Nov 22 01:34:18 np0005531887 kernel: Linux version 5.14.0-639.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-67.el9) #1 SMP PREEMPT_DYNAMIC Sat Nov 15 10:30:41 UTC 2025
Nov 22 01:34:18 np0005531887 kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Nov 22 01:34:18 np0005531887 kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-639.el9.x86_64 root=UUID=47e3724e-7a1b-439a-9543-b98c9a290709 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 22 01:34:18 np0005531887 kernel: BIOS-provided physical RAM map:
Nov 22 01:34:18 np0005531887 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Nov 22 01:34:18 np0005531887 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Nov 22 01:34:18 np0005531887 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Nov 22 01:34:18 np0005531887 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Nov 22 01:34:18 np0005531887 kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Nov 22 01:34:18 np0005531887 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Nov 22 01:34:18 np0005531887 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Nov 22 01:34:18 np0005531887 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Nov 22 01:34:18 np0005531887 kernel: NX (Execute Disable) protection: active
Nov 22 01:34:18 np0005531887 kernel: APIC: Static calls initialized
Nov 22 01:34:18 np0005531887 kernel: SMBIOS 2.8 present.
Nov 22 01:34:18 np0005531887 kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Nov 22 01:34:18 np0005531887 kernel: Hypervisor detected: KVM
Nov 22 01:34:18 np0005531887 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Nov 22 01:34:18 np0005531887 kernel: kvm-clock: using sched offset of 11441433059 cycles
Nov 22 01:34:18 np0005531887 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Nov 22 01:34:18 np0005531887 kernel: tsc: Detected 2799.998 MHz processor
Nov 22 01:34:18 np0005531887 kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Nov 22 01:34:18 np0005531887 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Nov 22 01:34:18 np0005531887 kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Nov 22 01:34:18 np0005531887 kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Nov 22 01:34:18 np0005531887 kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Nov 22 01:34:18 np0005531887 kernel: Using GB pages for direct mapping
Nov 22 01:34:18 np0005531887 kernel: RAMDISK: [mem 0x2d83a000-0x32c14fff]
Nov 22 01:34:18 np0005531887 kernel: ACPI: Early table checksum verification disabled
Nov 22 01:34:18 np0005531887 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Nov 22 01:34:18 np0005531887 kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 22 01:34:18 np0005531887 kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 22 01:34:18 np0005531887 kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 22 01:34:18 np0005531887 kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Nov 22 01:34:18 np0005531887 kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 22 01:34:18 np0005531887 kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 22 01:34:18 np0005531887 kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Nov 22 01:34:18 np0005531887 kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Nov 22 01:34:18 np0005531887 kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Nov 22 01:34:18 np0005531887 kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Nov 22 01:34:18 np0005531887 kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Nov 22 01:34:18 np0005531887 kernel: No NUMA configuration found
Nov 22 01:34:18 np0005531887 kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Nov 22 01:34:18 np0005531887 kernel: NODE_DATA(0) allocated [mem 0x23ffd5000-0x23fffffff]
Nov 22 01:34:18 np0005531887 kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Nov 22 01:34:18 np0005531887 kernel: Zone ranges:
Nov 22 01:34:18 np0005531887 kernel:  DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Nov 22 01:34:18 np0005531887 kernel:  DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Nov 22 01:34:18 np0005531887 kernel:  Normal   [mem 0x0000000100000000-0x000000023fffffff]
Nov 22 01:34:18 np0005531887 kernel:  Device   empty
Nov 22 01:34:18 np0005531887 kernel: Movable zone start for each node
Nov 22 01:34:18 np0005531887 kernel: Early memory node ranges
Nov 22 01:34:18 np0005531887 kernel:  node   0: [mem 0x0000000000001000-0x000000000009efff]
Nov 22 01:34:18 np0005531887 kernel:  node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Nov 22 01:34:18 np0005531887 kernel:  node   0: [mem 0x0000000100000000-0x000000023fffffff]
Nov 22 01:34:18 np0005531887 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Nov 22 01:34:18 np0005531887 kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Nov 22 01:34:18 np0005531887 kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Nov 22 01:34:18 np0005531887 kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Nov 22 01:34:18 np0005531887 kernel: ACPI: PM-Timer IO Port: 0x608
Nov 22 01:34:18 np0005531887 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Nov 22 01:34:18 np0005531887 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Nov 22 01:34:18 np0005531887 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Nov 22 01:34:18 np0005531887 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Nov 22 01:34:18 np0005531887 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Nov 22 01:34:18 np0005531887 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Nov 22 01:34:18 np0005531887 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Nov 22 01:34:18 np0005531887 kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Nov 22 01:34:18 np0005531887 kernel: TSC deadline timer available
Nov 22 01:34:18 np0005531887 kernel: CPU topo: Max. logical packages:   8
Nov 22 01:34:18 np0005531887 kernel: CPU topo: Max. logical dies:       8
Nov 22 01:34:18 np0005531887 kernel: CPU topo: Max. dies per package:   1
Nov 22 01:34:18 np0005531887 kernel: CPU topo: Max. threads per core:   1
Nov 22 01:34:18 np0005531887 kernel: CPU topo: Num. cores per package:     1
Nov 22 01:34:18 np0005531887 kernel: CPU topo: Num. threads per package:   1
Nov 22 01:34:18 np0005531887 kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Nov 22 01:34:18 np0005531887 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Nov 22 01:34:18 np0005531887 kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Nov 22 01:34:18 np0005531887 kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Nov 22 01:34:18 np0005531887 kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Nov 22 01:34:18 np0005531887 kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Nov 22 01:34:18 np0005531887 kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Nov 22 01:34:18 np0005531887 kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Nov 22 01:34:18 np0005531887 kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Nov 22 01:34:18 np0005531887 kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Nov 22 01:34:18 np0005531887 kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Nov 22 01:34:18 np0005531887 kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Nov 22 01:34:18 np0005531887 kernel: Booting paravirtualized kernel on KVM
Nov 22 01:34:18 np0005531887 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Nov 22 01:34:18 np0005531887 kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Nov 22 01:34:18 np0005531887 kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Nov 22 01:34:18 np0005531887 kernel: kvm-guest: PV spinlocks disabled, no host support
Nov 22 01:34:18 np0005531887 kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-639.el9.x86_64 root=UUID=47e3724e-7a1b-439a-9543-b98c9a290709 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 22 01:34:18 np0005531887 kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-639.el9.x86_64", will be passed to user space.
Nov 22 01:34:18 np0005531887 kernel: random: crng init done
Nov 22 01:34:18 np0005531887 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Nov 22 01:34:18 np0005531887 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Nov 22 01:34:18 np0005531887 kernel: Fallback order for Node 0: 0 
Nov 22 01:34:18 np0005531887 kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Nov 22 01:34:18 np0005531887 kernel: Policy zone: Normal
Nov 22 01:34:18 np0005531887 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Nov 22 01:34:18 np0005531887 kernel: software IO TLB: area num 8.
Nov 22 01:34:18 np0005531887 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Nov 22 01:34:18 np0005531887 kernel: ftrace: allocating 49298 entries in 193 pages
Nov 22 01:34:18 np0005531887 kernel: ftrace: allocated 193 pages with 3 groups
Nov 22 01:34:18 np0005531887 kernel: Dynamic Preempt: voluntary
Nov 22 01:34:18 np0005531887 kernel: rcu: Preemptible hierarchical RCU implementation.
Nov 22 01:34:18 np0005531887 kernel: rcu: #011RCU event tracing is enabled.
Nov 22 01:34:18 np0005531887 kernel: rcu: #011RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Nov 22 01:34:18 np0005531887 kernel: #011Trampoline variant of Tasks RCU enabled.
Nov 22 01:34:18 np0005531887 kernel: #011Rude variant of Tasks RCU enabled.
Nov 22 01:34:18 np0005531887 kernel: #011Tracing variant of Tasks RCU enabled.
Nov 22 01:34:18 np0005531887 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Nov 22 01:34:18 np0005531887 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Nov 22 01:34:18 np0005531887 kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Nov 22 01:34:18 np0005531887 kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Nov 22 01:34:18 np0005531887 kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Nov 22 01:34:18 np0005531887 kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Nov 22 01:34:18 np0005531887 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Nov 22 01:34:18 np0005531887 kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Nov 22 01:34:18 np0005531887 kernel: Console: colour VGA+ 80x25
Nov 22 01:34:18 np0005531887 kernel: printk: console [ttyS0] enabled
Nov 22 01:34:18 np0005531887 kernel: ACPI: Core revision 20230331
Nov 22 01:34:18 np0005531887 kernel: APIC: Switch to symmetric I/O mode setup
Nov 22 01:34:18 np0005531887 kernel: x2apic enabled
Nov 22 01:34:18 np0005531887 kernel: APIC: Switched APIC routing to: physical x2apic
Nov 22 01:34:18 np0005531887 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Nov 22 01:34:18 np0005531887 kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998)
Nov 22 01:34:18 np0005531887 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Nov 22 01:34:18 np0005531887 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Nov 22 01:34:18 np0005531887 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Nov 22 01:34:18 np0005531887 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Nov 22 01:34:18 np0005531887 kernel: Spectre V2 : Mitigation: Retpolines
Nov 22 01:34:18 np0005531887 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Nov 22 01:34:18 np0005531887 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Nov 22 01:34:18 np0005531887 kernel: RETBleed: Mitigation: untrained return thunk
Nov 22 01:34:18 np0005531887 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Nov 22 01:34:18 np0005531887 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Nov 22 01:34:18 np0005531887 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Nov 22 01:34:18 np0005531887 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Nov 22 01:34:18 np0005531887 kernel: x86/bugs: return thunk changed
Nov 22 01:34:18 np0005531887 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Nov 22 01:34:18 np0005531887 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Nov 22 01:34:18 np0005531887 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Nov 22 01:34:18 np0005531887 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Nov 22 01:34:18 np0005531887 kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Nov 22 01:34:18 np0005531887 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Nov 22 01:34:18 np0005531887 kernel: Freeing SMP alternatives memory: 40K
Nov 22 01:34:18 np0005531887 kernel: pid_max: default: 32768 minimum: 301
Nov 22 01:34:18 np0005531887 kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Nov 22 01:34:18 np0005531887 kernel: landlock: Up and running.
Nov 22 01:34:18 np0005531887 kernel: Yama: becoming mindful.
Nov 22 01:34:18 np0005531887 kernel: SELinux:  Initializing.
Nov 22 01:34:18 np0005531887 kernel: LSM support for eBPF active
Nov 22 01:34:18 np0005531887 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Nov 22 01:34:18 np0005531887 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Nov 22 01:34:18 np0005531887 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Nov 22 01:34:18 np0005531887 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Nov 22 01:34:18 np0005531887 kernel: ... version:                0
Nov 22 01:34:18 np0005531887 kernel: ... bit width:              48
Nov 22 01:34:18 np0005531887 kernel: ... generic registers:      6
Nov 22 01:34:18 np0005531887 kernel: ... value mask:             0000ffffffffffff
Nov 22 01:34:18 np0005531887 kernel: ... max period:             00007fffffffffff
Nov 22 01:34:18 np0005531887 kernel: ... fixed-purpose events:   0
Nov 22 01:34:18 np0005531887 kernel: ... event mask:             000000000000003f
Nov 22 01:34:18 np0005531887 kernel: signal: max sigframe size: 1776
Nov 22 01:34:18 np0005531887 kernel: rcu: Hierarchical SRCU implementation.
Nov 22 01:34:18 np0005531887 kernel: rcu: #011Max phase no-delay instances is 400.
Nov 22 01:34:18 np0005531887 kernel: smp: Bringing up secondary CPUs ...
Nov 22 01:34:18 np0005531887 kernel: smpboot: x86: Booting SMP configuration:
Nov 22 01:34:18 np0005531887 kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Nov 22 01:34:18 np0005531887 kernel: smp: Brought up 1 node, 8 CPUs
Nov 22 01:34:18 np0005531887 kernel: smpboot: Total of 8 processors activated (44799.96 BogoMIPS)
Nov 22 01:34:18 np0005531887 kernel: node 0 deferred pages initialised in 10ms
Nov 22 01:34:18 np0005531887 kernel: Memory: 7765908K/8388068K available (16384K kernel code, 5786K rwdata, 13900K rodata, 4188K init, 7176K bss, 616272K reserved, 0K cma-reserved)
Nov 22 01:34:18 np0005531887 kernel: devtmpfs: initialized
Nov 22 01:34:18 np0005531887 kernel: x86/mm: Memory block size: 128MB
Nov 22 01:34:18 np0005531887 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Nov 22 01:34:18 np0005531887 kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear)
Nov 22 01:34:18 np0005531887 kernel: pinctrl core: initialized pinctrl subsystem
Nov 22 01:34:18 np0005531887 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Nov 22 01:34:18 np0005531887 kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Nov 22 01:34:18 np0005531887 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Nov 22 01:34:18 np0005531887 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Nov 22 01:34:18 np0005531887 kernel: audit: initializing netlink subsys (disabled)
Nov 22 01:34:18 np0005531887 kernel: audit: type=2000 audit(1763793256.890:1): state=initialized audit_enabled=0 res=1
Nov 22 01:34:18 np0005531887 kernel: thermal_sys: Registered thermal governor 'fair_share'
Nov 22 01:34:18 np0005531887 kernel: thermal_sys: Registered thermal governor 'step_wise'
Nov 22 01:34:18 np0005531887 kernel: thermal_sys: Registered thermal governor 'user_space'
Nov 22 01:34:18 np0005531887 kernel: cpuidle: using governor menu
Nov 22 01:34:18 np0005531887 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Nov 22 01:34:18 np0005531887 kernel: PCI: Using configuration type 1 for base access
Nov 22 01:34:18 np0005531887 kernel: PCI: Using configuration type 1 for extended access
Nov 22 01:34:18 np0005531887 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Nov 22 01:34:18 np0005531887 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Nov 22 01:34:18 np0005531887 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Nov 22 01:34:18 np0005531887 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Nov 22 01:34:18 np0005531887 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Nov 22 01:34:18 np0005531887 kernel: Demotion targets for Node 0: null
Nov 22 01:34:18 np0005531887 kernel: cryptd: max_cpu_qlen set to 1000
Nov 22 01:34:18 np0005531887 kernel: ACPI: Added _OSI(Module Device)
Nov 22 01:34:18 np0005531887 kernel: ACPI: Added _OSI(Processor Device)
Nov 22 01:34:18 np0005531887 kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Nov 22 01:34:18 np0005531887 kernel: ACPI: Added _OSI(Processor Aggregator Device)
Nov 22 01:34:18 np0005531887 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Nov 22 01:34:18 np0005531887 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Nov 22 01:34:18 np0005531887 kernel: ACPI: Interpreter enabled
Nov 22 01:34:18 np0005531887 kernel: ACPI: PM: (supports S0 S3 S4 S5)
Nov 22 01:34:18 np0005531887 kernel: ACPI: Using IOAPIC for interrupt routing
Nov 22 01:34:18 np0005531887 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Nov 22 01:34:18 np0005531887 kernel: PCI: Using E820 reservations for host bridge windows
Nov 22 01:34:18 np0005531887 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Nov 22 01:34:18 np0005531887 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Nov 22 01:34:18 np0005531887 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Nov 22 01:34:18 np0005531887 kernel: acpiphp: Slot [3] registered
Nov 22 01:34:18 np0005531887 kernel: acpiphp: Slot [4] registered
Nov 22 01:34:18 np0005531887 kernel: acpiphp: Slot [5] registered
Nov 22 01:34:18 np0005531887 kernel: acpiphp: Slot [6] registered
Nov 22 01:34:18 np0005531887 kernel: acpiphp: Slot [7] registered
Nov 22 01:34:18 np0005531887 kernel: acpiphp: Slot [8] registered
Nov 22 01:34:18 np0005531887 kernel: acpiphp: Slot [9] registered
Nov 22 01:34:18 np0005531887 kernel: acpiphp: Slot [10] registered
Nov 22 01:34:18 np0005531887 kernel: acpiphp: Slot [11] registered
Nov 22 01:34:18 np0005531887 kernel: acpiphp: Slot [12] registered
Nov 22 01:34:18 np0005531887 kernel: acpiphp: Slot [13] registered
Nov 22 01:34:18 np0005531887 kernel: acpiphp: Slot [14] registered
Nov 22 01:34:18 np0005531887 kernel: acpiphp: Slot [15] registered
Nov 22 01:34:18 np0005531887 kernel: acpiphp: Slot [16] registered
Nov 22 01:34:18 np0005531887 kernel: acpiphp: Slot [17] registered
Nov 22 01:34:18 np0005531887 kernel: acpiphp: Slot [18] registered
Nov 22 01:34:18 np0005531887 kernel: acpiphp: Slot [19] registered
Nov 22 01:34:18 np0005531887 kernel: acpiphp: Slot [20] registered
Nov 22 01:34:18 np0005531887 kernel: acpiphp: Slot [21] registered
Nov 22 01:34:18 np0005531887 kernel: acpiphp: Slot [22] registered
Nov 22 01:34:18 np0005531887 kernel: acpiphp: Slot [23] registered
Nov 22 01:34:18 np0005531887 kernel: acpiphp: Slot [24] registered
Nov 22 01:34:18 np0005531887 kernel: acpiphp: Slot [25] registered
Nov 22 01:34:18 np0005531887 kernel: acpiphp: Slot [26] registered
Nov 22 01:34:18 np0005531887 kernel: acpiphp: Slot [27] registered
Nov 22 01:34:18 np0005531887 kernel: acpiphp: Slot [28] registered
Nov 22 01:34:18 np0005531887 kernel: acpiphp: Slot [29] registered
Nov 22 01:34:18 np0005531887 kernel: acpiphp: Slot [30] registered
Nov 22 01:34:18 np0005531887 kernel: acpiphp: Slot [31] registered
Nov 22 01:34:18 np0005531887 kernel: PCI host bridge to bus 0000:00
Nov 22 01:34:18 np0005531887 kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Nov 22 01:34:18 np0005531887 kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Nov 22 01:34:18 np0005531887 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Nov 22 01:34:18 np0005531887 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Nov 22 01:34:18 np0005531887 kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Nov 22 01:34:18 np0005531887 kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Nov 22 01:34:18 np0005531887 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Nov 22 01:34:18 np0005531887 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Nov 22 01:34:18 np0005531887 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Nov 22 01:34:18 np0005531887 kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Nov 22 01:34:18 np0005531887 kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Nov 22 01:34:18 np0005531887 kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Nov 22 01:34:18 np0005531887 kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Nov 22 01:34:18 np0005531887 kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Nov 22 01:34:18 np0005531887 kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Nov 22 01:34:18 np0005531887 kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Nov 22 01:34:18 np0005531887 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Nov 22 01:34:18 np0005531887 kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Nov 22 01:34:18 np0005531887 kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Nov 22 01:34:18 np0005531887 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Nov 22 01:34:18 np0005531887 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Nov 22 01:34:18 np0005531887 kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Nov 22 01:34:18 np0005531887 kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Nov 22 01:34:18 np0005531887 kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Nov 22 01:34:18 np0005531887 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Nov 22 01:34:18 np0005531887 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Nov 22 01:34:18 np0005531887 kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Nov 22 01:34:18 np0005531887 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Nov 22 01:34:18 np0005531887 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Nov 22 01:34:18 np0005531887 kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Nov 22 01:34:18 np0005531887 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Nov 22 01:34:18 np0005531887 kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Nov 22 01:34:18 np0005531887 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Nov 22 01:34:18 np0005531887 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Nov 22 01:34:18 np0005531887 kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Nov 22 01:34:18 np0005531887 kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Nov 22 01:34:18 np0005531887 kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Nov 22 01:34:18 np0005531887 kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Nov 22 01:34:18 np0005531887 kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Nov 22 01:34:18 np0005531887 kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Nov 22 01:34:18 np0005531887 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Nov 22 01:34:18 np0005531887 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Nov 22 01:34:18 np0005531887 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Nov 22 01:34:18 np0005531887 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Nov 22 01:34:18 np0005531887 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Nov 22 01:34:18 np0005531887 kernel: iommu: Default domain type: Translated
Nov 22 01:34:18 np0005531887 kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Nov 22 01:34:18 np0005531887 kernel: SCSI subsystem initialized
Nov 22 01:34:18 np0005531887 kernel: ACPI: bus type USB registered
Nov 22 01:34:18 np0005531887 kernel: usbcore: registered new interface driver usbfs
Nov 22 01:34:18 np0005531887 kernel: usbcore: registered new interface driver hub
Nov 22 01:34:18 np0005531887 kernel: usbcore: registered new device driver usb
Nov 22 01:34:18 np0005531887 kernel: pps_core: LinuxPPS API ver. 1 registered
Nov 22 01:34:18 np0005531887 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Nov 22 01:34:18 np0005531887 kernel: PTP clock support registered
Nov 22 01:34:18 np0005531887 kernel: EDAC MC: Ver: 3.0.0
Nov 22 01:34:18 np0005531887 kernel: NetLabel: Initializing
Nov 22 01:34:18 np0005531887 kernel: NetLabel:  domain hash size = 128
Nov 22 01:34:18 np0005531887 kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Nov 22 01:34:18 np0005531887 kernel: NetLabel:  unlabeled traffic allowed by default
Nov 22 01:34:18 np0005531887 kernel: PCI: Using ACPI for IRQ routing
Nov 22 01:34:18 np0005531887 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Nov 22 01:34:18 np0005531887 kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Nov 22 01:34:18 np0005531887 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Nov 22 01:34:18 np0005531887 kernel: vgaarb: loaded
Nov 22 01:34:18 np0005531887 kernel: clocksource: Switched to clocksource kvm-clock
Nov 22 01:34:18 np0005531887 kernel: VFS: Disk quotas dquot_6.6.0
Nov 22 01:34:18 np0005531887 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Nov 22 01:34:18 np0005531887 kernel: pnp: PnP ACPI init
Nov 22 01:34:18 np0005531887 kernel: pnp: PnP ACPI: found 5 devices
Nov 22 01:34:18 np0005531887 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Nov 22 01:34:18 np0005531887 kernel: NET: Registered PF_INET protocol family
Nov 22 01:34:18 np0005531887 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Nov 22 01:34:18 np0005531887 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Nov 22 01:34:18 np0005531887 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Nov 22 01:34:18 np0005531887 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Nov 22 01:34:18 np0005531887 kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Nov 22 01:34:18 np0005531887 kernel: TCP: Hash tables configured (established 65536 bind 65536)
Nov 22 01:34:18 np0005531887 kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Nov 22 01:34:18 np0005531887 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Nov 22 01:34:18 np0005531887 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Nov 22 01:34:18 np0005531887 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Nov 22 01:34:18 np0005531887 kernel: NET: Registered PF_XDP protocol family
Nov 22 01:34:18 np0005531887 kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Nov 22 01:34:18 np0005531887 kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Nov 22 01:34:18 np0005531887 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Nov 22 01:34:18 np0005531887 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Nov 22 01:34:18 np0005531887 kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Nov 22 01:34:18 np0005531887 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Nov 22 01:34:18 np0005531887 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Nov 22 01:34:18 np0005531887 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Nov 22 01:34:18 np0005531887 kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 91224 usecs
Nov 22 01:34:18 np0005531887 kernel: PCI: CLS 0 bytes, default 64
Nov 22 01:34:18 np0005531887 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Nov 22 01:34:18 np0005531887 kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Nov 22 01:34:18 np0005531887 kernel: ACPI: bus type thunderbolt registered
Nov 22 01:34:18 np0005531887 kernel: Trying to unpack rootfs image as initramfs...
Nov 22 01:34:18 np0005531887 kernel: Initialise system trusted keyrings
Nov 22 01:34:18 np0005531887 kernel: Key type blacklist registered
Nov 22 01:34:18 np0005531887 kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Nov 22 01:34:18 np0005531887 kernel: zbud: loaded
Nov 22 01:34:18 np0005531887 kernel: integrity: Platform Keyring initialized
Nov 22 01:34:18 np0005531887 kernel: integrity: Machine keyring initialized
Nov 22 01:34:18 np0005531887 kernel: Freeing initrd memory: 85868K
Nov 22 01:34:18 np0005531887 kernel: NET: Registered PF_ALG protocol family
Nov 22 01:34:18 np0005531887 kernel: xor: automatically using best checksumming function   avx       
Nov 22 01:34:18 np0005531887 kernel: Key type asymmetric registered
Nov 22 01:34:18 np0005531887 kernel: Asymmetric key parser 'x509' registered
Nov 22 01:34:18 np0005531887 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Nov 22 01:34:18 np0005531887 kernel: io scheduler mq-deadline registered
Nov 22 01:34:18 np0005531887 kernel: io scheduler kyber registered
Nov 22 01:34:18 np0005531887 kernel: io scheduler bfq registered
Nov 22 01:34:18 np0005531887 kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Nov 22 01:34:18 np0005531887 kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Nov 22 01:34:18 np0005531887 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Nov 22 01:34:18 np0005531887 kernel: ACPI: button: Power Button [PWRF]
Nov 22 01:34:18 np0005531887 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Nov 22 01:34:18 np0005531887 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Nov 22 01:34:18 np0005531887 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Nov 22 01:34:18 np0005531887 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Nov 22 01:34:18 np0005531887 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Nov 22 01:34:18 np0005531887 kernel: Non-volatile memory driver v1.3
Nov 22 01:34:18 np0005531887 kernel: rdac: device handler registered
Nov 22 01:34:18 np0005531887 kernel: hp_sw: device handler registered
Nov 22 01:34:18 np0005531887 kernel: emc: device handler registered
Nov 22 01:34:18 np0005531887 kernel: alua: device handler registered
Nov 22 01:34:18 np0005531887 kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Nov 22 01:34:18 np0005531887 kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Nov 22 01:34:18 np0005531887 kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Nov 22 01:34:18 np0005531887 kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Nov 22 01:34:18 np0005531887 kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Nov 22 01:34:18 np0005531887 kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Nov 22 01:34:18 np0005531887 kernel: usb usb1: Product: UHCI Host Controller
Nov 22 01:34:18 np0005531887 kernel: usb usb1: Manufacturer: Linux 5.14.0-639.el9.x86_64 uhci_hcd
Nov 22 01:34:18 np0005531887 kernel: usb usb1: SerialNumber: 0000:00:01.2
Nov 22 01:34:18 np0005531887 kernel: hub 1-0:1.0: USB hub found
Nov 22 01:34:18 np0005531887 kernel: hub 1-0:1.0: 2 ports detected
Nov 22 01:34:18 np0005531887 kernel: usbcore: registered new interface driver usbserial_generic
Nov 22 01:34:18 np0005531887 kernel: usbserial: USB Serial support registered for generic
Nov 22 01:34:18 np0005531887 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Nov 22 01:34:18 np0005531887 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Nov 22 01:34:18 np0005531887 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Nov 22 01:34:18 np0005531887 kernel: mousedev: PS/2 mouse device common for all mice
Nov 22 01:34:18 np0005531887 kernel: rtc_cmos 00:04: RTC can wake from S4
Nov 22 01:34:18 np0005531887 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Nov 22 01:34:18 np0005531887 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Nov 22 01:34:18 np0005531887 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Nov 22 01:34:18 np0005531887 kernel: rtc_cmos 00:04: registered as rtc0
Nov 22 01:34:18 np0005531887 kernel: rtc_cmos 00:04: setting system clock to 2025-11-22T06:34:17 UTC (1763793257)
Nov 22 01:34:18 np0005531887 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Nov 22 01:34:18 np0005531887 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Nov 22 01:34:18 np0005531887 kernel: hid: raw HID events driver (C) Jiri Kosina
Nov 22 01:34:18 np0005531887 kernel: usbcore: registered new interface driver usbhid
Nov 22 01:34:18 np0005531887 kernel: usbhid: USB HID core driver
Nov 22 01:34:18 np0005531887 kernel: drop_monitor: Initializing network drop monitor service
Nov 22 01:34:18 np0005531887 kernel: Initializing XFRM netlink socket
Nov 22 01:34:18 np0005531887 kernel: NET: Registered PF_INET6 protocol family
Nov 22 01:34:18 np0005531887 kernel: Segment Routing with IPv6
Nov 22 01:34:18 np0005531887 kernel: NET: Registered PF_PACKET protocol family
Nov 22 01:34:18 np0005531887 kernel: mpls_gso: MPLS GSO support
Nov 22 01:34:18 np0005531887 kernel: IPI shorthand broadcast: enabled
Nov 22 01:34:18 np0005531887 kernel: AVX2 version of gcm_enc/dec engaged.
Nov 22 01:34:18 np0005531887 kernel: AES CTR mode by8 optimization enabled
Nov 22 01:34:18 np0005531887 kernel: sched_clock: Marking stable (1222002998, 156116502)->(1490534404, -112414904)
Nov 22 01:34:18 np0005531887 kernel: registered taskstats version 1
Nov 22 01:34:18 np0005531887 kernel: Loading compiled-in X.509 certificates
Nov 22 01:34:18 np0005531887 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: f7751431c703da8a75244ce96aad68601cf1c188'
Nov 22 01:34:18 np0005531887 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Nov 22 01:34:18 np0005531887 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Nov 22 01:34:18 np0005531887 kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Nov 22 01:34:18 np0005531887 kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Nov 22 01:34:18 np0005531887 kernel: Demotion targets for Node 0: null
Nov 22 01:34:18 np0005531887 kernel: page_owner is disabled
Nov 22 01:34:18 np0005531887 kernel: Key type .fscrypt registered
Nov 22 01:34:18 np0005531887 kernel: Key type fscrypt-provisioning registered
Nov 22 01:34:18 np0005531887 kernel: Key type big_key registered
Nov 22 01:34:18 np0005531887 kernel: Key type encrypted registered
Nov 22 01:34:18 np0005531887 kernel: ima: No TPM chip found, activating TPM-bypass!
Nov 22 01:34:18 np0005531887 kernel: Loading compiled-in module X.509 certificates
Nov 22 01:34:18 np0005531887 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: f7751431c703da8a75244ce96aad68601cf1c188'
Nov 22 01:34:18 np0005531887 kernel: ima: Allocated hash algorithm: sha256
Nov 22 01:34:18 np0005531887 kernel: ima: No architecture policies found
Nov 22 01:34:18 np0005531887 kernel: evm: Initialising EVM extended attributes:
Nov 22 01:34:18 np0005531887 kernel: evm: security.selinux
Nov 22 01:34:18 np0005531887 kernel: evm: security.SMACK64 (disabled)
Nov 22 01:34:18 np0005531887 kernel: evm: security.SMACK64EXEC (disabled)
Nov 22 01:34:18 np0005531887 kernel: evm: security.SMACK64TRANSMUTE (disabled)
Nov 22 01:34:18 np0005531887 kernel: evm: security.SMACK64MMAP (disabled)
Nov 22 01:34:18 np0005531887 kernel: evm: security.apparmor (disabled)
Nov 22 01:34:18 np0005531887 kernel: evm: security.ima
Nov 22 01:34:18 np0005531887 kernel: evm: security.capability
Nov 22 01:34:18 np0005531887 kernel: evm: HMAC attrs: 0x1
Nov 22 01:34:18 np0005531887 kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Nov 22 01:34:18 np0005531887 kernel: Running certificate verification RSA selftest
Nov 22 01:34:18 np0005531887 kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Nov 22 01:34:18 np0005531887 kernel: Running certificate verification ECDSA selftest
Nov 22 01:34:18 np0005531887 kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Nov 22 01:34:18 np0005531887 kernel: clk: Disabling unused clocks
Nov 22 01:34:18 np0005531887 kernel: Freeing unused decrypted memory: 2028K
Nov 22 01:34:18 np0005531887 kernel: Freeing unused kernel image (initmem) memory: 4188K
Nov 22 01:34:18 np0005531887 kernel: Write protecting the kernel read-only data: 30720k
Nov 22 01:34:18 np0005531887 kernel: Freeing unused kernel image (rodata/data gap) memory: 436K
Nov 22 01:34:18 np0005531887 kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Nov 22 01:34:18 np0005531887 kernel: Run /init as init process
Nov 22 01:34:18 np0005531887 systemd: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Nov 22 01:34:18 np0005531887 systemd: Detected virtualization kvm.
Nov 22 01:34:18 np0005531887 systemd: Detected architecture x86-64.
Nov 22 01:34:18 np0005531887 systemd: Running in initrd.
Nov 22 01:34:18 np0005531887 systemd: No hostname configured, using default hostname.
Nov 22 01:34:18 np0005531887 systemd: Hostname set to <localhost>.
Nov 22 01:34:18 np0005531887 systemd: Initializing machine ID from VM UUID.
Nov 22 01:34:18 np0005531887 kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Nov 22 01:34:18 np0005531887 kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Nov 22 01:34:18 np0005531887 kernel: usb 1-1: Product: QEMU USB Tablet
Nov 22 01:34:18 np0005531887 kernel: usb 1-1: Manufacturer: QEMU
Nov 22 01:34:18 np0005531887 kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Nov 22 01:34:18 np0005531887 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Nov 22 01:34:18 np0005531887 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Nov 22 01:34:18 np0005531887 systemd: Queued start job for default target Initrd Default Target.
Nov 22 01:34:18 np0005531887 systemd: Started Dispatch Password Requests to Console Directory Watch.
Nov 22 01:34:18 np0005531887 systemd: Reached target Local Encrypted Volumes.
Nov 22 01:34:18 np0005531887 systemd: Reached target Initrd /usr File System.
Nov 22 01:34:18 np0005531887 systemd: Reached target Local File Systems.
Nov 22 01:34:18 np0005531887 systemd: Reached target Path Units.
Nov 22 01:34:18 np0005531887 systemd: Reached target Slice Units.
Nov 22 01:34:18 np0005531887 systemd: Reached target Swaps.
Nov 22 01:34:18 np0005531887 systemd: Reached target Timer Units.
Nov 22 01:34:18 np0005531887 systemd: Listening on D-Bus System Message Bus Socket.
Nov 22 01:34:18 np0005531887 systemd: Listening on Journal Socket (/dev/log).
Nov 22 01:34:18 np0005531887 systemd: Listening on Journal Socket.
Nov 22 01:34:18 np0005531887 systemd: Listening on udev Control Socket.
Nov 22 01:34:18 np0005531887 systemd: Listening on udev Kernel Socket.
Nov 22 01:34:18 np0005531887 systemd: Reached target Socket Units.
Nov 22 01:34:18 np0005531887 systemd: Starting Create List of Static Device Nodes...
Nov 22 01:34:18 np0005531887 systemd: Starting Journal Service...
Nov 22 01:34:18 np0005531887 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Nov 22 01:34:18 np0005531887 systemd: Starting Apply Kernel Variables...
Nov 22 01:34:18 np0005531887 systemd: Starting Create System Users...
Nov 22 01:34:18 np0005531887 systemd: Starting Setup Virtual Console...
Nov 22 01:34:18 np0005531887 systemd: Finished Create List of Static Device Nodes.
Nov 22 01:34:18 np0005531887 systemd: Finished Apply Kernel Variables.
Nov 22 01:34:18 np0005531887 systemd: Finished Create System Users.
Nov 22 01:34:18 np0005531887 systemd-journald[306]: Journal started
Nov 22 01:34:18 np0005531887 systemd-journald[306]: Runtime Journal (/run/log/journal/e5ccb90d580e48d9a7d0f6edef583e11) is 8.0M, max 153.6M, 145.6M free.
Nov 22 01:34:18 np0005531887 systemd-sysusers[310]: Creating group 'users' with GID 100.
Nov 22 01:34:18 np0005531887 systemd-sysusers[310]: Creating group 'dbus' with GID 81.
Nov 22 01:34:18 np0005531887 systemd-sysusers[310]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Nov 22 01:34:18 np0005531887 systemd: Started Journal Service.
Nov 22 01:34:18 np0005531887 systemd[1]: Starting Create Static Device Nodes in /dev...
Nov 22 01:34:18 np0005531887 systemd[1]: Starting Create Volatile Files and Directories...
Nov 22 01:34:18 np0005531887 systemd[1]: Finished Create Static Device Nodes in /dev.
Nov 22 01:34:18 np0005531887 systemd[1]: Finished Create Volatile Files and Directories.
Nov 22 01:34:18 np0005531887 systemd[1]: Finished Setup Virtual Console.
Nov 22 01:34:18 np0005531887 systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Nov 22 01:34:18 np0005531887 systemd[1]: Starting dracut cmdline hook...
Nov 22 01:34:18 np0005531887 dracut-cmdline[325]: dracut-9 dracut-057-102.git20250818.el9
Nov 22 01:34:18 np0005531887 dracut-cmdline[325]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-639.el9.x86_64 root=UUID=47e3724e-7a1b-439a-9543-b98c9a290709 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 22 01:34:18 np0005531887 systemd[1]: Finished dracut cmdline hook.
Nov 22 01:34:18 np0005531887 systemd[1]: Starting dracut pre-udev hook...
Nov 22 01:34:18 np0005531887 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Nov 22 01:34:18 np0005531887 kernel: device-mapper: uevent: version 1.0.3
Nov 22 01:34:18 np0005531887 kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Nov 22 01:34:18 np0005531887 kernel: RPC: Registered named UNIX socket transport module.
Nov 22 01:34:18 np0005531887 kernel: RPC: Registered udp transport module.
Nov 22 01:34:18 np0005531887 kernel: RPC: Registered tcp transport module.
Nov 22 01:34:18 np0005531887 kernel: RPC: Registered tcp-with-tls transport module.
Nov 22 01:34:18 np0005531887 kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Nov 22 01:34:18 np0005531887 rpc.statd[441]: Version 2.5.4 starting
Nov 22 01:34:18 np0005531887 rpc.statd[441]: Initializing NSM state
Nov 22 01:34:18 np0005531887 rpc.idmapd[446]: Setting log level to 0
Nov 22 01:34:18 np0005531887 systemd[1]: Finished dracut pre-udev hook.
Nov 22 01:34:18 np0005531887 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Nov 22 01:34:18 np0005531887 systemd-udevd[459]: Using default interface naming scheme 'rhel-9.0'.
Nov 22 01:34:18 np0005531887 systemd[1]: Started Rule-based Manager for Device Events and Files.
Nov 22 01:34:18 np0005531887 systemd[1]: Starting dracut pre-trigger hook...
Nov 22 01:34:18 np0005531887 systemd[1]: Finished dracut pre-trigger hook.
Nov 22 01:34:19 np0005531887 systemd[1]: Starting Coldplug All udev Devices...
Nov 22 01:34:19 np0005531887 systemd[1]: Created slice Slice /system/modprobe.
Nov 22 01:34:19 np0005531887 systemd[1]: Starting Load Kernel Module configfs...
Nov 22 01:34:19 np0005531887 systemd[1]: Finished Coldplug All udev Devices.
Nov 22 01:34:19 np0005531887 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 22 01:34:19 np0005531887 systemd[1]: Finished Load Kernel Module configfs.
Nov 22 01:34:19 np0005531887 systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Nov 22 01:34:19 np0005531887 systemd[1]: Reached target Network.
Nov 22 01:34:19 np0005531887 systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Nov 22 01:34:19 np0005531887 systemd[1]: Starting dracut initqueue hook...
Nov 22 01:34:19 np0005531887 kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Nov 22 01:34:19 np0005531887 systemd[1]: Mounting Kernel Configuration File System...
Nov 22 01:34:19 np0005531887 systemd[1]: Mounted Kernel Configuration File System.
Nov 22 01:34:19 np0005531887 kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Nov 22 01:34:19 np0005531887 systemd[1]: Reached target System Initialization.
Nov 22 01:34:19 np0005531887 systemd[1]: Reached target Basic System.
Nov 22 01:34:19 np0005531887 kernel: vda: vda1
Nov 22 01:34:19 np0005531887 kernel: scsi host0: ata_piix
Nov 22 01:34:19 np0005531887 kernel: scsi host1: ata_piix
Nov 22 01:34:19 np0005531887 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Nov 22 01:34:19 np0005531887 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Nov 22 01:34:19 np0005531887 systemd[1]: Found device /dev/disk/by-uuid/47e3724e-7a1b-439a-9543-b98c9a290709.
Nov 22 01:34:19 np0005531887 systemd[1]: Reached target Initrd Root Device.
Nov 22 01:34:19 np0005531887 kernel: ata1: found unknown device (class 0)
Nov 22 01:34:19 np0005531887 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Nov 22 01:34:19 np0005531887 kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Nov 22 01:34:19 np0005531887 systemd-udevd[496]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 01:34:19 np0005531887 kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Nov 22 01:34:19 np0005531887 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Nov 22 01:34:19 np0005531887 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Nov 22 01:34:19 np0005531887 systemd[1]: Finished dracut initqueue hook.
Nov 22 01:34:19 np0005531887 systemd[1]: Reached target Preparation for Remote File Systems.
Nov 22 01:34:19 np0005531887 systemd[1]: Reached target Remote Encrypted Volumes.
Nov 22 01:34:19 np0005531887 systemd[1]: Reached target Remote File Systems.
Nov 22 01:34:19 np0005531887 systemd[1]: Starting dracut pre-mount hook...
Nov 22 01:34:19 np0005531887 systemd[1]: Finished dracut pre-mount hook.
Nov 22 01:34:19 np0005531887 systemd[1]: Starting File System Check on /dev/disk/by-uuid/47e3724e-7a1b-439a-9543-b98c9a290709...
Nov 22 01:34:19 np0005531887 systemd-fsck[550]: /usr/sbin/fsck.xfs: XFS file system.
Nov 22 01:34:19 np0005531887 systemd[1]: Finished File System Check on /dev/disk/by-uuid/47e3724e-7a1b-439a-9543-b98c9a290709.
Nov 22 01:34:19 np0005531887 systemd[1]: Mounting /sysroot...
Nov 22 01:34:20 np0005531887 kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Nov 22 01:34:20 np0005531887 kernel: XFS (vda1): Mounting V5 Filesystem 47e3724e-7a1b-439a-9543-b98c9a290709
Nov 22 01:34:20 np0005531887 kernel: XFS (vda1): Ending clean mount
Nov 22 01:34:20 np0005531887 systemd[1]: Mounted /sysroot.
Nov 22 01:34:20 np0005531887 systemd[1]: Reached target Initrd Root File System.
Nov 22 01:34:20 np0005531887 systemd[1]: Starting Mountpoints Configured in the Real Root...
Nov 22 01:34:20 np0005531887 systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Nov 22 01:34:20 np0005531887 systemd[1]: Finished Mountpoints Configured in the Real Root.
Nov 22 01:34:20 np0005531887 systemd[1]: Reached target Initrd File Systems.
Nov 22 01:34:20 np0005531887 systemd[1]: Reached target Initrd Default Target.
Nov 22 01:34:20 np0005531887 systemd[1]: Starting dracut mount hook...
Nov 22 01:34:20 np0005531887 systemd[1]: Finished dracut mount hook.
Nov 22 01:34:20 np0005531887 systemd[1]: Starting dracut pre-pivot and cleanup hook...
Nov 22 01:34:20 np0005531887 rpc.idmapd[446]: exiting on signal 15
Nov 22 01:34:20 np0005531887 systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Nov 22 01:34:20 np0005531887 systemd[1]: Finished dracut pre-pivot and cleanup hook.
Nov 22 01:34:20 np0005531887 systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Nov 22 01:34:20 np0005531887 systemd[1]: Stopped target Network.
Nov 22 01:34:20 np0005531887 systemd[1]: Stopped target Remote Encrypted Volumes.
Nov 22 01:34:20 np0005531887 systemd[1]: Stopped target Timer Units.
Nov 22 01:34:20 np0005531887 systemd[1]: dbus.socket: Deactivated successfully.
Nov 22 01:34:20 np0005531887 systemd[1]: Closed D-Bus System Message Bus Socket.
Nov 22 01:34:20 np0005531887 systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Nov 22 01:34:20 np0005531887 systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Nov 22 01:34:20 np0005531887 systemd[1]: Stopped target Initrd Default Target.
Nov 22 01:34:20 np0005531887 systemd[1]: Stopped target Basic System.
Nov 22 01:34:20 np0005531887 systemd[1]: Stopped target Initrd Root Device.
Nov 22 01:34:20 np0005531887 systemd[1]: Stopped target Initrd /usr File System.
Nov 22 01:34:20 np0005531887 systemd[1]: Stopped target Path Units.
Nov 22 01:34:20 np0005531887 systemd[1]: Stopped target Remote File Systems.
Nov 22 01:34:20 np0005531887 systemd[1]: Stopped target Preparation for Remote File Systems.
Nov 22 01:34:20 np0005531887 systemd[1]: Stopped target Slice Units.
Nov 22 01:34:20 np0005531887 systemd[1]: Stopped target Socket Units.
Nov 22 01:34:20 np0005531887 systemd[1]: Stopped target System Initialization.
Nov 22 01:34:20 np0005531887 systemd[1]: Stopped target Local File Systems.
Nov 22 01:34:20 np0005531887 systemd[1]: Stopped target Swaps.
Nov 22 01:34:20 np0005531887 systemd[1]: dracut-mount.service: Deactivated successfully.
Nov 22 01:34:20 np0005531887 systemd[1]: Stopped dracut mount hook.
Nov 22 01:34:20 np0005531887 systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Nov 22 01:34:20 np0005531887 systemd[1]: Stopped dracut pre-mount hook.
Nov 22 01:34:20 np0005531887 systemd[1]: Stopped target Local Encrypted Volumes.
Nov 22 01:34:20 np0005531887 systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Nov 22 01:34:20 np0005531887 systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Nov 22 01:34:20 np0005531887 systemd[1]: dracut-initqueue.service: Deactivated successfully.
Nov 22 01:34:20 np0005531887 systemd[1]: Stopped dracut initqueue hook.
Nov 22 01:34:20 np0005531887 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Nov 22 01:34:20 np0005531887 systemd[1]: Stopped Apply Kernel Variables.
Nov 22 01:34:20 np0005531887 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Nov 22 01:34:20 np0005531887 systemd[1]: Stopped Create Volatile Files and Directories.
Nov 22 01:34:20 np0005531887 systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Nov 22 01:34:20 np0005531887 systemd[1]: Stopped Coldplug All udev Devices.
Nov 22 01:34:20 np0005531887 systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Nov 22 01:34:20 np0005531887 systemd[1]: Stopped dracut pre-trigger hook.
Nov 22 01:34:20 np0005531887 systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Nov 22 01:34:20 np0005531887 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Nov 22 01:34:20 np0005531887 systemd[1]: Stopped Setup Virtual Console.
Nov 22 01:34:20 np0005531887 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Nov 22 01:34:20 np0005531887 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Nov 22 01:34:20 np0005531887 systemd[1]: initrd-cleanup.service: Deactivated successfully.
Nov 22 01:34:20 np0005531887 systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Nov 22 01:34:20 np0005531887 systemd[1]: systemd-udevd.service: Deactivated successfully.
Nov 22 01:34:20 np0005531887 systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Nov 22 01:34:20 np0005531887 systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Nov 22 01:34:20 np0005531887 systemd[1]: Closed udev Control Socket.
Nov 22 01:34:20 np0005531887 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Nov 22 01:34:20 np0005531887 systemd[1]: Closed udev Kernel Socket.
Nov 22 01:34:20 np0005531887 systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Nov 22 01:34:20 np0005531887 systemd[1]: Stopped dracut pre-udev hook.
Nov 22 01:34:20 np0005531887 systemd[1]: dracut-cmdline.service: Deactivated successfully.
Nov 22 01:34:20 np0005531887 systemd[1]: Stopped dracut cmdline hook.
Nov 22 01:34:20 np0005531887 systemd[1]: Starting Cleanup udev Database...
Nov 22 01:34:20 np0005531887 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Nov 22 01:34:20 np0005531887 systemd[1]: Stopped Create Static Device Nodes in /dev.
Nov 22 01:34:20 np0005531887 systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Nov 22 01:34:20 np0005531887 systemd[1]: Stopped Create List of Static Device Nodes.
Nov 22 01:34:20 np0005531887 systemd[1]: systemd-sysusers.service: Deactivated successfully.
Nov 22 01:34:20 np0005531887 systemd[1]: Stopped Create System Users.
Nov 22 01:34:20 np0005531887 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Nov 22 01:34:20 np0005531887 systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Nov 22 01:34:20 np0005531887 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Nov 22 01:34:20 np0005531887 systemd[1]: Finished Cleanup udev Database.
Nov 22 01:34:20 np0005531887 systemd[1]: Reached target Switch Root.
Nov 22 01:34:20 np0005531887 systemd[1]: Starting Switch Root...
Nov 22 01:34:20 np0005531887 systemd[1]: Switching root.
Nov 22 01:34:20 np0005531887 systemd-journald[306]: Received SIGTERM from PID 1 (systemd).
Nov 22 01:34:20 np0005531887 systemd-journald[306]: Journal stopped
Nov 22 01:34:23 np0005531887 kernel: audit: type=1404 audit(1763793261.340:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Nov 22 01:34:23 np0005531887 kernel: SELinux:  policy capability network_peer_controls=1
Nov 22 01:34:23 np0005531887 kernel: SELinux:  policy capability open_perms=1
Nov 22 01:34:23 np0005531887 kernel: SELinux:  policy capability extended_socket_class=1
Nov 22 01:34:23 np0005531887 kernel: SELinux:  policy capability always_check_network=0
Nov 22 01:34:23 np0005531887 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 22 01:34:23 np0005531887 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 22 01:34:23 np0005531887 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 22 01:34:23 np0005531887 kernel: audit: type=1403 audit(1763793261.524:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Nov 22 01:34:23 np0005531887 systemd: Successfully loaded SELinux policy in 189.551ms.
Nov 22 01:34:23 np0005531887 systemd: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 30.375ms.
Nov 22 01:34:23 np0005531887 systemd: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Nov 22 01:34:23 np0005531887 systemd: Detected virtualization kvm.
Nov 22 01:34:23 np0005531887 systemd: Detected architecture x86-64.
Nov 22 01:34:23 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 01:34:23 np0005531887 systemd: initrd-switch-root.service: Deactivated successfully.
Nov 22 01:34:23 np0005531887 systemd: Stopped Switch Root.
Nov 22 01:34:23 np0005531887 systemd: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Nov 22 01:34:23 np0005531887 systemd: Created slice Slice /system/getty.
Nov 22 01:34:23 np0005531887 systemd: Created slice Slice /system/serial-getty.
Nov 22 01:34:23 np0005531887 systemd: Created slice Slice /system/sshd-keygen.
Nov 22 01:34:23 np0005531887 systemd: Created slice User and Session Slice.
Nov 22 01:34:23 np0005531887 systemd: Started Dispatch Password Requests to Console Directory Watch.
Nov 22 01:34:23 np0005531887 systemd: Started Forward Password Requests to Wall Directory Watch.
Nov 22 01:34:23 np0005531887 systemd: Set up automount Arbitrary Executable File Formats File System Automount Point.
Nov 22 01:34:23 np0005531887 systemd: Reached target Local Encrypted Volumes.
Nov 22 01:34:23 np0005531887 systemd: Stopped target Switch Root.
Nov 22 01:34:23 np0005531887 systemd: Stopped target Initrd File Systems.
Nov 22 01:34:23 np0005531887 systemd: Stopped target Initrd Root File System.
Nov 22 01:34:23 np0005531887 systemd: Reached target Local Integrity Protected Volumes.
Nov 22 01:34:23 np0005531887 systemd: Reached target Path Units.
Nov 22 01:34:23 np0005531887 systemd: Reached target rpc_pipefs.target.
Nov 22 01:34:23 np0005531887 systemd: Reached target Slice Units.
Nov 22 01:34:23 np0005531887 systemd: Reached target Swaps.
Nov 22 01:34:23 np0005531887 systemd: Reached target Local Verity Protected Volumes.
Nov 22 01:34:23 np0005531887 systemd: Listening on RPCbind Server Activation Socket.
Nov 22 01:34:23 np0005531887 systemd: Reached target RPC Port Mapper.
Nov 22 01:34:23 np0005531887 systemd: Listening on Process Core Dump Socket.
Nov 22 01:34:23 np0005531887 systemd: Listening on initctl Compatibility Named Pipe.
Nov 22 01:34:23 np0005531887 systemd: Listening on udev Control Socket.
Nov 22 01:34:23 np0005531887 systemd: Listening on udev Kernel Socket.
Nov 22 01:34:23 np0005531887 systemd: Mounting Huge Pages File System...
Nov 22 01:34:23 np0005531887 systemd: Mounting POSIX Message Queue File System...
Nov 22 01:34:23 np0005531887 systemd: Mounting Kernel Debug File System...
Nov 22 01:34:23 np0005531887 systemd: Mounting Kernel Trace File System...
Nov 22 01:34:23 np0005531887 systemd: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Nov 22 01:34:23 np0005531887 systemd: Starting Create List of Static Device Nodes...
Nov 22 01:34:23 np0005531887 systemd: Starting Load Kernel Module configfs...
Nov 22 01:34:23 np0005531887 systemd: Starting Load Kernel Module drm...
Nov 22 01:34:23 np0005531887 systemd: Starting Load Kernel Module efi_pstore...
Nov 22 01:34:23 np0005531887 systemd: Starting Load Kernel Module fuse...
Nov 22 01:34:23 np0005531887 systemd: Starting Read and set NIS domainname from /etc/sysconfig/network...
Nov 22 01:34:23 np0005531887 systemd: systemd-fsck-root.service: Deactivated successfully.
Nov 22 01:34:23 np0005531887 systemd: Stopped File System Check on Root Device.
Nov 22 01:34:23 np0005531887 systemd: Stopped Journal Service.
Nov 22 01:34:23 np0005531887 systemd: Starting Journal Service...
Nov 22 01:34:23 np0005531887 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Nov 22 01:34:23 np0005531887 systemd: Starting Generate network units from Kernel command line...
Nov 22 01:34:23 np0005531887 systemd: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 22 01:34:23 np0005531887 systemd: Starting Remount Root and Kernel File Systems...
Nov 22 01:34:23 np0005531887 systemd: Repartition Root Disk was skipped because no trigger condition checks were met.
Nov 22 01:34:23 np0005531887 systemd: Starting Apply Kernel Variables...
Nov 22 01:34:23 np0005531887 systemd: Starting Coldplug All udev Devices...
Nov 22 01:34:23 np0005531887 systemd-journald[676]: Journal started
Nov 22 01:34:23 np0005531887 systemd-journald[676]: Runtime Journal (/run/log/journal/fee38d0f94bf6f4b17ec77ba536bd6ab) is 8.0M, max 153.6M, 145.6M free.
Nov 22 01:34:23 np0005531887 systemd[1]: Queued start job for default target Multi-User System.
Nov 22 01:34:23 np0005531887 systemd[1]: systemd-journald.service: Deactivated successfully.
Nov 22 01:34:23 np0005531887 systemd: Started Journal Service.
Nov 22 01:34:23 np0005531887 systemd[1]: Mounted Huge Pages File System.
Nov 22 01:34:23 np0005531887 systemd[1]: Mounted POSIX Message Queue File System.
Nov 22 01:34:23 np0005531887 kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Nov 22 01:34:23 np0005531887 systemd[1]: Mounted Kernel Debug File System.
Nov 22 01:34:23 np0005531887 systemd[1]: Mounted Kernel Trace File System.
Nov 22 01:34:23 np0005531887 systemd[1]: Finished Create List of Static Device Nodes.
Nov 22 01:34:23 np0005531887 kernel: fuse: init (API version 7.37)
Nov 22 01:34:23 np0005531887 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 22 01:34:23 np0005531887 systemd[1]: Finished Load Kernel Module configfs.
Nov 22 01:34:23 np0005531887 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Nov 22 01:34:23 np0005531887 systemd[1]: Finished Load Kernel Module efi_pstore.
Nov 22 01:34:23 np0005531887 systemd[1]: modprobe@fuse.service: Deactivated successfully.
Nov 22 01:34:23 np0005531887 systemd[1]: Finished Load Kernel Module fuse.
Nov 22 01:34:23 np0005531887 systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Nov 22 01:34:23 np0005531887 systemd[1]: Finished Generate network units from Kernel command line.
Nov 22 01:34:23 np0005531887 systemd[1]: Finished Remount Root and Kernel File Systems.
Nov 22 01:34:23 np0005531887 systemd[1]: Finished Apply Kernel Variables.
Nov 22 01:34:23 np0005531887 kernel: ACPI: bus type drm_connector registered
Nov 22 01:34:23 np0005531887 systemd[1]: Mounting FUSE Control File System...
Nov 22 01:34:23 np0005531887 systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Nov 22 01:34:23 np0005531887 systemd[1]: Starting Rebuild Hardware Database...
Nov 22 01:34:23 np0005531887 systemd[1]: Starting Flush Journal to Persistent Storage...
Nov 22 01:34:23 np0005531887 systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Nov 22 01:34:23 np0005531887 systemd[1]: Starting Load/Save OS Random Seed...
Nov 22 01:34:23 np0005531887 systemd[1]: Starting Create System Users...
Nov 22 01:34:23 np0005531887 systemd[1]: modprobe@drm.service: Deactivated successfully.
Nov 22 01:34:23 np0005531887 systemd[1]: Finished Load Kernel Module drm.
Nov 22 01:34:23 np0005531887 systemd[1]: Mounted FUSE Control File System.
Nov 22 01:34:23 np0005531887 systemd-journald[676]: Runtime Journal (/run/log/journal/fee38d0f94bf6f4b17ec77ba536bd6ab) is 8.0M, max 153.6M, 145.6M free.
Nov 22 01:34:23 np0005531887 systemd-journald[676]: Received client request to flush runtime journal.
Nov 22 01:34:23 np0005531887 systemd[1]: Finished Coldplug All udev Devices.
Nov 22 01:34:23 np0005531887 systemd[1]: Finished Flush Journal to Persistent Storage.
Nov 22 01:34:23 np0005531887 systemd[1]: Finished Create System Users.
Nov 22 01:34:23 np0005531887 systemd[1]: Starting Create Static Device Nodes in /dev...
Nov 22 01:34:23 np0005531887 systemd[1]: Finished Load/Save OS Random Seed.
Nov 22 01:34:23 np0005531887 systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Nov 22 01:34:24 np0005531887 systemd[1]: Finished Create Static Device Nodes in /dev.
Nov 22 01:34:24 np0005531887 systemd[1]: Reached target Preparation for Local File Systems.
Nov 22 01:34:24 np0005531887 systemd[1]: Reached target Local File Systems.
Nov 22 01:34:24 np0005531887 systemd[1]: Starting Rebuild Dynamic Linker Cache...
Nov 22 01:34:24 np0005531887 systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Nov 22 01:34:24 np0005531887 systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Nov 22 01:34:24 np0005531887 systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Nov 22 01:34:24 np0005531887 systemd[1]: Starting Automatic Boot Loader Update...
Nov 22 01:34:24 np0005531887 systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Nov 22 01:34:24 np0005531887 systemd[1]: Starting Create Volatile Files and Directories...
Nov 22 01:34:24 np0005531887 bootctl[695]: Couldn't find EFI system partition, skipping.
Nov 22 01:34:24 np0005531887 systemd[1]: Finished Automatic Boot Loader Update.
Nov 22 01:34:24 np0005531887 systemd[1]: Finished Create Volatile Files and Directories.
Nov 22 01:34:24 np0005531887 systemd[1]: Starting Security Auditing Service...
Nov 22 01:34:24 np0005531887 systemd[1]: Starting RPC Bind...
Nov 22 01:34:24 np0005531887 systemd[1]: Starting Rebuild Journal Catalog...
Nov 22 01:34:24 np0005531887 systemd[1]: Finished Rebuild Journal Catalog.
Nov 22 01:34:24 np0005531887 auditd[701]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Nov 22 01:34:24 np0005531887 auditd[701]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Nov 22 01:34:24 np0005531887 systemd[1]: Started RPC Bind.
Nov 22 01:34:24 np0005531887 augenrules[706]: /sbin/augenrules: No change
Nov 22 01:34:24 np0005531887 augenrules[721]: No rules
Nov 22 01:34:24 np0005531887 augenrules[721]: enabled 1
Nov 22 01:34:24 np0005531887 augenrules[721]: failure 1
Nov 22 01:34:24 np0005531887 augenrules[721]: pid 701
Nov 22 01:34:24 np0005531887 augenrules[721]: rate_limit 0
Nov 22 01:34:24 np0005531887 augenrules[721]: backlog_limit 8192
Nov 22 01:34:24 np0005531887 augenrules[721]: lost 0
Nov 22 01:34:24 np0005531887 augenrules[721]: backlog 3
Nov 22 01:34:24 np0005531887 augenrules[721]: backlog_wait_time 60000
Nov 22 01:34:24 np0005531887 augenrules[721]: backlog_wait_time_actual 0
Nov 22 01:34:24 np0005531887 augenrules[721]: enabled 1
Nov 22 01:34:24 np0005531887 augenrules[721]: failure 1
Nov 22 01:34:24 np0005531887 augenrules[721]: pid 701
Nov 22 01:34:24 np0005531887 augenrules[721]: rate_limit 0
Nov 22 01:34:24 np0005531887 augenrules[721]: backlog_limit 8192
Nov 22 01:34:24 np0005531887 augenrules[721]: lost 0
Nov 22 01:34:24 np0005531887 augenrules[721]: backlog 0
Nov 22 01:34:24 np0005531887 augenrules[721]: backlog_wait_time 60000
Nov 22 01:34:24 np0005531887 augenrules[721]: backlog_wait_time_actual 0
Nov 22 01:34:24 np0005531887 augenrules[721]: enabled 1
Nov 22 01:34:24 np0005531887 augenrules[721]: failure 1
Nov 22 01:34:24 np0005531887 augenrules[721]: pid 701
Nov 22 01:34:24 np0005531887 augenrules[721]: rate_limit 0
Nov 22 01:34:24 np0005531887 augenrules[721]: backlog_limit 8192
Nov 22 01:34:24 np0005531887 augenrules[721]: lost 0
Nov 22 01:34:24 np0005531887 augenrules[721]: backlog 0
Nov 22 01:34:24 np0005531887 augenrules[721]: backlog_wait_time 60000
Nov 22 01:34:24 np0005531887 augenrules[721]: backlog_wait_time_actual 0
Nov 22 01:34:24 np0005531887 systemd[1]: Started Security Auditing Service.
Nov 22 01:34:24 np0005531887 systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Nov 22 01:34:24 np0005531887 systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Nov 22 01:34:24 np0005531887 systemd[1]: Finished Rebuild Hardware Database.
Nov 22 01:34:24 np0005531887 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Nov 22 01:34:24 np0005531887 systemd-udevd[729]: Using default interface naming scheme 'rhel-9.0'.
Nov 22 01:34:25 np0005531887 systemd[1]: Started Rule-based Manager for Device Events and Files.
Nov 22 01:34:25 np0005531887 systemd[1]: Starting Load Kernel Module configfs...
Nov 22 01:34:25 np0005531887 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 22 01:34:25 np0005531887 systemd[1]: Finished Load Kernel Module configfs.
Nov 22 01:34:25 np0005531887 systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Nov 22 01:34:25 np0005531887 kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Nov 22 01:34:25 np0005531887 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Nov 22 01:34:25 np0005531887 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Nov 22 01:34:25 np0005531887 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Nov 22 01:34:25 np0005531887 systemd-udevd[734]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 01:34:25 np0005531887 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Nov 22 01:34:25 np0005531887 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Nov 22 01:34:25 np0005531887 kernel: Console: switching to colour dummy device 80x25
Nov 22 01:34:25 np0005531887 kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Nov 22 01:34:25 np0005531887 kernel: [drm] features: -context_init
Nov 22 01:34:25 np0005531887 kernel: [drm] number of scanouts: 1
Nov 22 01:34:25 np0005531887 kernel: [drm] number of cap sets: 0
Nov 22 01:34:25 np0005531887 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Nov 22 01:34:25 np0005531887 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Nov 22 01:34:25 np0005531887 kernel: Console: switching to colour frame buffer device 128x48
Nov 22 01:34:25 np0005531887 kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Nov 22 01:34:25 np0005531887 kernel: kvm_amd: TSC scaling supported
Nov 22 01:34:25 np0005531887 kernel: kvm_amd: Nested Virtualization enabled
Nov 22 01:34:25 np0005531887 kernel: kvm_amd: Nested Paging enabled
Nov 22 01:34:25 np0005531887 kernel: kvm_amd: LBR virtualization supported
Nov 22 01:34:26 np0005531887 systemd[1]: Finished Rebuild Dynamic Linker Cache.
Nov 22 01:34:26 np0005531887 systemd[1]: Starting Update is Completed...
Nov 22 01:34:26 np0005531887 systemd[1]: Finished Update is Completed.
Nov 22 01:34:26 np0005531887 systemd[1]: Reached target System Initialization.
Nov 22 01:34:26 np0005531887 systemd[1]: Started dnf makecache --timer.
Nov 22 01:34:26 np0005531887 systemd[1]: Started Daily rotation of log files.
Nov 22 01:34:26 np0005531887 systemd[1]: Started Daily Cleanup of Temporary Directories.
Nov 22 01:34:26 np0005531887 systemd[1]: Reached target Timer Units.
Nov 22 01:34:26 np0005531887 systemd[1]: Listening on D-Bus System Message Bus Socket.
Nov 22 01:34:26 np0005531887 systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Nov 22 01:34:26 np0005531887 systemd[1]: Reached target Socket Units.
Nov 22 01:34:26 np0005531887 systemd[1]: Starting D-Bus System Message Bus...
Nov 22 01:34:26 np0005531887 systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 22 01:34:26 np0005531887 systemd[1]: Started D-Bus System Message Bus.
Nov 22 01:34:26 np0005531887 systemd[1]: Reached target Basic System.
Nov 22 01:34:26 np0005531887 dbus-broker-lau[810]: Ready
Nov 22 01:34:26 np0005531887 systemd[1]: Starting NTP client/server...
Nov 22 01:34:26 np0005531887 systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Nov 22 01:34:26 np0005531887 systemd[1]: Starting Restore /run/initramfs on shutdown...
Nov 22 01:34:26 np0005531887 systemd[1]: Starting IPv4 firewall with iptables...
Nov 22 01:34:26 np0005531887 systemd[1]: Started irqbalance daemon.
Nov 22 01:34:26 np0005531887 systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Nov 22 01:34:26 np0005531887 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 22 01:34:26 np0005531887 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 22 01:34:26 np0005531887 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 22 01:34:26 np0005531887 systemd[1]: Reached target sshd-keygen.target.
Nov 22 01:34:26 np0005531887 systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Nov 22 01:34:26 np0005531887 systemd[1]: Reached target User and Group Name Lookups.
Nov 22 01:34:26 np0005531887 systemd[1]: Starting User Login Management...
Nov 22 01:34:26 np0005531887 systemd[1]: Finished Restore /run/initramfs on shutdown.
Nov 22 01:34:26 np0005531887 chronyd[829]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Nov 22 01:34:26 np0005531887 kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Nov 22 01:34:26 np0005531887 kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Nov 22 01:34:26 np0005531887 systemd-logind[821]: New seat seat0.
Nov 22 01:34:26 np0005531887 systemd-logind[821]: Watching system buttons on /dev/input/event0 (Power Button)
Nov 22 01:34:26 np0005531887 systemd-logind[821]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Nov 22 01:34:26 np0005531887 systemd[1]: Started User Login Management.
Nov 22 01:34:26 np0005531887 chronyd[829]: Loaded 0 symmetric keys
Nov 22 01:34:26 np0005531887 chronyd[829]: Using right/UTC timezone to obtain leap second data
Nov 22 01:34:26 np0005531887 chronyd[829]: Loaded seccomp filter (level 2)
Nov 22 01:34:26 np0005531887 systemd[1]: Started NTP client/server.
Nov 22 01:34:26 np0005531887 iptables.init[815]: iptables: Applying firewall rules: [  OK  ]
Nov 22 01:34:26 np0005531887 systemd[1]: Finished IPv4 firewall with iptables.
Nov 22 01:34:28 np0005531887 cloud-init[838]: Cloud-init v. 24.4-7.el9 running 'init-local' at Sat, 22 Nov 2025 06:34:28 +0000. Up 12.14 seconds.
Nov 22 01:34:28 np0005531887 systemd[1]: run-cloud\x2dinit-tmp-tmpahmdotvq.mount: Deactivated successfully.
Nov 22 01:34:29 np0005531887 systemd[1]: Starting Hostname Service...
Nov 22 01:34:29 np0005531887 systemd[1]: Started Hostname Service.
Nov 22 01:34:29 np0005531887 systemd-hostnamed[853]: Hostname set to <np0005531887.novalocal> (static)
Nov 22 01:34:29 np0005531887 systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Nov 22 01:34:29 np0005531887 systemd[1]: Reached target Preparation for Network.
Nov 22 01:34:29 np0005531887 systemd[1]: Starting Network Manager...
Nov 22 01:34:29 np0005531887 NetworkManager[857]: <info>  [1763793269.5882] NetworkManager (version 1.54.1-1.el9) is starting... (boot:c39ce406-ec93-4c16-a5f8-0230d1610d46)
Nov 22 01:34:29 np0005531887 NetworkManager[857]: <info>  [1763793269.5897] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Nov 22 01:34:29 np0005531887 NetworkManager[857]: <info>  [1763793269.6322] manager[0x56210727c080]: monitoring kernel firmware directory '/lib/firmware'.
Nov 22 01:34:29 np0005531887 NetworkManager[857]: <info>  [1763793269.6392] hostname: hostname: using hostnamed
Nov 22 01:34:29 np0005531887 NetworkManager[857]: <info>  [1763793269.6393] hostname: static hostname changed from (none) to "np0005531887.novalocal"
Nov 22 01:34:29 np0005531887 NetworkManager[857]: <info>  [1763793269.6400] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 22 01:34:29 np0005531887 NetworkManager[857]: <info>  [1763793269.6615] manager[0x56210727c080]: rfkill: Wi-Fi hardware radio set enabled
Nov 22 01:34:29 np0005531887 NetworkManager[857]: <info>  [1763793269.6616] manager[0x56210727c080]: rfkill: WWAN hardware radio set enabled
Nov 22 01:34:29 np0005531887 systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Nov 22 01:34:29 np0005531887 NetworkManager[857]: <info>  [1763793269.9576] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 22 01:34:29 np0005531887 NetworkManager[857]: <info>  [1763793269.9577] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 22 01:34:29 np0005531887 NetworkManager[857]: <info>  [1763793269.9578] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 22 01:34:29 np0005531887 NetworkManager[857]: <info>  [1763793269.9579] manager: Networking is enabled by state file
Nov 22 01:34:29 np0005531887 NetworkManager[857]: <info>  [1763793269.9584] settings: Loaded settings plugin: keyfile (internal)
Nov 22 01:34:29 np0005531887 NetworkManager[857]: <info>  [1763793269.9651] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 22 01:34:29 np0005531887 NetworkManager[857]: <info>  [1763793269.9698] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 22 01:34:29 np0005531887 NetworkManager[857]: <info>  [1763793269.9802] dhcp: init: Using DHCP client 'internal'
Nov 22 01:34:29 np0005531887 NetworkManager[857]: <info>  [1763793269.9808] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 22 01:34:29 np0005531887 NetworkManager[857]: <info>  [1763793269.9833] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 01:34:29 np0005531887 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 22 01:34:29 np0005531887 NetworkManager[857]: <info>  [1763793269.9877] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 22 01:34:29 np0005531887 NetworkManager[857]: <info>  [1763793269.9889] device (lo): Activation: starting connection 'lo' (b8f7fc42-9e8e-4183-b750-e949ad8a8a15)
Nov 22 01:34:29 np0005531887 NetworkManager[857]: <info>  [1763793269.9908] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 22 01:34:29 np0005531887 NetworkManager[857]: <info>  [1763793269.9913] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 22 01:34:29 np0005531887 systemd[1]: Started Network Manager.
Nov 22 01:34:29 np0005531887 NetworkManager[857]: <info>  [1763793269.9987] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 22 01:34:29 np0005531887 NetworkManager[857]: <info>  [1763793269.9998] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 22 01:34:30 np0005531887 NetworkManager[857]: <info>  [1763793270.0002] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 22 01:34:30 np0005531887 NetworkManager[857]: <info>  [1763793270.0005] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 22 01:34:30 np0005531887 NetworkManager[857]: <info>  [1763793270.0009] device (eth0): carrier: link connected
Nov 22 01:34:30 np0005531887 NetworkManager[857]: <info>  [1763793270.0013] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 22 01:34:30 np0005531887 systemd[1]: Reached target Network.
Nov 22 01:34:30 np0005531887 NetworkManager[857]: <info>  [1763793270.0022] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Nov 22 01:34:30 np0005531887 NetworkManager[857]: <info>  [1763793270.0032] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 22 01:34:30 np0005531887 NetworkManager[857]: <info>  [1763793270.0036] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 22 01:34:30 np0005531887 NetworkManager[857]: <info>  [1763793270.0052] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 22 01:34:30 np0005531887 NetworkManager[857]: <info>  [1763793270.0054] manager: NetworkManager state is now CONNECTING
Nov 22 01:34:30 np0005531887 NetworkManager[857]: <info>  [1763793270.0055] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 22 01:34:30 np0005531887 systemd[1]: Starting Network Manager Wait Online...
Nov 22 01:34:30 np0005531887 NetworkManager[857]: <info>  [1763793270.0084] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 22 01:34:30 np0005531887 NetworkManager[857]: <info>  [1763793270.0087] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 22 01:34:30 np0005531887 systemd[1]: Starting GSSAPI Proxy Daemon...
Nov 22 01:34:30 np0005531887 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 22 01:34:30 np0005531887 NetworkManager[857]: <info>  [1763793270.0238] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 22 01:34:30 np0005531887 NetworkManager[857]: <info>  [1763793270.0240] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 22 01:34:30 np0005531887 NetworkManager[857]: <info>  [1763793270.0249] device (lo): Activation: successful, device activated.
Nov 22 01:34:30 np0005531887 systemd[1]: Started GSSAPI Proxy Daemon.
Nov 22 01:34:30 np0005531887 systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Nov 22 01:34:30 np0005531887 systemd[1]: Reached target NFS client services.
Nov 22 01:34:30 np0005531887 systemd[1]: Reached target Preparation for Remote File Systems.
Nov 22 01:34:30 np0005531887 systemd[1]: Reached target Remote File Systems.
Nov 22 01:34:30 np0005531887 systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 22 01:34:33 np0005531887 NetworkManager[857]: <info>  [1763793273.6872] dhcp4 (eth0): state changed new lease, address=38.129.56.226
Nov 22 01:34:33 np0005531887 NetworkManager[857]: <info>  [1763793273.6886] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 22 01:34:33 np0005531887 NetworkManager[857]: <info>  [1763793273.6914] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 22 01:34:33 np0005531887 NetworkManager[857]: <info>  [1763793273.6959] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 22 01:34:33 np0005531887 NetworkManager[857]: <info>  [1763793273.6962] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 22 01:34:33 np0005531887 NetworkManager[857]: <info>  [1763793273.6966] manager: NetworkManager state is now CONNECTED_SITE
Nov 22 01:34:33 np0005531887 NetworkManager[857]: <info>  [1763793273.6970] device (eth0): Activation: successful, device activated.
Nov 22 01:34:33 np0005531887 NetworkManager[857]: <info>  [1763793273.6976] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 22 01:34:33 np0005531887 NetworkManager[857]: <info>  [1763793273.6979] manager: startup complete
Nov 22 01:34:33 np0005531887 systemd[1]: Finished Network Manager Wait Online.
Nov 22 01:34:33 np0005531887 systemd[1]: Starting Cloud-init: Network Stage...
Nov 22 01:34:34 np0005531887 cloud-init[923]: Cloud-init v. 24.4-7.el9 running 'init' at Sat, 22 Nov 2025 06:34:34 +0000. Up 17.69 seconds.
Nov 22 01:34:34 np0005531887 cloud-init[923]: ci-info: ++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Nov 22 01:34:34 np0005531887 cloud-init[923]: ci-info: +--------+------+-----------------------------+---------------+--------+-------------------+
Nov 22 01:34:34 np0005531887 cloud-init[923]: ci-info: | Device |  Up  |           Address           |      Mask     | Scope  |     Hw-Address    |
Nov 22 01:34:34 np0005531887 cloud-init[923]: ci-info: +--------+------+-----------------------------+---------------+--------+-------------------+
Nov 22 01:34:34 np0005531887 cloud-init[923]: ci-info: |  eth0  | True |        38.129.56.226        | 255.255.255.0 | global | fa:16:3e:cc:03:04 |
Nov 22 01:34:34 np0005531887 cloud-init[923]: ci-info: |  eth0  | True | fe80::f816:3eff:fecc:304/64 |       .       |  link  | fa:16:3e:cc:03:04 |
Nov 22 01:34:34 np0005531887 cloud-init[923]: ci-info: |   lo   | True |          127.0.0.1          |   255.0.0.0   |  host  |         .         |
Nov 22 01:34:34 np0005531887 cloud-init[923]: ci-info: |   lo   | True |           ::1/128           |       .       |  host  |         .         |
Nov 22 01:34:34 np0005531887 cloud-init[923]: ci-info: +--------+------+-----------------------------+---------------+--------+-------------------+
Nov 22 01:34:34 np0005531887 cloud-init[923]: ci-info: ++++++++++++++++++++++++++++++++Route IPv4 info++++++++++++++++++++++++++++++++
Nov 22 01:34:34 np0005531887 cloud-init[923]: ci-info: +-------+-----------------+-------------+-----------------+-----------+-------+
Nov 22 01:34:34 np0005531887 cloud-init[923]: ci-info: | Route |   Destination   |   Gateway   |     Genmask     | Interface | Flags |
Nov 22 01:34:34 np0005531887 cloud-init[923]: ci-info: +-------+-----------------+-------------+-----------------+-----------+-------+
Nov 22 01:34:34 np0005531887 cloud-init[923]: ci-info: |   0   |     0.0.0.0     | 38.129.56.1 |     0.0.0.0     |    eth0   |   UG  |
Nov 22 01:34:34 np0005531887 cloud-init[923]: ci-info: |   1   |   38.129.56.0   |   0.0.0.0   |  255.255.255.0  |    eth0   |   U   |
Nov 22 01:34:34 np0005531887 cloud-init[923]: ci-info: |   2   | 169.254.169.254 | 38.129.56.5 | 255.255.255.255 |    eth0   |  UGH  |
Nov 22 01:34:34 np0005531887 cloud-init[923]: ci-info: +-------+-----------------+-------------+-----------------+-----------+-------+
Nov 22 01:34:34 np0005531887 cloud-init[923]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Nov 22 01:34:34 np0005531887 cloud-init[923]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 22 01:34:34 np0005531887 cloud-init[923]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Nov 22 01:34:34 np0005531887 cloud-init[923]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 22 01:34:34 np0005531887 cloud-init[923]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Nov 22 01:34:34 np0005531887 cloud-init[923]: ci-info: |   3   |    local    |    ::   |    eth0   |   U   |
Nov 22 01:34:34 np0005531887 cloud-init[923]: ci-info: |   4   |  multicast  |    ::   |    eth0   |   U   |
Nov 22 01:34:34 np0005531887 cloud-init[923]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 22 01:34:36 np0005531887 irqbalance[816]: Cannot change IRQ 25 affinity: Operation not permitted
Nov 22 01:34:36 np0005531887 irqbalance[816]: IRQ 25 affinity is now unmanaged
Nov 22 01:34:36 np0005531887 irqbalance[816]: Cannot change IRQ 31 affinity: Operation not permitted
Nov 22 01:34:36 np0005531887 irqbalance[816]: IRQ 31 affinity is now unmanaged
Nov 22 01:34:36 np0005531887 irqbalance[816]: Cannot change IRQ 28 affinity: Operation not permitted
Nov 22 01:34:36 np0005531887 irqbalance[816]: IRQ 28 affinity is now unmanaged
Nov 22 01:34:36 np0005531887 irqbalance[816]: Cannot change IRQ 32 affinity: Operation not permitted
Nov 22 01:34:36 np0005531887 irqbalance[816]: IRQ 32 affinity is now unmanaged
Nov 22 01:34:36 np0005531887 irqbalance[816]: Cannot change IRQ 30 affinity: Operation not permitted
Nov 22 01:34:36 np0005531887 irqbalance[816]: IRQ 30 affinity is now unmanaged
Nov 22 01:34:36 np0005531887 irqbalance[816]: Cannot change IRQ 29 affinity: Operation not permitted
Nov 22 01:34:36 np0005531887 irqbalance[816]: IRQ 29 affinity is now unmanaged
Nov 22 01:34:38 np0005531887 cloud-init[923]: Generating public/private rsa key pair.
Nov 22 01:34:38 np0005531887 cloud-init[923]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Nov 22 01:34:38 np0005531887 cloud-init[923]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Nov 22 01:34:38 np0005531887 cloud-init[923]: The key fingerprint is:
Nov 22 01:34:38 np0005531887 cloud-init[923]: SHA256:acTS5TfAqbN1KSYSbrlqIK0IfuVENeO+svnirxVPUwY root@np0005531887.novalocal
Nov 22 01:34:38 np0005531887 cloud-init[923]: The key's randomart image is:
Nov 22 01:34:38 np0005531887 cloud-init[923]: +---[RSA 3072]----+
Nov 22 01:34:38 np0005531887 cloud-init[923]: |         Eo.     |
Nov 22 01:34:38 np0005531887 cloud-init[923]: |      .* o+.     |
Nov 22 01:34:38 np0005531887 cloud-init[923]: |     .+o*..oo.   |
Nov 22 01:34:38 np0005531887 cloud-init[923]: |     .=++.*.o.   |
Nov 22 01:34:38 np0005531887 cloud-init[923]: | .  ...+SO o     |
Nov 22 01:34:38 np0005531887 cloud-init[923]: |o o  o.o= .      |
Nov 22 01:34:38 np0005531887 cloud-init[923]: |+o .+. ...       |
Nov 22 01:34:38 np0005531887 cloud-init[923]: |o. .o+o.         |
Nov 22 01:34:38 np0005531887 cloud-init[923]: |  ...=Bo         |
Nov 22 01:34:38 np0005531887 cloud-init[923]: +----[SHA256]-----+
Nov 22 01:34:38 np0005531887 cloud-init[923]: Generating public/private ecdsa key pair.
Nov 22 01:34:38 np0005531887 cloud-init[923]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Nov 22 01:34:38 np0005531887 cloud-init[923]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Nov 22 01:34:38 np0005531887 cloud-init[923]: The key fingerprint is:
Nov 22 01:34:38 np0005531887 cloud-init[923]: SHA256:P+iW+IdDpp6u8ih72ga0icj+Ihetksp+JOdDfluLBzk root@np0005531887.novalocal
Nov 22 01:34:38 np0005531887 cloud-init[923]: The key's randomart image is:
Nov 22 01:34:38 np0005531887 cloud-init[923]: +---[ECDSA 256]---+
Nov 22 01:34:38 np0005531887 cloud-init[923]: |                 |
Nov 22 01:34:38 np0005531887 cloud-init[923]: |                 |
Nov 22 01:34:38 np0005531887 cloud-init[923]: |                 |
Nov 22 01:34:38 np0005531887 cloud-init[923]: | .               |
Nov 22 01:34:38 np0005531887 cloud-init[923]: |= o.  . S        |
Nov 22 01:34:38 np0005531887 cloud-init[923]: |+=.+.E  oo       |
Nov 22 01:34:38 np0005531887 cloud-init[923]: |.oBo  o*.oo      |
Nov 22 01:34:38 np0005531887 cloud-init[923]: |*+** .=+* ..     |
Nov 22 01:34:38 np0005531887 cloud-init[923]: |B@*+=**+oo       |
Nov 22 01:34:38 np0005531887 cloud-init[923]: +----[SHA256]-----+
Nov 22 01:34:38 np0005531887 cloud-init[923]: Generating public/private ed25519 key pair.
Nov 22 01:34:38 np0005531887 cloud-init[923]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Nov 22 01:34:38 np0005531887 cloud-init[923]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Nov 22 01:34:38 np0005531887 cloud-init[923]: The key fingerprint is:
Nov 22 01:34:38 np0005531887 cloud-init[923]: SHA256:k/cB6BFVSGwF0eVvqrlH1HW/rkworuNeMKppVCpNcx8 root@np0005531887.novalocal
Nov 22 01:34:38 np0005531887 cloud-init[923]: The key's randomart image is:
Nov 22 01:34:38 np0005531887 cloud-init[923]: +--[ED25519 256]--+
Nov 22 01:34:38 np0005531887 cloud-init[923]: |        .+=Bo..  |
Nov 22 01:34:38 np0005531887 cloud-init[923]: |         o+ ..  o|
Nov 22 01:34:38 np0005531887 cloud-init[923]: |        o..   ..+|
Nov 22 01:34:38 np0005531887 cloud-init[923]: |   o o E o .  ..o|
Nov 22 01:34:38 np0005531887 cloud-init[923]: |  o = .oS . ..  +|
Nov 22 01:34:38 np0005531887 cloud-init[923]: | . +  ..oo ....+ |
Nov 22 01:34:38 np0005531887 cloud-init[923]: |  o  .   o ..oo  |
Nov 22 01:34:38 np0005531887 cloud-init[923]: |   .o  .o . oo.. |
Nov 22 01:34:38 np0005531887 cloud-init[923]: |  .o  o+o.  +=.  |
Nov 22 01:34:38 np0005531887 cloud-init[923]: +----[SHA256]-----+
Nov 22 01:34:38 np0005531887 systemd[1]: Finished Cloud-init: Network Stage.
Nov 22 01:34:38 np0005531887 systemd[1]: Reached target Cloud-config availability.
Nov 22 01:34:38 np0005531887 systemd[1]: Reached target Network is Online.
Nov 22 01:34:38 np0005531887 systemd[1]: Starting Cloud-init: Config Stage...
Nov 22 01:34:38 np0005531887 systemd[1]: Starting Crash recovery kernel arming...
Nov 22 01:34:38 np0005531887 systemd[1]: Starting Notify NFS peers of a restart...
Nov 22 01:34:38 np0005531887 sm-notify[1006]: Version 2.5.4 starting
Nov 22 01:34:38 np0005531887 systemd[1]: Starting System Logging Service...
Nov 22 01:34:38 np0005531887 systemd[1]: Starting OpenSSH server daemon...
Nov 22 01:34:38 np0005531887 systemd[1]: Starting Permit User Sessions...
Nov 22 01:34:38 np0005531887 systemd[1]: Started Notify NFS peers of a restart.
Nov 22 01:34:38 np0005531887 systemd[1]: Finished Permit User Sessions.
Nov 22 01:34:38 np0005531887 systemd[1]: Started Command Scheduler.
Nov 22 01:34:38 np0005531887 systemd[1]: Started Getty on tty1.
Nov 22 01:34:38 np0005531887 systemd[1]: Started Serial Getty on ttyS0.
Nov 22 01:34:38 np0005531887 systemd[1]: Reached target Login Prompts.
Nov 22 01:34:38 np0005531887 systemd[1]: Started OpenSSH server daemon.
Nov 22 01:34:38 np0005531887 rsyslogd[1007]: [origin software="rsyslogd" swVersion="8.2506.0-2.el9" x-pid="1007" x-info="https://www.rsyslog.com"] start
Nov 22 01:34:38 np0005531887 rsyslogd[1007]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Nov 22 01:34:38 np0005531887 systemd[1]: Started System Logging Service.
Nov 22 01:34:38 np0005531887 systemd[1]: Reached target Multi-User System.
Nov 22 01:34:38 np0005531887 systemd[1]: Starting Record Runlevel Change in UTMP...
Nov 22 01:34:38 np0005531887 systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Nov 22 01:34:38 np0005531887 systemd[1]: Finished Record Runlevel Change in UTMP.
Nov 22 01:34:38 np0005531887 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 01:34:38 np0005531887 cloud-init[1073]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Sat, 22 Nov 2025 06:34:38 +0000. Up 22.21 seconds.
Nov 22 01:34:38 np0005531887 chronyd[829]: Selected source 138.197.135.239 (2.centos.pool.ntp.org)
Nov 22 01:34:39 np0005531887 chronyd[829]: System clock wrong by 1.020112 seconds
Nov 22 01:34:39 np0005531887 chronyd[829]: System clock was stepped by 1.020112 seconds
Nov 22 01:34:39 np0005531887 chronyd[829]: System clock TAI offset set to 37 seconds
Nov 22 01:34:39 np0005531887 systemd[1]: Finished Cloud-init: Config Stage.
Nov 22 01:34:39 np0005531887 systemd[1]: Starting Cloud-init: Final Stage...
Nov 22 01:34:39 np0005531887 kdumpctl[1015]: kdump: No kdump initial ramdisk found.
Nov 22 01:34:39 np0005531887 kdumpctl[1015]: kdump: Rebuilding /boot/initramfs-5.14.0-639.el9.x86_64kdump.img
Nov 22 01:34:39 np0005531887 cloud-init[1205]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Sat, 22 Nov 2025 06:34:39 +0000. Up 22.61 seconds.
Nov 22 01:34:40 np0005531887 cloud-init[1247]: #############################################################
Nov 22 01:34:40 np0005531887 cloud-init[1250]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Nov 22 01:34:40 np0005531887 cloud-init[1252]: 256 SHA256:P+iW+IdDpp6u8ih72ga0icj+Ihetksp+JOdDfluLBzk root@np0005531887.novalocal (ECDSA)
Nov 22 01:34:40 np0005531887 cloud-init[1256]: 256 SHA256:k/cB6BFVSGwF0eVvqrlH1HW/rkworuNeMKppVCpNcx8 root@np0005531887.novalocal (ED25519)
Nov 22 01:34:40 np0005531887 cloud-init[1262]: 3072 SHA256:acTS5TfAqbN1KSYSbrlqIK0IfuVENeO+svnirxVPUwY root@np0005531887.novalocal (RSA)
Nov 22 01:34:40 np0005531887 cloud-init[1266]: -----END SSH HOST KEY FINGERPRINTS-----
Nov 22 01:34:40 np0005531887 cloud-init[1268]: #############################################################
Nov 22 01:34:40 np0005531887 cloud-init[1205]: Cloud-init v. 24.4-7.el9 finished at Sat, 22 Nov 2025 06:34:40 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 22.81 seconds
Nov 22 01:34:40 np0005531887 systemd[1]: Finished Cloud-init: Final Stage.
Nov 22 01:34:40 np0005531887 systemd[1]: Reached target Cloud-init target.
Nov 22 01:34:40 np0005531887 dracut[1303]: dracut-057-102.git20250818.el9
Nov 22 01:34:40 np0005531887 dracut[1305]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/47e3724e-7a1b-439a-9543-b98c9a290709 /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-639.el9.x86_64kdump.img 5.14.0-639.el9.x86_64
Nov 22 01:34:41 np0005531887 dracut[1305]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Nov 22 01:34:41 np0005531887 dracut[1305]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Nov 22 01:34:41 np0005531887 dracut[1305]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Nov 22 01:34:41 np0005531887 dracut[1305]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Nov 22 01:34:41 np0005531887 dracut[1305]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Nov 22 01:34:41 np0005531887 dracut[1305]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Nov 22 01:34:41 np0005531887 dracut[1305]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Nov 22 01:34:41 np0005531887 dracut[1305]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Nov 22 01:34:41 np0005531887 dracut[1305]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Nov 22 01:34:41 np0005531887 dracut[1305]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Nov 22 01:34:41 np0005531887 dracut[1305]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Nov 22 01:34:41 np0005531887 dracut[1305]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Nov 22 01:34:41 np0005531887 dracut[1305]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Nov 22 01:34:41 np0005531887 dracut[1305]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Nov 22 01:34:41 np0005531887 dracut[1305]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Nov 22 01:34:41 np0005531887 dracut[1305]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Nov 22 01:34:41 np0005531887 dracut[1305]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Nov 22 01:34:41 np0005531887 dracut[1305]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Nov 22 01:34:41 np0005531887 dracut[1305]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Nov 22 01:34:41 np0005531887 dracut[1305]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Nov 22 01:34:41 np0005531887 dracut[1305]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Nov 22 01:34:41 np0005531887 dracut[1305]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Nov 22 01:34:41 np0005531887 dracut[1305]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Nov 22 01:34:41 np0005531887 dracut[1305]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Nov 22 01:34:41 np0005531887 dracut[1305]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Nov 22 01:34:41 np0005531887 dracut[1305]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Nov 22 01:34:41 np0005531887 dracut[1305]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Nov 22 01:34:41 np0005531887 dracut[1305]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Nov 22 01:34:42 np0005531887 dracut[1305]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Nov 22 01:34:42 np0005531887 dracut[1305]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Nov 22 01:34:42 np0005531887 dracut[1305]: memstrack is not available
Nov 22 01:34:42 np0005531887 dracut[1305]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Nov 22 01:34:42 np0005531887 dracut[1305]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Nov 22 01:34:42 np0005531887 dracut[1305]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Nov 22 01:34:42 np0005531887 dracut[1305]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Nov 22 01:34:42 np0005531887 dracut[1305]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Nov 22 01:34:42 np0005531887 dracut[1305]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Nov 22 01:34:42 np0005531887 dracut[1305]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Nov 22 01:34:42 np0005531887 dracut[1305]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Nov 22 01:34:42 np0005531887 dracut[1305]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Nov 22 01:34:42 np0005531887 dracut[1305]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Nov 22 01:34:42 np0005531887 dracut[1305]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Nov 22 01:34:42 np0005531887 dracut[1305]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Nov 22 01:34:42 np0005531887 dracut[1305]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Nov 22 01:34:42 np0005531887 dracut[1305]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Nov 22 01:34:42 np0005531887 dracut[1305]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Nov 22 01:34:42 np0005531887 dracut[1305]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Nov 22 01:34:42 np0005531887 dracut[1305]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Nov 22 01:34:42 np0005531887 dracut[1305]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Nov 22 01:34:42 np0005531887 dracut[1305]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Nov 22 01:34:42 np0005531887 dracut[1305]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Nov 22 01:34:42 np0005531887 dracut[1305]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Nov 22 01:34:42 np0005531887 dracut[1305]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Nov 22 01:34:42 np0005531887 dracut[1305]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Nov 22 01:34:42 np0005531887 dracut[1305]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Nov 22 01:34:42 np0005531887 dracut[1305]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Nov 22 01:34:42 np0005531887 dracut[1305]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Nov 22 01:34:42 np0005531887 dracut[1305]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Nov 22 01:34:42 np0005531887 dracut[1305]: memstrack is not available
Nov 22 01:34:42 np0005531887 dracut[1305]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Nov 22 01:34:42 np0005531887 dracut[1305]: *** Including module: systemd ***
Nov 22 01:34:43 np0005531887 dracut[1305]: *** Including module: fips ***
Nov 22 01:34:43 np0005531887 dracut[1305]: *** Including module: systemd-initrd ***
Nov 22 01:34:43 np0005531887 dracut[1305]: *** Including module: i18n ***
Nov 22 01:34:43 np0005531887 dracut[1305]: *** Including module: drm ***
Nov 22 01:34:44 np0005531887 dracut[1305]: *** Including module: prefixdevname ***
Nov 22 01:34:44 np0005531887 dracut[1305]: *** Including module: kernel-modules ***
Nov 22 01:34:44 np0005531887 kernel: block vda: the capability attribute has been deprecated.
Nov 22 01:34:44 np0005531887 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 22 01:34:45 np0005531887 dracut[1305]: *** Including module: kernel-modules-extra ***
Nov 22 01:34:45 np0005531887 dracut[1305]: *** Including module: qemu ***
Nov 22 01:34:45 np0005531887 dracut[1305]: *** Including module: fstab-sys ***
Nov 22 01:34:45 np0005531887 dracut[1305]: *** Including module: rootfs-block ***
Nov 22 01:34:45 np0005531887 dracut[1305]: *** Including module: terminfo ***
Nov 22 01:34:45 np0005531887 dracut[1305]: *** Including module: udev-rules ***
Nov 22 01:34:46 np0005531887 dracut[1305]: Skipping udev rule: 91-permissions.rules
Nov 22 01:34:46 np0005531887 dracut[1305]: Skipping udev rule: 80-drivers-modprobe.rules
Nov 22 01:34:46 np0005531887 dracut[1305]: *** Including module: virtiofs ***
Nov 22 01:34:46 np0005531887 dracut[1305]: *** Including module: dracut-systemd ***
Nov 22 01:34:46 np0005531887 dracut[1305]: *** Including module: usrmount ***
Nov 22 01:34:46 np0005531887 dracut[1305]: *** Including module: base ***
Nov 22 01:34:46 np0005531887 dracut[1305]: *** Including module: fs-lib ***
Nov 22 01:34:46 np0005531887 dracut[1305]: *** Including module: kdumpbase ***
Nov 22 01:34:46 np0005531887 dracut[1305]: *** Including module: microcode_ctl-fw_dir_override ***
Nov 22 01:34:46 np0005531887 dracut[1305]:  microcode_ctl module: mangling fw_dir
Nov 22 01:34:46 np0005531887 dracut[1305]:    microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Nov 22 01:34:46 np0005531887 dracut[1305]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Nov 22 01:34:46 np0005531887 dracut[1305]:    microcode_ctl: configuration "intel" is ignored
Nov 22 01:34:46 np0005531887 dracut[1305]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Nov 22 01:34:46 np0005531887 dracut[1305]:    microcode_ctl: configuration "intel-06-2d-07" is ignored
Nov 22 01:34:46 np0005531887 dracut[1305]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Nov 22 01:34:46 np0005531887 dracut[1305]:    microcode_ctl: configuration "intel-06-4e-03" is ignored
Nov 22 01:34:47 np0005531887 dracut[1305]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Nov 22 01:34:47 np0005531887 dracut[1305]:    microcode_ctl: configuration "intel-06-4f-01" is ignored
Nov 22 01:34:47 np0005531887 dracut[1305]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Nov 22 01:34:47 np0005531887 dracut[1305]:    microcode_ctl: configuration "intel-06-55-04" is ignored
Nov 22 01:34:47 np0005531887 dracut[1305]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Nov 22 01:34:47 np0005531887 dracut[1305]:    microcode_ctl: configuration "intel-06-5e-03" is ignored
Nov 22 01:34:47 np0005531887 dracut[1305]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Nov 22 01:34:47 np0005531887 dracut[1305]:    microcode_ctl: configuration "intel-06-8c-01" is ignored
Nov 22 01:34:47 np0005531887 dracut[1305]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Nov 22 01:34:47 np0005531887 dracut[1305]:    microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Nov 22 01:34:47 np0005531887 dracut[1305]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Nov 22 01:34:47 np0005531887 dracut[1305]:    microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Nov 22 01:34:47 np0005531887 dracut[1305]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Nov 22 01:34:47 np0005531887 dracut[1305]:    microcode_ctl: configuration "intel-06-8f-08" is ignored
Nov 22 01:34:47 np0005531887 dracut[1305]:    microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Nov 22 01:34:47 np0005531887 dracut[1305]: *** Including module: openssl ***
Nov 22 01:34:47 np0005531887 dracut[1305]: *** Including module: shutdown ***
Nov 22 01:34:47 np0005531887 dracut[1305]: *** Including module: squash ***
Nov 22 01:34:47 np0005531887 dracut[1305]: *** Including modules done ***
Nov 22 01:34:47 np0005531887 dracut[1305]: *** Installing kernel module dependencies ***
Nov 22 01:34:48 np0005531887 dracut[1305]: *** Installing kernel module dependencies done ***
Nov 22 01:34:48 np0005531887 dracut[1305]: *** Resolving executable dependencies ***
Nov 22 01:34:50 np0005531887 dracut[1305]: *** Resolving executable dependencies done ***
Nov 22 01:34:50 np0005531887 dracut[1305]: *** Generating early-microcode cpio image ***
Nov 22 01:34:50 np0005531887 dracut[1305]: *** Store current command line parameters ***
Nov 22 01:34:50 np0005531887 dracut[1305]: Stored kernel commandline:
Nov 22 01:34:50 np0005531887 dracut[1305]: No dracut internal kernel commandline stored in the initramfs
Nov 22 01:34:50 np0005531887 dracut[1305]: *** Install squash loader ***
Nov 22 01:34:52 np0005531887 dracut[1305]: *** Squashing the files inside the initramfs ***
Nov 22 01:34:53 np0005531887 dracut[1305]: *** Squashing the files inside the initramfs done ***
Nov 22 01:34:53 np0005531887 dracut[1305]: *** Creating image file '/boot/initramfs-5.14.0-639.el9.x86_64kdump.img' ***
Nov 22 01:34:53 np0005531887 dracut[1305]: *** Hardlinking files ***
Nov 22 01:34:53 np0005531887 dracut[1305]: *** Hardlinking files done ***
Nov 22 01:34:54 np0005531887 dracut[1305]: *** Creating initramfs image file '/boot/initramfs-5.14.0-639.el9.x86_64kdump.img' done ***
Nov 22 01:34:54 np0005531887 kdumpctl[1015]: kdump: kexec: loaded kdump kernel
Nov 22 01:34:54 np0005531887 kdumpctl[1015]: kdump: Starting kdump: [OK]
Nov 22 01:34:54 np0005531887 systemd[1]: Finished Crash recovery kernel arming.
Nov 22 01:34:54 np0005531887 systemd[1]: Startup finished in 1.593s (kernel) + 3.408s (initrd) + 32.510s (userspace) = 37.512s.
Nov 22 01:34:55 np0005531887 systemd[1]: Created slice User Slice of UID 1000.
Nov 22 01:34:55 np0005531887 systemd[1]: Starting User Runtime Directory /run/user/1000...
Nov 22 01:34:55 np0005531887 systemd-logind[821]: New session 1 of user zuul.
Nov 22 01:34:55 np0005531887 systemd[1]: Finished User Runtime Directory /run/user/1000.
Nov 22 01:34:55 np0005531887 systemd[1]: Starting User Manager for UID 1000...
Nov 22 01:34:56 np0005531887 systemd[4301]: Queued start job for default target Main User Target.
Nov 22 01:34:56 np0005531887 systemd[4301]: Created slice User Application Slice.
Nov 22 01:34:56 np0005531887 systemd[4301]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 22 01:34:56 np0005531887 systemd[4301]: Started Daily Cleanup of User's Temporary Directories.
Nov 22 01:34:56 np0005531887 systemd[4301]: Reached target Paths.
Nov 22 01:34:56 np0005531887 systemd[4301]: Reached target Timers.
Nov 22 01:34:56 np0005531887 systemd[4301]: Starting D-Bus User Message Bus Socket...
Nov 22 01:34:56 np0005531887 systemd[4301]: Starting Create User's Volatile Files and Directories...
Nov 22 01:34:56 np0005531887 systemd[4301]: Listening on D-Bus User Message Bus Socket.
Nov 22 01:34:56 np0005531887 systemd[4301]: Finished Create User's Volatile Files and Directories.
Nov 22 01:34:56 np0005531887 systemd[4301]: Reached target Sockets.
Nov 22 01:34:56 np0005531887 systemd[4301]: Reached target Basic System.
Nov 22 01:34:56 np0005531887 systemd[4301]: Reached target Main User Target.
Nov 22 01:34:56 np0005531887 systemd[4301]: Startup finished in 172ms.
Nov 22 01:34:56 np0005531887 systemd[1]: Started User Manager for UID 1000.
Nov 22 01:34:56 np0005531887 systemd[1]: Started Session 1 of User zuul.
Nov 22 01:34:56 np0005531887 python3[4383]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 01:35:00 np0005531887 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 22 01:35:08 np0005531887 python3[4413]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 01:35:16 np0005531887 python3[4471]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 01:35:17 np0005531887 python3[4511]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Nov 22 01:35:19 np0005531887 python3[4537]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDIDDDD+fltt9cmDgcjLSkGENwZvQzj5XoQ8wGDcg2s6u+LVhotbjXRoCyQvkLrQ9+aYjFbt1JZ05PeSToOVkPdJ2l6AucsYKMFk7tKlgqYA0SfBQkQjrI4dYCIJp5Zl46tl+HQ7eT2kkERLJRgc1sNhw88jbxU83GEmQNcj9/Q6rj2r+/nIptD66sUseZ1GDb43Ao7zBSzRrD8HRZlEfDChNFod0RykV5phE1R5jhZzJ7KtwI8ovnac3+YT5JW3uK2sdRHHMkZyMiqLqGgsozncX0tlbDqQ6Td89rR3ia15IGC2ZhCwZ5c8vyHhHLG0eEjA73ADlY3cxVKkV8ULfKIWbZL7+AmS7WLvTbD3QSMnkFyuzpAbq/zrs1iZFaLNioOyXiKn0sdTX+CE+goDViTSGJIE8ELsdVZ1adwTqArvAG+Rek7RLJ0oiTWo43Kjdyfs/JYcGpxz+5HVoi4aE2g0M5qhLU7D/EmGa4VwYjui4rxXMlhIFmTsq1NgHSMlB8= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 01:35:20 np0005531887 python3[4561]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:35:20 np0005531887 python3[4660]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 01:35:20 np0005531887 python3[4731]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763793320.2845502-252-224368043178000/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=6bc74860ecfa49adaf1e65a536fcfd6f_id_rsa follow=False checksum=d1aad691a5f7d928d36e451e57eecb0570edc5f2 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:35:21 np0005531887 python3[4854]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 01:35:21 np0005531887 python3[4925]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763793321.289967-307-63024843809063/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=6bc74860ecfa49adaf1e65a536fcfd6f_id_rsa.pub follow=False checksum=5c64f06d32705901c18adda8251e89259a484c91 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:35:23 np0005531887 python3[4973]: ansible-ping Invoked with data=pong
Nov 22 01:35:24 np0005531887 python3[4997]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 01:35:26 np0005531887 python3[5055]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Nov 22 01:35:27 np0005531887 python3[5087]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:35:28 np0005531887 python3[5111]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:35:28 np0005531887 python3[5135]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:35:28 np0005531887 python3[5159]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:35:29 np0005531887 python3[5183]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:35:29 np0005531887 python3[5207]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:35:31 np0005531887 python3[5233]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:35:31 np0005531887 python3[5311]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 01:35:32 np0005531887 python3[5384]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763793331.3268504-32-219230732741522/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:35:33 np0005531887 python3[5432]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 01:35:33 np0005531887 python3[5456]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 01:35:33 np0005531887 python3[5480]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 01:35:33 np0005531887 python3[5504]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 01:35:34 np0005531887 python3[5528]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 01:35:34 np0005531887 python3[5552]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 01:35:34 np0005531887 python3[5576]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 01:35:34 np0005531887 python3[5600]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 01:35:35 np0005531887 python3[5624]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 01:35:35 np0005531887 python3[5648]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 01:35:35 np0005531887 python3[5672]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 01:35:35 np0005531887 python3[5696]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 01:35:36 np0005531887 python3[5720]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 01:35:36 np0005531887 python3[5744]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 01:35:36 np0005531887 python3[5768]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 01:35:37 np0005531887 python3[5792]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 01:35:37 np0005531887 python3[5816]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 01:35:37 np0005531887 python3[5840]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 01:35:37 np0005531887 python3[5864]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 01:35:38 np0005531887 python3[5888]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 01:35:38 np0005531887 python3[5912]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 01:35:38 np0005531887 python3[5936]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 01:35:39 np0005531887 python3[5960]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 01:35:39 np0005531887 python3[5984]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 01:35:39 np0005531887 python3[6008]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 01:35:39 np0005531887 python3[6032]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 01:35:42 np0005531887 python3[6058]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Nov 22 01:35:42 np0005531887 systemd[1]: Starting Time & Date Service...
Nov 22 01:35:42 np0005531887 systemd[1]: Started Time & Date Service.
Nov 22 01:35:42 np0005531887 systemd-timedated[6060]: Changed time zone to 'UTC' (UTC).
Nov 22 01:35:44 np0005531887 python3[6089]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:35:45 np0005531887 python3[6165]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 01:35:45 np0005531887 python3[6236]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1763793345.093021-252-80336208889672/source _original_basename=tmp1ix6pp6q follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:35:46 np0005531887 python3[6336]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 01:35:46 np0005531887 python3[6407]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1763793345.9513156-303-270054616569933/source _original_basename=tmp8asthvbm follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:35:47 np0005531887 python3[6509]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 01:35:47 np0005531887 python3[6582]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1763793347.0558276-382-121986211065918/source _original_basename=tmpw5aa07g9 follow=False checksum=df8038d0e1608d98850d3e1f9d175c5362bef36d backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:35:48 np0005531887 python3[6630]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 01:35:48 np0005531887 python3[6656]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 01:35:48 np0005531887 python3[6736]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 01:35:49 np0005531887 python3[6809]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1763793348.6579845-452-272048468448152/source _original_basename=tmpdqrxmp5e follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:35:50 np0005531887 python3[6860]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163e3b-3c83-2f73-06ac-00000000001f-1-compute1 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 01:35:50 np0005531887 python3[6888]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env#012 _uses_shell=True zuul_log_id=fa163e3b-3c83-2f73-06ac-000000000020-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Nov 22 01:35:52 np0005531887 python3[6916]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:36:13 np0005531887 systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 22 01:36:28 np0005531887 python3[6946]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:37:25 np0005531887 systemd[4301]: Starting Mark boot as successful...
Nov 22 01:37:26 np0005531887 systemd[4301]: Finished Mark boot as successful.
Nov 22 01:37:28 np0005531887 systemd-logind[821]: Session 1 logged out. Waiting for processes to exit.
Nov 22 01:37:33 np0005531887 kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Nov 22 01:37:33 np0005531887 kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Nov 22 01:37:33 np0005531887 kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Nov 22 01:37:33 np0005531887 kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Nov 22 01:37:33 np0005531887 kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Nov 22 01:37:33 np0005531887 kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Nov 22 01:37:33 np0005531887 kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Nov 22 01:37:33 np0005531887 kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Nov 22 01:37:33 np0005531887 kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Nov 22 01:37:33 np0005531887 kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Nov 22 01:37:33 np0005531887 NetworkManager[857]: <info>  [1763793453.4737] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 22 01:37:33 np0005531887 systemd-udevd[6952]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 01:37:33 np0005531887 NetworkManager[857]: <info>  [1763793453.4874] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 22 01:37:33 np0005531887 NetworkManager[857]: <info>  [1763793453.4893] settings: (eth1): created default wired connection 'Wired connection 1'
Nov 22 01:37:33 np0005531887 NetworkManager[857]: <info>  [1763793453.4895] device (eth1): carrier: link connected
Nov 22 01:37:33 np0005531887 NetworkManager[857]: <info>  [1763793453.4896] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Nov 22 01:37:33 np0005531887 NetworkManager[857]: <info>  [1763793453.4900] policy: auto-activating connection 'Wired connection 1' (a54092cf-a033-39de-bd2d-a412d28aa90a)
Nov 22 01:37:33 np0005531887 NetworkManager[857]: <info>  [1763793453.4903] device (eth1): Activation: starting connection 'Wired connection 1' (a54092cf-a033-39de-bd2d-a412d28aa90a)
Nov 22 01:37:33 np0005531887 NetworkManager[857]: <info>  [1763793453.4903] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 22 01:37:33 np0005531887 NetworkManager[857]: <info>  [1763793453.4905] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 22 01:37:33 np0005531887 NetworkManager[857]: <info>  [1763793453.4907] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 22 01:37:33 np0005531887 NetworkManager[857]: <info>  [1763793453.4911] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 22 01:37:34 np0005531887 systemd-logind[821]: New session 3 of user zuul.
Nov 22 01:37:34 np0005531887 systemd[1]: Started Session 3 of User zuul.
Nov 22 01:37:34 np0005531887 python3[6983]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163e3b-3c83-e99d-39a2-000000000189-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 01:37:41 np0005531887 python3[7065]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 01:37:42 np0005531887 python3[7138]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763793461.1602864-155-209272953712501/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=63c681db47fa4740aef2115040bfdf26ea2af8f8 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:37:42 np0005531887 python3[7188]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 01:37:42 np0005531887 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Nov 22 01:37:42 np0005531887 systemd[1]: Stopped Network Manager Wait Online.
Nov 22 01:37:42 np0005531887 systemd[1]: Stopping Network Manager Wait Online...
Nov 22 01:37:42 np0005531887 systemd[1]: Stopping Network Manager...
Nov 22 01:37:42 np0005531887 NetworkManager[857]: <info>  [1763793462.6404] caught SIGTERM, shutting down normally.
Nov 22 01:37:42 np0005531887 NetworkManager[857]: <info>  [1763793462.6421] dhcp4 (eth0): canceled DHCP transaction
Nov 22 01:37:42 np0005531887 NetworkManager[857]: <info>  [1763793462.6421] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 22 01:37:42 np0005531887 NetworkManager[857]: <info>  [1763793462.6422] dhcp4 (eth0): state changed no lease
Nov 22 01:37:42 np0005531887 NetworkManager[857]: <info>  [1763793462.6425] manager: NetworkManager state is now CONNECTING
Nov 22 01:37:42 np0005531887 NetworkManager[857]: <info>  [1763793462.6533] dhcp4 (eth1): canceled DHCP transaction
Nov 22 01:37:42 np0005531887 NetworkManager[857]: <info>  [1763793462.6534] dhcp4 (eth1): state changed no lease
Nov 22 01:37:42 np0005531887 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 22 01:37:42 np0005531887 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 22 01:37:42 np0005531887 NetworkManager[857]: <info>  [1763793462.7947] exiting (success)
Nov 22 01:37:42 np0005531887 systemd[1]: NetworkManager.service: Deactivated successfully.
Nov 22 01:37:42 np0005531887 systemd[1]: Stopped Network Manager.
Nov 22 01:37:42 np0005531887 systemd[1]: NetworkManager.service: Consumed 1.481s CPU time, 10.0M memory peak.
Nov 22 01:37:42 np0005531887 systemd[1]: Starting Network Manager...
Nov 22 01:37:42 np0005531887 NetworkManager[7205]: <info>  [1763793462.8564] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:c39ce406-ec93-4c16-a5f8-0230d1610d46)
Nov 22 01:37:42 np0005531887 NetworkManager[7205]: <info>  [1763793462.8565] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Nov 22 01:37:42 np0005531887 NetworkManager[7205]: <info>  [1763793462.8613] manager[0x55ba3b9b2070]: monitoring kernel firmware directory '/lib/firmware'.
Nov 22 01:37:42 np0005531887 systemd[1]: Starting Hostname Service...
Nov 22 01:37:42 np0005531887 systemd[1]: Started Hostname Service.
Nov 22 01:37:42 np0005531887 NetworkManager[7205]: <info>  [1763793462.9586] hostname: hostname: using hostnamed
Nov 22 01:37:42 np0005531887 NetworkManager[7205]: <info>  [1763793462.9586] hostname: static hostname changed from (none) to "np0005531887.novalocal"
Nov 22 01:37:42 np0005531887 NetworkManager[7205]: <info>  [1763793462.9594] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 22 01:37:42 np0005531887 NetworkManager[7205]: <info>  [1763793462.9600] manager[0x55ba3b9b2070]: rfkill: Wi-Fi hardware radio set enabled
Nov 22 01:37:42 np0005531887 NetworkManager[7205]: <info>  [1763793462.9601] manager[0x55ba3b9b2070]: rfkill: WWAN hardware radio set enabled
Nov 22 01:37:42 np0005531887 NetworkManager[7205]: <info>  [1763793462.9645] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 22 01:37:42 np0005531887 NetworkManager[7205]: <info>  [1763793462.9646] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 22 01:37:42 np0005531887 NetworkManager[7205]: <info>  [1763793462.9647] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 22 01:37:42 np0005531887 NetworkManager[7205]: <info>  [1763793462.9647] manager: Networking is enabled by state file
Nov 22 01:37:42 np0005531887 NetworkManager[7205]: <info>  [1763793462.9650] settings: Loaded settings plugin: keyfile (internal)
Nov 22 01:37:42 np0005531887 NetworkManager[7205]: <info>  [1763793462.9657] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 22 01:37:42 np0005531887 NetworkManager[7205]: <info>  [1763793462.9697] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 22 01:37:42 np0005531887 NetworkManager[7205]: <info>  [1763793462.9715] dhcp: init: Using DHCP client 'internal'
Nov 22 01:37:42 np0005531887 NetworkManager[7205]: <info>  [1763793462.9720] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 22 01:37:42 np0005531887 NetworkManager[7205]: <info>  [1763793462.9730] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 01:37:42 np0005531887 NetworkManager[7205]: <info>  [1763793462.9740] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 22 01:37:42 np0005531887 NetworkManager[7205]: <info>  [1763793462.9753] device (lo): Activation: starting connection 'lo' (b8f7fc42-9e8e-4183-b750-e949ad8a8a15)
Nov 22 01:37:42 np0005531887 NetworkManager[7205]: <info>  [1763793462.9763] device (eth0): carrier: link connected
Nov 22 01:37:42 np0005531887 NetworkManager[7205]: <info>  [1763793462.9770] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 22 01:37:42 np0005531887 NetworkManager[7205]: <info>  [1763793462.9777] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Nov 22 01:37:42 np0005531887 NetworkManager[7205]: <info>  [1763793462.9778] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 22 01:37:42 np0005531887 NetworkManager[7205]: <info>  [1763793462.9787] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 22 01:37:42 np0005531887 NetworkManager[7205]: <info>  [1763793462.9797] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 22 01:37:42 np0005531887 NetworkManager[7205]: <info>  [1763793462.9806] device (eth1): carrier: link connected
Nov 22 01:37:42 np0005531887 NetworkManager[7205]: <info>  [1763793462.9813] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 22 01:37:42 np0005531887 NetworkManager[7205]: <info>  [1763793462.9823] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (a54092cf-a033-39de-bd2d-a412d28aa90a) (indicated)
Nov 22 01:37:42 np0005531887 NetworkManager[7205]: <info>  [1763793462.9824] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 22 01:37:42 np0005531887 NetworkManager[7205]: <info>  [1763793462.9835] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 22 01:37:42 np0005531887 NetworkManager[7205]: <info>  [1763793462.9849] device (eth1): Activation: starting connection 'Wired connection 1' (a54092cf-a033-39de-bd2d-a412d28aa90a)
Nov 22 01:37:42 np0005531887 NetworkManager[7205]: <info>  [1763793462.9860] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 22 01:37:42 np0005531887 systemd[1]: Started Network Manager.
Nov 22 01:37:42 np0005531887 NetworkManager[7205]: <info>  [1763793462.9867] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 22 01:37:42 np0005531887 NetworkManager[7205]: <info>  [1763793462.9871] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 22 01:37:42 np0005531887 NetworkManager[7205]: <info>  [1763793462.9876] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 22 01:37:42 np0005531887 NetworkManager[7205]: <info>  [1763793462.9881] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 22 01:37:42 np0005531887 NetworkManager[7205]: <info>  [1763793462.9889] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 22 01:37:42 np0005531887 NetworkManager[7205]: <info>  [1763793462.9901] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 22 01:37:42 np0005531887 NetworkManager[7205]: <info>  [1763793462.9907] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 22 01:37:42 np0005531887 NetworkManager[7205]: <info>  [1763793462.9912] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 22 01:37:42 np0005531887 NetworkManager[7205]: <info>  [1763793462.9923] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 22 01:37:42 np0005531887 NetworkManager[7205]: <info>  [1763793462.9927] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 22 01:37:42 np0005531887 NetworkManager[7205]: <info>  [1763793462.9941] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 22 01:37:42 np0005531887 NetworkManager[7205]: <info>  [1763793462.9945] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 22 01:37:42 np0005531887 NetworkManager[7205]: <info>  [1763793462.9972] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 22 01:37:42 np0005531887 NetworkManager[7205]: <info>  [1763793462.9976] dhcp4 (eth0): state changed new lease, address=38.129.56.226
Nov 22 01:37:42 np0005531887 NetworkManager[7205]: <info>  [1763793462.9985] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 22 01:37:42 np0005531887 NetworkManager[7205]: <info>  [1763793462.9996] device (lo): Activation: successful, device activated.
Nov 22 01:37:43 np0005531887 NetworkManager[7205]: <info>  [1763793463.0013] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 22 01:37:43 np0005531887 systemd[1]: Starting Network Manager Wait Online...
Nov 22 01:37:43 np0005531887 NetworkManager[7205]: <info>  [1763793463.2071] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 22 01:37:43 np0005531887 NetworkManager[7205]: <info>  [1763793463.2149] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 22 01:37:43 np0005531887 NetworkManager[7205]: <info>  [1763793463.2154] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 22 01:37:43 np0005531887 NetworkManager[7205]: <info>  [1763793463.2162] manager: NetworkManager state is now CONNECTED_SITE
Nov 22 01:37:43 np0005531887 NetworkManager[7205]: <info>  [1763793463.2173] device (eth0): Activation: successful, device activated.
Nov 22 01:37:43 np0005531887 NetworkManager[7205]: <info>  [1763793463.2193] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 22 01:37:43 np0005531887 python3[7254]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163e3b-3c83-e99d-39a2-0000000000c8-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 01:37:53 np0005531887 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 22 01:38:12 np0005531887 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 22 01:38:28 np0005531887 NetworkManager[7205]: <info>  [1763793508.3306] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 22 01:38:28 np0005531887 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 22 01:38:28 np0005531887 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 22 01:38:28 np0005531887 NetworkManager[7205]: <info>  [1763793508.3654] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 22 01:38:28 np0005531887 NetworkManager[7205]: <info>  [1763793508.3659] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 22 01:38:28 np0005531887 NetworkManager[7205]: <info>  [1763793508.3673] device (eth1): Activation: successful, device activated.
Nov 22 01:38:28 np0005531887 NetworkManager[7205]: <info>  [1763793508.3681] manager: startup complete
Nov 22 01:38:28 np0005531887 NetworkManager[7205]: <info>  [1763793508.3685] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Nov 22 01:38:28 np0005531887 NetworkManager[7205]: <warn>  [1763793508.3698] device (eth1): Activation: failed for connection 'Wired connection 1'
Nov 22 01:38:28 np0005531887 NetworkManager[7205]: <info>  [1763793508.3704] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Nov 22 01:38:28 np0005531887 systemd[1]: Finished Network Manager Wait Online.
Nov 22 01:38:28 np0005531887 NetworkManager[7205]: <info>  [1763793508.3813] dhcp4 (eth1): canceled DHCP transaction
Nov 22 01:38:28 np0005531887 NetworkManager[7205]: <info>  [1763793508.3815] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 22 01:38:28 np0005531887 NetworkManager[7205]: <info>  [1763793508.3816] dhcp4 (eth1): state changed no lease
Nov 22 01:38:28 np0005531887 NetworkManager[7205]: <info>  [1763793508.3833] policy: auto-activating connection 'ci-private-network' (1cca51b0-a68b-5692-83d5-ea4fdaca949c)
Nov 22 01:38:28 np0005531887 NetworkManager[7205]: <info>  [1763793508.3838] device (eth1): Activation: starting connection 'ci-private-network' (1cca51b0-a68b-5692-83d5-ea4fdaca949c)
Nov 22 01:38:28 np0005531887 NetworkManager[7205]: <info>  [1763793508.3842] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 22 01:38:28 np0005531887 NetworkManager[7205]: <info>  [1763793508.3847] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 22 01:38:28 np0005531887 NetworkManager[7205]: <info>  [1763793508.3856] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 22 01:38:28 np0005531887 NetworkManager[7205]: <info>  [1763793508.3866] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 22 01:38:28 np0005531887 NetworkManager[7205]: <info>  [1763793508.4227] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 22 01:38:28 np0005531887 NetworkManager[7205]: <info>  [1763793508.4231] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 22 01:38:28 np0005531887 NetworkManager[7205]: <info>  [1763793508.4237] device (eth1): Activation: successful, device activated.
Nov 22 01:38:38 np0005531887 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 22 01:38:43 np0005531887 systemd[1]: session-3.scope: Deactivated successfully.
Nov 22 01:38:43 np0005531887 systemd[1]: session-3.scope: Consumed 1.664s CPU time.
Nov 22 01:38:43 np0005531887 systemd-logind[821]: Session 3 logged out. Waiting for processes to exit.
Nov 22 01:38:43 np0005531887 systemd-logind[821]: Removed session 3.
Nov 22 01:39:17 np0005531887 systemd-logind[821]: New session 4 of user zuul.
Nov 22 01:39:17 np0005531887 systemd[1]: Started Session 4 of User zuul.
Nov 22 01:39:18 np0005531887 python3[7386]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 01:39:18 np0005531887 python3[7459]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763793557.81001-365-181129333387851/source _original_basename=tmpjm4slh7o follow=False checksum=ec5da2e3f9737eb58d2ca927fe651700c5f6760b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:39:21 np0005531887 systemd[1]: session-4.scope: Deactivated successfully.
Nov 22 01:39:21 np0005531887 systemd-logind[821]: Session 4 logged out. Waiting for processes to exit.
Nov 22 01:39:21 np0005531887 systemd-logind[821]: Removed session 4.
Nov 22 01:40:25 np0005531887 systemd[4301]: Created slice User Background Tasks Slice.
Nov 22 01:40:25 np0005531887 systemd[4301]: Starting Cleanup of User's Temporary Files and Directories...
Nov 22 01:40:25 np0005531887 systemd[4301]: Finished Cleanup of User's Temporary Files and Directories.
Nov 22 01:45:32 np0005531887 systemd-logind[821]: New session 5 of user zuul.
Nov 22 01:45:32 np0005531887 systemd[1]: Started Session 5 of User zuul.
Nov 22 01:45:32 np0005531887 python3[7542]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda#012 _uses_shell=True zuul_log_id=fa163e3b-3c83-eb85-1f76-000000000ca6-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 01:45:32 np0005531887 python3[7570]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:45:32 np0005531887 python3[7597]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:45:33 np0005531887 python3[7623]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:45:33 np0005531887 python3[7649]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:45:33 np0005531887 python3[7675]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:45:34 np0005531887 python3[7753]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 01:45:34 np0005531887 python3[7826]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763793934.1271362-368-209957650405130/source _original_basename=tmpp5a56geq follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:45:35 np0005531887 python3[7876]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 22 01:45:35 np0005531887 systemd[1]: Reloading.
Nov 22 01:45:35 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 01:45:37 np0005531887 python3[7933]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Nov 22 01:45:37 np0005531887 python3[7959]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 01:45:38 np0005531887 python3[7987]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 01:45:38 np0005531887 python3[8015]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 01:45:38 np0005531887 python3[8043]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 01:45:39 np0005531887 python3[8070]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;#012 _uses_shell=True zuul_log_id=fa163e3b-3c83-eb85-1f76-000000000cad-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 01:45:39 np0005531887 python3[8100]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 22 01:45:42 np0005531887 systemd[1]: session-5.scope: Deactivated successfully.
Nov 22 01:45:42 np0005531887 systemd[1]: session-5.scope: Consumed 4.213s CPU time.
Nov 22 01:45:42 np0005531887 systemd-logind[821]: Session 5 logged out. Waiting for processes to exit.
Nov 22 01:45:42 np0005531887 systemd-logind[821]: Removed session 5.
Nov 22 01:45:44 np0005531887 systemd-logind[821]: New session 6 of user zuul.
Nov 22 01:45:44 np0005531887 systemd[1]: Started Session 6 of User zuul.
Nov 22 01:45:44 np0005531887 python3[8135]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 22 01:45:47 np0005531887 irqbalance[816]: Cannot change IRQ 27 affinity: Operation not permitted
Nov 22 01:45:47 np0005531887 irqbalance[816]: IRQ 27 affinity is now unmanaged
Nov 22 01:45:59 np0005531887 kernel: SELinux:  Converting 385 SID table entries...
Nov 22 01:45:59 np0005531887 kernel: SELinux:  policy capability network_peer_controls=1
Nov 22 01:45:59 np0005531887 kernel: SELinux:  policy capability open_perms=1
Nov 22 01:45:59 np0005531887 kernel: SELinux:  policy capability extended_socket_class=1
Nov 22 01:45:59 np0005531887 kernel: SELinux:  policy capability always_check_network=0
Nov 22 01:45:59 np0005531887 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 22 01:45:59 np0005531887 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 22 01:45:59 np0005531887 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 22 01:46:08 np0005531887 kernel: SELinux:  Converting 385 SID table entries...
Nov 22 01:46:08 np0005531887 kernel: SELinux:  policy capability network_peer_controls=1
Nov 22 01:46:08 np0005531887 kernel: SELinux:  policy capability open_perms=1
Nov 22 01:46:08 np0005531887 kernel: SELinux:  policy capability extended_socket_class=1
Nov 22 01:46:08 np0005531887 kernel: SELinux:  policy capability always_check_network=0
Nov 22 01:46:08 np0005531887 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 22 01:46:08 np0005531887 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 22 01:46:08 np0005531887 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 22 01:46:17 np0005531887 kernel: SELinux:  Converting 385 SID table entries...
Nov 22 01:46:17 np0005531887 kernel: SELinux:  policy capability network_peer_controls=1
Nov 22 01:46:17 np0005531887 kernel: SELinux:  policy capability open_perms=1
Nov 22 01:46:17 np0005531887 kernel: SELinux:  policy capability extended_socket_class=1
Nov 22 01:46:17 np0005531887 kernel: SELinux:  policy capability always_check_network=0
Nov 22 01:46:17 np0005531887 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 22 01:46:17 np0005531887 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 22 01:46:17 np0005531887 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 22 01:46:18 np0005531887 setsebool[8204]: The virt_use_nfs policy boolean was changed to 1 by root
Nov 22 01:46:18 np0005531887 setsebool[8204]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Nov 22 01:46:30 np0005531887 kernel: SELinux:  Converting 388 SID table entries...
Nov 22 01:46:30 np0005531887 kernel: SELinux:  policy capability network_peer_controls=1
Nov 22 01:46:30 np0005531887 kernel: SELinux:  policy capability open_perms=1
Nov 22 01:46:30 np0005531887 kernel: SELinux:  policy capability extended_socket_class=1
Nov 22 01:46:30 np0005531887 kernel: SELinux:  policy capability always_check_network=0
Nov 22 01:46:30 np0005531887 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 22 01:46:30 np0005531887 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 22 01:46:30 np0005531887 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 22 01:47:06 np0005531887 dbus-broker-launch[811]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Nov 22 01:47:06 np0005531887 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 22 01:47:06 np0005531887 systemd[1]: Starting man-db-cache-update.service...
Nov 22 01:47:06 np0005531887 systemd[1]: Reloading.
Nov 22 01:47:06 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 01:47:06 np0005531887 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 22 01:47:54 np0005531887 python3[24491]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"#012 _uses_shell=True zuul_log_id=fa163e3b-3c83-d633-179b-00000000000c-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 01:47:55 np0005531887 kernel: evm: overlay not supported
Nov 22 01:47:56 np0005531887 systemd[4301]: Starting D-Bus User Message Bus...
Nov 22 01:47:56 np0005531887 dbus-broker-launch[24991]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Nov 22 01:47:56 np0005531887 dbus-broker-launch[24991]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Nov 22 01:47:56 np0005531887 systemd[4301]: Started D-Bus User Message Bus.
Nov 22 01:47:56 np0005531887 dbus-broker-lau[24991]: Ready
Nov 22 01:47:56 np0005531887 systemd[4301]: selinux: avc:  op=load_policy lsm=selinux seqno=6 res=1
Nov 22 01:47:56 np0005531887 systemd[4301]: Created slice Slice /user.
Nov 22 01:47:56 np0005531887 systemd[4301]: podman-24818.scope: unit configures an IP firewall, but not running as root.
Nov 22 01:47:56 np0005531887 systemd[4301]: (This warning is only shown for the first unit using IP firewalling.)
Nov 22 01:47:56 np0005531887 systemd[4301]: Started podman-24818.scope.
Nov 22 01:47:56 np0005531887 systemd[4301]: Started podman-pause-925497ed.scope.
Nov 22 01:47:58 np0005531887 python3[25699]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]#012location = "38.102.83.155:5001"#012insecure = true path=/etc/containers/registries.conf block=[[registry]]#012location = "38.102.83.155:5001"#012insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:47:58 np0005531887 python3[25699]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Nov 22 01:47:59 np0005531887 systemd[1]: session-6.scope: Deactivated successfully.
Nov 22 01:47:59 np0005531887 systemd[1]: session-6.scope: Consumed 59.804s CPU time.
Nov 22 01:47:59 np0005531887 systemd-logind[821]: Session 6 logged out. Waiting for processes to exit.
Nov 22 01:47:59 np0005531887 systemd-logind[821]: Removed session 6.
Nov 22 01:48:13 np0005531887 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 22 01:48:13 np0005531887 systemd[1]: Finished man-db-cache-update.service.
Nov 22 01:48:13 np0005531887 systemd[1]: man-db-cache-update.service: Consumed 1min 12.298s CPU time.
Nov 22 01:48:13 np0005531887 systemd[1]: run-re937d258419e45f0b9cabab74be9b4e6.service: Deactivated successfully.
Nov 22 01:48:31 np0005531887 systemd-logind[821]: New session 7 of user zuul.
Nov 22 01:48:31 np0005531887 systemd[1]: Started Session 7 of User zuul.
Nov 22 01:48:31 np0005531887 python3[29658]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOpFAjaSiSlb3Z20Y0m04CgcPrwNFzNBBf5oLwBoYILNoMPdmUatHE9iyvTfqyXv8EDwL6ikMKNecwZLodb/nJI= zuul@np0005531885.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 01:48:32 np0005531887 python3[29684]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOpFAjaSiSlb3Z20Y0m04CgcPrwNFzNBBf5oLwBoYILNoMPdmUatHE9iyvTfqyXv8EDwL6ikMKNecwZLodb/nJI= zuul@np0005531885.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 01:48:32 np0005531887 python3[29710]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005531887.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Nov 22 01:48:34 np0005531887 python3[29744]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOpFAjaSiSlb3Z20Y0m04CgcPrwNFzNBBf5oLwBoYILNoMPdmUatHE9iyvTfqyXv8EDwL6ikMKNecwZLodb/nJI= zuul@np0005531885.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 01:48:34 np0005531887 python3[29824]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 01:48:35 np0005531887 python3[29897]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1763794114.2024999-168-90521986827376/source _original_basename=tmp5f_91rf4 follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:48:36 np0005531887 python3[29947]: ansible-ansible.builtin.hostname Invoked with name=compute-1 use=systemd
Nov 22 01:48:36 np0005531887 systemd[1]: Starting Hostname Service...
Nov 22 01:48:36 np0005531887 systemd[1]: Started Hostname Service.
Nov 22 01:48:36 np0005531887 systemd-hostnamed[29951]: Changed pretty hostname to 'compute-1'
Nov 22 01:48:36 np0005531887 systemd-hostnamed[29951]: Hostname set to <compute-1> (static)
Nov 22 01:48:36 np0005531887 NetworkManager[7205]: <info>  [1763794116.1770] hostname: static hostname changed from "np0005531887.novalocal" to "compute-1"
Nov 22 01:48:36 np0005531887 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 22 01:48:36 np0005531887 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 22 01:48:36 np0005531887 systemd[1]: session-7.scope: Deactivated successfully.
Nov 22 01:48:36 np0005531887 systemd[1]: session-7.scope: Consumed 2.617s CPU time.
Nov 22 01:48:36 np0005531887 systemd-logind[821]: Session 7 logged out. Waiting for processes to exit.
Nov 22 01:48:36 np0005531887 systemd-logind[821]: Removed session 7.
Nov 22 01:48:46 np0005531887 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 22 01:49:06 np0005531887 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 22 01:49:25 np0005531887 systemd[1]: Starting Cleanup of Temporary Directories...
Nov 22 01:49:25 np0005531887 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Nov 22 01:49:25 np0005531887 systemd[1]: Finished Cleanup of Temporary Directories.
Nov 22 01:49:25 np0005531887 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Nov 22 01:55:33 np0005531887 systemd-logind[821]: New session 8 of user zuul.
Nov 22 01:55:33 np0005531887 systemd[1]: Started Session 8 of User zuul.
Nov 22 01:55:34 np0005531887 python3[30072]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 01:55:36 np0005531887 python3[30188]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 01:55:36 np0005531887 python3[30261]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1763794535.961896-33953-107578261397483/source mode=0755 _original_basename=delorean.repo follow=False checksum=1830be8248976a7f714fb01ca8550e92dfc79ad2 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:55:36 np0005531887 python3[30287]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 01:55:37 np0005531887 python3[30360]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1763794535.961896-33953-107578261397483/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=0bdbb813b840548359ae77c28d76ca272ccaf31b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:55:37 np0005531887 python3[30386]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 01:55:37 np0005531887 python3[30459]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1763794535.961896-33953-107578261397483/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:55:38 np0005531887 python3[30485]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 01:55:38 np0005531887 python3[30558]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1763794535.961896-33953-107578261397483/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:55:38 np0005531887 python3[30584]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 01:55:38 np0005531887 python3[30657]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1763794535.961896-33953-107578261397483/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:55:39 np0005531887 python3[30683]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 01:55:39 np0005531887 python3[30756]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1763794535.961896-33953-107578261397483/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:55:39 np0005531887 python3[30782]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 01:55:40 np0005531887 python3[30855]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1763794535.961896-33953-107578261397483/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=6646317362318a9831d66a1804f6bb7dd1b97cd5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 01:55:49 np0005531887 python3[30905]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:00:49 np0005531887 systemd[1]: session-8.scope: Deactivated successfully.
Nov 22 02:00:49 np0005531887 systemd[1]: session-8.scope: Consumed 4.804s CPU time.
Nov 22 02:00:49 np0005531887 systemd-logind[821]: Session 8 logged out. Waiting for processes to exit.
Nov 22 02:00:49 np0005531887 systemd-logind[821]: Removed session 8.
Nov 22 02:11:49 np0005531887 systemd-logind[821]: New session 9 of user zuul.
Nov 22 02:11:49 np0005531887 systemd[1]: Started Session 9 of User zuul.
Nov 22 02:11:51 np0005531887 python3.9[31132]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 02:11:52 np0005531887 python3.9[31315]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:12:00 np0005531887 systemd[1]: session-9.scope: Deactivated successfully.
Nov 22 02:12:00 np0005531887 systemd[1]: session-9.scope: Consumed 8.762s CPU time.
Nov 22 02:12:00 np0005531887 systemd-logind[821]: Session 9 logged out. Waiting for processes to exit.
Nov 22 02:12:00 np0005531887 systemd-logind[821]: Removed session 9.
Nov 22 02:12:16 np0005531887 systemd-logind[821]: New session 10 of user zuul.
Nov 22 02:12:16 np0005531887 systemd[1]: Started Session 10 of User zuul.
Nov 22 02:12:16 np0005531887 python3.9[31527]: ansible-ansible.legacy.ping Invoked with data=pong
Nov 22 02:12:18 np0005531887 python3.9[31701]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 02:12:19 np0005531887 python3.9[31853]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:12:20 np0005531887 python3.9[32006]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:12:21 np0005531887 python3.9[32158]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:12:21 np0005531887 python3.9[32310]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:12:22 np0005531887 python3.9[32433]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1763795541.276762-183-176357936458162/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:12:23 np0005531887 python3.9[32585]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 02:12:24 np0005531887 python3.9[32741]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:12:24 np0005531887 python3.9[32893]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:12:26 np0005531887 python3.9[33043]: ansible-ansible.builtin.service_facts Invoked
Nov 22 02:12:30 np0005531887 python3.9[33296]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:12:31 np0005531887 python3.9[33446]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 02:12:32 np0005531887 python3.9[33602]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 02:12:33 np0005531887 python3.9[33760]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 22 02:12:34 np0005531887 python3.9[33844]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 02:13:20 np0005531887 systemd[1]: Reloading.
Nov 22 02:13:20 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:13:20 np0005531887 systemd[1]: Starting dnf makecache...
Nov 22 02:13:20 np0005531887 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Nov 22 02:13:20 np0005531887 dnf[34055]: Failed determining last makecache time.
Nov 22 02:13:20 np0005531887 dnf[34055]: delorean-openstack-barbican-42b4c41831408a8e323 117 kB/s | 3.0 kB     00:00
Nov 22 02:13:20 np0005531887 systemd[1]: Reloading.
Nov 22 02:13:20 np0005531887 dnf[34055]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 144 kB/s | 3.0 kB     00:00
Nov 22 02:13:20 np0005531887 dnf[34055]: delorean-openstack-cinder-1c00d6490d88e436f26ef 115 kB/s | 3.0 kB     00:00
Nov 22 02:13:20 np0005531887 dnf[34055]: delorean-python-stevedore-c4acc5639fd2329372142 155 kB/s | 3.0 kB     00:00
Nov 22 02:13:20 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:13:20 np0005531887 dnf[34055]: delorean-python-observabilityclient-2f31846d73c 153 kB/s | 3.0 kB     00:00
Nov 22 02:13:20 np0005531887 dnf[34055]: delorean-os-net-config-bbae2ed8a159b0435a473f38 190 kB/s | 3.0 kB     00:00
Nov 22 02:13:20 np0005531887 dnf[34055]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 187 kB/s | 3.0 kB     00:00
Nov 22 02:13:20 np0005531887 dnf[34055]: delorean-python-designate-tests-tempest-347fdbc 135 kB/s | 3.0 kB     00:00
Nov 22 02:13:20 np0005531887 dnf[34055]: delorean-openstack-glance-1fd12c29b339f30fe823e 158 kB/s | 3.0 kB     00:00
Nov 22 02:13:20 np0005531887 dnf[34055]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 160 kB/s | 3.0 kB     00:00
Nov 22 02:13:20 np0005531887 dnf[34055]: delorean-openstack-manila-3c01b7181572c95dac462 179 kB/s | 3.0 kB     00:00
Nov 22 02:13:20 np0005531887 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Nov 22 02:13:20 np0005531887 dnf[34055]: delorean-python-whitebox-neutron-tests-tempest- 151 kB/s | 3.0 kB     00:00
Nov 22 02:13:20 np0005531887 dnf[34055]: delorean-openstack-octavia-ba397f07a7331190208c 159 kB/s | 3.0 kB     00:00
Nov 22 02:13:20 np0005531887 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Nov 22 02:13:20 np0005531887 dnf[34055]: delorean-openstack-watcher-c014f81a8647287f6dcc 165 kB/s | 3.0 kB     00:00
Nov 22 02:13:20 np0005531887 dnf[34055]: delorean-python-tcib-1124124ec06aadbac34f0d340b 167 kB/s | 3.0 kB     00:00
Nov 22 02:13:20 np0005531887 systemd[1]: Reloading.
Nov 22 02:13:20 np0005531887 dnf[34055]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 151 kB/s | 3.0 kB     00:00
Nov 22 02:13:20 np0005531887 dnf[34055]: delorean-openstack-swift-dc98a8463506ac520c469a 147 kB/s | 3.0 kB     00:00
Nov 22 02:13:20 np0005531887 dnf[34055]: delorean-python-tempestconf-8515371b7cceebd4282 133 kB/s | 3.0 kB     00:00
Nov 22 02:13:20 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:13:20 np0005531887 dnf[34055]: delorean-openstack-heat-ui-013accbfd179753bc3f0 170 kB/s | 3.0 kB     00:00
Nov 22 02:13:21 np0005531887 systemd[1]: Listening on LVM2 poll daemon socket.
Nov 22 02:13:21 np0005531887 dnf[34055]: CentOS Stream 9 - BaseOS                         59 kB/s | 7.3 kB     00:00
Nov 22 02:13:21 np0005531887 dnf[34055]: CentOS Stream 9 - AppStream                      71 kB/s | 7.4 kB     00:00
Nov 22 02:13:21 np0005531887 dbus-broker-launch[810]: Noticed file-system modification, trigger reload.
Nov 22 02:13:21 np0005531887 dbus-broker-launch[810]: Noticed file-system modification, trigger reload.
Nov 22 02:13:21 np0005531887 dbus-broker-launch[810]: Noticed file-system modification, trigger reload.
Nov 22 02:13:21 np0005531887 dnf[34055]: CentOS Stream 9 - CRB                            73 kB/s | 7.2 kB     00:00
Nov 22 02:13:21 np0005531887 dnf[34055]: CentOS Stream 9 - Extras packages                76 kB/s | 8.3 kB     00:00
Nov 22 02:13:21 np0005531887 dnf[34055]: dlrn-antelope-testing                           101 kB/s | 3.0 kB     00:00
Nov 22 02:13:21 np0005531887 dnf[34055]: dlrn-antelope-build-deps                        140 kB/s | 3.0 kB     00:00
Nov 22 02:13:21 np0005531887 dnf[34055]: centos9-rabbitmq                                 93 kB/s | 3.0 kB     00:00
Nov 22 02:13:21 np0005531887 dnf[34055]: centos9-storage                                 122 kB/s | 3.0 kB     00:00
Nov 22 02:13:21 np0005531887 dnf[34055]: centos9-opstools                                116 kB/s | 3.0 kB     00:00
Nov 22 02:13:21 np0005531887 dnf[34055]: NFV SIG OpenvSwitch                             128 kB/s | 3.0 kB     00:00
Nov 22 02:13:21 np0005531887 dnf[34055]: repo-setup-centos-appstream                     213 kB/s | 4.4 kB     00:00
Nov 22 02:13:21 np0005531887 dnf[34055]: repo-setup-centos-baseos                        165 kB/s | 3.9 kB     00:00
Nov 22 02:13:22 np0005531887 dnf[34055]: repo-setup-centos-highavailability              156 kB/s | 3.9 kB     00:00
Nov 22 02:13:22 np0005531887 dnf[34055]: repo-setup-centos-powertools                    189 kB/s | 4.3 kB     00:00
Nov 22 02:13:22 np0005531887 dnf[34055]: Extra Packages for Enterprise Linux 9 - x86_64  256 kB/s |  33 kB     00:00
Nov 22 02:13:25 np0005531887 dnf[34055]: Extra Packages for Enterprise Linux 9 - x86_64  7.4 MB/s |  20 MB     00:02
Nov 22 02:13:35 np0005531887 dnf[34055]: Metadata cache created.
Nov 22 02:13:35 np0005531887 systemd[1]: dnf-makecache.service: Deactivated successfully.
Nov 22 02:13:35 np0005531887 systemd[1]: Finished dnf makecache.
Nov 22 02:13:35 np0005531887 systemd[1]: dnf-makecache.service: Consumed 11.071s CPU time.
Nov 22 02:14:34 np0005531887 kernel: SELinux:  Converting 2718 SID table entries...
Nov 22 02:14:34 np0005531887 kernel: SELinux:  policy capability network_peer_controls=1
Nov 22 02:14:34 np0005531887 kernel: SELinux:  policy capability open_perms=1
Nov 22 02:14:34 np0005531887 kernel: SELinux:  policy capability extended_socket_class=1
Nov 22 02:14:34 np0005531887 kernel: SELinux:  policy capability always_check_network=0
Nov 22 02:14:34 np0005531887 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 22 02:14:34 np0005531887 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 22 02:14:34 np0005531887 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 22 02:14:34 np0005531887 dbus-broker-launch[811]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Nov 22 02:14:34 np0005531887 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 22 02:14:34 np0005531887 systemd[1]: Starting man-db-cache-update.service...
Nov 22 02:14:34 np0005531887 systemd[1]: Reloading.
Nov 22 02:14:34 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:14:34 np0005531887 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 22 02:14:35 np0005531887 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 22 02:14:35 np0005531887 systemd[1]: Finished man-db-cache-update.service.
Nov 22 02:14:35 np0005531887 systemd[1]: man-db-cache-update.service: Consumed 1.227s CPU time.
Nov 22 02:14:35 np0005531887 systemd[1]: run-r0d1706df3560401bb2cc6459aa4dc0d7.service: Deactivated successfully.
Nov 22 02:14:42 np0005531887 python3.9[35399]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:14:44 np0005531887 python3.9[35680]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Nov 22 02:14:46 np0005531887 python3.9[35832]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Nov 22 02:14:53 np0005531887 python3.9[35986]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:15:07 np0005531887 python3.9[36138]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Nov 22 02:15:11 np0005531887 python3.9[36292]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:15:12 np0005531887 python3.9[36444]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:15:12 np0005531887 python3.9[36567]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763795711.6267536-672-184587979472674/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=ba56832c35f23d00035eec09df0d02bc867796ad backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:15:14 np0005531887 python3.9[36719]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:15:15 np0005531887 python3.9[36871]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:15:15 np0005531887 python3.9[37024]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:15:17 np0005531887 python3.9[37176]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Nov 22 02:15:17 np0005531887 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 02:15:17 np0005531887 irqbalance[816]: Cannot change IRQ 26 affinity: Operation not permitted
Nov 22 02:15:17 np0005531887 irqbalance[816]: IRQ 26 affinity is now unmanaged
Nov 22 02:15:18 np0005531887 python3.9[37330]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 22 02:15:19 np0005531887 python3.9[37488]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 22 02:15:20 np0005531887 python3.9[37648]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Nov 22 02:15:20 np0005531887 python3.9[37801]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 22 02:15:22 np0005531887 python3.9[37959]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Nov 22 02:15:23 np0005531887 python3.9[38111]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 02:15:26 np0005531887 python3.9[38264]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:15:27 np0005531887 python3.9[38416]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:15:27 np0005531887 python3.9[38539]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763795726.5861306-1029-57972090717533/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:15:28 np0005531887 python3.9[38691]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 02:15:28 np0005531887 systemd[1]: Starting Load Kernel Modules...
Nov 22 02:15:28 np0005531887 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Nov 22 02:15:28 np0005531887 kernel: Bridge firewalling registered
Nov 22 02:15:28 np0005531887 systemd-modules-load[38695]: Inserted module 'br_netfilter'
Nov 22 02:15:28 np0005531887 systemd[1]: Finished Load Kernel Modules.
Nov 22 02:15:29 np0005531887 python3.9[38850]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:15:30 np0005531887 python3.9[38973]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763795729.2009792-1098-229863080422944/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:15:31 np0005531887 python3.9[39125]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 02:15:35 np0005531887 dbus-broker-launch[810]: Noticed file-system modification, trigger reload.
Nov 22 02:15:35 np0005531887 dbus-broker-launch[810]: Noticed file-system modification, trigger reload.
Nov 22 02:15:36 np0005531887 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 22 02:15:36 np0005531887 systemd[1]: Starting man-db-cache-update.service...
Nov 22 02:15:36 np0005531887 systemd[1]: Reloading.
Nov 22 02:15:36 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:15:36 np0005531887 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 22 02:15:38 np0005531887 python3.9[40417]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:15:39 np0005531887 python3.9[41463]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Nov 22 02:15:39 np0005531887 python3.9[42370]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:15:40 np0005531887 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 22 02:15:40 np0005531887 systemd[1]: Finished man-db-cache-update.service.
Nov 22 02:15:40 np0005531887 systemd[1]: man-db-cache-update.service: Consumed 5.016s CPU time.
Nov 22 02:15:40 np0005531887 systemd[1]: run-r6c890e4b1ba14fd3b2a32b8705cf56f9.service: Deactivated successfully.
Nov 22 02:15:40 np0005531887 python3.9[43326]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:15:41 np0005531887 systemd[1]: Starting Dynamic System Tuning Daemon...
Nov 22 02:15:41 np0005531887 systemd[1]: Starting Authorization Manager...
Nov 22 02:15:41 np0005531887 systemd[1]: Started Dynamic System Tuning Daemon.
Nov 22 02:15:41 np0005531887 polkitd[43544]: Started polkitd version 0.117
Nov 22 02:15:41 np0005531887 systemd[1]: Started Authorization Manager.
Nov 22 02:15:42 np0005531887 python3.9[43714]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:15:42 np0005531887 systemd[1]: Stopping Dynamic System Tuning Daemon...
Nov 22 02:15:42 np0005531887 systemd[1]: tuned.service: Deactivated successfully.
Nov 22 02:15:42 np0005531887 systemd[1]: Stopped Dynamic System Tuning Daemon.
Nov 22 02:15:42 np0005531887 systemd[1]: Starting Dynamic System Tuning Daemon...
Nov 22 02:15:42 np0005531887 systemd[1]: Started Dynamic System Tuning Daemon.
Nov 22 02:15:43 np0005531887 python3.9[43876]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Nov 22 02:15:48 np0005531887 python3.9[44030]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:15:48 np0005531887 systemd[1]: Reloading.
Nov 22 02:15:48 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:15:49 np0005531887 python3.9[44220]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:15:49 np0005531887 systemd[1]: Reloading.
Nov 22 02:15:49 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:15:50 np0005531887 python3.9[44409]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:15:51 np0005531887 python3.9[44562]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:15:51 np0005531887 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Nov 22 02:15:51 np0005531887 python3.9[44715]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:15:54 np0005531887 python3.9[44878]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:15:55 np0005531887 python3.9[45031]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 02:15:55 np0005531887 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Nov 22 02:15:55 np0005531887 systemd[1]: Stopped Apply Kernel Variables.
Nov 22 02:15:55 np0005531887 systemd[1]: Stopping Apply Kernel Variables...
Nov 22 02:15:55 np0005531887 systemd[1]: Starting Apply Kernel Variables...
Nov 22 02:15:55 np0005531887 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Nov 22 02:15:55 np0005531887 systemd[1]: Finished Apply Kernel Variables.
Nov 22 02:15:55 np0005531887 systemd-logind[821]: Session 10 logged out. Waiting for processes to exit.
Nov 22 02:15:55 np0005531887 systemd[1]: session-10.scope: Deactivated successfully.
Nov 22 02:15:55 np0005531887 systemd[1]: session-10.scope: Consumed 2min 23.837s CPU time.
Nov 22 02:15:55 np0005531887 systemd-logind[821]: Removed session 10.
Nov 22 02:16:01 np0005531887 systemd-logind[821]: New session 11 of user zuul.
Nov 22 02:16:01 np0005531887 systemd[1]: Started Session 11 of User zuul.
Nov 22 02:16:02 np0005531887 python3.9[45214]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 02:16:03 np0005531887 python3.9[45368]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 02:16:05 np0005531887 python3.9[45524]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:16:06 np0005531887 python3.9[45675]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 02:16:07 np0005531887 python3.9[45831]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 22 02:16:08 np0005531887 python3.9[45915]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 02:16:10 np0005531887 python3.9[46068]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 22 02:16:11 np0005531887 python3.9[46239]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:16:12 np0005531887 python3.9[46391]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:16:12 np0005531887 podman[46392]: 2025-11-22 07:16:12.703084746 +0000 UTC m=+0.181361219 system refresh
Nov 22 02:16:13 np0005531887 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 02:16:13 np0005531887 python3.9[46553]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:16:14 np0005531887 python3.9[46676]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763795772.9098291-293-21657589721559/.source.json follow=False _original_basename=podman_network_config.j2 checksum=44500ac0de5e108f0c792e1b0f644391757aafef backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:16:15 np0005531887 python3.9[46828]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:16:15 np0005531887 python3.9[46951]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763795774.7612312-338-31590812087066/.source.conf follow=False _original_basename=registries.conf.j2 checksum=193e1b13ee9dd51d1fc7c456c46399ca66d3b9c7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:16:16 np0005531887 python3.9[47103]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:16:17 np0005531887 python3.9[47255]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:16:18 np0005531887 python3.9[47407]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:16:18 np0005531887 python3.9[47559]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:16:20 np0005531887 python3.9[47709]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 02:16:20 np0005531887 python3.9[47863]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 22 02:16:23 np0005531887 python3.9[48018]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 22 02:16:26 np0005531887 python3.9[48178]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 22 02:16:29 np0005531887 python3.9[48331]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 22 02:16:32 np0005531887 python3.9[48484]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 22 02:16:37 np0005531887 python3.9[48640]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 22 02:16:44 np0005531887 python3.9[48810]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 22 02:16:47 np0005531887 python3.9[48963]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 22 02:17:10 np0005531887 python3.9[49302]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['iscsi-initiator-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 22 02:17:12 np0005531887 python3.9[49458]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:17:13 np0005531887 python3.9[49633]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:17:13 np0005531887 python3.9[49756]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1763795832.8926685-782-27634816554170/.source.json _original_basename=.9y7yggep follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:17:15 np0005531887 python3.9[49908]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 22 02:17:15 np0005531887 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 02:17:19 np0005531887 systemd[1]: var-lib-containers-storage-overlay-compat3656496476-lower\x2dmapped.mount: Deactivated successfully.
Nov 22 02:17:23 np0005531887 podman[49919]: 2025-11-22 07:17:23.470024253 +0000 UTC m=+8.379015665 image pull 197857ba4b35dfe0da58eb2e9c37f91c8a1d2b66c0967b4c66656aa6329b870c quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 22 02:17:23 np0005531887 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 02:17:23 np0005531887 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 02:17:23 np0005531887 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 02:17:24 np0005531887 python3.9[50217]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 22 02:17:24 np0005531887 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 02:17:28 np0005531887 podman[50229]: 2025-11-22 07:17:28.123408884 +0000 UTC m=+3.324731416 image pull 5a87eb2d1bea5c4c3bce654551fc0b05a96cf5556b36110e17bddeee8189b072 quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 22 02:17:28 np0005531887 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 02:17:28 np0005531887 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 02:17:28 np0005531887 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 02:17:29 np0005531887 python3.9[50467]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 22 02:17:29 np0005531887 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 02:17:54 np0005531887 podman[50479]: 2025-11-22 07:17:54.370714464 +0000 UTC m=+24.847095126 image pull 8e31b7b83c8d26bacd9598fdae1b287d27f8fa7d1d3cf4270dd8e435ff2f6a66 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 22 02:17:54 np0005531887 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 02:17:54 np0005531887 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 02:17:54 np0005531887 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 02:17:58 np0005531887 python3.9[50761]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 22 02:17:58 np0005531887 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 02:18:07 np0005531887 podman[50772]: 2025-11-22 07:18:07.004673163 +0000 UTC m=+7.989333799 image pull 5b3bac081df6146e06acefa72320d250dc7d5f82abc7fbe0b9e83aec1e1587f5 quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Nov 22 02:18:07 np0005531887 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 02:18:07 np0005531887 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 02:18:07 np0005531887 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 02:18:07 np0005531887 python3.9[51031]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter:v1.5.0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 22 02:18:18 np0005531887 podman[51043]: 2025-11-22 07:18:18.231300076 +0000 UTC m=+10.321563554 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Nov 22 02:18:18 np0005531887 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 02:18:18 np0005531887 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 02:18:18 np0005531887 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 02:18:18 np0005531887 systemd-logind[821]: Session 11 logged out. Waiting for processes to exit.
Nov 22 02:18:18 np0005531887 systemd[1]: session-11.scope: Deactivated successfully.
Nov 22 02:18:18 np0005531887 systemd[1]: session-11.scope: Consumed 1min 41.387s CPU time.
Nov 22 02:18:18 np0005531887 systemd-logind[821]: Removed session 11.
Nov 22 02:18:24 np0005531887 systemd-logind[821]: New session 12 of user zuul.
Nov 22 02:18:24 np0005531887 systemd[1]: Started Session 12 of User zuul.
Nov 22 02:18:25 np0005531887 python3.9[51354]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 02:18:30 np0005531887 python3.9[51510]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Nov 22 02:18:31 np0005531887 python3.9[51663]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 22 02:18:32 np0005531887 python3.9[51821]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 22 02:18:33 np0005531887 python3.9[51981]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 22 02:18:34 np0005531887 python3.9[52065]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 22 02:18:38 np0005531887 python3.9[52227]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 02:18:52 np0005531887 kernel: SELinux:  Converting 2731 SID table entries...
Nov 22 02:18:52 np0005531887 kernel: SELinux:  policy capability network_peer_controls=1
Nov 22 02:18:52 np0005531887 kernel: SELinux:  policy capability open_perms=1
Nov 22 02:18:52 np0005531887 kernel: SELinux:  policy capability extended_socket_class=1
Nov 22 02:18:52 np0005531887 kernel: SELinux:  policy capability always_check_network=0
Nov 22 02:18:52 np0005531887 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 22 02:18:52 np0005531887 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 22 02:18:52 np0005531887 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 22 02:18:53 np0005531887 dbus-broker-launch[811]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Nov 22 02:18:53 np0005531887 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Nov 22 02:18:55 np0005531887 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 22 02:18:55 np0005531887 systemd[1]: Starting man-db-cache-update.service...
Nov 22 02:18:55 np0005531887 systemd[1]: Reloading.
Nov 22 02:18:55 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:18:55 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:18:55 np0005531887 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 22 02:18:56 np0005531887 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 22 02:18:56 np0005531887 systemd[1]: Finished man-db-cache-update.service.
Nov 22 02:18:56 np0005531887 systemd[1]: run-r7cb3c6fc7e90417c9ce46e29cc620966.service: Deactivated successfully.
Nov 22 02:18:58 np0005531887 python3.9[53325]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 22 02:18:58 np0005531887 systemd[1]: Reloading.
Nov 22 02:18:58 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:18:58 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:18:58 np0005531887 systemd[1]: Starting Open vSwitch Database Unit...
Nov 22 02:18:58 np0005531887 chown[53368]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Nov 22 02:18:58 np0005531887 ovs-ctl[53373]: /etc/openvswitch/conf.db does not exist ... (warning).
Nov 22 02:18:58 np0005531887 ovs-ctl[53373]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Nov 22 02:18:58 np0005531887 ovs-ctl[53373]: Starting ovsdb-server [  OK  ]
Nov 22 02:18:58 np0005531887 ovs-vsctl[53422]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Nov 22 02:18:58 np0005531887 ovs-vsctl[53442]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"73ab1342-b2af-4236-8199-7d435ebce194\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Nov 22 02:18:58 np0005531887 ovs-ctl[53373]: Configuring Open vSwitch system IDs [  OK  ]
Nov 22 02:18:58 np0005531887 ovs-ctl[53373]: Enabling remote OVSDB managers [  OK  ]
Nov 22 02:18:58 np0005531887 ovs-vsctl[53448]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Nov 22 02:18:58 np0005531887 systemd[1]: Started Open vSwitch Database Unit.
Nov 22 02:18:58 np0005531887 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Nov 22 02:18:58 np0005531887 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Nov 22 02:18:59 np0005531887 systemd[1]: Starting Open vSwitch Forwarding Unit...
Nov 22 02:18:59 np0005531887 kernel: openvswitch: Open vSwitch switching datapath
Nov 22 02:18:59 np0005531887 ovs-ctl[53493]: Inserting openvswitch module [  OK  ]
Nov 22 02:18:59 np0005531887 ovs-ctl[53462]: Starting ovs-vswitchd [  OK  ]
Nov 22 02:18:59 np0005531887 ovs-vsctl[53510]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Nov 22 02:18:59 np0005531887 ovs-ctl[53462]: Enabling remote OVSDB managers [  OK  ]
Nov 22 02:18:59 np0005531887 systemd[1]: Started Open vSwitch Forwarding Unit.
Nov 22 02:18:59 np0005531887 systemd[1]: Starting Open vSwitch...
Nov 22 02:18:59 np0005531887 systemd[1]: Finished Open vSwitch.
Nov 22 02:19:00 np0005531887 python3.9[53662]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 02:19:01 np0005531887 python3.9[53814]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Nov 22 02:19:04 np0005531887 kernel: SELinux:  Converting 2745 SID table entries...
Nov 22 02:19:04 np0005531887 kernel: SELinux:  policy capability network_peer_controls=1
Nov 22 02:19:04 np0005531887 kernel: SELinux:  policy capability open_perms=1
Nov 22 02:19:04 np0005531887 kernel: SELinux:  policy capability extended_socket_class=1
Nov 22 02:19:04 np0005531887 kernel: SELinux:  policy capability always_check_network=0
Nov 22 02:19:04 np0005531887 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 22 02:19:04 np0005531887 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 22 02:19:04 np0005531887 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 22 02:19:05 np0005531887 python3.9[53970]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 02:19:06 np0005531887 dbus-broker-launch[811]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Nov 22 02:19:07 np0005531887 python3.9[54128]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 02:19:09 np0005531887 python3.9[54281]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:19:11 np0005531887 python3.9[54568]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 22 02:19:11 np0005531887 python3.9[54718]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:19:12 np0005531887 python3.9[54872]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 02:19:14 np0005531887 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 22 02:19:14 np0005531887 systemd[1]: Starting man-db-cache-update.service...
Nov 22 02:19:14 np0005531887 systemd[1]: Reloading.
Nov 22 02:19:14 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:19:14 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:19:15 np0005531887 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 22 02:19:15 np0005531887 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 22 02:19:15 np0005531887 systemd[1]: Finished man-db-cache-update.service.
Nov 22 02:19:15 np0005531887 systemd[1]: run-rb345fc832c204d19a2303f187e3ae4ad.service: Deactivated successfully.
Nov 22 02:19:16 np0005531887 python3.9[55192]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 02:19:16 np0005531887 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Nov 22 02:19:16 np0005531887 systemd[1]: Stopped Network Manager Wait Online.
Nov 22 02:19:16 np0005531887 systemd[1]: Stopping Network Manager Wait Online...
Nov 22 02:19:16 np0005531887 systemd[1]: Stopping Network Manager...
Nov 22 02:19:16 np0005531887 NetworkManager[7205]: <info>  [1763795956.7422] caught SIGTERM, shutting down normally.
Nov 22 02:19:16 np0005531887 NetworkManager[7205]: <info>  [1763795956.7441] dhcp4 (eth0): canceled DHCP transaction
Nov 22 02:19:16 np0005531887 NetworkManager[7205]: <info>  [1763795956.7442] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 22 02:19:16 np0005531887 NetworkManager[7205]: <info>  [1763795956.7442] dhcp4 (eth0): state changed no lease
Nov 22 02:19:16 np0005531887 NetworkManager[7205]: <info>  [1763795956.7445] manager: NetworkManager state is now CONNECTED_SITE
Nov 22 02:19:16 np0005531887 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 22 02:19:16 np0005531887 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 22 02:19:16 np0005531887 NetworkManager[7205]: <info>  [1763795956.7847] exiting (success)
Nov 22 02:19:16 np0005531887 systemd[1]: NetworkManager.service: Deactivated successfully.
Nov 22 02:19:16 np0005531887 systemd[1]: Stopped Network Manager.
Nov 22 02:19:16 np0005531887 systemd[1]: NetworkManager.service: Consumed 21.585s CPU time, 4.1M memory peak, read 0B from disk, written 33.0K to disk.
Nov 22 02:19:16 np0005531887 systemd[1]: Starting Network Manager...
Nov 22 02:19:16 np0005531887 NetworkManager[55210]: <info>  [1763795956.8672] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:c39ce406-ec93-4c16-a5f8-0230d1610d46)
Nov 22 02:19:16 np0005531887 NetworkManager[55210]: <info>  [1763795956.8675] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Nov 22 02:19:16 np0005531887 NetworkManager[55210]: <info>  [1763795956.8747] manager[0x55c7ff82b090]: monitoring kernel firmware directory '/lib/firmware'.
Nov 22 02:19:16 np0005531887 systemd[1]: Starting Hostname Service...
Nov 22 02:19:16 np0005531887 systemd[1]: Started Hostname Service.
Nov 22 02:19:16 np0005531887 NetworkManager[55210]: <info>  [1763795956.9577] hostname: hostname: using hostnamed
Nov 22 02:19:16 np0005531887 NetworkManager[55210]: <info>  [1763795956.9578] hostname: static hostname changed from (none) to "compute-1"
Nov 22 02:19:16 np0005531887 NetworkManager[55210]: <info>  [1763795956.9583] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 22 02:19:16 np0005531887 NetworkManager[55210]: <info>  [1763795956.9589] manager[0x55c7ff82b090]: rfkill: Wi-Fi hardware radio set enabled
Nov 22 02:19:16 np0005531887 NetworkManager[55210]: <info>  [1763795956.9589] manager[0x55c7ff82b090]: rfkill: WWAN hardware radio set enabled
Nov 22 02:19:16 np0005531887 NetworkManager[55210]: <info>  [1763795956.9613] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-ovs.so)
Nov 22 02:19:16 np0005531887 NetworkManager[55210]: <info>  [1763795956.9624] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 22 02:19:16 np0005531887 NetworkManager[55210]: <info>  [1763795956.9625] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 22 02:19:16 np0005531887 NetworkManager[55210]: <info>  [1763795956.9626] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 22 02:19:16 np0005531887 NetworkManager[55210]: <info>  [1763795956.9626] manager: Networking is enabled by state file
Nov 22 02:19:16 np0005531887 NetworkManager[55210]: <info>  [1763795956.9628] settings: Loaded settings plugin: keyfile (internal)
Nov 22 02:19:16 np0005531887 NetworkManager[55210]: <info>  [1763795956.9632] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 22 02:19:16 np0005531887 NetworkManager[55210]: <info>  [1763795956.9656] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 22 02:19:16 np0005531887 NetworkManager[55210]: <info>  [1763795956.9666] dhcp: init: Using DHCP client 'internal'
Nov 22 02:19:16 np0005531887 NetworkManager[55210]: <info>  [1763795956.9668] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 22 02:19:16 np0005531887 NetworkManager[55210]: <info>  [1763795956.9673] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 02:19:16 np0005531887 NetworkManager[55210]: <info>  [1763795956.9679] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 22 02:19:16 np0005531887 NetworkManager[55210]: <info>  [1763795956.9686] device (lo): Activation: starting connection 'lo' (b8f7fc42-9e8e-4183-b750-e949ad8a8a15)
Nov 22 02:19:16 np0005531887 NetworkManager[55210]: <info>  [1763795956.9694] device (eth0): carrier: link connected
Nov 22 02:19:16 np0005531887 NetworkManager[55210]: <info>  [1763795956.9698] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 22 02:19:16 np0005531887 NetworkManager[55210]: <info>  [1763795956.9701] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Nov 22 02:19:16 np0005531887 NetworkManager[55210]: <info>  [1763795956.9702] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 22 02:19:16 np0005531887 NetworkManager[55210]: <info>  [1763795956.9709] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 22 02:19:16 np0005531887 NetworkManager[55210]: <info>  [1763795956.9715] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 22 02:19:16 np0005531887 NetworkManager[55210]: <info>  [1763795956.9722] device (eth1): carrier: link connected
Nov 22 02:19:16 np0005531887 NetworkManager[55210]: <info>  [1763795956.9726] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 22 02:19:16 np0005531887 NetworkManager[55210]: <info>  [1763795956.9731] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (1cca51b0-a68b-5692-83d5-ea4fdaca949c) (indicated)
Nov 22 02:19:16 np0005531887 NetworkManager[55210]: <info>  [1763795956.9732] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 22 02:19:16 np0005531887 NetworkManager[55210]: <info>  [1763795956.9738] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 22 02:19:16 np0005531887 NetworkManager[55210]: <info>  [1763795956.9746] device (eth1): Activation: starting connection 'ci-private-network' (1cca51b0-a68b-5692-83d5-ea4fdaca949c)
Nov 22 02:19:16 np0005531887 systemd[1]: Started Network Manager.
Nov 22 02:19:16 np0005531887 NetworkManager[55210]: <info>  [1763795956.9757] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 22 02:19:16 np0005531887 NetworkManager[55210]: <info>  [1763795956.9768] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 22 02:19:16 np0005531887 NetworkManager[55210]: <info>  [1763795956.9780] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 22 02:19:16 np0005531887 NetworkManager[55210]: <info>  [1763795956.9783] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 22 02:19:16 np0005531887 NetworkManager[55210]: <info>  [1763795956.9786] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 22 02:19:16 np0005531887 NetworkManager[55210]: <info>  [1763795956.9789] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 22 02:19:16 np0005531887 NetworkManager[55210]: <info>  [1763795956.9793] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 22 02:19:16 np0005531887 NetworkManager[55210]: <info>  [1763795956.9796] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 22 02:19:16 np0005531887 NetworkManager[55210]: <info>  [1763795956.9799] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 22 02:19:16 np0005531887 NetworkManager[55210]: <info>  [1763795956.9805] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 22 02:19:16 np0005531887 NetworkManager[55210]: <info>  [1763795956.9808] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 22 02:19:16 np0005531887 NetworkManager[55210]: <info>  [1763795956.9838] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 22 02:19:16 np0005531887 NetworkManager[55210]: <info>  [1763795956.9856] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 22 02:19:16 np0005531887 NetworkManager[55210]: <info>  [1763795956.9870] dhcp4 (eth0): state changed new lease, address=38.129.56.226
Nov 22 02:19:16 np0005531887 NetworkManager[55210]: <info>  [1763795956.9879] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 22 02:19:16 np0005531887 systemd[1]: Starting Network Manager Wait Online...
Nov 22 02:19:16 np0005531887 NetworkManager[55210]: <info>  [1763795956.9980] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 22 02:19:16 np0005531887 NetworkManager[55210]: <info>  [1763795956.9988] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 22 02:19:16 np0005531887 NetworkManager[55210]: <info>  [1763795956.9995] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 22 02:19:17 np0005531887 NetworkManager[55210]: <info>  [1763795957.0001] device (lo): Activation: successful, device activated.
Nov 22 02:19:17 np0005531887 NetworkManager[55210]: <info>  [1763795957.0009] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 22 02:19:17 np0005531887 NetworkManager[55210]: <info>  [1763795957.0014] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 22 02:19:17 np0005531887 NetworkManager[55210]: <info>  [1763795957.0018] manager: NetworkManager state is now CONNECTED_LOCAL
Nov 22 02:19:17 np0005531887 NetworkManager[55210]: <info>  [1763795957.0022] device (eth1): Activation: successful, device activated.
Nov 22 02:19:17 np0005531887 NetworkManager[55210]: <info>  [1763795957.0030] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 22 02:19:17 np0005531887 NetworkManager[55210]: <info>  [1763795957.0033] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 22 02:19:17 np0005531887 NetworkManager[55210]: <info>  [1763795957.0039] manager: NetworkManager state is now CONNECTED_SITE
Nov 22 02:19:17 np0005531887 NetworkManager[55210]: <info>  [1763795957.0043] device (eth0): Activation: successful, device activated.
Nov 22 02:19:17 np0005531887 NetworkManager[55210]: <info>  [1763795957.0048] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 22 02:19:17 np0005531887 NetworkManager[55210]: <info>  [1763795957.0124] manager: startup complete
Nov 22 02:19:17 np0005531887 systemd[1]: Finished Network Manager Wait Online.
Nov 22 02:19:17 np0005531887 python3.9[55418]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 02:19:25 np0005531887 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 22 02:19:25 np0005531887 systemd[1]: Starting man-db-cache-update.service...
Nov 22 02:19:25 np0005531887 systemd[1]: Reloading.
Nov 22 02:19:25 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:19:25 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:19:25 np0005531887 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 22 02:19:27 np0005531887 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 22 02:19:27 np0005531887 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 22 02:19:27 np0005531887 systemd[1]: Finished man-db-cache-update.service.
Nov 22 02:19:27 np0005531887 systemd[1]: run-rde16a7f205774d008a6ee206afedd10b.service: Deactivated successfully.
Nov 22 02:19:28 np0005531887 python3.9[55879]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:19:29 np0005531887 python3.9[56031]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:19:30 np0005531887 python3.9[56185]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:19:31 np0005531887 python3.9[56337]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:19:32 np0005531887 python3.9[56489]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:19:32 np0005531887 python3.9[56641]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:19:33 np0005531887 python3.9[56793]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:19:34 np0005531887 python3.9[56916]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1763795973.0995095-653-102950933293496/.source _original_basename=.a56letw8 follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:19:35 np0005531887 python3.9[57068]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:19:36 np0005531887 python3.9[57220]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Nov 22 02:19:37 np0005531887 python3.9[57372]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:19:40 np0005531887 python3.9[57801]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Nov 22 02:19:41 np0005531887 ansible-async_wrapper.py[57976]: Invoked with j942434221639 300 /home/zuul/.ansible/tmp/ansible-tmp-1763795980.4381638-851-66704363925720/AnsiballZ_edpm_os_net_config.py _
Nov 22 02:19:41 np0005531887 ansible-async_wrapper.py[57979]: Starting module and watcher
Nov 22 02:19:41 np0005531887 ansible-async_wrapper.py[57979]: Start watching 57980 (300)
Nov 22 02:19:41 np0005531887 ansible-async_wrapper.py[57980]: Start module (57980)
Nov 22 02:19:41 np0005531887 ansible-async_wrapper.py[57976]: Return async_wrapper task started.
Nov 22 02:19:41 np0005531887 python3.9[57981]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Nov 22 02:19:42 np0005531887 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Nov 22 02:19:42 np0005531887 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Nov 22 02:19:42 np0005531887 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Nov 22 02:19:42 np0005531887 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Nov 22 02:19:42 np0005531887 kernel: cfg80211: failed to load regulatory.db
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.5660] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=57982 uid=0 result="success"
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.5681] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=57982 uid=0 result="success"
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.6360] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.6361] audit: op="connection-add" uuid="21f7c3b9-f90c-47a7-8754-8d0d9e4fa79b" name="br-ex-br" pid=57982 uid=0 result="success"
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.6377] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.6379] audit: op="connection-add" uuid="ec4ae993-a327-44f8-bfd7-7dd73d252f1b" name="br-ex-port" pid=57982 uid=0 result="success"
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.6389] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.6390] audit: op="connection-add" uuid="047612f1-5e9c-4c53-89f1-63a616878f80" name="eth1-port" pid=57982 uid=0 result="success"
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.6402] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.6404] audit: op="connection-add" uuid="868782ab-664d-4629-b2af-6e4519fd3a7b" name="vlan20-port" pid=57982 uid=0 result="success"
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.6417] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.6418] audit: op="connection-add" uuid="a7c4dfe3-a7be-436a-9c63-b7daba5fcef4" name="vlan21-port" pid=57982 uid=0 result="success"
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.6429] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.6430] audit: op="connection-add" uuid="ad9ad911-f770-45cc-9891-1d2c1602ce80" name="vlan22-port" pid=57982 uid=0 result="success"
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.6450] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="connection.autoconnect-priority,connection.timestamp,802-3-ethernet.mtu,ipv6.method,ipv6.addr-gen-mode,ipv6.dhcp-timeout,ipv4.dhcp-client-id,ipv4.dhcp-timeout" pid=57982 uid=0 result="success"
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.6466] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.6467] audit: op="connection-add" uuid="6c67c6ea-6d66-4247-adb1-5ad81041e45c" name="br-ex-if" pid=57982 uid=0 result="success"
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8203] audit: op="connection-update" uuid="1cca51b0-a68b-5692-83d5-ea4fdaca949c" name="ci-private-network" args="ovs-external-ids.data,connection.slave-type,connection.master,connection.controller,connection.timestamp,connection.port-type,ipv6.method,ipv6.dns,ipv6.routes,ipv6.addr-gen-mode,ipv6.addresses,ipv6.routing-rules,ipv4.method,ipv4.dns,ipv4.routes,ipv4.addresses,ipv4.never-default,ipv4.routing-rules,ovs-interface.type" pid=57982 uid=0 result="success"
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8224] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8226] audit: op="connection-add" uuid="55e5a2a4-f530-45e7-b8f1-57c9d47a415b" name="vlan20-if" pid=57982 uid=0 result="success"
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8239] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8241] audit: op="connection-add" uuid="f8b4d368-445b-444e-8d6e-cf11d8f6fb0b" name="vlan21-if" pid=57982 uid=0 result="success"
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8255] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8256] audit: op="connection-add" uuid="395f82b7-8712-4b88-a7ae-8168f70e866c" name="vlan22-if" pid=57982 uid=0 result="success"
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8270] audit: op="connection-delete" uuid="a54092cf-a033-39de-bd2d-a412d28aa90a" name="Wired connection 1" pid=57982 uid=0 result="success"
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8281] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8293] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8297] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (21f7c3b9-f90c-47a7-8754-8d0d9e4fa79b)
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8297] audit: op="connection-activate" uuid="21f7c3b9-f90c-47a7-8754-8d0d9e4fa79b" name="br-ex-br" pid=57982 uid=0 result="success"
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8299] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8306] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8310] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (ec4ae993-a327-44f8-bfd7-7dd73d252f1b)
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8312] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8319] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8324] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (047612f1-5e9c-4c53-89f1-63a616878f80)
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8326] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8333] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8337] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (868782ab-664d-4629-b2af-6e4519fd3a7b)
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8340] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8346] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8353] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (a7c4dfe3-a7be-436a-9c63-b7daba5fcef4)
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8356] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8365] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8368] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (ad9ad911-f770-45cc-9891-1d2c1602ce80)
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8369] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8371] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8373] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8381] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8386] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8390] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (6c67c6ea-6d66-4247-adb1-5ad81041e45c)
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8391] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8394] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8396] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8398] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8400] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8414] device (eth1): disconnecting for new activation request.
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8414] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8417] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8419] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8420] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8423] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8427] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8432] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (55e5a2a4-f530-45e7-b8f1-57c9d47a415b)
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8433] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8436] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8438] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8439] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8442] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8447] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8450] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (f8b4d368-445b-444e-8d6e-cf11d8f6fb0b)
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8451] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8453] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8455] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8457] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8460] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8465] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8470] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (395f82b7-8712-4b88-a7ae-8168f70e866c)
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8471] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8474] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8476] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8477] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8478] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8494] audit: op="device-reapply" interface="eth0" ifindex=2 args="connection.autoconnect-priority,802-3-ethernet.mtu,ipv6.method,ipv6.addr-gen-mode,ipv4.dhcp-client-id,ipv4.dhcp-timeout" pid=57982 uid=0 result="success"
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8498] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8501] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8503] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8509] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8513] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8517] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8520] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8522] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8527] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531887 kernel: ovs-system: entered promiscuous mode
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8542] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8546] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8548] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8553] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531887 kernel: Timeout policy base is empty
Nov 22 02:19:43 np0005531887 systemd-udevd[57986]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8565] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8569] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8571] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8575] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8579] dhcp4 (eth0): canceled DHCP transaction
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8579] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8579] dhcp4 (eth0): state changed no lease
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8581] dhcp4 (eth0): activation: beginning transaction (no timeout)
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8593] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8596] audit: op="device-reapply" interface="eth1" ifindex=3 pid=57982 uid=0 result="fail" reason="Device is not activated"
Nov 22 02:19:43 np0005531887 NetworkManager[55210]: <info>  [1763795983.8599] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Nov 22 02:19:43 np0005531887 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 22 02:19:43 np0005531887 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 22 02:19:43 np0005531887 kernel: br-ex: entered promiscuous mode
Nov 22 02:19:43 np0005531887 kernel: vlan21: entered promiscuous mode
Nov 22 02:19:43 np0005531887 systemd-udevd[57988]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:19:43 np0005531887 kernel: vlan20: entered promiscuous mode
Nov 22 02:19:44 np0005531887 NetworkManager[55210]: <info>  [1763795984.0420] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Nov 22 02:19:44 np0005531887 NetworkManager[55210]: <info>  [1763795984.0426] dhcp4 (eth0): state changed new lease, address=38.129.56.226
Nov 22 02:19:44 np0005531887 NetworkManager[55210]: <info>  [1763795984.0436] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Nov 22 02:19:44 np0005531887 NetworkManager[55210]: <info>  [1763795984.0450] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Nov 22 02:19:44 np0005531887 NetworkManager[55210]: <info>  [1763795984.0457] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Nov 22 02:19:44 np0005531887 NetworkManager[55210]: <info>  [1763795984.0462] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Nov 22 02:19:44 np0005531887 kernel: vlan22: entered promiscuous mode
Nov 22 02:19:44 np0005531887 NetworkManager[55210]: <info>  [1763795984.1736] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Nov 22 02:19:44 np0005531887 NetworkManager[55210]: <info>  [1763795984.1830] device (eth1): Activation: starting connection 'ci-private-network' (1cca51b0-a68b-5692-83d5-ea4fdaca949c)
Nov 22 02:19:44 np0005531887 NetworkManager[55210]: <info>  [1763795984.1834] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 22 02:19:44 np0005531887 NetworkManager[55210]: <info>  [1763795984.1836] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 22 02:19:44 np0005531887 NetworkManager[55210]: <info>  [1763795984.1837] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 22 02:19:44 np0005531887 NetworkManager[55210]: <info>  [1763795984.1838] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 22 02:19:44 np0005531887 NetworkManager[55210]: <info>  [1763795984.1840] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 22 02:19:44 np0005531887 NetworkManager[55210]: <info>  [1763795984.1841] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 22 02:19:44 np0005531887 NetworkManager[55210]: <info>  [1763795984.1844] device (eth1): state change: disconnected -> deactivating (reason 'new-activation', managed-type: 'full')
Nov 22 02:19:44 np0005531887 NetworkManager[55210]: <info>  [1763795984.1853] device (eth1): disconnecting for new activation request.
Nov 22 02:19:44 np0005531887 NetworkManager[55210]: <info>  [1763795984.1854] audit: op="connection-activate" uuid="1cca51b0-a68b-5692-83d5-ea4fdaca949c" name="ci-private-network" pid=57982 uid=0 result="success"
Nov 22 02:19:44 np0005531887 NetworkManager[55210]: <info>  [1763795984.1870] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Nov 22 02:19:44 np0005531887 NetworkManager[55210]: <info>  [1763795984.1874] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 22 02:19:44 np0005531887 NetworkManager[55210]: <info>  [1763795984.1879] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Nov 22 02:19:44 np0005531887 NetworkManager[55210]: <info>  [1763795984.1883] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 22 02:19:44 np0005531887 NetworkManager[55210]: <info>  [1763795984.1888] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Nov 22 02:19:44 np0005531887 NetworkManager[55210]: <info>  [1763795984.1893] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 22 02:19:44 np0005531887 NetworkManager[55210]: <info>  [1763795984.1900] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Nov 22 02:19:44 np0005531887 NetworkManager[55210]: <info>  [1763795984.1906] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 22 02:19:44 np0005531887 NetworkManager[55210]: <info>  [1763795984.1909] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Nov 22 02:19:44 np0005531887 NetworkManager[55210]: <info>  [1763795984.1912] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 22 02:19:44 np0005531887 NetworkManager[55210]: <info>  [1763795984.1916] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Nov 22 02:19:44 np0005531887 NetworkManager[55210]: <info>  [1763795984.1919] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 22 02:19:44 np0005531887 NetworkManager[55210]: <info>  [1763795984.1923] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Nov 22 02:19:44 np0005531887 NetworkManager[55210]: <info>  [1763795984.1948] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Nov 22 02:19:44 np0005531887 NetworkManager[55210]: <info>  [1763795984.1957] device (eth1): Activation: starting connection 'ci-private-network' (1cca51b0-a68b-5692-83d5-ea4fdaca949c)
Nov 22 02:19:44 np0005531887 NetworkManager[55210]: <info>  [1763795984.1960] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=57982 uid=0 result="success"
Nov 22 02:19:44 np0005531887 NetworkManager[55210]: <info>  [1763795984.1963] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 22 02:19:44 np0005531887 NetworkManager[55210]: <info>  [1763795984.1986] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 22 02:19:44 np0005531887 NetworkManager[55210]: <info>  [1763795984.1989] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 22 02:19:44 np0005531887 NetworkManager[55210]: <info>  [1763795984.2001] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 22 02:19:44 np0005531887 NetworkManager[55210]: <info>  [1763795984.2007] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 22 02:19:44 np0005531887 NetworkManager[55210]: <info>  [1763795984.2017] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 22 02:19:44 np0005531887 NetworkManager[55210]: <info>  [1763795984.2021] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 22 02:19:44 np0005531887 NetworkManager[55210]: <info>  [1763795984.2026] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 22 02:19:44 np0005531887 NetworkManager[55210]: <info>  [1763795984.2031] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 22 02:19:44 np0005531887 NetworkManager[55210]: <info>  [1763795984.2035] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 22 02:19:44 np0005531887 NetworkManager[55210]: <info>  [1763795984.2039] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 22 02:19:44 np0005531887 NetworkManager[55210]: <info>  [1763795984.2041] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Nov 22 02:19:44 np0005531887 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Nov 22 02:19:44 np0005531887 NetworkManager[55210]: <info>  [1763795984.2061] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 22 02:19:44 np0005531887 NetworkManager[55210]: <info>  [1763795984.2069] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 22 02:19:44 np0005531887 NetworkManager[55210]: <info>  [1763795984.2076] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 22 02:19:44 np0005531887 NetworkManager[55210]: <info>  [1763795984.2078] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 22 02:19:44 np0005531887 NetworkManager[55210]: <info>  [1763795984.2081] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 22 02:19:44 np0005531887 NetworkManager[55210]: <info>  [1763795984.2087] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 22 02:19:44 np0005531887 NetworkManager[55210]: <info>  [1763795984.2094] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 22 02:19:44 np0005531887 NetworkManager[55210]: <info>  [1763795984.2100] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 22 02:19:44 np0005531887 NetworkManager[55210]: <info>  [1763795984.2106] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 22 02:19:44 np0005531887 NetworkManager[55210]: <info>  [1763795984.3663] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 22 02:19:44 np0005531887 NetworkManager[55210]: <info>  [1763795984.3673] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 22 02:19:44 np0005531887 NetworkManager[55210]: <info>  [1763795984.3681] device (eth1): Activation: successful, device activated.
Nov 22 02:19:45 np0005531887 python3.9[58319]: ansible-ansible.legacy.async_status Invoked with jid=j942434221639.57976 mode=status _async_dir=/root/.ansible_async
Nov 22 02:19:45 np0005531887 NetworkManager[55210]: <info>  [1763795985.6451] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=57982 uid=0 result="success"
Nov 22 02:19:45 np0005531887 NetworkManager[55210]: <info>  [1763795985.8167] checkpoint[0x55c7ff801950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Nov 22 02:19:45 np0005531887 NetworkManager[55210]: <info>  [1763795985.8170] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=57982 uid=0 result="success"
Nov 22 02:19:46 np0005531887 NetworkManager[55210]: <info>  [1763795986.1246] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=57982 uid=0 result="success"
Nov 22 02:19:46 np0005531887 NetworkManager[55210]: <info>  [1763795986.1265] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=57982 uid=0 result="success"
Nov 22 02:19:46 np0005531887 NetworkManager[55210]: <info>  [1763795986.3400] audit: op="networking-control" arg="global-dns-configuration" pid=57982 uid=0 result="success"
Nov 22 02:19:46 np0005531887 NetworkManager[55210]: <info>  [1763795986.3477] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Nov 22 02:19:46 np0005531887 NetworkManager[55210]: <info>  [1763795986.3708] audit: op="networking-control" arg="global-dns-configuration" pid=57982 uid=0 result="success"
Nov 22 02:19:46 np0005531887 NetworkManager[55210]: <info>  [1763795986.3739] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=57982 uid=0 result="success"
Nov 22 02:19:46 np0005531887 ansible-async_wrapper.py[57979]: 57980 still running (300)
Nov 22 02:19:46 np0005531887 NetworkManager[55210]: <info>  [1763795986.5329] checkpoint[0x55c7ff801a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Nov 22 02:19:46 np0005531887 NetworkManager[55210]: <info>  [1763795986.5334] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=57982 uid=0 result="success"
Nov 22 02:19:46 np0005531887 ansible-async_wrapper.py[57980]: Module complete (57980)
Nov 22 02:19:46 np0005531887 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 22 02:19:48 np0005531887 python3.9[58427]: ansible-ansible.legacy.async_status Invoked with jid=j942434221639.57976 mode=status _async_dir=/root/.ansible_async
Nov 22 02:19:49 np0005531887 python3.9[58527]: ansible-ansible.legacy.async_status Invoked with jid=j942434221639.57976 mode=cleanup _async_dir=/root/.ansible_async
Nov 22 02:19:50 np0005531887 python3.9[58679]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:19:50 np0005531887 python3.9[58802]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763795989.7726269-927-103315454834564/.source.returncode _original_basename=.l2blr70o follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:19:51 np0005531887 ansible-async_wrapper.py[57979]: Done in kid B.
Nov 22 02:19:51 np0005531887 python3.9[58954]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:19:52 np0005531887 python3.9[59078]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763795991.2124536-975-120718488460204/.source.cfg _original_basename=.4lvs8m85 follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:19:53 np0005531887 python3.9[59230]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 02:19:53 np0005531887 systemd[1]: Reloading Network Manager...
Nov 22 02:19:53 np0005531887 NetworkManager[55210]: <info>  [1763795993.3874] audit: op="reload" arg="0" pid=59234 uid=0 result="success"
Nov 22 02:19:53 np0005531887 NetworkManager[55210]: <info>  [1763795993.3882] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Nov 22 02:19:53 np0005531887 systemd[1]: Reloaded Network Manager.
Nov 22 02:19:53 np0005531887 systemd[1]: session-12.scope: Deactivated successfully.
Nov 22 02:19:53 np0005531887 systemd[1]: session-12.scope: Consumed 55.293s CPU time.
Nov 22 02:19:53 np0005531887 systemd-logind[821]: Session 12 logged out. Waiting for processes to exit.
Nov 22 02:19:53 np0005531887 systemd-logind[821]: Removed session 12.
Nov 22 02:20:00 np0005531887 systemd-logind[821]: New session 13 of user zuul.
Nov 22 02:20:00 np0005531887 systemd[1]: Started Session 13 of User zuul.
Nov 22 02:20:01 np0005531887 python3.9[59418]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 02:20:02 np0005531887 python3.9[59573]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 22 02:20:03 np0005531887 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 22 02:20:03 np0005531887 python3.9[59763]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:20:04 np0005531887 systemd[1]: session-13.scope: Deactivated successfully.
Nov 22 02:20:04 np0005531887 systemd[1]: session-13.scope: Consumed 2.550s CPU time.
Nov 22 02:20:04 np0005531887 systemd-logind[821]: Session 13 logged out. Waiting for processes to exit.
Nov 22 02:20:04 np0005531887 systemd-logind[821]: Removed session 13.
Nov 22 02:20:10 np0005531887 systemd-logind[821]: New session 14 of user zuul.
Nov 22 02:20:10 np0005531887 systemd[1]: Started Session 14 of User zuul.
Nov 22 02:20:11 np0005531887 python3.9[59945]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 02:20:12 np0005531887 python3.9[60099]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 02:20:14 np0005531887 python3.9[60255]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 22 02:20:15 np0005531887 python3.9[60340]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 02:20:17 np0005531887 python3.9[60493]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 22 02:20:18 np0005531887 python3.9[60684]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:20:19 np0005531887 python3.9[60836]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:20:19 np0005531887 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 02:20:20 np0005531887 python3.9[61000]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:20:20 np0005531887 python3.9[61078]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:20:21 np0005531887 python3.9[61230]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:20:22 np0005531887 python3.9[61308]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:20:22 np0005531887 python3.9[61460]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:20:23 np0005531887 python3.9[61612]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:20:24 np0005531887 python3.9[61764]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:20:24 np0005531887 python3.9[61916]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:20:25 np0005531887 python3.9[62068]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 02:20:28 np0005531887 python3.9[62221]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 02:20:29 np0005531887 python3.9[62375]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:20:30 np0005531887 python3.9[62527]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:20:30 np0005531887 python3.9[62679]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:20:31 np0005531887 python3.9[62832]: ansible-service_facts Invoked
Nov 22 02:20:31 np0005531887 network[62849]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 22 02:20:31 np0005531887 network[62850]: 'network-scripts' will be removed from distribution in near future.
Nov 22 02:20:31 np0005531887 network[62851]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 22 02:20:37 np0005531887 python3.9[63303]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 02:20:39 np0005531887 python3.9[63456]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Nov 22 02:20:41 np0005531887 python3.9[63608]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:20:41 np0005531887 python3.9[63733]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796040.7857668-663-226532108969361/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:20:42 np0005531887 python3.9[63887]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:20:43 np0005531887 python3.9[64012]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796042.2228856-708-175524386453612/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:20:45 np0005531887 python3.9[64166]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:20:47 np0005531887 python3.9[64320]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 22 02:20:48 np0005531887 python3.9[64404]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:20:49 np0005531887 python3.9[64558]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 22 02:20:50 np0005531887 python3.9[64642]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 02:20:50 np0005531887 systemd[1]: Stopping NTP client/server...
Nov 22 02:20:50 np0005531887 chronyd[829]: chronyd exiting
Nov 22 02:20:50 np0005531887 systemd[1]: chronyd.service: Deactivated successfully.
Nov 22 02:20:50 np0005531887 systemd[1]: Stopped NTP client/server.
Nov 22 02:20:50 np0005531887 systemd[1]: Starting NTP client/server...
Nov 22 02:20:50 np0005531887 chronyd[64650]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Nov 22 02:20:50 np0005531887 chronyd[64650]: Frequency -23.463 +/- 0.208 ppm read from /var/lib/chrony/drift
Nov 22 02:20:50 np0005531887 chronyd[64650]: Loaded seccomp filter (level 2)
Nov 22 02:20:50 np0005531887 systemd[1]: Started NTP client/server.
Nov 22 02:20:51 np0005531887 systemd[1]: session-14.scope: Deactivated successfully.
Nov 22 02:20:51 np0005531887 systemd[1]: session-14.scope: Consumed 27.235s CPU time.
Nov 22 02:20:51 np0005531887 systemd-logind[821]: Session 14 logged out. Waiting for processes to exit.
Nov 22 02:20:51 np0005531887 systemd-logind[821]: Removed session 14.
Nov 22 02:20:57 np0005531887 systemd-logind[821]: New session 15 of user zuul.
Nov 22 02:20:57 np0005531887 systemd[1]: Started Session 15 of User zuul.
Nov 22 02:20:58 np0005531887 python3.9[64830]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 02:20:59 np0005531887 python3.9[64986]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:21:00 np0005531887 python3.9[65163]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:21:01 np0005531887 python3.9[65241]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.iszz51n3 recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:21:02 np0005531887 python3.9[65393]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:21:03 np0005531887 python3.9[65516]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796061.6848648-149-48347402780079/.source _original_basename=.8xb01eb5 follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:21:03 np0005531887 python3.9[65668]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:21:04 np0005531887 python3.9[65820]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:21:05 np0005531887 python3.9[65943]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796064.096-221-268915545027017/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:21:05 np0005531887 python3.9[66095]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:21:06 np0005531887 python3.9[66218]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796065.378416-221-185139828615085/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:21:07 np0005531887 python3.9[66370]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:21:08 np0005531887 python3.9[66522]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:21:08 np0005531887 python3.9[66645]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796067.3185163-332-6801628089356/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:21:09 np0005531887 python3.9[66797]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:21:09 np0005531887 python3.9[66920]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796068.7572289-377-259588966643081/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:21:11 np0005531887 python3.9[67072]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:21:11 np0005531887 systemd[1]: Reloading.
Nov 22 02:21:11 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:21:11 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:21:11 np0005531887 systemd[1]: Reloading.
Nov 22 02:21:11 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:21:11 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:21:11 np0005531887 systemd[1]: Starting EDPM Container Shutdown...
Nov 22 02:21:11 np0005531887 systemd[1]: Finished EDPM Container Shutdown.
Nov 22 02:21:12 np0005531887 python3.9[67299]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:21:12 np0005531887 python3.9[67422]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796071.832587-446-239977544454071/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:21:13 np0005531887 python3.9[67574]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:21:14 np0005531887 python3.9[67697]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796073.0951226-491-132660097911453/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:21:15 np0005531887 python3.9[67849]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:21:15 np0005531887 systemd[1]: Reloading.
Nov 22 02:21:15 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:21:15 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:21:15 np0005531887 systemd[1]: Reloading.
Nov 22 02:21:15 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:21:15 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:21:15 np0005531887 systemd[1]: Starting Create netns directory...
Nov 22 02:21:15 np0005531887 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 22 02:21:15 np0005531887 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 22 02:21:15 np0005531887 systemd[1]: Finished Create netns directory.
Nov 22 02:21:16 np0005531887 python3.9[68076]: ansible-ansible.builtin.service_facts Invoked
Nov 22 02:21:16 np0005531887 network[68093]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 22 02:21:16 np0005531887 network[68094]: 'network-scripts' will be removed from distribution in near future.
Nov 22 02:21:16 np0005531887 network[68095]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 22 02:21:20 np0005531887 python3.9[68357]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:21:21 np0005531887 systemd[1]: Reloading.
Nov 22 02:21:21 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:21:21 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:21:21 np0005531887 systemd[1]: Stopping IPv4 firewall with iptables...
Nov 22 02:21:21 np0005531887 iptables.init[68397]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Nov 22 02:21:21 np0005531887 iptables.init[68397]: iptables: Flushing firewall rules: [  OK  ]
Nov 22 02:21:21 np0005531887 systemd[1]: iptables.service: Deactivated successfully.
Nov 22 02:21:21 np0005531887 systemd[1]: Stopped IPv4 firewall with iptables.
Nov 22 02:21:22 np0005531887 python3.9[68594]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:21:23 np0005531887 python3.9[68748]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:21:23 np0005531887 systemd[1]: Reloading.
Nov 22 02:21:23 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:21:23 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:21:23 np0005531887 systemd[1]: Starting Netfilter Tables...
Nov 22 02:21:23 np0005531887 systemd[1]: Finished Netfilter Tables.
Nov 22 02:21:24 np0005531887 python3.9[68940]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:21:25 np0005531887 python3.9[69093]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:21:26 np0005531887 python3.9[69218]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796085.2671673-698-94976098014219/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:21:27 np0005531887 python3.9[69371]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 02:21:27 np0005531887 systemd[1]: Reloading OpenSSH server daemon...
Nov 22 02:21:27 np0005531887 systemd[1]: Reloaded OpenSSH server daemon.
Nov 22 02:21:28 np0005531887 python3.9[69527]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:21:28 np0005531887 python3.9[69679]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:21:29 np0005531887 python3.9[69802]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796088.3442407-791-191291765991172/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:21:30 np0005531887 python3.9[69954]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Nov 22 02:21:30 np0005531887 systemd[1]: Starting Time & Date Service...
Nov 22 02:21:30 np0005531887 systemd[1]: Started Time & Date Service.
Nov 22 02:21:31 np0005531887 python3.9[70110]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:21:32 np0005531887 python3.9[70262]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:21:33 np0005531887 python3.9[70385]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796091.8950288-896-4212593816559/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:21:33 np0005531887 python3.9[70537]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:21:34 np0005531887 python3.9[70660]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796093.2365344-941-60665578795375/.source.yaml _original_basename=.f_2rjdmv follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:21:35 np0005531887 python3.9[70812]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:21:36 np0005531887 python3.9[70935]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796095.1307843-986-246281858542204/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:21:36 np0005531887 python3.9[71087]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:21:37 np0005531887 python3.9[71240]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:21:38 np0005531887 python3[71393]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 22 02:21:39 np0005531887 python3.9[71545]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:21:39 np0005531887 python3.9[71669]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796098.8027341-1103-59005575207211/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:21:40 np0005531887 python3.9[71822]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:21:41 np0005531887 python3.9[71945]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796100.2017145-1148-238664208491705/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:21:42 np0005531887 python3.9[72097]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:21:43 np0005531887 python3.9[72220]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796101.5785995-1193-255737512770175/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:21:43 np0005531887 python3.9[72372]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:21:44 np0005531887 python3.9[72495]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796103.3592458-1238-277456567772767/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:21:45 np0005531887 python3.9[72647]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:21:45 np0005531887 python3.9[72770]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796104.6226363-1283-242442010857475/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:21:46 np0005531887 python3.9[72922]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:21:47 np0005531887 python3.9[73074]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:21:48 np0005531887 python3.9[73233]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:21:49 np0005531887 python3.9[73386]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:21:50 np0005531887 python3.9[73538]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:21:51 np0005531887 python3.9[73690]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Nov 22 02:21:51 np0005531887 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 02:21:51 np0005531887 python3.9[73844]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Nov 22 02:21:52 np0005531887 systemd-logind[821]: Session 15 logged out. Waiting for processes to exit.
Nov 22 02:21:52 np0005531887 systemd[1]: session-15.scope: Deactivated successfully.
Nov 22 02:21:52 np0005531887 systemd[1]: session-15.scope: Consumed 37.892s CPU time.
Nov 22 02:21:52 np0005531887 systemd-logind[821]: Removed session 15.
Nov 22 02:21:58 np0005531887 systemd-logind[821]: New session 16 of user zuul.
Nov 22 02:21:58 np0005531887 systemd[1]: Started Session 16 of User zuul.
Nov 22 02:21:58 np0005531887 python3.9[74025]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Nov 22 02:21:59 np0005531887 python3.9[74177]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:22:00 np0005531887 python3.9[74329]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 02:22:00 np0005531887 systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 22 02:22:01 np0005531887 python3.9[74483]: ansible-ansible.builtin.blockinfile Invoked with block=compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCmnpAkzBA+/P5ygAqTpHSo/yxyshcDXOqGY2sZ6+LmKpfF/U/3puURRCYPFHLvU6Fe2oRGY6GwNjK7ej5idUzOOTgf6eMc2MfuxlwwYk9lQWXXYu3BIFbZTa/Jz2j3Jd5KpxE11/bi7aYfn5u+oXd0Q+EgbyaX14S6EGKPujybZZbWbPUjXyNBIpHDRP3QOvtmf0oXpNj7FZ/+eQ5okb2AzQeflovexeLh5/TrUuMpBgxJC+IT5bDgtr3scwyEN7Su9iQQos2qnNIIzuFTAJrbao4uS5RsC+rRO10O4Z+2p8nWhQuSG2tQ63gvUhaXg8h1KFhHYfclNow/Nzxq1rSASWv2iNeUsoDWgxH7Yq3GPbGEofld095ADvo32HdVYHmdYEaD9GLY7WKHW6ilz14vUYQ6cN6XZoli1rdTt1Z/UWpQSy64npnbT3IGeztmD2KPGZP2laTkFkxzTh7m0Dz2sBx1rbfhQV8SjNw0ZkeSV+G3sqXWqozNXMvk1k7Ma/8=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIPqC7967YjYmXjy5Y1Atr1idIuJEqYVlUbJ/ivnjEtjv#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJYSJRTKIRJHGorxvpDox6ZIDiNrie6EQnECMuD5IFEY7kEn/cP5JLTUpe4kf0aZt1r5R4WnwY6StRedSzkyRk0=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCowOVwCxoeDeu/gjiGxT6DsxjadsI6OsklC9oYH1562wrbeZDtXV2FXgAB8clz6v5hIpHsJBOPMHniNRFVQwlu7A3igwl6rkisIR380P+Ttep7r2pEz15KdpK86MS3svcPZn5qKpfnr+3JUxX9Kt71rH4jGzpDSQCFB/vJPmodINZL7o8vaTg1Gz0vkf+zJlmQjq3fUKVrInLbL6hPyuV8pXqtw3q+JYCIrXJlHDFPOngM4PsGnOJL6j9PaOEdRXK30tQNQlzko6lfntblufy8mAb/o9Sn1ulCIbI1nIJqkTVm9aK31C4nWSPumTQ9GLdi0dvultCwMbw0ym7pzFAWlxrsx4V9GRz2yqAPLbNwEFoaA42ScSLnQpq+Y3747tGiT5jdKz2AyCBa6sN43tUXKR/mtjBpXXoOsCvgvzvnlul+TRmjoju2jFsL05dlNImskQ1UwAn5iIr+7TvzDF03jeQYani/6aykV0z4KJyt0VneL9fYnSlSZ7dnpTfkYgc=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAII6yEJELXMtPQkFh9QeTL6LtFdllgEtcCx/vTvD4VQRD#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBENm8i1piornPA+bu3lQ1gnDuQ+S2zp/iE9MvAGxNHKKvA3MMS333GJWFx+BV6kDFZ9hDTDj/kimvzvpu8lziNQ=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDQtcPDTu5W9vsjUBaPvvVOkaIA7MPqUmOieWOqa7ySB03c8aREkaNDH3Zlp68jXwdC1Qpw0/2EGQ83sfaSlvG6XSE4QBwVDoOMe7GzTY8agr/ZZOIedAz8v04HH0OpnsD0tqLQlZZ0nuBJ7UM7iP5PTbc7O1Z2n35+F+XTqiKfSmsCSxhnwJhgyZBKS0HJUIsvQoVw1N699OnanMSweTImsEURAEEsL3zrVM9Qa/uw2XH5LTuU9kXzfqKNgy/5VXcEbamLe+cPbFPKDc8ei1sCASL3xDbyGriLdNKOiSjytc5GTcG0eg5aHmBxz1/KWYAf9JCs/xAGk0Nifft+xlfC1OwkiPBCHsUlfVWzERxno/lVQQGvrNTgMZ1G/lJwuhRYCWlScgfADkcZSitTszGd/qlunDx3biSKRE1RnACqaNsF0Hum0S7m6d8wxj6TNoD618+lN82HqRhRMhrVQ+hxQySpHEXSWTdhfVLND+2neEL9hyT0cCzB33FvD+YacRk=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIIBd7JSklzBfXUPIvKiAxXVL//OQf5r0dI648cExbdgs#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAE41g4YqzC3bfy1t/lRYP5p85+7h3wD8DzLbz0LtdbkROkWg/OHzC73WNbkqdHKqwacHfch6fbycv9mIDE73cM=#012 create=True mode=0644 path=/tmp/ansible.u5o14bjy state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:22:02 np0005531887 python3.9[74635]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.u5o14bjy' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:22:03 np0005531887 python3.9[74789]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.u5o14bjy state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:22:03 np0005531887 systemd[1]: session-16.scope: Deactivated successfully.
Nov 22 02:22:03 np0005531887 systemd[1]: session-16.scope: Consumed 3.552s CPU time.
Nov 22 02:22:03 np0005531887 systemd-logind[821]: Session 16 logged out. Waiting for processes to exit.
Nov 22 02:22:03 np0005531887 systemd-logind[821]: Removed session 16.
Nov 22 02:22:09 np0005531887 systemd-logind[821]: New session 17 of user zuul.
Nov 22 02:22:09 np0005531887 systemd[1]: Started Session 17 of User zuul.
Nov 22 02:22:10 np0005531887 python3.9[74967]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 02:22:11 np0005531887 python3.9[75123]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 22 02:22:12 np0005531887 python3.9[75277]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 02:22:13 np0005531887 python3.9[75430]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:22:14 np0005531887 python3.9[75583]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:22:15 np0005531887 python3.9[75737]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:22:16 np0005531887 python3.9[75892]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:22:16 np0005531887 systemd[1]: session-17.scope: Deactivated successfully.
Nov 22 02:22:16 np0005531887 systemd[1]: session-17.scope: Consumed 4.683s CPU time.
Nov 22 02:22:16 np0005531887 systemd-logind[821]: Session 17 logged out. Waiting for processes to exit.
Nov 22 02:22:16 np0005531887 systemd-logind[821]: Removed session 17.
Nov 22 02:22:21 np0005531887 systemd-logind[821]: New session 18 of user zuul.
Nov 22 02:22:21 np0005531887 systemd[1]: Started Session 18 of User zuul.
Nov 22 02:22:22 np0005531887 python3.9[76072]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 02:22:24 np0005531887 python3.9[76228]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 22 02:22:25 np0005531887 python3.9[76312]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 22 02:22:27 np0005531887 python3.9[76463]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:22:28 np0005531887 python3.9[76614]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 22 02:22:29 np0005531887 python3.9[76764]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:22:30 np0005531887 python3.9[76914]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:22:30 np0005531887 systemd[1]: session-18.scope: Deactivated successfully.
Nov 22 02:22:30 np0005531887 systemd[1]: session-18.scope: Consumed 6.364s CPU time.
Nov 22 02:22:30 np0005531887 systemd-logind[821]: Session 18 logged out. Waiting for processes to exit.
Nov 22 02:22:30 np0005531887 systemd-logind[821]: Removed session 18.
Nov 22 02:22:36 np0005531887 systemd-logind[821]: New session 19 of user zuul.
Nov 22 02:22:36 np0005531887 systemd[1]: Started Session 19 of User zuul.
Nov 22 02:22:37 np0005531887 python3.9[77092]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 02:22:39 np0005531887 python3.9[77248]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:22:40 np0005531887 python3.9[77400]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:22:41 np0005531887 python3.9[77552]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:22:41 np0005531887 python3.9[77675]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796160.7004309-165-29901555755537/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=0d902c408c37eeba0929bd917348c07cde6ccbc7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:22:42 np0005531887 python3.9[77827]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:22:43 np0005531887 python3.9[77950]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796162.1450326-165-58955341413130/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=16bd60fbc24423e3fc1bbc9e201827083d9b5e39 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:22:43 np0005531887 python3.9[78102]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:22:44 np0005531887 python3.9[78225]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796163.3925319-165-275325021184696/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=8c989cbbf81763d8d455f239d470b85744219dcc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:22:45 np0005531887 python3.9[78377]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:22:45 np0005531887 python3.9[78529]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:22:46 np0005531887 python3.9[78681]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:22:47 np0005531887 python3.9[78804]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796166.1261702-351-146504273424875/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=6c4ca68e1ab01db614a1ca5bcf262685ac301b09 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:22:47 np0005531887 python3.9[78956]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:22:48 np0005531887 python3.9[79079]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796167.3597898-351-2902989978237/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=8ad07d9f15fb881d541cc871f705c812e1318a58 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:22:49 np0005531887 python3.9[79231]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:22:49 np0005531887 python3.9[79354]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796168.5843954-351-77221722462303/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=34d934131778711b06f52e7da501821d13d8e7e7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:22:50 np0005531887 python3.9[79506]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:22:51 np0005531887 python3.9[79658]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:22:51 np0005531887 python3.9[79810]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:22:52 np0005531887 python3.9[79933]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796171.3095508-531-7105092085134/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=ee2868b069477fa0e6787e01ced2e0b286eb18ef backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:22:53 np0005531887 python3.9[80085]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:22:53 np0005531887 python3.9[80208]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796172.5589776-531-237619700484933/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=ab92e79a33a6e2fca5144cd0532be918fe14e6b1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:22:54 np0005531887 python3.9[80360]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:22:54 np0005531887 python3.9[80483]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796173.8054595-531-6380900987398/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=28968bcd43485129f8b7f2be95cef086c36f2582 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:22:55 np0005531887 python3.9[80635]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:22:56 np0005531887 python3.9[80787]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:22:57 np0005531887 python3.9[80939]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:22:57 np0005531887 python3.9[81062]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796176.515882-716-41815485117454/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=359587cdd763ace6f2d359733fc74f653d8c09a4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:22:58 np0005531887 python3.9[81214]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:22:58 np0005531887 python3.9[81337]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796177.7650695-716-101521360494069/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=ab92e79a33a6e2fca5144cd0532be918fe14e6b1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:22:59 np0005531887 python3.9[81489]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:22:59 np0005531887 chronyd[64650]: Selected source 216.128.178.20 (pool.ntp.org)
Nov 22 02:22:59 np0005531887 python3.9[81614]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796178.9284575-716-171681246879372/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=30dd4528a0519375aaaefa6effa870290646a4be backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:23:01 np0005531887 python3.9[81766]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:23:02 np0005531887 python3.9[81920]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:23:02 np0005531887 python3.9[82043]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796181.4796183-923-147978528522913/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=ba56832c35f23d00035eec09df0d02bc867796ad backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:23:03 np0005531887 python3.9[82195]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:23:04 np0005531887 python3.9[82347]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:23:04 np0005531887 python3.9[82470]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796183.5890362-1003-5329234795530/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=ba56832c35f23d00035eec09df0d02bc867796ad backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:23:05 np0005531887 python3.9[82622]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:23:06 np0005531887 python3.9[82774]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:23:06 np0005531887 python3.9[82897]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796185.5192409-1076-83739894206423/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=ba56832c35f23d00035eec09df0d02bc867796ad backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:23:07 np0005531887 python3.9[83049]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:23:08 np0005531887 python3.9[83201]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:23:08 np0005531887 python3.9[83324]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796187.5644472-1148-159064912437753/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=ba56832c35f23d00035eec09df0d02bc867796ad backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:23:09 np0005531887 python3.9[83476]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:23:09 np0005531887 python3.9[83628]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:23:10 np0005531887 python3.9[83751]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796189.5067997-1218-240894965365044/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=ba56832c35f23d00035eec09df0d02bc867796ad backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:23:11 np0005531887 python3.9[83903]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:23:11 np0005531887 python3.9[84055]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:23:12 np0005531887 python3.9[84178]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796191.4405391-1288-260167847719573/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=ba56832c35f23d00035eec09df0d02bc867796ad backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:23:13 np0005531887 python3.9[84330]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:23:13 np0005531887 python3.9[84482]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:23:14 np0005531887 python3.9[84605]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796193.3828225-1356-220503013966625/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=ba56832c35f23d00035eec09df0d02bc867796ad backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:23:14 np0005531887 systemd[1]: session-19.scope: Deactivated successfully.
Nov 22 02:23:14 np0005531887 systemd[1]: session-19.scope: Consumed 30.751s CPU time.
Nov 22 02:23:14 np0005531887 systemd-logind[821]: Session 19 logged out. Waiting for processes to exit.
Nov 22 02:23:14 np0005531887 systemd-logind[821]: Removed session 19.
Nov 22 02:23:20 np0005531887 systemd-logind[821]: New session 20 of user zuul.
Nov 22 02:23:20 np0005531887 systemd[1]: Started Session 20 of User zuul.
Nov 22 02:23:21 np0005531887 python3.9[84784]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 02:23:22 np0005531887 python3.9[84940]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:23:23 np0005531887 python3.9[85092]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:23:24 np0005531887 python3.9[85242]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 02:23:24 np0005531887 python3.9[85394]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Nov 22 02:23:26 np0005531887 dbus-broker-launch[811]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Nov 22 02:23:27 np0005531887 python3.9[85550]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 22 02:23:28 np0005531887 python3.9[85634]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 02:23:30 np0005531887 python3.9[85787]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 22 02:23:31 np0005531887 python3[85942]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks#012  rule:#012    proto: udp#012    dport: 4789#012- rule_name: 119 neutron geneve networks#012  rule:#012    proto: udp#012    dport: 6081#012    state: ["UNTRACKED"]#012- rule_name: 120 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: OUTPUT#012    jump: NOTRACK#012    action: append#012    state: []#012- rule_name: 121 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: PREROUTING#012    jump: NOTRACK#012    action: append#012    state: []#012 dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Nov 22 02:23:32 np0005531887 python3.9[86094]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:23:33 np0005531887 python3.9[86246]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:23:33 np0005531887 python3.9[86324]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:23:34 np0005531887 python3.9[86476]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:23:34 np0005531887 python3.9[86554]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.wvpuhx83 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:23:35 np0005531887 python3.9[86706]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:23:35 np0005531887 python3.9[86784]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:23:36 np0005531887 python3.9[86936]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:23:37 np0005531887 python3[87089]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 22 02:23:38 np0005531887 python3.9[87241]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:23:39 np0005531887 python3.9[87368]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796217.9569361-437-183023864644832/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:23:39 np0005531887 python3.9[87520]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:23:40 np0005531887 python3.9[87645]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796219.384581-482-250564028944279/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:23:41 np0005531887 python3.9[87797]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:23:41 np0005531887 python3.9[87922]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796220.651986-527-95016325222887/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:23:42 np0005531887 python3.9[88074]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:23:43 np0005531887 python3.9[88199]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796222.0995831-572-154044885568732/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:23:43 np0005531887 python3.9[88351]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:23:44 np0005531887 python3.9[88476]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796223.4540606-617-46928563962594/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:23:45 np0005531887 python3.9[88628]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:23:46 np0005531887 python3.9[88780]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:23:46 np0005531887 python3.9[88935]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:23:47 np0005531887 python3.9[89087]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:23:48 np0005531887 python3.9[89240]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:23:49 np0005531887 python3.9[89394]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:23:49 np0005531887 python3.9[89550]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:23:51 np0005531887 python3.9[89700]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 02:23:52 np0005531887 python3.9[89853]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:1e:0a:93:45:69:49" external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch #012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:23:52 np0005531887 ovs-vsctl[89854]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:1e:0a:93:45:69:49 external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Nov 22 02:23:53 np0005531887 python3.9[90006]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ovs-vsctl show | grep -q "Manager"#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:23:54 np0005531887 python3.9[90161]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:23:54 np0005531887 ovs-vsctl[90162]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Nov 22 02:23:55 np0005531887 python3.9[90312]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:23:55 np0005531887 python3.9[90466]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:23:56 np0005531887 python3.9[90618]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:23:57 np0005531887 python3.9[90696]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:23:57 np0005531887 python3.9[90848]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:23:58 np0005531887 python3.9[90926]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:23:58 np0005531887 python3.9[91078]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:23:59 np0005531887 python3.9[91230]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:24:00 np0005531887 python3.9[91308]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:24:00 np0005531887 python3.9[91460]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:24:01 np0005531887 python3.9[91538]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:24:02 np0005531887 python3.9[91690]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:24:02 np0005531887 systemd[1]: Reloading.
Nov 22 02:24:02 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:24:02 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:24:03 np0005531887 python3.9[91879]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:24:03 np0005531887 python3.9[91957]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:24:04 np0005531887 python3.9[92109]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:24:05 np0005531887 python3.9[92187]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:24:06 np0005531887 python3.9[92339]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:24:06 np0005531887 systemd[1]: Reloading.
Nov 22 02:24:06 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:24:06 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:24:06 np0005531887 systemd[1]: Starting Create netns directory...
Nov 22 02:24:06 np0005531887 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 22 02:24:06 np0005531887 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 22 02:24:06 np0005531887 systemd[1]: Finished Create netns directory.
Nov 22 02:24:07 np0005531887 python3.9[92533]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:24:08 np0005531887 python3.9[92685]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:24:08 np0005531887 python3.9[92808]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796247.5546818-1370-28793653953252/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:24:10 np0005531887 python3.9[92960]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:24:10 np0005531887 python3.9[93112]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:24:11 np0005531887 python3.9[93235]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796250.3946548-1445-116160762287065/.source.json _original_basename=.5bdtwgw2 follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:24:12 np0005531887 python3.9[93387]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:24:14 np0005531887 python3.9[93814]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Nov 22 02:24:15 np0005531887 python3.9[93966]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 22 02:24:16 np0005531887 python3.9[94118]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 22 02:24:16 np0005531887 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 02:24:18 np0005531887 python3[94281]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 22 02:24:18 np0005531887 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 02:24:18 np0005531887 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 02:24:18 np0005531887 podman[94317]: 2025-11-22 07:24:18.702077643 +0000 UTC m=+0.058849720 container create 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 02:24:18 np0005531887 podman[94317]: 2025-11-22 07:24:18.668622934 +0000 UTC m=+0.025395031 image pull 197857ba4b35dfe0da58eb2e9c37f91c8a1d2b66c0967b4c66656aa6329b870c quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 22 02:24:18 np0005531887 python3[94281]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 22 02:24:19 np0005531887 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 02:24:19 np0005531887 python3.9[94503]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:24:20 np0005531887 python3.9[94657]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:24:20 np0005531887 python3.9[94733]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:24:21 np0005531887 python3.9[94884]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763796260.882671-1709-272511753715120/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:24:22 np0005531887 python3.9[94960]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 22 02:24:22 np0005531887 systemd[1]: Reloading.
Nov 22 02:24:22 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:24:22 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:24:23 np0005531887 python3.9[95072]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:24:23 np0005531887 systemd[1]: Reloading.
Nov 22 02:24:23 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:24:23 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:24:23 np0005531887 systemd[1]: Starting ovn_controller container...
Nov 22 02:24:23 np0005531887 systemd[1]: Created slice Virtual Machine and Container Slice.
Nov 22 02:24:23 np0005531887 systemd[1]: Started libcrun container.
Nov 22 02:24:23 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c77cb235b60ff0db8d97c42cadf42a068fe19bc3f1d9d001f5e0041127c8ce6d/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 22 02:24:23 np0005531887 systemd[1]: Started /usr/bin/podman healthcheck run 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d.
Nov 22 02:24:23 np0005531887 podman[95115]: 2025-11-22 07:24:23.753353823 +0000 UTC m=+0.136567854 container init 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller)
Nov 22 02:24:23 np0005531887 ovn_controller[95130]: + sudo -E kolla_set_configs
Nov 22 02:24:23 np0005531887 podman[95115]: 2025-11-22 07:24:23.781713293 +0000 UTC m=+0.164927304 container start 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 02:24:23 np0005531887 edpm-start-podman-container[95115]: ovn_controller
Nov 22 02:24:23 np0005531887 systemd[1]: Created slice User Slice of UID 0.
Nov 22 02:24:23 np0005531887 systemd[1]: Starting User Runtime Directory /run/user/0...
Nov 22 02:24:23 np0005531887 systemd[1]: Finished User Runtime Directory /run/user/0.
Nov 22 02:24:23 np0005531887 systemd[1]: Starting User Manager for UID 0...
Nov 22 02:24:23 np0005531887 edpm-start-podman-container[95114]: Creating additional drop-in dependency for "ovn_controller" (5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d)
Nov 22 02:24:23 np0005531887 systemd[1]: Reloading.
Nov 22 02:24:23 np0005531887 podman[95137]: 2025-11-22 07:24:23.905151429 +0000 UTC m=+0.109644261 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 22 02:24:23 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:24:23 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:24:23 np0005531887 systemd[95165]: Queued start job for default target Main User Target.
Nov 22 02:24:24 np0005531887 systemd[95165]: Created slice User Application Slice.
Nov 22 02:24:24 np0005531887 systemd[95165]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Nov 22 02:24:24 np0005531887 systemd[95165]: Started Daily Cleanup of User's Temporary Directories.
Nov 22 02:24:24 np0005531887 systemd[95165]: Reached target Paths.
Nov 22 02:24:24 np0005531887 systemd[95165]: Reached target Timers.
Nov 22 02:24:24 np0005531887 systemd[95165]: Starting D-Bus User Message Bus Socket...
Nov 22 02:24:24 np0005531887 systemd[95165]: Starting Create User's Volatile Files and Directories...
Nov 22 02:24:24 np0005531887 systemd[95165]: Finished Create User's Volatile Files and Directories.
Nov 22 02:24:24 np0005531887 systemd[95165]: Listening on D-Bus User Message Bus Socket.
Nov 22 02:24:24 np0005531887 systemd[95165]: Reached target Sockets.
Nov 22 02:24:24 np0005531887 systemd[95165]: Reached target Basic System.
Nov 22 02:24:24 np0005531887 systemd[95165]: Reached target Main User Target.
Nov 22 02:24:24 np0005531887 systemd[95165]: Startup finished in 152ms.
Nov 22 02:24:24 np0005531887 systemd[1]: Started User Manager for UID 0.
Nov 22 02:24:24 np0005531887 systemd[1]: Started ovn_controller container.
Nov 22 02:24:24 np0005531887 systemd[1]: 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d-4a784ad4574a3775.service: Main process exited, code=exited, status=1/FAILURE
Nov 22 02:24:24 np0005531887 systemd[1]: 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d-4a784ad4574a3775.service: Failed with result 'exit-code'.
Nov 22 02:24:24 np0005531887 systemd[1]: Started Session c1 of User root.
Nov 22 02:24:24 np0005531887 ovn_controller[95130]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 22 02:24:24 np0005531887 ovn_controller[95130]: INFO:__main__:Validating config file
Nov 22 02:24:24 np0005531887 ovn_controller[95130]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 22 02:24:24 np0005531887 ovn_controller[95130]: INFO:__main__:Writing out command to execute
Nov 22 02:24:24 np0005531887 systemd[1]: session-c1.scope: Deactivated successfully.
Nov 22 02:24:24 np0005531887 ovn_controller[95130]: ++ cat /run_command
Nov 22 02:24:24 np0005531887 ovn_controller[95130]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Nov 22 02:24:24 np0005531887 ovn_controller[95130]: + ARGS=
Nov 22 02:24:24 np0005531887 ovn_controller[95130]: + sudo kolla_copy_cacerts
Nov 22 02:24:24 np0005531887 systemd[1]: Started Session c2 of User root.
Nov 22 02:24:24 np0005531887 systemd[1]: session-c2.scope: Deactivated successfully.
Nov 22 02:24:24 np0005531887 ovn_controller[95130]: + [[ ! -n '' ]]
Nov 22 02:24:24 np0005531887 ovn_controller[95130]: + . kolla_extend_start
Nov 22 02:24:24 np0005531887 ovn_controller[95130]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Nov 22 02:24:24 np0005531887 ovn_controller[95130]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Nov 22 02:24:24 np0005531887 ovn_controller[95130]: + umask 0022
Nov 22 02:24:24 np0005531887 ovn_controller[95130]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Nov 22 02:24:24 np0005531887 ovn_controller[95130]: 2025-11-22T07:24:24Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Nov 22 02:24:24 np0005531887 ovn_controller[95130]: 2025-11-22T07:24:24Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Nov 22 02:24:24 np0005531887 ovn_controller[95130]: 2025-11-22T07:24:24Z|00003|main|INFO|OVN internal version is : [24.03.7-20.33.0-76.8]
Nov 22 02:24:24 np0005531887 ovn_controller[95130]: 2025-11-22T07:24:24Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Nov 22 02:24:24 np0005531887 ovn_controller[95130]: 2025-11-22T07:24:24Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Nov 22 02:24:24 np0005531887 ovn_controller[95130]: 2025-11-22T07:24:24Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Nov 22 02:24:24 np0005531887 NetworkManager[55210]: <info>  [1763796264.3096] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Nov 22 02:24:24 np0005531887 NetworkManager[55210]: <info>  [1763796264.3104] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 22 02:24:24 np0005531887 NetworkManager[55210]: <info>  [1763796264.3115] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Nov 22 02:24:24 np0005531887 NetworkManager[55210]: <info>  [1763796264.3121] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Nov 22 02:24:24 np0005531887 NetworkManager[55210]: <info>  [1763796264.3139] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Nov 22 02:24:24 np0005531887 kernel: br-int: entered promiscuous mode
Nov 22 02:24:24 np0005531887 ovn_controller[95130]: 2025-11-22T07:24:24Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Nov 22 02:24:24 np0005531887 ovn_controller[95130]: 2025-11-22T07:24:24Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 22 02:24:24 np0005531887 ovn_controller[95130]: 2025-11-22T07:24:24Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 22 02:24:24 np0005531887 ovn_controller[95130]: 2025-11-22T07:24:24Z|00010|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Nov 22 02:24:24 np0005531887 ovn_controller[95130]: 2025-11-22T07:24:24Z|00011|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 22 02:24:24 np0005531887 ovn_controller[95130]: 2025-11-22T07:24:24Z|00012|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 22 02:24:24 np0005531887 ovn_controller[95130]: 2025-11-22T07:24:24Z|00013|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 22 02:24:24 np0005531887 ovn_controller[95130]: 2025-11-22T07:24:24Z|00014|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Nov 22 02:24:24 np0005531887 ovn_controller[95130]: 2025-11-22T07:24:24Z|00015|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Nov 22 02:24:24 np0005531887 systemd-udevd[95258]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:24:24 np0005531887 ovn_controller[95130]: 2025-11-22T07:24:24Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 22 02:24:24 np0005531887 ovn_controller[95130]: 2025-11-22T07:24:24Z|00017|features|INFO|OVS Feature: ct_zero_snat, state: supported
Nov 22 02:24:24 np0005531887 ovn_controller[95130]: 2025-11-22T07:24:24Z|00018|features|INFO|OVS Feature: ct_flush, state: supported
Nov 22 02:24:24 np0005531887 ovn_controller[95130]: 2025-11-22T07:24:24Z|00019|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Nov 22 02:24:24 np0005531887 ovn_controller[95130]: 2025-11-22T07:24:24Z|00020|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Nov 22 02:24:24 np0005531887 ovn_controller[95130]: 2025-11-22T07:24:24Z|00021|main|INFO|OVS feature set changed, force recompute.
Nov 22 02:24:24 np0005531887 ovn_controller[95130]: 2025-11-22T07:24:24Z|00022|main|INFO|OVS feature set changed, force recompute.
Nov 22 02:24:24 np0005531887 ovn_controller[95130]: 2025-11-22T07:24:24Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Nov 22 02:24:24 np0005531887 ovn_controller[95130]: 2025-11-22T07:24:24Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Nov 22 02:24:24 np0005531887 python3.9[95387]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:24:24 np0005531887 ovs-vsctl[95388]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Nov 22 02:24:25 np0005531887 python3.9[95540]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:24:25 np0005531887 ovs-vsctl[95542]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Nov 22 02:24:26 np0005531887 ovn_controller[95130]: 2025-11-22T07:24:26Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 22 02:24:26 np0005531887 ovn_controller[95130]: 2025-11-22T07:24:26Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 22 02:24:26 np0005531887 ovn_controller[95130]: 2025-11-22T07:24:26Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 22 02:24:26 np0005531887 ovn_controller[95130]: 2025-11-22T07:24:26Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 22 02:24:26 np0005531887 python3.9[95695]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:24:26 np0005531887 NetworkManager[55210]: <info>  [1763796266.8844] manager: (ovn-df0984-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Nov 22 02:24:26 np0005531887 NetworkManager[55210]: <info>  [1763796266.8861] manager: (ovn-4984e1-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/18)
Nov 22 02:24:26 np0005531887 NetworkManager[55210]: <info>  [1763796266.8868] manager: (ovn-e686e2-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Nov 22 02:24:26 np0005531887 ovs-vsctl[95697]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Nov 22 02:24:26 np0005531887 kernel: genev_sys_6081: entered promiscuous mode
Nov 22 02:24:26 np0005531887 ovn_controller[95130]: 2025-11-22T07:24:26Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 22 02:24:26 np0005531887 ovn_controller[95130]: 2025-11-22T07:24:26Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 22 02:24:26 np0005531887 NetworkManager[55210]: <info>  [1763796266.9134] device (genev_sys_6081): carrier: link connected
Nov 22 02:24:26 np0005531887 NetworkManager[55210]: <info>  [1763796266.9138] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/20)
Nov 22 02:24:27 np0005531887 systemd[1]: session-20.scope: Deactivated successfully.
Nov 22 02:24:27 np0005531887 systemd[1]: session-20.scope: Consumed 46.949s CPU time.
Nov 22 02:24:27 np0005531887 systemd-logind[821]: Session 20 logged out. Waiting for processes to exit.
Nov 22 02:24:27 np0005531887 systemd-logind[821]: Removed session 20.
Nov 22 02:24:34 np0005531887 systemd[1]: Stopping User Manager for UID 0...
Nov 22 02:24:34 np0005531887 systemd[95165]: Activating special unit Exit the Session...
Nov 22 02:24:34 np0005531887 systemd[95165]: Stopped target Main User Target.
Nov 22 02:24:34 np0005531887 systemd[95165]: Stopped target Basic System.
Nov 22 02:24:34 np0005531887 systemd[95165]: Stopped target Paths.
Nov 22 02:24:34 np0005531887 systemd[95165]: Stopped target Sockets.
Nov 22 02:24:34 np0005531887 systemd[95165]: Stopped target Timers.
Nov 22 02:24:34 np0005531887 systemd[95165]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 22 02:24:34 np0005531887 systemd[95165]: Closed D-Bus User Message Bus Socket.
Nov 22 02:24:34 np0005531887 systemd[95165]: Stopped Create User's Volatile Files and Directories.
Nov 22 02:24:34 np0005531887 systemd[95165]: Removed slice User Application Slice.
Nov 22 02:24:34 np0005531887 systemd[95165]: Reached target Shutdown.
Nov 22 02:24:34 np0005531887 systemd[95165]: Finished Exit the Session.
Nov 22 02:24:34 np0005531887 systemd[95165]: Reached target Exit the Session.
Nov 22 02:24:34 np0005531887 systemd[1]: user@0.service: Deactivated successfully.
Nov 22 02:24:34 np0005531887 systemd[1]: Stopped User Manager for UID 0.
Nov 22 02:24:34 np0005531887 systemd[1]: Stopping User Runtime Directory /run/user/0...
Nov 22 02:24:34 np0005531887 systemd[1]: run-user-0.mount: Deactivated successfully.
Nov 22 02:24:34 np0005531887 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Nov 22 02:24:34 np0005531887 systemd[1]: Stopped User Runtime Directory /run/user/0.
Nov 22 02:24:34 np0005531887 systemd[1]: Removed slice User Slice of UID 0.
Nov 22 02:24:35 np0005531887 systemd-logind[821]: New session 22 of user zuul.
Nov 22 02:24:35 np0005531887 systemd[1]: Started Session 22 of User zuul.
Nov 22 02:24:37 np0005531887 python3.9[95881]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 02:24:38 np0005531887 python3.9[96037]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:24:39 np0005531887 python3.9[96189]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:24:40 np0005531887 python3.9[96341]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:24:40 np0005531887 python3.9[96493]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:24:41 np0005531887 python3.9[96645]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:24:42 np0005531887 python3.9[96795]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 02:24:43 np0005531887 python3.9[96948]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Nov 22 02:24:45 np0005531887 python3.9[97098]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:24:45 np0005531887 python3.9[97219]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796284.3953788-224-64343513795291/.source follow=False _original_basename=haproxy.j2 checksum=95c62e64c8f82dd9393a560d1b052dc98d38f810 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:24:46 np0005531887 python3.9[97369]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:24:47 np0005531887 python3.9[97490]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796286.157411-269-97244368542165/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:24:48 np0005531887 python3.9[97642]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 22 02:24:49 np0005531887 python3.9[97726]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 02:24:51 np0005531887 python3.9[97879]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 22 02:24:52 np0005531887 python3.9[98032]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:24:53 np0005531887 python3.9[98153]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796292.2630494-380-143487003115702/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:24:54 np0005531887 python3.9[98303]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:24:54 np0005531887 ovn_controller[95130]: 2025-11-22T07:24:54Z|00025|memory|INFO|16256 kB peak resident set size after 30.3 seconds
Nov 22 02:24:54 np0005531887 ovn_controller[95130]: 2025-11-22T07:24:54Z|00026|memory|INFO|idl-cells-OVN_Southbound:273 idl-cells-Open_vSwitch:585 ofctrl_desired_flow_usage-KB:7 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:2
Nov 22 02:24:54 np0005531887 podman[98398]: 2025-11-22 07:24:54.666340462 +0000 UTC m=+0.181627529 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller)
Nov 22 02:24:54 np0005531887 python3.9[98435]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796293.6576426-380-139670501816351/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:24:56 np0005531887 python3.9[98598]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:24:57 np0005531887 python3.9[98719]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796296.3684063-512-192783518302528/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:24:57 np0005531887 python3.9[98869]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:24:58 np0005531887 python3.9[98990]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796297.4927688-512-195162106669674/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=3fd0bbe67f8d6b170421a2b4395a288aa69eaea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:24:59 np0005531887 python3.9[99141]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:24:59 np0005531887 python3.9[99296]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:25:00 np0005531887 python3.9[99448]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:25:01 np0005531887 python3.9[99526]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:25:01 np0005531887 python3.9[99678]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:25:02 np0005531887 python3.9[99756]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:25:02 np0005531887 python3.9[99908]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:25:03 np0005531887 python3.9[100060]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:25:04 np0005531887 python3.9[100138]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:25:04 np0005531887 python3.9[100290]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:25:05 np0005531887 python3.9[100368]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:25:05 np0005531887 python3.9[100520]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:25:05 np0005531887 systemd[1]: Reloading.
Nov 22 02:25:05 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:25:05 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:25:06 np0005531887 python3.9[100710]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:25:07 np0005531887 python3.9[100788]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:25:07 np0005531887 python3.9[100940]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:25:08 np0005531887 python3.9[101018]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:25:09 np0005531887 python3.9[101170]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:25:09 np0005531887 systemd[1]: Reloading.
Nov 22 02:25:09 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:25:09 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:25:09 np0005531887 systemd[1]: Starting Create netns directory...
Nov 22 02:25:09 np0005531887 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 22 02:25:09 np0005531887 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 22 02:25:09 np0005531887 systemd[1]: Finished Create netns directory.
Nov 22 02:25:10 np0005531887 python3.9[101362]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:25:10 np0005531887 python3.9[101514]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:25:11 np0005531887 python3.9[101637]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796310.4551783-965-43985957637214/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:25:12 np0005531887 python3.9[101789]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:25:13 np0005531887 python3.9[101941]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:25:13 np0005531887 python3.9[102064]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796312.8043303-1040-245858040961110/.source.json _original_basename=.cunj3mbs follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:25:14 np0005531887 python3.9[102216]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:25:17 np0005531887 python3.9[102643]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Nov 22 02:25:18 np0005531887 python3.9[102795]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 22 02:25:19 np0005531887 python3.9[102947]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 22 02:25:20 np0005531887 python3[103125]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 22 02:25:26 np0005531887 podman[103182]: 2025-11-22 07:25:26.669403562 +0000 UTC m=+1.882191174 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:25:27 np0005531887 podman[103138]: 2025-11-22 07:25:27.768769825 +0000 UTC m=+6.909905551 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 02:25:27 np0005531887 podman[103262]: 2025-11-22 07:25:27.923925631 +0000 UTC m=+0.051542979 container create cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118)
Nov 22 02:25:27 np0005531887 podman[103262]: 2025-11-22 07:25:27.898527736 +0000 UTC m=+0.026145104 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 02:25:27 np0005531887 python3[103125]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 02:25:28 np0005531887 python3.9[103453]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:25:29 np0005531887 python3.9[103607]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:25:30 np0005531887 python3.9[103683]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:25:30 np0005531887 python3.9[103834]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763796330.1530843-1304-30608118274093/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:25:31 np0005531887 python3.9[103910]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 22 02:25:31 np0005531887 systemd[1]: Reloading.
Nov 22 02:25:31 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:25:31 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:25:32 np0005531887 python3.9[104021]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:25:32 np0005531887 systemd[1]: Reloading.
Nov 22 02:25:32 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:25:32 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:25:33 np0005531887 systemd[1]: Starting ovn_metadata_agent container...
Nov 22 02:25:33 np0005531887 systemd[1]: Started libcrun container.
Nov 22 02:25:33 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bff3813fb747326fe0d25d508b3b522239fff63bc55482b67b0e382da45db0dc/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Nov 22 02:25:33 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bff3813fb747326fe0d25d508b3b522239fff63bc55482b67b0e382da45db0dc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 02:25:34 np0005531887 systemd[1]: Started /usr/bin/podman healthcheck run cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da.
Nov 22 02:25:35 np0005531887 podman[104063]: 2025-11-22 07:25:35.092762527 +0000 UTC m=+1.645228243 container init cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 22 02:25:35 np0005531887 ovn_metadata_agent[104079]: + sudo -E kolla_set_configs
Nov 22 02:25:35 np0005531887 podman[104063]: 2025-11-22 07:25:35.125193364 +0000 UTC m=+1.677659050 container start cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 02:25:35 np0005531887 ovn_metadata_agent[104079]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 22 02:25:35 np0005531887 ovn_metadata_agent[104079]: INFO:__main__:Validating config file
Nov 22 02:25:35 np0005531887 ovn_metadata_agent[104079]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 22 02:25:35 np0005531887 ovn_metadata_agent[104079]: INFO:__main__:Copying service configuration files
Nov 22 02:25:35 np0005531887 ovn_metadata_agent[104079]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Nov 22 02:25:35 np0005531887 ovn_metadata_agent[104079]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Nov 22 02:25:35 np0005531887 ovn_metadata_agent[104079]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Nov 22 02:25:35 np0005531887 ovn_metadata_agent[104079]: INFO:__main__:Writing out command to execute
Nov 22 02:25:35 np0005531887 ovn_metadata_agent[104079]: INFO:__main__:Setting permission for /var/lib/neutron
Nov 22 02:25:35 np0005531887 ovn_metadata_agent[104079]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Nov 22 02:25:35 np0005531887 ovn_metadata_agent[104079]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Nov 22 02:25:35 np0005531887 ovn_metadata_agent[104079]: INFO:__main__:Setting permission for /var/lib/neutron/external
Nov 22 02:25:35 np0005531887 ovn_metadata_agent[104079]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Nov 22 02:25:35 np0005531887 ovn_metadata_agent[104079]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Nov 22 02:25:35 np0005531887 ovn_metadata_agent[104079]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Nov 22 02:25:35 np0005531887 ovn_metadata_agent[104079]: ++ cat /run_command
Nov 22 02:25:35 np0005531887 ovn_metadata_agent[104079]: + CMD=neutron-ovn-metadata-agent
Nov 22 02:25:35 np0005531887 ovn_metadata_agent[104079]: + ARGS=
Nov 22 02:25:35 np0005531887 ovn_metadata_agent[104079]: + sudo kolla_copy_cacerts
Nov 22 02:25:35 np0005531887 ovn_metadata_agent[104079]: + [[ ! -n '' ]]
Nov 22 02:25:35 np0005531887 ovn_metadata_agent[104079]: + . kolla_extend_start
Nov 22 02:25:35 np0005531887 ovn_metadata_agent[104079]: Running command: 'neutron-ovn-metadata-agent'
Nov 22 02:25:35 np0005531887 ovn_metadata_agent[104079]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Nov 22 02:25:35 np0005531887 ovn_metadata_agent[104079]: + umask 0022
Nov 22 02:25:35 np0005531887 ovn_metadata_agent[104079]: + exec neutron-ovn-metadata-agent
Nov 22 02:25:35 np0005531887 edpm-start-podman-container[104063]: ovn_metadata_agent
Nov 22 02:25:35 np0005531887 edpm-start-podman-container[104062]: Creating additional drop-in dependency for "ovn_metadata_agent" (cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da)
Nov 22 02:25:35 np0005531887 systemd[1]: Reloading.
Nov 22 02:25:35 np0005531887 podman[104085]: 2025-11-22 07:25:35.42055306 +0000 UTC m=+0.279144133 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 22 02:25:35 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:25:35 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:25:35 np0005531887 systemd[1]: Started ovn_metadata_agent container.
Nov 22 02:25:36 np0005531887 systemd[1]: session-22.scope: Deactivated successfully.
Nov 22 02:25:36 np0005531887 systemd[1]: session-22.scope: Consumed 50.665s CPU time.
Nov 22 02:25:36 np0005531887 systemd-logind[821]: Session 22 logged out. Waiting for processes to exit.
Nov 22 02:25:36 np0005531887 systemd-logind[821]: Removed session 22.
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.247 104084 INFO neutron.common.config [-] Logging enabled!#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.247 104084 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.248 104084 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.248 104084 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.248 104084 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.248 104084 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.249 104084 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.249 104084 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.249 104084 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.249 104084 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.249 104084 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.250 104084 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.250 104084 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.250 104084 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.250 104084 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.250 104084 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.250 104084 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.250 104084 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.250 104084 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.250 104084 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.251 104084 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.251 104084 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.251 104084 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.251 104084 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.251 104084 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.251 104084 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.251 104084 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.251 104084 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.252 104084 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.252 104084 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.252 104084 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.252 104084 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.252 104084 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.252 104084 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.252 104084 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.253 104084 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.253 104084 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.253 104084 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.253 104084 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.253 104084 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.253 104084 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.253 104084 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.254 104084 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.254 104084 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.254 104084 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.254 104084 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.254 104084 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.254 104084 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.254 104084 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.254 104084 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.254 104084 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.255 104084 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.255 104084 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.255 104084 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.255 104084 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.255 104084 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.255 104084 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.255 104084 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.255 104084 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.256 104084 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.256 104084 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.256 104084 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.256 104084 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.256 104084 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.256 104084 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.256 104084 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.257 104084 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.257 104084 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.257 104084 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.257 104084 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.257 104084 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.257 104084 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.257 104084 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.258 104084 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-cell1-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.258 104084 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.258 104084 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.258 104084 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.258 104084 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.258 104084 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.258 104084 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.258 104084 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.259 104084 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.259 104084 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.259 104084 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.259 104084 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.259 104084 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.259 104084 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.259 104084 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.260 104084 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.260 104084 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.260 104084 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.260 104084 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.260 104084 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.260 104084 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.260 104084 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.261 104084 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.261 104084 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.261 104084 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.261 104084 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.261 104084 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.261 104084 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.261 104084 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.261 104084 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.261 104084 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.262 104084 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.262 104084 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.262 104084 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.262 104084 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.262 104084 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.262 104084 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.262 104084 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.263 104084 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.263 104084 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.263 104084 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.263 104084 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.263 104084 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.263 104084 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.263 104084 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.263 104084 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.264 104084 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.264 104084 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.264 104084 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.264 104084 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.264 104084 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.264 104084 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.265 104084 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.265 104084 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.265 104084 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.265 104084 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.265 104084 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.265 104084 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.265 104084 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.266 104084 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.266 104084 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.266 104084 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.266 104084 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.266 104084 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.266 104084 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.267 104084 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.267 104084 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.267 104084 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.267 104084 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.267 104084 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.267 104084 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.267 104084 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.268 104084 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.268 104084 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.268 104084 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.268 104084 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.268 104084 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.268 104084 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.268 104084 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.269 104084 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.269 104084 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.269 104084 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.269 104084 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.269 104084 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.269 104084 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.269 104084 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.269 104084 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.270 104084 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.270 104084 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.270 104084 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.270 104084 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.270 104084 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.270 104084 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.270 104084 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.270 104084 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.271 104084 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.271 104084 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.271 104084 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.271 104084 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.271 104084 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.271 104084 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.271 104084 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.271 104084 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.272 104084 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.272 104084 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.272 104084 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.272 104084 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.272 104084 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.272 104084 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.272 104084 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.273 104084 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.273 104084 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.273 104084 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.273 104084 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.273 104084 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.273 104084 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.273 104084 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.273 104084 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.273 104084 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.274 104084 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.274 104084 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.274 104084 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.274 104084 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.274 104084 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.274 104084 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.274 104084 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.274 104084 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.275 104084 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.275 104084 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.275 104084 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.275 104084 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.275 104084 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.275 104084 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.276 104084 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.276 104084 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.276 104084 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.276 104084 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.276 104084 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.276 104084 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.276 104084 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.277 104084 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.277 104084 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.277 104084 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.277 104084 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.277 104084 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.277 104084 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.277 104084 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.278 104084 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.278 104084 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.278 104084 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.278 104084 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.278 104084 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.278 104084 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.278 104084 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.279 104084 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.279 104084 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.279 104084 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.279 104084 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.279 104084 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.279 104084 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.279 104084 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.280 104084 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.280 104084 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.280 104084 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.280 104084 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.280 104084 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.280 104084 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.281 104084 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.281 104084 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.281 104084 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.281 104084 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.281 104084 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.281 104084 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.281 104084 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.282 104084 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.282 104084 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.282 104084 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.282 104084 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.282 104084 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.282 104084 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.282 104084 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.283 104084 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.283 104084 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.283 104084 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.283 104084 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.283 104084 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.283 104084 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.283 104084 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.284 104084 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.284 104084 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.284 104084 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.284 104084 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.284 104084 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.284 104084 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.285 104084 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.285 104084 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.285 104084 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.285 104084 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.285 104084 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.285 104084 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.285 104084 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.286 104084 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.286 104084 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.286 104084 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.286 104084 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.286 104084 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.286 104084 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.286 104084 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.287 104084 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.287 104084 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.287 104084 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.287 104084 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.287 104084 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.287 104084 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.287 104084 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.288 104084 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.288 104084 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.288 104084 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.288 104084 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.288 104084 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.288 104084 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.288 104084 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.289 104084 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.289 104084 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.289 104084 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.289 104084 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.289 104084 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.289 104084 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.289 104084 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.289 104084 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.290 104084 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.299 104084 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.299 104084 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.300 104084 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.300 104084 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.300 104084 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.315 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 73ab1342-b2af-4236-8199-7d435ebce194 (UUID: 73ab1342-b2af-4236-8199-7d435ebce194) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.344 104084 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.345 104084 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.345 104084 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.345 104084 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.350 104084 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.356 104084 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.362 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '73ab1342-b2af-4236-8199-7d435ebce194'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], external_ids={}, name=73ab1342-b2af-4236-8199-7d435ebce194, nb_cfg_timestamp=1763796272331, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.363 104084 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f84c9b17a90>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.364 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.364 104084 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.364 104084 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.364 104084 INFO oslo_service.service [-] Starting 1 workers#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.369 104084 DEBUG oslo_service.service [-] Started child 104195 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.373 104195 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-430721'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.374 104084 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmp8n_1_bnl/privsep.sock']#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.398 104195 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.398 104195 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.399 104195 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.402 104195 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.409 104195 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Nov 22 02:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.415 104195 INFO eventlet.wsgi.server [-] (104195) wsgi starting up on http:/var/lib/neutron/metadata_proxy#033[00m
Nov 22 02:25:37 np0005531887 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Nov 22 02:25:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:38.076 104084 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Nov 22 02:25:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:38.077 104084 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp8n_1_bnl/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Nov 22 02:25:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.934 104200 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Nov 22 02:25:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.938 104200 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Nov 22 02:25:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.942 104200 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m
Nov 22 02:25:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:37.943 104200 INFO oslo.privsep.daemon [-] privsep daemon running as pid 104200#033[00m
Nov 22 02:25:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:38.081 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[a41d2df2-e604-4153-a723-f275a938d8fc]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:25:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:38.622 104200 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:25:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:38.622 104200 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:25:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:38.622 104200 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:25:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:39.246 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[62b74d4a-be57-4e01-88e5-b7ad4463bcab]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:25:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:39.248 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, column=external_ids, values=({'neutron:ovn-metadata-id': '66fb37dc-e977-50f3-8e50-4afdd4c6b9be'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:25:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:39.693 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.018 104084 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.019 104084 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.019 104084 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.019 104084 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.019 104084 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.019 104084 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.019 104084 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.019 104084 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.019 104084 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.020 104084 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.020 104084 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.020 104084 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.020 104084 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.020 104084 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.020 104084 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.020 104084 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.021 104084 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.021 104084 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.021 104084 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.021 104084 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.021 104084 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.021 104084 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.021 104084 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.021 104084 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.021 104084 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.022 104084 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.022 104084 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.022 104084 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.022 104084 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.022 104084 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.022 104084 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.022 104084 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.023 104084 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.023 104084 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.023 104084 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.023 104084 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.023 104084 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.023 104084 DEBUG oslo_service.service [-] host                           = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.023 104084 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.024 104084 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.024 104084 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.024 104084 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.024 104084 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.024 104084 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.024 104084 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.025 104084 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.025 104084 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.025 104084 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.025 104084 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.025 104084 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.025 104084 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.025 104084 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.025 104084 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.026 104084 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.026 104084 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.026 104084 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.026 104084 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.026 104084 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.026 104084 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.026 104084 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.026 104084 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.027 104084 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.027 104084 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.027 104084 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.027 104084 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.027 104084 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.027 104084 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.028 104084 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.028 104084 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.028 104084 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.028 104084 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.028 104084 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.028 104084 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.028 104084 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-cell1-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.029 104084 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.029 104084 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.029 104084 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.029 104084 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.029 104084 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.029 104084 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.029 104084 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.029 104084 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.030 104084 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.030 104084 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.030 104084 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.030 104084 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.030 104084 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.030 104084 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.030 104084 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.030 104084 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.030 104084 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.031 104084 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.031 104084 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.031 104084 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.031 104084 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.031 104084 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.031 104084 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.031 104084 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.031 104084 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.031 104084 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.031 104084 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.032 104084 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.032 104084 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.032 104084 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.032 104084 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.032 104084 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.032 104084 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.032 104084 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.032 104084 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.033 104084 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.033 104084 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.033 104084 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.033 104084 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.033 104084 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.033 104084 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.034 104084 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.034 104084 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.034 104084 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.034 104084 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.034 104084 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.035 104084 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.035 104084 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.035 104084 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.035 104084 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.035 104084 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.035 104084 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.035 104084 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.035 104084 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.035 104084 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.036 104084 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.036 104084 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.036 104084 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.036 104084 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.036 104084 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.036 104084 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.036 104084 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.037 104084 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.037 104084 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.037 104084 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.037 104084 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.037 104084 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.037 104084 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.037 104084 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.038 104084 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.038 104084 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.038 104084 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.038 104084 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.038 104084 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.038 104084 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.038 104084 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.038 104084 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.039 104084 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.039 104084 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.039 104084 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.039 104084 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.039 104084 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.039 104084 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.040 104084 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.040 104084 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.040 104084 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.040 104084 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.040 104084 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.040 104084 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.040 104084 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.040 104084 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.040 104084 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.040 104084 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.041 104084 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.041 104084 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.041 104084 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.041 104084 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.041 104084 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.041 104084 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.041 104084 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.041 104084 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.042 104084 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.042 104084 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.042 104084 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.042 104084 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.042 104084 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.042 104084 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.042 104084 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.042 104084 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.043 104084 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.043 104084 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.043 104084 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.043 104084 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.043 104084 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.044 104084 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.044 104084 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.044 104084 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.044 104084 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.044 104084 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.044 104084 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.044 104084 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.044 104084 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.044 104084 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.045 104084 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.045 104084 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.045 104084 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.045 104084 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.045 104084 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.045 104084 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.045 104084 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.045 104084 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.046 104084 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.046 104084 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.046 104084 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.046 104084 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.046 104084 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.046 104084 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.046 104084 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.047 104084 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.047 104084 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.047 104084 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.047 104084 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.047 104084 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.047 104084 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.047 104084 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.048 104084 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.048 104084 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.048 104084 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.048 104084 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.048 104084 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.048 104084 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.048 104084 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.048 104084 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.048 104084 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.049 104084 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.049 104084 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.049 104084 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.049 104084 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.049 104084 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.049 104084 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.049 104084 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.049 104084 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.049 104084 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.050 104084 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.050 104084 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.050 104084 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.050 104084 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.050 104084 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.050 104084 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.050 104084 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.050 104084 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.051 104084 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.051 104084 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.051 104084 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.051 104084 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.051 104084 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.051 104084 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.051 104084 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.051 104084 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.051 104084 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.052 104084 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.052 104084 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.052 104084 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.052 104084 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.052 104084 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.052 104084 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.052 104084 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.052 104084 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.052 104084 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.053 104084 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.053 104084 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.053 104084 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.053 104084 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.053 104084 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.053 104084 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.053 104084 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.054 104084 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.054 104084 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.054 104084 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.054 104084 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.054 104084 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.054 104084 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.054 104084 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.054 104084 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.055 104084 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.055 104084 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.055 104084 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.055 104084 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.055 104084 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.055 104084 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.055 104084 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.055 104084 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.055 104084 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.056 104084 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.056 104084 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.056 104084 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.056 104084 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.056 104084 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.056 104084 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.056 104084 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.056 104084 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.057 104084 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.057 104084 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.057 104084 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.057 104084 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.057 104084 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.057 104084 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.058 104084 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.058 104084 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:25:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:25:40.058 104084 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Nov 22 02:25:41 np0005531887 systemd-logind[821]: New session 23 of user zuul.
Nov 22 02:25:41 np0005531887 systemd[1]: Started Session 23 of User zuul.
Nov 22 02:25:42 np0005531887 python3.9[104360]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 02:25:44 np0005531887 python3.9[104516]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:25:45 np0005531887 python3.9[104681]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 22 02:25:45 np0005531887 systemd[1]: Reloading.
Nov 22 02:25:45 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:25:45 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:25:46 np0005531887 python3.9[104865]: ansible-ansible.builtin.service_facts Invoked
Nov 22 02:25:46 np0005531887 network[104882]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 22 02:25:46 np0005531887 network[104883]: 'network-scripts' will be removed from distribution in near future.
Nov 22 02:25:46 np0005531887 network[104884]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 22 02:25:50 np0005531887 python3.9[105145]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:25:51 np0005531887 python3.9[105298]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:25:52 np0005531887 python3.9[105451]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:25:53 np0005531887 python3.9[105604]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:25:53 np0005531887 python3.9[105757]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:25:54 np0005531887 python3.9[105910]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:25:55 np0005531887 python3.9[106063]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:25:57 np0005531887 python3.9[106216]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:25:57 np0005531887 podman[106369]: 2025-11-22 07:25:57.916351367 +0000 UTC m=+0.121775767 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.license=GPLv2)
Nov 22 02:25:58 np0005531887 python3.9[106368]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:25:58 np0005531887 python3.9[106546]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:26:01 np0005531887 python3.9[106698]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:26:02 np0005531887 python3.9[106850]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:26:02 np0005531887 python3.9[107002]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:26:03 np0005531887 python3.9[107154]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:26:04 np0005531887 python3.9[107306]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:26:04 np0005531887 python3.9[107458]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:26:05 np0005531887 python3.9[107610]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:26:05 np0005531887 podman[107693]: 2025-11-22 07:26:05.833665022 +0000 UTC m=+0.050415753 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:26:06 np0005531887 python3.9[107781]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:26:06 np0005531887 python3.9[107933]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:26:07 np0005531887 python3.9[108085]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:26:08 np0005531887 python3.9[108237]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:26:08 np0005531887 python3.9[108389]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:26:09 np0005531887 python3.9[108541]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 22 02:26:10 np0005531887 python3.9[108693]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 22 02:26:10 np0005531887 systemd[1]: Reloading.
Nov 22 02:26:10 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:26:10 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:26:11 np0005531887 python3.9[108879]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:26:12 np0005531887 python3.9[109032]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:26:12 np0005531887 python3.9[109185]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:26:13 np0005531887 python3.9[109338]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:26:14 np0005531887 python3.9[109491]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:26:14 np0005531887 python3.9[109644]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:26:15 np0005531887 python3.9[109797]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:26:16 np0005531887 python3.9[109950]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Nov 22 02:26:17 np0005531887 python3.9[110103]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 22 02:26:18 np0005531887 python3.9[110261]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 22 02:26:19 np0005531887 python3.9[110421]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 22 02:26:20 np0005531887 python3.9[110505]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 02:26:28 np0005531887 podman[110519]: 2025-11-22 07:26:28.89245484 +0000 UTC m=+0.106342273 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 22 02:26:36 np0005531887 podman[110719]: 2025-11-22 07:26:36.85476159 +0000 UTC m=+0.061799664 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Nov 22 02:26:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:26:37.292 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:26:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:26:37.293 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:26:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:26:37.293 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:26:50 np0005531887 kernel: SELinux:  Converting 2758 SID table entries...
Nov 22 02:26:50 np0005531887 kernel: SELinux:  policy capability network_peer_controls=1
Nov 22 02:26:50 np0005531887 kernel: SELinux:  policy capability open_perms=1
Nov 22 02:26:50 np0005531887 kernel: SELinux:  policy capability extended_socket_class=1
Nov 22 02:26:50 np0005531887 kernel: SELinux:  policy capability always_check_network=0
Nov 22 02:26:50 np0005531887 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 22 02:26:50 np0005531887 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 22 02:26:50 np0005531887 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 22 02:26:59 np0005531887 dbus-broker-launch[811]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Nov 22 02:26:59 np0005531887 podman[110755]: 2025-11-22 07:26:59.88983136 +0000 UTC m=+0.097217934 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:27:00 np0005531887 kernel: SELinux:  Converting 2758 SID table entries...
Nov 22 02:27:00 np0005531887 kernel: SELinux:  policy capability network_peer_controls=1
Nov 22 02:27:00 np0005531887 kernel: SELinux:  policy capability open_perms=1
Nov 22 02:27:00 np0005531887 kernel: SELinux:  policy capability extended_socket_class=1
Nov 22 02:27:00 np0005531887 kernel: SELinux:  policy capability always_check_network=0
Nov 22 02:27:00 np0005531887 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 22 02:27:00 np0005531887 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 22 02:27:00 np0005531887 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 22 02:27:07 np0005531887 dbus-broker-launch[811]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Nov 22 02:27:07 np0005531887 podman[110789]: 2025-11-22 07:27:07.84724887 +0000 UTC m=+0.060356237 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 22 02:27:30 np0005531887 podman[119274]: 2025-11-22 07:27:30.89103192 +0000 UTC m=+0.104211573 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2)
Nov 22 02:27:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:27:37.293 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:27:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:27:37.295 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:27:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:27:37.296 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:27:38 np0005531887 podman[123884]: 2025-11-22 07:27:38.884883285 +0000 UTC m=+0.095384549 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118)
Nov 22 02:27:59 np0005531887 kernel: SELinux:  Converting 2759 SID table entries...
Nov 22 02:27:59 np0005531887 kernel: SELinux:  policy capability network_peer_controls=1
Nov 22 02:27:59 np0005531887 kernel: SELinux:  policy capability open_perms=1
Nov 22 02:27:59 np0005531887 kernel: SELinux:  policy capability extended_socket_class=1
Nov 22 02:27:59 np0005531887 kernel: SELinux:  policy capability always_check_network=0
Nov 22 02:27:59 np0005531887 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 22 02:27:59 np0005531887 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 22 02:27:59 np0005531887 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 22 02:28:00 np0005531887 dbus-broker-launch[811]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Nov 22 02:28:00 np0005531887 dbus-broker-launch[810]: Noticed file-system modification, trigger reload.
Nov 22 02:28:00 np0005531887 dbus-broker-launch[810]: Noticed file-system modification, trigger reload.
Nov 22 02:28:01 np0005531887 podman[127670]: 2025-11-22 07:28:01.050283407 +0000 UTC m=+0.110867594 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 02:28:09 np0005531887 podman[127924]: 2025-11-22 07:28:09.872996942 +0000 UTC m=+0.082961199 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3)
Nov 22 02:28:11 np0005531887 systemd[1]: Stopping OpenSSH server daemon...
Nov 22 02:28:11 np0005531887 systemd[1]: sshd.service: Deactivated successfully.
Nov 22 02:28:11 np0005531887 systemd[1]: Stopped OpenSSH server daemon.
Nov 22 02:28:11 np0005531887 systemd[1]: sshd.service: Consumed 3.860s CPU time, read 32.0K from disk, written 84.0K to disk.
Nov 22 02:28:11 np0005531887 systemd[1]: Stopped target sshd-keygen.target.
Nov 22 02:28:11 np0005531887 systemd[1]: Stopping sshd-keygen.target...
Nov 22 02:28:11 np0005531887 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 22 02:28:11 np0005531887 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 22 02:28:11 np0005531887 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 22 02:28:11 np0005531887 systemd[1]: Reached target sshd-keygen.target.
Nov 22 02:28:11 np0005531887 systemd[1]: Starting OpenSSH server daemon...
Nov 22 02:28:11 np0005531887 systemd[1]: Started OpenSSH server daemon.
Nov 22 02:28:13 np0005531887 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 22 02:28:13 np0005531887 systemd[1]: Starting man-db-cache-update.service...
Nov 22 02:28:13 np0005531887 systemd[1]: Reloading.
Nov 22 02:28:13 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:28:13 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:28:13 np0005531887 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 22 02:28:18 np0005531887 python3.9[133080]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 22 02:28:18 np0005531887 systemd[1]: Reloading.
Nov 22 02:28:18 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:28:18 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:28:19 np0005531887 python3.9[134218]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 22 02:28:19 np0005531887 systemd[1]: Reloading.
Nov 22 02:28:19 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:28:19 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:28:20 np0005531887 python3.9[135411]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 22 02:28:20 np0005531887 systemd[1]: Reloading.
Nov 22 02:28:20 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:28:20 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:28:21 np0005531887 python3.9[136646]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 22 02:28:22 np0005531887 systemd[1]: Reloading.
Nov 22 02:28:22 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:28:22 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:28:23 np0005531887 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 22 02:28:23 np0005531887 systemd[1]: Finished man-db-cache-update.service.
Nov 22 02:28:23 np0005531887 systemd[1]: man-db-cache-update.service: Consumed 11.730s CPU time.
Nov 22 02:28:23 np0005531887 systemd[1]: run-r00afe907a74e4838ba0b1559fb3fe52a.service: Deactivated successfully.
Nov 22 02:28:24 np0005531887 python3.9[138003]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 02:28:24 np0005531887 systemd[1]: Reloading.
Nov 22 02:28:24 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:28:24 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:28:25 np0005531887 python3.9[138193]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 02:28:25 np0005531887 systemd[1]: Reloading.
Nov 22 02:28:25 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:28:25 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:28:26 np0005531887 python3.9[138385]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 02:28:26 np0005531887 systemd[1]: Reloading.
Nov 22 02:28:26 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:28:26 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:28:27 np0005531887 python3.9[138575]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 02:28:28 np0005531887 python3.9[138730]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 02:28:28 np0005531887 systemd[1]: Reloading.
Nov 22 02:28:28 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:28:28 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:28:29 np0005531887 python3.9[138921]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 22 02:28:29 np0005531887 systemd[1]: Reloading.
Nov 22 02:28:29 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:28:29 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:28:30 np0005531887 systemd[1]: Listening on libvirt proxy daemon socket.
Nov 22 02:28:30 np0005531887 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Nov 22 02:28:31 np0005531887 python3.9[139114]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 02:28:31 np0005531887 podman[139116]: 2025-11-22 07:28:31.204476853 +0000 UTC m=+0.104776316 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller)
Nov 22 02:28:32 np0005531887 python3.9[139296]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 02:28:33 np0005531887 python3.9[139451]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 02:28:34 np0005531887 python3.9[139606]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 02:28:34 np0005531887 python3.9[139761]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 02:28:35 np0005531887 python3.9[139916]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 02:28:36 np0005531887 python3.9[140071]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 02:28:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:28:37.296 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:28:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:28:37.298 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:28:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:28:37.299 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:28:37 np0005531887 python3.9[140226]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 02:28:38 np0005531887 python3.9[140381]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 02:28:39 np0005531887 python3.9[140536]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 02:28:40 np0005531887 podman[140538]: 2025-11-22 07:28:40.037465282 +0000 UTC m=+0.065736248 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:28:40 np0005531887 python3.9[140711]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 02:28:41 np0005531887 python3.9[140866]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 02:28:42 np0005531887 python3.9[141021]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 02:28:43 np0005531887 python3.9[141176]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 02:28:45 np0005531887 python3.9[141331]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:28:46 np0005531887 python3.9[141483]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:28:47 np0005531887 python3.9[141635]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:28:48 np0005531887 python3.9[141787]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:28:48 np0005531887 python3.9[141939]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:28:49 np0005531887 python3.9[142091]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:28:51 np0005531887 python3.9[142243]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:28:52 np0005531887 python3.9[142368]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763796530.21558-1628-212738624227415/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:28:53 np0005531887 python3.9[142520]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:28:53 np0005531887 python3.9[142645]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763796532.5368865-1628-170713761960385/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:28:54 np0005531887 python3.9[142797]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:28:55 np0005531887 python3.9[142922]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763796534.0780976-1628-31534935029762/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:28:56 np0005531887 python3.9[143074]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:28:57 np0005531887 python3.9[143199]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763796535.7446249-1628-281157621403040/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:28:57 np0005531887 python3.9[143351]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:28:58 np0005531887 python3.9[143476]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763796537.348306-1628-1233871900952/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:28:59 np0005531887 python3.9[143628]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:29:00 np0005531887 python3.9[143753]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763796538.8364732-1628-224696478613899/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:00 np0005531887 python3.9[143905]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:29:01 np0005531887 podman[144028]: 2025-11-22 07:29:01.423624797 +0000 UTC m=+0.123981902 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Nov 22 02:29:01 np0005531887 python3.9[144029]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763796540.2256982-1628-83385186140623/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:02 np0005531887 python3.9[144206]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:29:02 np0005531887 python3.9[144331]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763796541.7682798-1628-104524417110470/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:03 np0005531887 python3.9[144485]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Nov 22 02:29:04 np0005531887 python3.9[144638]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:05 np0005531887 python3.9[144790]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:06 np0005531887 python3.9[144942]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:07 np0005531887 python3.9[145094]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:07 np0005531887 python3.9[145246]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:08 np0005531887 python3.9[145398]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:09 np0005531887 python3.9[145550]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:09 np0005531887 python3.9[145702]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:10 np0005531887 podman[145826]: 2025-11-22 07:29:10.445113413 +0000 UTC m=+0.054988997 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Nov 22 02:29:10 np0005531887 python3.9[145870]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:11 np0005531887 python3.9[146023]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:12 np0005531887 python3.9[146175]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:12 np0005531887 python3.9[146327]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:13 np0005531887 python3.9[146479]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:14 np0005531887 python3.9[146631]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:15 np0005531887 python3.9[146783]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:29:15 np0005531887 python3.9[146906]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796554.5752938-2291-188795608999243/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:16 np0005531887 python3.9[147058]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:29:17 np0005531887 python3.9[147181]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796555.991658-2291-146352916393853/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:18 np0005531887 python3.9[147333]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:29:18 np0005531887 python3.9[147456]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796557.5477414-2291-112595210095487/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:19 np0005531887 python3.9[147608]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:29:19 np0005531887 python3.9[147731]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796558.8276901-2291-238630678693654/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:20 np0005531887 python3.9[147883]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:29:21 np0005531887 python3.9[148006]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796560.0662572-2291-146296009066259/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:22 np0005531887 python3.9[148158]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:29:22 np0005531887 python3.9[148281]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796561.4245837-2291-61014991267018/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:23 np0005531887 python3.9[148433]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:29:24 np0005531887 python3.9[148556]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796562.9065125-2291-66262413354500/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:24 np0005531887 python3.9[148708]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:29:25 np0005531887 python3.9[148831]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796564.2558093-2291-193340356104512/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:26 np0005531887 python3.9[148983]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:29:26 np0005531887 python3.9[149106]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796565.5762768-2291-98455974255935/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:27 np0005531887 python3.9[149258]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:29:28 np0005531887 python3.9[149381]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796566.992719-2291-10674148287268/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:28 np0005531887 python3.9[149533]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:29:29 np0005531887 python3.9[149656]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796568.347108-2291-217407383730430/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:30 np0005531887 python3.9[149808]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:29:30 np0005531887 python3.9[149931]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796569.575804-2291-239513772831231/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:31 np0005531887 python3.9[150083]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:29:31 np0005531887 podman[150154]: 2025-11-22 07:29:31.864517536 +0000 UTC m=+0.087189118 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 22 02:29:32 np0005531887 python3.9[150232]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796570.9409552-2291-19490798834305/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:32 np0005531887 python3.9[150384]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:29:33 np0005531887 python3.9[150507]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796572.2571285-2291-166555332179355/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:34 np0005531887 python3.9[150657]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ls -lRZ /run/libvirt | grep -E ':container_\S+_t'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:29:36 np0005531887 python3.9[150812]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Nov 22 02:29:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:29:37.297 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:29:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:29:37.298 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:29:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:29:37.298 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:29:38 np0005531887 dbus-broker-launch[811]: avc:  op=load_policy lsm=selinux seqno=15 res=1
Nov 22 02:29:38 np0005531887 python3.9[150968]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:39 np0005531887 python3.9[151120]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:40 np0005531887 python3.9[151272]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:40 np0005531887 podman[151396]: 2025-11-22 07:29:40.792702934 +0000 UTC m=+0.089999716 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:29:40 np0005531887 python3.9[151439]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:41 np0005531887 python3.9[151595]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:42 np0005531887 python3.9[151747]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:43 np0005531887 python3.9[151899]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:43 np0005531887 python3.9[152051]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:44 np0005531887 python3.9[152205]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:44 np0005531887 python3.9[152357]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:45 np0005531887 python3.9[152509]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 02:29:45 np0005531887 systemd[1]: Reloading.
Nov 22 02:29:46 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:29:46 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:29:46 np0005531887 systemd[1]: Starting libvirt logging daemon socket...
Nov 22 02:29:46 np0005531887 systemd[1]: Listening on libvirt logging daemon socket.
Nov 22 02:29:46 np0005531887 systemd[1]: Starting libvirt logging daemon admin socket...
Nov 22 02:29:46 np0005531887 systemd[1]: Listening on libvirt logging daemon admin socket.
Nov 22 02:29:46 np0005531887 systemd[1]: Starting libvirt logging daemon...
Nov 22 02:29:46 np0005531887 systemd[1]: Started libvirt logging daemon.
Nov 22 02:29:47 np0005531887 python3.9[152702]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 02:29:47 np0005531887 systemd[1]: Reloading.
Nov 22 02:29:47 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:29:47 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:29:47 np0005531887 systemd[1]: Starting libvirt nodedev daemon socket...
Nov 22 02:29:47 np0005531887 systemd[1]: Listening on libvirt nodedev daemon socket.
Nov 22 02:29:47 np0005531887 systemd[1]: Starting libvirt nodedev daemon admin socket...
Nov 22 02:29:47 np0005531887 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Nov 22 02:29:47 np0005531887 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Nov 22 02:29:47 np0005531887 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Nov 22 02:29:47 np0005531887 systemd[1]: Starting libvirt nodedev daemon...
Nov 22 02:29:47 np0005531887 systemd[1]: Started libvirt nodedev daemon.
Nov 22 02:29:48 np0005531887 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Nov 22 02:29:48 np0005531887 python3.9[152918]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 02:29:48 np0005531887 systemd[1]: Reloading.
Nov 22 02:29:48 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:29:48 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:29:48 np0005531887 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Nov 22 02:29:48 np0005531887 systemd[1]: Starting libvirt proxy daemon admin socket...
Nov 22 02:29:48 np0005531887 systemd[1]: Starting libvirt proxy daemon read-only socket...
Nov 22 02:29:48 np0005531887 systemd[1]: Listening on libvirt proxy daemon admin socket.
Nov 22 02:29:48 np0005531887 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Nov 22 02:29:48 np0005531887 systemd[1]: Starting libvirt proxy daemon...
Nov 22 02:29:48 np0005531887 systemd[1]: Started libvirt proxy daemon.
Nov 22 02:29:48 np0005531887 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Nov 22 02:29:48 np0005531887 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Nov 22 02:29:49 np0005531887 python3.9[153139]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 02:29:49 np0005531887 systemd[1]: Reloading.
Nov 22 02:29:49 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:29:49 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:29:49 np0005531887 setroubleshoot[152919]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 16f06045-00e7-4d0e-b787-c33733fa914d
Nov 22 02:29:49 np0005531887 setroubleshoot[152919]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Nov 22 02:29:49 np0005531887 setroubleshoot[152919]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 16f06045-00e7-4d0e-b787-c33733fa914d
Nov 22 02:29:49 np0005531887 setroubleshoot[152919]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Nov 22 02:29:49 np0005531887 systemd[1]: Listening on libvirt locking daemon socket.
Nov 22 02:29:49 np0005531887 systemd[1]: Starting libvirt QEMU daemon socket...
Nov 22 02:29:49 np0005531887 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Nov 22 02:29:49 np0005531887 systemd[1]: Starting Virtual Machine and Container Registration Service...
Nov 22 02:29:49 np0005531887 systemd[1]: Listening on libvirt QEMU daemon socket.
Nov 22 02:29:49 np0005531887 systemd[1]: Starting libvirt QEMU daemon admin socket...
Nov 22 02:29:49 np0005531887 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Nov 22 02:29:49 np0005531887 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Nov 22 02:29:49 np0005531887 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Nov 22 02:29:49 np0005531887 systemd[1]: Started Virtual Machine and Container Registration Service.
Nov 22 02:29:49 np0005531887 systemd[1]: Starting libvirt QEMU daemon...
Nov 22 02:29:49 np0005531887 systemd[1]: Started libvirt QEMU daemon.
Nov 22 02:29:50 np0005531887 python3.9[153355]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 02:29:50 np0005531887 systemd[1]: Reloading.
Nov 22 02:29:51 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:29:51 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:29:51 np0005531887 systemd[1]: Starting libvirt secret daemon socket...
Nov 22 02:29:51 np0005531887 systemd[1]: Listening on libvirt secret daemon socket.
Nov 22 02:29:51 np0005531887 systemd[1]: Starting libvirt secret daemon admin socket...
Nov 22 02:29:51 np0005531887 systemd[1]: Starting libvirt secret daemon read-only socket...
Nov 22 02:29:51 np0005531887 systemd[1]: Listening on libvirt secret daemon admin socket.
Nov 22 02:29:51 np0005531887 systemd[1]: Listening on libvirt secret daemon read-only socket.
Nov 22 02:29:51 np0005531887 systemd[1]: Starting libvirt secret daemon...
Nov 22 02:29:51 np0005531887 systemd[1]: Started libvirt secret daemon.
Nov 22 02:29:53 np0005531887 python3.9[153567]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:54 np0005531887 python3.9[153719]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 22 02:29:55 np0005531887 python3.9[153871]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:29:56 np0005531887 python3.9[153994]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796594.7204664-3326-274273082525034/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:57 np0005531887 python3.9[154146]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:58 np0005531887 python3.9[154298]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:29:58 np0005531887 python3.9[154376]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:29:59 np0005531887 python3.9[154528]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:29:59 np0005531887 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Nov 22 02:29:59 np0005531887 systemd[1]: setroubleshootd.service: Deactivated successfully.
Nov 22 02:30:00 np0005531887 python3.9[154606]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.cp32auwi recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:30:00 np0005531887 python3.9[154758]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:30:01 np0005531887 python3.9[154836]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:30:02 np0005531887 podman[154988]: 2025-11-22 07:30:02.026449819 +0000 UTC m=+0.097073418 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:30:02 np0005531887 python3.9[154989]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:30:03 np0005531887 python3[155167]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 22 02:30:03 np0005531887 python3.9[155319]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:30:04 np0005531887 python3.9[155397]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:30:06 np0005531887 python3.9[155549]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:30:06 np0005531887 python3.9[155627]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:30:07 np0005531887 python3.9[155779]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:30:08 np0005531887 python3.9[155857]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:30:08 np0005531887 python3.9[156009]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:30:09 np0005531887 python3.9[156087]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:30:10 np0005531887 python3.9[156239]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:30:10 np0005531887 python3.9[156364]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796609.5333397-3701-248417191611866/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:30:11 np0005531887 podman[156488]: 2025-11-22 07:30:11.285762816 +0000 UTC m=+0.055542730 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 02:30:11 np0005531887 python3.9[156533]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:30:12 np0005531887 python3.9[156685]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:30:13 np0005531887 python3.9[156840]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:30:14 np0005531887 python3.9[156992]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:30:15 np0005531887 python3.9[157145]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:30:15 np0005531887 python3.9[157299]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:30:16 np0005531887 python3.9[157454]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:30:17 np0005531887 python3.9[157606]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:30:18 np0005531887 python3.9[157729]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796616.8075988-3917-185264133656680/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:30:18 np0005531887 python3.9[157881]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:30:19 np0005531887 python3.9[158004]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796618.3248465-3962-37020233702253/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:30:20 np0005531887 python3.9[158156]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:30:20 np0005531887 python3.9[158279]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796619.6597292-4007-32829474198838/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:30:21 np0005531887 python3.9[158431]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:30:21 np0005531887 systemd[1]: Reloading.
Nov 22 02:30:21 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:30:21 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:30:21 np0005531887 systemd[1]: Reached target edpm_libvirt.target.
Nov 22 02:30:22 np0005531887 python3.9[158622]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 22 02:30:22 np0005531887 systemd[1]: Reloading.
Nov 22 02:30:22 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:30:22 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:30:23 np0005531887 systemd[1]: Reloading.
Nov 22 02:30:23 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:30:23 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:30:24 np0005531887 systemd[1]: session-23.scope: Deactivated successfully.
Nov 22 02:30:24 np0005531887 systemd[1]: session-23.scope: Consumed 3min 31.331s CPU time.
Nov 22 02:30:24 np0005531887 systemd-logind[821]: Session 23 logged out. Waiting for processes to exit.
Nov 22 02:30:24 np0005531887 systemd-logind[821]: Removed session 23.
Nov 22 02:30:29 np0005531887 systemd-logind[821]: New session 24 of user zuul.
Nov 22 02:30:29 np0005531887 systemd[1]: Started Session 24 of User zuul.
Nov 22 02:30:30 np0005531887 python3.9[158873]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 02:30:31 np0005531887 python3.9[159027]: ansible-ansible.builtin.service_facts Invoked
Nov 22 02:30:31 np0005531887 network[159044]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 22 02:30:31 np0005531887 network[159045]: 'network-scripts' will be removed from distribution in near future.
Nov 22 02:30:31 np0005531887 network[159046]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 22 02:30:32 np0005531887 podman[159053]: 2025-11-22 07:30:32.90841286 +0000 UTC m=+0.108013729 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:30:36 np0005531887 python3.9[159343]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 22 02:30:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:30:37.299 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:30:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:30:37.300 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:30:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:30:37.300 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:30:37 np0005531887 python3.9[159427]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 02:30:41 np0005531887 podman[159429]: 2025-11-22 07:30:41.855442863 +0000 UTC m=+0.071030537 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 22 02:30:44 np0005531887 python3.9[159600]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:30:45 np0005531887 python3.9[159752]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:30:45 np0005531887 python3.9[159905]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:30:46 np0005531887 python3.9[160057]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:30:47 np0005531887 python3.9[160210]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:30:48 np0005531887 python3.9[160333]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796646.9985063-251-59213784365287/.source.iscsi _original_basename=.4hp7d3bc follow=False checksum=4b3f635a04202901c6e6f760e001f51b4b8ca147 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:30:49 np0005531887 python3.9[160485]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:30:49 np0005531887 python3.9[160637]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:30:50 np0005531887 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 02:30:50 np0005531887 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 02:30:51 np0005531887 python3.9[160790]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:30:51 np0005531887 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Nov 22 02:30:52 np0005531887 python3.9[160946]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:30:52 np0005531887 systemd[1]: Reloading.
Nov 22 02:30:52 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:30:52 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:30:52 np0005531887 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Nov 22 02:30:52 np0005531887 systemd[1]: Starting Open-iSCSI...
Nov 22 02:30:52 np0005531887 kernel: Loading iSCSI transport class v2.0-870.
Nov 22 02:30:52 np0005531887 systemd[1]: Started Open-iSCSI.
Nov 22 02:30:52 np0005531887 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Nov 22 02:30:52 np0005531887 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Nov 22 02:30:53 np0005531887 python3.9[161147]: ansible-ansible.builtin.service_facts Invoked
Nov 22 02:30:53 np0005531887 network[161164]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 22 02:30:53 np0005531887 network[161165]: 'network-scripts' will be removed from distribution in near future.
Nov 22 02:30:53 np0005531887 network[161166]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 22 02:30:58 np0005531887 python3.9[161437]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 22 02:30:59 np0005531887 python3.9[161589]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Nov 22 02:30:59 np0005531887 python3.9[161745]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:31:00 np0005531887 python3.9[161868]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796659.4841437-482-17080628446241/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:31:01 np0005531887 python3.9[162020]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:31:02 np0005531887 python3.9[162172]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 02:31:02 np0005531887 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Nov 22 02:31:02 np0005531887 systemd[1]: Stopped Load Kernel Modules.
Nov 22 02:31:02 np0005531887 systemd[1]: Stopping Load Kernel Modules...
Nov 22 02:31:02 np0005531887 systemd[1]: Starting Load Kernel Modules...
Nov 22 02:31:02 np0005531887 systemd[1]: Finished Load Kernel Modules.
Nov 22 02:31:03 np0005531887 podman[162329]: 2025-11-22 07:31:03.087378956 +0000 UTC m=+0.099550068 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 22 02:31:03 np0005531887 python3.9[162330]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:31:03 np0005531887 python3.9[162509]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:31:04 np0005531887 python3.9[162661]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:31:05 np0005531887 python3.9[162813]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:31:05 np0005531887 python3.9[162936]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796664.9698746-656-9951819791189/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:31:06 np0005531887 python3.9[163088]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:31:07 np0005531887 python3.9[163241]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:31:08 np0005531887 python3.9[163395]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:31:08 np0005531887 python3.9[163547]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:31:09 np0005531887 python3.9[163699]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:31:10 np0005531887 python3.9[163851]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:31:10 np0005531887 python3.9[164003]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:31:11 np0005531887 python3.9[164155]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:31:12 np0005531887 podman[164279]: 2025-11-22 07:31:12.026947661 +0000 UTC m=+0.063986859 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 02:31:12 np0005531887 python3.9[164325]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:31:13 np0005531887 python3.9[164480]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:31:13 np0005531887 python3.9[164632]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:31:14 np0005531887 python3.9[164784]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:31:14 np0005531887 python3.9[164862]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:31:15 np0005531887 python3.9[165014]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:31:16 np0005531887 python3.9[165092]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:31:16 np0005531887 python3.9[165244]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:31:17 np0005531887 python3.9[165396]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:31:18 np0005531887 python3.9[165474]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:31:18 np0005531887 python3.9[165626]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:31:19 np0005531887 python3.9[165704]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:31:20 np0005531887 python3.9[165856]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:31:20 np0005531887 systemd[1]: Reloading.
Nov 22 02:31:20 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:31:20 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:31:21 np0005531887 python3.9[166044]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:31:21 np0005531887 python3.9[166122]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:31:22 np0005531887 python3.9[166275]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:31:22 np0005531887 python3.9[166353]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:31:23 np0005531887 python3.9[166505]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:31:23 np0005531887 systemd[1]: Reloading.
Nov 22 02:31:23 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:31:23 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:31:23 np0005531887 systemd[1]: Starting Create netns directory...
Nov 22 02:31:23 np0005531887 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 22 02:31:23 np0005531887 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 22 02:31:23 np0005531887 systemd[1]: Finished Create netns directory.
Nov 22 02:31:25 np0005531887 python3.9[166698]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:31:25 np0005531887 python3.9[166850]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:31:26 np0005531887 python3.9[166973]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796685.2894828-1277-190159069987444/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:31:27 np0005531887 python3.9[167125]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:31:28 np0005531887 python3.9[167277]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:31:28 np0005531887 python3.9[167400]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796687.5927336-1352-618704201868/.source.json _original_basename=.u7u65jgz follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:31:29 np0005531887 python3.9[167552]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:31:31 np0005531887 python3.9[167979]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Nov 22 02:31:33 np0005531887 podman[168131]: 2025-11-22 07:31:33.249018121 +0000 UTC m=+0.099345634 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller)
Nov 22 02:31:33 np0005531887 python3.9[168132]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 22 02:31:34 np0005531887 python3.9[168308]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 22 02:31:36 np0005531887 python3[168487]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 22 02:31:36 np0005531887 podman[168523]: 2025-11-22 07:31:36.40397781 +0000 UTC m=+0.055780482 container create 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, config_id=multipathd, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, container_name=multipathd)
Nov 22 02:31:36 np0005531887 podman[168523]: 2025-11-22 07:31:36.37047587 +0000 UTC m=+0.022278572 image pull 5a87eb2d1bea5c4c3bce654551fc0b05a96cf5556b36110e17bddeee8189b072 quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 22 02:31:36 np0005531887 python3[168487]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 22 02:31:37 np0005531887 python3.9[168711]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:31:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:31:37.300 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:31:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:31:37.301 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:31:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:31:37.302 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:31:38 np0005531887 python3.9[168865]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:31:38 np0005531887 python3.9[168941]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:31:39 np0005531887 python3.9[169092]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763796698.7097814-1616-82626548821061/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:31:39 np0005531887 python3.9[169168]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 22 02:31:39 np0005531887 systemd[1]: Reloading.
Nov 22 02:31:40 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:31:40 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:31:40 np0005531887 python3.9[169279]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:31:40 np0005531887 systemd[1]: Reloading.
Nov 22 02:31:41 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:31:41 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:31:41 np0005531887 systemd[1]: Starting multipathd container...
Nov 22 02:31:41 np0005531887 systemd[1]: Started libcrun container.
Nov 22 02:31:41 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6774b32afd9ee80575d221096fcac2422550aefd0b5b946fb5158e3bceefef5f/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 22 02:31:41 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6774b32afd9ee80575d221096fcac2422550aefd0b5b946fb5158e3bceefef5f/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 22 02:31:41 np0005531887 systemd[1]: Started /usr/bin/podman healthcheck run 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3.
Nov 22 02:31:41 np0005531887 podman[169319]: 2025-11-22 07:31:41.391541737 +0000 UTC m=+0.129011775 container init 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Nov 22 02:31:41 np0005531887 multipathd[169335]: + sudo -E kolla_set_configs
Nov 22 02:31:41 np0005531887 podman[169319]: 2025-11-22 07:31:41.4192979 +0000 UTC m=+0.156767908 container start 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:31:41 np0005531887 multipathd[169335]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 22 02:31:41 np0005531887 multipathd[169335]: INFO:__main__:Validating config file
Nov 22 02:31:41 np0005531887 multipathd[169335]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 22 02:31:41 np0005531887 multipathd[169335]: INFO:__main__:Writing out command to execute
Nov 22 02:31:41 np0005531887 multipathd[169335]: ++ cat /run_command
Nov 22 02:31:41 np0005531887 multipathd[169335]: + CMD='/usr/sbin/multipathd -d'
Nov 22 02:31:41 np0005531887 multipathd[169335]: + ARGS=
Nov 22 02:31:41 np0005531887 multipathd[169335]: + sudo kolla_copy_cacerts
Nov 22 02:31:41 np0005531887 multipathd[169335]: + [[ ! -n '' ]]
Nov 22 02:31:41 np0005531887 multipathd[169335]: + . kolla_extend_start
Nov 22 02:31:41 np0005531887 multipathd[169335]: Running command: '/usr/sbin/multipathd -d'
Nov 22 02:31:41 np0005531887 multipathd[169335]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Nov 22 02:31:41 np0005531887 multipathd[169335]: + umask 0022
Nov 22 02:31:41 np0005531887 multipathd[169335]: + exec /usr/sbin/multipathd -d
Nov 22 02:31:41 np0005531887 multipathd[169335]: 3444.184436 | --------start up--------
Nov 22 02:31:41 np0005531887 multipathd[169335]: 3444.184457 | read /etc/multipath.conf
Nov 22 02:31:41 np0005531887 multipathd[169335]: 3444.190596 | path checkers start up
Nov 22 02:31:41 np0005531887 podman[169319]: multipathd
Nov 22 02:31:41 np0005531887 systemd[1]: Started multipathd container.
Nov 22 02:31:41 np0005531887 podman[169342]: 2025-11-22 07:31:41.668240655 +0000 UTC m=+0.238336838 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 02:31:42 np0005531887 podman[169498]: 2025-11-22 07:31:42.202358186 +0000 UTC m=+0.051240266 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:31:42 np0005531887 python3.9[169539]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:31:43 np0005531887 python3.9[169698]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:31:44 np0005531887 python3.9[169863]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 02:31:44 np0005531887 systemd[1]: Stopping multipathd container...
Nov 22 02:31:44 np0005531887 multipathd[169335]: 3447.169220 | exit (signal)
Nov 22 02:31:44 np0005531887 multipathd[169335]: 3447.169809 | --------shut down-------
Nov 22 02:31:44 np0005531887 systemd[1]: libpod-30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3.scope: Deactivated successfully.
Nov 22 02:31:44 np0005531887 conmon[169335]: conmon 30a952e857b1c4f66fb6 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3.scope/container/memory.events
Nov 22 02:31:44 np0005531887 podman[169867]: 2025-11-22 07:31:44.53171347 +0000 UTC m=+0.328075808 container died 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:31:44 np0005531887 systemd[1]: 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3-57dfb858fba056d1.timer: Deactivated successfully.
Nov 22 02:31:44 np0005531887 systemd[1]: Stopped /usr/bin/podman healthcheck run 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3.
Nov 22 02:31:44 np0005531887 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3-userdata-shm.mount: Deactivated successfully.
Nov 22 02:31:44 np0005531887 systemd[1]: var-lib-containers-storage-overlay-6774b32afd9ee80575d221096fcac2422550aefd0b5b946fb5158e3bceefef5f-merged.mount: Deactivated successfully.
Nov 22 02:31:44 np0005531887 podman[169867]: 2025-11-22 07:31:44.860471193 +0000 UTC m=+0.656833531 container cleanup 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, container_name=multipathd)
Nov 22 02:31:44 np0005531887 podman[169867]: multipathd
Nov 22 02:31:44 np0005531887 podman[169895]: multipathd
Nov 22 02:31:44 np0005531887 systemd[1]: edpm_multipathd.service: Deactivated successfully.
Nov 22 02:31:44 np0005531887 systemd[1]: Stopped multipathd container.
Nov 22 02:31:44 np0005531887 systemd[1]: Starting multipathd container...
Nov 22 02:31:45 np0005531887 systemd[1]: Started libcrun container.
Nov 22 02:31:45 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6774b32afd9ee80575d221096fcac2422550aefd0b5b946fb5158e3bceefef5f/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 22 02:31:45 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6774b32afd9ee80575d221096fcac2422550aefd0b5b946fb5158e3bceefef5f/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 22 02:31:45 np0005531887 systemd[1]: Started /usr/bin/podman healthcheck run 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3.
Nov 22 02:31:45 np0005531887 podman[169908]: 2025-11-22 07:31:45.063942344 +0000 UTC m=+0.114744296 container init 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:31:45 np0005531887 multipathd[169922]: + sudo -E kolla_set_configs
Nov 22 02:31:45 np0005531887 podman[169908]: 2025-11-22 07:31:45.091821361 +0000 UTC m=+0.142623283 container start 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:31:45 np0005531887 podman[169908]: multipathd
Nov 22 02:31:45 np0005531887 systemd[1]: Started multipathd container.
Nov 22 02:31:45 np0005531887 multipathd[169922]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 22 02:31:45 np0005531887 multipathd[169922]: INFO:__main__:Validating config file
Nov 22 02:31:45 np0005531887 multipathd[169922]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 22 02:31:45 np0005531887 multipathd[169922]: INFO:__main__:Writing out command to execute
Nov 22 02:31:45 np0005531887 multipathd[169922]: ++ cat /run_command
Nov 22 02:31:45 np0005531887 multipathd[169922]: + CMD='/usr/sbin/multipathd -d'
Nov 22 02:31:45 np0005531887 multipathd[169922]: + ARGS=
Nov 22 02:31:45 np0005531887 multipathd[169922]: + sudo kolla_copy_cacerts
Nov 22 02:31:45 np0005531887 podman[169929]: 2025-11-22 07:31:45.164783263 +0000 UTC m=+0.060244114 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=multipathd, container_name=multipathd)
Nov 22 02:31:45 np0005531887 multipathd[169922]: Running command: '/usr/sbin/multipathd -d'
Nov 22 02:31:45 np0005531887 multipathd[169922]: + [[ ! -n '' ]]
Nov 22 02:31:45 np0005531887 multipathd[169922]: + . kolla_extend_start
Nov 22 02:31:45 np0005531887 multipathd[169922]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Nov 22 02:31:45 np0005531887 multipathd[169922]: + umask 0022
Nov 22 02:31:45 np0005531887 multipathd[169922]: + exec /usr/sbin/multipathd -d
Nov 22 02:31:45 np0005531887 systemd[1]: 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3-7d582665711b267e.service: Main process exited, code=exited, status=1/FAILURE
Nov 22 02:31:45 np0005531887 systemd[1]: 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3-7d582665711b267e.service: Failed with result 'exit-code'.
Nov 22 02:31:45 np0005531887 multipathd[169922]: 3447.849946 | --------start up--------
Nov 22 02:31:45 np0005531887 multipathd[169922]: 3447.849969 | read /etc/multipath.conf
Nov 22 02:31:45 np0005531887 multipathd[169922]: 3447.855343 | path checkers start up
Nov 22 02:31:45 np0005531887 python3.9[170114]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:31:46 np0005531887 python3.9[170266]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 22 02:31:47 np0005531887 systemd[1]: virtnodedevd.service: Deactivated successfully.
Nov 22 02:31:47 np0005531887 python3.9[170418]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Nov 22 02:31:47 np0005531887 kernel: Key type psk registered
Nov 22 02:31:48 np0005531887 systemd[1]: virtproxyd.service: Deactivated successfully.
Nov 22 02:31:48 np0005531887 python3.9[170582]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:31:49 np0005531887 python3.9[170706]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796708.1532192-1856-246632365501387/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:31:49 np0005531887 systemd[1]: virtqemud.service: Deactivated successfully.
Nov 22 02:31:50 np0005531887 python3.9[170859]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:31:51 np0005531887 python3.9[171011]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 02:31:51 np0005531887 systemd[1]: virtsecretd.service: Deactivated successfully.
Nov 22 02:31:51 np0005531887 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Nov 22 02:31:51 np0005531887 systemd[1]: Stopped Load Kernel Modules.
Nov 22 02:31:51 np0005531887 systemd[1]: Stopping Load Kernel Modules...
Nov 22 02:31:51 np0005531887 systemd[1]: Starting Load Kernel Modules...
Nov 22 02:31:51 np0005531887 systemd[1]: Finished Load Kernel Modules.
Nov 22 02:31:52 np0005531887 python3.9[171168]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 02:31:55 np0005531887 systemd[1]: Reloading.
Nov 22 02:31:55 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:31:55 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:31:55 np0005531887 systemd[1]: Reloading.
Nov 22 02:31:55 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:31:55 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:31:56 np0005531887 systemd-logind[821]: Watching system buttons on /dev/input/event0 (Power Button)
Nov 22 02:31:56 np0005531887 systemd-logind[821]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Nov 22 02:31:56 np0005531887 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 22 02:31:56 np0005531887 systemd[1]: Starting man-db-cache-update.service...
Nov 22 02:31:56 np0005531887 systemd[1]: Reloading.
Nov 22 02:31:56 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:31:56 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:31:57 np0005531887 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 22 02:32:00 np0005531887 python3.9[172621]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 02:32:00 np0005531887 iscsid[160986]: iscsid shutting down.
Nov 22 02:32:00 np0005531887 systemd[1]: Stopping Open-iSCSI...
Nov 22 02:32:00 np0005531887 systemd[1]: iscsid.service: Deactivated successfully.
Nov 22 02:32:00 np0005531887 systemd[1]: Stopped Open-iSCSI.
Nov 22 02:32:00 np0005531887 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Nov 22 02:32:00 np0005531887 systemd[1]: Starting Open-iSCSI...
Nov 22 02:32:00 np0005531887 systemd[1]: Started Open-iSCSI.
Nov 22 02:32:00 np0005531887 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 22 02:32:00 np0005531887 systemd[1]: Finished man-db-cache-update.service.
Nov 22 02:32:00 np0005531887 systemd[1]: man-db-cache-update.service: Consumed 1.816s CPU time.
Nov 22 02:32:00 np0005531887 systemd[1]: run-rc40fb77f5cd349f1b22d7e8df43f24e9.service: Deactivated successfully.
Nov 22 02:32:01 np0005531887 python3.9[172776]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 02:32:02 np0005531887 python3.9[172932]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:32:03 np0005531887 podman[173056]: 2025-11-22 07:32:03.467026519 +0000 UTC m=+0.092605619 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 22 02:32:03 np0005531887 python3.9[173100]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 22 02:32:03 np0005531887 systemd[1]: Reloading.
Nov 22 02:32:03 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:32:03 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:32:04 np0005531887 python3.9[173293]: ansible-ansible.builtin.service_facts Invoked
Nov 22 02:32:04 np0005531887 network[173310]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 22 02:32:04 np0005531887 network[173311]: 'network-scripts' will be removed from distribution in near future.
Nov 22 02:32:04 np0005531887 network[173312]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 22 02:32:09 np0005531887 python3.9[173586]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:32:09 np0005531887 python3.9[173739]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:32:10 np0005531887 python3.9[173892]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:32:11 np0005531887 python3.9[174045]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:32:12 np0005531887 python3.9[174198]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:32:12 np0005531887 podman[174200]: 2025-11-22 07:32:12.484229967 +0000 UTC m=+0.063341649 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 22 02:32:13 np0005531887 python3.9[174371]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:32:13 np0005531887 python3.9[174524]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:32:14 np0005531887 python3.9[174677]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:32:15 np0005531887 podman[174802]: 2025-11-22 07:32:15.720663848 +0000 UTC m=+0.064722462 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd)
Nov 22 02:32:15 np0005531887 python3.9[174848]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:32:16 np0005531887 python3.9[175001]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:32:17 np0005531887 python3.9[175153]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:32:17 np0005531887 python3.9[175305]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:32:18 np0005531887 python3.9[175457]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:32:19 np0005531887 python3.9[175609]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:32:19 np0005531887 python3.9[175761]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:32:20 np0005531887 python3.9[175913]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:32:21 np0005531887 python3.9[176065]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:32:21 np0005531887 python3.9[176217]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:32:22 np0005531887 python3.9[176369]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:32:23 np0005531887 python3.9[176521]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:32:24 np0005531887 python3.9[176673]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:32:24 np0005531887 python3.9[176825]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:32:25 np0005531887 python3.9[176977]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:32:25 np0005531887 python3.9[177129]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:32:26 np0005531887 python3.9[177281]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:32:27 np0005531887 python3.9[177433]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 22 02:32:28 np0005531887 python3.9[177585]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 22 02:32:28 np0005531887 systemd[1]: Reloading.
Nov 22 02:32:28 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:32:28 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:32:29 np0005531887 python3.9[177772]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:32:30 np0005531887 python3.9[177925]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:32:31 np0005531887 python3.9[178078]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:32:31 np0005531887 python3.9[178231]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:32:32 np0005531887 python3.9[178384]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:32:33 np0005531887 python3.9[178537]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:32:33 np0005531887 podman[178662]: 2025-11-22 07:32:33.612450569 +0000 UTC m=+0.085115867 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251118, container_name=ovn_controller)
Nov 22 02:32:33 np0005531887 python3.9[178707]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:32:34 np0005531887 python3.9[178867]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:32:36 np0005531887 python3.9[179022]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:32:37 np0005531887 python3.9[179174]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:32:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:32:37.302 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:32:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:32:37.303 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:32:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:32:37.304 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:32:37 np0005531887 python3.9[179326]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:32:38 np0005531887 python3.9[179478]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:32:39 np0005531887 python3.9[179630]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:32:39 np0005531887 python3.9[179782]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:32:40 np0005531887 python3.9[179934]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:32:41 np0005531887 python3.9[180086]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:32:42 np0005531887 python3.9[180238]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:32:42 np0005531887 python3.9[180390]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:32:42 np0005531887 podman[180391]: 2025-11-22 07:32:42.700123607 +0000 UTC m=+0.059497083 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 22 02:32:45 np0005531887 podman[180433]: 2025-11-22 07:32:45.845274997 +0000 UTC m=+0.064351221 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 22 02:32:48 np0005531887 python3.9[180580]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Nov 22 02:32:49 np0005531887 python3.9[180733]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 22 02:32:50 np0005531887 python3.9[180891]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 22 02:32:52 np0005531887 systemd-logind[821]: New session 25 of user zuul.
Nov 22 02:32:52 np0005531887 systemd[1]: Started Session 25 of User zuul.
Nov 22 02:32:52 np0005531887 systemd[1]: session-25.scope: Deactivated successfully.
Nov 22 02:32:52 np0005531887 systemd-logind[821]: Session 25 logged out. Waiting for processes to exit.
Nov 22 02:32:52 np0005531887 systemd-logind[821]: Removed session 25.
Nov 22 02:32:53 np0005531887 python3.9[181077]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:32:53 np0005531887 python3.9[181198]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796772.6872218-3419-37446860865489/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:32:54 np0005531887 python3.9[181348]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:32:54 np0005531887 python3.9[181424]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:32:55 np0005531887 python3.9[181574]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:32:55 np0005531887 python3.9[181695]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796774.9267118-3419-11666261854405/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:32:56 np0005531887 python3.9[181845]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:32:57 np0005531887 python3.9[181966]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796776.0789666-3419-200029179392749/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=bc7f3bb7d4094c596a18178a888511b54e157ba4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:32:57 np0005531887 python3.9[182116]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:32:58 np0005531887 python3.9[182237]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796777.227517-3419-94099662684504/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:32:58 np0005531887 python3.9[182387]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:32:59 np0005531887 python3.9[182508]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796778.4608903-3419-134623824852009/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:33:00 np0005531887 python3.9[182660]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:33:00 np0005531887 python3.9[182812]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:33:01 np0005531887 python3.9[182964]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:33:02 np0005531887 python3.9[183116]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:33:03 np0005531887 python3.9[183239]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1763796781.942799-3740-6328901949447/.source _original_basename=.q6cq6h12 follow=False checksum=446163b1f0bf71986f29ce9581f9d167b4b84004 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Nov 22 02:33:03 np0005531887 podman[183365]: 2025-11-22 07:33:03.750012494 +0000 UTC m=+0.094796039 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 22 02:33:03 np0005531887 python3.9[183403]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:33:04 np0005531887 python3.9[183568]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:33:05 np0005531887 python3.9[183689]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796784.1049037-3819-2247516237199/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=211ffd0bca4b407eb4de45a749ef70116a7806fd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:33:05 np0005531887 python3.9[183839]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:33:06 np0005531887 python3.9[183960]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796785.4640205-3863-270964885728011/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:33:07 np0005531887 python3.9[184112]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Nov 22 02:33:08 np0005531887 python3.9[184264]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 22 02:33:09 np0005531887 python3[184416]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Nov 22 02:33:09 np0005531887 podman[184449]: 2025-11-22 07:33:09.639008005 +0000 UTC m=+0.076547007 container create b1900248f40c924d32ec4ca80c205fe0fd94b3c0a558b23ce248b1e1b23ec07c (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=nova_compute_init, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, managed_by=edpm_ansible, io.buildah.version=1.41.3, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Nov 22 02:33:09 np0005531887 podman[184449]: 2025-11-22 07:33:09.583051658 +0000 UTC m=+0.020590690 image pull 8e31b7b83c8d26bacd9598fdae1b287d27f8fa7d1d3cf4270dd8e435ff2f6a66 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 22 02:33:09 np0005531887 python3[184416]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Nov 22 02:33:10 np0005531887 python3.9[184639]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:33:11 np0005531887 python3.9[184793]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Nov 22 02:33:12 np0005531887 python3.9[184946]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 22 02:33:12 np0005531887 podman[184971]: 2025-11-22 07:33:12.850068753 +0000 UTC m=+0.065867519 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:33:13 np0005531887 python3[185117]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Nov 22 02:33:13 np0005531887 podman[185152]: 2025-11-22 07:33:13.899505835 +0000 UTC m=+0.058331626 container create 61441c72d43f0f782bd8d71166bea354a498d29dd66d7d908beb0494ee8f7349 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute, managed_by=edpm_ansible, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:33:13 np0005531887 podman[185152]: 2025-11-22 07:33:13.866963756 +0000 UTC m=+0.025789567 image pull 8e31b7b83c8d26bacd9598fdae1b287d27f8fa7d1d3cf4270dd8e435ff2f6a66 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 22 02:33:13 np0005531887 python3[185117]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Nov 22 02:33:14 np0005531887 python3.9[185342]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:33:15 np0005531887 python3.9[185496]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:33:16 np0005531887 podman[185619]: 2025-11-22 07:33:16.080040609 +0000 UTC m=+0.066084733 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:33:16 np0005531887 python3.9[185667]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763796795.6685565-4139-227270412993740/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:33:16 np0005531887 python3.9[185743]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 22 02:33:16 np0005531887 systemd[1]: Reloading.
Nov 22 02:33:16 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:33:16 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:33:17 np0005531887 python3.9[185854]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:33:17 np0005531887 systemd[1]: Reloading.
Nov 22 02:33:17 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:33:17 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:33:18 np0005531887 systemd[1]: Starting nova_compute container...
Nov 22 02:33:18 np0005531887 systemd[1]: Started libcrun container.
Nov 22 02:33:18 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/caf13b0be9116136c9409e97e415e056024416dab4914f64f87bc6e45e896721/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 22 02:33:18 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/caf13b0be9116136c9409e97e415e056024416dab4914f64f87bc6e45e896721/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Nov 22 02:33:18 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/caf13b0be9116136c9409e97e415e056024416dab4914f64f87bc6e45e896721/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 22 02:33:18 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/caf13b0be9116136c9409e97e415e056024416dab4914f64f87bc6e45e896721/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 22 02:33:18 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/caf13b0be9116136c9409e97e415e056024416dab4914f64f87bc6e45e896721/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 22 02:33:18 np0005531887 podman[185894]: 2025-11-22 07:33:18.201991103 +0000 UTC m=+0.113005281 container init 61441c72d43f0f782bd8d71166bea354a498d29dd66d7d908beb0494ee8f7349 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=nova_compute, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3)
Nov 22 02:33:18 np0005531887 podman[185894]: 2025-11-22 07:33:18.207812414 +0000 UTC m=+0.118826582 container start 61441c72d43f0f782bd8d71166bea354a498d29dd66d7d908beb0494ee8f7349 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=nova_compute, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 02:33:18 np0005531887 podman[185894]: nova_compute
Nov 22 02:33:18 np0005531887 nova_compute[185912]: + sudo -E kolla_set_configs
Nov 22 02:33:18 np0005531887 systemd[1]: Started nova_compute container.
Nov 22 02:33:18 np0005531887 nova_compute[185912]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 22 02:33:18 np0005531887 nova_compute[185912]: INFO:__main__:Validating config file
Nov 22 02:33:18 np0005531887 nova_compute[185912]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 22 02:33:18 np0005531887 nova_compute[185912]: INFO:__main__:Copying service configuration files
Nov 22 02:33:18 np0005531887 nova_compute[185912]: INFO:__main__:Deleting /etc/nova/nova.conf
Nov 22 02:33:18 np0005531887 nova_compute[185912]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Nov 22 02:33:18 np0005531887 nova_compute[185912]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Nov 22 02:33:18 np0005531887 nova_compute[185912]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Nov 22 02:33:18 np0005531887 nova_compute[185912]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Nov 22 02:33:18 np0005531887 nova_compute[185912]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 22 02:33:18 np0005531887 nova_compute[185912]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 22 02:33:18 np0005531887 nova_compute[185912]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Nov 22 02:33:18 np0005531887 nova_compute[185912]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Nov 22 02:33:18 np0005531887 nova_compute[185912]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 22 02:33:18 np0005531887 nova_compute[185912]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 22 02:33:18 np0005531887 nova_compute[185912]: INFO:__main__:Deleting /etc/ceph
Nov 22 02:33:18 np0005531887 nova_compute[185912]: INFO:__main__:Creating directory /etc/ceph
Nov 22 02:33:18 np0005531887 nova_compute[185912]: INFO:__main__:Setting permission for /etc/ceph
Nov 22 02:33:18 np0005531887 nova_compute[185912]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Nov 22 02:33:18 np0005531887 nova_compute[185912]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 22 02:33:18 np0005531887 nova_compute[185912]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Nov 22 02:33:18 np0005531887 nova_compute[185912]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 22 02:33:18 np0005531887 nova_compute[185912]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Nov 22 02:33:18 np0005531887 nova_compute[185912]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Nov 22 02:33:18 np0005531887 nova_compute[185912]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Nov 22 02:33:18 np0005531887 nova_compute[185912]: INFO:__main__:Writing out command to execute
Nov 22 02:33:18 np0005531887 nova_compute[185912]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Nov 22 02:33:18 np0005531887 nova_compute[185912]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 22 02:33:18 np0005531887 nova_compute[185912]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 22 02:33:18 np0005531887 nova_compute[185912]: ++ cat /run_command
Nov 22 02:33:18 np0005531887 nova_compute[185912]: + CMD=nova-compute
Nov 22 02:33:18 np0005531887 nova_compute[185912]: + ARGS=
Nov 22 02:33:18 np0005531887 nova_compute[185912]: + sudo kolla_copy_cacerts
Nov 22 02:33:18 np0005531887 nova_compute[185912]: + [[ ! -n '' ]]
Nov 22 02:33:18 np0005531887 nova_compute[185912]: + . kolla_extend_start
Nov 22 02:33:18 np0005531887 nova_compute[185912]: + echo 'Running command: '\''nova-compute'\'''
Nov 22 02:33:18 np0005531887 nova_compute[185912]: Running command: 'nova-compute'
Nov 22 02:33:18 np0005531887 nova_compute[185912]: + umask 0022
Nov 22 02:33:18 np0005531887 nova_compute[185912]: + exec nova-compute
Nov 22 02:33:19 np0005531887 python3.9[186074]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:33:20 np0005531887 python3.9[186224]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:33:20 np0005531887 nova_compute[185912]: 2025-11-22 07:33:20.536 185916 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Nov 22 02:33:20 np0005531887 nova_compute[185912]: 2025-11-22 07:33:20.537 185916 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Nov 22 02:33:20 np0005531887 nova_compute[185912]: 2025-11-22 07:33:20.537 185916 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Nov 22 02:33:20 np0005531887 nova_compute[185912]: 2025-11-22 07:33:20.537 185916 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Nov 22 02:33:20 np0005531887 nova_compute[185912]: 2025-11-22 07:33:20.765 185916 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:33:20 np0005531887 nova_compute[185912]: 2025-11-22 07:33:20.785 185916 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.020s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:33:20 np0005531887 nova_compute[185912]: 2025-11-22 07:33:20.786 185916 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.298 185916 INFO nova.virt.driver [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Nov 22 02:33:21 np0005531887 python3.9[186378]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.425 185916 INFO nova.compute.provider_config [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.442 185916 DEBUG oslo_concurrency.lockutils [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.443 185916 DEBUG oslo_concurrency.lockutils [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.443 185916 DEBUG oslo_concurrency.lockutils [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.443 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.444 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.444 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.444 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.444 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.444 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.445 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.445 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.445 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.445 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.445 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.445 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.446 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.446 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.446 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.446 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.446 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.446 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.447 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.447 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.447 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.447 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.447 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.448 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.448 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.448 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.448 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.448 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.449 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.449 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.449 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.449 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.449 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.450 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.450 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.450 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.450 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.450 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.451 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.451 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.451 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.451 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.451 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.451 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.452 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.452 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.452 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.452 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.452 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.452 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.453 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.453 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.453 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.453 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.453 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.454 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.454 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.454 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.454 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.454 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.454 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.454 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.455 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.455 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.455 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.455 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.455 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.455 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.455 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.456 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.456 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.456 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.456 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.456 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.456 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.456 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.457 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.457 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.457 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.457 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.457 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.457 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.457 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.458 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.458 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.458 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.458 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.458 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.458 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.458 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.459 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.459 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.459 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.459 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.459 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.459 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.459 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.460 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.460 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.460 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.460 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.460 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.460 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.460 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.461 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.461 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.461 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.461 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.461 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.462 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.462 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.462 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.462 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.462 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.462 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.462 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.463 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.463 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.463 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.463 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.463 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.463 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.463 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.464 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.464 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.464 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.464 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.464 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.464 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.464 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.465 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.465 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.465 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.465 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.465 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.465 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.465 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.465 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.466 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.466 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.466 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.466 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.466 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.466 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.466 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.467 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.467 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.467 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.467 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.467 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.468 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.468 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.468 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.468 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.468 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.468 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.468 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.469 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.469 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.469 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.469 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.469 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.469 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.469 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.469 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.470 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.470 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.470 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.470 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.470 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.470 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.471 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.471 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.471 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.471 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.471 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.471 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.471 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.472 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.472 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.472 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.472 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.472 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.472 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.473 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.473 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.473 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.473 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.473 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.473 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.474 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.474 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.474 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.474 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.474 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.474 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.475 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.475 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.475 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.475 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.475 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.476 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.476 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.476 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.476 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.476 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.476 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.476 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.477 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.477 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.477 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.477 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.477 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.478 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.478 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.478 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.478 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.478 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.478 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.479 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.479 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.479 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.479 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.479 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.479 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.480 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.480 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.480 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.480 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.480 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.480 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.481 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.481 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.481 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.481 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.481 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.481 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.482 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.482 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.482 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.482 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.482 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.482 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.482 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.483 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.483 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.483 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.483 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.483 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.484 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.484 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.484 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.484 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.484 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.484 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.485 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.485 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.485 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.485 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.485 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.485 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.485 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.486 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.486 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.486 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.486 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.486 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.486 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.487 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.487 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.487 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.487 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.487 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.487 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.487 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.488 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.488 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.488 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.488 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.488 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.488 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.488 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.488 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.489 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.489 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.489 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.489 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.489 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.489 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.489 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.490 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.490 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.490 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.490 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.490 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.490 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.490 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.491 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.491 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.491 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.491 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.491 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.491 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.491 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.492 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.492 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.492 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.492 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.492 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.492 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.492 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.493 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.493 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.493 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.493 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.493 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.493 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.493 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.494 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.494 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.494 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.494 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.494 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.494 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.494 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.495 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.495 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.495 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.495 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.495 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.495 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.495 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.496 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.496 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.496 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.496 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.496 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.496 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.496 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.497 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.497 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.497 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.497 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.497 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.497 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.497 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.497 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.498 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.498 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.498 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.498 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.498 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.498 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.499 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.499 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.499 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.499 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.499 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.500 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.500 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.500 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.500 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.500 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.500 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.501 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.501 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.501 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.501 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.501 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.501 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.501 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.501 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.502 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.502 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.502 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.502 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.502 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.502 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.503 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.503 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.503 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.503 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.503 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.503 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.503 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.503 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.504 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.504 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.504 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.504 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.504 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.505 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.505 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.505 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.505 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.505 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.505 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.506 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.506 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.506 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.506 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.506 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.506 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.506 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.507 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.507 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.507 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.507 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.507 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.507 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.507 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.508 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.508 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.508 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.508 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.508 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.508 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.508 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.509 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.509 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.509 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.509 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.509 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.509 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.509 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.510 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.510 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.510 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.510 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.510 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.510 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.510 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.511 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.511 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.511 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.511 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.511 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.511 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.512 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.512 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.512 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.512 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.512 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.512 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.512 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.513 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.513 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.513 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.513 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.513 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.514 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.514 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.514 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.514 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.514 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.514 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.515 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.515 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.515 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.515 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.515 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.515 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.515 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.516 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.516 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.516 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.516 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.516 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.516 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.516 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.517 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.517 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.517 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.517 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.517 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.517 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.518 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.518 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.518 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.518 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.518 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.519 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.519 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.519 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.519 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.519 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.519 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.520 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.520 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.520 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.520 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.520 185916 WARNING oslo_config.cfg [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Nov 22 02:33:21 np0005531887 nova_compute[185912]: live_migration_uri is deprecated for removal in favor of two other options that
Nov 22 02:33:21 np0005531887 nova_compute[185912]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Nov 22 02:33:21 np0005531887 nova_compute[185912]: and ``live_migration_inbound_addr`` respectively.
Nov 22 02:33:21 np0005531887 nova_compute[185912]: ).  Its value may be silently ignored in the future.#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.521 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.521 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.521 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.521 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.521 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.522 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.522 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.522 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.522 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.522 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.523 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.523 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.523 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.523 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.523 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.524 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.524 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.524 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.524 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.524 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.525 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.525 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.525 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.525 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.525 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.525 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.525 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.526 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.526 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.526 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.526 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.526 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.526 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.527 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.527 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.527 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.527 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.527 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.527 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.527 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.528 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.528 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.528 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.528 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.528 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.528 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.528 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.529 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.529 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.529 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.529 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.529 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.529 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.529 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.530 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.530 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.530 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.530 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.530 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.530 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.531 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.531 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.531 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.531 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.531 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.531 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.532 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.532 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.532 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.532 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.532 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.532 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.533 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.533 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.534 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.534 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.534 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.534 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.535 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.535 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.537 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.537 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.538 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] notifications.notification_format = both log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.538 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] notifications.notify_on_state_change = vm_and_task_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.538 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.538 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.539 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.539 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.539 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.539 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.539 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.540 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.540 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.540 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.540 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.540 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.540 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.540 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.541 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.541 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.541 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.541 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.541 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.542 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.542 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.542 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.542 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.542 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.543 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.543 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.543 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.543 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.543 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.543 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.544 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.544 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.544 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.544 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.544 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.545 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.545 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.545 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.545 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.545 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.545 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.546 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.546 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.546 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.546 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.546 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.546 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.547 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.547 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.547 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.547 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.547 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.547 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.548 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.548 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.548 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.548 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.548 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.549 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.549 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.549 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.549 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.549 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.549 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.550 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.550 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.550 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.550 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.550 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.550 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.551 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.551 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.551 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.551 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.551 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.551 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.551 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.552 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.552 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.552 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.552 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.552 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.552 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.553 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.553 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.553 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.553 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.553 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.553 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.554 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.554 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.554 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.554 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.554 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.554 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.555 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.555 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.555 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.555 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.555 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.555 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.556 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.556 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.556 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.556 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.556 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.556 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.556 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.557 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.557 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.557 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.557 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.557 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.557 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.558 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.558 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.558 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.558 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.558 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.558 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.559 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.559 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.559 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.559 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.559 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.559 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.559 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.560 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.560 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.560 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.560 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.560 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.560 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.560 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.561 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.561 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.561 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.561 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.561 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.561 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.561 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.562 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.562 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.562 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.562 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.562 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.562 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.563 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.563 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.563 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.563 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.563 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.563 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.563 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.564 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.564 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.564 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.564 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.564 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.564 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.565 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.565 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.565 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.565 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.565 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.566 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.566 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.566 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.566 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.566 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.566 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.567 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.567 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.567 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.567 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.567 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.567 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.568 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.568 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.568 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.568 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.568 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.568 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.568 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.569 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.569 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.569 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.569 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.569 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.569 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.570 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.570 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.570 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.570 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.570 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.570 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.571 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.571 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.571 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.571 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.571 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.571 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.571 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.572 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.572 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.572 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.572 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.572 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.573 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.573 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.573 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.573 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.573 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.573 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.574 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.574 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.574 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.574 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.574 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.574 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.575 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.575 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.575 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.575 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.575 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.575 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.576 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.576 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.576 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.576 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.576 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.576 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.577 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.577 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.577 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.577 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.577 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.577 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.578 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.578 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.578 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.578 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.578 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.579 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.579 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.579 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.579 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.579 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.580 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.580 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.580 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.580 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.580 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.581 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.581 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.581 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.581 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.581 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.582 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.582 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.582 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.582 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.582 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.582 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.583 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.583 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.583 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.583 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.583 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.583 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.584 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.584 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.584 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.584 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.584 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.584 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.585 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.585 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.585 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.585 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.586 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.586 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.586 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.586 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.586 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.586 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.587 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.587 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.587 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.587 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.587 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.587 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.588 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.588 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.588 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.588 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.588 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.588 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.588 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.589 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.589 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.589 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.589 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.589 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.589 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.590 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.590 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.590 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.590 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.590 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.590 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.591 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.591 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.591 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.591 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.591 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.591 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.591 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.592 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.592 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.592 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.592 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.592 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.592 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.592 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.592 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.593 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.593 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.593 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.593 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.593 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.593 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.593 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.594 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.594 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.594 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.594 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.594 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.594 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.594 185916 DEBUG oslo_service.service [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.596 185916 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.627 185916 DEBUG nova.virt.libvirt.host [None req-b632d896-1ff7-4b13-ba1e-50cd845c82ac - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.628 185916 DEBUG nova.virt.libvirt.host [None req-b632d896-1ff7-4b13-ba1e-50cd845c82ac - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.628 185916 DEBUG nova.virt.libvirt.host [None req-b632d896-1ff7-4b13-ba1e-50cd845c82ac - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.629 185916 DEBUG nova.virt.libvirt.host [None req-b632d896-1ff7-4b13-ba1e-50cd845c82ac - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Nov 22 02:33:21 np0005531887 systemd[1]: Starting libvirt QEMU daemon...
Nov 22 02:33:21 np0005531887 systemd[1]: Started libvirt QEMU daemon.
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.716 185916 DEBUG nova.virt.libvirt.host [None req-b632d896-1ff7-4b13-ba1e-50cd845c82ac - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f644fc9bf70> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.719 185916 DEBUG nova.virt.libvirt.host [None req-b632d896-1ff7-4b13-ba1e-50cd845c82ac - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f644fc9bf70> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.720 185916 INFO nova.virt.libvirt.driver [None req-b632d896-1ff7-4b13-ba1e-50cd845c82ac - - - - - -] Connection event '1' reason 'None'#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.734 185916 WARNING nova.virt.libvirt.driver [None req-b632d896-1ff7-4b13-ba1e-50cd845c82ac - - - - - -] Cannot update service status on host "compute-1.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.#033[00m
Nov 22 02:33:21 np0005531887 nova_compute[185912]: 2025-11-22 07:33:21.735 185916 DEBUG nova.virt.libvirt.volume.mount [None req-b632d896-1ff7-4b13-ba1e-50cd845c82ac - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Nov 22 02:33:22 np0005531887 python3.9[186584]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Nov 22 02:33:22 np0005531887 nova_compute[185912]: 2025-11-22 07:33:22.521 185916 INFO nova.virt.libvirt.host [None req-b632d896-1ff7-4b13-ba1e-50cd845c82ac - - - - - -] Libvirt host capabilities <capabilities>
Nov 22 02:33:22 np0005531887 nova_compute[185912]: 
Nov 22 02:33:22 np0005531887 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  <host>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <uuid>e5ccb90d-580e-48d9-a7d0-f6edef583e11</uuid>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <cpu>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <arch>x86_64</arch>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model>EPYC-Rome-v4</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <vendor>AMD</vendor>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <microcode version='16777317'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <signature family='23' model='49' stepping='0'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <maxphysaddr mode='emulate' bits='40'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature name='x2apic'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature name='tsc-deadline'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature name='osxsave'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature name='hypervisor'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature name='tsc_adjust'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature name='spec-ctrl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature name='stibp'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature name='arch-capabilities'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature name='ssbd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature name='cmp_legacy'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature name='topoext'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature name='virt-ssbd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature name='lbrv'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature name='tsc-scale'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature name='vmcb-clean'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature name='pause-filter'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature name='pfthreshold'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature name='svme-addr-chk'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature name='rdctl-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature name='skip-l1dfl-vmentry'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature name='mds-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature name='pschange-mc-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <pages unit='KiB' size='4'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <pages unit='KiB' size='2048'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <pages unit='KiB' size='1048576'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </cpu>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <power_management>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <suspend_mem/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <suspend_disk/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <suspend_hybrid/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </power_management>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <iommu support='no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <migration_features>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <live/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <uri_transports>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <uri_transport>tcp</uri_transport>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <uri_transport>rdma</uri_transport>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </uri_transports>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </migration_features>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <topology>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <cells num='1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <cell id='0'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:          <memory unit='KiB'>7864316</memory>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:          <pages unit='KiB' size='4'>1966079</pages>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:          <pages unit='KiB' size='2048'>0</pages>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:          <pages unit='KiB' size='1048576'>0</pages>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:          <distances>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:            <sibling id='0' value='10'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:          </distances>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:          <cpus num='8'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:          </cpus>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        </cell>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </cells>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </topology>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <cache>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </cache>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <secmodel>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model>selinux</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <doi>0</doi>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </secmodel>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <secmodel>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model>dac</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <doi>0</doi>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <baselabel type='kvm'>+107:+107</baselabel>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <baselabel type='qemu'>+107:+107</baselabel>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </secmodel>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  </host>
Nov 22 02:33:22 np0005531887 nova_compute[185912]: 
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  <guest>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <os_type>hvm</os_type>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <arch name='i686'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <wordsize>32</wordsize>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <domain type='qemu'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <domain type='kvm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </arch>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <features>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <pae/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <nonpae/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <acpi default='on' toggle='yes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <apic default='on' toggle='no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <cpuselection/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <deviceboot/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <disksnapshot default='on' toggle='no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <externalSnapshot/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </features>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  </guest>
Nov 22 02:33:22 np0005531887 nova_compute[185912]: 
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  <guest>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <os_type>hvm</os_type>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <arch name='x86_64'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <wordsize>64</wordsize>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <domain type='qemu'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <domain type='kvm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </arch>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <features>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <acpi default='on' toggle='yes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <apic default='on' toggle='no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <cpuselection/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <deviceboot/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <disksnapshot default='on' toggle='no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <externalSnapshot/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </features>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  </guest>
Nov 22 02:33:22 np0005531887 nova_compute[185912]: 
Nov 22 02:33:22 np0005531887 nova_compute[185912]: </capabilities>
Nov 22 02:33:22 np0005531887 nova_compute[185912]: #033[00m
Nov 22 02:33:22 np0005531887 nova_compute[185912]: 2025-11-22 07:33:22.528 185916 DEBUG nova.virt.libvirt.host [None req-b632d896-1ff7-4b13-ba1e-50cd845c82ac - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Nov 22 02:33:22 np0005531887 nova_compute[185912]: 2025-11-22 07:33:22.552 185916 DEBUG nova.virt.libvirt.host [None req-b632d896-1ff7-4b13-ba1e-50cd845c82ac - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Nov 22 02:33:22 np0005531887 nova_compute[185912]: <domainCapabilities>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  <path>/usr/libexec/qemu-kvm</path>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  <domain>kvm</domain>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  <machine>pc-q35-rhel9.8.0</machine>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  <arch>i686</arch>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  <vcpu max='4096'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  <iothreads supported='yes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  <os supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <enum name='firmware'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <loader supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='type'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>rom</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>pflash</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='readonly'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>yes</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>no</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='secure'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>no</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </loader>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  </os>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  <cpu>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <mode name='host-passthrough' supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='hostPassthroughMigratable'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>on</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>off</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </mode>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <mode name='maximum' supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='maximumMigratable'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>on</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>off</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </mode>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <mode name='host-model' supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <vendor>AMD</vendor>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='x2apic'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='tsc-deadline'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='hypervisor'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='tsc_adjust'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='spec-ctrl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='stibp'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='ssbd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='cmp_legacy'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='overflow-recov'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='succor'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='ibrs'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='amd-ssbd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='virt-ssbd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='lbrv'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='tsc-scale'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='vmcb-clean'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='flushbyasid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='pause-filter'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='pfthreshold'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='svme-addr-chk'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='disable' name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </mode>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <mode name='custom' supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Broadwell'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Broadwell-IBRS'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Broadwell-noTSX'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Broadwell-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Broadwell-v2'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Broadwell-v3'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Broadwell-v4'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Cascadelake-Server'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Cascadelake-Server-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Cascadelake-Server-v2'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Cascadelake-Server-v3'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Cascadelake-Server-v4'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Cascadelake-Server-v5'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Cooperlake'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Cooperlake-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Cooperlake-v2'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Denverton'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='mpx'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Denverton-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='mpx'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Denverton-v2'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Denverton-v3'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Dhyana-v2'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='EPYC-Genoa'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amd-psfd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='auto-ibrs'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='no-nested-data-bp'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='null-sel-clr-base'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='stibp-always-on'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='EPYC-Genoa-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amd-psfd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='auto-ibrs'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='no-nested-data-bp'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='null-sel-clr-base'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='stibp-always-on'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='EPYC-Milan'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='EPYC-Milan-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='EPYC-Milan-v2'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amd-psfd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='no-nested-data-bp'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='null-sel-clr-base'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='stibp-always-on'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='EPYC-Rome'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='EPYC-Rome-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='EPYC-Rome-v2'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='EPYC-Rome-v3'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='EPYC-v3'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='EPYC-v4'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='GraniteRapids'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-fp16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-int8'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-tile'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-fp16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fbsdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrc'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fzrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='mcdt-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pbrsb-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='prefetchiti'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='psdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xfd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='GraniteRapids-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-fp16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-int8'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-tile'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-fp16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fbsdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrc'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fzrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='mcdt-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pbrsb-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='prefetchiti'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='psdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xfd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='GraniteRapids-v2'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-fp16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-int8'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-tile'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx10'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx10-128'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx10-256'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx10-512'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-fp16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='cldemote'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fbsdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrc'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fzrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='mcdt-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='movdir64b'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='movdiri'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pbrsb-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='prefetchiti'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='psdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xfd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Haswell'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Haswell-IBRS'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Haswell-noTSX'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Haswell-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Haswell-v2'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Haswell-v3'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Haswell-v4'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Icelake-Server'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Icelake-Server-noTSX'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Icelake-Server-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Icelake-Server-v2'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Icelake-Server-v3'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Icelake-Server-v4'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Icelake-Server-v5'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Icelake-Server-v6'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Icelake-Server-v7'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='IvyBridge'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='IvyBridge-IBRS'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='IvyBridge-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='IvyBridge-v2'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='KnightsMill'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-4fmaps'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-4vnniw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512er'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512pf'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='KnightsMill-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-4fmaps'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-4vnniw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512er'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512pf'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Opteron_G4'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fma4'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xop'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Opteron_G4-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fma4'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xop'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Opteron_G5'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fma4'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='tbm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xop'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Opteron_G5-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fma4'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='tbm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xop'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='SapphireRapids'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-int8'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-tile'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-fp16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrc'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fzrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xfd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='SapphireRapids-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-int8'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-tile'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-fp16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrc'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fzrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xfd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='SapphireRapids-v2'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-int8'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-tile'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-fp16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fbsdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrc'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fzrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='psdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xfd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='SapphireRapids-v3'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-int8'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-tile'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-fp16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='cldemote'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fbsdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrc'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fzrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='movdir64b'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='movdiri'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='psdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xfd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='SierraForest'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx-ifma'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx-ne-convert'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx-vnni-int8'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='cmpccxadd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fbsdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='mcdt-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pbrsb-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='psdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='SierraForest-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx-ifma'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx-ne-convert'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx-vnni-int8'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='cmpccxadd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fbsdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='mcdt-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pbrsb-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='psdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Skylake-Client'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Skylake-Client-IBRS'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Skylake-Client-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Skylake-Client-v2'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Skylake-Client-v3'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Skylake-Client-v4'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Skylake-Server'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Skylake-Server-IBRS'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Skylake-Server-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Skylake-Server-v2'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Skylake-Server-v3'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Skylake-Server-v4'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Skylake-Server-v5'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Snowridge'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='cldemote'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='core-capability'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='movdir64b'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='movdiri'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='mpx'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='split-lock-detect'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Snowridge-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='cldemote'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='core-capability'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='movdir64b'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='movdiri'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='mpx'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='split-lock-detect'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Snowridge-v2'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='cldemote'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='core-capability'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='movdir64b'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='movdiri'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='split-lock-detect'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Snowridge-v3'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='cldemote'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='core-capability'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='movdir64b'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='movdiri'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='split-lock-detect'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Snowridge-v4'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='cldemote'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='movdir64b'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='movdiri'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='athlon'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='3dnow'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='3dnowext'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='athlon-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='3dnow'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='3dnowext'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='core2duo'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='core2duo-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='coreduo'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='coreduo-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='n270'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='n270-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='phenom'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='3dnow'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='3dnowext'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='phenom-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='3dnow'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='3dnowext'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </mode>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  </cpu>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  <memoryBacking supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <enum name='sourceType'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <value>file</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <value>anonymous</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <value>memfd</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  </memoryBacking>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  <devices>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <disk supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='diskDevice'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>disk</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>cdrom</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>floppy</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>lun</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='bus'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>fdc</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>scsi</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>virtio</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>usb</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>sata</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='model'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>virtio</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>virtio-transitional</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>virtio-non-transitional</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </disk>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <graphics supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='type'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>vnc</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>egl-headless</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>dbus</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </graphics>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <video supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='modelType'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>vga</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>cirrus</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>virtio</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>none</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>bochs</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>ramfb</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </video>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <hostdev supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='mode'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>subsystem</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='startupPolicy'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>default</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>mandatory</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>requisite</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>optional</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='subsysType'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>usb</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>pci</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>scsi</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='capsType'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='pciBackend'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </hostdev>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <rng supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='model'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>virtio</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>virtio-transitional</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>virtio-non-transitional</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='backendModel'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>random</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>egd</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>builtin</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </rng>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <filesystem supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='driverType'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>path</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>handle</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>virtiofs</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </filesystem>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <tpm supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='model'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>tpm-tis</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>tpm-crb</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='backendModel'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>emulator</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>external</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='backendVersion'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>2.0</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </tpm>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <redirdev supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='bus'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>usb</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </redirdev>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <channel supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='type'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>pty</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>unix</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </channel>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <crypto supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='model'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='type'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>qemu</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='backendModel'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>builtin</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </crypto>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <interface supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='backendType'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>default</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>passt</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </interface>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <panic supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='model'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>isa</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>hyperv</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </panic>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <console supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='type'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>null</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>vc</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>pty</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>dev</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>file</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>pipe</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>stdio</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>udp</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>tcp</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>unix</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>qemu-vdagent</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>dbus</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </console>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  </devices>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  <features>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <gic supported='no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <vmcoreinfo supported='yes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <genid supported='yes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <backingStoreInput supported='yes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <backup supported='yes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <async-teardown supported='yes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <ps2 supported='yes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <sev supported='no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <sgx supported='no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <hyperv supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='features'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>relaxed</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>vapic</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>spinlocks</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>vpindex</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>runtime</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>synic</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>stimer</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>reset</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>vendor_id</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>frequencies</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>reenlightenment</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>tlbflush</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>ipi</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>avic</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>emsr_bitmap</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>xmm_input</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <defaults>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <spinlocks>4095</spinlocks>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <stimer_direct>on</stimer_direct>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <tlbflush_direct>on</tlbflush_direct>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <tlbflush_extended>on</tlbflush_extended>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </defaults>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </hyperv>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <launchSecurity supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='sectype'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>tdx</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </launchSecurity>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  </features>
Nov 22 02:33:22 np0005531887 nova_compute[185912]: </domainCapabilities>
Nov 22 02:33:22 np0005531887 nova_compute[185912]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 22 02:33:22 np0005531887 nova_compute[185912]: 2025-11-22 07:33:22.560 185916 DEBUG nova.virt.libvirt.host [None req-b632d896-1ff7-4b13-ba1e-50cd845c82ac - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Nov 22 02:33:22 np0005531887 nova_compute[185912]: <domainCapabilities>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  <path>/usr/libexec/qemu-kvm</path>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  <domain>kvm</domain>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  <machine>pc-i440fx-rhel7.6.0</machine>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  <arch>i686</arch>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  <vcpu max='240'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  <iothreads supported='yes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  <os supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <enum name='firmware'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <loader supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='type'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>rom</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>pflash</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='readonly'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>yes</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>no</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='secure'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>no</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </loader>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  </os>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  <cpu>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <mode name='host-passthrough' supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='hostPassthroughMigratable'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>on</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>off</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </mode>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <mode name='maximum' supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='maximumMigratable'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>on</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>off</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </mode>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <mode name='host-model' supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <vendor>AMD</vendor>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='x2apic'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='tsc-deadline'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='hypervisor'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='tsc_adjust'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='spec-ctrl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='stibp'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='ssbd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='cmp_legacy'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='overflow-recov'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='succor'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='ibrs'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='amd-ssbd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='virt-ssbd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='lbrv'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='tsc-scale'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='vmcb-clean'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='flushbyasid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='pause-filter'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='pfthreshold'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='svme-addr-chk'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='disable' name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </mode>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <mode name='custom' supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Broadwell'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Broadwell-IBRS'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Broadwell-noTSX'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Broadwell-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Broadwell-v2'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Broadwell-v3'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Broadwell-v4'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Cascadelake-Server'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Cascadelake-Server-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Cascadelake-Server-v2'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Cascadelake-Server-v3'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Cascadelake-Server-v4'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Cascadelake-Server-v5'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Cooperlake'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Cooperlake-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Cooperlake-v2'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Denverton'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='mpx'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Denverton-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='mpx'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Denverton-v2'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Denverton-v3'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Dhyana-v2'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='EPYC-Genoa'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amd-psfd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='auto-ibrs'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='no-nested-data-bp'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='null-sel-clr-base'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='stibp-always-on'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='EPYC-Genoa-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amd-psfd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='auto-ibrs'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='no-nested-data-bp'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='null-sel-clr-base'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='stibp-always-on'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='EPYC-Milan'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='EPYC-Milan-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='EPYC-Milan-v2'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amd-psfd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='no-nested-data-bp'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='null-sel-clr-base'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='stibp-always-on'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='EPYC-Rome'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='EPYC-Rome-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='EPYC-Rome-v2'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='EPYC-Rome-v3'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='EPYC-v3'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='EPYC-v4'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='GraniteRapids'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-fp16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-int8'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-tile'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-fp16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fbsdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrc'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fzrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='mcdt-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pbrsb-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='prefetchiti'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='psdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xfd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='GraniteRapids-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-fp16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-int8'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-tile'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-fp16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fbsdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrc'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fzrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='mcdt-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pbrsb-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='prefetchiti'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='psdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xfd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='GraniteRapids-v2'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-fp16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-int8'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-tile'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx10'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx10-128'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx10-256'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx10-512'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-fp16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='cldemote'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fbsdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrc'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fzrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='mcdt-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='movdir64b'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='movdiri'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pbrsb-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='prefetchiti'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='psdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xfd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Haswell'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Haswell-IBRS'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Haswell-noTSX'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Haswell-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Haswell-v2'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Haswell-v3'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Haswell-v4'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Icelake-Server'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Icelake-Server-noTSX'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Icelake-Server-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Icelake-Server-v2'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Icelake-Server-v3'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Icelake-Server-v4'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Icelake-Server-v5'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Icelake-Server-v6'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Icelake-Server-v7'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='IvyBridge'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='IvyBridge-IBRS'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='IvyBridge-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='IvyBridge-v2'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='KnightsMill'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-4fmaps'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-4vnniw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512er'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512pf'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='KnightsMill-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-4fmaps'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-4vnniw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512er'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512pf'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Opteron_G4'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fma4'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xop'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Opteron_G4-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fma4'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xop'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Opteron_G5'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fma4'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='tbm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xop'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Opteron_G5-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fma4'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='tbm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xop'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='SapphireRapids'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-int8'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-tile'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-fp16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrc'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fzrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xfd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='SapphireRapids-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-int8'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-tile'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-fp16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrc'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fzrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xfd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='SapphireRapids-v2'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-int8'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-tile'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-fp16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fbsdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrc'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fzrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='psdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xfd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='SapphireRapids-v3'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-int8'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-tile'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-fp16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='cldemote'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fbsdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrc'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fzrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='movdir64b'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='movdiri'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='psdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xfd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='SierraForest'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx-ifma'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx-ne-convert'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx-vnni-int8'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='cmpccxadd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fbsdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='mcdt-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pbrsb-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='psdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='SierraForest-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx-ifma'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx-ne-convert'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx-vnni-int8'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='cmpccxadd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fbsdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='mcdt-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pbrsb-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='psdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Skylake-Client'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Skylake-Client-IBRS'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Skylake-Client-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Skylake-Client-v2'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Skylake-Client-v3'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Skylake-Client-v4'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Skylake-Server'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Skylake-Server-IBRS'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Skylake-Server-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Skylake-Server-v2'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Skylake-Server-v3'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Skylake-Server-v4'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Skylake-Server-v5'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Snowridge'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='cldemote'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='core-capability'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='movdir64b'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='movdiri'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='mpx'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='split-lock-detect'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Snowridge-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='cldemote'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='core-capability'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='movdir64b'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='movdiri'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='mpx'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='split-lock-detect'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Snowridge-v2'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='cldemote'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='core-capability'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='movdir64b'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='movdiri'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='split-lock-detect'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Snowridge-v3'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='cldemote'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='core-capability'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='movdir64b'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='movdiri'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='split-lock-detect'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Snowridge-v4'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='cldemote'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='movdir64b'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='movdiri'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='athlon'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='3dnow'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='3dnowext'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='athlon-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='3dnow'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='3dnowext'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='core2duo'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='core2duo-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='coreduo'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='coreduo-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='n270'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='n270-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='phenom'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='3dnow'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='3dnowext'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='phenom-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='3dnow'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='3dnowext'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </mode>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  </cpu>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  <memoryBacking supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <enum name='sourceType'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <value>file</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <value>anonymous</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <value>memfd</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  </memoryBacking>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  <devices>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <disk supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='diskDevice'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>disk</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>cdrom</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>floppy</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>lun</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='bus'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>ide</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>fdc</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>scsi</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>virtio</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>usb</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>sata</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='model'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>virtio</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>virtio-transitional</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>virtio-non-transitional</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </disk>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <graphics supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='type'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>vnc</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>egl-headless</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>dbus</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </graphics>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <video supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='modelType'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>vga</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>cirrus</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>virtio</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>none</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>bochs</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>ramfb</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </video>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <hostdev supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='mode'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>subsystem</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='startupPolicy'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>default</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>mandatory</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>requisite</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>optional</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='subsysType'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>usb</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>pci</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>scsi</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='capsType'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='pciBackend'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </hostdev>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <rng supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='model'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>virtio</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>virtio-transitional</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>virtio-non-transitional</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='backendModel'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>random</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>egd</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>builtin</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </rng>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <filesystem supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='driverType'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>path</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>handle</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>virtiofs</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </filesystem>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <tpm supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='model'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>tpm-tis</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>tpm-crb</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='backendModel'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>emulator</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>external</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='backendVersion'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>2.0</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </tpm>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <redirdev supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='bus'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>usb</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </redirdev>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <channel supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='type'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>pty</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>unix</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </channel>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <crypto supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='model'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='type'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>qemu</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='backendModel'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>builtin</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </crypto>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <interface supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='backendType'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>default</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>passt</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </interface>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <panic supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='model'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>isa</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>hyperv</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </panic>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <console supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='type'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>null</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>vc</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>pty</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>dev</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>file</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>pipe</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>stdio</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>udp</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>tcp</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>unix</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>qemu-vdagent</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>dbus</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </console>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  </devices>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  <features>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <gic supported='no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <vmcoreinfo supported='yes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <genid supported='yes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <backingStoreInput supported='yes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <backup supported='yes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <async-teardown supported='yes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <ps2 supported='yes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <sev supported='no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <sgx supported='no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <hyperv supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='features'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>relaxed</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>vapic</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>spinlocks</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>vpindex</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>runtime</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>synic</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>stimer</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>reset</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>vendor_id</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>frequencies</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>reenlightenment</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>tlbflush</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>ipi</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>avic</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>emsr_bitmap</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>xmm_input</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <defaults>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <spinlocks>4095</spinlocks>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <stimer_direct>on</stimer_direct>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <tlbflush_direct>on</tlbflush_direct>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <tlbflush_extended>on</tlbflush_extended>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </defaults>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </hyperv>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <launchSecurity supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='sectype'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>tdx</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </launchSecurity>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  </features>
Nov 22 02:33:22 np0005531887 nova_compute[185912]: </domainCapabilities>
Nov 22 02:33:22 np0005531887 nova_compute[185912]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 22 02:33:22 np0005531887 nova_compute[185912]: 2025-11-22 07:33:22.597 185916 DEBUG nova.virt.libvirt.host [None req-b632d896-1ff7-4b13-ba1e-50cd845c82ac - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Nov 22 02:33:22 np0005531887 nova_compute[185912]: 2025-11-22 07:33:22.602 185916 DEBUG nova.virt.libvirt.host [None req-b632d896-1ff7-4b13-ba1e-50cd845c82ac - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Nov 22 02:33:22 np0005531887 nova_compute[185912]: <domainCapabilities>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  <path>/usr/libexec/qemu-kvm</path>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  <domain>kvm</domain>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  <machine>pc-q35-rhel9.8.0</machine>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  <arch>x86_64</arch>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  <vcpu max='4096'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  <iothreads supported='yes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  <os supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <enum name='firmware'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <value>efi</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <loader supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='type'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>rom</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>pflash</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='readonly'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>yes</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>no</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='secure'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>yes</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>no</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </loader>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  </os>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  <cpu>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <mode name='host-passthrough' supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='hostPassthroughMigratable'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>on</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>off</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </mode>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <mode name='maximum' supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='maximumMigratable'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>on</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>off</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </mode>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <mode name='host-model' supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <vendor>AMD</vendor>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='x2apic'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='tsc-deadline'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='hypervisor'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='tsc_adjust'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='spec-ctrl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='stibp'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='ssbd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='cmp_legacy'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='overflow-recov'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='succor'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='ibrs'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='amd-ssbd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='virt-ssbd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='lbrv'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='tsc-scale'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='vmcb-clean'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='flushbyasid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='pause-filter'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='pfthreshold'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='svme-addr-chk'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='disable' name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </mode>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <mode name='custom' supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Broadwell'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Broadwell-IBRS'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Broadwell-noTSX'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Broadwell-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Broadwell-v2'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Broadwell-v3'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Broadwell-v4'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Cascadelake-Server'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Cascadelake-Server-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Cascadelake-Server-v2'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Cascadelake-Server-v3'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Cascadelake-Server-v4'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Cascadelake-Server-v5'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Cooperlake'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Cooperlake-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Cooperlake-v2'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Denverton'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='mpx'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Denverton-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='mpx'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Denverton-v2'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Denverton-v3'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Dhyana-v2'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='EPYC-Genoa'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amd-psfd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='auto-ibrs'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='no-nested-data-bp'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='null-sel-clr-base'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='stibp-always-on'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='EPYC-Genoa-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amd-psfd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='auto-ibrs'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='no-nested-data-bp'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='null-sel-clr-base'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='stibp-always-on'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='EPYC-Milan'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='EPYC-Milan-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='EPYC-Milan-v2'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amd-psfd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='no-nested-data-bp'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='null-sel-clr-base'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='stibp-always-on'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='EPYC-Rome'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='EPYC-Rome-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='EPYC-Rome-v2'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='EPYC-Rome-v3'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='EPYC-v3'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='EPYC-v4'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='GraniteRapids'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-fp16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-int8'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-tile'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-fp16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fbsdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrc'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fzrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='mcdt-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pbrsb-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='prefetchiti'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='psdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xfd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='GraniteRapids-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-fp16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-int8'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-tile'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-fp16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fbsdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrc'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fzrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='mcdt-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pbrsb-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='prefetchiti'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='psdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xfd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='GraniteRapids-v2'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-fp16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-int8'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-tile'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx10'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx10-128'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx10-256'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx10-512'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-fp16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='cldemote'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fbsdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrc'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fzrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='mcdt-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='movdir64b'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='movdiri'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pbrsb-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='prefetchiti'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='psdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xfd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Haswell'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Haswell-IBRS'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Haswell-noTSX'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Haswell-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Haswell-v2'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Haswell-v3'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Haswell-v4'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Icelake-Server'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Icelake-Server-noTSX'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Icelake-Server-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Icelake-Server-v2'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Icelake-Server-v3'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Icelake-Server-v4'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Icelake-Server-v5'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Icelake-Server-v6'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Icelake-Server-v7'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='IvyBridge'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='IvyBridge-IBRS'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='IvyBridge-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='IvyBridge-v2'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='KnightsMill'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-4fmaps'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-4vnniw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512er'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512pf'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='KnightsMill-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-4fmaps'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-4vnniw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512er'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512pf'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Opteron_G4'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fma4'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xop'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Opteron_G4-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fma4'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xop'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Opteron_G5'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fma4'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='tbm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xop'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Opteron_G5-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fma4'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='tbm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xop'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='SapphireRapids'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-int8'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-tile'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-fp16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrc'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fzrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xfd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='SapphireRapids-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-int8'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-tile'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-fp16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrc'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fzrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xfd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='SapphireRapids-v2'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-int8'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-tile'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-fp16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fbsdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrc'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fzrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='psdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xfd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='SapphireRapids-v3'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-int8'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-tile'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-fp16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='cldemote'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fbsdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrc'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fzrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='movdir64b'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='movdiri'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='psdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xfd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='SierraForest'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx-ifma'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx-ne-convert'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx-vnni-int8'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='cmpccxadd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fbsdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='mcdt-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pbrsb-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='psdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='SierraForest-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx-ifma'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx-ne-convert'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx-vnni-int8'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='cmpccxadd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fbsdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='mcdt-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pbrsb-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='psdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Skylake-Client'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Skylake-Client-IBRS'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Skylake-Client-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Skylake-Client-v2'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Skylake-Client-v3'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Skylake-Client-v4'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Skylake-Server'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Skylake-Server-IBRS'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Skylake-Server-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Skylake-Server-v2'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Skylake-Server-v3'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Skylake-Server-v4'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Skylake-Server-v5'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Snowridge'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='cldemote'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='core-capability'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='movdir64b'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='movdiri'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='mpx'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='split-lock-detect'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Snowridge-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='cldemote'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='core-capability'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='movdir64b'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='movdiri'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='mpx'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='split-lock-detect'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Snowridge-v2'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='cldemote'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='core-capability'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='movdir64b'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='movdiri'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='split-lock-detect'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Snowridge-v3'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='cldemote'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='core-capability'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='movdir64b'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='movdiri'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='split-lock-detect'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Snowridge-v4'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='cldemote'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='movdir64b'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='movdiri'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='athlon'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='3dnow'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='3dnowext'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='athlon-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='3dnow'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='3dnowext'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='core2duo'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='core2duo-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='coreduo'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='coreduo-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='n270'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='n270-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='phenom'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='3dnow'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='3dnowext'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='phenom-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='3dnow'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='3dnowext'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </mode>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  </cpu>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  <memoryBacking supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <enum name='sourceType'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <value>file</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <value>anonymous</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <value>memfd</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  </memoryBacking>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  <devices>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <disk supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='diskDevice'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>disk</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>cdrom</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>floppy</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>lun</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='bus'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>fdc</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>scsi</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>virtio</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>usb</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>sata</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='model'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>virtio</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>virtio-transitional</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>virtio-non-transitional</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </disk>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <graphics supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='type'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>vnc</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>egl-headless</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>dbus</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </graphics>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <video supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='modelType'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>vga</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>cirrus</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>virtio</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>none</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>bochs</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>ramfb</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </video>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <hostdev supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='mode'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>subsystem</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='startupPolicy'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>default</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>mandatory</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>requisite</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>optional</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='subsysType'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>usb</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>pci</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>scsi</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='capsType'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='pciBackend'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </hostdev>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <rng supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='model'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>virtio</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>virtio-transitional</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>virtio-non-transitional</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='backendModel'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>random</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>egd</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>builtin</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </rng>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <filesystem supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='driverType'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>path</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>handle</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>virtiofs</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </filesystem>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <tpm supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='model'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>tpm-tis</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>tpm-crb</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='backendModel'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>emulator</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>external</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='backendVersion'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>2.0</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </tpm>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <redirdev supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='bus'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>usb</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </redirdev>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <channel supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='type'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>pty</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>unix</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </channel>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <crypto supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='model'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='type'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>qemu</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='backendModel'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>builtin</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </crypto>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <interface supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='backendType'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>default</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>passt</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </interface>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <panic supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='model'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>isa</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>hyperv</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </panic>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <console supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='type'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>null</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>vc</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>pty</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>dev</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>file</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>pipe</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>stdio</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>udp</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>tcp</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>unix</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>qemu-vdagent</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>dbus</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </console>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  </devices>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  <features>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <gic supported='no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <vmcoreinfo supported='yes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <genid supported='yes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <backingStoreInput supported='yes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <backup supported='yes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <async-teardown supported='yes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <ps2 supported='yes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <sev supported='no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <sgx supported='no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <hyperv supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='features'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>relaxed</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>vapic</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>spinlocks</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>vpindex</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>runtime</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>synic</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>stimer</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>reset</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>vendor_id</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>frequencies</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>reenlightenment</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>tlbflush</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>ipi</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>avic</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>emsr_bitmap</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>xmm_input</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <defaults>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <spinlocks>4095</spinlocks>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <stimer_direct>on</stimer_direct>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <tlbflush_direct>on</tlbflush_direct>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <tlbflush_extended>on</tlbflush_extended>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </defaults>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </hyperv>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <launchSecurity supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='sectype'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>tdx</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </launchSecurity>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  </features>
Nov 22 02:33:22 np0005531887 nova_compute[185912]: </domainCapabilities>
Nov 22 02:33:22 np0005531887 nova_compute[185912]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 22 02:33:22 np0005531887 nova_compute[185912]: 2025-11-22 07:33:22.674 185916 DEBUG nova.virt.libvirt.host [None req-b632d896-1ff7-4b13-ba1e-50cd845c82ac - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Nov 22 02:33:22 np0005531887 nova_compute[185912]: <domainCapabilities>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  <path>/usr/libexec/qemu-kvm</path>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  <domain>kvm</domain>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  <machine>pc-i440fx-rhel7.6.0</machine>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  <arch>x86_64</arch>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  <vcpu max='240'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  <iothreads supported='yes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  <os supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <enum name='firmware'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <loader supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='type'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>rom</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>pflash</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='readonly'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>yes</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>no</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='secure'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>no</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </loader>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  </os>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  <cpu>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <mode name='host-passthrough' supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='hostPassthroughMigratable'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>on</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>off</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </mode>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <mode name='maximum' supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='maximumMigratable'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>on</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>off</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </mode>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <mode name='host-model' supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <vendor>AMD</vendor>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='x2apic'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='tsc-deadline'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='hypervisor'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='tsc_adjust'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='spec-ctrl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='stibp'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='ssbd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='cmp_legacy'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='overflow-recov'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='succor'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='ibrs'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='amd-ssbd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='virt-ssbd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='lbrv'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='tsc-scale'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='vmcb-clean'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='flushbyasid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='pause-filter'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='pfthreshold'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='svme-addr-chk'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <feature policy='disable' name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </mode>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <mode name='custom' supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Broadwell'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Broadwell-IBRS'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Broadwell-noTSX'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Broadwell-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Broadwell-v2'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Broadwell-v3'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Broadwell-v4'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Cascadelake-Server'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Cascadelake-Server-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Cascadelake-Server-v2'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Cascadelake-Server-v3'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Cascadelake-Server-v4'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Cascadelake-Server-v5'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Cooperlake'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Cooperlake-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Cooperlake-v2'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Denverton'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='mpx'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Denverton-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='mpx'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Denverton-v2'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Denverton-v3'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Dhyana-v2'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='EPYC-Genoa'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amd-psfd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='auto-ibrs'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='no-nested-data-bp'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='null-sel-clr-base'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='stibp-always-on'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='EPYC-Genoa-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amd-psfd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='auto-ibrs'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='no-nested-data-bp'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='null-sel-clr-base'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='stibp-always-on'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='EPYC-Milan'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='EPYC-Milan-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='EPYC-Milan-v2'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amd-psfd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='no-nested-data-bp'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='null-sel-clr-base'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='stibp-always-on'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='EPYC-Rome'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='EPYC-Rome-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='EPYC-Rome-v2'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='EPYC-Rome-v3'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='EPYC-v3'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='EPYC-v4'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='GraniteRapids'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-fp16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-int8'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-tile'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-fp16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fbsdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrc'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fzrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='mcdt-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pbrsb-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='prefetchiti'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='psdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xfd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='GraniteRapids-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-fp16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-int8'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-tile'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-fp16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fbsdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrc'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fzrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='mcdt-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pbrsb-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='prefetchiti'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='psdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xfd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='GraniteRapids-v2'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-fp16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-int8'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-tile'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx10'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx10-128'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx10-256'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx10-512'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-fp16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='cldemote'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fbsdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrc'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fzrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='mcdt-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='movdir64b'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='movdiri'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pbrsb-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='prefetchiti'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='psdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xfd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Haswell'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Haswell-IBRS'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Haswell-noTSX'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Haswell-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Haswell-v2'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Haswell-v3'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Haswell-v4'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Icelake-Server'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Icelake-Server-noTSX'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Icelake-Server-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Icelake-Server-v2'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Icelake-Server-v3'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Icelake-Server-v4'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Icelake-Server-v5'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Icelake-Server-v6'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Icelake-Server-v7'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='IvyBridge'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='IvyBridge-IBRS'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='IvyBridge-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='IvyBridge-v2'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='KnightsMill'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-4fmaps'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-4vnniw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512er'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512pf'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='KnightsMill-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-4fmaps'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-4vnniw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512er'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512pf'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Opteron_G4'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fma4'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xop'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Opteron_G4-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fma4'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xop'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Opteron_G5'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fma4'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='tbm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xop'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Opteron_G5-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fma4'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='tbm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xop'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='SapphireRapids'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-int8'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-tile'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-fp16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrc'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fzrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xfd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='SapphireRapids-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-int8'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-tile'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-fp16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrc'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fzrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xfd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='SapphireRapids-v2'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-int8'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-tile'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-fp16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fbsdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrc'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fzrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='psdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xfd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='SapphireRapids-v3'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-int8'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='amx-tile'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-bf16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-fp16'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bitalg'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512ifma'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='cldemote'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fbsdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrc'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fzrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='la57'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='movdir64b'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='movdiri'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='psdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='taa-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xfd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='SierraForest'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx-ifma'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx-ne-convert'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx-vnni-int8'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='cmpccxadd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fbsdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='mcdt-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pbrsb-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='psdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='SierraForest-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx-ifma'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx-ne-convert'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx-vnni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx-vnni-int8'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='cmpccxadd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fbsdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='fsrs'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ibrs-all'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='mcdt-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pbrsb-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='psdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='serialize'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vaes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Skylake-Client'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Skylake-Client-IBRS'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Skylake-Client-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Skylake-Client-v2'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Skylake-Client-v3'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Skylake-Client-v4'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Skylake-Server'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Skylake-Server-IBRS'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Skylake-Server-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Skylake-Server-v2'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='hle'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='rtm'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Skylake-Server-v3'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Skylake-Server-v4'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Skylake-Server-v5'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512bw'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512cd'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512dq'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512f'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='avx512vl'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='invpcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pcid'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='pku'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Snowridge'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='cldemote'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='core-capability'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='movdir64b'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='movdiri'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='mpx'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='split-lock-detect'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Snowridge-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='cldemote'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='core-capability'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='movdir64b'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='movdiri'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='mpx'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='split-lock-detect'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Snowridge-v2'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='cldemote'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='core-capability'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='movdir64b'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='movdiri'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='split-lock-detect'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Snowridge-v3'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='cldemote'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='core-capability'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='movdir64b'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='movdiri'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='split-lock-detect'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='Snowridge-v4'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='cldemote'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='erms'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='gfni'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='movdir64b'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='movdiri'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='xsaves'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='athlon'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='3dnow'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='3dnowext'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='athlon-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='3dnow'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='3dnowext'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='core2duo'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='core2duo-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='coreduo'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='coreduo-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='n270'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='n270-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='ss'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='phenom'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='3dnow'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='3dnowext'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <blockers model='phenom-v1'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='3dnow'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <feature name='3dnowext'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </blockers>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </mode>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  </cpu>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  <memoryBacking supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <enum name='sourceType'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <value>file</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <value>anonymous</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <value>memfd</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  </memoryBacking>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  <devices>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <disk supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='diskDevice'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>disk</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>cdrom</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>floppy</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>lun</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='bus'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>ide</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>fdc</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>scsi</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>virtio</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>usb</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>sata</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='model'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>virtio</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>virtio-transitional</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>virtio-non-transitional</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </disk>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <graphics supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='type'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>vnc</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>egl-headless</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>dbus</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </graphics>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <video supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='modelType'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>vga</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>cirrus</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>virtio</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>none</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>bochs</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>ramfb</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </video>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <hostdev supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='mode'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>subsystem</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='startupPolicy'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>default</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>mandatory</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>requisite</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>optional</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='subsysType'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>usb</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>pci</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>scsi</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='capsType'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='pciBackend'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </hostdev>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <rng supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='model'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>virtio</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>virtio-transitional</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>virtio-non-transitional</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='backendModel'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>random</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>egd</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>builtin</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </rng>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <filesystem supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='driverType'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>path</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>handle</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>virtiofs</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </filesystem>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <tpm supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='model'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>tpm-tis</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>tpm-crb</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='backendModel'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>emulator</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>external</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='backendVersion'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>2.0</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </tpm>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <redirdev supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='bus'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>usb</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </redirdev>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <channel supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='type'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>pty</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>unix</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </channel>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <crypto supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='model'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='type'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>qemu</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='backendModel'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>builtin</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </crypto>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <interface supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='backendType'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>default</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>passt</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </interface>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <panic supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='model'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>isa</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>hyperv</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </panic>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <console supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='type'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>null</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>vc</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>pty</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>dev</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>file</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>pipe</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>stdio</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>udp</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>tcp</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>unix</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>qemu-vdagent</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>dbus</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </console>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  </devices>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  <features>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <gic supported='no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <vmcoreinfo supported='yes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <genid supported='yes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <backingStoreInput supported='yes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <backup supported='yes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <async-teardown supported='yes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <ps2 supported='yes'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <sev supported='no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <sgx supported='no'/>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <hyperv supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='features'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>relaxed</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>vapic</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>spinlocks</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>vpindex</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>runtime</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>synic</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>stimer</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>reset</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>vendor_id</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>frequencies</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>reenlightenment</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>tlbflush</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>ipi</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>avic</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>emsr_bitmap</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>xmm_input</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <defaults>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <spinlocks>4095</spinlocks>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <stimer_direct>on</stimer_direct>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <tlbflush_direct>on</tlbflush_direct>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <tlbflush_extended>on</tlbflush_extended>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </defaults>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </hyperv>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    <launchSecurity supported='yes'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      <enum name='sectype'>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:        <value>tdx</value>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:      </enum>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:    </launchSecurity>
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  </features>
Nov 22 02:33:22 np0005531887 nova_compute[185912]: </domainCapabilities>
Nov 22 02:33:22 np0005531887 nova_compute[185912]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 22 02:33:22 np0005531887 nova_compute[185912]: 2025-11-22 07:33:22.744 185916 DEBUG nova.virt.libvirt.host [None req-b632d896-1ff7-4b13-ba1e-50cd845c82ac - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Nov 22 02:33:22 np0005531887 nova_compute[185912]: 2025-11-22 07:33:22.744 185916 INFO nova.virt.libvirt.host [None req-b632d896-1ff7-4b13-ba1e-50cd845c82ac - - - - - -] Secure Boot support detected#033[00m
Nov 22 02:33:22 np0005531887 nova_compute[185912]: 2025-11-22 07:33:22.748 185916 INFO nova.virt.libvirt.driver [None req-b632d896-1ff7-4b13-ba1e-50cd845c82ac - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Nov 22 02:33:22 np0005531887 nova_compute[185912]: 2025-11-22 07:33:22.748 185916 INFO nova.virt.libvirt.driver [None req-b632d896-1ff7-4b13-ba1e-50cd845c82ac - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Nov 22 02:33:22 np0005531887 nova_compute[185912]: 2025-11-22 07:33:22.761 185916 DEBUG nova.virt.libvirt.driver [None req-b632d896-1ff7-4b13-ba1e-50cd845c82ac - - - - - -] cpu compare xml: <cpu match="exact">
Nov 22 02:33:22 np0005531887 nova_compute[185912]:  <model>Nehalem</model>
Nov 22 02:33:22 np0005531887 nova_compute[185912]: </cpu>
Nov 22 02:33:22 np0005531887 nova_compute[185912]: _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019#033[00m
Nov 22 02:33:22 np0005531887 nova_compute[185912]: 2025-11-22 07:33:22.764 185916 DEBUG nova.virt.libvirt.driver [None req-b632d896-1ff7-4b13-ba1e-50cd845c82ac - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Nov 22 02:33:22 np0005531887 nova_compute[185912]: 2025-11-22 07:33:22.798 185916 INFO nova.virt.node [None req-b632d896-1ff7-4b13-ba1e-50cd845c82ac - - - - - -] Determined node identity 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 from /var/lib/nova/compute_id#033[00m
Nov 22 02:33:22 np0005531887 nova_compute[185912]: 2025-11-22 07:33:22.818 185916 WARNING nova.compute.manager [None req-b632d896-1ff7-4b13-ba1e-50cd845c82ac - - - - - -] Compute nodes ['9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78'] for host compute-1.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.#033[00m
Nov 22 02:33:22 np0005531887 nova_compute[185912]: 2025-11-22 07:33:22.842 185916 INFO nova.compute.manager [None req-b632d896-1ff7-4b13-ba1e-50cd845c82ac - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Nov 22 02:33:22 np0005531887 nova_compute[185912]: 2025-11-22 07:33:22.991 185916 WARNING nova.compute.manager [None req-b632d896-1ff7-4b13-ba1e-50cd845c82ac - - - - - -] No compute node record found for host compute-1.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.#033[00m
Nov 22 02:33:22 np0005531887 nova_compute[185912]: 2025-11-22 07:33:22.991 185916 DEBUG oslo_concurrency.lockutils [None req-b632d896-1ff7-4b13-ba1e-50cd845c82ac - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:33:22 np0005531887 nova_compute[185912]: 2025-11-22 07:33:22.992 185916 DEBUG oslo_concurrency.lockutils [None req-b632d896-1ff7-4b13-ba1e-50cd845c82ac - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:33:22 np0005531887 nova_compute[185912]: 2025-11-22 07:33:22.992 185916 DEBUG oslo_concurrency.lockutils [None req-b632d896-1ff7-4b13-ba1e-50cd845c82ac - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:33:22 np0005531887 nova_compute[185912]: 2025-11-22 07:33:22.992 185916 DEBUG nova.compute.resource_tracker [None req-b632d896-1ff7-4b13-ba1e-50cd845c82ac - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 02:33:23 np0005531887 systemd[1]: Starting libvirt nodedev daemon...
Nov 22 02:33:23 np0005531887 systemd[1]: Started libvirt nodedev daemon.
Nov 22 02:33:23 np0005531887 nova_compute[185912]: 2025-11-22 07:33:23.317 185916 WARNING nova.virt.libvirt.driver [None req-b632d896-1ff7-4b13-ba1e-50cd845c82ac - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:33:23 np0005531887 nova_compute[185912]: 2025-11-22 07:33:23.319 185916 DEBUG nova.compute.resource_tracker [None req-b632d896-1ff7-4b13-ba1e-50cd845c82ac - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6211MB free_disk=73.6649284362793GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 02:33:23 np0005531887 nova_compute[185912]: 2025-11-22 07:33:23.319 185916 DEBUG oslo_concurrency.lockutils [None req-b632d896-1ff7-4b13-ba1e-50cd845c82ac - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:33:23 np0005531887 nova_compute[185912]: 2025-11-22 07:33:23.319 185916 DEBUG oslo_concurrency.lockutils [None req-b632d896-1ff7-4b13-ba1e-50cd845c82ac - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:33:23 np0005531887 nova_compute[185912]: 2025-11-22 07:33:23.343 185916 WARNING nova.compute.resource_tracker [None req-b632d896-1ff7-4b13-ba1e-50cd845c82ac - - - - - -] No compute node record for compute-1.ctlplane.example.com:9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 could not be found.#033[00m
Nov 22 02:33:23 np0005531887 nova_compute[185912]: 2025-11-22 07:33:23.384 185916 INFO nova.compute.resource_tracker [None req-b632d896-1ff7-4b13-ba1e-50cd845c82ac - - - - - -] Compute node record created for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com with uuid: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78#033[00m
Nov 22 02:33:23 np0005531887 nova_compute[185912]: 2025-11-22 07:33:23.464 185916 DEBUG nova.compute.resource_tracker [None req-b632d896-1ff7-4b13-ba1e-50cd845c82ac - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 02:33:23 np0005531887 nova_compute[185912]: 2025-11-22 07:33:23.465 185916 DEBUG nova.compute.resource_tracker [None req-b632d896-1ff7-4b13-ba1e-50cd845c82ac - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 02:33:23 np0005531887 python3.9[186787]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 02:33:23 np0005531887 systemd[1]: Stopping nova_compute container...
Nov 22 02:33:23 np0005531887 nova_compute[185912]: 2025-11-22 07:33:23.575 185916 DEBUG oslo_concurrency.lockutils [None req-b632d896-1ff7-4b13-ba1e-50cd845c82ac - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.256s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:33:23 np0005531887 nova_compute[185912]: 2025-11-22 07:33:23.576 185916 DEBUG oslo_concurrency.lockutils [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:33:23 np0005531887 nova_compute[185912]: 2025-11-22 07:33:23.576 185916 DEBUG oslo_concurrency.lockutils [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:33:23 np0005531887 nova_compute[185912]: 2025-11-22 07:33:23.576 185916 DEBUG oslo_concurrency.lockutils [None req-9aa2fa6f-528d-4cdf-aca0-bfc01ff7eae0 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:33:24 np0005531887 virtqemud[186424]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, )
Nov 22 02:33:24 np0005531887 virtqemud[186424]: hostname: compute-1
Nov 22 02:33:24 np0005531887 virtqemud[186424]: End of file while reading data: Input/output error
Nov 22 02:33:24 np0005531887 systemd[1]: libpod-61441c72d43f0f782bd8d71166bea354a498d29dd66d7d908beb0494ee8f7349.scope: Deactivated successfully.
Nov 22 02:33:24 np0005531887 systemd[1]: libpod-61441c72d43f0f782bd8d71166bea354a498d29dd66d7d908beb0494ee8f7349.scope: Consumed 3.611s CPU time.
Nov 22 02:33:24 np0005531887 podman[186793]: 2025-11-22 07:33:24.045478115 +0000 UTC m=+0.506980443 container died 61441c72d43f0f782bd8d71166bea354a498d29dd66d7d908beb0494ee8f7349 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 02:33:24 np0005531887 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-61441c72d43f0f782bd8d71166bea354a498d29dd66d7d908beb0494ee8f7349-userdata-shm.mount: Deactivated successfully.
Nov 22 02:33:24 np0005531887 systemd[1]: var-lib-containers-storage-overlay-caf13b0be9116136c9409e97e415e056024416dab4914f64f87bc6e45e896721-merged.mount: Deactivated successfully.
Nov 22 02:33:24 np0005531887 podman[186793]: 2025-11-22 07:33:24.112944891 +0000 UTC m=+0.574447219 container cleanup 61441c72d43f0f782bd8d71166bea354a498d29dd66d7d908beb0494ee8f7349 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm)
Nov 22 02:33:24 np0005531887 podman[186793]: nova_compute
Nov 22 02:33:24 np0005531887 podman[186819]: nova_compute
Nov 22 02:33:24 np0005531887 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Nov 22 02:33:24 np0005531887 systemd[1]: Stopped nova_compute container.
Nov 22 02:33:24 np0005531887 systemd[1]: Starting nova_compute container...
Nov 22 02:33:24 np0005531887 systemd[1]: Started libcrun container.
Nov 22 02:33:24 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/caf13b0be9116136c9409e97e415e056024416dab4914f64f87bc6e45e896721/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 22 02:33:24 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/caf13b0be9116136c9409e97e415e056024416dab4914f64f87bc6e45e896721/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Nov 22 02:33:24 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/caf13b0be9116136c9409e97e415e056024416dab4914f64f87bc6e45e896721/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 22 02:33:24 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/caf13b0be9116136c9409e97e415e056024416dab4914f64f87bc6e45e896721/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 22 02:33:24 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/caf13b0be9116136c9409e97e415e056024416dab4914f64f87bc6e45e896721/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 22 02:33:24 np0005531887 podman[186833]: 2025-11-22 07:33:24.291344336 +0000 UTC m=+0.091716094 container init 61441c72d43f0f782bd8d71166bea354a498d29dd66d7d908beb0494ee8f7349 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=nova_compute, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm)
Nov 22 02:33:24 np0005531887 podman[186833]: 2025-11-22 07:33:24.302866975 +0000 UTC m=+0.103238713 container start 61441c72d43f0f782bd8d71166bea354a498d29dd66d7d908beb0494ee8f7349 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 02:33:24 np0005531887 nova_compute[186849]: + sudo -E kolla_set_configs
Nov 22 02:33:24 np0005531887 podman[186833]: nova_compute
Nov 22 02:33:24 np0005531887 systemd[1]: Started nova_compute container.
Nov 22 02:33:24 np0005531887 nova_compute[186849]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 22 02:33:24 np0005531887 nova_compute[186849]: INFO:__main__:Validating config file
Nov 22 02:33:24 np0005531887 nova_compute[186849]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 22 02:33:24 np0005531887 nova_compute[186849]: INFO:__main__:Copying service configuration files
Nov 22 02:33:24 np0005531887 nova_compute[186849]: INFO:__main__:Deleting /etc/nova/nova.conf
Nov 22 02:33:24 np0005531887 nova_compute[186849]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Nov 22 02:33:24 np0005531887 nova_compute[186849]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Nov 22 02:33:24 np0005531887 nova_compute[186849]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Nov 22 02:33:24 np0005531887 nova_compute[186849]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Nov 22 02:33:24 np0005531887 nova_compute[186849]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Nov 22 02:33:24 np0005531887 nova_compute[186849]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 22 02:33:24 np0005531887 nova_compute[186849]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 22 02:33:24 np0005531887 nova_compute[186849]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 22 02:33:24 np0005531887 nova_compute[186849]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Nov 22 02:33:24 np0005531887 nova_compute[186849]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Nov 22 02:33:24 np0005531887 nova_compute[186849]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Nov 22 02:33:24 np0005531887 nova_compute[186849]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 22 02:33:24 np0005531887 nova_compute[186849]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 22 02:33:24 np0005531887 nova_compute[186849]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 22 02:33:24 np0005531887 nova_compute[186849]: INFO:__main__:Deleting /etc/ceph
Nov 22 02:33:24 np0005531887 nova_compute[186849]: INFO:__main__:Creating directory /etc/ceph
Nov 22 02:33:24 np0005531887 nova_compute[186849]: INFO:__main__:Setting permission for /etc/ceph
Nov 22 02:33:24 np0005531887 nova_compute[186849]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Nov 22 02:33:24 np0005531887 nova_compute[186849]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Nov 22 02:33:24 np0005531887 nova_compute[186849]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 22 02:33:24 np0005531887 nova_compute[186849]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Nov 22 02:33:24 np0005531887 nova_compute[186849]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Nov 22 02:33:24 np0005531887 nova_compute[186849]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 22 02:33:24 np0005531887 nova_compute[186849]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Nov 22 02:33:24 np0005531887 nova_compute[186849]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Nov 22 02:33:24 np0005531887 nova_compute[186849]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Nov 22 02:33:24 np0005531887 nova_compute[186849]: INFO:__main__:Writing out command to execute
Nov 22 02:33:24 np0005531887 nova_compute[186849]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Nov 22 02:33:24 np0005531887 nova_compute[186849]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 22 02:33:24 np0005531887 nova_compute[186849]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 22 02:33:24 np0005531887 nova_compute[186849]: ++ cat /run_command
Nov 22 02:33:24 np0005531887 nova_compute[186849]: + CMD=nova-compute
Nov 22 02:33:24 np0005531887 nova_compute[186849]: + ARGS=
Nov 22 02:33:24 np0005531887 nova_compute[186849]: + sudo kolla_copy_cacerts
Nov 22 02:33:24 np0005531887 nova_compute[186849]: + [[ ! -n '' ]]
Nov 22 02:33:24 np0005531887 nova_compute[186849]: + . kolla_extend_start
Nov 22 02:33:24 np0005531887 nova_compute[186849]: Running command: 'nova-compute'
Nov 22 02:33:24 np0005531887 nova_compute[186849]: + echo 'Running command: '\''nova-compute'\'''
Nov 22 02:33:24 np0005531887 nova_compute[186849]: + umask 0022
Nov 22 02:33:24 np0005531887 nova_compute[186849]: + exec nova-compute
Nov 22 02:33:25 np0005531887 python3.9[187012]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Nov 22 02:33:25 np0005531887 systemd[1]: Started libpod-conmon-b1900248f40c924d32ec4ca80c205fe0fd94b3c0a558b23ce248b1e1b23ec07c.scope.
Nov 22 02:33:25 np0005531887 systemd[1]: Started libcrun container.
Nov 22 02:33:25 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f0f3ead2c5d84716e43697b636082af7e7bafe1fcf61fb33a6ecde74fe81bc91/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Nov 22 02:33:25 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f0f3ead2c5d84716e43697b636082af7e7bafe1fcf61fb33a6ecde74fe81bc91/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 22 02:33:25 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f0f3ead2c5d84716e43697b636082af7e7bafe1fcf61fb33a6ecde74fe81bc91/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Nov 22 02:33:25 np0005531887 podman[187039]: 2025-11-22 07:33:25.529385991 +0000 UTC m=+0.129463059 container init b1900248f40c924d32ec4ca80c205fe0fd94b3c0a558b23ce248b1e1b23ec07c (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, tcib_managed=true, container_name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 02:33:25 np0005531887 podman[187039]: 2025-11-22 07:33:25.539369593 +0000 UTC m=+0.139446631 container start b1900248f40c924d32ec4ca80c205fe0fd94b3c0a558b23ce248b1e1b23ec07c (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 22 02:33:25 np0005531887 python3.9[187012]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Nov 22 02:33:25 np0005531887 nova_compute_init[187061]: INFO:nova_statedir:Applying nova statedir ownership
Nov 22 02:33:25 np0005531887 nova_compute_init[187061]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Nov 22 02:33:25 np0005531887 nova_compute_init[187061]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Nov 22 02:33:25 np0005531887 nova_compute_init[187061]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Nov 22 02:33:25 np0005531887 nova_compute_init[187061]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Nov 22 02:33:25 np0005531887 nova_compute_init[187061]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Nov 22 02:33:25 np0005531887 nova_compute_init[187061]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Nov 22 02:33:25 np0005531887 nova_compute_init[187061]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Nov 22 02:33:25 np0005531887 nova_compute_init[187061]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Nov 22 02:33:25 np0005531887 nova_compute_init[187061]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Nov 22 02:33:25 np0005531887 nova_compute_init[187061]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Nov 22 02:33:25 np0005531887 nova_compute_init[187061]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Nov 22 02:33:25 np0005531887 nova_compute_init[187061]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Nov 22 02:33:25 np0005531887 nova_compute_init[187061]: INFO:nova_statedir:Nova statedir ownership complete
Nov 22 02:33:25 np0005531887 systemd[1]: libpod-b1900248f40c924d32ec4ca80c205fe0fd94b3c0a558b23ce248b1e1b23ec07c.scope: Deactivated successfully.
Nov 22 02:33:25 np0005531887 podman[187075]: 2025-11-22 07:33:25.629243013 +0000 UTC m=+0.021473142 container died b1900248f40c924d32ec4ca80c205fe0fd94b3c0a558b23ce248b1e1b23ec07c (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 22 02:33:25 np0005531887 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b1900248f40c924d32ec4ca80c205fe0fd94b3c0a558b23ce248b1e1b23ec07c-userdata-shm.mount: Deactivated successfully.
Nov 22 02:33:25 np0005531887 systemd[1]: var-lib-containers-storage-overlay-f0f3ead2c5d84716e43697b636082af7e7bafe1fcf61fb33a6ecde74fe81bc91-merged.mount: Deactivated successfully.
Nov 22 02:33:25 np0005531887 podman[187075]: 2025-11-22 07:33:25.846272205 +0000 UTC m=+0.238502284 container cleanup b1900248f40c924d32ec4ca80c205fe0fd94b3c0a558b23ce248b1e1b23ec07c (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=nova_compute_init, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 02:33:25 np0005531887 systemd[1]: libpod-conmon-b1900248f40c924d32ec4ca80c205fe0fd94b3c0a558b23ce248b1e1b23ec07c.scope: Deactivated successfully.
Nov 22 02:33:26 np0005531887 systemd[1]: session-24.scope: Deactivated successfully.
Nov 22 02:33:26 np0005531887 systemd[1]: session-24.scope: Consumed 1min 55.463s CPU time.
Nov 22 02:33:26 np0005531887 systemd-logind[821]: Session 24 logged out. Waiting for processes to exit.
Nov 22 02:33:26 np0005531887 systemd-logind[821]: Removed session 24.
Nov 22 02:33:26 np0005531887 nova_compute[186849]: 2025-11-22 07:33:26.589 186853 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Nov 22 02:33:26 np0005531887 nova_compute[186849]: 2025-11-22 07:33:26.590 186853 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Nov 22 02:33:26 np0005531887 nova_compute[186849]: 2025-11-22 07:33:26.590 186853 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Nov 22 02:33:26 np0005531887 nova_compute[186849]: 2025-11-22 07:33:26.590 186853 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Nov 22 02:33:26 np0005531887 nova_compute[186849]: 2025-11-22 07:33:26.734 186853 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:33:26 np0005531887 nova_compute[186849]: 2025-11-22 07:33:26.757 186853 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:33:26 np0005531887 nova_compute[186849]: 2025-11-22 07:33:26.757 186853 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.232 186853 INFO nova.virt.driver [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.343 186853 INFO nova.compute.provider_config [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.388 186853 DEBUG oslo_concurrency.lockutils [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.389 186853 DEBUG oslo_concurrency.lockutils [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.389 186853 DEBUG oslo_concurrency.lockutils [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.389 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.390 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.390 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.390 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.390 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.390 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.390 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.390 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.391 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.391 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.391 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.391 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.391 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.391 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.391 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.392 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.392 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.392 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.392 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.392 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.392 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.393 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.393 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.393 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.393 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.393 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.393 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.393 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.394 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.394 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.394 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.394 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.394 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.394 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.394 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.395 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.395 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.395 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.395 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.395 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.395 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.395 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.396 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.396 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.396 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.396 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.396 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.396 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.397 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.397 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.397 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.397 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.397 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.398 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.398 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.398 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.398 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.398 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.398 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.398 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.399 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.399 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.399 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.399 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.399 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.399 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.399 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.399 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.400 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.400 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.400 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.400 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.400 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.400 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.400 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.401 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.401 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.401 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.401 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.401 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.401 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.402 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.402 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.402 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.402 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.402 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.402 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.402 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.403 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.403 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.403 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.403 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.403 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.403 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.403 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.404 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.404 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.404 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.404 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.404 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.404 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.405 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.405 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.405 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.405 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.405 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.405 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.406 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.406 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.406 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.406 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.406 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.406 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.406 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.407 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.407 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.407 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.407 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.407 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.407 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.407 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.408 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.408 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.408 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.408 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.408 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.408 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.408 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.409 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.409 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.409 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.409 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.409 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.409 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.409 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.410 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.410 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.410 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.410 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.410 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.410 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.410 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.411 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.411 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.411 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.411 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.411 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.411 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.411 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.412 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.412 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.412 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.412 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.412 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.412 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.412 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.413 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.413 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.413 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.413 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.413 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.413 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.414 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.414 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.414 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.414 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.414 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.414 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.414 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.415 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.415 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.415 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.415 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.415 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.415 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.415 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.416 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.416 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.416 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.416 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.416 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.417 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.417 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.417 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.417 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.417 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.417 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.418 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.418 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.418 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.418 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.418 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.418 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.418 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.419 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.419 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.419 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.419 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.419 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.419 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.419 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.420 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.420 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.420 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.420 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.420 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.420 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.420 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.421 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.421 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.421 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.421 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.421 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.421 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.421 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.422 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.422 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.422 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.422 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.422 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.422 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.422 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.423 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.423 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.423 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.423 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.423 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.423 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.423 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.424 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.424 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.424 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.424 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.424 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.424 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.424 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.425 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.425 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.425 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.425 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.425 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.425 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.426 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.426 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.426 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.426 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.426 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.426 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.426 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.427 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.427 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.427 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.427 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.427 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.427 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.427 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.428 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.428 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.428 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.428 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.428 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.428 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.428 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.429 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.429 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.429 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.429 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.429 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.429 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.429 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.430 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.430 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.430 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.430 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.430 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.430 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.431 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.431 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.431 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.431 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.431 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.431 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.432 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.432 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.432 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.432 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.432 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.432 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.433 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.433 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.433 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.433 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.433 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.433 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.434 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.434 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.434 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.434 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.434 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.434 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.435 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.435 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.435 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.435 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.435 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.435 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.435 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.436 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.436 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.436 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.436 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.436 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.436 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.436 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.437 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.437 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.437 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.437 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.437 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.437 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.438 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.438 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.438 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.438 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.438 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.438 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.439 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.439 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.439 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.439 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.439 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.439 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.439 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.440 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.440 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.440 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.440 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.440 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.440 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.441 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.441 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.441 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.441 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.441 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.442 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.442 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.442 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.442 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.442 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.442 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.442 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.443 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.443 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.443 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.443 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.443 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.443 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.444 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.444 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.444 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.444 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.444 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.444 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.444 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.445 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.445 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.445 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.445 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.445 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.445 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.445 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.446 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.446 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.446 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.446 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.446 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.446 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.446 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.447 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.447 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.447 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.447 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.447 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.447 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.447 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.448 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.448 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.448 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.448 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.448 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.448 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.449 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.449 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.449 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.449 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.449 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.449 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.450 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.450 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.450 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.450 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.450 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.451 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.451 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.451 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.451 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.451 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.452 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.452 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.452 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.452 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.452 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.452 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.453 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.453 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.453 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.453 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.453 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.453 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.454 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.454 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.454 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.454 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.454 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.454 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.454 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.455 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.455 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.455 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.455 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.455 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.456 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.456 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.456 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.456 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.456 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.456 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.456 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.456 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.457 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.457 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.457 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.457 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.457 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.457 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.457 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.458 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.458 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.458 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.458 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.458 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.459 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.459 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.459 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.459 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.459 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.459 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.460 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.460 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.460 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.460 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.460 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.460 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.460 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.461 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.461 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.461 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.461 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.461 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.461 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.462 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.462 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.462 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.462 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.462 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.462 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.462 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.463 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.463 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.463 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.463 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.463 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.463 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.464 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.464 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.464 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.464 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.464 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.464 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.464 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.465 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.465 186853 WARNING oslo_config.cfg [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Nov 22 02:33:27 np0005531887 nova_compute[186849]: live_migration_uri is deprecated for removal in favor of two other options that
Nov 22 02:33:27 np0005531887 nova_compute[186849]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Nov 22 02:33:27 np0005531887 nova_compute[186849]: and ``live_migration_inbound_addr`` respectively.
Nov 22 02:33:27 np0005531887 nova_compute[186849]: ).  Its value may be silently ignored in the future.#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.465 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.465 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.465 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.466 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.466 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.466 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.466 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.466 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.467 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.467 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.467 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.467 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.467 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.468 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.468 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.468 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.468 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.468 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.468 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.469 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.469 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.469 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.469 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.469 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.469 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.469 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.470 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.470 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.470 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.470 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.470 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.470 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.471 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.471 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.471 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.471 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.471 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.471 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.471 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.472 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.472 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.472 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.472 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.472 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.472 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.472 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.473 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.473 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.473 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.473 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.473 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.474 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.474 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.474 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.474 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.474 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.474 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.474 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.475 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.475 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.475 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.475 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.475 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.475 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.475 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.476 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.476 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.476 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.476 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.476 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.476 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.476 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.477 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.477 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.477 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.477 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.477 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.477 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.477 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.478 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.478 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.478 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.478 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] notifications.notification_format = both log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.478 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] notifications.notify_on_state_change = vm_and_task_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.478 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.478 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.479 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.479 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.479 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.479 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.479 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.479 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.480 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.480 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.480 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.480 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.480 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.480 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.480 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.481 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.481 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.481 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.481 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.481 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.481 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.482 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.482 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.482 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.482 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.482 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.482 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.482 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.483 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.483 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.483 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.484 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.484 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.485 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.485 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.486 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.486 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.487 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.487 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.488 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.489 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.489 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.489 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.491 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.491 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.492 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.492 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.492 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.493 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.493 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.494 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.495 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.495 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.496 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.496 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.498 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.499 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.500 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.501 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.501 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.502 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.503 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.503 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.504 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.505 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.505 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.507 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.507 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.508 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.509 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.509 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.510 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.511 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.512 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.512 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.513 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.513 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.513 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.513 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.514 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.514 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.514 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.514 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.515 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.515 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.515 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.515 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.515 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.515 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.515 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.515 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.516 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.516 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.516 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.516 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.516 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.517 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.517 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.517 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.517 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.517 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.517 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.518 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.518 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.518 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.518 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.518 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.518 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.518 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.519 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.519 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.519 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.519 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.519 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.519 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.520 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.520 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.520 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.520 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.520 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.520 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.520 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.521 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.521 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.521 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.521 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.521 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.521 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.522 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.522 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.522 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.522 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.522 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.522 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.523 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.523 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.523 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.523 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.523 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.524 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.524 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.524 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.524 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.524 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.524 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.525 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.525 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.525 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.525 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.525 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.525 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.525 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.526 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.526 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.526 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.526 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.526 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.526 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.526 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.527 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.527 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.527 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.527 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.527 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.528 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.528 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.528 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.528 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.529 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.529 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.529 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.529 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.529 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.529 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.530 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.530 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.530 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.530 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.530 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.530 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.531 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.531 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.531 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.531 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.531 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.531 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.531 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.532 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.532 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.532 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.532 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.532 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.532 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.533 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.533 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.533 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.533 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.533 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.533 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.533 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.534 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.534 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.534 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.534 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.534 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.534 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.535 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.535 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.535 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.535 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.535 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.535 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.535 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.536 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.536 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.536 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.536 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.536 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.537 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.537 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.537 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.537 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.537 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.538 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.538 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.538 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.538 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.538 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.539 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.539 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.539 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.539 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.539 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.539 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.540 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.540 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.540 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.540 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.540 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.541 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.541 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.541 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.541 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.541 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.542 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.542 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.542 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.542 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.542 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.542 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.543 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.543 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.543 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.543 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.543 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.543 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.543 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.544 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.544 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.544 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.544 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.544 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.544 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.545 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.545 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.545 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.545 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.545 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.545 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.545 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.545 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.546 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.546 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.546 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.546 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.546 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.546 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.546 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.547 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.547 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.547 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.547 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.547 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.547 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.547 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.548 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.548 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.548 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.548 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.548 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.548 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.548 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.549 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.549 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.549 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.549 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.549 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.549 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.550 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.550 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.550 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.550 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.550 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.550 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.550 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.551 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.551 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.551 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.551 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.551 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.551 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.552 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.552 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.552 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.552 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.552 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.553 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.553 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.553 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.553 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.553 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.553 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.554 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.554 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.554 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.554 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.554 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.554 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.554 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.555 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.555 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.555 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.555 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.555 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.555 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.556 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.556 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.556 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.556 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.556 186853 DEBUG oslo_service.service [None req-60b91863-7298-4e7f-8f4c-64a39d8584f1 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.557 186853 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.576 186853 INFO nova.virt.node [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Determined node identity 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 from /var/lib/nova/compute_id#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.577 186853 DEBUG nova.virt.libvirt.host [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.577 186853 DEBUG nova.virt.libvirt.host [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.578 186853 DEBUG nova.virt.libvirt.host [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.578 186853 DEBUG nova.virt.libvirt.host [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.589 186853 DEBUG nova.virt.libvirt.host [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f1229b43700> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.591 186853 DEBUG nova.virt.libvirt.host [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f1229b43700> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.592 186853 INFO nova.virt.libvirt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Connection event '1' reason 'None'#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.599 186853 INFO nova.virt.libvirt.host [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Libvirt host capabilities <capabilities>
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  <host>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <uuid>e5ccb90d-580e-48d9-a7d0-f6edef583e11</uuid>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <cpu>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <arch>x86_64</arch>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model>EPYC-Rome-v4</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <vendor>AMD</vendor>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <microcode version='16777317'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <signature family='23' model='49' stepping='0'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <maxphysaddr mode='emulate' bits='40'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature name='x2apic'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature name='tsc-deadline'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature name='osxsave'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature name='hypervisor'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature name='tsc_adjust'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature name='spec-ctrl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature name='stibp'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature name='arch-capabilities'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature name='ssbd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature name='cmp_legacy'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature name='topoext'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature name='virt-ssbd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature name='lbrv'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature name='tsc-scale'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature name='vmcb-clean'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature name='pause-filter'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature name='pfthreshold'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature name='svme-addr-chk'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature name='rdctl-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature name='skip-l1dfl-vmentry'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature name='mds-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature name='pschange-mc-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <pages unit='KiB' size='4'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <pages unit='KiB' size='2048'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <pages unit='KiB' size='1048576'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </cpu>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <power_management>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <suspend_mem/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <suspend_disk/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <suspend_hybrid/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </power_management>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <iommu support='no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <migration_features>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <live/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <uri_transports>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <uri_transport>tcp</uri_transport>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <uri_transport>rdma</uri_transport>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </uri_transports>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </migration_features>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <topology>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <cells num='1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <cell id='0'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:          <memory unit='KiB'>7864316</memory>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:          <pages unit='KiB' size='4'>1966079</pages>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:          <pages unit='KiB' size='2048'>0</pages>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:          <pages unit='KiB' size='1048576'>0</pages>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:          <distances>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:            <sibling id='0' value='10'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:          </distances>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:          <cpus num='8'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:          </cpus>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        </cell>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </cells>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </topology>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <cache>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </cache>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <secmodel>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model>selinux</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <doi>0</doi>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </secmodel>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <secmodel>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model>dac</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <doi>0</doi>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <baselabel type='kvm'>+107:+107</baselabel>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <baselabel type='qemu'>+107:+107</baselabel>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </secmodel>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  </host>
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  <guest>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <os_type>hvm</os_type>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <arch name='i686'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <wordsize>32</wordsize>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <domain type='qemu'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <domain type='kvm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </arch>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <features>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <pae/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <nonpae/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <acpi default='on' toggle='yes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <apic default='on' toggle='no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <cpuselection/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <deviceboot/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <disksnapshot default='on' toggle='no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <externalSnapshot/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </features>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  </guest>
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  <guest>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <os_type>hvm</os_type>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <arch name='x86_64'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <wordsize>64</wordsize>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <domain type='qemu'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <domain type='kvm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </arch>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <features>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <acpi default='on' toggle='yes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <apic default='on' toggle='no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <cpuselection/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <deviceboot/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <disksnapshot default='on' toggle='no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <externalSnapshot/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </features>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  </guest>
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 
Nov 22 02:33:27 np0005531887 nova_compute[186849]: </capabilities>
Nov 22 02:33:27 np0005531887 nova_compute[186849]: #033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.605 186853 DEBUG nova.virt.libvirt.host [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.609 186853 DEBUG nova.virt.libvirt.volume.mount [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.610 186853 DEBUG nova.virt.libvirt.host [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Nov 22 02:33:27 np0005531887 nova_compute[186849]: <domainCapabilities>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  <path>/usr/libexec/qemu-kvm</path>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  <domain>kvm</domain>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  <machine>pc-q35-rhel9.8.0</machine>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  <arch>i686</arch>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  <vcpu max='4096'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  <iothreads supported='yes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  <os supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <enum name='firmware'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <loader supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='type'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>rom</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>pflash</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='readonly'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>yes</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>no</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='secure'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>no</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </loader>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  </os>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  <cpu>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <mode name='host-passthrough' supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='hostPassthroughMigratable'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>on</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>off</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </mode>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <mode name='maximum' supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='maximumMigratable'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>on</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>off</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </mode>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <mode name='host-model' supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <vendor>AMD</vendor>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='x2apic'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='tsc-deadline'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='hypervisor'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='tsc_adjust'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='spec-ctrl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='stibp'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='ssbd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='cmp_legacy'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='overflow-recov'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='succor'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='ibrs'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='amd-ssbd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='virt-ssbd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='lbrv'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='tsc-scale'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='vmcb-clean'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='flushbyasid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='pause-filter'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='pfthreshold'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='svme-addr-chk'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='disable' name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </mode>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <mode name='custom' supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Broadwell'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Broadwell-IBRS'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Broadwell-noTSX'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Broadwell-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Broadwell-v2'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Broadwell-v3'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Broadwell-v4'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Cascadelake-Server'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Cascadelake-Server-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Cascadelake-Server-v2'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Cascadelake-Server-v3'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Cascadelake-Server-v4'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Cascadelake-Server-v5'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Cooperlake'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Cooperlake-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Cooperlake-v2'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Denverton'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='mpx'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Denverton-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='mpx'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Denverton-v2'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Denverton-v3'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Dhyana-v2'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='EPYC-Genoa'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amd-psfd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='auto-ibrs'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='no-nested-data-bp'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='null-sel-clr-base'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='stibp-always-on'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='EPYC-Genoa-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amd-psfd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='auto-ibrs'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='no-nested-data-bp'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='null-sel-clr-base'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='stibp-always-on'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='EPYC-Milan'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='EPYC-Milan-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='EPYC-Milan-v2'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amd-psfd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='no-nested-data-bp'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='null-sel-clr-base'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='stibp-always-on'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='EPYC-Rome'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='EPYC-Rome-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='EPYC-Rome-v2'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='EPYC-Rome-v3'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='EPYC-v3'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='EPYC-v4'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='GraniteRapids'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-fp16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-int8'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-tile'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-fp16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fbsdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrc'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fzrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='mcdt-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pbrsb-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='prefetchiti'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='psdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xfd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='GraniteRapids-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-fp16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-int8'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-tile'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-fp16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fbsdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrc'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fzrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='mcdt-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pbrsb-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='prefetchiti'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='psdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xfd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='GraniteRapids-v2'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-fp16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-int8'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-tile'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx10'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx10-128'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx10-256'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx10-512'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-fp16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='cldemote'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fbsdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrc'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fzrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='mcdt-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='movdir64b'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='movdiri'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pbrsb-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='prefetchiti'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='psdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xfd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Haswell'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Haswell-IBRS'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Haswell-noTSX'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Haswell-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Haswell-v2'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Haswell-v3'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Haswell-v4'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Icelake-Server'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Icelake-Server-noTSX'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Icelake-Server-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Icelake-Server-v2'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Icelake-Server-v3'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Icelake-Server-v4'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Icelake-Server-v5'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Icelake-Server-v6'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Icelake-Server-v7'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='IvyBridge'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='IvyBridge-IBRS'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='IvyBridge-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='IvyBridge-v2'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='KnightsMill'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-4fmaps'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-4vnniw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512er'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512pf'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='KnightsMill-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-4fmaps'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-4vnniw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512er'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512pf'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Opteron_G4'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fma4'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xop'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Opteron_G4-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fma4'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xop'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Opteron_G5'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fma4'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='tbm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xop'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Opteron_G5-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fma4'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='tbm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xop'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='SapphireRapids'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-int8'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-tile'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-fp16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrc'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fzrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xfd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='SapphireRapids-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-int8'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-tile'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-fp16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrc'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fzrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xfd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='SapphireRapids-v2'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-int8'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-tile'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-fp16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fbsdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrc'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fzrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='psdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xfd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='SapphireRapids-v3'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-int8'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-tile'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-fp16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='cldemote'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fbsdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrc'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fzrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='movdir64b'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='movdiri'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='psdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xfd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='SierraForest'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx-ifma'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx-ne-convert'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx-vnni-int8'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='cmpccxadd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fbsdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='mcdt-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pbrsb-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='psdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='SierraForest-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx-ifma'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx-ne-convert'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx-vnni-int8'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='cmpccxadd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fbsdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='mcdt-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pbrsb-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='psdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Skylake-Client'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Skylake-Client-IBRS'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Skylake-Client-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Skylake-Client-v2'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Skylake-Client-v3'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Skylake-Client-v4'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Skylake-Server'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Skylake-Server-IBRS'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Skylake-Server-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Skylake-Server-v2'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Skylake-Server-v3'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Skylake-Server-v4'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Skylake-Server-v5'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Snowridge'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='cldemote'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='core-capability'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='movdir64b'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='movdiri'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='mpx'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='split-lock-detect'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Snowridge-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='cldemote'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='core-capability'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='movdir64b'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='movdiri'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='mpx'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='split-lock-detect'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Snowridge-v2'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='cldemote'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='core-capability'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='movdir64b'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='movdiri'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='split-lock-detect'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Snowridge-v3'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='cldemote'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='core-capability'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='movdir64b'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='movdiri'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='split-lock-detect'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Snowridge-v4'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='cldemote'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='movdir64b'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='movdiri'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='athlon'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='3dnow'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='3dnowext'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='athlon-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='3dnow'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='3dnowext'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='core2duo'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='core2duo-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='coreduo'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='coreduo-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='n270'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='n270-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='phenom'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='3dnow'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='3dnowext'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='phenom-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='3dnow'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='3dnowext'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </mode>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  <memoryBacking supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <enum name='sourceType'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <value>file</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <value>anonymous</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <value>memfd</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  </memoryBacking>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  <devices>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <disk supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='diskDevice'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>disk</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>cdrom</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>floppy</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>lun</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='bus'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>fdc</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>scsi</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>virtio</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>usb</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>sata</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='model'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>virtio</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>virtio-transitional</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>virtio-non-transitional</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <graphics supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='type'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>vnc</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>egl-headless</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>dbus</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </graphics>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <video supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='modelType'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>vga</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>cirrus</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>virtio</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>none</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>bochs</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>ramfb</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </video>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <hostdev supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='mode'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>subsystem</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='startupPolicy'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>default</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>mandatory</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>requisite</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>optional</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='subsysType'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>usb</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>pci</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>scsi</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='capsType'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='pciBackend'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </hostdev>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <rng supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='model'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>virtio</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>virtio-transitional</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>virtio-non-transitional</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='backendModel'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>random</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>egd</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>builtin</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </rng>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <filesystem supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='driverType'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>path</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>handle</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>virtiofs</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </filesystem>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <tpm supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='model'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>tpm-tis</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>tpm-crb</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='backendModel'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>emulator</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>external</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='backendVersion'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>2.0</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </tpm>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <redirdev supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='bus'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>usb</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </redirdev>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <channel supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='type'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>pty</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>unix</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </channel>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <crypto supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='model'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='type'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>qemu</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='backendModel'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>builtin</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </crypto>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <interface supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='backendType'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>default</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>passt</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </interface>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <panic supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='model'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>isa</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>hyperv</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </panic>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <console supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='type'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>null</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>vc</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>pty</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>dev</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>file</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>pipe</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>stdio</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>udp</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>tcp</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>unix</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>qemu-vdagent</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>dbus</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </console>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  </devices>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  <features>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <gic supported='no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <vmcoreinfo supported='yes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <genid supported='yes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <backingStoreInput supported='yes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <backup supported='yes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <async-teardown supported='yes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <ps2 supported='yes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <sev supported='no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <sgx supported='no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <hyperv supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='features'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>relaxed</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>vapic</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>spinlocks</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>vpindex</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>runtime</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>synic</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>stimer</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>reset</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>vendor_id</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>frequencies</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>reenlightenment</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>tlbflush</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>ipi</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>avic</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>emsr_bitmap</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>xmm_input</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <defaults>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <spinlocks>4095</spinlocks>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <stimer_direct>on</stimer_direct>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <tlbflush_direct>on</tlbflush_direct>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <tlbflush_extended>on</tlbflush_extended>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </defaults>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </hyperv>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <launchSecurity supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='sectype'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>tdx</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </launchSecurity>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  </features>
Nov 22 02:33:27 np0005531887 nova_compute[186849]: </domainCapabilities>
Nov 22 02:33:27 np0005531887 nova_compute[186849]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.615 186853 DEBUG nova.virt.libvirt.host [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Nov 22 02:33:27 np0005531887 nova_compute[186849]: <domainCapabilities>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  <path>/usr/libexec/qemu-kvm</path>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  <domain>kvm</domain>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  <machine>pc-i440fx-rhel7.6.0</machine>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  <arch>i686</arch>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  <vcpu max='240'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  <iothreads supported='yes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  <os supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <enum name='firmware'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <loader supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='type'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>rom</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>pflash</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='readonly'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>yes</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>no</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='secure'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>no</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </loader>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  </os>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  <cpu>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <mode name='host-passthrough' supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='hostPassthroughMigratable'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>on</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>off</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </mode>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <mode name='maximum' supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='maximumMigratable'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>on</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>off</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </mode>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <mode name='host-model' supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <vendor>AMD</vendor>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='x2apic'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='tsc-deadline'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='hypervisor'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='tsc_adjust'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='spec-ctrl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='stibp'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='ssbd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='cmp_legacy'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='overflow-recov'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='succor'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='ibrs'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='amd-ssbd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='virt-ssbd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='lbrv'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='tsc-scale'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='vmcb-clean'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='flushbyasid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='pause-filter'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='pfthreshold'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='svme-addr-chk'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='disable' name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </mode>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <mode name='custom' supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Broadwell'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Broadwell-IBRS'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Broadwell-noTSX'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Broadwell-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Broadwell-v2'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Broadwell-v3'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Broadwell-v4'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Cascadelake-Server'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Cascadelake-Server-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Cascadelake-Server-v2'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Cascadelake-Server-v3'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Cascadelake-Server-v4'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Cascadelake-Server-v5'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Cooperlake'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Cooperlake-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Cooperlake-v2'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Denverton'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='mpx'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Denverton-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='mpx'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Denverton-v2'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Denverton-v3'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Dhyana-v2'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='EPYC-Genoa'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amd-psfd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='auto-ibrs'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='no-nested-data-bp'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='null-sel-clr-base'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='stibp-always-on'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='EPYC-Genoa-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amd-psfd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='auto-ibrs'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='no-nested-data-bp'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='null-sel-clr-base'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='stibp-always-on'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='EPYC-Milan'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='EPYC-Milan-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='EPYC-Milan-v2'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amd-psfd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='no-nested-data-bp'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='null-sel-clr-base'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='stibp-always-on'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='EPYC-Rome'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='EPYC-Rome-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='EPYC-Rome-v2'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='EPYC-Rome-v3'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='EPYC-v3'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='EPYC-v4'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='GraniteRapids'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-fp16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-int8'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-tile'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-fp16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fbsdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrc'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fzrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='mcdt-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pbrsb-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='prefetchiti'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='psdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xfd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='GraniteRapids-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-fp16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-int8'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-tile'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-fp16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fbsdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrc'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fzrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='mcdt-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pbrsb-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='prefetchiti'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='psdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xfd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='GraniteRapids-v2'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-fp16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-int8'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-tile'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx10'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx10-128'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx10-256'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx10-512'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-fp16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='cldemote'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fbsdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrc'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fzrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='mcdt-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='movdir64b'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='movdiri'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pbrsb-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='prefetchiti'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='psdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xfd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Haswell'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Haswell-IBRS'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Haswell-noTSX'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Haswell-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Haswell-v2'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Haswell-v3'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Haswell-v4'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Icelake-Server'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Icelake-Server-noTSX'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Icelake-Server-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Icelake-Server-v2'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Icelake-Server-v3'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Icelake-Server-v4'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Icelake-Server-v5'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Icelake-Server-v6'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Icelake-Server-v7'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='IvyBridge'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='IvyBridge-IBRS'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='IvyBridge-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='IvyBridge-v2'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='KnightsMill'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-4fmaps'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-4vnniw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512er'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512pf'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='KnightsMill-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-4fmaps'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-4vnniw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512er'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512pf'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Opteron_G4'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fma4'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xop'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Opteron_G4-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fma4'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xop'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Opteron_G5'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fma4'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='tbm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xop'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Opteron_G5-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fma4'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='tbm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xop'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='SapphireRapids'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-int8'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-tile'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-fp16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrc'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fzrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xfd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='SapphireRapids-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-int8'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-tile'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-fp16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrc'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fzrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xfd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='SapphireRapids-v2'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-int8'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-tile'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-fp16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fbsdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrc'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fzrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='psdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xfd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='SapphireRapids-v3'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-int8'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-tile'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-fp16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='cldemote'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fbsdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrc'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fzrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='movdir64b'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='movdiri'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='psdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xfd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='SierraForest'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx-ifma'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx-ne-convert'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx-vnni-int8'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='cmpccxadd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fbsdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='mcdt-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pbrsb-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='psdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='SierraForest-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx-ifma'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx-ne-convert'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx-vnni-int8'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='cmpccxadd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fbsdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='mcdt-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pbrsb-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='psdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Skylake-Client'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Skylake-Client-IBRS'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Skylake-Client-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Skylake-Client-v2'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Skylake-Client-v3'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Skylake-Client-v4'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Skylake-Server'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Skylake-Server-IBRS'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Skylake-Server-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Skylake-Server-v2'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Skylake-Server-v3'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Skylake-Server-v4'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Skylake-Server-v5'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Snowridge'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='cldemote'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='core-capability'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='movdir64b'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='movdiri'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='mpx'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='split-lock-detect'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Snowridge-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='cldemote'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='core-capability'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='movdir64b'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='movdiri'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='mpx'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='split-lock-detect'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Snowridge-v2'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='cldemote'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='core-capability'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='movdir64b'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='movdiri'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='split-lock-detect'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Snowridge-v3'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='cldemote'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='core-capability'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='movdir64b'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='movdiri'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='split-lock-detect'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Snowridge-v4'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='cldemote'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='movdir64b'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='movdiri'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='athlon'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='3dnow'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='3dnowext'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='athlon-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='3dnow'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='3dnowext'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='core2duo'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='core2duo-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='coreduo'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='coreduo-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='n270'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='n270-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='phenom'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='3dnow'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='3dnowext'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='phenom-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='3dnow'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='3dnowext'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </mode>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  <memoryBacking supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <enum name='sourceType'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <value>file</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <value>anonymous</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <value>memfd</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  </memoryBacking>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  <devices>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <disk supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='diskDevice'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>disk</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>cdrom</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>floppy</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>lun</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='bus'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>ide</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>fdc</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>scsi</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>virtio</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>usb</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>sata</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='model'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>virtio</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>virtio-transitional</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>virtio-non-transitional</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <graphics supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='type'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>vnc</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>egl-headless</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>dbus</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </graphics>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <video supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='modelType'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>vga</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>cirrus</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>virtio</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>none</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>bochs</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>ramfb</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </video>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <hostdev supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='mode'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>subsystem</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='startupPolicy'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>default</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>mandatory</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>requisite</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>optional</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='subsysType'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>usb</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>pci</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>scsi</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='capsType'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='pciBackend'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </hostdev>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <rng supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='model'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>virtio</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>virtio-transitional</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>virtio-non-transitional</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='backendModel'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>random</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>egd</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>builtin</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </rng>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <filesystem supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='driverType'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>path</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>handle</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>virtiofs</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </filesystem>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <tpm supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='model'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>tpm-tis</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>tpm-crb</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='backendModel'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>emulator</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>external</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='backendVersion'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>2.0</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </tpm>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <redirdev supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='bus'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>usb</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </redirdev>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <channel supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='type'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>pty</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>unix</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </channel>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <crypto supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='model'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='type'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>qemu</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='backendModel'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>builtin</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </crypto>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <interface supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='backendType'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>default</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>passt</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </interface>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <panic supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='model'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>isa</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>hyperv</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </panic>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <console supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='type'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>null</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>vc</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>pty</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>dev</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>file</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>pipe</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>stdio</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>udp</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>tcp</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>unix</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>qemu-vdagent</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>dbus</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </console>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  </devices>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  <features>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <gic supported='no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <vmcoreinfo supported='yes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <genid supported='yes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <backingStoreInput supported='yes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <backup supported='yes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <async-teardown supported='yes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <ps2 supported='yes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <sev supported='no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <sgx supported='no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <hyperv supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='features'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>relaxed</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>vapic</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>spinlocks</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>vpindex</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>runtime</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>synic</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>stimer</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>reset</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>vendor_id</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>frequencies</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>reenlightenment</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>tlbflush</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>ipi</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>avic</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>emsr_bitmap</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>xmm_input</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <defaults>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <spinlocks>4095</spinlocks>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <stimer_direct>on</stimer_direct>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <tlbflush_direct>on</tlbflush_direct>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <tlbflush_extended>on</tlbflush_extended>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </defaults>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </hyperv>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <launchSecurity supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='sectype'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>tdx</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </launchSecurity>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  </features>
Nov 22 02:33:27 np0005531887 nova_compute[186849]: </domainCapabilities>
Nov 22 02:33:27 np0005531887 nova_compute[186849]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.643 186853 DEBUG nova.virt.libvirt.host [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.647 186853 DEBUG nova.virt.libvirt.host [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Nov 22 02:33:27 np0005531887 nova_compute[186849]: <domainCapabilities>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  <path>/usr/libexec/qemu-kvm</path>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  <domain>kvm</domain>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  <machine>pc-q35-rhel9.8.0</machine>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  <arch>x86_64</arch>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  <vcpu max='4096'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  <iothreads supported='yes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  <os supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <enum name='firmware'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <value>efi</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <loader supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='type'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>rom</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>pflash</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='readonly'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>yes</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>no</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='secure'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>yes</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>no</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </loader>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  </os>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  <cpu>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <mode name='host-passthrough' supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='hostPassthroughMigratable'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>on</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>off</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </mode>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <mode name='maximum' supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='maximumMigratable'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>on</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>off</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </mode>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <mode name='host-model' supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <vendor>AMD</vendor>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='x2apic'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='tsc-deadline'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='hypervisor'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='tsc_adjust'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='spec-ctrl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='stibp'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='ssbd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='cmp_legacy'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='overflow-recov'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='succor'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='ibrs'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='amd-ssbd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='virt-ssbd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='lbrv'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='tsc-scale'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='vmcb-clean'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='flushbyasid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='pause-filter'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='pfthreshold'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='svme-addr-chk'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='disable' name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </mode>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <mode name='custom' supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Broadwell'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Broadwell-IBRS'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Broadwell-noTSX'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Broadwell-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Broadwell-v2'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Broadwell-v3'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Broadwell-v4'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Cascadelake-Server'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Cascadelake-Server-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Cascadelake-Server-v2'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Cascadelake-Server-v3'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Cascadelake-Server-v4'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Cascadelake-Server-v5'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Cooperlake'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Cooperlake-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Cooperlake-v2'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Denverton'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='mpx'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Denverton-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='mpx'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Denverton-v2'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Denverton-v3'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Dhyana-v2'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='EPYC-Genoa'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amd-psfd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='auto-ibrs'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='no-nested-data-bp'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='null-sel-clr-base'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='stibp-always-on'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='EPYC-Genoa-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amd-psfd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='auto-ibrs'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='no-nested-data-bp'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='null-sel-clr-base'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='stibp-always-on'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='EPYC-Milan'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='EPYC-Milan-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='EPYC-Milan-v2'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amd-psfd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='no-nested-data-bp'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='null-sel-clr-base'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='stibp-always-on'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='EPYC-Rome'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='EPYC-Rome-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='EPYC-Rome-v2'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='EPYC-Rome-v3'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='EPYC-v3'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='EPYC-v4'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='GraniteRapids'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-fp16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-int8'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-tile'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-fp16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fbsdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrc'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fzrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='mcdt-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pbrsb-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='prefetchiti'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='psdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xfd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='GraniteRapids-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-fp16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-int8'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-tile'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-fp16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fbsdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrc'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fzrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='mcdt-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pbrsb-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='prefetchiti'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='psdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xfd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='GraniteRapids-v2'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-fp16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-int8'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-tile'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx10'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx10-128'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx10-256'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx10-512'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-fp16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='cldemote'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fbsdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrc'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fzrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='mcdt-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='movdir64b'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='movdiri'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pbrsb-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='prefetchiti'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='psdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xfd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Haswell'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Haswell-IBRS'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Haswell-noTSX'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Haswell-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Haswell-v2'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Haswell-v3'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Haswell-v4'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Icelake-Server'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Icelake-Server-noTSX'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Icelake-Server-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Icelake-Server-v2'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Icelake-Server-v3'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Icelake-Server-v4'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Icelake-Server-v5'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Icelake-Server-v6'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Icelake-Server-v7'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='IvyBridge'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='IvyBridge-IBRS'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='IvyBridge-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='IvyBridge-v2'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='KnightsMill'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-4fmaps'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-4vnniw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512er'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512pf'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='KnightsMill-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-4fmaps'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-4vnniw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512er'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512pf'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Opteron_G4'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fma4'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xop'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Opteron_G4-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fma4'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xop'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Opteron_G5'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fma4'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='tbm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xop'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Opteron_G5-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fma4'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='tbm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xop'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='SapphireRapids'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-int8'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-tile'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-fp16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrc'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fzrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xfd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='SapphireRapids-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-int8'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-tile'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-fp16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrc'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fzrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xfd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='SapphireRapids-v2'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-int8'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-tile'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-fp16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fbsdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrc'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fzrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='psdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xfd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='SapphireRapids-v3'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-int8'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-tile'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-fp16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='cldemote'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fbsdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrc'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fzrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='movdir64b'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='movdiri'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='psdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xfd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='SierraForest'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx-ifma'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx-ne-convert'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx-vnni-int8'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='cmpccxadd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fbsdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='mcdt-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pbrsb-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='psdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='SierraForest-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx-ifma'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx-ne-convert'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx-vnni-int8'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='cmpccxadd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fbsdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='mcdt-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pbrsb-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='psdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Skylake-Client'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Skylake-Client-IBRS'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Skylake-Client-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Skylake-Client-v2'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Skylake-Client-v3'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Skylake-Client-v4'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Skylake-Server'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Skylake-Server-IBRS'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Skylake-Server-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Skylake-Server-v2'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Skylake-Server-v3'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Skylake-Server-v4'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Skylake-Server-v5'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Snowridge'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='cldemote'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='core-capability'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='movdir64b'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='movdiri'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='mpx'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='split-lock-detect'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Snowridge-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='cldemote'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='core-capability'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='movdir64b'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='movdiri'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='mpx'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='split-lock-detect'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Snowridge-v2'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='cldemote'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='core-capability'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='movdir64b'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='movdiri'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='split-lock-detect'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Snowridge-v3'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='cldemote'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='core-capability'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='movdir64b'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='movdiri'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='split-lock-detect'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Snowridge-v4'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='cldemote'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='movdir64b'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='movdiri'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='athlon'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='3dnow'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='3dnowext'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='athlon-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='3dnow'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='3dnowext'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='core2duo'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='core2duo-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='coreduo'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='coreduo-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='n270'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='n270-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='phenom'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='3dnow'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='3dnowext'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='phenom-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='3dnow'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='3dnowext'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </mode>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  <memoryBacking supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <enum name='sourceType'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <value>file</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <value>anonymous</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <value>memfd</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  </memoryBacking>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  <devices>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <disk supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='diskDevice'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>disk</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>cdrom</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>floppy</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>lun</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='bus'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>fdc</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>scsi</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>virtio</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>usb</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>sata</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='model'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>virtio</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>virtio-transitional</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>virtio-non-transitional</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <graphics supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='type'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>vnc</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>egl-headless</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>dbus</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </graphics>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <video supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='modelType'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>vga</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>cirrus</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>virtio</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>none</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>bochs</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>ramfb</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </video>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <hostdev supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='mode'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>subsystem</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='startupPolicy'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>default</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>mandatory</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>requisite</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>optional</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='subsysType'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>usb</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>pci</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>scsi</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='capsType'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='pciBackend'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </hostdev>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <rng supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='model'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>virtio</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>virtio-transitional</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>virtio-non-transitional</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='backendModel'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>random</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>egd</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>builtin</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </rng>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <filesystem supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='driverType'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>path</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>handle</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>virtiofs</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </filesystem>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <tpm supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='model'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>tpm-tis</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>tpm-crb</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='backendModel'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>emulator</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>external</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='backendVersion'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>2.0</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </tpm>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <redirdev supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='bus'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>usb</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </redirdev>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <channel supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='type'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>pty</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>unix</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </channel>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <crypto supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='model'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='type'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>qemu</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='backendModel'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>builtin</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </crypto>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <interface supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='backendType'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>default</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>passt</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </interface>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <panic supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='model'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>isa</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>hyperv</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </panic>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <console supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='type'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>null</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>vc</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>pty</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>dev</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>file</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>pipe</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>stdio</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>udp</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>tcp</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>unix</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>qemu-vdagent</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>dbus</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </console>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  </devices>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  <features>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <gic supported='no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <vmcoreinfo supported='yes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <genid supported='yes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <backingStoreInput supported='yes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <backup supported='yes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <async-teardown supported='yes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <ps2 supported='yes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <sev supported='no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <sgx supported='no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <hyperv supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='features'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>relaxed</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>vapic</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>spinlocks</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>vpindex</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>runtime</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>synic</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>stimer</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>reset</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>vendor_id</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>frequencies</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>reenlightenment</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>tlbflush</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>ipi</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>avic</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>emsr_bitmap</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>xmm_input</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <defaults>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <spinlocks>4095</spinlocks>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <stimer_direct>on</stimer_direct>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <tlbflush_direct>on</tlbflush_direct>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <tlbflush_extended>on</tlbflush_extended>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </defaults>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </hyperv>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <launchSecurity supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='sectype'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>tdx</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </launchSecurity>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  </features>
Nov 22 02:33:27 np0005531887 nova_compute[186849]: </domainCapabilities>
Nov 22 02:33:27 np0005531887 nova_compute[186849]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.710 186853 DEBUG nova.virt.libvirt.host [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Nov 22 02:33:27 np0005531887 nova_compute[186849]: <domainCapabilities>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  <path>/usr/libexec/qemu-kvm</path>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  <domain>kvm</domain>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  <machine>pc-i440fx-rhel7.6.0</machine>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  <arch>x86_64</arch>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  <vcpu max='240'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  <iothreads supported='yes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  <os supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <enum name='firmware'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <loader supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='type'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>rom</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>pflash</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='readonly'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>yes</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>no</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='secure'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>no</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </loader>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  </os>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  <cpu>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <mode name='host-passthrough' supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='hostPassthroughMigratable'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>on</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>off</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </mode>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <mode name='maximum' supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='maximumMigratable'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>on</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>off</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </mode>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <mode name='host-model' supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <vendor>AMD</vendor>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='x2apic'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='tsc-deadline'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='hypervisor'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='tsc_adjust'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='spec-ctrl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='stibp'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='ssbd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='cmp_legacy'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='overflow-recov'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='succor'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='ibrs'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='amd-ssbd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='virt-ssbd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='lbrv'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='tsc-scale'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='vmcb-clean'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='flushbyasid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='pause-filter'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='pfthreshold'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='svme-addr-chk'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <feature policy='disable' name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </mode>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <mode name='custom' supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Broadwell'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Broadwell-IBRS'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Broadwell-noTSX'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Broadwell-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Broadwell-v2'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Broadwell-v3'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Broadwell-v4'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Cascadelake-Server'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Cascadelake-Server-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Cascadelake-Server-v2'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Cascadelake-Server-v3'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Cascadelake-Server-v4'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Cascadelake-Server-v5'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Cooperlake'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Cooperlake-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Cooperlake-v2'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Denverton'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='mpx'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Denverton-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='mpx'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Denverton-v2'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Denverton-v3'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Dhyana-v2'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='EPYC-Genoa'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amd-psfd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='auto-ibrs'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='no-nested-data-bp'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='null-sel-clr-base'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='stibp-always-on'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='EPYC-Genoa-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amd-psfd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='auto-ibrs'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='no-nested-data-bp'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='null-sel-clr-base'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='stibp-always-on'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='EPYC-Milan'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='EPYC-Milan-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='EPYC-Milan-v2'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amd-psfd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='no-nested-data-bp'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='null-sel-clr-base'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='stibp-always-on'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='EPYC-Rome'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='EPYC-Rome-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='EPYC-Rome-v2'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='EPYC-Rome-v3'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='EPYC-v3'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='EPYC-v4'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='GraniteRapids'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-fp16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-int8'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-tile'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-fp16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fbsdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrc'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fzrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='mcdt-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pbrsb-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='prefetchiti'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='psdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xfd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='GraniteRapids-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-fp16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-int8'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-tile'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-fp16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fbsdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrc'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fzrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='mcdt-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pbrsb-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='prefetchiti'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='psdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xfd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='GraniteRapids-v2'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-fp16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-int8'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-tile'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx10'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx10-128'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx10-256'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx10-512'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-fp16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='cldemote'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fbsdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrc'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fzrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='mcdt-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='movdir64b'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='movdiri'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pbrsb-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='prefetchiti'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='psdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xfd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Haswell'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Haswell-IBRS'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Haswell-noTSX'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Haswell-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Haswell-v2'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Haswell-v3'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Haswell-v4'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Icelake-Server'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Icelake-Server-noTSX'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Icelake-Server-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Icelake-Server-v2'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Icelake-Server-v3'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Icelake-Server-v4'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Icelake-Server-v5'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Icelake-Server-v6'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Icelake-Server-v7'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='IvyBridge'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='IvyBridge-IBRS'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='IvyBridge-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='IvyBridge-v2'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='KnightsMill'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-4fmaps'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-4vnniw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512er'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512pf'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='KnightsMill-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-4fmaps'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-4vnniw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512er'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512pf'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Opteron_G4'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fma4'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xop'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Opteron_G4-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fma4'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xop'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Opteron_G5'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fma4'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='tbm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xop'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Opteron_G5-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fma4'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='tbm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xop'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='SapphireRapids'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-int8'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-tile'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-fp16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrc'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fzrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xfd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='SapphireRapids-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-int8'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-tile'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-fp16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrc'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fzrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xfd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='SapphireRapids-v2'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-int8'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-tile'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-fp16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fbsdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrc'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fzrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='psdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xfd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='SapphireRapids-v3'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-int8'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='amx-tile'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-bf16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-fp16'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512-vpopcntdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bitalg'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512ifma'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vbmi2'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='cldemote'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fbsdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrc'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fzrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='la57'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='movdir64b'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='movdiri'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='psdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='taa-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='tsx-ldtrk'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xfd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='SierraForest'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx-ifma'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx-ne-convert'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx-vnni-int8'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='cmpccxadd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fbsdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='mcdt-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pbrsb-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='psdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='SierraForest-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx-ifma'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx-ne-convert'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx-vnni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx-vnni-int8'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='bus-lock-detect'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='cmpccxadd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fbsdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='fsrs'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ibrs-all'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='mcdt-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pbrsb-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='psdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='sbdr-ssdp-no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='serialize'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vaes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='vpclmulqdq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Skylake-Client'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Skylake-Client-IBRS'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Skylake-Client-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Skylake-Client-v2'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Skylake-Client-v3'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Skylake-Client-v4'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Skylake-Server'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Skylake-Server-IBRS'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Skylake-Server-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Skylake-Server-v2'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='hle'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='rtm'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Skylake-Server-v3'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Skylake-Server-v4'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Skylake-Server-v5'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512bw'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512cd'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512dq'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512f'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='avx512vl'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='invpcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pcid'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='pku'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Snowridge'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='cldemote'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='core-capability'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='movdir64b'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='movdiri'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='mpx'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='split-lock-detect'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Snowridge-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='cldemote'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='core-capability'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='movdir64b'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='movdiri'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='mpx'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='split-lock-detect'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Snowridge-v2'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='cldemote'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='core-capability'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='movdir64b'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='movdiri'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='split-lock-detect'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Snowridge-v3'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='cldemote'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='core-capability'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='movdir64b'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='movdiri'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='split-lock-detect'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='Snowridge-v4'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='cldemote'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='erms'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='gfni'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='movdir64b'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='movdiri'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='xsaves'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='athlon'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='3dnow'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='3dnowext'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='athlon-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='3dnow'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='3dnowext'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='core2duo'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='core2duo-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='coreduo'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='coreduo-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='n270'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='n270-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='ss'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='phenom'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='3dnow'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='3dnowext'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <blockers model='phenom-v1'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='3dnow'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <feature name='3dnowext'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </blockers>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </mode>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  <memoryBacking supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <enum name='sourceType'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <value>file</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <value>anonymous</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <value>memfd</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  </memoryBacking>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  <devices>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <disk supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='diskDevice'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>disk</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>cdrom</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>floppy</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>lun</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='bus'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>ide</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>fdc</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>scsi</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>virtio</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>usb</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>sata</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='model'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>virtio</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>virtio-transitional</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>virtio-non-transitional</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <graphics supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='type'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>vnc</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>egl-headless</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>dbus</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </graphics>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <video supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='modelType'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>vga</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>cirrus</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>virtio</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>none</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>bochs</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>ramfb</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </video>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <hostdev supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='mode'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>subsystem</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='startupPolicy'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>default</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>mandatory</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>requisite</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>optional</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='subsysType'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>usb</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>pci</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>scsi</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='capsType'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='pciBackend'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </hostdev>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <rng supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='model'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>virtio</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>virtio-transitional</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>virtio-non-transitional</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='backendModel'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>random</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>egd</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>builtin</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </rng>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <filesystem supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='driverType'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>path</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>handle</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>virtiofs</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </filesystem>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <tpm supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='model'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>tpm-tis</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>tpm-crb</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='backendModel'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>emulator</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>external</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='backendVersion'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>2.0</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </tpm>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <redirdev supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='bus'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>usb</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </redirdev>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <channel supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='type'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>pty</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>unix</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </channel>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <crypto supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='model'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='type'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>qemu</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='backendModel'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>builtin</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </crypto>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <interface supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='backendType'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>default</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>passt</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </interface>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <panic supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='model'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>isa</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>hyperv</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </panic>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <console supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='type'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>null</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>vc</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>pty</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>dev</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>file</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>pipe</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>stdio</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>udp</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>tcp</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>unix</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>qemu-vdagent</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>dbus</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </console>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  </devices>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  <features>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <gic supported='no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <vmcoreinfo supported='yes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <genid supported='yes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <backingStoreInput supported='yes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <backup supported='yes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <async-teardown supported='yes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <ps2 supported='yes'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <sev supported='no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <sgx supported='no'/>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <hyperv supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='features'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>relaxed</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>vapic</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>spinlocks</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>vpindex</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>runtime</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>synic</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>stimer</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>reset</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>vendor_id</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>frequencies</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>reenlightenment</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>tlbflush</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>ipi</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>avic</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>emsr_bitmap</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>xmm_input</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <defaults>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <spinlocks>4095</spinlocks>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <stimer_direct>on</stimer_direct>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <tlbflush_direct>on</tlbflush_direct>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <tlbflush_extended>on</tlbflush_extended>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </defaults>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </hyperv>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    <launchSecurity supported='yes'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      <enum name='sectype'>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:        <value>tdx</value>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:      </enum>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:    </launchSecurity>
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  </features>
Nov 22 02:33:27 np0005531887 nova_compute[186849]: </domainCapabilities>
Nov 22 02:33:27 np0005531887 nova_compute[186849]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.777 186853 DEBUG nova.virt.libvirt.host [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.777 186853 INFO nova.virt.libvirt.host [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Secure Boot support detected#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.779 186853 INFO nova.virt.libvirt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.779 186853 INFO nova.virt.libvirt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.789 186853 DEBUG nova.virt.libvirt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] cpu compare xml: <cpu match="exact">
Nov 22 02:33:27 np0005531887 nova_compute[186849]:  <model>Nehalem</model>
Nov 22 02:33:27 np0005531887 nova_compute[186849]: </cpu>
Nov 22 02:33:27 np0005531887 nova_compute[186849]: _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.791 186853 DEBUG nova.virt.libvirt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.813 186853 INFO nova.virt.node [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Determined node identity 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 from /var/lib/nova/compute_id#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.827 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Verified node 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 matches my host compute-1.ctlplane.example.com _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568#033[00m
Nov 22 02:33:27 np0005531887 nova_compute[186849]: 2025-11-22 07:33:27.843 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Nov 22 02:33:28 np0005531887 nova_compute[186849]: 2025-11-22 07:33:28.521 186853 ERROR nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Could not retrieve compute node resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 and therefore unable to error out any instances stuck in BUILDING state. Error: Failed to retrieve allocations for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider '9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78' not found: No resource provider with uuid 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 found  ", "request_id": "req-c5bdc3f3-58de-4f55-90cf-66c3538dc48c"}]}: nova.exception.ResourceProviderAllocationRetrievalFailed: Failed to retrieve allocations for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider '9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78' not found: No resource provider with uuid 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 found  ", "request_id": "req-c5bdc3f3-58de-4f55-90cf-66c3538dc48c"}]}#033[00m
Nov 22 02:33:28 np0005531887 nova_compute[186849]: 2025-11-22 07:33:28.542 186853 DEBUG oslo_concurrency.lockutils [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:33:28 np0005531887 nova_compute[186849]: 2025-11-22 07:33:28.542 186853 DEBUG oslo_concurrency.lockutils [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:33:28 np0005531887 nova_compute[186849]: 2025-11-22 07:33:28.542 186853 DEBUG oslo_concurrency.lockutils [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:33:28 np0005531887 nova_compute[186849]: 2025-11-22 07:33:28.543 186853 DEBUG nova.compute.resource_tracker [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 02:33:28 np0005531887 nova_compute[186849]: 2025-11-22 07:33:28.686 186853 WARNING nova.virt.libvirt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:33:28 np0005531887 nova_compute[186849]: 2025-11-22 07:33:28.687 186853 DEBUG nova.compute.resource_tracker [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6209MB free_disk=73.66344833374023GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 02:33:28 np0005531887 nova_compute[186849]: 2025-11-22 07:33:28.687 186853 DEBUG oslo_concurrency.lockutils [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:33:28 np0005531887 nova_compute[186849]: 2025-11-22 07:33:28.687 186853 DEBUG oslo_concurrency.lockutils [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:33:28 np0005531887 nova_compute[186849]: 2025-11-22 07:33:28.783 186853 ERROR nova.compute.resource_tracker [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Skipping removal of allocations for deleted instances: Failed to retrieve allocations for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider '9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78' not found: No resource provider with uuid 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 found  ", "request_id": "req-1f16abd2-abb7-46d0-abe7-47fd12b7050d"}]}: nova.exception.ResourceProviderAllocationRetrievalFailed: Failed to retrieve allocations for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider '9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78' not found: No resource provider with uuid 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 found  ", "request_id": "req-1f16abd2-abb7-46d0-abe7-47fd12b7050d"}]}#033[00m
Nov 22 02:33:28 np0005531887 nova_compute[186849]: 2025-11-22 07:33:28.783 186853 DEBUG nova.compute.resource_tracker [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 02:33:28 np0005531887 nova_compute[186849]: 2025-11-22 07:33:28.784 186853 DEBUG nova.compute.resource_tracker [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 02:33:28 np0005531887 nova_compute[186849]: 2025-11-22 07:33:28.943 186853 INFO nova.scheduler.client.report [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [req-584d5baf-7bf5-4eaf-a746-ce7cad5d26c4] Created resource provider record via placement API for resource provider with UUID 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 and name compute-1.ctlplane.example.com.#033[00m
Nov 22 02:33:28 np0005531887 nova_compute[186849]: 2025-11-22 07:33:28.968 186853 DEBUG nova.virt.libvirt.host [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Nov 22 02:33:28 np0005531887 nova_compute[186849]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m
Nov 22 02:33:28 np0005531887 nova_compute[186849]: 2025-11-22 07:33:28.968 186853 INFO nova.virt.libvirt.host [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] kernel doesn't support AMD SEV#033[00m
Nov 22 02:33:28 np0005531887 nova_compute[186849]: 2025-11-22 07:33:28.969 186853 DEBUG nova.compute.provider_tree [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Updating inventory in ProviderTree for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 22 02:33:28 np0005531887 nova_compute[186849]: 2025-11-22 07:33:28.969 186853 DEBUG nova.virt.libvirt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:33:28 np0005531887 nova_compute[186849]: 2025-11-22 07:33:28.971 186853 DEBUG nova.virt.libvirt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Libvirt baseline CPU <cpu>
Nov 22 02:33:28 np0005531887 nova_compute[186849]:  <arch>x86_64</arch>
Nov 22 02:33:28 np0005531887 nova_compute[186849]:  <model>Nehalem</model>
Nov 22 02:33:28 np0005531887 nova_compute[186849]:  <vendor>AMD</vendor>
Nov 22 02:33:28 np0005531887 nova_compute[186849]:  <topology sockets="8" cores="1" threads="1"/>
Nov 22 02:33:28 np0005531887 nova_compute[186849]: </cpu>
Nov 22 02:33:28 np0005531887 nova_compute[186849]: _get_guest_baseline_cpu_features /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12537#033[00m
Nov 22 02:33:29 np0005531887 nova_compute[186849]: 2025-11-22 07:33:29.086 186853 DEBUG nova.scheduler.client.report [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Updated inventory for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Nov 22 02:33:29 np0005531887 nova_compute[186849]: 2025-11-22 07:33:29.086 186853 DEBUG nova.compute.provider_tree [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Updating resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Nov 22 02:33:29 np0005531887 nova_compute[186849]: 2025-11-22 07:33:29.086 186853 DEBUG nova.compute.provider_tree [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Updating inventory in ProviderTree for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 22 02:33:29 np0005531887 nova_compute[186849]: 2025-11-22 07:33:29.352 186853 DEBUG nova.compute.provider_tree [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Updating resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Nov 22 02:33:29 np0005531887 nova_compute[186849]: 2025-11-22 07:33:29.610 186853 DEBUG nova.compute.resource_tracker [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 02:33:29 np0005531887 nova_compute[186849]: 2025-11-22 07:33:29.610 186853 DEBUG oslo_concurrency.lockutils [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.923s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:33:29 np0005531887 nova_compute[186849]: 2025-11-22 07:33:29.610 186853 DEBUG nova.service [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182#033[00m
Nov 22 02:33:29 np0005531887 nova_compute[186849]: 2025-11-22 07:33:29.650 186853 DEBUG nova.service [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199#033[00m
Nov 22 02:33:29 np0005531887 nova_compute[186849]: 2025-11-22 07:33:29.650 186853 DEBUG nova.servicegroup.drivers.db [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] DB_Driver: join new ServiceGroup member compute-1.ctlplane.example.com to the compute group, service = <Service: host=compute-1.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44#033[00m
Nov 22 02:33:31 np0005531887 systemd-logind[821]: New session 26 of user zuul.
Nov 22 02:33:31 np0005531887 systemd[1]: Started Session 26 of User zuul.
Nov 22 02:33:32 np0005531887 python3.9[187304]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 02:33:34 np0005531887 podman[187432]: 2025-11-22 07:33:34.371388413 +0000 UTC m=+0.095947147 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 02:33:34 np0005531887 python3.9[187477]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 22 02:33:34 np0005531887 systemd[1]: Reloading.
Nov 22 02:33:34 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:33:34 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:33:35 np0005531887 python3.9[187671]: ansible-ansible.builtin.service_facts Invoked
Nov 22 02:33:35 np0005531887 network[187688]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 22 02:33:35 np0005531887 network[187689]: 'network-scripts' will be removed from distribution in near future.
Nov 22 02:33:35 np0005531887 network[187690]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 22 02:33:36 np0005531887 nova_compute[186849]: 2025-11-22 07:33:36.653 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:33:36 np0005531887 nova_compute[186849]: 2025-11-22 07:33:36.671 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:33:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:33:37.304 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:33:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:33:37.304 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:33:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:33:37.304 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:33:40 np0005531887 python3.9[187964]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:33:41 np0005531887 python3.9[188117]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:33:41 np0005531887 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 02:33:41 np0005531887 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 02:33:41 np0005531887 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 02:33:42 np0005531887 python3.9[188270]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:33:43 np0005531887 podman[188394]: 2025-11-22 07:33:43.636775341 +0000 UTC m=+0.067039276 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 22 02:33:43 np0005531887 python3.9[188441]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:33:44 np0005531887 python3.9[188593]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 22 02:33:45 np0005531887 python3.9[188745]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 22 02:33:45 np0005531887 systemd[1]: Reloading.
Nov 22 02:33:45 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:33:45 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:33:46 np0005531887 podman[188904]: 2025-11-22 07:33:46.526075324 +0000 UTC m=+0.082329403 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:33:46 np0005531887 python3.9[188950]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:33:47 np0005531887 python3.9[189108]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:33:48 np0005531887 python3.9[189258]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:33:49 np0005531887 python3.9[189410]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:33:49 np0005531887 python3.9[189531]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796828.8249843-365-123288148967980/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=a9bdb897f3979025d9a372b4beff53a09cbe0d55 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:33:50 np0005531887 python3.9[189683]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Nov 22 02:33:52 np0005531887 python3.9[189835]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Nov 22 02:33:52 np0005531887 python3.9[189988]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 22 02:33:53 np0005531887 python3.9[190146]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 22 02:33:55 np0005531887 python3.9[190304]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:33:55 np0005531887 python3.9[190425]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1763796834.8880613-569-277567057467796/.source.conf _original_basename=ceilometer.conf follow=False checksum=f74f01c63e6cdeca5458ef9aff2a1db5d6a4e4b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:33:56 np0005531887 python3.9[190575]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:33:57 np0005531887 python3.9[190696]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1763796836.0684078-569-189559431412198/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:33:57 np0005531887 python3.9[190846]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:33:58 np0005531887 python3.9[190967]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1763796837.2375991-569-280078553494964/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:33:58 np0005531887 python3.9[191117]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:33:59 np0005531887 python3.9[191269]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:34:00 np0005531887 python3.9[191423]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:34:00 np0005531887 python3.9[191544]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796839.9780133-746-191003191404471/.source.json follow=False _original_basename=ceilometer-agent-compute.json.j2 checksum=264d11e8d3809e7ef745878dce7edd46098e25b2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:34:01 np0005531887 python3.9[191694]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:34:02 np0005531887 python3.9[191770]: ansible-ansible.legacy.file Invoked with mode=420 dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf _original_basename=ceilometer-host-specific.conf.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:34:02 np0005531887 python3.9[191920]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:34:03 np0005531887 python3.9[192041]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796842.311075-746-189812393207080/.source.json follow=False _original_basename=ceilometer_agent_compute.json.j2 checksum=17453a32c9d181134878b3e453cb84c3cd9bd67d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:34:04 np0005531887 python3.9[192191]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:34:04 np0005531887 python3.9[192312]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796843.5128782-746-279904384138540/.source.yaml follow=False _original_basename=ceilometer_prom_exporter.yaml.j2 checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:34:04 np0005531887 podman[192313]: 2025-11-22 07:34:04.713618651 +0000 UTC m=+0.101093854 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:34:05 np0005531887 python3.9[192485]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:34:05 np0005531887 python3.9[192606]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/firewall.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796844.762322-746-20364656030685/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:34:06 np0005531887 python3.9[192756]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:34:07 np0005531887 python3.9[192877]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796846.0038064-746-220345884477152/.source.json follow=False _original_basename=node_exporter.json.j2 checksum=6e4982940d2bfae88404914dfaf72552f6356d81 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:34:07 np0005531887 python3.9[193027]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:34:08 np0005531887 python3.9[193148]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796847.2786262-746-31045699104838/.source.yaml follow=False _original_basename=node_exporter.yaml.j2 checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:34:08 np0005531887 python3.9[193298]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:34:09 np0005531887 python3.9[193419]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796848.4264958-746-254606032354887/.source.json follow=False _original_basename=openstack_network_exporter.json.j2 checksum=d474f1e4c3dbd24762592c51cbe5311f0a037273 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:34:10 np0005531887 python3.9[193569]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:34:10 np0005531887 python3.9[193690]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796849.6545558-746-135187906209046/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=2b6bd0891e609bf38a73282f42888052b750bed6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:34:11 np0005531887 python3.9[193840]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:34:11 np0005531887 python3.9[193961]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796850.8720474-746-221457257393076/.source.json follow=False _original_basename=podman_exporter.json.j2 checksum=e342121a88f67e2bae7ebc05d1e6d350470198a5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:34:12 np0005531887 python3.9[194111]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:34:13 np0005531887 python3.9[194232]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796852.079739-746-199102747409987/.source.yaml follow=False _original_basename=podman_exporter.yaml.j2 checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:34:13 np0005531887 podman[194332]: 2025-11-22 07:34:13.859973797 +0000 UTC m=+0.078295796 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 02:34:14 np0005531887 python3.9[194401]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:34:14 np0005531887 python3.9[194477]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/node_exporter.yaml _original_basename=node_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/node_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:34:15 np0005531887 python3.9[194627]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:34:15 np0005531887 python3.9[194703]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml _original_basename=podman_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/podman_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:34:16 np0005531887 python3.9[194853]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:34:16 np0005531887 podman[194903]: 2025-11-22 07:34:16.685592213 +0000 UTC m=+0.057400375 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 22 02:34:16 np0005531887 python3.9[194944]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml _original_basename=ceilometer_prom_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:34:17 np0005531887 python3.9[195101]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:34:18 np0005531887 python3.9[195253]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:34:19 np0005531887 python3.9[195405]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:34:19 np0005531887 python3.9[195557]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:34:20 np0005531887 systemd[1]: Reloading.
Nov 22 02:34:20 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:34:20 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:34:20 np0005531887 systemd[1]: Listening on Podman API Socket.
Nov 22 02:34:21 np0005531887 python3.9[195750]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:34:21 np0005531887 python3.9[195873]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796860.7512932-1412-85696369936374/.source _original_basename=healthcheck follow=False checksum=ebb343c21fce35a02591a9351660cb7035a47d42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:34:22 np0005531887 python3.9[195949]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck.future follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:34:22 np0005531887 python3.9[196072]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796860.7512932-1412-85696369936374/.source.future _original_basename=healthcheck.future follow=False checksum=d500a98192f4ddd70b4dfdc059e2d81aed36a294 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:34:24 np0005531887 python3.9[196224]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=ceilometer_agent_compute.json debug=False
Nov 22 02:34:25 np0005531887 python3.9[196376]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 22 02:34:26 np0005531887 python3[196528]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=ceilometer_agent_compute.json log_base_path=/var/log/containers/stdouts debug=False
Nov 22 02:34:26 np0005531887 nova_compute[186849]: 2025-11-22 07:34:26.770 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:34:26 np0005531887 nova_compute[186849]: 2025-11-22 07:34:26.771 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:34:26 np0005531887 nova_compute[186849]: 2025-11-22 07:34:26.771 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 02:34:26 np0005531887 nova_compute[186849]: 2025-11-22 07:34:26.771 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 02:34:26 np0005531887 nova_compute[186849]: 2025-11-22 07:34:26.783 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 02:34:26 np0005531887 nova_compute[186849]: 2025-11-22 07:34:26.783 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:34:26 np0005531887 nova_compute[186849]: 2025-11-22 07:34:26.783 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:34:26 np0005531887 nova_compute[186849]: 2025-11-22 07:34:26.784 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:34:26 np0005531887 nova_compute[186849]: 2025-11-22 07:34:26.784 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:34:26 np0005531887 nova_compute[186849]: 2025-11-22 07:34:26.784 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:34:26 np0005531887 nova_compute[186849]: 2025-11-22 07:34:26.784 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:34:26 np0005531887 nova_compute[186849]: 2025-11-22 07:34:26.784 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 02:34:26 np0005531887 nova_compute[186849]: 2025-11-22 07:34:26.785 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:34:26 np0005531887 nova_compute[186849]: 2025-11-22 07:34:26.806 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:34:26 np0005531887 nova_compute[186849]: 2025-11-22 07:34:26.807 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:34:26 np0005531887 nova_compute[186849]: 2025-11-22 07:34:26.807 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:34:26 np0005531887 nova_compute[186849]: 2025-11-22 07:34:26.807 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 02:34:26 np0005531887 podman[196566]: 2025-11-22 07:34:26.745560972 +0000 UTC m=+0.021087707 image pull 5b3bac081df6146e06acefa72320d250dc7d5f82abc7fbe0b9e83aec1e1587f5 quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Nov 22 02:34:26 np0005531887 nova_compute[186849]: 2025-11-22 07:34:26.947 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:34:26 np0005531887 nova_compute[186849]: 2025-11-22 07:34:26.947 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6170MB free_disk=73.66294479370117GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 02:34:26 np0005531887 nova_compute[186849]: 2025-11-22 07:34:26.948 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:34:26 np0005531887 nova_compute[186849]: 2025-11-22 07:34:26.948 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:34:27 np0005531887 nova_compute[186849]: 2025-11-22 07:34:27.004 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 02:34:27 np0005531887 nova_compute[186849]: 2025-11-22 07:34:27.004 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 02:34:27 np0005531887 podman[196566]: 2025-11-22 07:34:27.00557612 +0000 UTC m=+0.281102855 container create 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=edpm, managed_by=edpm_ansible)
Nov 22 02:34:27 np0005531887 python3[196528]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck compute --label config_id=edpm --label container_name=ceilometer_agent_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']} --log-driver journald --log-level info --network host --security-opt label:type:ceilometer_polling_t --user ceilometer --volume /var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z --volume /var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z --volume /run/libvirt:/run/libvirt:shared,ro --volume /etc/hosts:/etc/hosts:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z --volume /dev/log:/dev/log --volume /var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified kolla_start
Nov 22 02:34:27 np0005531887 nova_compute[186849]: 2025-11-22 07:34:27.027 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:34:27 np0005531887 nova_compute[186849]: 2025-11-22 07:34:27.039 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:34:27 np0005531887 nova_compute[186849]: 2025-11-22 07:34:27.040 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 02:34:27 np0005531887 nova_compute[186849]: 2025-11-22 07:34:27.041 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.093s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:34:27 np0005531887 python3.9[196754]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:34:28 np0005531887 python3.9[196908]: ansible-file Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:34:29 np0005531887 python3.9[197059]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763796868.8027327-1604-184011785740144/source dest=/etc/systemd/system/edpm_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:34:30 np0005531887 python3.9[197135]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 22 02:34:30 np0005531887 systemd[1]: Reloading.
Nov 22 02:34:30 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:34:30 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:34:31 np0005531887 python3.9[197247]: ansible-systemd Invoked with state=restarted name=edpm_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:34:31 np0005531887 systemd[1]: Reloading.
Nov 22 02:34:31 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:34:31 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:34:31 np0005531887 systemd[1]: Starting ceilometer_agent_compute container...
Nov 22 02:34:31 np0005531887 systemd[1]: Started libcrun container.
Nov 22 02:34:31 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c5145d3bd841d501a9b6904044c0ae4cf99536f1a1d3e03f1a503da9097aaa83/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Nov 22 02:34:31 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c5145d3bd841d501a9b6904044c0ae4cf99536f1a1d3e03f1a503da9097aaa83/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 22 02:34:31 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c5145d3bd841d501a9b6904044c0ae4cf99536f1a1d3e03f1a503da9097aaa83/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Nov 22 02:34:31 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c5145d3bd841d501a9b6904044c0ae4cf99536f1a1d3e03f1a503da9097aaa83/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Nov 22 02:34:31 np0005531887 systemd[1]: Started /usr/bin/podman healthcheck run 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd.
Nov 22 02:34:32 np0005531887 podman[197286]: 2025-11-22 07:34:32.173964456 +0000 UTC m=+0.450243234 container init 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 22 02:34:32 np0005531887 ceilometer_agent_compute[197303]: + sudo -E kolla_set_configs
Nov 22 02:34:32 np0005531887 podman[197286]: 2025-11-22 07:34:32.203711912 +0000 UTC m=+0.479990670 container start 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 22 02:34:32 np0005531887 ceilometer_agent_compute[197303]: sudo: unable to send audit message: Operation not permitted
Nov 22 02:34:32 np0005531887 ceilometer_agent_compute[197303]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 22 02:34:32 np0005531887 ceilometer_agent_compute[197303]: INFO:__main__:Validating config file
Nov 22 02:34:32 np0005531887 ceilometer_agent_compute[197303]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 22 02:34:32 np0005531887 ceilometer_agent_compute[197303]: INFO:__main__:Copying service configuration files
Nov 22 02:34:32 np0005531887 ceilometer_agent_compute[197303]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Nov 22 02:34:32 np0005531887 ceilometer_agent_compute[197303]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Nov 22 02:34:32 np0005531887 ceilometer_agent_compute[197303]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Nov 22 02:34:32 np0005531887 ceilometer_agent_compute[197303]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Nov 22 02:34:32 np0005531887 ceilometer_agent_compute[197303]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Nov 22 02:34:32 np0005531887 ceilometer_agent_compute[197303]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Nov 22 02:34:32 np0005531887 ceilometer_agent_compute[197303]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 22 02:34:32 np0005531887 ceilometer_agent_compute[197303]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 22 02:34:32 np0005531887 ceilometer_agent_compute[197303]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 22 02:34:32 np0005531887 ceilometer_agent_compute[197303]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 22 02:34:32 np0005531887 ceilometer_agent_compute[197303]: INFO:__main__:Writing out command to execute
Nov 22 02:34:32 np0005531887 ceilometer_agent_compute[197303]: ++ cat /run_command
Nov 22 02:34:32 np0005531887 ceilometer_agent_compute[197303]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Nov 22 02:34:32 np0005531887 ceilometer_agent_compute[197303]: + ARGS=
Nov 22 02:34:32 np0005531887 ceilometer_agent_compute[197303]: + sudo kolla_copy_cacerts
Nov 22 02:34:32 np0005531887 ceilometer_agent_compute[197303]: sudo: unable to send audit message: Operation not permitted
Nov 22 02:34:32 np0005531887 ceilometer_agent_compute[197303]: + [[ ! -n '' ]]
Nov 22 02:34:32 np0005531887 ceilometer_agent_compute[197303]: + . kolla_extend_start
Nov 22 02:34:32 np0005531887 ceilometer_agent_compute[197303]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Nov 22 02:34:32 np0005531887 ceilometer_agent_compute[197303]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Nov 22 02:34:32 np0005531887 ceilometer_agent_compute[197303]: + umask 0022
Nov 22 02:34:32 np0005531887 ceilometer_agent_compute[197303]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Nov 22 02:34:32 np0005531887 podman[197286]: ceilometer_agent_compute
Nov 22 02:34:32 np0005531887 systemd[1]: Started ceilometer_agent_compute container.
Nov 22 02:34:32 np0005531887 podman[197310]: 2025-11-22 07:34:32.370249735 +0000 UTC m=+0.155582636 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:34:32 np0005531887 systemd[1]: 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd-37502d6f232b5dbb.service: Main process exited, code=exited, status=1/FAILURE
Nov 22 02:34:32 np0005531887 systemd[1]: 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd-37502d6f232b5dbb.service: Failed with result 'exit-code'.
Nov 22 02:34:32 np0005531887 auditd[701]: Audit daemon rotating log files
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.232 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.232 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.232 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.232 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.232 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.232 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.233 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.233 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.233 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.233 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.233 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.233 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.233 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.233 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.233 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.233 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.234 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.234 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.234 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.234 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.234 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.234 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.234 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.234 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.234 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.234 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.234 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.235 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.235 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.235 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.235 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.235 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.235 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.235 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.235 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.235 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.235 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.235 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.235 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.236 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.236 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.236 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.236 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.236 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.236 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.236 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.236 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.236 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.237 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.237 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.237 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.237 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.237 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.237 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.237 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.237 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.238 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.238 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.238 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.238 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.238 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.238 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.238 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.238 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.238 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.238 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.238 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.239 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.239 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.239 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.239 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.239 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.239 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.239 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.239 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.240 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.240 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.240 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.240 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.240 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.240 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.240 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.240 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.240 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.240 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.241 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.241 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.241 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.241 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.241 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.241 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.241 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.241 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.241 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.241 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.242 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.242 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.242 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.242 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.242 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.242 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.242 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.242 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.242 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.242 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.243 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.243 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.243 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.243 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.243 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.243 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.243 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.243 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.243 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.243 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.244 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.244 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.244 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.244 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.244 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.244 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.244 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.244 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.244 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.245 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.245 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.245 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.245 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.245 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.245 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.245 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.245 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.246 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.246 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.246 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.246 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.246 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.246 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.246 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.246 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.247 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.247 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.247 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.247 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.247 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.247 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.247 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.247 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.248 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.248 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.248 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.248 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.248 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.248 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.248 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.248 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.248 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.249 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.249 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.249 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.249 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.249 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.249 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.249 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.249 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.268 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.270 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.271 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.391 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Nov 22 02:34:33 np0005531887 python3.9[197486]: ansible-ansible.builtin.systemd Invoked with name=edpm_ceilometer_agent_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 02:34:33 np0005531887 systemd[1]: Stopping ceilometer_agent_compute container...
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.520 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.522 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.522 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.522 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.522 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.522 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.522 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.523 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.523 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.523 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.523 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.523 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.523 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.523 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.523 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.523 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.524 12 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.524 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.524 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.524 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.524 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.524 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.524 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.524 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.524 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.524 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.525 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.525 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.525 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.525 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.525 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.525 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.525 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.525 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.525 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.525 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.526 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.526 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.526 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.526 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.526 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.526 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.526 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.526 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.526 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.526 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.527 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.527 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.527 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.527 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.527 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.527 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.527 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.527 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.527 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.528 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.528 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.528 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.528 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.528 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.528 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.528 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.528 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.528 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.529 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.529 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.529 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.529 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.529 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.529 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.529 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.529 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.530 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.530 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.530 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.530 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.530 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.530 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.530 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.530 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.530 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.531 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.531 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.531 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.531 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.531 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.531 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.531 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.531 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.531 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.532 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.532 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.532 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.532 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.532 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.532 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.532 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.532 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.532 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.533 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.533 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.533 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.533 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.533 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.533 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.533 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.533 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.533 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.534 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.534 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.534 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.534 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.534 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.534 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.534 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.534 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.534 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.534 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.534 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.535 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.535 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.535 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.535 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.535 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.535 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.535 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.535 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.535 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.535 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.535 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.536 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.536 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.536 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.536 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.536 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.536 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.536 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.536 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.536 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.536 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.536 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.537 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.537 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.537 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.537 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.537 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.537 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.537 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.537 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.537 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.537 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.537 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.538 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.538 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.538 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.538 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.538 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.538 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.538 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.538 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.538 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.538 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.539 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.539 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.539 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.539 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.539 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.539 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.539 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.539 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.539 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.539 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.540 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.540 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.540 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.540 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.540 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.540 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.540 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.540 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.540 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.540 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.541 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.541 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.541 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.541 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.541 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.541 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.541 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.541 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.541 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.542 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.542 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.542 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.542 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.542 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.543 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.543 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.543 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.543 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.543 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.543 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.543 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.543 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.543 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.543 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.544 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.544 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.544 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.544 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.544 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.544 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.544 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.544 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.544 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.544 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.545 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.545 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.545 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.547 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.551 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.555 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.555 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.555 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.556 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.556 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.556 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.556 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.556 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.556 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.556 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.556 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.556 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.556 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.556 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.556 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.557 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.557 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.557 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.557 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.557 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.557 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.557 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.557 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.557 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.557 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.857 2 INFO cotyledon._service_manager [-] Caught SIGTERM signal, graceful exiting of master process
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.958 2 DEBUG cotyledon._service_manager [-] Killing services with signal SIGTERM _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:304
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.959 2 DEBUG cotyledon._service_manager [-] Waiting services to terminate _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:308
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.959 12 INFO cotyledon._service [-] Caught SIGTERM signal, graceful exiting of service AgentManager(0) [12]
Nov 22 02:34:33 np0005531887 ceilometer_agent_compute[197303]: 2025-11-22 07:34:33.968 2 DEBUG cotyledon._service_manager [-] Shutdown finish _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:320
Nov 22 02:34:33 np0005531887 virtqemud[186424]: End of file while reading data: Input/output error
Nov 22 02:34:33 np0005531887 virtqemud[186424]: End of file while reading data: Input/output error
Nov 22 02:34:34 np0005531887 systemd[1]: libpod-083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd.scope: Deactivated successfully.
Nov 22 02:34:34 np0005531887 systemd[1]: libpod-083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd.scope: Consumed 1.562s CPU time.
Nov 22 02:34:34 np0005531887 podman[197493]: 2025-11-22 07:34:34.155417605 +0000 UTC m=+0.626989235 container died 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 22 02:34:34 np0005531887 systemd[1]: 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd-37502d6f232b5dbb.timer: Deactivated successfully.
Nov 22 02:34:34 np0005531887 systemd[1]: Stopped /usr/bin/podman healthcheck run 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd.
Nov 22 02:34:34 np0005531887 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd-userdata-shm.mount: Deactivated successfully.
Nov 22 02:34:34 np0005531887 systemd[1]: var-lib-containers-storage-overlay-c5145d3bd841d501a9b6904044c0ae4cf99536f1a1d3e03f1a503da9097aaa83-merged.mount: Deactivated successfully.
Nov 22 02:34:34 np0005531887 podman[197493]: 2025-11-22 07:34:34.833594591 +0000 UTC m=+1.305166221 container cleanup 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:34:34 np0005531887 podman[197493]: ceilometer_agent_compute
Nov 22 02:34:34 np0005531887 podman[197538]: ceilometer_agent_compute
Nov 22 02:34:34 np0005531887 systemd[1]: edpm_ceilometer_agent_compute.service: Deactivated successfully.
Nov 22 02:34:34 np0005531887 systemd[1]: Stopped ceilometer_agent_compute container.
Nov 22 02:34:34 np0005531887 systemd[1]: Starting ceilometer_agent_compute container...
Nov 22 02:34:34 np0005531887 podman[197526]: 2025-11-22 07:34:34.925203752 +0000 UTC m=+0.149397125 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 22 02:34:35 np0005531887 systemd[1]: Started libcrun container.
Nov 22 02:34:35 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c5145d3bd841d501a9b6904044c0ae4cf99536f1a1d3e03f1a503da9097aaa83/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Nov 22 02:34:35 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c5145d3bd841d501a9b6904044c0ae4cf99536f1a1d3e03f1a503da9097aaa83/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 22 02:34:35 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c5145d3bd841d501a9b6904044c0ae4cf99536f1a1d3e03f1a503da9097aaa83/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Nov 22 02:34:35 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c5145d3bd841d501a9b6904044c0ae4cf99536f1a1d3e03f1a503da9097aaa83/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Nov 22 02:34:35 np0005531887 systemd[1]: Started /usr/bin/podman healthcheck run 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd.
Nov 22 02:34:35 np0005531887 podman[197559]: 2025-11-22 07:34:35.411701061 +0000 UTC m=+0.490044057 container init 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 22 02:34:35 np0005531887 ceilometer_agent_compute[197578]: + sudo -E kolla_set_configs
Nov 22 02:34:35 np0005531887 podman[197559]: 2025-11-22 07:34:35.440140507 +0000 UTC m=+0.518483493 container start 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 22 02:34:35 np0005531887 ceilometer_agent_compute[197578]: sudo: unable to send audit message: Operation not permitted
Nov 22 02:34:35 np0005531887 ceilometer_agent_compute[197578]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 22 02:34:35 np0005531887 ceilometer_agent_compute[197578]: INFO:__main__:Validating config file
Nov 22 02:34:35 np0005531887 ceilometer_agent_compute[197578]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 22 02:34:35 np0005531887 ceilometer_agent_compute[197578]: INFO:__main__:Copying service configuration files
Nov 22 02:34:35 np0005531887 ceilometer_agent_compute[197578]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Nov 22 02:34:35 np0005531887 ceilometer_agent_compute[197578]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Nov 22 02:34:35 np0005531887 ceilometer_agent_compute[197578]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Nov 22 02:34:35 np0005531887 ceilometer_agent_compute[197578]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Nov 22 02:34:35 np0005531887 ceilometer_agent_compute[197578]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Nov 22 02:34:35 np0005531887 ceilometer_agent_compute[197578]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Nov 22 02:34:35 np0005531887 ceilometer_agent_compute[197578]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 22 02:34:35 np0005531887 ceilometer_agent_compute[197578]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 22 02:34:35 np0005531887 ceilometer_agent_compute[197578]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 22 02:34:35 np0005531887 ceilometer_agent_compute[197578]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 22 02:34:35 np0005531887 ceilometer_agent_compute[197578]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 22 02:34:35 np0005531887 ceilometer_agent_compute[197578]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 22 02:34:35 np0005531887 ceilometer_agent_compute[197578]: INFO:__main__:Writing out command to execute
Nov 22 02:34:35 np0005531887 ceilometer_agent_compute[197578]: ++ cat /run_command
Nov 22 02:34:35 np0005531887 ceilometer_agent_compute[197578]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Nov 22 02:34:35 np0005531887 ceilometer_agent_compute[197578]: + ARGS=
Nov 22 02:34:35 np0005531887 ceilometer_agent_compute[197578]: + sudo kolla_copy_cacerts
Nov 22 02:34:35 np0005531887 ceilometer_agent_compute[197578]: sudo: unable to send audit message: Operation not permitted
Nov 22 02:34:35 np0005531887 ceilometer_agent_compute[197578]: + [[ ! -n '' ]]
Nov 22 02:34:35 np0005531887 ceilometer_agent_compute[197578]: + . kolla_extend_start
Nov 22 02:34:35 np0005531887 ceilometer_agent_compute[197578]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Nov 22 02:34:35 np0005531887 ceilometer_agent_compute[197578]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Nov 22 02:34:35 np0005531887 ceilometer_agent_compute[197578]: + umask 0022
Nov 22 02:34:35 np0005531887 ceilometer_agent_compute[197578]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Nov 22 02:34:35 np0005531887 podman[197559]: ceilometer_agent_compute
Nov 22 02:34:35 np0005531887 systemd[1]: Started ceilometer_agent_compute container.
Nov 22 02:34:35 np0005531887 podman[197585]: 2025-11-22 07:34:35.627086748 +0000 UTC m=+0.177299777 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.build-date=20251118, tcib_managed=true, container_name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 22 02:34:35 np0005531887 systemd[1]: 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd-108737761a9ec916.service: Main process exited, code=exited, status=1/FAILURE
Nov 22 02:34:35 np0005531887 systemd[1]: 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd-108737761a9ec916.service: Failed with result 'exit-code'.
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.426 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.426 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.426 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.426 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.427 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.427 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.427 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.427 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.428 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.429 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.429 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.429 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.429 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.429 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.430 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.430 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.430 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.430 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.430 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.430 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.431 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.431 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.431 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.431 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.432 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.432 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.432 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.432 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.432 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.432 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.432 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.433 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.433 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.433 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.433 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.433 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.433 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.433 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.434 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.434 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.434 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.434 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.434 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.434 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.434 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.435 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.435 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.435 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.435 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.435 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.435 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.436 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.436 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.436 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.436 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.436 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.436 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.436 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.437 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.437 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.437 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.437 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.437 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.437 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.437 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.438 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.438 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.438 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.438 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.438 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.438 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.438 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.439 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.439 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.439 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.439 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.439 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.439 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.439 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.440 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.440 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.440 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.440 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.440 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.440 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.440 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.441 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.441 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.441 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.441 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.441 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.441 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.441 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.442 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.442 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.442 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.442 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.442 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.442 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.442 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.442 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.443 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.443 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.443 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.443 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.443 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.443 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.444 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.444 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.444 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.444 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.444 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.444 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.444 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.444 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.445 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.445 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.445 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.445 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.445 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.445 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.445 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.445 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.446 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.446 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.446 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.446 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.446 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.446 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.446 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.446 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.447 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.447 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.447 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.447 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.447 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.447 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.448 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.448 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.448 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.448 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.448 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.448 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.448 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.449 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.449 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.449 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.449 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.449 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.450 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.450 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.450 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.450 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.450 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.450 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.451 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.451 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.451 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.451 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.451 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.451 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.452 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.452 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.452 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.452 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.470 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.471 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.472 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.485 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Nov 22 02:34:36 np0005531887 python3.9[197761]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/node_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.623 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.624 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.624 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.624 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.624 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.624 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.624 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.625 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.625 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.625 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.625 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.625 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.625 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.625 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.625 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.625 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.626 12 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.626 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.626 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.626 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.626 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.626 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.626 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.626 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.626 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.627 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.627 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.627 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.627 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.627 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.627 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.627 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.627 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.627 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.627 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.628 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.628 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.628 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.628 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.628 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.628 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.628 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.628 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.629 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.629 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.629 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.629 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.629 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.629 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.629 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.629 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.629 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.629 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.629 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.630 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.630 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.630 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.630 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.630 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.630 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.630 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.630 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.630 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.630 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.630 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.631 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.631 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.631 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.631 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.631 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.631 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.631 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.631 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.631 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.631 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.631 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.632 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.632 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.632 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.632 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.632 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.632 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.632 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.632 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.632 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.632 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.632 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.633 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.633 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.633 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.633 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.633 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.633 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.633 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.633 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.634 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.634 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.634 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.634 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.634 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.634 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.634 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.634 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.634 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.635 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.635 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.635 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.635 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.635 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.635 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.635 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.635 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.635 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.635 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.636 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.636 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.636 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.636 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.636 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.636 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.636 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.636 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.636 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.636 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.636 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.637 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.637 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.637 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.637 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.637 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.637 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.637 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.637 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.637 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.637 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.638 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.638 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.638 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.638 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.638 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.638 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.638 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.638 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.638 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.638 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.638 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.639 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.639 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.639 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.639 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.639 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.639 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.639 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.639 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.639 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.639 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.640 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.640 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.640 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.640 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.640 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.640 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.640 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.640 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.640 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.640 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.640 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.640 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.641 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.641 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.641 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.641 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.641 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.641 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.641 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.641 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.641 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.641 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.641 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.642 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.642 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.642 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.642 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.642 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.642 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.642 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.642 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.642 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.642 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.642 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.642 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.643 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.643 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.643 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.643 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.643 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.643 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.643 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.643 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.643 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.643 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.643 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.644 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.644 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.644 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.644 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.644 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.644 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.644 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.644 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.644 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.644 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.644 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.645 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.645 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.645 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.645 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.645 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.645 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.648 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.654 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.657 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.658 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.658 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.658 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.658 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.658 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.658 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.658 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.658 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.659 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.659 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.659 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.659 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.659 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.659 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.659 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.659 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.659 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.660 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.660 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.660 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.660 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.660 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.660 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:34:36.660 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:34:37 np0005531887 python3.9[197890]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/node_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796876.1331768-1700-93451256324198/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:34:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:34:37.305 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:34:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:34:37.306 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:34:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:34:37.306 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:34:38 np0005531887 python3.9[198042]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=node_exporter.json debug=False
Nov 22 02:34:39 np0005531887 python3.9[198194]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 22 02:34:40 np0005531887 python3[198346]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=node_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Nov 22 02:34:40 np0005531887 podman[198380]: 2025-11-22 07:34:40.329192248 +0000 UTC m=+0.020164024 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Nov 22 02:34:41 np0005531887 podman[198380]: 2025-11-22 07:34:41.167659344 +0000 UTC m=+0.858631100 container create 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, container_name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, config_id=edpm)
Nov 22 02:34:41 np0005531887 python3[198346]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name node_exporter --conmon-pidfile /run/node_exporter.pid --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck node_exporter --label config_id=edpm --label container_name=node_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9100:9100 --user root --volume /var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z --volume /var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw --volume /var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z quay.io/prometheus/node-exporter:v1.5.0 --web.config.file=/etc/node_exporter/node_exporter.yaml --web.disable-exporter-metrics --collector.systemd --collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service --no-collector.dmi --no-collector.entropy --no-collector.thermal_zone --no-collector.time --no-collector.timex --no-collector.uname --no-collector.stat --no-collector.hwmon --no-collector.os --no-collector.selinux --no-collector.textfile --no-collector.powersupplyclass --no-collector.pressure --no-collector.rapl
Nov 22 02:34:41 np0005531887 python3.9[198568]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:34:43 np0005531887 python3.9[198722]: ansible-file Invoked with path=/etc/systemd/system/edpm_node_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:34:43 np0005531887 python3.9[198873]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763796883.0784075-1859-161092196784680/source dest=/etc/systemd/system/edpm_node_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:34:44 np0005531887 podman[198921]: 2025-11-22 07:34:44.015397182 +0000 UTC m=+0.083436212 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Nov 22 02:34:44 np0005531887 python3.9[198968]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 22 02:34:44 np0005531887 systemd[1]: Reloading.
Nov 22 02:34:44 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:34:44 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:34:45 np0005531887 python3.9[199078]: ansible-systemd Invoked with state=restarted name=edpm_node_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:34:45 np0005531887 systemd[1]: Reloading.
Nov 22 02:34:45 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:34:45 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:34:45 np0005531887 systemd[1]: Starting node_exporter container...
Nov 22 02:34:46 np0005531887 systemd[1]: Started libcrun container.
Nov 22 02:34:46 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/061375d171964e0a9fedf2f52811f14e94959eccc345927c65672836c63a4aee/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 22 02:34:46 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/061375d171964e0a9fedf2f52811f14e94959eccc345927c65672836c63a4aee/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 22 02:34:46 np0005531887 systemd[1]: Started /usr/bin/podman healthcheck run 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f.
Nov 22 02:34:46 np0005531887 podman[199118]: 2025-11-22 07:34:46.642377201 +0000 UTC m=+0.798043599 container init 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 02:34:46 np0005531887 node_exporter[199133]: ts=2025-11-22T07:34:46.656Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Nov 22 02:34:46 np0005531887 node_exporter[199133]: ts=2025-11-22T07:34:46.656Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Nov 22 02:34:46 np0005531887 node_exporter[199133]: ts=2025-11-22T07:34:46.656Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Nov 22 02:34:46 np0005531887 node_exporter[199133]: ts=2025-11-22T07:34:46.657Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Nov 22 02:34:46 np0005531887 node_exporter[199133]: ts=2025-11-22T07:34:46.658Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Nov 22 02:34:46 np0005531887 node_exporter[199133]: ts=2025-11-22T07:34:46.658Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Nov 22 02:34:46 np0005531887 node_exporter[199133]: ts=2025-11-22T07:34:46.658Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Nov 22 02:34:46 np0005531887 node_exporter[199133]: ts=2025-11-22T07:34:46.658Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Nov 22 02:34:46 np0005531887 node_exporter[199133]: ts=2025-11-22T07:34:46.658Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Nov 22 02:34:46 np0005531887 node_exporter[199133]: ts=2025-11-22T07:34:46.658Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Nov 22 02:34:46 np0005531887 node_exporter[199133]: ts=2025-11-22T07:34:46.658Z caller=node_exporter.go:117 level=info collector=arp
Nov 22 02:34:46 np0005531887 node_exporter[199133]: ts=2025-11-22T07:34:46.658Z caller=node_exporter.go:117 level=info collector=bcache
Nov 22 02:34:46 np0005531887 node_exporter[199133]: ts=2025-11-22T07:34:46.658Z caller=node_exporter.go:117 level=info collector=bonding
Nov 22 02:34:46 np0005531887 node_exporter[199133]: ts=2025-11-22T07:34:46.658Z caller=node_exporter.go:117 level=info collector=btrfs
Nov 22 02:34:46 np0005531887 node_exporter[199133]: ts=2025-11-22T07:34:46.658Z caller=node_exporter.go:117 level=info collector=conntrack
Nov 22 02:34:46 np0005531887 node_exporter[199133]: ts=2025-11-22T07:34:46.658Z caller=node_exporter.go:117 level=info collector=cpu
Nov 22 02:34:46 np0005531887 node_exporter[199133]: ts=2025-11-22T07:34:46.658Z caller=node_exporter.go:117 level=info collector=cpufreq
Nov 22 02:34:46 np0005531887 node_exporter[199133]: ts=2025-11-22T07:34:46.658Z caller=node_exporter.go:117 level=info collector=diskstats
Nov 22 02:34:46 np0005531887 node_exporter[199133]: ts=2025-11-22T07:34:46.658Z caller=node_exporter.go:117 level=info collector=edac
Nov 22 02:34:46 np0005531887 node_exporter[199133]: ts=2025-11-22T07:34:46.658Z caller=node_exporter.go:117 level=info collector=fibrechannel
Nov 22 02:34:46 np0005531887 node_exporter[199133]: ts=2025-11-22T07:34:46.658Z caller=node_exporter.go:117 level=info collector=filefd
Nov 22 02:34:46 np0005531887 node_exporter[199133]: ts=2025-11-22T07:34:46.658Z caller=node_exporter.go:117 level=info collector=filesystem
Nov 22 02:34:46 np0005531887 node_exporter[199133]: ts=2025-11-22T07:34:46.658Z caller=node_exporter.go:117 level=info collector=infiniband
Nov 22 02:34:46 np0005531887 node_exporter[199133]: ts=2025-11-22T07:34:46.658Z caller=node_exporter.go:117 level=info collector=ipvs
Nov 22 02:34:46 np0005531887 node_exporter[199133]: ts=2025-11-22T07:34:46.658Z caller=node_exporter.go:117 level=info collector=loadavg
Nov 22 02:34:46 np0005531887 node_exporter[199133]: ts=2025-11-22T07:34:46.658Z caller=node_exporter.go:117 level=info collector=mdadm
Nov 22 02:34:46 np0005531887 node_exporter[199133]: ts=2025-11-22T07:34:46.658Z caller=node_exporter.go:117 level=info collector=meminfo
Nov 22 02:34:46 np0005531887 node_exporter[199133]: ts=2025-11-22T07:34:46.658Z caller=node_exporter.go:117 level=info collector=netclass
Nov 22 02:34:46 np0005531887 node_exporter[199133]: ts=2025-11-22T07:34:46.658Z caller=node_exporter.go:117 level=info collector=netdev
Nov 22 02:34:46 np0005531887 node_exporter[199133]: ts=2025-11-22T07:34:46.658Z caller=node_exporter.go:117 level=info collector=netstat
Nov 22 02:34:46 np0005531887 node_exporter[199133]: ts=2025-11-22T07:34:46.658Z caller=node_exporter.go:117 level=info collector=nfs
Nov 22 02:34:46 np0005531887 node_exporter[199133]: ts=2025-11-22T07:34:46.658Z caller=node_exporter.go:117 level=info collector=nfsd
Nov 22 02:34:46 np0005531887 node_exporter[199133]: ts=2025-11-22T07:34:46.658Z caller=node_exporter.go:117 level=info collector=nvme
Nov 22 02:34:46 np0005531887 node_exporter[199133]: ts=2025-11-22T07:34:46.658Z caller=node_exporter.go:117 level=info collector=schedstat
Nov 22 02:34:46 np0005531887 node_exporter[199133]: ts=2025-11-22T07:34:46.658Z caller=node_exporter.go:117 level=info collector=sockstat
Nov 22 02:34:46 np0005531887 node_exporter[199133]: ts=2025-11-22T07:34:46.658Z caller=node_exporter.go:117 level=info collector=softnet
Nov 22 02:34:46 np0005531887 node_exporter[199133]: ts=2025-11-22T07:34:46.658Z caller=node_exporter.go:117 level=info collector=systemd
Nov 22 02:34:46 np0005531887 node_exporter[199133]: ts=2025-11-22T07:34:46.659Z caller=node_exporter.go:117 level=info collector=tapestats
Nov 22 02:34:46 np0005531887 node_exporter[199133]: ts=2025-11-22T07:34:46.659Z caller=node_exporter.go:117 level=info collector=udp_queues
Nov 22 02:34:46 np0005531887 node_exporter[199133]: ts=2025-11-22T07:34:46.659Z caller=node_exporter.go:117 level=info collector=vmstat
Nov 22 02:34:46 np0005531887 node_exporter[199133]: ts=2025-11-22T07:34:46.659Z caller=node_exporter.go:117 level=info collector=xfs
Nov 22 02:34:46 np0005531887 node_exporter[199133]: ts=2025-11-22T07:34:46.659Z caller=node_exporter.go:117 level=info collector=zfs
Nov 22 02:34:46 np0005531887 node_exporter[199133]: ts=2025-11-22T07:34:46.659Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Nov 22 02:34:46 np0005531887 node_exporter[199133]: ts=2025-11-22T07:34:46.660Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Nov 22 02:34:46 np0005531887 podman[199118]: 2025-11-22 07:34:46.671980844 +0000 UTC m=+0.827647232 container start 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 02:34:46 np0005531887 podman[199118]: node_exporter
Nov 22 02:34:46 np0005531887 systemd[1]: Started node_exporter container.
Nov 22 02:34:46 np0005531887 podman[199152]: 2025-11-22 07:34:46.887997168 +0000 UTC m=+0.109433317 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 02:34:46 np0005531887 podman[199142]: 2025-11-22 07:34:46.916100855 +0000 UTC m=+0.233449190 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 02:34:47 np0005531887 python3.9[199337]: ansible-ansible.builtin.systemd Invoked with name=edpm_node_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 02:34:48 np0005531887 systemd[1]: Stopping node_exporter container...
Nov 22 02:34:48 np0005531887 systemd[1]: libpod-3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f.scope: Deactivated successfully.
Nov 22 02:34:48 np0005531887 conmon[199133]: conmon 3be178ecb3c6af9d186f <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f.scope/container/memory.events
Nov 22 02:34:48 np0005531887 podman[199341]: 2025-11-22 07:34:48.4217696 +0000 UTC m=+0.391068985 container died 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 02:34:49 np0005531887 systemd[1]: 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f-363bed621afda5c4.timer: Deactivated successfully.
Nov 22 02:34:49 np0005531887 systemd[1]: Stopped /usr/bin/podman healthcheck run 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f.
Nov 22 02:34:49 np0005531887 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f-userdata-shm.mount: Deactivated successfully.
Nov 22 02:34:49 np0005531887 systemd[1]: var-lib-containers-storage-overlay-061375d171964e0a9fedf2f52811f14e94959eccc345927c65672836c63a4aee-merged.mount: Deactivated successfully.
Nov 22 02:34:50 np0005531887 podman[199341]: 2025-11-22 07:34:50.459104517 +0000 UTC m=+2.428403902 container cleanup 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 02:34:50 np0005531887 podman[199341]: node_exporter
Nov 22 02:34:50 np0005531887 systemd[1]: edpm_node_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Nov 22 02:34:50 np0005531887 podman[199371]: node_exporter
Nov 22 02:34:50 np0005531887 systemd[1]: edpm_node_exporter.service: Failed with result 'exit-code'.
Nov 22 02:34:50 np0005531887 systemd[1]: Stopped node_exporter container.
Nov 22 02:34:50 np0005531887 systemd[1]: Starting node_exporter container...
Nov 22 02:34:50 np0005531887 systemd[1]: Started libcrun container.
Nov 22 02:34:50 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/061375d171964e0a9fedf2f52811f14e94959eccc345927c65672836c63a4aee/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 22 02:34:50 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/061375d171964e0a9fedf2f52811f14e94959eccc345927c65672836c63a4aee/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 22 02:34:51 np0005531887 systemd[1]: Started /usr/bin/podman healthcheck run 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f.
Nov 22 02:34:51 np0005531887 podman[199384]: 2025-11-22 07:34:51.302074112 +0000 UTC m=+0.755612059 container init 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 02:34:51 np0005531887 node_exporter[199399]: ts=2025-11-22T07:34:51.314Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Nov 22 02:34:51 np0005531887 node_exporter[199399]: ts=2025-11-22T07:34:51.314Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Nov 22 02:34:51 np0005531887 node_exporter[199399]: ts=2025-11-22T07:34:51.314Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Nov 22 02:34:51 np0005531887 node_exporter[199399]: ts=2025-11-22T07:34:51.314Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Nov 22 02:34:51 np0005531887 node_exporter[199399]: ts=2025-11-22T07:34:51.314Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Nov 22 02:34:51 np0005531887 node_exporter[199399]: ts=2025-11-22T07:34:51.314Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Nov 22 02:34:51 np0005531887 node_exporter[199399]: ts=2025-11-22T07:34:51.315Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Nov 22 02:34:51 np0005531887 node_exporter[199399]: ts=2025-11-22T07:34:51.315Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Nov 22 02:34:51 np0005531887 node_exporter[199399]: ts=2025-11-22T07:34:51.315Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Nov 22 02:34:51 np0005531887 node_exporter[199399]: ts=2025-11-22T07:34:51.315Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Nov 22 02:34:51 np0005531887 node_exporter[199399]: ts=2025-11-22T07:34:51.315Z caller=node_exporter.go:117 level=info collector=arp
Nov 22 02:34:51 np0005531887 node_exporter[199399]: ts=2025-11-22T07:34:51.315Z caller=node_exporter.go:117 level=info collector=bcache
Nov 22 02:34:51 np0005531887 node_exporter[199399]: ts=2025-11-22T07:34:51.315Z caller=node_exporter.go:117 level=info collector=bonding
Nov 22 02:34:51 np0005531887 node_exporter[199399]: ts=2025-11-22T07:34:51.315Z caller=node_exporter.go:117 level=info collector=btrfs
Nov 22 02:34:51 np0005531887 node_exporter[199399]: ts=2025-11-22T07:34:51.315Z caller=node_exporter.go:117 level=info collector=conntrack
Nov 22 02:34:51 np0005531887 node_exporter[199399]: ts=2025-11-22T07:34:51.315Z caller=node_exporter.go:117 level=info collector=cpu
Nov 22 02:34:51 np0005531887 node_exporter[199399]: ts=2025-11-22T07:34:51.315Z caller=node_exporter.go:117 level=info collector=cpufreq
Nov 22 02:34:51 np0005531887 node_exporter[199399]: ts=2025-11-22T07:34:51.315Z caller=node_exporter.go:117 level=info collector=diskstats
Nov 22 02:34:51 np0005531887 node_exporter[199399]: ts=2025-11-22T07:34:51.315Z caller=node_exporter.go:117 level=info collector=edac
Nov 22 02:34:51 np0005531887 node_exporter[199399]: ts=2025-11-22T07:34:51.315Z caller=node_exporter.go:117 level=info collector=fibrechannel
Nov 22 02:34:51 np0005531887 node_exporter[199399]: ts=2025-11-22T07:34:51.315Z caller=node_exporter.go:117 level=info collector=filefd
Nov 22 02:34:51 np0005531887 node_exporter[199399]: ts=2025-11-22T07:34:51.315Z caller=node_exporter.go:117 level=info collector=filesystem
Nov 22 02:34:51 np0005531887 node_exporter[199399]: ts=2025-11-22T07:34:51.315Z caller=node_exporter.go:117 level=info collector=infiniband
Nov 22 02:34:51 np0005531887 node_exporter[199399]: ts=2025-11-22T07:34:51.315Z caller=node_exporter.go:117 level=info collector=ipvs
Nov 22 02:34:51 np0005531887 node_exporter[199399]: ts=2025-11-22T07:34:51.315Z caller=node_exporter.go:117 level=info collector=loadavg
Nov 22 02:34:51 np0005531887 node_exporter[199399]: ts=2025-11-22T07:34:51.315Z caller=node_exporter.go:117 level=info collector=mdadm
Nov 22 02:34:51 np0005531887 node_exporter[199399]: ts=2025-11-22T07:34:51.315Z caller=node_exporter.go:117 level=info collector=meminfo
Nov 22 02:34:51 np0005531887 node_exporter[199399]: ts=2025-11-22T07:34:51.315Z caller=node_exporter.go:117 level=info collector=netclass
Nov 22 02:34:51 np0005531887 node_exporter[199399]: ts=2025-11-22T07:34:51.315Z caller=node_exporter.go:117 level=info collector=netdev
Nov 22 02:34:51 np0005531887 node_exporter[199399]: ts=2025-11-22T07:34:51.315Z caller=node_exporter.go:117 level=info collector=netstat
Nov 22 02:34:51 np0005531887 node_exporter[199399]: ts=2025-11-22T07:34:51.315Z caller=node_exporter.go:117 level=info collector=nfs
Nov 22 02:34:51 np0005531887 node_exporter[199399]: ts=2025-11-22T07:34:51.315Z caller=node_exporter.go:117 level=info collector=nfsd
Nov 22 02:34:51 np0005531887 node_exporter[199399]: ts=2025-11-22T07:34:51.315Z caller=node_exporter.go:117 level=info collector=nvme
Nov 22 02:34:51 np0005531887 node_exporter[199399]: ts=2025-11-22T07:34:51.315Z caller=node_exporter.go:117 level=info collector=schedstat
Nov 22 02:34:51 np0005531887 node_exporter[199399]: ts=2025-11-22T07:34:51.315Z caller=node_exporter.go:117 level=info collector=sockstat
Nov 22 02:34:51 np0005531887 node_exporter[199399]: ts=2025-11-22T07:34:51.315Z caller=node_exporter.go:117 level=info collector=softnet
Nov 22 02:34:51 np0005531887 node_exporter[199399]: ts=2025-11-22T07:34:51.315Z caller=node_exporter.go:117 level=info collector=systemd
Nov 22 02:34:51 np0005531887 node_exporter[199399]: ts=2025-11-22T07:34:51.315Z caller=node_exporter.go:117 level=info collector=tapestats
Nov 22 02:34:51 np0005531887 node_exporter[199399]: ts=2025-11-22T07:34:51.315Z caller=node_exporter.go:117 level=info collector=udp_queues
Nov 22 02:34:51 np0005531887 node_exporter[199399]: ts=2025-11-22T07:34:51.315Z caller=node_exporter.go:117 level=info collector=vmstat
Nov 22 02:34:51 np0005531887 node_exporter[199399]: ts=2025-11-22T07:34:51.315Z caller=node_exporter.go:117 level=info collector=xfs
Nov 22 02:34:51 np0005531887 node_exporter[199399]: ts=2025-11-22T07:34:51.315Z caller=node_exporter.go:117 level=info collector=zfs
Nov 22 02:34:51 np0005531887 node_exporter[199399]: ts=2025-11-22T07:34:51.316Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Nov 22 02:34:51 np0005531887 node_exporter[199399]: ts=2025-11-22T07:34:51.316Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Nov 22 02:34:51 np0005531887 podman[199384]: 2025-11-22 07:34:51.325144181 +0000 UTC m=+0.778682108 container start 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 02:34:51 np0005531887 podman[199384]: node_exporter
Nov 22 02:34:51 np0005531887 systemd[1]: Started node_exporter container.
Nov 22 02:34:51 np0005531887 podman[199409]: 2025-11-22 07:34:51.618303951 +0000 UTC m=+0.284378819 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 02:34:52 np0005531887 python3.9[199584]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:34:52 np0005531887 python3.9[199707]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796891.786568-1955-252821016342927/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:34:54 np0005531887 python3.9[199859]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=podman_exporter.json debug=False
Nov 22 02:34:55 np0005531887 python3.9[200011]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 22 02:34:56 np0005531887 python3[200163]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=podman_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Nov 22 02:35:01 np0005531887 podman[200176]: 2025-11-22 07:35:01.374710489 +0000 UTC m=+4.876793276 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Nov 22 02:35:01 np0005531887 podman[200272]: 2025-11-22 07:35:01.533778627 +0000 UTC m=+0.055110794 container create 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, config_id=edpm, container_name=podman_exporter)
Nov 22 02:35:01 np0005531887 podman[200272]: 2025-11-22 07:35:01.498029742 +0000 UTC m=+0.019361929 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Nov 22 02:35:01 np0005531887 python3[200163]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env OS_ENDPOINT_TYPE=internal --env CONTAINER_HOST=unix:///run/podman/podman.sock --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=edpm --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter:v1.10.1 --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Nov 22 02:35:02 np0005531887 python3.9[200461]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:35:03 np0005531887 python3.9[200615]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:35:04 np0005531887 python3.9[200766]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763796903.493855-2114-161593263994141/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:35:04 np0005531887 python3.9[200842]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 22 02:35:04 np0005531887 systemd[1]: Reloading.
Nov 22 02:35:04 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:35:04 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:35:05 np0005531887 podman[200878]: 2025-11-22 07:35:05.254357261 +0000 UTC m=+0.106386623 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 02:35:05 np0005531887 python3.9[200979]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:35:05 np0005531887 systemd[1]: Reloading.
Nov 22 02:35:05 np0005531887 podman[200981]: 2025-11-22 07:35:05.831626952 +0000 UTC m=+0.065332041 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=2, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Nov 22 02:35:05 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:35:05 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:35:06 np0005531887 systemd[1]: 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd-108737761a9ec916.service: Main process exited, code=exited, status=1/FAILURE
Nov 22 02:35:06 np0005531887 systemd[1]: 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd-108737761a9ec916.service: Failed with result 'exit-code'.
Nov 22 02:35:06 np0005531887 systemd[1]: Starting podman_exporter container...
Nov 22 02:35:06 np0005531887 systemd[1]: Started libcrun container.
Nov 22 02:35:06 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50085a9fbfad0681cc30d2ae2434c0b6cea528dbddc267d4ae64d77660bd3819/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 22 02:35:06 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50085a9fbfad0681cc30d2ae2434c0b6cea528dbddc267d4ae64d77660bd3819/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 22 02:35:06 np0005531887 systemd[1]: Started /usr/bin/podman healthcheck run 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2.
Nov 22 02:35:06 np0005531887 podman[201037]: 2025-11-22 07:35:06.76862989 +0000 UTC m=+0.538514373 container init 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 02:35:06 np0005531887 podman_exporter[201053]: ts=2025-11-22T07:35:06.790Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Nov 22 02:35:06 np0005531887 podman_exporter[201053]: ts=2025-11-22T07:35:06.790Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Nov 22 02:35:06 np0005531887 podman_exporter[201053]: ts=2025-11-22T07:35:06.790Z caller=handler.go:94 level=info msg="enabled collectors"
Nov 22 02:35:06 np0005531887 podman_exporter[201053]: ts=2025-11-22T07:35:06.790Z caller=handler.go:105 level=info collector=container
Nov 22 02:35:06 np0005531887 podman[201037]: 2025-11-22 07:35:06.805790029 +0000 UTC m=+0.575674492 container start 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 02:35:06 np0005531887 systemd[1]: Starting Podman API Service...
Nov 22 02:35:06 np0005531887 systemd[1]: Started Podman API Service.
Nov 22 02:35:06 np0005531887 podman[201037]: podman_exporter
Nov 22 02:35:06 np0005531887 systemd[1]: Started podman_exporter container.
Nov 22 02:35:06 np0005531887 podman[201064]: time="2025-11-22T07:35:06Z" level=info msg="/usr/bin/podman filtering at log level info"
Nov 22 02:35:06 np0005531887 podman[201064]: time="2025-11-22T07:35:06Z" level=info msg="Setting parallel job count to 25"
Nov 22 02:35:06 np0005531887 podman[201064]: time="2025-11-22T07:35:06Z" level=info msg="Using sqlite as database backend"
Nov 22 02:35:06 np0005531887 podman[201064]: time="2025-11-22T07:35:06Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Nov 22 02:35:06 np0005531887 podman[201064]: time="2025-11-22T07:35:06Z" level=info msg="Using systemd socket activation to determine API endpoint"
Nov 22 02:35:06 np0005531887 podman[201064]: time="2025-11-22T07:35:06Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Nov 22 02:35:06 np0005531887 podman[201064]: @ - - [22/Nov/2025:07:35:06 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Nov 22 02:35:06 np0005531887 podman[201064]: time="2025-11-22T07:35:06Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 22 02:35:06 np0005531887 podman[201064]: @ - - [22/Nov/2025:07:35:06 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 19568 "" "Go-http-client/1.1"
Nov 22 02:35:06 np0005531887 podman_exporter[201053]: ts=2025-11-22T07:35:06.877Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Nov 22 02:35:06 np0005531887 podman_exporter[201053]: ts=2025-11-22T07:35:06.878Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Nov 22 02:35:06 np0005531887 podman_exporter[201053]: ts=2025-11-22T07:35:06.878Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Nov 22 02:35:06 np0005531887 podman[201062]: 2025-11-22 07:35:06.902433367 +0000 UTC m=+0.084949175 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 02:35:06 np0005531887 systemd[1]: 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2-545643807065ee5f.service: Main process exited, code=exited, status=1/FAILURE
Nov 22 02:35:06 np0005531887 systemd[1]: 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2-545643807065ee5f.service: Failed with result 'exit-code'.
Nov 22 02:35:08 np0005531887 python3.9[201249]: ansible-ansible.builtin.systemd Invoked with name=edpm_podman_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 02:35:08 np0005531887 systemd[1]: Stopping podman_exporter container...
Nov 22 02:35:09 np0005531887 podman[201064]: @ - - [22/Nov/2025:07:35:06 +0000] "GET /v4.9.3/libpod/events?filters=%7B%7D&since=&stream=true&until= HTTP/1.1" 200 1449 "" "Go-http-client/1.1"
Nov 22 02:35:09 np0005531887 systemd[1]: libpod-899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2.scope: Deactivated successfully.
Nov 22 02:35:09 np0005531887 podman[201253]: 2025-11-22 07:35:09.035603944 +0000 UTC m=+0.261985667 container died 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 02:35:09 np0005531887 systemd[1]: 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2-545643807065ee5f.timer: Deactivated successfully.
Nov 22 02:35:09 np0005531887 systemd[1]: Stopped /usr/bin/podman healthcheck run 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2.
Nov 22 02:35:09 np0005531887 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2-userdata-shm.mount: Deactivated successfully.
Nov 22 02:35:09 np0005531887 systemd[1]: var-lib-containers-storage-overlay-50085a9fbfad0681cc30d2ae2434c0b6cea528dbddc267d4ae64d77660bd3819-merged.mount: Deactivated successfully.
Nov 22 02:35:09 np0005531887 podman[201253]: 2025-11-22 07:35:09.889299739 +0000 UTC m=+1.115681462 container cleanup 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 02:35:09 np0005531887 podman[201253]: podman_exporter
Nov 22 02:35:09 np0005531887 systemd[1]: edpm_podman_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Nov 22 02:35:09 np0005531887 podman[201282]: podman_exporter
Nov 22 02:35:09 np0005531887 systemd[1]: edpm_podman_exporter.service: Failed with result 'exit-code'.
Nov 22 02:35:09 np0005531887 systemd[1]: Stopped podman_exporter container.
Nov 22 02:35:09 np0005531887 systemd[1]: Starting podman_exporter container...
Nov 22 02:35:10 np0005531887 systemd[1]: Started libcrun container.
Nov 22 02:35:10 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50085a9fbfad0681cc30d2ae2434c0b6cea528dbddc267d4ae64d77660bd3819/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 22 02:35:10 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50085a9fbfad0681cc30d2ae2434c0b6cea528dbddc267d4ae64d77660bd3819/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 22 02:35:10 np0005531887 systemd[1]: Started /usr/bin/podman healthcheck run 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2.
Nov 22 02:35:10 np0005531887 podman[201294]: 2025-11-22 07:35:10.454471435 +0000 UTC m=+0.481709609 container init 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 02:35:10 np0005531887 podman_exporter[201309]: ts=2025-11-22T07:35:10.469Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Nov 22 02:35:10 np0005531887 podman_exporter[201309]: ts=2025-11-22T07:35:10.469Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Nov 22 02:35:10 np0005531887 podman_exporter[201309]: ts=2025-11-22T07:35:10.469Z caller=handler.go:94 level=info msg="enabled collectors"
Nov 22 02:35:10 np0005531887 podman_exporter[201309]: ts=2025-11-22T07:35:10.469Z caller=handler.go:105 level=info collector=container
Nov 22 02:35:10 np0005531887 podman[201064]: @ - - [22/Nov/2025:07:35:10 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Nov 22 02:35:10 np0005531887 podman[201064]: time="2025-11-22T07:35:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 22 02:35:10 np0005531887 podman[201294]: 2025-11-22 07:35:10.483933328 +0000 UTC m=+0.511171472 container start 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 02:35:10 np0005531887 podman[201294]: podman_exporter
Nov 22 02:35:10 np0005531887 podman[201064]: @ - - [22/Nov/2025:07:35:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 19570 "" "Go-http-client/1.1"
Nov 22 02:35:10 np0005531887 podman_exporter[201309]: ts=2025-11-22T07:35:10.723Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Nov 22 02:35:10 np0005531887 podman_exporter[201309]: ts=2025-11-22T07:35:10.724Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Nov 22 02:35:10 np0005531887 podman_exporter[201309]: ts=2025-11-22T07:35:10.724Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Nov 22 02:35:10 np0005531887 systemd[1]: Started podman_exporter container.
Nov 22 02:35:10 np0005531887 podman[201319]: 2025-11-22 07:35:10.780089451 +0000 UTC m=+0.285946346 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 02:35:11 np0005531887 python3.9[201495]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:35:12 np0005531887 python3.9[201618]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796910.9379673-2210-35659512142345/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 22 02:35:12 np0005531887 python3.9[201770]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=openstack_network_exporter.json debug=False
Nov 22 02:35:14 np0005531887 python3.9[201922]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 22 02:35:14 np0005531887 podman[202046]: 2025-11-22 07:35:14.830165373 +0000 UTC m=+0.060294779 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Nov 22 02:35:15 np0005531887 python3[202092]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=openstack_network_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Nov 22 02:35:17 np0005531887 podman[202161]: 2025-11-22 07:35:17.435789525 +0000 UTC m=+0.145649633 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 02:35:17 np0005531887 podman[202104]: 2025-11-22 07:35:17.596639495 +0000 UTC m=+2.389460325 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Nov 22 02:35:17 np0005531887 podman[202215]: 2025-11-22 07:35:17.748985779 +0000 UTC m=+0.052432868 container create ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.openshift.tags=minimal rhel9, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, release=1755695350, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 22 02:35:17 np0005531887 podman[202215]: 2025-11-22 07:35:17.719470766 +0000 UTC m=+0.022917885 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Nov 22 02:35:17 np0005531887 python3[202092]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OS_ENDPOINT_TYPE=internal --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=edpm --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Nov 22 02:35:18 np0005531887 python3.9[202405]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:35:19 np0005531887 python3.9[202559]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:35:20 np0005531887 python3.9[202710]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763796919.6789238-2369-5703781930468/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:35:20 np0005531887 python3.9[202786]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 22 02:35:20 np0005531887 systemd[1]: Reloading.
Nov 22 02:35:21 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:35:21 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:35:21 np0005531887 podman[202898]: 2025-11-22 07:35:21.842549655 +0000 UTC m=+0.058889285 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 02:35:21 np0005531887 python3.9[202897]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 02:35:21 np0005531887 systemd[1]: Reloading.
Nov 22 02:35:22 np0005531887 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 02:35:22 np0005531887 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 02:35:22 np0005531887 systemd[1]: Starting openstack_network_exporter container...
Nov 22 02:35:22 np0005531887 systemd[1]: Started libcrun container.
Nov 22 02:35:22 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/77db9091a6c51f1f0a177627a2a6f1b81b649920585dd55b9f2e90b6a527ffc9/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 22 02:35:22 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/77db9091a6c51f1f0a177627a2a6f1b81b649920585dd55b9f2e90b6a527ffc9/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 22 02:35:22 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/77db9091a6c51f1f0a177627a2a6f1b81b649920585dd55b9f2e90b6a527ffc9/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 22 02:35:22 np0005531887 systemd[1]: Started /usr/bin/podman healthcheck run ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e.
Nov 22 02:35:22 np0005531887 podman[202961]: 2025-11-22 07:35:22.423531785 +0000 UTC m=+0.139181478 container init ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, version=9.6, container_name=openstack_network_exporter, io.openshift.expose-services=, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, distribution-scope=public, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 02:35:22 np0005531887 openstack_network_exporter[202977]: INFO    07:35:22 main.go:48: registering *bridge.Collector
Nov 22 02:35:22 np0005531887 openstack_network_exporter[202977]: INFO    07:35:22 main.go:48: registering *coverage.Collector
Nov 22 02:35:22 np0005531887 openstack_network_exporter[202977]: INFO    07:35:22 main.go:48: registering *datapath.Collector
Nov 22 02:35:22 np0005531887 openstack_network_exporter[202977]: INFO    07:35:22 main.go:48: registering *iface.Collector
Nov 22 02:35:22 np0005531887 openstack_network_exporter[202977]: INFO    07:35:22 main.go:48: registering *memory.Collector
Nov 22 02:35:22 np0005531887 openstack_network_exporter[202977]: INFO    07:35:22 main.go:48: registering *ovnnorthd.Collector
Nov 22 02:35:22 np0005531887 openstack_network_exporter[202977]: INFO    07:35:22 main.go:48: registering *ovn.Collector
Nov 22 02:35:22 np0005531887 openstack_network_exporter[202977]: INFO    07:35:22 main.go:48: registering *ovsdbserver.Collector
Nov 22 02:35:22 np0005531887 openstack_network_exporter[202977]: INFO    07:35:22 main.go:48: registering *pmd_perf.Collector
Nov 22 02:35:22 np0005531887 openstack_network_exporter[202977]: INFO    07:35:22 main.go:48: registering *pmd_rxq.Collector
Nov 22 02:35:22 np0005531887 openstack_network_exporter[202977]: INFO    07:35:22 main.go:48: registering *vswitch.Collector
Nov 22 02:35:22 np0005531887 openstack_network_exporter[202977]: NOTICE  07:35:22 main.go:76: listening on https://:9105/metrics
Nov 22 02:35:22 np0005531887 podman[202961]: 2025-11-22 07:35:22.452972237 +0000 UTC m=+0.168621960 container start ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, name=ubi9-minimal, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, version=9.6, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, architecture=x86_64, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 02:35:22 np0005531887 podman[202961]: openstack_network_exporter
Nov 22 02:35:22 np0005531887 systemd[1]: Started openstack_network_exporter container.
Nov 22 02:35:22 np0005531887 podman[202987]: 2025-11-22 07:35:22.566069432 +0000 UTC m=+0.103213767 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_id=edpm, managed_by=edpm_ansible, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., architecture=x86_64, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public)
Nov 22 02:35:23 np0005531887 python3.9[203163]: ansible-ansible.builtin.systemd Invoked with name=edpm_openstack_network_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 02:35:23 np0005531887 systemd[1]: Stopping openstack_network_exporter container...
Nov 22 02:35:23 np0005531887 systemd[1]: libpod-ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e.scope: Deactivated successfully.
Nov 22 02:35:23 np0005531887 podman[203168]: 2025-11-22 07:35:23.669368093 +0000 UTC m=+0.065948667 container died ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, managed_by=edpm_ansible, architecture=x86_64, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 22 02:35:24 np0005531887 systemd[1]: ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e-5ff38d259da71f82.timer: Deactivated successfully.
Nov 22 02:35:24 np0005531887 systemd[1]: Stopped /usr/bin/podman healthcheck run ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e.
Nov 22 02:35:24 np0005531887 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e-userdata-shm.mount: Deactivated successfully.
Nov 22 02:35:24 np0005531887 systemd[1]: var-lib-containers-storage-overlay-77db9091a6c51f1f0a177627a2a6f1b81b649920585dd55b9f2e90b6a527ffc9-merged.mount: Deactivated successfully.
Nov 22 02:35:26 np0005531887 podman[203168]: 2025-11-22 07:35:26.769808731 +0000 UTC m=+3.166389295 container cleanup ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vendor=Red Hat, Inc., container_name=openstack_network_exporter, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.buildah.version=1.33.7, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, name=ubi9-minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc.)
Nov 22 02:35:26 np0005531887 podman[203168]: openstack_network_exporter
Nov 22 02:35:26 np0005531887 systemd[1]: edpm_openstack_network_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Nov 22 02:35:26 np0005531887 podman[203197]: openstack_network_exporter
Nov 22 02:35:26 np0005531887 systemd[1]: edpm_openstack_network_exporter.service: Failed with result 'exit-code'.
Nov 22 02:35:26 np0005531887 systemd[1]: Stopped openstack_network_exporter container.
Nov 22 02:35:26 np0005531887 systemd[1]: Starting openstack_network_exporter container...
Nov 22 02:35:26 np0005531887 systemd[1]: Started libcrun container.
Nov 22 02:35:27 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/77db9091a6c51f1f0a177627a2a6f1b81b649920585dd55b9f2e90b6a527ffc9/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 22 02:35:27 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/77db9091a6c51f1f0a177627a2a6f1b81b649920585dd55b9f2e90b6a527ffc9/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 22 02:35:27 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/77db9091a6c51f1f0a177627a2a6f1b81b649920585dd55b9f2e90b6a527ffc9/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 22 02:35:27 np0005531887 nova_compute[186849]: 2025-11-22 07:35:27.031 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:35:27 np0005531887 nova_compute[186849]: 2025-11-22 07:35:27.032 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:35:27 np0005531887 systemd[1]: Started /usr/bin/podman healthcheck run ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e.
Nov 22 02:35:27 np0005531887 podman[203210]: 2025-11-22 07:35:27.078065145 +0000 UTC m=+0.207747884 container init ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, name=ubi9-minimal, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, release=1755695350, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, maintainer=Red Hat, Inc.)
Nov 22 02:35:27 np0005531887 openstack_network_exporter[203225]: INFO    07:35:27 main.go:48: registering *bridge.Collector
Nov 22 02:35:27 np0005531887 openstack_network_exporter[203225]: INFO    07:35:27 main.go:48: registering *coverage.Collector
Nov 22 02:35:27 np0005531887 openstack_network_exporter[203225]: INFO    07:35:27 main.go:48: registering *datapath.Collector
Nov 22 02:35:27 np0005531887 openstack_network_exporter[203225]: INFO    07:35:27 main.go:48: registering *iface.Collector
Nov 22 02:35:27 np0005531887 openstack_network_exporter[203225]: INFO    07:35:27 main.go:48: registering *memory.Collector
Nov 22 02:35:27 np0005531887 openstack_network_exporter[203225]: INFO    07:35:27 main.go:48: registering *ovnnorthd.Collector
Nov 22 02:35:27 np0005531887 openstack_network_exporter[203225]: INFO    07:35:27 main.go:48: registering *ovn.Collector
Nov 22 02:35:27 np0005531887 openstack_network_exporter[203225]: INFO    07:35:27 main.go:48: registering *ovsdbserver.Collector
Nov 22 02:35:27 np0005531887 openstack_network_exporter[203225]: INFO    07:35:27 main.go:48: registering *pmd_perf.Collector
Nov 22 02:35:27 np0005531887 openstack_network_exporter[203225]: INFO    07:35:27 main.go:48: registering *pmd_rxq.Collector
Nov 22 02:35:27 np0005531887 openstack_network_exporter[203225]: INFO    07:35:27 main.go:48: registering *vswitch.Collector
Nov 22 02:35:27 np0005531887 openstack_network_exporter[203225]: NOTICE  07:35:27 main.go:76: listening on https://:9105/metrics
Nov 22 02:35:27 np0005531887 podman[203210]: 2025-11-22 07:35:27.108765048 +0000 UTC m=+0.238447767 container start ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, release=1755695350, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, version=9.6, container_name=openstack_network_exporter, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 22 02:35:27 np0005531887 podman[203210]: openstack_network_exporter
Nov 22 02:35:27 np0005531887 systemd[1]: Started openstack_network_exporter container.
Nov 22 02:35:27 np0005531887 podman[203235]: 2025-11-22 07:35:27.235092602 +0000 UTC m=+0.115398351 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, distribution-scope=public, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, version=9.6, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9)
Nov 22 02:35:27 np0005531887 nova_compute[186849]: 2025-11-22 07:35:27.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:35:27 np0005531887 nova_compute[186849]: 2025-11-22 07:35:27.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 02:35:27 np0005531887 nova_compute[186849]: 2025-11-22 07:35:27.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 02:35:27 np0005531887 nova_compute[186849]: 2025-11-22 07:35:27.783 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 02:35:27 np0005531887 nova_compute[186849]: 2025-11-22 07:35:27.783 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:35:27 np0005531887 nova_compute[186849]: 2025-11-22 07:35:27.783 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:35:27 np0005531887 nova_compute[186849]: 2025-11-22 07:35:27.784 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:35:27 np0005531887 nova_compute[186849]: 2025-11-22 07:35:27.784 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:35:27 np0005531887 nova_compute[186849]: 2025-11-22 07:35:27.784 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:35:27 np0005531887 nova_compute[186849]: 2025-11-22 07:35:27.784 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 02:35:27 np0005531887 python3.9[203407]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 22 02:35:28 np0005531887 nova_compute[186849]: 2025-11-22 07:35:28.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:35:28 np0005531887 nova_compute[186849]: 2025-11-22 07:35:28.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:35:28 np0005531887 nova_compute[186849]: 2025-11-22 07:35:28.788 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:35:28 np0005531887 nova_compute[186849]: 2025-11-22 07:35:28.789 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:35:28 np0005531887 nova_compute[186849]: 2025-11-22 07:35:28.789 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:35:28 np0005531887 nova_compute[186849]: 2025-11-22 07:35:28.789 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 02:35:28 np0005531887 nova_compute[186849]: 2025-11-22 07:35:28.994 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:35:28 np0005531887 nova_compute[186849]: 2025-11-22 07:35:28.995 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5912MB free_disk=73.49393844604492GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 02:35:28 np0005531887 nova_compute[186849]: 2025-11-22 07:35:28.995 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:35:28 np0005531887 nova_compute[186849]: 2025-11-22 07:35:28.995 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:35:29 np0005531887 nova_compute[186849]: 2025-11-22 07:35:29.056 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 02:35:29 np0005531887 nova_compute[186849]: 2025-11-22 07:35:29.057 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 02:35:29 np0005531887 nova_compute[186849]: 2025-11-22 07:35:29.077 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:35:29 np0005531887 nova_compute[186849]: 2025-11-22 07:35:29.090 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:35:29 np0005531887 nova_compute[186849]: 2025-11-22 07:35:29.092 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 02:35:29 np0005531887 nova_compute[186849]: 2025-11-22 07:35:29.092 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.097s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:35:35 np0005531887 podman[203432]: 2025-11-22 07:35:35.865414119 +0000 UTC m=+0.085796665 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, container_name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:35:36 np0005531887 podman[203459]: 2025-11-22 07:35:36.838138953 +0000 UTC m=+0.055440793 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, health_failing_streak=3, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 22 02:35:36 np0005531887 systemd[1]: 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd-108737761a9ec916.service: Main process exited, code=exited, status=1/FAILURE
Nov 22 02:35:36 np0005531887 systemd[1]: 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd-108737761a9ec916.service: Failed with result 'exit-code'.
Nov 22 02:35:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:35:37.306 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:35:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:35:37.307 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:35:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:35:37.307 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:35:41 np0005531887 podman[203479]: 2025-11-22 07:35:41.835827222 +0000 UTC m=+0.059258063 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 02:35:45 np0005531887 podman[203504]: 2025-11-22 07:35:45.823900677 +0000 UTC m=+0.044685773 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 22 02:35:47 np0005531887 podman[203523]: 2025-11-22 07:35:47.841128009 +0000 UTC m=+0.063932957 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 22 02:35:52 np0005531887 podman[203545]: 2025-11-22 07:35:52.834341349 +0000 UTC m=+0.053475666 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 02:35:56 np0005531887 python3.9[203696]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Nov 22 02:35:57 np0005531887 python3.9[203861]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 22 02:35:57 np0005531887 systemd[1]: Started libpod-conmon-5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d.scope.
Nov 22 02:35:57 np0005531887 podman[203862]: 2025-11-22 07:35:57.217839034 +0000 UTC m=+0.103053791 container exec 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, container_name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 22 02:35:57 np0005531887 podman[203862]: 2025-11-22 07:35:57.251613936 +0000 UTC m=+0.136828673 container exec_died 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 02:35:57 np0005531887 systemd[1]: libpod-conmon-5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d.scope: Deactivated successfully.
Nov 22 02:35:57 np0005531887 podman[203893]: 2025-11-22 07:35:57.380007724 +0000 UTC m=+0.062653850 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, name=ubi9-minimal, version=9.6, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, io.openshift.expose-services=, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 02:35:57 np0005531887 python3.9[204065]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 22 02:35:58 np0005531887 systemd[1]: Started libpod-conmon-5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d.scope.
Nov 22 02:35:58 np0005531887 podman[204066]: 2025-11-22 07:35:58.084524361 +0000 UTC m=+0.080588898 container exec 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 22 02:35:58 np0005531887 podman[204066]: 2025-11-22 07:35:58.116986009 +0000 UTC m=+0.113050526 container exec_died 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 22 02:35:58 np0005531887 systemd[1]: libpod-conmon-5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d.scope: Deactivated successfully.
Nov 22 02:36:00 np0005531887 python3.9[204249]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:36:01 np0005531887 python3.9[204401]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Nov 22 02:36:02 np0005531887 python3.9[204566]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 22 02:36:02 np0005531887 systemd[1]: Started libpod-conmon-cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da.scope.
Nov 22 02:36:02 np0005531887 podman[204567]: 2025-11-22 07:36:02.468030573 +0000 UTC m=+0.367912524 container exec cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:36:02 np0005531887 podman[204586]: 2025-11-22 07:36:02.665610017 +0000 UTC m=+0.183003203 container exec_died cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 02:36:02 np0005531887 podman[204567]: 2025-11-22 07:36:02.824655539 +0000 UTC m=+0.724537470 container exec_died cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 02:36:02 np0005531887 systemd[1]: libpod-conmon-cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da.scope: Deactivated successfully.
Nov 22 02:36:03 np0005531887 python3.9[204750]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 22 02:36:03 np0005531887 systemd[1]: Started libpod-conmon-cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da.scope.
Nov 22 02:36:03 np0005531887 podman[204751]: 2025-11-22 07:36:03.886834605 +0000 UTC m=+0.064813125 container exec cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:36:03 np0005531887 podman[204751]: 2025-11-22 07:36:03.922600719 +0000 UTC m=+0.100579209 container exec_died cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:36:03 np0005531887 systemd[1]: libpod-conmon-cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da.scope: Deactivated successfully.
Nov 22 02:36:04 np0005531887 python3.9[204935]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:36:05 np0005531887 python3.9[205087]: ansible-containers.podman.podman_container_info Invoked with name=['multipathd'] executable=podman
Nov 22 02:36:06 np0005531887 podman[205254]: 2025-11-22 07:36:06.038064457 +0000 UTC m=+0.098870436 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:36:06 np0005531887 python3.9[205255]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 22 02:36:06 np0005531887 systemd[1]: Started libpod-conmon-30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3.scope.
Nov 22 02:36:06 np0005531887 podman[205281]: 2025-11-22 07:36:06.243756568 +0000 UTC m=+0.073402775 container exec 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 22 02:36:06 np0005531887 podman[205281]: 2025-11-22 07:36:06.27749506 +0000 UTC m=+0.107141287 container exec_died 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 22 02:36:06 np0005531887 systemd[1]: libpod-conmon-30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3.scope: Deactivated successfully.
Nov 22 02:36:06 np0005531887 podman[205463]: 2025-11-22 07:36:06.96732255 +0000 UTC m=+0.061643135 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, health_failing_streak=4, health_log=, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 22 02:36:06 np0005531887 systemd[1]: 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd-108737761a9ec916.service: Main process exited, code=exited, status=1/FAILURE
Nov 22 02:36:06 np0005531887 systemd[1]: 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd-108737761a9ec916.service: Failed with result 'exit-code'.
Nov 22 02:36:07 np0005531887 python3.9[205464]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 22 02:36:07 np0005531887 systemd[1]: Started libpod-conmon-30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3.scope.
Nov 22 02:36:07 np0005531887 podman[205483]: 2025-11-22 07:36:07.230612893 +0000 UTC m=+0.087462394 container exec 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 22 02:36:07 np0005531887 podman[205483]: 2025-11-22 07:36:07.263640886 +0000 UTC m=+0.120490397 container exec_died 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:36:07 np0005531887 systemd[1]: libpod-conmon-30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3.scope: Deactivated successfully.
Nov 22 02:36:08 np0005531887 python3.9[205666]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/multipathd recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:36:08 np0005531887 python3.9[205818]: ansible-containers.podman.podman_container_info Invoked with name=['ceilometer_agent_compute'] executable=podman
Nov 22 02:36:09 np0005531887 python3.9[205983]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 22 02:36:09 np0005531887 systemd[1]: Started libpod-conmon-083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd.scope.
Nov 22 02:36:09 np0005531887 podman[205984]: 2025-11-22 07:36:09.757554966 +0000 UTC m=+0.106744946 container exec 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 22 02:36:09 np0005531887 podman[206004]: 2025-11-22 07:36:09.831490484 +0000 UTC m=+0.057941790 container exec_died 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 02:36:09 np0005531887 podman[205984]: 2025-11-22 07:36:09.837479647 +0000 UTC m=+0.186669607 container exec_died 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 22 02:36:09 np0005531887 systemd[1]: libpod-conmon-083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd.scope: Deactivated successfully.
Nov 22 02:36:10 np0005531887 python3.9[206168]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 22 02:36:10 np0005531887 systemd[1]: Started libpod-conmon-083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd.scope.
Nov 22 02:36:10 np0005531887 podman[206169]: 2025-11-22 07:36:10.680545331 +0000 UTC m=+0.073410705 container exec 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:36:10 np0005531887 podman[206169]: 2025-11-22 07:36:10.712576158 +0000 UTC m=+0.105441532 container exec_died 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 02:36:10 np0005531887 systemd[1]: libpod-conmon-083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd.scope: Deactivated successfully.
Nov 22 02:36:11 np0005531887 python3.9[206350]: ansible-ansible.builtin.file Invoked with group=42405 mode=0700 owner=42405 path=/var/lib/openstack/healthchecks/ceilometer_agent_compute recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:36:12 np0005531887 podman[206474]: 2025-11-22 07:36:12.110733324 +0000 UTC m=+0.055933759 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 02:36:12 np0005531887 python3.9[206526]: ansible-containers.podman.podman_container_info Invoked with name=['node_exporter'] executable=podman
Nov 22 02:36:13 np0005531887 python3.9[206692]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 22 02:36:13 np0005531887 systemd[1]: Started libpod-conmon-3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f.scope.
Nov 22 02:36:13 np0005531887 podman[206693]: 2025-11-22 07:36:13.379579648 +0000 UTC m=+0.129195489 container exec 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 02:36:13 np0005531887 podman[206693]: 2025-11-22 07:36:13.413646539 +0000 UTC m=+0.163262360 container exec_died 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 02:36:13 np0005531887 systemd[1]: libpod-conmon-3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f.scope: Deactivated successfully.
Nov 22 02:36:14 np0005531887 python3.9[206876]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 22 02:36:14 np0005531887 systemd[1]: Started libpod-conmon-3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f.scope.
Nov 22 02:36:14 np0005531887 podman[206877]: 2025-11-22 07:36:14.233101569 +0000 UTC m=+0.067214827 container exec 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 02:36:14 np0005531887 podman[206896]: 2025-11-22 07:36:14.299464393 +0000 UTC m=+0.053497156 container exec_died 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 02:36:14 np0005531887 podman[206877]: 2025-11-22 07:36:14.304879312 +0000 UTC m=+0.138992560 container exec_died 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 02:36:14 np0005531887 systemd[1]: libpod-conmon-3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f.scope: Deactivated successfully.
Nov 22 02:36:14 np0005531887 python3.9[207061]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/node_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:36:16 np0005531887 podman[207185]: 2025-11-22 07:36:16.12145957 +0000 UTC m=+0.059549212 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:36:16 np0005531887 python3.9[207231]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Nov 22 02:36:17 np0005531887 python3.9[207396]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 22 02:36:17 np0005531887 systemd[1]: Started libpod-conmon-899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2.scope.
Nov 22 02:36:17 np0005531887 podman[207397]: 2025-11-22 07:36:17.698989334 +0000 UTC m=+0.075548829 container exec 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 02:36:17 np0005531887 podman[207397]: 2025-11-22 07:36:17.729661828 +0000 UTC m=+0.106221293 container exec_died 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 02:36:17 np0005531887 systemd[1]: libpod-conmon-899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2.scope: Deactivated successfully.
Nov 22 02:36:18 np0005531887 podman[207552]: 2025-11-22 07:36:18.241485905 +0000 UTC m=+0.060670300 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:36:18 np0005531887 python3.9[207599]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 22 02:36:18 np0005531887 systemd[1]: Started libpod-conmon-899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2.scope.
Nov 22 02:36:18 np0005531887 podman[207600]: 2025-11-22 07:36:18.523472884 +0000 UTC m=+0.073780495 container exec 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 02:36:18 np0005531887 podman[207600]: 2025-11-22 07:36:18.559650598 +0000 UTC m=+0.109958189 container exec_died 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 02:36:18 np0005531887 systemd[1]: libpod-conmon-899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2.scope: Deactivated successfully.
Nov 22 02:36:19 np0005531887 python3.9[207785]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:36:19 np0005531887 python3.9[207937]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Nov 22 02:36:20 np0005531887 python3.9[208102]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 22 02:36:20 np0005531887 systemd[1]: Started libpod-conmon-ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e.scope.
Nov 22 02:36:20 np0005531887 podman[208103]: 2025-11-22 07:36:20.857144514 +0000 UTC m=+0.067076624 container exec ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, architecture=x86_64, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, release=1755695350, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6)
Nov 22 02:36:20 np0005531887 podman[208103]: 2025-11-22 07:36:20.891542691 +0000 UTC m=+0.101474771 container exec_died ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.buildah.version=1.33.7, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, vcs-type=git, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible)
Nov 22 02:36:20 np0005531887 systemd[1]: libpod-conmon-ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e.scope: Deactivated successfully.
Nov 22 02:36:21 np0005531887 python3.9[208287]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 22 02:36:21 np0005531887 systemd[1]: Started libpod-conmon-ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e.scope.
Nov 22 02:36:21 np0005531887 podman[208288]: 2025-11-22 07:36:21.696043471 +0000 UTC m=+0.078140266 container exec ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-type=git, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, distribution-scope=public, name=ubi9-minimal, release=1755695350, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter)
Nov 22 02:36:21 np0005531887 podman[208288]: 2025-11-22 07:36:21.730775368 +0000 UTC m=+0.112872163 container exec_died ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, architecture=x86_64, config_id=edpm, name=ubi9-minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., managed_by=edpm_ansible, io.buildah.version=1.33.7)
Nov 22 02:36:21 np0005531887 systemd[1]: libpod-conmon-ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e.scope: Deactivated successfully.
Nov 22 02:36:22 np0005531887 python3.9[208471]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:36:23 np0005531887 podman[208496]: 2025-11-22 07:36:23.835116483 +0000 UTC m=+0.054263155 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 02:36:27 np0005531887 podman[208521]: 2025-11-22 07:36:27.839188007 +0000 UTC m=+0.063169594 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, vendor=Red Hat, Inc., release=1755695350, container_name=openstack_network_exporter, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, distribution-scope=public, config_id=edpm, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 22 02:36:28 np0005531887 nova_compute[186849]: 2025-11-22 07:36:28.092 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:36:28 np0005531887 nova_compute[186849]: 2025-11-22 07:36:28.093 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:36:28 np0005531887 nova_compute[186849]: 2025-11-22 07:36:28.762 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:36:28 np0005531887 nova_compute[186849]: 2025-11-22 07:36:28.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:36:28 np0005531887 nova_compute[186849]: 2025-11-22 07:36:28.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 02:36:28 np0005531887 nova_compute[186849]: 2025-11-22 07:36:28.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 02:36:28 np0005531887 nova_compute[186849]: 2025-11-22 07:36:28.781 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 02:36:28 np0005531887 nova_compute[186849]: 2025-11-22 07:36:28.781 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:36:28 np0005531887 nova_compute[186849]: 2025-11-22 07:36:28.782 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:36:28 np0005531887 nova_compute[186849]: 2025-11-22 07:36:28.783 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:36:28 np0005531887 nova_compute[186849]: 2025-11-22 07:36:28.783 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 02:36:29 np0005531887 nova_compute[186849]: 2025-11-22 07:36:29.770 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:36:30 np0005531887 nova_compute[186849]: 2025-11-22 07:36:30.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:36:30 np0005531887 nova_compute[186849]: 2025-11-22 07:36:30.798 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:36:30 np0005531887 nova_compute[186849]: 2025-11-22 07:36:30.798 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:36:30 np0005531887 nova_compute[186849]: 2025-11-22 07:36:30.799 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:36:30 np0005531887 nova_compute[186849]: 2025-11-22 07:36:30.799 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 02:36:30 np0005531887 nova_compute[186849]: 2025-11-22 07:36:30.956 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:36:30 np0005531887 nova_compute[186849]: 2025-11-22 07:36:30.957 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5971MB free_disk=73.49348831176758GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 02:36:30 np0005531887 nova_compute[186849]: 2025-11-22 07:36:30.957 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:36:30 np0005531887 nova_compute[186849]: 2025-11-22 07:36:30.957 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:36:31 np0005531887 nova_compute[186849]: 2025-11-22 07:36:31.050 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 02:36:31 np0005531887 nova_compute[186849]: 2025-11-22 07:36:31.050 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 02:36:31 np0005531887 nova_compute[186849]: 2025-11-22 07:36:31.087 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:36:31 np0005531887 nova_compute[186849]: 2025-11-22 07:36:31.103 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:36:31 np0005531887 nova_compute[186849]: 2025-11-22 07:36:31.105 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 02:36:31 np0005531887 nova_compute[186849]: 2025-11-22 07:36:31.105 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.148s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:36:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:36:36.655 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:36:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:36:36.656 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:36:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:36:36.656 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:36:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:36:36.656 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:36:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:36:36.656 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:36:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:36:36.656 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:36:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:36:36.656 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:36:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:36:36.656 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:36:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:36:36.656 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:36:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:36:36.657 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:36:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:36:36.657 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:36:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:36:36.657 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:36:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:36:36.657 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:36:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:36:36.657 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:36:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:36:36.657 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:36:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:36:36.657 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:36:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:36:36.657 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:36:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:36:36.657 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:36:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:36:36.657 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:36:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:36:36.657 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:36:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:36:36.657 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:36:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:36:36.657 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:36:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:36:36.657 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:36:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:36:36.658 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:36:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:36:36.658 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:36:36 np0005531887 podman[208542]: 2025-11-22 07:36:36.886206131 +0000 UTC m=+0.108867751 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 02:36:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:36:37.307 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:36:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:36:37.308 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:36:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:36:37.308 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:36:37 np0005531887 podman[208568]: 2025-11-22 07:36:37.838653167 +0000 UTC m=+0.062778774 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.build-date=20251118, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 22 02:36:42 np0005531887 podman[208589]: 2025-11-22 07:36:42.83241499 +0000 UTC m=+0.052503842 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 02:36:46 np0005531887 podman[208613]: 2025-11-22 07:36:46.838352621 +0000 UTC m=+0.062007632 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:36:48 np0005531887 podman[208634]: 2025-11-22 07:36:48.827011882 +0000 UTC m=+0.048363635 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:36:49 np0005531887 python3.9[208781]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:36:50 np0005531887 python3.9[208933]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:36:50 np0005531887 python3.9[209056]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1763797009.744877-3212-216345631647357/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:36:51 np0005531887 python3.9[209208]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:36:52 np0005531887 python3.9[209360]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:36:52 np0005531887 python3.9[209438]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:36:53 np0005531887 python3.9[209590]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:36:54 np0005531887 podman[209640]: 2025-11-22 07:36:54.019206762 +0000 UTC m=+0.053252820 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 02:36:54 np0005531887 python3.9[209692]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.rnicvz6j recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:36:54 np0005531887 python3.9[209844]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:36:55 np0005531887 python3.9[209922]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:36:56 np0005531887 python3.9[210074]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:36:57 np0005531887 python3[210227]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 22 02:36:57 np0005531887 python3.9[210379]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:36:58 np0005531887 podman[210429]: 2025-11-22 07:36:58.137621946 +0000 UTC m=+0.091912558 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, maintainer=Red Hat, Inc., release=1755695350, build-date=2025-08-20T13:12:41, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git)
Nov 22 02:36:58 np0005531887 python3.9[210474]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:36:59 np0005531887 python3.9[210630]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:36:59 np0005531887 python3.9[210708]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:37:00 np0005531887 python3.9[210860]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:37:00 np0005531887 python3.9[210938]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:37:01 np0005531887 python3.9[211090]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:37:02 np0005531887 python3.9[211168]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:37:02 np0005531887 python3.9[211320]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 02:37:03 np0005531887 python3.9[211445]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763797022.2805269-3587-176382096715593/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:37:04 np0005531887 python3.9[211597]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:37:04 np0005531887 python3.9[211749]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:37:05 np0005531887 python3.9[211904]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:37:06 np0005531887 python3.9[212056]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:37:07 np0005531887 podman[212181]: 2025-11-22 07:37:07.358333828 +0000 UTC m=+0.108992537 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 22 02:37:07 np0005531887 python3.9[212230]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 02:37:08 np0005531887 podman[212362]: 2025-11-22 07:37:08.044127413 +0000 UTC m=+0.059326985 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 22 02:37:08 np0005531887 python3.9[212410]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 02:37:08 np0005531887 python3.9[212566]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 02:37:09 np0005531887 systemd[1]: session-26.scope: Deactivated successfully.
Nov 22 02:37:09 np0005531887 systemd[1]: session-26.scope: Consumed 1min 40.931s CPU time.
Nov 22 02:37:09 np0005531887 systemd-logind[821]: Session 26 logged out. Waiting for processes to exit.
Nov 22 02:37:09 np0005531887 systemd-logind[821]: Removed session 26.
Nov 22 02:37:13 np0005531887 podman[212591]: 2025-11-22 07:37:13.844541881 +0000 UTC m=+0.068834572 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 02:37:17 np0005531887 podman[212616]: 2025-11-22 07:37:17.826152152 +0000 UTC m=+0.045284665 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 22 02:37:19 np0005531887 podman[212636]: 2025-11-22 07:37:19.822018889 +0000 UTC m=+0.046487804 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd)
Nov 22 02:37:24 np0005531887 podman[212656]: 2025-11-22 07:37:24.835546994 +0000 UTC m=+0.057454939 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 02:37:28 np0005531887 nova_compute[186849]: 2025-11-22 07:37:28.099 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:37:28 np0005531887 nova_compute[186849]: 2025-11-22 07:37:28.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:37:28 np0005531887 nova_compute[186849]: 2025-11-22 07:37:28.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:37:28 np0005531887 nova_compute[186849]: 2025-11-22 07:37:28.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:37:28 np0005531887 nova_compute[186849]: 2025-11-22 07:37:28.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:37:28 np0005531887 nova_compute[186849]: 2025-11-22 07:37:28.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 02:37:28 np0005531887 podman[212683]: 2025-11-22 07:37:28.832629362 +0000 UTC m=+0.054141105 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, distribution-scope=public, release=1755695350, vcs-type=git, version=9.6, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9)
Nov 22 02:37:29 np0005531887 nova_compute[186849]: 2025-11-22 07:37:29.763 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:37:29 np0005531887 nova_compute[186849]: 2025-11-22 07:37:29.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:37:29 np0005531887 nova_compute[186849]: 2025-11-22 07:37:29.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 02:37:29 np0005531887 nova_compute[186849]: 2025-11-22 07:37:29.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 02:37:29 np0005531887 nova_compute[186849]: 2025-11-22 07:37:29.782 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 02:37:29 np0005531887 nova_compute[186849]: 2025-11-22 07:37:29.782 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:37:29 np0005531887 nova_compute[186849]: 2025-11-22 07:37:29.782 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:37:32 np0005531887 nova_compute[186849]: 2025-11-22 07:37:32.770 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:37:32 np0005531887 nova_compute[186849]: 2025-11-22 07:37:32.792 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:37:32 np0005531887 nova_compute[186849]: 2025-11-22 07:37:32.793 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:37:32 np0005531887 nova_compute[186849]: 2025-11-22 07:37:32.793 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:37:32 np0005531887 nova_compute[186849]: 2025-11-22 07:37:32.793 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 02:37:32 np0005531887 nova_compute[186849]: 2025-11-22 07:37:32.952 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:37:32 np0005531887 nova_compute[186849]: 2025-11-22 07:37:32.954 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6031MB free_disk=73.49720764160156GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 02:37:32 np0005531887 nova_compute[186849]: 2025-11-22 07:37:32.954 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:37:32 np0005531887 nova_compute[186849]: 2025-11-22 07:37:32.954 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:37:33 np0005531887 nova_compute[186849]: 2025-11-22 07:37:33.015 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 02:37:33 np0005531887 nova_compute[186849]: 2025-11-22 07:37:33.015 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 02:37:33 np0005531887 nova_compute[186849]: 2025-11-22 07:37:33.050 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:37:33 np0005531887 nova_compute[186849]: 2025-11-22 07:37:33.066 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:37:33 np0005531887 nova_compute[186849]: 2025-11-22 07:37:33.067 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 02:37:33 np0005531887 nova_compute[186849]: 2025-11-22 07:37:33.068 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:37:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:37:37.308 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:37:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:37:37.308 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:37:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:37:37.309 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:37:37 np0005531887 podman[212706]: 2025-11-22 07:37:37.857071479 +0000 UTC m=+0.076304920 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 22 02:37:38 np0005531887 podman[212732]: 2025-11-22 07:37:38.842638114 +0000 UTC m=+0.063959572 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:37:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:37:40.075 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:37:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:37:40.076 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 02:37:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:37:40.078 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:37:44 np0005531887 podman[212752]: 2025-11-22 07:37:44.842665615 +0000 UTC m=+0.061231582 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 02:37:48 np0005531887 podman[212777]: 2025-11-22 07:37:48.826001369 +0000 UTC m=+0.050233078 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:37:50 np0005531887 podman[212797]: 2025-11-22 07:37:50.848217316 +0000 UTC m=+0.063047509 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:37:55 np0005531887 podman[212817]: 2025-11-22 07:37:55.83818271 +0000 UTC m=+0.053564602 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 02:37:59 np0005531887 podman[212842]: 2025-11-22 07:37:59.829058732 +0000 UTC m=+0.050576456 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, version=9.6, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 22 02:38:08 np0005531887 podman[212864]: 2025-11-22 07:38:08.854387707 +0000 UTC m=+0.075037430 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller)
Nov 22 02:38:08 np0005531887 podman[212890]: 2025-11-22 07:38:08.935109037 +0000 UTC m=+0.054500724 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 22 02:38:15 np0005531887 podman[212912]: 2025-11-22 07:38:15.835463802 +0000 UTC m=+0.053651422 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 02:38:19 np0005531887 podman[212935]: 2025-11-22 07:38:19.830719916 +0000 UTC m=+0.055993780 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Nov 22 02:38:21 np0005531887 podman[212956]: 2025-11-22 07:38:21.847222134 +0000 UTC m=+0.068584001 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:38:26 np0005531887 nova_compute[186849]: 2025-11-22 07:38:26.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:38:26 np0005531887 nova_compute[186849]: 2025-11-22 07:38:26.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 22 02:38:26 np0005531887 nova_compute[186849]: 2025-11-22 07:38:26.784 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 22 02:38:26 np0005531887 nova_compute[186849]: 2025-11-22 07:38:26.784 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:38:26 np0005531887 nova_compute[186849]: 2025-11-22 07:38:26.785 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 22 02:38:26 np0005531887 nova_compute[186849]: 2025-11-22 07:38:26.796 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:38:26 np0005531887 podman[212976]: 2025-11-22 07:38:26.852168096 +0000 UTC m=+0.073355038 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 02:38:28 np0005531887 nova_compute[186849]: 2025-11-22 07:38:28.807 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:38:28 np0005531887 nova_compute[186849]: 2025-11-22 07:38:28.807 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:38:28 np0005531887 nova_compute[186849]: 2025-11-22 07:38:28.808 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:38:28 np0005531887 nova_compute[186849]: 2025-11-22 07:38:28.808 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 02:38:29 np0005531887 nova_compute[186849]: 2025-11-22 07:38:29.763 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:38:30 np0005531887 nova_compute[186849]: 2025-11-22 07:38:30.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:38:30 np0005531887 nova_compute[186849]: 2025-11-22 07:38:30.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 02:38:30 np0005531887 nova_compute[186849]: 2025-11-22 07:38:30.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 02:38:30 np0005531887 nova_compute[186849]: 2025-11-22 07:38:30.780 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 02:38:30 np0005531887 nova_compute[186849]: 2025-11-22 07:38:30.781 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:38:30 np0005531887 podman[213000]: 2025-11-22 07:38:30.832835351 +0000 UTC m=+0.057283253 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, name=ubi9-minimal, vendor=Red Hat, Inc., release=1755695350, config_id=edpm, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41)
Nov 22 02:38:31 np0005531887 nova_compute[186849]: 2025-11-22 07:38:31.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:38:31 np0005531887 nova_compute[186849]: 2025-11-22 07:38:31.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:38:33 np0005531887 nova_compute[186849]: 2025-11-22 07:38:33.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:38:33 np0005531887 nova_compute[186849]: 2025-11-22 07:38:33.791 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:38:33 np0005531887 nova_compute[186849]: 2025-11-22 07:38:33.792 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:38:33 np0005531887 nova_compute[186849]: 2025-11-22 07:38:33.792 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:38:33 np0005531887 nova_compute[186849]: 2025-11-22 07:38:33.792 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 02:38:33 np0005531887 nova_compute[186849]: 2025-11-22 07:38:33.934 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:38:33 np0005531887 nova_compute[186849]: 2025-11-22 07:38:33.935 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6058MB free_disk=73.49720764160156GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 02:38:33 np0005531887 nova_compute[186849]: 2025-11-22 07:38:33.936 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:38:33 np0005531887 nova_compute[186849]: 2025-11-22 07:38:33.936 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:38:34 np0005531887 nova_compute[186849]: 2025-11-22 07:38:34.066 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 02:38:34 np0005531887 nova_compute[186849]: 2025-11-22 07:38:34.066 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 02:38:34 np0005531887 nova_compute[186849]: 2025-11-22 07:38:34.128 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Refreshing inventories for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 22 02:38:34 np0005531887 nova_compute[186849]: 2025-11-22 07:38:34.174 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Updating ProviderTree inventory for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 22 02:38:34 np0005531887 nova_compute[186849]: 2025-11-22 07:38:34.174 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Updating inventory in ProviderTree for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 22 02:38:34 np0005531887 nova_compute[186849]: 2025-11-22 07:38:34.195 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Refreshing aggregate associations for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 22 02:38:34 np0005531887 nova_compute[186849]: 2025-11-22 07:38:34.236 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Refreshing trait associations for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78, traits: COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_MMX,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_PCNET _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 22 02:38:34 np0005531887 nova_compute[186849]: 2025-11-22 07:38:34.261 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:38:34 np0005531887 nova_compute[186849]: 2025-11-22 07:38:34.275 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:38:34 np0005531887 nova_compute[186849]: 2025-11-22 07:38:34.276 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 02:38:34 np0005531887 nova_compute[186849]: 2025-11-22 07:38:34.277 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.341s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:38:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:38:36.656 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:38:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:38:36.657 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:38:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:38:36.657 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:38:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:38:36.657 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:38:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:38:36.657 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:38:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:38:36.658 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:38:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:38:36.658 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:38:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:38:36.658 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:38:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:38:36.658 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:38:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:38:36.658 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:38:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:38:36.658 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:38:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:38:36.658 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:38:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:38:36.658 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:38:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:38:36.658 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:38:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:38:36.658 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:38:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:38:36.658 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:38:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:38:36.658 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:38:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:38:36.658 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:38:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:38:36.658 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:38:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:38:36.659 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:38:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:38:36.659 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:38:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:38:36.659 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:38:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:38:36.659 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:38:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:38:36.659 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:38:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:38:36.659 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:38:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:38:37.310 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:38:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:38:37.310 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:38:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:38:37.311 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:38:39 np0005531887 podman[213021]: 2025-11-22 07:38:39.838190895 +0000 UTC m=+0.055014556 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, managed_by=edpm_ansible)
Nov 22 02:38:39 np0005531887 podman[213022]: 2025-11-22 07:38:39.871337592 +0000 UTC m=+0.079551481 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:38:46 np0005531887 podman[213067]: 2025-11-22 07:38:46.871198941 +0000 UTC m=+0.087646461 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 02:38:50 np0005531887 podman[213093]: 2025-11-22 07:38:50.837408959 +0000 UTC m=+0.055489999 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 22 02:38:52 np0005531887 podman[213112]: 2025-11-22 07:38:52.846098232 +0000 UTC m=+0.062256105 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 22 02:38:57 np0005531887 podman[213134]: 2025-11-22 07:38:57.832857737 +0000 UTC m=+0.056312508 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 02:39:01 np0005531887 podman[213158]: 2025-11-22 07:39:01.83215234 +0000 UTC m=+0.053519610 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, architecture=x86_64, config_id=edpm, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, vendor=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 22 02:39:10 np0005531887 podman[213179]: 2025-11-22 07:39:10.834783691 +0000 UTC m=+0.057873092 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:39:10 np0005531887 podman[213180]: 2025-11-22 07:39:10.860026587 +0000 UTC m=+0.080927243 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3)
Nov 22 02:39:17 np0005531887 podman[213223]: 2025-11-22 07:39:17.839794372 +0000 UTC m=+0.053746446 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 02:39:21 np0005531887 podman[213248]: 2025-11-22 07:39:21.829221674 +0000 UTC m=+0.050650677 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:39:22 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:39:22.513 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:39:22 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:39:22.514 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 02:39:23 np0005531887 podman[213266]: 2025-11-22 07:39:23.828223836 +0000 UTC m=+0.052752080 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 02:39:28 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:39:28.516 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:39:28 np0005531887 podman[213287]: 2025-11-22 07:39:28.829155017 +0000 UTC m=+0.049473577 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 02:39:29 np0005531887 nova_compute[186849]: 2025-11-22 07:39:29.276 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:39:29 np0005531887 nova_compute[186849]: 2025-11-22 07:39:29.762 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:39:29 np0005531887 nova_compute[186849]: 2025-11-22 07:39:29.767 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:39:30 np0005531887 nova_compute[186849]: 2025-11-22 07:39:30.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:39:30 np0005531887 nova_compute[186849]: 2025-11-22 07:39:30.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 02:39:30 np0005531887 nova_compute[186849]: 2025-11-22 07:39:30.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 02:39:30 np0005531887 nova_compute[186849]: 2025-11-22 07:39:30.813 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 02:39:30 np0005531887 nova_compute[186849]: 2025-11-22 07:39:30.814 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:39:30 np0005531887 nova_compute[186849]: 2025-11-22 07:39:30.814 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 02:39:31 np0005531887 nova_compute[186849]: 2025-11-22 07:39:31.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:39:31 np0005531887 nova_compute[186849]: 2025-11-22 07:39:31.785 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:39:32 np0005531887 nova_compute[186849]: 2025-11-22 07:39:32.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:39:32 np0005531887 nova_compute[186849]: 2025-11-22 07:39:32.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:39:32 np0005531887 podman[213311]: 2025-11-22 07:39:32.864341432 +0000 UTC m=+0.083728994 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, name=ubi9-minimal, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., container_name=openstack_network_exporter, vcs-type=git, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, managed_by=edpm_ansible, release=1755695350, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6)
Nov 22 02:39:33 np0005531887 nova_compute[186849]: 2025-11-22 07:39:33.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:39:33 np0005531887 nova_compute[186849]: 2025-11-22 07:39:33.795 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:39:33 np0005531887 nova_compute[186849]: 2025-11-22 07:39:33.795 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:39:33 np0005531887 nova_compute[186849]: 2025-11-22 07:39:33.795 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:39:33 np0005531887 nova_compute[186849]: 2025-11-22 07:39:33.795 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 02:39:33 np0005531887 nova_compute[186849]: 2025-11-22 07:39:33.960 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:39:33 np0005531887 nova_compute[186849]: 2025-11-22 07:39:33.961 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6075MB free_disk=73.49718856811523GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 02:39:33 np0005531887 nova_compute[186849]: 2025-11-22 07:39:33.962 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:39:33 np0005531887 nova_compute[186849]: 2025-11-22 07:39:33.962 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:39:34 np0005531887 nova_compute[186849]: 2025-11-22 07:39:34.043 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 02:39:34 np0005531887 nova_compute[186849]: 2025-11-22 07:39:34.043 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 02:39:34 np0005531887 nova_compute[186849]: 2025-11-22 07:39:34.082 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:39:34 np0005531887 nova_compute[186849]: 2025-11-22 07:39:34.101 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:39:34 np0005531887 nova_compute[186849]: 2025-11-22 07:39:34.103 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 02:39:34 np0005531887 nova_compute[186849]: 2025-11-22 07:39:34.103 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.141s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:39:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:39:37.311 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:39:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:39:37.312 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:39:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:39:37.312 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:39:40 np0005531887 nova_compute[186849]: 2025-11-22 07:39:40.736 186853 DEBUG oslo_concurrency.lockutils [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Acquiring lock "5fde891f-fc12-4c02-8b8d-0475ce19a753" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:39:40 np0005531887 nova_compute[186849]: 2025-11-22 07:39:40.737 186853 DEBUG oslo_concurrency.lockutils [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Lock "5fde891f-fc12-4c02-8b8d-0475ce19a753" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:39:40 np0005531887 nova_compute[186849]: 2025-11-22 07:39:40.764 186853 DEBUG nova.compute.manager [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: 5fde891f-fc12-4c02-8b8d-0475ce19a753] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 02:39:40 np0005531887 nova_compute[186849]: 2025-11-22 07:39:40.943 186853 DEBUG oslo_concurrency.lockutils [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:39:40 np0005531887 nova_compute[186849]: 2025-11-22 07:39:40.943 186853 DEBUG oslo_concurrency.lockutils [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:39:40 np0005531887 nova_compute[186849]: 2025-11-22 07:39:40.949 186853 DEBUG nova.virt.hardware [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:39:40 np0005531887 nova_compute[186849]: 2025-11-22 07:39:40.949 186853 INFO nova.compute.claims [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: 5fde891f-fc12-4c02-8b8d-0475ce19a753] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 22 02:39:41 np0005531887 nova_compute[186849]: 2025-11-22 07:39:41.064 186853 DEBUG nova.compute.provider_tree [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:39:41 np0005531887 nova_compute[186849]: 2025-11-22 07:39:41.080 186853 DEBUG nova.scheduler.client.report [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:39:41 np0005531887 nova_compute[186849]: 2025-11-22 07:39:41.113 186853 DEBUG oslo_concurrency.lockutils [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.170s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:39:41 np0005531887 nova_compute[186849]: 2025-11-22 07:39:41.114 186853 DEBUG nova.compute.manager [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: 5fde891f-fc12-4c02-8b8d-0475ce19a753] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 02:39:41 np0005531887 nova_compute[186849]: 2025-11-22 07:39:41.163 186853 DEBUG nova.compute.manager [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: 5fde891f-fc12-4c02-8b8d-0475ce19a753] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 02:39:41 np0005531887 nova_compute[186849]: 2025-11-22 07:39:41.163 186853 DEBUG nova.network.neutron [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: 5fde891f-fc12-4c02-8b8d-0475ce19a753] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 02:39:41 np0005531887 nova_compute[186849]: 2025-11-22 07:39:41.181 186853 INFO nova.virt.libvirt.driver [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: 5fde891f-fc12-4c02-8b8d-0475ce19a753] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 02:39:41 np0005531887 nova_compute[186849]: 2025-11-22 07:39:41.200 186853 DEBUG nova.compute.manager [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: 5fde891f-fc12-4c02-8b8d-0475ce19a753] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 02:39:41 np0005531887 nova_compute[186849]: 2025-11-22 07:39:41.311 186853 DEBUG nova.compute.manager [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: 5fde891f-fc12-4c02-8b8d-0475ce19a753] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 02:39:41 np0005531887 nova_compute[186849]: 2025-11-22 07:39:41.312 186853 DEBUG nova.virt.libvirt.driver [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: 5fde891f-fc12-4c02-8b8d-0475ce19a753] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:39:41 np0005531887 nova_compute[186849]: 2025-11-22 07:39:41.313 186853 INFO nova.virt.libvirt.driver [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: 5fde891f-fc12-4c02-8b8d-0475ce19a753] Creating image(s)#033[00m
Nov 22 02:39:41 np0005531887 nova_compute[186849]: 2025-11-22 07:39:41.313 186853 DEBUG oslo_concurrency.lockutils [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Acquiring lock "/var/lib/nova/instances/5fde891f-fc12-4c02-8b8d-0475ce19a753/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:39:41 np0005531887 nova_compute[186849]: 2025-11-22 07:39:41.313 186853 DEBUG oslo_concurrency.lockutils [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Lock "/var/lib/nova/instances/5fde891f-fc12-4c02-8b8d-0475ce19a753/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:39:41 np0005531887 nova_compute[186849]: 2025-11-22 07:39:41.314 186853 DEBUG oslo_concurrency.lockutils [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Lock "/var/lib/nova/instances/5fde891f-fc12-4c02-8b8d-0475ce19a753/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:39:41 np0005531887 nova_compute[186849]: 2025-11-22 07:39:41.315 186853 DEBUG oslo_concurrency.lockutils [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:39:41 np0005531887 nova_compute[186849]: 2025-11-22 07:39:41.315 186853 DEBUG oslo_concurrency.lockutils [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:39:41 np0005531887 podman[213337]: 2025-11-22 07:39:41.833456231 +0000 UTC m=+0.053446579 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:39:41 np0005531887 podman[213338]: 2025-11-22 07:39:41.867117012 +0000 UTC m=+0.083257582 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Nov 22 02:39:42 np0005531887 nova_compute[186849]: 2025-11-22 07:39:42.305 186853 DEBUG nova.network.neutron [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: 5fde891f-fc12-4c02-8b8d-0475ce19a753] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Nov 22 02:39:42 np0005531887 nova_compute[186849]: 2025-11-22 07:39:42.305 186853 DEBUG nova.compute.manager [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: 5fde891f-fc12-4c02-8b8d-0475ce19a753] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 02:39:43 np0005531887 nova_compute[186849]: 2025-11-22 07:39:43.723 186853 DEBUG oslo_concurrency.processutils [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:39:43 np0005531887 nova_compute[186849]: 2025-11-22 07:39:43.773 186853 DEBUG oslo_concurrency.processutils [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726.part --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:39:43 np0005531887 nova_compute[186849]: 2025-11-22 07:39:43.775 186853 DEBUG nova.virt.images [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] eb6eb4ac-7956-4021-b3a0-d612ae61d38c was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Nov 22 02:39:43 np0005531887 nova_compute[186849]: 2025-11-22 07:39:43.782 186853 DEBUG nova.privsep.utils [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 22 02:39:43 np0005531887 nova_compute[186849]: 2025-11-22 07:39:43.782 186853 DEBUG oslo_concurrency.processutils [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726.part /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:39:44 np0005531887 nova_compute[186849]: 2025-11-22 07:39:44.684 186853 DEBUG oslo_concurrency.processutils [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726.part /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726.converted" returned: 0 in 0.901s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:39:44 np0005531887 nova_compute[186849]: 2025-11-22 07:39:44.690 186853 DEBUG oslo_concurrency.processutils [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:39:44 np0005531887 nova_compute[186849]: 2025-11-22 07:39:44.755 186853 DEBUG oslo_concurrency.processutils [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726.converted --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:39:44 np0005531887 nova_compute[186849]: 2025-11-22 07:39:44.756 186853 DEBUG oslo_concurrency.lockutils [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 3.441s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:39:44 np0005531887 nova_compute[186849]: 2025-11-22 07:39:44.767 186853 INFO oslo.privsep.daemon [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmp5qjiwc1z/privsep.sock']#033[00m
Nov 22 02:39:45 np0005531887 nova_compute[186849]: 2025-11-22 07:39:45.452 186853 INFO oslo.privsep.daemon [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Spawned new privsep daemon via rootwrap#033[00m
Nov 22 02:39:45 np0005531887 nova_compute[186849]: 2025-11-22 07:39:45.320 213402 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Nov 22 02:39:45 np0005531887 nova_compute[186849]: 2025-11-22 07:39:45.324 213402 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Nov 22 02:39:45 np0005531887 nova_compute[186849]: 2025-11-22 07:39:45.326 213402 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Nov 22 02:39:45 np0005531887 nova_compute[186849]: 2025-11-22 07:39:45.327 213402 INFO oslo.privsep.daemon [-] privsep daemon running as pid 213402#033[00m
Nov 22 02:39:45 np0005531887 nova_compute[186849]: 2025-11-22 07:39:45.535 186853 DEBUG oslo_concurrency.processutils [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:39:45 np0005531887 nova_compute[186849]: 2025-11-22 07:39:45.594 186853 DEBUG oslo_concurrency.processutils [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:39:45 np0005531887 nova_compute[186849]: 2025-11-22 07:39:45.595 186853 DEBUG oslo_concurrency.lockutils [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:39:45 np0005531887 nova_compute[186849]: 2025-11-22 07:39:45.595 186853 DEBUG oslo_concurrency.lockutils [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:39:45 np0005531887 nova_compute[186849]: 2025-11-22 07:39:45.606 186853 DEBUG oslo_concurrency.processutils [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:39:45 np0005531887 nova_compute[186849]: 2025-11-22 07:39:45.658 186853 DEBUG oslo_concurrency.processutils [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:39:45 np0005531887 nova_compute[186849]: 2025-11-22 07:39:45.659 186853 DEBUG oslo_concurrency.processutils [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/5fde891f-fc12-4c02-8b8d-0475ce19a753/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:39:45 np0005531887 nova_compute[186849]: 2025-11-22 07:39:45.702 186853 DEBUG oslo_concurrency.processutils [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/5fde891f-fc12-4c02-8b8d-0475ce19a753/disk 1073741824" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:39:45 np0005531887 nova_compute[186849]: 2025-11-22 07:39:45.703 186853 DEBUG oslo_concurrency.lockutils [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:39:45 np0005531887 nova_compute[186849]: 2025-11-22 07:39:45.704 186853 DEBUG oslo_concurrency.processutils [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:39:45 np0005531887 nova_compute[186849]: 2025-11-22 07:39:45.755 186853 DEBUG oslo_concurrency.processutils [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:39:45 np0005531887 nova_compute[186849]: 2025-11-22 07:39:45.756 186853 DEBUG nova.virt.disk.api [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Checking if we can resize image /var/lib/nova/instances/5fde891f-fc12-4c02-8b8d-0475ce19a753/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:39:45 np0005531887 nova_compute[186849]: 2025-11-22 07:39:45.756 186853 DEBUG oslo_concurrency.processutils [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5fde891f-fc12-4c02-8b8d-0475ce19a753/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:39:45 np0005531887 nova_compute[186849]: 2025-11-22 07:39:45.822 186853 DEBUG oslo_concurrency.processutils [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5fde891f-fc12-4c02-8b8d-0475ce19a753/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:39:45 np0005531887 nova_compute[186849]: 2025-11-22 07:39:45.824 186853 DEBUG nova.virt.disk.api [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Cannot resize image /var/lib/nova/instances/5fde891f-fc12-4c02-8b8d-0475ce19a753/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:39:45 np0005531887 nova_compute[186849]: 2025-11-22 07:39:45.824 186853 DEBUG nova.objects.instance [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Lazy-loading 'migration_context' on Instance uuid 5fde891f-fc12-4c02-8b8d-0475ce19a753 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:39:46 np0005531887 nova_compute[186849]: 2025-11-22 07:39:46.184 186853 DEBUG nova.virt.libvirt.driver [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: 5fde891f-fc12-4c02-8b8d-0475ce19a753] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:39:46 np0005531887 nova_compute[186849]: 2025-11-22 07:39:46.185 186853 DEBUG nova.virt.libvirt.driver [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: 5fde891f-fc12-4c02-8b8d-0475ce19a753] Ensure instance console log exists: /var/lib/nova/instances/5fde891f-fc12-4c02-8b8d-0475ce19a753/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:39:46 np0005531887 nova_compute[186849]: 2025-11-22 07:39:46.185 186853 DEBUG oslo_concurrency.lockutils [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:39:46 np0005531887 nova_compute[186849]: 2025-11-22 07:39:46.186 186853 DEBUG oslo_concurrency.lockutils [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:39:46 np0005531887 nova_compute[186849]: 2025-11-22 07:39:46.186 186853 DEBUG oslo_concurrency.lockutils [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:39:46 np0005531887 nova_compute[186849]: 2025-11-22 07:39:46.188 186853 DEBUG nova.virt.libvirt.driver [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: 5fde891f-fc12-4c02-8b8d-0475ce19a753] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'size': 0, 'boot_index': 0, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:39:46 np0005531887 nova_compute[186849]: 2025-11-22 07:39:46.193 186853 WARNING nova.virt.libvirt.driver [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:39:46 np0005531887 nova_compute[186849]: 2025-11-22 07:39:46.198 186853 DEBUG nova.virt.libvirt.host [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:39:46 np0005531887 nova_compute[186849]: 2025-11-22 07:39:46.198 186853 DEBUG nova.virt.libvirt.host [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:39:46 np0005531887 nova_compute[186849]: 2025-11-22 07:39:46.202 186853 DEBUG nova.virt.libvirt.host [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:39:46 np0005531887 nova_compute[186849]: 2025-11-22 07:39:46.202 186853 DEBUG nova.virt.libvirt.host [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:39:46 np0005531887 nova_compute[186849]: 2025-11-22 07:39:46.204 186853 DEBUG nova.virt.libvirt.driver [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:39:46 np0005531887 nova_compute[186849]: 2025-11-22 07:39:46.205 186853 DEBUG nova.virt.hardware [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:39:46 np0005531887 nova_compute[186849]: 2025-11-22 07:39:46.205 186853 DEBUG nova.virt.hardware [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:39:46 np0005531887 nova_compute[186849]: 2025-11-22 07:39:46.205 186853 DEBUG nova.virt.hardware [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:39:46 np0005531887 nova_compute[186849]: 2025-11-22 07:39:46.206 186853 DEBUG nova.virt.hardware [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:39:46 np0005531887 nova_compute[186849]: 2025-11-22 07:39:46.206 186853 DEBUG nova.virt.hardware [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:39:46 np0005531887 nova_compute[186849]: 2025-11-22 07:39:46.206 186853 DEBUG nova.virt.hardware [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:39:46 np0005531887 nova_compute[186849]: 2025-11-22 07:39:46.206 186853 DEBUG nova.virt.hardware [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:39:46 np0005531887 nova_compute[186849]: 2025-11-22 07:39:46.207 186853 DEBUG nova.virt.hardware [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:39:46 np0005531887 nova_compute[186849]: 2025-11-22 07:39:46.207 186853 DEBUG nova.virt.hardware [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:39:46 np0005531887 nova_compute[186849]: 2025-11-22 07:39:46.207 186853 DEBUG nova.virt.hardware [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:39:46 np0005531887 nova_compute[186849]: 2025-11-22 07:39:46.207 186853 DEBUG nova.virt.hardware [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:39:46 np0005531887 nova_compute[186849]: 2025-11-22 07:39:46.211 186853 DEBUG nova.privsep.utils [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 22 02:39:46 np0005531887 nova_compute[186849]: 2025-11-22 07:39:46.213 186853 DEBUG nova.objects.instance [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5fde891f-fc12-4c02-8b8d-0475ce19a753 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:39:46 np0005531887 nova_compute[186849]: 2025-11-22 07:39:46.593 186853 DEBUG nova.virt.libvirt.driver [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: 5fde891f-fc12-4c02-8b8d-0475ce19a753] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:39:46 np0005531887 nova_compute[186849]:  <uuid>5fde891f-fc12-4c02-8b8d-0475ce19a753</uuid>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:  <name>instance-00000003</name>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:  <memory>131072</memory>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:  <vcpu>1</vcpu>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:39:46 np0005531887 nova_compute[186849]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:      <nova:name>tempest-DeleteServersAdminTestJSON-server-425818617</nova:name>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:      <nova:creationTime>2025-11-22 07:39:46</nova:creationTime>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:      <nova:flavor name="m1.nano">
Nov 22 02:39:46 np0005531887 nova_compute[186849]:        <nova:memory>128</nova:memory>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:        <nova:disk>1</nova:disk>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:        <nova:swap>0</nova:swap>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:      </nova:flavor>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:      <nova:owner>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:        <nova:user uuid="ae0a9bb236424581bf35f94644a5484c">tempest-DeleteServersAdminTestJSON-2048590971-project-member</nova:user>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:        <nova:project uuid="dd45d638bd73499da80359efc81898a3">tempest-DeleteServersAdminTestJSON-2048590971</nova:project>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:      </nova:owner>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:      <nova:ports/>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:    </nova:instance>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:  <sysinfo type="smbios">
Nov 22 02:39:46 np0005531887 nova_compute[186849]:    <system>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:      <entry name="serial">5fde891f-fc12-4c02-8b8d-0475ce19a753</entry>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:      <entry name="uuid">5fde891f-fc12-4c02-8b8d-0475ce19a753</entry>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:    </system>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:  <os>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:    <boot dev="hd"/>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:    <smbios mode="sysinfo"/>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:  </os>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:  <features>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:    <vmcoreinfo/>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:  </features>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:  <clock offset="utc">
Nov 22 02:39:46 np0005531887 nova_compute[186849]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:    <timer name="hpet" present="no"/>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:  </clock>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:  <cpu mode="custom" match="exact">
Nov 22 02:39:46 np0005531887 nova_compute[186849]:    <model>Nehalem</model>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:  <devices>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:    <disk type="file" device="disk">
Nov 22 02:39:46 np0005531887 nova_compute[186849]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/5fde891f-fc12-4c02-8b8d-0475ce19a753/disk"/>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:      <target dev="vda" bus="virtio"/>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:    <disk type="file" device="cdrom">
Nov 22 02:39:46 np0005531887 nova_compute[186849]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/5fde891f-fc12-4c02-8b8d-0475ce19a753/disk.config"/>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:      <target dev="sda" bus="sata"/>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:    <serial type="pty">
Nov 22 02:39:46 np0005531887 nova_compute[186849]:      <log file="/var/lib/nova/instances/5fde891f-fc12-4c02-8b8d-0475ce19a753/console.log" append="off"/>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:    </serial>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:    <video>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:    </video>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:    <input type="tablet" bus="usb"/>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:    <rng model="virtio">
Nov 22 02:39:46 np0005531887 nova_compute[186849]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:    </rng>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:    <controller type="usb" index="0"/>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:    <memballoon model="virtio">
Nov 22 02:39:46 np0005531887 nova_compute[186849]:      <stats period="10"/>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 02:39:46 np0005531887 nova_compute[186849]:  </devices>
Nov 22 02:39:46 np0005531887 nova_compute[186849]: </domain>
Nov 22 02:39:46 np0005531887 nova_compute[186849]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:39:46 np0005531887 nova_compute[186849]: 2025-11-22 07:39:46.651 186853 DEBUG nova.virt.libvirt.driver [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:39:46 np0005531887 nova_compute[186849]: 2025-11-22 07:39:46.651 186853 DEBUG nova.virt.libvirt.driver [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:39:46 np0005531887 nova_compute[186849]: 2025-11-22 07:39:46.652 186853 INFO nova.virt.libvirt.driver [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: 5fde891f-fc12-4c02-8b8d-0475ce19a753] Using config drive#033[00m
Nov 22 02:39:47 np0005531887 nova_compute[186849]: 2025-11-22 07:39:47.741 186853 INFO nova.virt.libvirt.driver [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: 5fde891f-fc12-4c02-8b8d-0475ce19a753] Creating config drive at /var/lib/nova/instances/5fde891f-fc12-4c02-8b8d-0475ce19a753/disk.config#033[00m
Nov 22 02:39:47 np0005531887 nova_compute[186849]: 2025-11-22 07:39:47.746 186853 DEBUG oslo_concurrency.processutils [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5fde891f-fc12-4c02-8b8d-0475ce19a753/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_9bp2xg6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:39:47 np0005531887 nova_compute[186849]: 2025-11-22 07:39:47.871 186853 DEBUG oslo_concurrency.processutils [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5fde891f-fc12-4c02-8b8d-0475ce19a753/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_9bp2xg6" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:39:47 np0005531887 systemd-machined[153180]: New machine qemu-1-instance-00000003.
Nov 22 02:39:47 np0005531887 systemd[1]: Started Virtual Machine qemu-1-instance-00000003.
Nov 22 02:39:48 np0005531887 podman[213429]: 2025-11-22 07:39:48.016097404 +0000 UTC m=+0.060490179 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 02:39:48 np0005531887 nova_compute[186849]: 2025-11-22 07:39:48.525 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763797188.5163581, 5fde891f-fc12-4c02-8b8d-0475ce19a753 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:39:48 np0005531887 nova_compute[186849]: 2025-11-22 07:39:48.526 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 5fde891f-fc12-4c02-8b8d-0475ce19a753] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:39:48 np0005531887 nova_compute[186849]: 2025-11-22 07:39:48.529 186853 DEBUG nova.compute.manager [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: 5fde891f-fc12-4c02-8b8d-0475ce19a753] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:39:48 np0005531887 nova_compute[186849]: 2025-11-22 07:39:48.530 186853 DEBUG nova.virt.libvirt.driver [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: 5fde891f-fc12-4c02-8b8d-0475ce19a753] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 02:39:48 np0005531887 nova_compute[186849]: 2025-11-22 07:39:48.535 186853 INFO nova.virt.libvirt.driver [-] [instance: 5fde891f-fc12-4c02-8b8d-0475ce19a753] Instance spawned successfully.#033[00m
Nov 22 02:39:48 np0005531887 nova_compute[186849]: 2025-11-22 07:39:48.536 186853 DEBUG nova.virt.libvirt.driver [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: 5fde891f-fc12-4c02-8b8d-0475ce19a753] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 02:39:48 np0005531887 nova_compute[186849]: 2025-11-22 07:39:48.556 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 5fde891f-fc12-4c02-8b8d-0475ce19a753] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:39:48 np0005531887 nova_compute[186849]: 2025-11-22 07:39:48.562 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 5fde891f-fc12-4c02-8b8d-0475ce19a753] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:39:48 np0005531887 nova_compute[186849]: 2025-11-22 07:39:48.566 186853 DEBUG nova.virt.libvirt.driver [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: 5fde891f-fc12-4c02-8b8d-0475ce19a753] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:39:48 np0005531887 nova_compute[186849]: 2025-11-22 07:39:48.566 186853 DEBUG nova.virt.libvirt.driver [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: 5fde891f-fc12-4c02-8b8d-0475ce19a753] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:39:48 np0005531887 nova_compute[186849]: 2025-11-22 07:39:48.567 186853 DEBUG nova.virt.libvirt.driver [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: 5fde891f-fc12-4c02-8b8d-0475ce19a753] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:39:48 np0005531887 nova_compute[186849]: 2025-11-22 07:39:48.568 186853 DEBUG nova.virt.libvirt.driver [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: 5fde891f-fc12-4c02-8b8d-0475ce19a753] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:39:48 np0005531887 nova_compute[186849]: 2025-11-22 07:39:48.568 186853 DEBUG nova.virt.libvirt.driver [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: 5fde891f-fc12-4c02-8b8d-0475ce19a753] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:39:48 np0005531887 nova_compute[186849]: 2025-11-22 07:39:48.569 186853 DEBUG nova.virt.libvirt.driver [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: 5fde891f-fc12-4c02-8b8d-0475ce19a753] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:39:48 np0005531887 nova_compute[186849]: 2025-11-22 07:39:48.593 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 5fde891f-fc12-4c02-8b8d-0475ce19a753] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:39:48 np0005531887 nova_compute[186849]: 2025-11-22 07:39:48.593 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763797188.5215786, 5fde891f-fc12-4c02-8b8d-0475ce19a753 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:39:48 np0005531887 nova_compute[186849]: 2025-11-22 07:39:48.594 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 5fde891f-fc12-4c02-8b8d-0475ce19a753] VM Started (Lifecycle Event)#033[00m
Nov 22 02:39:48 np0005531887 nova_compute[186849]: 2025-11-22 07:39:48.623 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 5fde891f-fc12-4c02-8b8d-0475ce19a753] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:39:48 np0005531887 nova_compute[186849]: 2025-11-22 07:39:48.627 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 5fde891f-fc12-4c02-8b8d-0475ce19a753] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:39:48 np0005531887 nova_compute[186849]: 2025-11-22 07:39:48.655 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 5fde891f-fc12-4c02-8b8d-0475ce19a753] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:39:48 np0005531887 nova_compute[186849]: 2025-11-22 07:39:48.675 186853 INFO nova.compute.manager [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: 5fde891f-fc12-4c02-8b8d-0475ce19a753] Took 7.36 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 02:39:48 np0005531887 nova_compute[186849]: 2025-11-22 07:39:48.676 186853 DEBUG nova.compute.manager [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: 5fde891f-fc12-4c02-8b8d-0475ce19a753] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:39:48 np0005531887 nova_compute[186849]: 2025-11-22 07:39:48.742 186853 INFO nova.compute.manager [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: 5fde891f-fc12-4c02-8b8d-0475ce19a753] Took 7.91 seconds to build instance.#033[00m
Nov 22 02:39:48 np0005531887 nova_compute[186849]: 2025-11-22 07:39:48.758 186853 DEBUG oslo_concurrency.lockutils [None req-b94b9b74-a7fc-4fbf-9a68-71367e4dd35e ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Lock "5fde891f-fc12-4c02-8b8d-0475ce19a753" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.021s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:39:51 np0005531887 nova_compute[186849]: 2025-11-22 07:39:51.216 186853 DEBUG oslo_concurrency.lockutils [None req-0421bafa-2779-4c44-b556-bed8c0093fb5 ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Acquiring lock "5fde891f-fc12-4c02-8b8d-0475ce19a753" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:39:51 np0005531887 nova_compute[186849]: 2025-11-22 07:39:51.217 186853 DEBUG oslo_concurrency.lockutils [None req-0421bafa-2779-4c44-b556-bed8c0093fb5 ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Lock "5fde891f-fc12-4c02-8b8d-0475ce19a753" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:39:51 np0005531887 nova_compute[186849]: 2025-11-22 07:39:51.217 186853 DEBUG oslo_concurrency.lockutils [None req-0421bafa-2779-4c44-b556-bed8c0093fb5 ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Acquiring lock "5fde891f-fc12-4c02-8b8d-0475ce19a753-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:39:51 np0005531887 nova_compute[186849]: 2025-11-22 07:39:51.217 186853 DEBUG oslo_concurrency.lockutils [None req-0421bafa-2779-4c44-b556-bed8c0093fb5 ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Lock "5fde891f-fc12-4c02-8b8d-0475ce19a753-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:39:51 np0005531887 nova_compute[186849]: 2025-11-22 07:39:51.218 186853 DEBUG oslo_concurrency.lockutils [None req-0421bafa-2779-4c44-b556-bed8c0093fb5 ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Lock "5fde891f-fc12-4c02-8b8d-0475ce19a753-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:39:51 np0005531887 nova_compute[186849]: 2025-11-22 07:39:51.227 186853 INFO nova.compute.manager [None req-0421bafa-2779-4c44-b556-bed8c0093fb5 ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: 5fde891f-fc12-4c02-8b8d-0475ce19a753] Terminating instance#033[00m
Nov 22 02:39:51 np0005531887 nova_compute[186849]: 2025-11-22 07:39:51.235 186853 DEBUG oslo_concurrency.lockutils [None req-0421bafa-2779-4c44-b556-bed8c0093fb5 ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Acquiring lock "refresh_cache-5fde891f-fc12-4c02-8b8d-0475ce19a753" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:39:51 np0005531887 nova_compute[186849]: 2025-11-22 07:39:51.236 186853 DEBUG oslo_concurrency.lockutils [None req-0421bafa-2779-4c44-b556-bed8c0093fb5 ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Acquired lock "refresh_cache-5fde891f-fc12-4c02-8b8d-0475ce19a753" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:39:51 np0005531887 nova_compute[186849]: 2025-11-22 07:39:51.236 186853 DEBUG nova.network.neutron [None req-0421bafa-2779-4c44-b556-bed8c0093fb5 ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: 5fde891f-fc12-4c02-8b8d-0475ce19a753] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:39:51 np0005531887 nova_compute[186849]: 2025-11-22 07:39:51.569 186853 DEBUG nova.network.neutron [None req-0421bafa-2779-4c44-b556-bed8c0093fb5 ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: 5fde891f-fc12-4c02-8b8d-0475ce19a753] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:39:51 np0005531887 nova_compute[186849]: 2025-11-22 07:39:51.947 186853 DEBUG nova.network.neutron [None req-0421bafa-2779-4c44-b556-bed8c0093fb5 ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: 5fde891f-fc12-4c02-8b8d-0475ce19a753] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:39:51 np0005531887 nova_compute[186849]: 2025-11-22 07:39:51.969 186853 DEBUG oslo_concurrency.lockutils [None req-0421bafa-2779-4c44-b556-bed8c0093fb5 ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Releasing lock "refresh_cache-5fde891f-fc12-4c02-8b8d-0475ce19a753" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:39:51 np0005531887 nova_compute[186849]: 2025-11-22 07:39:51.969 186853 DEBUG nova.compute.manager [None req-0421bafa-2779-4c44-b556-bed8c0093fb5 ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: 5fde891f-fc12-4c02-8b8d-0475ce19a753] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 02:39:52 np0005531887 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000003.scope: Deactivated successfully.
Nov 22 02:39:52 np0005531887 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000003.scope: Consumed 3.850s CPU time.
Nov 22 02:39:52 np0005531887 systemd-machined[153180]: Machine qemu-1-instance-00000003 terminated.
Nov 22 02:39:52 np0005531887 podman[213469]: 2025-11-22 07:39:52.068488439 +0000 UTC m=+0.058687663 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 22 02:39:52 np0005531887 nova_compute[186849]: 2025-11-22 07:39:52.220 186853 INFO nova.virt.libvirt.driver [-] [instance: 5fde891f-fc12-4c02-8b8d-0475ce19a753] Instance destroyed successfully.#033[00m
Nov 22 02:39:52 np0005531887 nova_compute[186849]: 2025-11-22 07:39:52.220 186853 DEBUG nova.objects.instance [None req-0421bafa-2779-4c44-b556-bed8c0093fb5 ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Lazy-loading 'resources' on Instance uuid 5fde891f-fc12-4c02-8b8d-0475ce19a753 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:39:52 np0005531887 nova_compute[186849]: 2025-11-22 07:39:52.231 186853 INFO nova.virt.libvirt.driver [None req-0421bafa-2779-4c44-b556-bed8c0093fb5 ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: 5fde891f-fc12-4c02-8b8d-0475ce19a753] Deleting instance files /var/lib/nova/instances/5fde891f-fc12-4c02-8b8d-0475ce19a753_del#033[00m
Nov 22 02:39:52 np0005531887 nova_compute[186849]: 2025-11-22 07:39:52.232 186853 INFO nova.virt.libvirt.driver [None req-0421bafa-2779-4c44-b556-bed8c0093fb5 ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: 5fde891f-fc12-4c02-8b8d-0475ce19a753] Deletion of /var/lib/nova/instances/5fde891f-fc12-4c02-8b8d-0475ce19a753_del complete#033[00m
Nov 22 02:39:52 np0005531887 nova_compute[186849]: 2025-11-22 07:39:52.383 186853 DEBUG nova.virt.libvirt.host [None req-0421bafa-2779-4c44-b556-bed8c0093fb5 ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754#033[00m
Nov 22 02:39:52 np0005531887 nova_compute[186849]: 2025-11-22 07:39:52.384 186853 INFO nova.virt.libvirt.host [None req-0421bafa-2779-4c44-b556-bed8c0093fb5 ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] UEFI support detected#033[00m
Nov 22 02:39:52 np0005531887 nova_compute[186849]: 2025-11-22 07:39:52.385 186853 INFO nova.compute.manager [None req-0421bafa-2779-4c44-b556-bed8c0093fb5 ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [instance: 5fde891f-fc12-4c02-8b8d-0475ce19a753] Took 0.42 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 02:39:52 np0005531887 nova_compute[186849]: 2025-11-22 07:39:52.386 186853 DEBUG oslo.service.loopingcall [None req-0421bafa-2779-4c44-b556-bed8c0093fb5 ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 02:39:52 np0005531887 nova_compute[186849]: 2025-11-22 07:39:52.386 186853 DEBUG nova.compute.manager [-] [instance: 5fde891f-fc12-4c02-8b8d-0475ce19a753] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 02:39:52 np0005531887 nova_compute[186849]: 2025-11-22 07:39:52.386 186853 DEBUG nova.network.neutron [-] [instance: 5fde891f-fc12-4c02-8b8d-0475ce19a753] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 02:39:52 np0005531887 nova_compute[186849]: 2025-11-22 07:39:52.626 186853 DEBUG nova.network.neutron [-] [instance: 5fde891f-fc12-4c02-8b8d-0475ce19a753] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:39:52 np0005531887 nova_compute[186849]: 2025-11-22 07:39:52.646 186853 DEBUG nova.network.neutron [-] [instance: 5fde891f-fc12-4c02-8b8d-0475ce19a753] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:39:52 np0005531887 nova_compute[186849]: 2025-11-22 07:39:52.665 186853 INFO nova.compute.manager [-] [instance: 5fde891f-fc12-4c02-8b8d-0475ce19a753] Took 0.28 seconds to deallocate network for instance.#033[00m
Nov 22 02:39:52 np0005531887 nova_compute[186849]: 2025-11-22 07:39:52.768 186853 DEBUG oslo_concurrency.lockutils [None req-0421bafa-2779-4c44-b556-bed8c0093fb5 ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:39:52 np0005531887 nova_compute[186849]: 2025-11-22 07:39:52.768 186853 DEBUG oslo_concurrency.lockutils [None req-0421bafa-2779-4c44-b556-bed8c0093fb5 ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:39:52 np0005531887 nova_compute[186849]: 2025-11-22 07:39:52.855 186853 DEBUG nova.compute.provider_tree [None req-0421bafa-2779-4c44-b556-bed8c0093fb5 ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Updating inventory in ProviderTree for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 22 02:39:52 np0005531887 nova_compute[186849]: 2025-11-22 07:39:52.891 186853 ERROR nova.scheduler.client.report [None req-0421bafa-2779-4c44-b556-bed8c0093fb5 ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [req-2143912d-75f0-4afd-8498-e1b6d638050e] Failed to update inventory to [{'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-2143912d-75f0-4afd-8498-e1b6d638050e"}]}#033[00m
Nov 22 02:39:52 np0005531887 nova_compute[186849]: 2025-11-22 07:39:52.915 186853 DEBUG nova.scheduler.client.report [None req-0421bafa-2779-4c44-b556-bed8c0093fb5 ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Refreshing inventories for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 22 02:39:52 np0005531887 nova_compute[186849]: 2025-11-22 07:39:52.939 186853 DEBUG nova.scheduler.client.report [None req-0421bafa-2779-4c44-b556-bed8c0093fb5 ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Updating ProviderTree inventory for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 22 02:39:52 np0005531887 nova_compute[186849]: 2025-11-22 07:39:52.939 186853 DEBUG nova.compute.provider_tree [None req-0421bafa-2779-4c44-b556-bed8c0093fb5 ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Updating inventory in ProviderTree for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 22 02:39:52 np0005531887 nova_compute[186849]: 2025-11-22 07:39:52.959 186853 DEBUG nova.scheduler.client.report [None req-0421bafa-2779-4c44-b556-bed8c0093fb5 ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Refreshing aggregate associations for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 22 02:39:53 np0005531887 nova_compute[186849]: 2025-11-22 07:39:53.002 186853 DEBUG nova.scheduler.client.report [None req-0421bafa-2779-4c44-b556-bed8c0093fb5 ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Refreshing trait associations for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78, traits: COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_MMX,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_PCNET _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 22 02:39:53 np0005531887 nova_compute[186849]: 2025-11-22 07:39:53.056 186853 DEBUG nova.compute.provider_tree [None req-0421bafa-2779-4c44-b556-bed8c0093fb5 ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Updating inventory in ProviderTree for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 22 02:39:53 np0005531887 nova_compute[186849]: 2025-11-22 07:39:53.087 186853 ERROR nova.scheduler.client.report [None req-0421bafa-2779-4c44-b556-bed8c0093fb5 ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] [req-cda27d99-a962-4c15-9433-f03bab1cc8c8] Failed to update inventory to [{'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-cda27d99-a962-4c15-9433-f03bab1cc8c8"}]}#033[00m
Nov 22 02:39:53 np0005531887 nova_compute[186849]: 2025-11-22 07:39:53.108 186853 DEBUG nova.scheduler.client.report [None req-0421bafa-2779-4c44-b556-bed8c0093fb5 ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Refreshing inventories for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 22 02:39:53 np0005531887 nova_compute[186849]: 2025-11-22 07:39:53.143 186853 DEBUG nova.scheduler.client.report [None req-0421bafa-2779-4c44-b556-bed8c0093fb5 ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Updating ProviderTree inventory for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 22 02:39:53 np0005531887 nova_compute[186849]: 2025-11-22 07:39:53.143 186853 DEBUG nova.compute.provider_tree [None req-0421bafa-2779-4c44-b556-bed8c0093fb5 ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Updating inventory in ProviderTree for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 22 02:39:53 np0005531887 nova_compute[186849]: 2025-11-22 07:39:53.158 186853 DEBUG nova.scheduler.client.report [None req-0421bafa-2779-4c44-b556-bed8c0093fb5 ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Refreshing aggregate associations for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 22 02:39:53 np0005531887 nova_compute[186849]: 2025-11-22 07:39:53.190 186853 DEBUG nova.scheduler.client.report [None req-0421bafa-2779-4c44-b556-bed8c0093fb5 ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Refreshing trait associations for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78, traits: COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_MMX,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_PCNET _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 22 02:39:53 np0005531887 nova_compute[186849]: 2025-11-22 07:39:53.298 186853 DEBUG nova.compute.provider_tree [None req-0421bafa-2779-4c44-b556-bed8c0093fb5 ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Updating inventory in ProviderTree for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 22 02:39:53 np0005531887 nova_compute[186849]: 2025-11-22 07:39:53.385 186853 DEBUG nova.scheduler.client.report [None req-0421bafa-2779-4c44-b556-bed8c0093fb5 ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Updated inventory for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 with generation 4 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Nov 22 02:39:53 np0005531887 nova_compute[186849]: 2025-11-22 07:39:53.386 186853 DEBUG nova.compute.provider_tree [None req-0421bafa-2779-4c44-b556-bed8c0093fb5 ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Updating resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 generation from 4 to 5 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Nov 22 02:39:53 np0005531887 nova_compute[186849]: 2025-11-22 07:39:53.386 186853 DEBUG nova.compute.provider_tree [None req-0421bafa-2779-4c44-b556-bed8c0093fb5 ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Updating inventory in ProviderTree for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 22 02:39:53 np0005531887 nova_compute[186849]: 2025-11-22 07:39:53.452 186853 DEBUG oslo_concurrency.lockutils [None req-0421bafa-2779-4c44-b556-bed8c0093fb5 ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.683s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:39:53 np0005531887 nova_compute[186849]: 2025-11-22 07:39:53.518 186853 INFO nova.scheduler.client.report [None req-0421bafa-2779-4c44-b556-bed8c0093fb5 ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Deleted allocations for instance 5fde891f-fc12-4c02-8b8d-0475ce19a753#033[00m
Nov 22 02:39:53 np0005531887 nova_compute[186849]: 2025-11-22 07:39:53.598 186853 DEBUG oslo_concurrency.lockutils [None req-0421bafa-2779-4c44-b556-bed8c0093fb5 ae0a9bb236424581bf35f94644a5484c dd45d638bd73499da80359efc81898a3 - - default default] Lock "5fde891f-fc12-4c02-8b8d-0475ce19a753" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.381s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:39:53 np0005531887 nova_compute[186849]: 2025-11-22 07:39:53.814 186853 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Acquiring lock "6fe80388-ca60-492c-a99f-a338bcce8d5b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:39:53 np0005531887 nova_compute[186849]: 2025-11-22 07:39:53.814 186853 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "6fe80388-ca60-492c-a99f-a338bcce8d5b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:39:53 np0005531887 nova_compute[186849]: 2025-11-22 07:39:53.879 186853 DEBUG nova.compute.manager [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 02:39:54 np0005531887 nova_compute[186849]: 2025-11-22 07:39:54.078 186853 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:39:54 np0005531887 nova_compute[186849]: 2025-11-22 07:39:54.078 186853 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:39:54 np0005531887 nova_compute[186849]: 2025-11-22 07:39:54.102 186853 DEBUG nova.virt.hardware [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:39:54 np0005531887 nova_compute[186849]: 2025-11-22 07:39:54.102 186853 INFO nova.compute.claims [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 22 02:39:54 np0005531887 nova_compute[186849]: 2025-11-22 07:39:54.273 186853 DEBUG nova.compute.provider_tree [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:39:54 np0005531887 nova_compute[186849]: 2025-11-22 07:39:54.294 186853 DEBUG nova.scheduler.client.report [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:39:54 np0005531887 nova_compute[186849]: 2025-11-22 07:39:54.349 186853 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.271s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:39:54 np0005531887 nova_compute[186849]: 2025-11-22 07:39:54.350 186853 DEBUG nova.compute.manager [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 02:39:54 np0005531887 nova_compute[186849]: 2025-11-22 07:39:54.463 186853 DEBUG nova.compute.manager [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 02:39:54 np0005531887 nova_compute[186849]: 2025-11-22 07:39:54.464 186853 DEBUG nova.network.neutron [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 02:39:54 np0005531887 nova_compute[186849]: 2025-11-22 07:39:54.480 186853 INFO nova.virt.libvirt.driver [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 02:39:54 np0005531887 nova_compute[186849]: 2025-11-22 07:39:54.508 186853 DEBUG nova.compute.manager [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 02:39:54 np0005531887 nova_compute[186849]: 2025-11-22 07:39:54.664 186853 DEBUG nova.compute.manager [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 02:39:54 np0005531887 nova_compute[186849]: 2025-11-22 07:39:54.666 186853 DEBUG nova.virt.libvirt.driver [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:39:54 np0005531887 nova_compute[186849]: 2025-11-22 07:39:54.667 186853 INFO nova.virt.libvirt.driver [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Creating image(s)#033[00m
Nov 22 02:39:54 np0005531887 nova_compute[186849]: 2025-11-22 07:39:54.668 186853 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Acquiring lock "/var/lib/nova/instances/6fe80388-ca60-492c-a99f-a338bcce8d5b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:39:54 np0005531887 nova_compute[186849]: 2025-11-22 07:39:54.668 186853 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "/var/lib/nova/instances/6fe80388-ca60-492c-a99f-a338bcce8d5b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:39:54 np0005531887 nova_compute[186849]: 2025-11-22 07:39:54.669 186853 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "/var/lib/nova/instances/6fe80388-ca60-492c-a99f-a338bcce8d5b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:39:54 np0005531887 nova_compute[186849]: 2025-11-22 07:39:54.685 186853 DEBUG oslo_concurrency.processutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:39:54 np0005531887 nova_compute[186849]: 2025-11-22 07:39:54.746 186853 DEBUG oslo_concurrency.processutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:39:54 np0005531887 nova_compute[186849]: 2025-11-22 07:39:54.748 186853 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:39:54 np0005531887 nova_compute[186849]: 2025-11-22 07:39:54.749 186853 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:39:54 np0005531887 nova_compute[186849]: 2025-11-22 07:39:54.766 186853 DEBUG oslo_concurrency.processutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:39:54 np0005531887 nova_compute[186849]: 2025-11-22 07:39:54.783 186853 DEBUG nova.network.neutron [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Automatically allocating a network for project 98627e04b62e4ce4bf9650377c674f73. _auto_allocate_network /usr/lib/python3.9/site-packages/nova/network/neutron.py:2460#033[00m
Nov 22 02:39:54 np0005531887 nova_compute[186849]: 2025-11-22 07:39:54.832 186853 DEBUG oslo_concurrency.processutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:39:54 np0005531887 nova_compute[186849]: 2025-11-22 07:39:54.832 186853 DEBUG oslo_concurrency.processutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/6fe80388-ca60-492c-a99f-a338bcce8d5b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:39:54 np0005531887 podman[213500]: 2025-11-22 07:39:54.83833333 +0000 UTC m=+0.062483771 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Nov 22 02:39:54 np0005531887 nova_compute[186849]: 2025-11-22 07:39:54.891 186853 DEBUG oslo_concurrency.processutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/6fe80388-ca60-492c-a99f-a338bcce8d5b/disk 1073741824" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:39:54 np0005531887 nova_compute[186849]: 2025-11-22 07:39:54.892 186853 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.143s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:39:54 np0005531887 nova_compute[186849]: 2025-11-22 07:39:54.892 186853 DEBUG oslo_concurrency.processutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:39:54 np0005531887 nova_compute[186849]: 2025-11-22 07:39:54.952 186853 DEBUG oslo_concurrency.processutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:39:54 np0005531887 nova_compute[186849]: 2025-11-22 07:39:54.953 186853 DEBUG nova.virt.disk.api [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Checking if we can resize image /var/lib/nova/instances/6fe80388-ca60-492c-a99f-a338bcce8d5b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:39:54 np0005531887 nova_compute[186849]: 2025-11-22 07:39:54.954 186853 DEBUG oslo_concurrency.processutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6fe80388-ca60-492c-a99f-a338bcce8d5b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:39:55 np0005531887 nova_compute[186849]: 2025-11-22 07:39:55.029 186853 DEBUG oslo_concurrency.processutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6fe80388-ca60-492c-a99f-a338bcce8d5b/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:39:55 np0005531887 nova_compute[186849]: 2025-11-22 07:39:55.030 186853 DEBUG nova.virt.disk.api [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Cannot resize image /var/lib/nova/instances/6fe80388-ca60-492c-a99f-a338bcce8d5b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:39:55 np0005531887 nova_compute[186849]: 2025-11-22 07:39:55.031 186853 DEBUG nova.objects.instance [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lazy-loading 'migration_context' on Instance uuid 6fe80388-ca60-492c-a99f-a338bcce8d5b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:39:55 np0005531887 nova_compute[186849]: 2025-11-22 07:39:55.048 186853 DEBUG nova.virt.libvirt.driver [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:39:55 np0005531887 nova_compute[186849]: 2025-11-22 07:39:55.048 186853 DEBUG nova.virt.libvirt.driver [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Ensure instance console log exists: /var/lib/nova/instances/6fe80388-ca60-492c-a99f-a338bcce8d5b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:39:55 np0005531887 nova_compute[186849]: 2025-11-22 07:39:55.048 186853 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:39:55 np0005531887 nova_compute[186849]: 2025-11-22 07:39:55.049 186853 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:39:55 np0005531887 nova_compute[186849]: 2025-11-22 07:39:55.049 186853 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:39:58 np0005531887 nova_compute[186849]: 2025-11-22 07:39:58.070 186853 DEBUG oslo_concurrency.lockutils [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Acquiring lock "6fabeb14-7440-41d0-8be1-453a7607a8ed" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:39:58 np0005531887 nova_compute[186849]: 2025-11-22 07:39:58.071 186853 DEBUG oslo_concurrency.lockutils [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Lock "6fabeb14-7440-41d0-8be1-453a7607a8ed" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:39:58 np0005531887 nova_compute[186849]: 2025-11-22 07:39:58.114 186853 DEBUG nova.compute.manager [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 6fabeb14-7440-41d0-8be1-453a7607a8ed] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 02:39:58 np0005531887 nova_compute[186849]: 2025-11-22 07:39:58.233 186853 DEBUG oslo_concurrency.lockutils [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:39:58 np0005531887 nova_compute[186849]: 2025-11-22 07:39:58.234 186853 DEBUG oslo_concurrency.lockutils [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:39:58 np0005531887 nova_compute[186849]: 2025-11-22 07:39:58.240 186853 DEBUG nova.virt.hardware [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:39:58 np0005531887 nova_compute[186849]: 2025-11-22 07:39:58.241 186853 INFO nova.compute.claims [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 6fabeb14-7440-41d0-8be1-453a7607a8ed] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 22 02:39:58 np0005531887 nova_compute[186849]: 2025-11-22 07:39:58.444 186853 DEBUG nova.compute.provider_tree [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:39:58 np0005531887 nova_compute[186849]: 2025-11-22 07:39:58.456 186853 DEBUG nova.scheduler.client.report [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:39:58 np0005531887 nova_compute[186849]: 2025-11-22 07:39:58.473 186853 DEBUG oslo_concurrency.lockutils [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.239s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:39:58 np0005531887 nova_compute[186849]: 2025-11-22 07:39:58.473 186853 DEBUG nova.compute.manager [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 6fabeb14-7440-41d0-8be1-453a7607a8ed] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 02:39:58 np0005531887 nova_compute[186849]: 2025-11-22 07:39:58.519 186853 DEBUG nova.compute.manager [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 6fabeb14-7440-41d0-8be1-453a7607a8ed] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 02:39:58 np0005531887 nova_compute[186849]: 2025-11-22 07:39:58.519 186853 DEBUG nova.network.neutron [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 6fabeb14-7440-41d0-8be1-453a7607a8ed] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 02:39:58 np0005531887 nova_compute[186849]: 2025-11-22 07:39:58.567 186853 INFO nova.virt.libvirt.driver [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 6fabeb14-7440-41d0-8be1-453a7607a8ed] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 02:39:58 np0005531887 nova_compute[186849]: 2025-11-22 07:39:58.594 186853 DEBUG nova.compute.manager [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 6fabeb14-7440-41d0-8be1-453a7607a8ed] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 02:39:58 np0005531887 nova_compute[186849]: 2025-11-22 07:39:58.700 186853 DEBUG nova.compute.manager [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 6fabeb14-7440-41d0-8be1-453a7607a8ed] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 02:39:58 np0005531887 nova_compute[186849]: 2025-11-22 07:39:58.701 186853 DEBUG nova.virt.libvirt.driver [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 6fabeb14-7440-41d0-8be1-453a7607a8ed] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:39:58 np0005531887 nova_compute[186849]: 2025-11-22 07:39:58.701 186853 INFO nova.virt.libvirt.driver [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 6fabeb14-7440-41d0-8be1-453a7607a8ed] Creating image(s)#033[00m
Nov 22 02:39:58 np0005531887 nova_compute[186849]: 2025-11-22 07:39:58.702 186853 DEBUG oslo_concurrency.lockutils [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Acquiring lock "/var/lib/nova/instances/6fabeb14-7440-41d0-8be1-453a7607a8ed/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:39:58 np0005531887 nova_compute[186849]: 2025-11-22 07:39:58.702 186853 DEBUG oslo_concurrency.lockutils [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Lock "/var/lib/nova/instances/6fabeb14-7440-41d0-8be1-453a7607a8ed/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:39:58 np0005531887 nova_compute[186849]: 2025-11-22 07:39:58.703 186853 DEBUG oslo_concurrency.lockutils [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Lock "/var/lib/nova/instances/6fabeb14-7440-41d0-8be1-453a7607a8ed/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:39:58 np0005531887 nova_compute[186849]: 2025-11-22 07:39:58.720 186853 DEBUG oslo_concurrency.processutils [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:39:58 np0005531887 nova_compute[186849]: 2025-11-22 07:39:58.776 186853 DEBUG oslo_concurrency.processutils [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:39:58 np0005531887 nova_compute[186849]: 2025-11-22 07:39:58.777 186853 DEBUG oslo_concurrency.lockutils [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:39:58 np0005531887 nova_compute[186849]: 2025-11-22 07:39:58.778 186853 DEBUG oslo_concurrency.lockutils [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:39:58 np0005531887 nova_compute[186849]: 2025-11-22 07:39:58.789 186853 DEBUG oslo_concurrency.processutils [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:39:58 np0005531887 nova_compute[186849]: 2025-11-22 07:39:58.843 186853 DEBUG oslo_concurrency.processutils [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:39:58 np0005531887 nova_compute[186849]: 2025-11-22 07:39:58.844 186853 DEBUG oslo_concurrency.processutils [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/6fabeb14-7440-41d0-8be1-453a7607a8ed/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:39:58 np0005531887 nova_compute[186849]: 2025-11-22 07:39:58.990 186853 DEBUG oslo_concurrency.processutils [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/6fabeb14-7440-41d0-8be1-453a7607a8ed/disk 1073741824" returned: 0 in 0.146s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:39:58 np0005531887 nova_compute[186849]: 2025-11-22 07:39:58.991 186853 DEBUG oslo_concurrency.lockutils [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.213s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:39:58 np0005531887 nova_compute[186849]: 2025-11-22 07:39:58.992 186853 DEBUG oslo_concurrency.processutils [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:39:59 np0005531887 nova_compute[186849]: 2025-11-22 07:39:59.057 186853 DEBUG oslo_concurrency.processutils [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:39:59 np0005531887 nova_compute[186849]: 2025-11-22 07:39:59.058 186853 DEBUG nova.virt.disk.api [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Checking if we can resize image /var/lib/nova/instances/6fabeb14-7440-41d0-8be1-453a7607a8ed/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:39:59 np0005531887 nova_compute[186849]: 2025-11-22 07:39:59.059 186853 DEBUG oslo_concurrency.processutils [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6fabeb14-7440-41d0-8be1-453a7607a8ed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:39:59 np0005531887 nova_compute[186849]: 2025-11-22 07:39:59.075 186853 DEBUG nova.network.neutron [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 6fabeb14-7440-41d0-8be1-453a7607a8ed] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Nov 22 02:39:59 np0005531887 nova_compute[186849]: 2025-11-22 07:39:59.075 186853 DEBUG nova.compute.manager [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 6fabeb14-7440-41d0-8be1-453a7607a8ed] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 02:39:59 np0005531887 nova_compute[186849]: 2025-11-22 07:39:59.112 186853 DEBUG oslo_concurrency.processutils [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6fabeb14-7440-41d0-8be1-453a7607a8ed/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:39:59 np0005531887 nova_compute[186849]: 2025-11-22 07:39:59.113 186853 DEBUG nova.virt.disk.api [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Cannot resize image /var/lib/nova/instances/6fabeb14-7440-41d0-8be1-453a7607a8ed/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:39:59 np0005531887 nova_compute[186849]: 2025-11-22 07:39:59.114 186853 DEBUG nova.objects.instance [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Lazy-loading 'migration_context' on Instance uuid 6fabeb14-7440-41d0-8be1-453a7607a8ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:39:59 np0005531887 nova_compute[186849]: 2025-11-22 07:39:59.126 186853 DEBUG nova.virt.libvirt.driver [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 6fabeb14-7440-41d0-8be1-453a7607a8ed] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:39:59 np0005531887 nova_compute[186849]: 2025-11-22 07:39:59.126 186853 DEBUG nova.virt.libvirt.driver [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 6fabeb14-7440-41d0-8be1-453a7607a8ed] Ensure instance console log exists: /var/lib/nova/instances/6fabeb14-7440-41d0-8be1-453a7607a8ed/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:39:59 np0005531887 nova_compute[186849]: 2025-11-22 07:39:59.127 186853 DEBUG oslo_concurrency.lockutils [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:39:59 np0005531887 nova_compute[186849]: 2025-11-22 07:39:59.127 186853 DEBUG oslo_concurrency.lockutils [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:39:59 np0005531887 nova_compute[186849]: 2025-11-22 07:39:59.128 186853 DEBUG oslo_concurrency.lockutils [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:39:59 np0005531887 nova_compute[186849]: 2025-11-22 07:39:59.129 186853 DEBUG nova.virt.libvirt.driver [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 6fabeb14-7440-41d0-8be1-453a7607a8ed] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'size': 0, 'boot_index': 0, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:39:59 np0005531887 nova_compute[186849]: 2025-11-22 07:39:59.135 186853 WARNING nova.virt.libvirt.driver [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:39:59 np0005531887 nova_compute[186849]: 2025-11-22 07:39:59.141 186853 DEBUG nova.virt.libvirt.host [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:39:59 np0005531887 nova_compute[186849]: 2025-11-22 07:39:59.142 186853 DEBUG nova.virt.libvirt.host [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:39:59 np0005531887 nova_compute[186849]: 2025-11-22 07:39:59.146 186853 DEBUG nova.virt.libvirt.host [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:39:59 np0005531887 nova_compute[186849]: 2025-11-22 07:39:59.147 186853 DEBUG nova.virt.libvirt.host [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:39:59 np0005531887 nova_compute[186849]: 2025-11-22 07:39:59.149 186853 DEBUG nova.virt.libvirt.driver [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:39:59 np0005531887 nova_compute[186849]: 2025-11-22 07:39:59.149 186853 DEBUG nova.virt.hardware [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:39:59 np0005531887 nova_compute[186849]: 2025-11-22 07:39:59.150 186853 DEBUG nova.virt.hardware [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:39:59 np0005531887 nova_compute[186849]: 2025-11-22 07:39:59.150 186853 DEBUG nova.virt.hardware [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:39:59 np0005531887 nova_compute[186849]: 2025-11-22 07:39:59.151 186853 DEBUG nova.virt.hardware [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:39:59 np0005531887 nova_compute[186849]: 2025-11-22 07:39:59.151 186853 DEBUG nova.virt.hardware [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:39:59 np0005531887 nova_compute[186849]: 2025-11-22 07:39:59.151 186853 DEBUG nova.virt.hardware [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:39:59 np0005531887 nova_compute[186849]: 2025-11-22 07:39:59.151 186853 DEBUG nova.virt.hardware [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:39:59 np0005531887 nova_compute[186849]: 2025-11-22 07:39:59.152 186853 DEBUG nova.virt.hardware [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:39:59 np0005531887 nova_compute[186849]: 2025-11-22 07:39:59.153 186853 DEBUG nova.virt.hardware [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:39:59 np0005531887 nova_compute[186849]: 2025-11-22 07:39:59.154 186853 DEBUG nova.virt.hardware [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:39:59 np0005531887 nova_compute[186849]: 2025-11-22 07:39:59.155 186853 DEBUG nova.virt.hardware [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:39:59 np0005531887 nova_compute[186849]: 2025-11-22 07:39:59.160 186853 DEBUG nova.objects.instance [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Lazy-loading 'pci_devices' on Instance uuid 6fabeb14-7440-41d0-8be1-453a7607a8ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:39:59 np0005531887 nova_compute[186849]: 2025-11-22 07:39:59.184 186853 DEBUG nova.virt.libvirt.driver [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 6fabeb14-7440-41d0-8be1-453a7607a8ed] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:39:59 np0005531887 nova_compute[186849]:  <uuid>6fabeb14-7440-41d0-8be1-453a7607a8ed</uuid>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:  <name>instance-00000007</name>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:  <memory>131072</memory>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:  <vcpu>1</vcpu>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:39:59 np0005531887 nova_compute[186849]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:      <nova:name>tempest-LiveMigrationNegativeTest-server-672746011</nova:name>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:      <nova:creationTime>2025-11-22 07:39:59</nova:creationTime>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:      <nova:flavor name="m1.nano">
Nov 22 02:39:59 np0005531887 nova_compute[186849]:        <nova:memory>128</nova:memory>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:        <nova:disk>1</nova:disk>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:        <nova:swap>0</nova:swap>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:      </nova:flavor>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:      <nova:owner>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:        <nova:user uuid="d7697c5198974e5ca1152e4c64815e29">tempest-LiveMigrationNegativeTest-555543298-project-member</nova:user>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:        <nova:project uuid="463b8a0ac0be4ebbb7491f91038a890f">tempest-LiveMigrationNegativeTest-555543298</nova:project>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:      </nova:owner>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:      <nova:ports/>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:    </nova:instance>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:  <sysinfo type="smbios">
Nov 22 02:39:59 np0005531887 nova_compute[186849]:    <system>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:      <entry name="serial">6fabeb14-7440-41d0-8be1-453a7607a8ed</entry>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:      <entry name="uuid">6fabeb14-7440-41d0-8be1-453a7607a8ed</entry>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:    </system>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:  <os>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:    <boot dev="hd"/>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:    <smbios mode="sysinfo"/>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:  </os>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:  <features>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:    <vmcoreinfo/>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:  </features>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:  <clock offset="utc">
Nov 22 02:39:59 np0005531887 nova_compute[186849]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:    <timer name="hpet" present="no"/>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:  </clock>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:  <cpu mode="custom" match="exact">
Nov 22 02:39:59 np0005531887 nova_compute[186849]:    <model>Nehalem</model>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:  <devices>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:    <disk type="file" device="disk">
Nov 22 02:39:59 np0005531887 nova_compute[186849]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/6fabeb14-7440-41d0-8be1-453a7607a8ed/disk"/>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:      <target dev="vda" bus="virtio"/>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:    <disk type="file" device="cdrom">
Nov 22 02:39:59 np0005531887 nova_compute[186849]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/6fabeb14-7440-41d0-8be1-453a7607a8ed/disk.config"/>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:      <target dev="sda" bus="sata"/>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:    <serial type="pty">
Nov 22 02:39:59 np0005531887 nova_compute[186849]:      <log file="/var/lib/nova/instances/6fabeb14-7440-41d0-8be1-453a7607a8ed/console.log" append="off"/>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:    </serial>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:    <video>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:    </video>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:    <input type="tablet" bus="usb"/>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:    <rng model="virtio">
Nov 22 02:39:59 np0005531887 nova_compute[186849]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:    </rng>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:    <controller type="usb" index="0"/>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:    <memballoon model="virtio">
Nov 22 02:39:59 np0005531887 nova_compute[186849]:      <stats period="10"/>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 02:39:59 np0005531887 nova_compute[186849]:  </devices>
Nov 22 02:39:59 np0005531887 nova_compute[186849]: </domain>
Nov 22 02:39:59 np0005531887 nova_compute[186849]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:39:59 np0005531887 nova_compute[186849]: 2025-11-22 07:39:59.275 186853 DEBUG nova.virt.libvirt.driver [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:39:59 np0005531887 nova_compute[186849]: 2025-11-22 07:39:59.275 186853 DEBUG nova.virt.libvirt.driver [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:39:59 np0005531887 nova_compute[186849]: 2025-11-22 07:39:59.276 186853 INFO nova.virt.libvirt.driver [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 6fabeb14-7440-41d0-8be1-453a7607a8ed] Using config drive#033[00m
Nov 22 02:39:59 np0005531887 podman[213550]: 2025-11-22 07:39:59.30199426 +0000 UTC m=+0.079856225 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 02:39:59 np0005531887 nova_compute[186849]: 2025-11-22 07:39:59.535 186853 INFO nova.virt.libvirt.driver [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 6fabeb14-7440-41d0-8be1-453a7607a8ed] Creating config drive at /var/lib/nova/instances/6fabeb14-7440-41d0-8be1-453a7607a8ed/disk.config#033[00m
Nov 22 02:39:59 np0005531887 nova_compute[186849]: 2025-11-22 07:39:59.540 186853 DEBUG oslo_concurrency.processutils [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6fabeb14-7440-41d0-8be1-453a7607a8ed/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1hf8ix9a execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:39:59 np0005531887 nova_compute[186849]: 2025-11-22 07:39:59.662 186853 DEBUG oslo_concurrency.processutils [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6fabeb14-7440-41d0-8be1-453a7607a8ed/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1hf8ix9a" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:39:59 np0005531887 systemd-machined[153180]: New machine qemu-2-instance-00000007.
Nov 22 02:39:59 np0005531887 systemd[1]: Started Virtual Machine qemu-2-instance-00000007.
Nov 22 02:40:00 np0005531887 nova_compute[186849]: 2025-11-22 07:40:00.286 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763797200.286275, 6fabeb14-7440-41d0-8be1-453a7607a8ed => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:40:00 np0005531887 nova_compute[186849]: 2025-11-22 07:40:00.287 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 6fabeb14-7440-41d0-8be1-453a7607a8ed] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:40:00 np0005531887 nova_compute[186849]: 2025-11-22 07:40:00.290 186853 DEBUG nova.compute.manager [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 6fabeb14-7440-41d0-8be1-453a7607a8ed] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:40:00 np0005531887 nova_compute[186849]: 2025-11-22 07:40:00.291 186853 DEBUG nova.virt.libvirt.driver [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 6fabeb14-7440-41d0-8be1-453a7607a8ed] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 02:40:00 np0005531887 nova_compute[186849]: 2025-11-22 07:40:00.295 186853 INFO nova.virt.libvirt.driver [-] [instance: 6fabeb14-7440-41d0-8be1-453a7607a8ed] Instance spawned successfully.#033[00m
Nov 22 02:40:00 np0005531887 nova_compute[186849]: 2025-11-22 07:40:00.295 186853 DEBUG nova.virt.libvirt.driver [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 6fabeb14-7440-41d0-8be1-453a7607a8ed] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 02:40:00 np0005531887 nova_compute[186849]: 2025-11-22 07:40:00.326 186853 DEBUG nova.virt.libvirt.driver [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 6fabeb14-7440-41d0-8be1-453a7607a8ed] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:40:00 np0005531887 nova_compute[186849]: 2025-11-22 07:40:00.326 186853 DEBUG nova.virt.libvirt.driver [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 6fabeb14-7440-41d0-8be1-453a7607a8ed] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:40:00 np0005531887 nova_compute[186849]: 2025-11-22 07:40:00.327 186853 DEBUG nova.virt.libvirt.driver [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 6fabeb14-7440-41d0-8be1-453a7607a8ed] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:40:00 np0005531887 nova_compute[186849]: 2025-11-22 07:40:00.327 186853 DEBUG nova.virt.libvirt.driver [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 6fabeb14-7440-41d0-8be1-453a7607a8ed] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:40:00 np0005531887 nova_compute[186849]: 2025-11-22 07:40:00.328 186853 DEBUG nova.virt.libvirt.driver [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 6fabeb14-7440-41d0-8be1-453a7607a8ed] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:40:00 np0005531887 nova_compute[186849]: 2025-11-22 07:40:00.328 186853 DEBUG nova.virt.libvirt.driver [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 6fabeb14-7440-41d0-8be1-453a7607a8ed] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:40:00 np0005531887 nova_compute[186849]: 2025-11-22 07:40:00.331 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 6fabeb14-7440-41d0-8be1-453a7607a8ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:40:00 np0005531887 nova_compute[186849]: 2025-11-22 07:40:00.333 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 6fabeb14-7440-41d0-8be1-453a7607a8ed] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:40:00 np0005531887 nova_compute[186849]: 2025-11-22 07:40:00.386 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 6fabeb14-7440-41d0-8be1-453a7607a8ed] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:40:00 np0005531887 nova_compute[186849]: 2025-11-22 07:40:00.386 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763797200.287284, 6fabeb14-7440-41d0-8be1-453a7607a8ed => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:40:00 np0005531887 nova_compute[186849]: 2025-11-22 07:40:00.386 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 6fabeb14-7440-41d0-8be1-453a7607a8ed] VM Started (Lifecycle Event)#033[00m
Nov 22 02:40:00 np0005531887 nova_compute[186849]: 2025-11-22 07:40:00.453 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 6fabeb14-7440-41d0-8be1-453a7607a8ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:40:00 np0005531887 nova_compute[186849]: 2025-11-22 07:40:00.456 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 6fabeb14-7440-41d0-8be1-453a7607a8ed] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:40:00 np0005531887 nova_compute[186849]: 2025-11-22 07:40:00.477 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 6fabeb14-7440-41d0-8be1-453a7607a8ed] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:40:00 np0005531887 nova_compute[186849]: 2025-11-22 07:40:00.488 186853 INFO nova.compute.manager [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 6fabeb14-7440-41d0-8be1-453a7607a8ed] Took 1.79 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 02:40:00 np0005531887 nova_compute[186849]: 2025-11-22 07:40:00.489 186853 DEBUG nova.compute.manager [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 6fabeb14-7440-41d0-8be1-453a7607a8ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:40:00 np0005531887 nova_compute[186849]: 2025-11-22 07:40:00.616 186853 INFO nova.compute.manager [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 6fabeb14-7440-41d0-8be1-453a7607a8ed] Took 2.42 seconds to build instance.#033[00m
Nov 22 02:40:00 np0005531887 nova_compute[186849]: 2025-11-22 07:40:00.655 186853 DEBUG oslo_concurrency.lockutils [None req-ef89d678-2f77-4f03-983c-e2dc89b047de d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Lock "6fabeb14-7440-41d0-8be1-453a7607a8ed" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.584s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:40:03 np0005531887 podman[213601]: 2025-11-22 07:40:03.848384367 +0000 UTC m=+0.065470847 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 02:40:07 np0005531887 nova_compute[186849]: 2025-11-22 07:40:07.219 186853 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797192.2180626, 5fde891f-fc12-4c02-8b8d-0475ce19a753 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:40:07 np0005531887 nova_compute[186849]: 2025-11-22 07:40:07.220 186853 INFO nova.compute.manager [-] [instance: 5fde891f-fc12-4c02-8b8d-0475ce19a753] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:40:07 np0005531887 nova_compute[186849]: 2025-11-22 07:40:07.254 186853 DEBUG nova.compute.manager [None req-c85f41bc-0881-492e-891f-db279b021e23 - - - - - -] [instance: 5fde891f-fc12-4c02-8b8d-0475ce19a753] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:40:11 np0005531887 nova_compute[186849]: 2025-11-22 07:40:11.633 186853 DEBUG nova.network.neutron [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Automatically allocated network: {'id': 'cd94b117-ddd2-457a-a1e9-a1e03ac67322', 'name': 'auto_allocated_network', 'tenant_id': '98627e04b62e4ce4bf9650377c674f73', 'admin_state_up': True, 'mtu': 1442, 'status': 'ACTIVE', 'subnets': ['68dcf6dc-373a-4168-81d5-f04abc5d8ac8', '826b0fb5-b3d0-49b5-b40e-079f62557646'], 'shared': False, 'availability_zone_hints': [], 'availability_zones': [], 'ipv4_address_scope': None, 'ipv6_address_scope': None, 'router:external': False, 'description': '', 'qos_policy_id': None, 'port_security_enabled': True, 'dns_domain': '', 'l2_adjacency': True, 'tags': [], 'created_at': '2025-11-22T07:39:55Z', 'updated_at': '2025-11-22T07:40:10Z', 'revision_number': 4, 'project_id': '98627e04b62e4ce4bf9650377c674f73'} _auto_allocate_network /usr/lib/python3.9/site-packages/nova/network/neutron.py:2478#033[00m
Nov 22 02:40:11 np0005531887 nova_compute[186849]: 2025-11-22 07:40:11.645 186853 WARNING oslo_policy.policy [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m
Nov 22 02:40:11 np0005531887 nova_compute[186849]: 2025-11-22 07:40:11.646 186853 WARNING oslo_policy.policy [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m
Nov 22 02:40:11 np0005531887 nova_compute[186849]: 2025-11-22 07:40:11.648 186853 DEBUG nova.policy [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 02:40:12 np0005531887 podman[213624]: 2025-11-22 07:40:12.858687531 +0000 UTC m=+0.067855787 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 22 02:40:12 np0005531887 podman[213625]: 2025-11-22 07:40:12.900592064 +0000 UTC m=+0.095842973 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:40:14 np0005531887 nova_compute[186849]: 2025-11-22 07:40:14.689 186853 DEBUG nova.network.neutron [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Successfully created port: d7f5e5f5-f58a-46a6-92e2-0ec1be38e606 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 02:40:16 np0005531887 nova_compute[186849]: 2025-11-22 07:40:16.938 186853 DEBUG nova.network.neutron [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Successfully updated port: d7f5e5f5-f58a-46a6-92e2-0ec1be38e606 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 02:40:16 np0005531887 nova_compute[186849]: 2025-11-22 07:40:16.962 186853 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Acquiring lock "refresh_cache-6fe80388-ca60-492c-a99f-a338bcce8d5b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:40:16 np0005531887 nova_compute[186849]: 2025-11-22 07:40:16.963 186853 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Acquired lock "refresh_cache-6fe80388-ca60-492c-a99f-a338bcce8d5b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:40:16 np0005531887 nova_compute[186849]: 2025-11-22 07:40:16.963 186853 DEBUG nova.network.neutron [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:40:17 np0005531887 nova_compute[186849]: 2025-11-22 07:40:17.283 186853 DEBUG nova.compute.manager [req-545af97a-5ed0-4ee1-a60d-bcade8cde94f req-5c8c59cd-98e2-4929-a6ac-1882556adde2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Received event network-changed-d7f5e5f5-f58a-46a6-92e2-0ec1be38e606 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:40:17 np0005531887 nova_compute[186849]: 2025-11-22 07:40:17.284 186853 DEBUG nova.compute.manager [req-545af97a-5ed0-4ee1-a60d-bcade8cde94f req-5c8c59cd-98e2-4929-a6ac-1882556adde2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Refreshing instance network info cache due to event network-changed-d7f5e5f5-f58a-46a6-92e2-0ec1be38e606. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 02:40:17 np0005531887 nova_compute[186849]: 2025-11-22 07:40:17.284 186853 DEBUG oslo_concurrency.lockutils [req-545af97a-5ed0-4ee1-a60d-bcade8cde94f req-5c8c59cd-98e2-4929-a6ac-1882556adde2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-6fe80388-ca60-492c-a99f-a338bcce8d5b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:40:17 np0005531887 nova_compute[186849]: 2025-11-22 07:40:17.341 186853 DEBUG nova.network.neutron [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:40:18 np0005531887 podman[213690]: 2025-11-22 07:40:18.82722818 +0000 UTC m=+0.051417858 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 02:40:19 np0005531887 nova_compute[186849]: 2025-11-22 07:40:19.963 186853 DEBUG nova.network.neutron [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Updating instance_info_cache with network_info: [{"id": "d7f5e5f5-f58a-46a6-92e2-0ec1be38e606", "address": "fa:16:3e:7c:98:89", "network": {"id": "cd94b117-ddd2-457a-a1e9-a1e03ac67322", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::3c4", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.61", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98627e04b62e4ce4bf9650377c674f73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7f5e5f5-f5", "ovs_interfaceid": "d7f5e5f5-f58a-46a6-92e2-0ec1be38e606", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:40:20 np0005531887 nova_compute[186849]: 2025-11-22 07:40:20.021 186853 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Releasing lock "refresh_cache-6fe80388-ca60-492c-a99f-a338bcce8d5b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:40:20 np0005531887 nova_compute[186849]: 2025-11-22 07:40:20.022 186853 DEBUG nova.compute.manager [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Instance network_info: |[{"id": "d7f5e5f5-f58a-46a6-92e2-0ec1be38e606", "address": "fa:16:3e:7c:98:89", "network": {"id": "cd94b117-ddd2-457a-a1e9-a1e03ac67322", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::3c4", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.61", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98627e04b62e4ce4bf9650377c674f73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7f5e5f5-f5", "ovs_interfaceid": "d7f5e5f5-f58a-46a6-92e2-0ec1be38e606", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 02:40:20 np0005531887 nova_compute[186849]: 2025-11-22 07:40:20.023 186853 DEBUG oslo_concurrency.lockutils [req-545af97a-5ed0-4ee1-a60d-bcade8cde94f req-5c8c59cd-98e2-4929-a6ac-1882556adde2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-6fe80388-ca60-492c-a99f-a338bcce8d5b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:40:20 np0005531887 nova_compute[186849]: 2025-11-22 07:40:20.023 186853 DEBUG nova.network.neutron [req-545af97a-5ed0-4ee1-a60d-bcade8cde94f req-5c8c59cd-98e2-4929-a6ac-1882556adde2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Refreshing network info cache for port d7f5e5f5-f58a-46a6-92e2-0ec1be38e606 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 02:40:20 np0005531887 nova_compute[186849]: 2025-11-22 07:40:20.027 186853 DEBUG nova.virt.libvirt.driver [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Start _get_guest_xml network_info=[{"id": "d7f5e5f5-f58a-46a6-92e2-0ec1be38e606", "address": "fa:16:3e:7c:98:89", "network": {"id": "cd94b117-ddd2-457a-a1e9-a1e03ac67322", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::3c4", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.61", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98627e04b62e4ce4bf9650377c674f73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7f5e5f5-f5", "ovs_interfaceid": "d7f5e5f5-f58a-46a6-92e2-0ec1be38e606", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'size': 0, 'boot_index': 0, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:40:20 np0005531887 nova_compute[186849]: 2025-11-22 07:40:20.036 186853 WARNING nova.virt.libvirt.driver [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:40:20 np0005531887 nova_compute[186849]: 2025-11-22 07:40:20.046 186853 DEBUG nova.virt.libvirt.host [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:40:20 np0005531887 nova_compute[186849]: 2025-11-22 07:40:20.047 186853 DEBUG nova.virt.libvirt.host [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:40:20 np0005531887 nova_compute[186849]: 2025-11-22 07:40:20.053 186853 DEBUG nova.virt.libvirt.host [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:40:20 np0005531887 nova_compute[186849]: 2025-11-22 07:40:20.054 186853 DEBUG nova.virt.libvirt.host [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:40:20 np0005531887 nova_compute[186849]: 2025-11-22 07:40:20.055 186853 DEBUG nova.virt.libvirt.driver [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:40:20 np0005531887 nova_compute[186849]: 2025-11-22 07:40:20.055 186853 DEBUG nova.virt.hardware [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:40:20 np0005531887 nova_compute[186849]: 2025-11-22 07:40:20.056 186853 DEBUG nova.virt.hardware [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:40:20 np0005531887 nova_compute[186849]: 2025-11-22 07:40:20.056 186853 DEBUG nova.virt.hardware [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:40:20 np0005531887 nova_compute[186849]: 2025-11-22 07:40:20.057 186853 DEBUG nova.virt.hardware [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:40:20 np0005531887 nova_compute[186849]: 2025-11-22 07:40:20.057 186853 DEBUG nova.virt.hardware [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:40:20 np0005531887 nova_compute[186849]: 2025-11-22 07:40:20.057 186853 DEBUG nova.virt.hardware [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:40:20 np0005531887 nova_compute[186849]: 2025-11-22 07:40:20.057 186853 DEBUG nova.virt.hardware [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:40:20 np0005531887 nova_compute[186849]: 2025-11-22 07:40:20.058 186853 DEBUG nova.virt.hardware [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:40:20 np0005531887 nova_compute[186849]: 2025-11-22 07:40:20.058 186853 DEBUG nova.virt.hardware [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:40:20 np0005531887 nova_compute[186849]: 2025-11-22 07:40:20.058 186853 DEBUG nova.virt.hardware [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:40:20 np0005531887 nova_compute[186849]: 2025-11-22 07:40:20.058 186853 DEBUG nova.virt.hardware [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:40:20 np0005531887 nova_compute[186849]: 2025-11-22 07:40:20.062 186853 DEBUG nova.virt.libvirt.vif [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:39:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-1346960213-3',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1346960213-3',id=6,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=2,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98627e04b62e4ce4bf9650377c674f73',ramdisk_id='',reservation_id='r-qng9bmzp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AutoAllocateNetworkTest-83498172',owner_user_name='tempest-AutoAllocateNetworkTest-83498172-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:39:54Z,user_data=None,user_id='12b223a79f8b4927861908eb11663fb5',uuid=6fe80388-ca60-492c-a99f-a338bcce8d5b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d7f5e5f5-f58a-46a6-92e2-0ec1be38e606", "address": "fa:16:3e:7c:98:89", "network": {"id": "cd94b117-ddd2-457a-a1e9-a1e03ac67322", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::3c4", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.61", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98627e04b62e4ce4bf9650377c674f73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7f5e5f5-f5", "ovs_interfaceid": "d7f5e5f5-f58a-46a6-92e2-0ec1be38e606", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 02:40:20 np0005531887 nova_compute[186849]: 2025-11-22 07:40:20.063 186853 DEBUG nova.network.os_vif_util [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Converting VIF {"id": "d7f5e5f5-f58a-46a6-92e2-0ec1be38e606", "address": "fa:16:3e:7c:98:89", "network": {"id": "cd94b117-ddd2-457a-a1e9-a1e03ac67322", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::3c4", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.61", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98627e04b62e4ce4bf9650377c674f73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7f5e5f5-f5", "ovs_interfaceid": "d7f5e5f5-f58a-46a6-92e2-0ec1be38e606", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:40:20 np0005531887 nova_compute[186849]: 2025-11-22 07:40:20.063 186853 DEBUG nova.network.os_vif_util [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7c:98:89,bridge_name='br-int',has_traffic_filtering=True,id=d7f5e5f5-f58a-46a6-92e2-0ec1be38e606,network=Network(cd94b117-ddd2-457a-a1e9-a1e03ac67322),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7f5e5f5-f5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:40:20 np0005531887 nova_compute[186849]: 2025-11-22 07:40:20.066 186853 DEBUG nova.objects.instance [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lazy-loading 'pci_devices' on Instance uuid 6fe80388-ca60-492c-a99f-a338bcce8d5b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:40:20 np0005531887 nova_compute[186849]: 2025-11-22 07:40:20.092 186853 DEBUG nova.virt.libvirt.driver [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:40:20 np0005531887 nova_compute[186849]:  <uuid>6fe80388-ca60-492c-a99f-a338bcce8d5b</uuid>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:  <name>instance-00000006</name>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:  <memory>131072</memory>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:  <vcpu>1</vcpu>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:40:20 np0005531887 nova_compute[186849]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:      <nova:name>tempest-tempest.common.compute-instance-1346960213-3</nova:name>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:      <nova:creationTime>2025-11-22 07:40:20</nova:creationTime>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:      <nova:flavor name="m1.nano">
Nov 22 02:40:20 np0005531887 nova_compute[186849]:        <nova:memory>128</nova:memory>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:        <nova:disk>1</nova:disk>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:        <nova:swap>0</nova:swap>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:      </nova:flavor>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:      <nova:owner>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:        <nova:user uuid="12b223a79f8b4927861908eb11663fb5">tempest-AutoAllocateNetworkTest-83498172-project-member</nova:user>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:        <nova:project uuid="98627e04b62e4ce4bf9650377c674f73">tempest-AutoAllocateNetworkTest-83498172</nova:project>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:      </nova:owner>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:      <nova:ports>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:        <nova:port uuid="d7f5e5f5-f58a-46a6-92e2-0ec1be38e606">
Nov 22 02:40:20 np0005531887 nova_compute[186849]:          <nova:ip type="fixed" address="fdfe:381f:8400::3c4" ipVersion="6"/>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:          <nova:ip type="fixed" address="10.1.0.61" ipVersion="4"/>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:        </nova:port>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:      </nova:ports>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:    </nova:instance>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:  <sysinfo type="smbios">
Nov 22 02:40:20 np0005531887 nova_compute[186849]:    <system>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:      <entry name="serial">6fe80388-ca60-492c-a99f-a338bcce8d5b</entry>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:      <entry name="uuid">6fe80388-ca60-492c-a99f-a338bcce8d5b</entry>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:    </system>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:  <os>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:    <boot dev="hd"/>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:    <smbios mode="sysinfo"/>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:  </os>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:  <features>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:    <vmcoreinfo/>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:  </features>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:  <clock offset="utc">
Nov 22 02:40:20 np0005531887 nova_compute[186849]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:    <timer name="hpet" present="no"/>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:  </clock>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:  <cpu mode="custom" match="exact">
Nov 22 02:40:20 np0005531887 nova_compute[186849]:    <model>Nehalem</model>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:  <devices>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:    <disk type="file" device="disk">
Nov 22 02:40:20 np0005531887 nova_compute[186849]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/6fe80388-ca60-492c-a99f-a338bcce8d5b/disk"/>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:      <target dev="vda" bus="virtio"/>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:    <disk type="file" device="cdrom">
Nov 22 02:40:20 np0005531887 nova_compute[186849]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/6fe80388-ca60-492c-a99f-a338bcce8d5b/disk.config"/>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:      <target dev="sda" bus="sata"/>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:    <interface type="ethernet">
Nov 22 02:40:20 np0005531887 nova_compute[186849]:      <mac address="fa:16:3e:7c:98:89"/>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:      <mtu size="1442"/>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:      <target dev="tapd7f5e5f5-f5"/>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:    </interface>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:    <serial type="pty">
Nov 22 02:40:20 np0005531887 nova_compute[186849]:      <log file="/var/lib/nova/instances/6fe80388-ca60-492c-a99f-a338bcce8d5b/console.log" append="off"/>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:    </serial>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:    <video>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:    </video>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:    <input type="tablet" bus="usb"/>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:    <rng model="virtio">
Nov 22 02:40:20 np0005531887 nova_compute[186849]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:    </rng>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:    <controller type="usb" index="0"/>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:    <memballoon model="virtio">
Nov 22 02:40:20 np0005531887 nova_compute[186849]:      <stats period="10"/>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 02:40:20 np0005531887 nova_compute[186849]:  </devices>
Nov 22 02:40:20 np0005531887 nova_compute[186849]: </domain>
Nov 22 02:40:20 np0005531887 nova_compute[186849]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:40:20 np0005531887 nova_compute[186849]: 2025-11-22 07:40:20.093 186853 DEBUG nova.compute.manager [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Preparing to wait for external event network-vif-plugged-d7f5e5f5-f58a-46a6-92e2-0ec1be38e606 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 02:40:20 np0005531887 nova_compute[186849]: 2025-11-22 07:40:20.094 186853 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Acquiring lock "6fe80388-ca60-492c-a99f-a338bcce8d5b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:40:20 np0005531887 nova_compute[186849]: 2025-11-22 07:40:20.094 186853 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "6fe80388-ca60-492c-a99f-a338bcce8d5b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:40:20 np0005531887 nova_compute[186849]: 2025-11-22 07:40:20.094 186853 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "6fe80388-ca60-492c-a99f-a338bcce8d5b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:40:20 np0005531887 nova_compute[186849]: 2025-11-22 07:40:20.095 186853 DEBUG nova.virt.libvirt.vif [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:39:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-1346960213-3',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1346960213-3',id=6,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=2,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98627e04b62e4ce4bf9650377c674f73',ramdisk_id='',reservation_id='r-qng9bmzp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AutoAllocateNetworkTest-83498172',owner_user_name='tempest-AutoAllocateNetworkTest-83498172-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:39:54Z,user_data=None,user_id='12b223a79f8b4927861908eb11663fb5',uuid=6fe80388-ca60-492c-a99f-a338bcce8d5b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d7f5e5f5-f58a-46a6-92e2-0ec1be38e606", "address": "fa:16:3e:7c:98:89", "network": {"id": "cd94b117-ddd2-457a-a1e9-a1e03ac67322", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::3c4", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.61", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98627e04b62e4ce4bf9650377c674f73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7f5e5f5-f5", "ovs_interfaceid": "d7f5e5f5-f58a-46a6-92e2-0ec1be38e606", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 02:40:20 np0005531887 nova_compute[186849]: 2025-11-22 07:40:20.095 186853 DEBUG nova.network.os_vif_util [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Converting VIF {"id": "d7f5e5f5-f58a-46a6-92e2-0ec1be38e606", "address": "fa:16:3e:7c:98:89", "network": {"id": "cd94b117-ddd2-457a-a1e9-a1e03ac67322", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::3c4", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.61", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98627e04b62e4ce4bf9650377c674f73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7f5e5f5-f5", "ovs_interfaceid": "d7f5e5f5-f58a-46a6-92e2-0ec1be38e606", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:40:20 np0005531887 nova_compute[186849]: 2025-11-22 07:40:20.096 186853 DEBUG nova.network.os_vif_util [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7c:98:89,bridge_name='br-int',has_traffic_filtering=True,id=d7f5e5f5-f58a-46a6-92e2-0ec1be38e606,network=Network(cd94b117-ddd2-457a-a1e9-a1e03ac67322),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7f5e5f5-f5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:40:20 np0005531887 nova_compute[186849]: 2025-11-22 07:40:20.096 186853 DEBUG os_vif [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7c:98:89,bridge_name='br-int',has_traffic_filtering=True,id=d7f5e5f5-f58a-46a6-92e2-0ec1be38e606,network=Network(cd94b117-ddd2-457a-a1e9-a1e03ac67322),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7f5e5f5-f5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 02:40:20 np0005531887 nova_compute[186849]: 2025-11-22 07:40:20.142 186853 DEBUG ovsdbapp.backend.ovs_idl [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 22 02:40:20 np0005531887 nova_compute[186849]: 2025-11-22 07:40:20.143 186853 DEBUG ovsdbapp.backend.ovs_idl [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 22 02:40:20 np0005531887 nova_compute[186849]: 2025-11-22 07:40:20.143 186853 DEBUG ovsdbapp.backend.ovs_idl [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 22 02:40:20 np0005531887 nova_compute[186849]: 2025-11-22 07:40:20.144 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Nov 22 02:40:20 np0005531887 nova_compute[186849]: 2025-11-22 07:40:20.145 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [POLLOUT] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:20 np0005531887 nova_compute[186849]: 2025-11-22 07:40:20.145 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Nov 22 02:40:20 np0005531887 nova_compute[186849]: 2025-11-22 07:40:20.146 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:20 np0005531887 nova_compute[186849]: 2025-11-22 07:40:20.148 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:20 np0005531887 nova_compute[186849]: 2025-11-22 07:40:20.150 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:20 np0005531887 nova_compute[186849]: 2025-11-22 07:40:20.161 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:20 np0005531887 nova_compute[186849]: 2025-11-22 07:40:20.162 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:40:20 np0005531887 nova_compute[186849]: 2025-11-22 07:40:20.162 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:40:20 np0005531887 nova_compute[186849]: 2025-11-22 07:40:20.164 186853 INFO oslo.privsep.daemon [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpi8w2c76g/privsep.sock']#033[00m
Nov 22 02:40:20 np0005531887 nova_compute[186849]: 2025-11-22 07:40:20.929 186853 INFO oslo.privsep.daemon [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Spawned new privsep daemon via rootwrap#033[00m
Nov 22 02:40:20 np0005531887 nova_compute[186849]: 2025-11-22 07:40:20.763 213719 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Nov 22 02:40:20 np0005531887 nova_compute[186849]: 2025-11-22 07:40:20.767 213719 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Nov 22 02:40:20 np0005531887 nova_compute[186849]: 2025-11-22 07:40:20.770 213719 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none#033[00m
Nov 22 02:40:20 np0005531887 nova_compute[186849]: 2025-11-22 07:40:20.770 213719 INFO oslo.privsep.daemon [-] privsep daemon running as pid 213719#033[00m
Nov 22 02:40:21 np0005531887 nova_compute[186849]: 2025-11-22 07:40:21.265 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:21 np0005531887 nova_compute[186849]: 2025-11-22 07:40:21.266 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd7f5e5f5-f5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:40:21 np0005531887 nova_compute[186849]: 2025-11-22 07:40:21.267 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd7f5e5f5-f5, col_values=(('external_ids', {'iface-id': 'd7f5e5f5-f58a-46a6-92e2-0ec1be38e606', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7c:98:89', 'vm-uuid': '6fe80388-ca60-492c-a99f-a338bcce8d5b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:40:21 np0005531887 nova_compute[186849]: 2025-11-22 07:40:21.269 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:21 np0005531887 NetworkManager[55210]: <info>  [1763797221.2704] manager: (tapd7f5e5f5-f5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/21)
Nov 22 02:40:21 np0005531887 nova_compute[186849]: 2025-11-22 07:40:21.273 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:40:21 np0005531887 nova_compute[186849]: 2025-11-22 07:40:21.277 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:21 np0005531887 nova_compute[186849]: 2025-11-22 07:40:21.278 186853 INFO os_vif [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7c:98:89,bridge_name='br-int',has_traffic_filtering=True,id=d7f5e5f5-f58a-46a6-92e2-0ec1be38e606,network=Network(cd94b117-ddd2-457a-a1e9-a1e03ac67322),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7f5e5f5-f5')#033[00m
Nov 22 02:40:21 np0005531887 nova_compute[186849]: 2025-11-22 07:40:21.408 186853 DEBUG nova.virt.libvirt.driver [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:40:21 np0005531887 nova_compute[186849]: 2025-11-22 07:40:21.409 186853 DEBUG nova.virt.libvirt.driver [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:40:21 np0005531887 nova_compute[186849]: 2025-11-22 07:40:21.410 186853 DEBUG nova.virt.libvirt.driver [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] No VIF found with MAC fa:16:3e:7c:98:89, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 02:40:21 np0005531887 nova_compute[186849]: 2025-11-22 07:40:21.410 186853 INFO nova.virt.libvirt.driver [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Using config drive#033[00m
Nov 22 02:40:22 np0005531887 nova_compute[186849]: 2025-11-22 07:40:22.177 186853 INFO nova.virt.libvirt.driver [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Creating config drive at /var/lib/nova/instances/6fe80388-ca60-492c-a99f-a338bcce8d5b/disk.config#033[00m
Nov 22 02:40:22 np0005531887 nova_compute[186849]: 2025-11-22 07:40:22.184 186853 DEBUG oslo_concurrency.processutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6fe80388-ca60-492c-a99f-a338bcce8d5b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpeun9kp0h execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:40:22 np0005531887 nova_compute[186849]: 2025-11-22 07:40:22.307 186853 DEBUG oslo_concurrency.processutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6fe80388-ca60-492c-a99f-a338bcce8d5b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpeun9kp0h" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:40:22 np0005531887 kernel: tun: Universal TUN/TAP device driver, 1.6
Nov 22 02:40:22 np0005531887 kernel: tapd7f5e5f5-f5: entered promiscuous mode
Nov 22 02:40:22 np0005531887 NetworkManager[55210]: <info>  [1763797222.4038] manager: (tapd7f5e5f5-f5): new Tun device (/org/freedesktop/NetworkManager/Devices/22)
Nov 22 02:40:22 np0005531887 ovn_controller[95130]: 2025-11-22T07:40:22Z|00027|binding|INFO|Claiming lport d7f5e5f5-f58a-46a6-92e2-0ec1be38e606 for this chassis.
Nov 22 02:40:22 np0005531887 ovn_controller[95130]: 2025-11-22T07:40:22Z|00028|binding|INFO|d7f5e5f5-f58a-46a6-92e2-0ec1be38e606: Claiming fa:16:3e:7c:98:89 10.1.0.61 fdfe:381f:8400::3c4
Nov 22 02:40:22 np0005531887 nova_compute[186849]: 2025-11-22 07:40:22.405 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:22 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:40:22.423 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7c:98:89 10.1.0.61 fdfe:381f:8400::3c4'], port_security=['fa:16:3e:7c:98:89 10.1.0.61 fdfe:381f:8400::3c4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.0.61/26 fdfe:381f:8400::3c4/64', 'neutron:device_id': '6fe80388-ca60-492c-a99f-a338bcce8d5b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cd94b117-ddd2-457a-a1e9-a1e03ac67322', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98627e04b62e4ce4bf9650377c674f73', 'neutron:revision_number': '2', 'neutron:security_group_ids': '931bf7c3-500b-4034-8d8e-f18219ff1b58', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6120d3e5-4a9e-45cc-93a1-87b92bf94714, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=d7f5e5f5-f58a-46a6-92e2-0ec1be38e606) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:40:22 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:40:22.424 104084 INFO neutron.agent.ovn.metadata.agent [-] Port d7f5e5f5-f58a-46a6-92e2-0ec1be38e606 in datapath cd94b117-ddd2-457a-a1e9-a1e03ac67322 bound to our chassis#033[00m
Nov 22 02:40:22 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:40:22.426 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network cd94b117-ddd2-457a-a1e9-a1e03ac67322#033[00m
Nov 22 02:40:22 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:40:22.427 104084 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmp4jj_ompl/privsep.sock']#033[00m
Nov 22 02:40:22 np0005531887 systemd-udevd[213765]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:40:22 np0005531887 NetworkManager[55210]: <info>  [1763797222.4581] device (tapd7f5e5f5-f5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 02:40:22 np0005531887 NetworkManager[55210]: <info>  [1763797222.4591] device (tapd7f5e5f5-f5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 02:40:22 np0005531887 systemd-machined[153180]: New machine qemu-3-instance-00000006.
Nov 22 02:40:22 np0005531887 podman[213736]: 2025-11-22 07:40:22.472445096 +0000 UTC m=+0.092539097 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:40:22 np0005531887 nova_compute[186849]: 2025-11-22 07:40:22.478 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:22 np0005531887 nova_compute[186849]: 2025-11-22 07:40:22.483 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:22 np0005531887 systemd[1]: Started Virtual Machine qemu-3-instance-00000006.
Nov 22 02:40:22 np0005531887 ovn_controller[95130]: 2025-11-22T07:40:22Z|00029|binding|INFO|Setting lport d7f5e5f5-f58a-46a6-92e2-0ec1be38e606 ovn-installed in OVS
Nov 22 02:40:22 np0005531887 ovn_controller[95130]: 2025-11-22T07:40:22Z|00030|binding|INFO|Setting lport d7f5e5f5-f58a-46a6-92e2-0ec1be38e606 up in Southbound
Nov 22 02:40:22 np0005531887 nova_compute[186849]: 2025-11-22 07:40:22.489 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:22 np0005531887 nova_compute[186849]: 2025-11-22 07:40:22.706 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763797222.705695, 6fe80388-ca60-492c-a99f-a338bcce8d5b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:40:22 np0005531887 nova_compute[186849]: 2025-11-22 07:40:22.707 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] VM Started (Lifecycle Event)#033[00m
Nov 22 02:40:22 np0005531887 nova_compute[186849]: 2025-11-22 07:40:22.746 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:40:22 np0005531887 nova_compute[186849]: 2025-11-22 07:40:22.750 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763797222.7097192, 6fe80388-ca60-492c-a99f-a338bcce8d5b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:40:22 np0005531887 nova_compute[186849]: 2025-11-22 07:40:22.751 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] VM Paused (Lifecycle Event)#033[00m
Nov 22 02:40:22 np0005531887 nova_compute[186849]: 2025-11-22 07:40:22.776 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:40:22 np0005531887 nova_compute[186849]: 2025-11-22 07:40:22.780 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:40:22 np0005531887 nova_compute[186849]: 2025-11-22 07:40:22.802 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:40:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:40:23.149 104084 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Nov 22 02:40:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:40:23.150 104084 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp4jj_ompl/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Nov 22 02:40:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:40:23.016 213790 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Nov 22 02:40:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:40:23.020 213790 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Nov 22 02:40:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:40:23.023 213790 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none#033[00m
Nov 22 02:40:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:40:23.023 213790 INFO oslo.privsep.daemon [-] privsep daemon running as pid 213790#033[00m
Nov 22 02:40:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:40:23.152 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[8622288b-c033-45ad-b8e3-a594aa01cece]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:40:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:40:23.223 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:40:23 np0005531887 nova_compute[186849]: 2025-11-22 07:40:23.223 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:23 np0005531887 nova_compute[186849]: 2025-11-22 07:40:23.610 186853 DEBUG nova.compute.manager [req-1c461dd7-82a3-4358-a56e-611c3a6d4a34 req-2f457517-b4a7-4fea-8d59-9dfbab6aa577 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Received event network-vif-plugged-d7f5e5f5-f58a-46a6-92e2-0ec1be38e606 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:40:23 np0005531887 nova_compute[186849]: 2025-11-22 07:40:23.611 186853 DEBUG oslo_concurrency.lockutils [req-1c461dd7-82a3-4358-a56e-611c3a6d4a34 req-2f457517-b4a7-4fea-8d59-9dfbab6aa577 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "6fe80388-ca60-492c-a99f-a338bcce8d5b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:40:23 np0005531887 nova_compute[186849]: 2025-11-22 07:40:23.611 186853 DEBUG oslo_concurrency.lockutils [req-1c461dd7-82a3-4358-a56e-611c3a6d4a34 req-2f457517-b4a7-4fea-8d59-9dfbab6aa577 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6fe80388-ca60-492c-a99f-a338bcce8d5b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:40:23 np0005531887 nova_compute[186849]: 2025-11-22 07:40:23.611 186853 DEBUG oslo_concurrency.lockutils [req-1c461dd7-82a3-4358-a56e-611c3a6d4a34 req-2f457517-b4a7-4fea-8d59-9dfbab6aa577 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6fe80388-ca60-492c-a99f-a338bcce8d5b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:40:23 np0005531887 nova_compute[186849]: 2025-11-22 07:40:23.611 186853 DEBUG nova.compute.manager [req-1c461dd7-82a3-4358-a56e-611c3a6d4a34 req-2f457517-b4a7-4fea-8d59-9dfbab6aa577 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Processing event network-vif-plugged-d7f5e5f5-f58a-46a6-92e2-0ec1be38e606 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 02:40:23 np0005531887 nova_compute[186849]: 2025-11-22 07:40:23.612 186853 DEBUG nova.compute.manager [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:40:23 np0005531887 nova_compute[186849]: 2025-11-22 07:40:23.615 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763797223.6151037, 6fe80388-ca60-492c-a99f-a338bcce8d5b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:40:23 np0005531887 nova_compute[186849]: 2025-11-22 07:40:23.615 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:40:23 np0005531887 nova_compute[186849]: 2025-11-22 07:40:23.618 186853 DEBUG nova.virt.libvirt.driver [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 02:40:23 np0005531887 nova_compute[186849]: 2025-11-22 07:40:23.621 186853 INFO nova.virt.libvirt.driver [-] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Instance spawned successfully.#033[00m
Nov 22 02:40:23 np0005531887 nova_compute[186849]: 2025-11-22 07:40:23.621 186853 DEBUG nova.virt.libvirt.driver [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 02:40:23 np0005531887 nova_compute[186849]: 2025-11-22 07:40:23.652 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:40:23 np0005531887 nova_compute[186849]: 2025-11-22 07:40:23.656 186853 DEBUG nova.virt.libvirt.driver [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:40:23 np0005531887 nova_compute[186849]: 2025-11-22 07:40:23.656 186853 DEBUG nova.virt.libvirt.driver [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:40:23 np0005531887 nova_compute[186849]: 2025-11-22 07:40:23.657 186853 DEBUG nova.virt.libvirt.driver [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:40:23 np0005531887 nova_compute[186849]: 2025-11-22 07:40:23.657 186853 DEBUG nova.virt.libvirt.driver [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:40:23 np0005531887 nova_compute[186849]: 2025-11-22 07:40:23.657 186853 DEBUG nova.virt.libvirt.driver [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:40:23 np0005531887 nova_compute[186849]: 2025-11-22 07:40:23.658 186853 DEBUG nova.virt.libvirt.driver [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:40:23 np0005531887 nova_compute[186849]: 2025-11-22 07:40:23.661 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:40:23 np0005531887 nova_compute[186849]: 2025-11-22 07:40:23.698 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:40:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:40:23.722 213790 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:40:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:40:23.723 213790 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:40:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:40:23.723 213790 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:40:23 np0005531887 nova_compute[186849]: 2025-11-22 07:40:23.783 186853 INFO nova.compute.manager [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Took 29.12 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 02:40:23 np0005531887 nova_compute[186849]: 2025-11-22 07:40:23.783 186853 DEBUG nova.compute.manager [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:40:23 np0005531887 nova_compute[186849]: 2025-11-22 07:40:23.886 186853 DEBUG oslo_concurrency.lockutils [None req-22c29e02-98f0-42e8-97ca-db5a9fd35757 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Acquiring lock "6fabeb14-7440-41d0-8be1-453a7607a8ed" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:40:23 np0005531887 nova_compute[186849]: 2025-11-22 07:40:23.887 186853 DEBUG oslo_concurrency.lockutils [None req-22c29e02-98f0-42e8-97ca-db5a9fd35757 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Lock "6fabeb14-7440-41d0-8be1-453a7607a8ed" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:40:23 np0005531887 nova_compute[186849]: 2025-11-22 07:40:23.887 186853 DEBUG oslo_concurrency.lockutils [None req-22c29e02-98f0-42e8-97ca-db5a9fd35757 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Acquiring lock "6fabeb14-7440-41d0-8be1-453a7607a8ed-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:40:23 np0005531887 nova_compute[186849]: 2025-11-22 07:40:23.887 186853 DEBUG oslo_concurrency.lockutils [None req-22c29e02-98f0-42e8-97ca-db5a9fd35757 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Lock "6fabeb14-7440-41d0-8be1-453a7607a8ed-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:40:23 np0005531887 nova_compute[186849]: 2025-11-22 07:40:23.888 186853 DEBUG oslo_concurrency.lockutils [None req-22c29e02-98f0-42e8-97ca-db5a9fd35757 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Lock "6fabeb14-7440-41d0-8be1-453a7607a8ed-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:40:23 np0005531887 nova_compute[186849]: 2025-11-22 07:40:23.898 186853 INFO nova.compute.manager [None req-22c29e02-98f0-42e8-97ca-db5a9fd35757 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 6fabeb14-7440-41d0-8be1-453a7607a8ed] Terminating instance#033[00m
Nov 22 02:40:23 np0005531887 nova_compute[186849]: 2025-11-22 07:40:23.906 186853 DEBUG oslo_concurrency.lockutils [None req-22c29e02-98f0-42e8-97ca-db5a9fd35757 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Acquiring lock "refresh_cache-6fabeb14-7440-41d0-8be1-453a7607a8ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:40:23 np0005531887 nova_compute[186849]: 2025-11-22 07:40:23.906 186853 DEBUG oslo_concurrency.lockutils [None req-22c29e02-98f0-42e8-97ca-db5a9fd35757 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Acquired lock "refresh_cache-6fabeb14-7440-41d0-8be1-453a7607a8ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:40:23 np0005531887 nova_compute[186849]: 2025-11-22 07:40:23.907 186853 DEBUG nova.network.neutron [None req-22c29e02-98f0-42e8-97ca-db5a9fd35757 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 6fabeb14-7440-41d0-8be1-453a7607a8ed] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:40:23 np0005531887 nova_compute[186849]: 2025-11-22 07:40:23.945 186853 INFO nova.compute.manager [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Took 29.89 seconds to build instance.#033[00m
Nov 22 02:40:24 np0005531887 nova_compute[186849]: 2025-11-22 07:40:24.018 186853 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "6fe80388-ca60-492c-a99f-a338bcce8d5b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 30.204s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:40:24 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:40:24.351 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[d6497ee5-a43f-4521-841a-dade5f735ad1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:40:24 np0005531887 nova_compute[186849]: 2025-11-22 07:40:24.353 186853 DEBUG nova.network.neutron [None req-22c29e02-98f0-42e8-97ca-db5a9fd35757 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 6fabeb14-7440-41d0-8be1-453a7607a8ed] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:40:24 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:40:24.354 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapcd94b117-d1 in ovnmeta-cd94b117-ddd2-457a-a1e9-a1e03ac67322 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 02:40:24 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:40:24.356 213790 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapcd94b117-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 02:40:24 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:40:24.356 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[e5d7187f-4b60-44df-be96-c1f7db2ba443]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:40:24 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:40:24.359 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[ec6b5819-56b5-4bea-86b7-2607bee7ede0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:40:24 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:40:24.380 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[45262cbb-2e1c-4bee-8d5e-dd127a1214ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:40:24 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:40:24.397 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[eb0c9f7b-f01f-49eb-addf-6b2132b3416e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:40:24 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:40:24.400 104084 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpdbh139ej/privsep.sock']#033[00m
Nov 22 02:40:24 np0005531887 nova_compute[186849]: 2025-11-22 07:40:24.799 186853 DEBUG nova.network.neutron [req-545af97a-5ed0-4ee1-a60d-bcade8cde94f req-5c8c59cd-98e2-4929-a6ac-1882556adde2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Updated VIF entry in instance network info cache for port d7f5e5f5-f58a-46a6-92e2-0ec1be38e606. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 02:40:24 np0005531887 nova_compute[186849]: 2025-11-22 07:40:24.800 186853 DEBUG nova.network.neutron [req-545af97a-5ed0-4ee1-a60d-bcade8cde94f req-5c8c59cd-98e2-4929-a6ac-1882556adde2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Updating instance_info_cache with network_info: [{"id": "d7f5e5f5-f58a-46a6-92e2-0ec1be38e606", "address": "fa:16:3e:7c:98:89", "network": {"id": "cd94b117-ddd2-457a-a1e9-a1e03ac67322", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::3c4", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.61", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98627e04b62e4ce4bf9650377c674f73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7f5e5f5-f5", "ovs_interfaceid": "d7f5e5f5-f58a-46a6-92e2-0ec1be38e606", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:40:24 np0005531887 nova_compute[186849]: 2025-11-22 07:40:24.810 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:24 np0005531887 nova_compute[186849]: 2025-11-22 07:40:24.824 186853 DEBUG oslo_concurrency.lockutils [req-545af97a-5ed0-4ee1-a60d-bcade8cde94f req-5c8c59cd-98e2-4929-a6ac-1882556adde2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-6fe80388-ca60-492c-a99f-a338bcce8d5b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:40:24 np0005531887 nova_compute[186849]: 2025-11-22 07:40:24.887 186853 DEBUG nova.network.neutron [None req-22c29e02-98f0-42e8-97ca-db5a9fd35757 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 6fabeb14-7440-41d0-8be1-453a7607a8ed] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:40:24 np0005531887 nova_compute[186849]: 2025-11-22 07:40:24.904 186853 DEBUG oslo_concurrency.lockutils [None req-22c29e02-98f0-42e8-97ca-db5a9fd35757 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Releasing lock "refresh_cache-6fabeb14-7440-41d0-8be1-453a7607a8ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:40:24 np0005531887 nova_compute[186849]: 2025-11-22 07:40:24.904 186853 DEBUG nova.compute.manager [None req-22c29e02-98f0-42e8-97ca-db5a9fd35757 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 6fabeb14-7440-41d0-8be1-453a7607a8ed] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 02:40:24 np0005531887 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000007.scope: Deactivated successfully.
Nov 22 02:40:24 np0005531887 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000007.scope: Consumed 14.425s CPU time.
Nov 22 02:40:24 np0005531887 systemd-machined[153180]: Machine qemu-2-instance-00000007 terminated.
Nov 22 02:40:25 np0005531887 podman[213805]: 2025-11-22 07:40:25.086455996 +0000 UTC m=+0.103607453 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:40:25 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:40:25.088 104084 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Nov 22 02:40:25 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:40:25.089 104084 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpdbh139ej/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Nov 22 02:40:25 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:40:24.959 213804 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Nov 22 02:40:25 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:40:24.964 213804 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Nov 22 02:40:25 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:40:24.967 213804 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Nov 22 02:40:25 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:40:24.967 213804 INFO oslo.privsep.daemon [-] privsep daemon running as pid 213804#033[00m
Nov 22 02:40:25 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:40:25.092 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[474c58a5-6d4e-4165-8233-218fe4b3e59f]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:40:25 np0005531887 nova_compute[186849]: 2025-11-22 07:40:25.147 186853 INFO nova.virt.libvirt.driver [-] [instance: 6fabeb14-7440-41d0-8be1-453a7607a8ed] Instance destroyed successfully.#033[00m
Nov 22 02:40:25 np0005531887 nova_compute[186849]: 2025-11-22 07:40:25.147 186853 DEBUG nova.objects.instance [None req-22c29e02-98f0-42e8-97ca-db5a9fd35757 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Lazy-loading 'resources' on Instance uuid 6fabeb14-7440-41d0-8be1-453a7607a8ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:40:25 np0005531887 nova_compute[186849]: 2025-11-22 07:40:25.158 186853 INFO nova.virt.libvirt.driver [None req-22c29e02-98f0-42e8-97ca-db5a9fd35757 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 6fabeb14-7440-41d0-8be1-453a7607a8ed] Deleting instance files /var/lib/nova/instances/6fabeb14-7440-41d0-8be1-453a7607a8ed_del#033[00m
Nov 22 02:40:25 np0005531887 nova_compute[186849]: 2025-11-22 07:40:25.159 186853 INFO nova.virt.libvirt.driver [None req-22c29e02-98f0-42e8-97ca-db5a9fd35757 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 6fabeb14-7440-41d0-8be1-453a7607a8ed] Deletion of /var/lib/nova/instances/6fabeb14-7440-41d0-8be1-453a7607a8ed_del complete#033[00m
Nov 22 02:40:25 np0005531887 nova_compute[186849]: 2025-11-22 07:40:25.378 186853 INFO nova.compute.manager [None req-22c29e02-98f0-42e8-97ca-db5a9fd35757 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 6fabeb14-7440-41d0-8be1-453a7607a8ed] Took 0.47 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 02:40:25 np0005531887 nova_compute[186849]: 2025-11-22 07:40:25.378 186853 DEBUG oslo.service.loopingcall [None req-22c29e02-98f0-42e8-97ca-db5a9fd35757 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 02:40:25 np0005531887 nova_compute[186849]: 2025-11-22 07:40:25.378 186853 DEBUG nova.compute.manager [-] [instance: 6fabeb14-7440-41d0-8be1-453a7607a8ed] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 02:40:25 np0005531887 nova_compute[186849]: 2025-11-22 07:40:25.379 186853 DEBUG nova.network.neutron [-] [instance: 6fabeb14-7440-41d0-8be1-453a7607a8ed] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 02:40:25 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:40:25.622 213804 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:40:25 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:40:25.622 213804 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:40:25 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:40:25.622 213804 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:40:25 np0005531887 nova_compute[186849]: 2025-11-22 07:40:25.756 186853 DEBUG nova.compute.manager [req-fd95de13-9a4f-464f-9f84-4fc138cfed1b req-04653cc9-3433-47e9-bf85-273cbd60e292 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Received event network-vif-plugged-d7f5e5f5-f58a-46a6-92e2-0ec1be38e606 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:40:25 np0005531887 nova_compute[186849]: 2025-11-22 07:40:25.757 186853 DEBUG oslo_concurrency.lockutils [req-fd95de13-9a4f-464f-9f84-4fc138cfed1b req-04653cc9-3433-47e9-bf85-273cbd60e292 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "6fe80388-ca60-492c-a99f-a338bcce8d5b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:40:25 np0005531887 nova_compute[186849]: 2025-11-22 07:40:25.757 186853 DEBUG oslo_concurrency.lockutils [req-fd95de13-9a4f-464f-9f84-4fc138cfed1b req-04653cc9-3433-47e9-bf85-273cbd60e292 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6fe80388-ca60-492c-a99f-a338bcce8d5b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:40:25 np0005531887 nova_compute[186849]: 2025-11-22 07:40:25.757 186853 DEBUG oslo_concurrency.lockutils [req-fd95de13-9a4f-464f-9f84-4fc138cfed1b req-04653cc9-3433-47e9-bf85-273cbd60e292 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6fe80388-ca60-492c-a99f-a338bcce8d5b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:40:25 np0005531887 nova_compute[186849]: 2025-11-22 07:40:25.757 186853 DEBUG nova.compute.manager [req-fd95de13-9a4f-464f-9f84-4fc138cfed1b req-04653cc9-3433-47e9-bf85-273cbd60e292 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] No waiting events found dispatching network-vif-plugged-d7f5e5f5-f58a-46a6-92e2-0ec1be38e606 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:40:25 np0005531887 nova_compute[186849]: 2025-11-22 07:40:25.758 186853 WARNING nova.compute.manager [req-fd95de13-9a4f-464f-9f84-4fc138cfed1b req-04653cc9-3433-47e9-bf85-273cbd60e292 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Received unexpected event network-vif-plugged-d7f5e5f5-f58a-46a6-92e2-0ec1be38e606 for instance with vm_state active and task_state None.#033[00m
Nov 22 02:40:25 np0005531887 nova_compute[186849]: 2025-11-22 07:40:25.762 186853 DEBUG nova.network.neutron [-] [instance: 6fabeb14-7440-41d0-8be1-453a7607a8ed] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:40:25 np0005531887 nova_compute[186849]: 2025-11-22 07:40:25.786 186853 DEBUG nova.network.neutron [-] [instance: 6fabeb14-7440-41d0-8be1-453a7607a8ed] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:40:25 np0005531887 nova_compute[186849]: 2025-11-22 07:40:25.813 186853 INFO nova.compute.manager [-] [instance: 6fabeb14-7440-41d0-8be1-453a7607a8ed] Took 0.43 seconds to deallocate network for instance.#033[00m
Nov 22 02:40:25 np0005531887 nova_compute[186849]: 2025-11-22 07:40:25.908 186853 DEBUG oslo_concurrency.lockutils [None req-22c29e02-98f0-42e8-97ca-db5a9fd35757 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:40:25 np0005531887 nova_compute[186849]: 2025-11-22 07:40:25.909 186853 DEBUG oslo_concurrency.lockutils [None req-22c29e02-98f0-42e8-97ca-db5a9fd35757 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:40:26 np0005531887 nova_compute[186849]: 2025-11-22 07:40:26.040 186853 DEBUG nova.compute.provider_tree [None req-22c29e02-98f0-42e8-97ca-db5a9fd35757 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:40:26 np0005531887 nova_compute[186849]: 2025-11-22 07:40:26.056 186853 DEBUG nova.scheduler.client.report [None req-22c29e02-98f0-42e8-97ca-db5a9fd35757 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:40:26 np0005531887 nova_compute[186849]: 2025-11-22 07:40:26.270 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:26 np0005531887 nova_compute[186849]: 2025-11-22 07:40:26.297 186853 DEBUG oslo_concurrency.lockutils [None req-22c29e02-98f0-42e8-97ca-db5a9fd35757 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.389s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:40:26 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:40:26.312 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[238f7bfc-16bf-4cd4-9037-0081014444a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:40:26 np0005531887 NetworkManager[55210]: <info>  [1763797226.3394] manager: (tapcd94b117-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/23)
Nov 22 02:40:26 np0005531887 nova_compute[186849]: 2025-11-22 07:40:26.338 186853 INFO nova.scheduler.client.report [None req-22c29e02-98f0-42e8-97ca-db5a9fd35757 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Deleted allocations for instance 6fabeb14-7440-41d0-8be1-453a7607a8ed#033[00m
Nov 22 02:40:26 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:40:26.337 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[b669dcc2-045b-44e7-a2e0-4b6f14e76658]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:40:26 np0005531887 systemd-udevd[213845]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:40:26 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:40:26.380 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[3b442e61-179b-418c-ace1-7a49a26b6bdf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:40:26 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:40:26.389 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[00acfb08-cd6d-47e3-b30b-f603622e1706]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:40:26 np0005531887 NetworkManager[55210]: <info>  [1763797226.4228] device (tapcd94b117-d0): carrier: link connected
Nov 22 02:40:26 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:40:26.429 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[d5672026-a697-4280-8182-67f21c535721]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:40:26 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:40:26.449 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[98e5655f-e164-41fe-a19a-8ce746a7bb44]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcd94b117-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:20:df:b6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 396902, 'reachable_time': 20696, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213863, 'error': None, 'target': 'ovnmeta-cd94b117-ddd2-457a-a1e9-a1e03ac67322', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:40:26 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:40:26.465 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[66c33bb8-9453-4c4f-b432-a61aecf2845b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe20:dfb6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 396902, 'tstamp': 396902}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213864, 'error': None, 'target': 'ovnmeta-cd94b117-ddd2-457a-a1e9-a1e03ac67322', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:40:26 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:40:26.482 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[f1887642-c7da-4120-87a3-3341d8ea9675]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcd94b117-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:20:df:b6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 396902, 'reachable_time': 20696, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213865, 'error': None, 'target': 'ovnmeta-cd94b117-ddd2-457a-a1e9-a1e03ac67322', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:40:26 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:40:26.514 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[a1a3d242-61f8-47a1-b652-5726cd8d3e25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:40:26 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:40:26.577 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[c851fa59-9570-4901-a662-18e081896698]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:40:26 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:40:26.579 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcd94b117-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:40:26 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:40:26.579 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:40:26 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:40:26.580 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcd94b117-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:40:26 np0005531887 nova_compute[186849]: 2025-11-22 07:40:26.582 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:26 np0005531887 NetworkManager[55210]: <info>  [1763797226.5861] manager: (tapcd94b117-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/24)
Nov 22 02:40:26 np0005531887 kernel: tapcd94b117-d0: entered promiscuous mode
Nov 22 02:40:26 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:40:26.591 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapcd94b117-d0, col_values=(('external_ids', {'iface-id': 'f15694ec-11c8-44d4-a18a-7277c1308d45'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:40:26 np0005531887 nova_compute[186849]: 2025-11-22 07:40:26.590 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:26 np0005531887 ovn_controller[95130]: 2025-11-22T07:40:26Z|00031|binding|INFO|Releasing lport f15694ec-11c8-44d4-a18a-7277c1308d45 from this chassis (sb_readonly=0)
Nov 22 02:40:26 np0005531887 nova_compute[186849]: 2025-11-22 07:40:26.592 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:26 np0005531887 nova_compute[186849]: 2025-11-22 07:40:26.613 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:26 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:40:26.614 104084 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/cd94b117-ddd2-457a-a1e9-a1e03ac67322.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/cd94b117-ddd2-457a-a1e9-a1e03ac67322.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 02:40:26 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:40:26.615 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[555671df-92a0-4ba4-82e5-45046d36cb06]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:40:26 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:40:26.618 104084 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 02:40:26 np0005531887 ovn_metadata_agent[104079]: global
Nov 22 02:40:26 np0005531887 ovn_metadata_agent[104079]:    log         /dev/log local0 debug
Nov 22 02:40:26 np0005531887 ovn_metadata_agent[104079]:    log-tag     haproxy-metadata-proxy-cd94b117-ddd2-457a-a1e9-a1e03ac67322
Nov 22 02:40:26 np0005531887 ovn_metadata_agent[104079]:    user        root
Nov 22 02:40:26 np0005531887 ovn_metadata_agent[104079]:    group       root
Nov 22 02:40:26 np0005531887 ovn_metadata_agent[104079]:    maxconn     1024
Nov 22 02:40:26 np0005531887 ovn_metadata_agent[104079]:    pidfile     /var/lib/neutron/external/pids/cd94b117-ddd2-457a-a1e9-a1e03ac67322.pid.haproxy
Nov 22 02:40:26 np0005531887 ovn_metadata_agent[104079]:    daemon
Nov 22 02:40:26 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:40:26 np0005531887 ovn_metadata_agent[104079]: defaults
Nov 22 02:40:26 np0005531887 ovn_metadata_agent[104079]:    log global
Nov 22 02:40:26 np0005531887 ovn_metadata_agent[104079]:    mode http
Nov 22 02:40:26 np0005531887 ovn_metadata_agent[104079]:    option httplog
Nov 22 02:40:26 np0005531887 ovn_metadata_agent[104079]:    option dontlognull
Nov 22 02:40:26 np0005531887 ovn_metadata_agent[104079]:    option http-server-close
Nov 22 02:40:26 np0005531887 ovn_metadata_agent[104079]:    option forwardfor
Nov 22 02:40:26 np0005531887 ovn_metadata_agent[104079]:    retries                 3
Nov 22 02:40:26 np0005531887 ovn_metadata_agent[104079]:    timeout http-request    30s
Nov 22 02:40:26 np0005531887 ovn_metadata_agent[104079]:    timeout connect         30s
Nov 22 02:40:26 np0005531887 ovn_metadata_agent[104079]:    timeout client          32s
Nov 22 02:40:26 np0005531887 ovn_metadata_agent[104079]:    timeout server          32s
Nov 22 02:40:26 np0005531887 ovn_metadata_agent[104079]:    timeout http-keep-alive 30s
Nov 22 02:40:26 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:40:26 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:40:26 np0005531887 ovn_metadata_agent[104079]: listen listener
Nov 22 02:40:26 np0005531887 ovn_metadata_agent[104079]:    bind 169.254.169.254:80
Nov 22 02:40:26 np0005531887 ovn_metadata_agent[104079]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 02:40:26 np0005531887 ovn_metadata_agent[104079]:    http-request add-header X-OVN-Network-ID cd94b117-ddd2-457a-a1e9-a1e03ac67322
Nov 22 02:40:26 np0005531887 ovn_metadata_agent[104079]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 02:40:26 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:40:26.619 104084 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-cd94b117-ddd2-457a-a1e9-a1e03ac67322', 'env', 'PROCESS_TAG=haproxy-cd94b117-ddd2-457a-a1e9-a1e03ac67322', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/cd94b117-ddd2-457a-a1e9-a1e03ac67322.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 02:40:26 np0005531887 nova_compute[186849]: 2025-11-22 07:40:26.755 186853 DEBUG oslo_concurrency.lockutils [None req-22c29e02-98f0-42e8-97ca-db5a9fd35757 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Lock "6fabeb14-7440-41d0-8be1-453a7607a8ed" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.868s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:40:27 np0005531887 podman[213898]: 2025-11-22 07:40:27.016247718 +0000 UTC m=+0.069869092 container create 7ce24ac3c6498975ec04df0528a998c64f90169ab6c9331f4df41d3ea0dd663c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd94b117-ddd2-457a-a1e9-a1e03ac67322, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:40:27 np0005531887 systemd[1]: Started libpod-conmon-7ce24ac3c6498975ec04df0528a998c64f90169ab6c9331f4df41d3ea0dd663c.scope.
Nov 22 02:40:27 np0005531887 podman[213898]: 2025-11-22 07:40:26.970466106 +0000 UTC m=+0.024087500 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 02:40:27 np0005531887 systemd[1]: Started libcrun container.
Nov 22 02:40:27 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d88a1f422fc85b44ba2b00a6360e3f911d9204d0b4efd2359b139838fa06b86f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 02:40:27 np0005531887 podman[213898]: 2025-11-22 07:40:27.120623008 +0000 UTC m=+0.174244402 container init 7ce24ac3c6498975ec04df0528a998c64f90169ab6c9331f4df41d3ea0dd663c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd94b117-ddd2-457a-a1e9-a1e03ac67322, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Nov 22 02:40:27 np0005531887 podman[213898]: 2025-11-22 07:40:27.127798191 +0000 UTC m=+0.181419565 container start 7ce24ac3c6498975ec04df0528a998c64f90169ab6c9331f4df41d3ea0dd663c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd94b117-ddd2-457a-a1e9-a1e03ac67322, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 02:40:27 np0005531887 neutron-haproxy-ovnmeta-cd94b117-ddd2-457a-a1e9-a1e03ac67322[213913]: [NOTICE]   (213917) : New worker (213919) forked
Nov 22 02:40:27 np0005531887 neutron-haproxy-ovnmeta-cd94b117-ddd2-457a-a1e9-a1e03ac67322[213913]: [NOTICE]   (213917) : Loading success.
Nov 22 02:40:27 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:40:27.203 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 02:40:29 np0005531887 nova_compute[186849]: 2025-11-22 07:40:29.103 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:40:29 np0005531887 nova_compute[186849]: 2025-11-22 07:40:29.812 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:29 np0005531887 podman[213928]: 2025-11-22 07:40:29.866919851 +0000 UTC m=+0.081024581 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 02:40:30 np0005531887 nova_compute[186849]: 2025-11-22 07:40:30.763 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:40:30 np0005531887 nova_compute[186849]: 2025-11-22 07:40:30.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:40:31 np0005531887 nova_compute[186849]: 2025-11-22 07:40:31.272 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:32 np0005531887 nova_compute[186849]: 2025-11-22 07:40:32.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:40:32 np0005531887 nova_compute[186849]: 2025-11-22 07:40:32.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 02:40:32 np0005531887 nova_compute[186849]: 2025-11-22 07:40:32.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 02:40:33 np0005531887 nova_compute[186849]: 2025-11-22 07:40:33.380 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "refresh_cache-6fe80388-ca60-492c-a99f-a338bcce8d5b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:40:33 np0005531887 nova_compute[186849]: 2025-11-22 07:40:33.381 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquired lock "refresh_cache-6fe80388-ca60-492c-a99f-a338bcce8d5b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:40:33 np0005531887 nova_compute[186849]: 2025-11-22 07:40:33.382 186853 DEBUG nova.network.neutron [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 02:40:33 np0005531887 nova_compute[186849]: 2025-11-22 07:40:33.382 186853 DEBUG nova.objects.instance [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 6fe80388-ca60-492c-a99f-a338bcce8d5b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:40:34 np0005531887 nova_compute[186849]: 2025-11-22 07:40:34.814 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:34 np0005531887 podman[213952]: 2025-11-22 07:40:34.851621428 +0000 UTC m=+0.067415923 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, architecture=x86_64, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 22 02:40:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:40:36.205 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:40:36 np0005531887 nova_compute[186849]: 2025-11-22 07:40:36.275 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.194 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}a11a3876e3b0d6d40540fef270f2527f376e5c95223cda6a3c2983f6e783b7d9" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Nov 22 02:40:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:40:37.313 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:40:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:40:37.314 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:40:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:40:37.314 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.361 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 1183 Content-Type: application/json Date: Sat, 22 Nov 2025 07:40:37 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-396407b2-3aed-4a8c-9086-178e1c2f4bdc x-openstack-request-id: req-396407b2-3aed-4a8c-9086-178e1c2f4bdc _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.361 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "1890587748", "name": "tempest-flavor_with_ephemeral_0-22804361", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/1890587748"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/1890587748"}]}, {"id": "1c351edf-5b2d-477d-93d0-c380bdae83e7", "name": "m1.micro", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/1c351edf-5b2d-477d-93d0-c380bdae83e7"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/1c351edf-5b2d-477d-93d0-c380bdae83e7"}]}, {"id": "31612188-3cd6-428b-9166-9568f0affd4a", "name": "m1.nano", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/31612188-3cd6-428b-9166-9568f0affd4a"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/31612188-3cd6-428b-9166-9568f0affd4a"}]}, {"id": "962835561", "name": "tempest-flavor_with_ephemeral_1-2110312460", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/962835561"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/962835561"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.362 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-396407b2-3aed-4a8c-9086-178e1c2f4bdc request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.363 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors/31612188-3cd6-428b-9166-9568f0affd4a -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}a11a3876e3b0d6d40540fef270f2527f376e5c95223cda6a3c2983f6e783b7d9" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.816 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 495 Content-Type: application/json Date: Sat, 22 Nov 2025 07:40:37 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-14d3e8b8-85f6-4efe-9b1d-2777fb435df9 x-openstack-request-id: req-14d3e8b8-85f6-4efe-9b1d-2777fb435df9 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.816 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "31612188-3cd6-428b-9166-9568f0affd4a", "name": "m1.nano", "ram": 128, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 0, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/31612188-3cd6-428b-9166-9568f0affd4a"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/31612188-3cd6-428b-9166-9568f0affd4a"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.816 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors/31612188-3cd6-428b-9166-9568f0affd4a used request id req-14d3e8b8-85f6-4efe-9b1d-2777fb435df9 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.818 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '6fe80388-ca60-492c-a99f-a338bcce8d5b', 'name': 'tempest-tempest.common.compute-instance-1346960213-3', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000006', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '98627e04b62e4ce4bf9650377c674f73', 'user_id': '12b223a79f8b4927861908eb11663fb5', 'hostId': '7e9e89413c0d2fcbe0cd4f3560f7c52ba231968510f86e127911473b', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.819 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.825 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 6fe80388-ca60-492c-a99f-a338bcce8d5b / tapd7f5e5f5-f5 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.825 12 DEBUG ceilometer.compute.pollsters [-] 6fe80388-ca60-492c-a99f-a338bcce8d5b/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.833 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1b592864-719c-49ee-868e-2a7999efd6d0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': 'instance-00000006-6fe80388-ca60-492c-a99f-a338bcce8d5b-tapd7f5e5f5-f5', 'timestamp': '2025-11-22T07:40:37.819530', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-3', 'name': 'tapd7f5e5f5-f5', 'instance_id': '6fe80388-ca60-492c-a99f-a338bcce8d5b', 'instance_type': 'm1.nano', 'host': '7e9e89413c0d2fcbe0cd4f3560f7c52ba231968510f86e127911473b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7c:98:89', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd7f5e5f5-f5'}, 'message_id': '89ab58a6-c776-11f0-9b25-fa163ecc0304', 'monotonic_time': 3980.489497163, 'message_signature': '656ac866d25d29a1ba73b5cbd8842acd41a8f9eb47aad4503e9a5f2ab2064320'}]}, 'timestamp': '2025-11-22 07:40:37.826537', '_unique_id': '3a70b53f4c224a17956008f488c3be40'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.833 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.833 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.833 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.833 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.833 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.833 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.833 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.833 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.833 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.833 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.833 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.833 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.833 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.833 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.833 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.833 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.833 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.833 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.833 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.833 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.833 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.833 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.833 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.833 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.833 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.833 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.833 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.833 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.833 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.833 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.833 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.836 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.868 12 DEBUG ceilometer.compute.pollsters [-] 6fe80388-ca60-492c-a99f-a338bcce8d5b/disk.device.write.latency volume: 17652092059 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.868 12 DEBUG ceilometer.compute.pollsters [-] 6fe80388-ca60-492c-a99f-a338bcce8d5b/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.870 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0968fc35-72f3-469e-846c-5ba687f1a501', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 17652092059, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '6fe80388-ca60-492c-a99f-a338bcce8d5b-vda', 'timestamp': '2025-11-22T07:40:37.836849', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-3', 'name': 'instance-00000006', 'instance_id': '6fe80388-ca60-492c-a99f-a338bcce8d5b', 'instance_type': 'm1.nano', 'host': '7e9e89413c0d2fcbe0cd4f3560f7c52ba231968510f86e127911473b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '89b1d096-c776-11f0-9b25-fa163ecc0304', 'monotonic_time': 3980.506810909, 'message_signature': '97028b02513cd45836df1fc36c62aafbc08cedb460fc6fb185d4b55cdf462f8b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '6fe80388-ca60-492c-a99f-a338bcce8d5b-sda', 'timestamp': '2025-11-22T07:40:37.836849', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-3', 'name': 'instance-00000006', 'instance_id': '6fe80388-ca60-492c-a99f-a338bcce8d5b', 'instance_type': 'm1.nano', 'host': '7e9e89413c0d2fcbe0cd4f3560f7c52ba231968510f86e127911473b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '89b1e16c-c776-11f0-9b25-fa163ecc0304', 'monotonic_time': 3980.506810909, 'message_signature': '402dc49741d112881073a245860797fe8e8478390e3f5b9686e714b8fe646201'}]}, 'timestamp': '2025-11-22 07:40:37.869167', '_unique_id': '3c1dfe33c8e3412ab4d87ced1bce1d98'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.870 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.870 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.870 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.870 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.870 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.870 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.870 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.870 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.870 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.870 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.870 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.870 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.870 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.870 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.870 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.870 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.870 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.870 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.870 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.870 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.870 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.870 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.870 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.870 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.870 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.870 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.870 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.870 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.870 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.870 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.870 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.870 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.871 12 DEBUG ceilometer.compute.pollsters [-] 6fe80388-ca60-492c-a99f-a338bcce8d5b/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.871 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '598c76b4-d221-4b39-b903-8eacf933baf5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': 'instance-00000006-6fe80388-ca60-492c-a99f-a338bcce8d5b-tapd7f5e5f5-f5', 'timestamp': '2025-11-22T07:40:37.871017', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-3', 'name': 'tapd7f5e5f5-f5', 'instance_id': '6fe80388-ca60-492c-a99f-a338bcce8d5b', 'instance_type': 'm1.nano', 'host': '7e9e89413c0d2fcbe0cd4f3560f7c52ba231968510f86e127911473b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7c:98:89', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd7f5e5f5-f5'}, 'message_id': '89b237a2-c776-11f0-9b25-fa163ecc0304', 'monotonic_time': 3980.489497163, 'message_signature': 'aa20c0afa4268e79c7c8eb1916ab27b367282d9a2cac1d0d6cee172a88ffc9d8'}]}, 'timestamp': '2025-11-22 07:40:37.871348', '_unique_id': '52eaf95ce780473fb3f0cc10afe18c17'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.871 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.871 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.871 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.871 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.871 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.871 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.871 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.871 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.871 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.871 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.871 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.871 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.871 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.871 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.871 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.871 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.871 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.871 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.871 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.871 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.871 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.871 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.871 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.871 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.871 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.871 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.871 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.871 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.871 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.871 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.871 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.872 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.872 12 DEBUG ceilometer.compute.pollsters [-] 6fe80388-ca60-492c-a99f-a338bcce8d5b/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.873 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '627a939b-b99d-485d-a5a8-830baa089ad2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': 'instance-00000006-6fe80388-ca60-492c-a99f-a338bcce8d5b-tapd7f5e5f5-f5', 'timestamp': '2025-11-22T07:40:37.872579', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-3', 'name': 'tapd7f5e5f5-f5', 'instance_id': '6fe80388-ca60-492c-a99f-a338bcce8d5b', 'instance_type': 'm1.nano', 'host': '7e9e89413c0d2fcbe0cd4f3560f7c52ba231968510f86e127911473b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7c:98:89', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd7f5e5f5-f5'}, 'message_id': '89b27258-c776-11f0-9b25-fa163ecc0304', 'monotonic_time': 3980.489497163, 'message_signature': '53633ae5deb17eee51691f53ab87a028b4d49b0fc24b386b832b58992e6799c6'}]}, 'timestamp': '2025-11-22 07:40:37.872815', '_unique_id': '986eff3be73f4d5b8262f8a592cb1a44'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.873 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.873 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.873 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.873 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.873 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.873 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.873 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.873 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.873 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.873 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.873 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.873 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.873 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.873 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.873 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.873 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.873 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.873 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.873 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.873 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.873 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.873 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.873 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.873 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.873 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.873 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.873 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.873 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.873 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.873 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.873 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.873 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.873 12 DEBUG ceilometer.compute.pollsters [-] 6fe80388-ca60-492c-a99f-a338bcce8d5b/disk.device.write.requests volume: 186 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.874 12 DEBUG ceilometer.compute.pollsters [-] 6fe80388-ca60-492c-a99f-a338bcce8d5b/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.875 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '39d87cb2-b5fb-4db3-9665-15d43eafc200', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 186, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '6fe80388-ca60-492c-a99f-a338bcce8d5b-vda', 'timestamp': '2025-11-22T07:40:37.873941', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-3', 'name': 'instance-00000006', 'instance_id': '6fe80388-ca60-492c-a99f-a338bcce8d5b', 'instance_type': 'm1.nano', 'host': '7e9e89413c0d2fcbe0cd4f3560f7c52ba231968510f86e127911473b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '89b2a75a-c776-11f0-9b25-fa163ecc0304', 'monotonic_time': 3980.506810909, 'message_signature': 'd05651049de723c943b1ca6b05502fea12f61b8782f4261c24b0a25276c029f1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '6fe80388-ca60-492c-a99f-a338bcce8d5b-sda', 'timestamp': '2025-11-22T07:40:37.873941', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-3', 'name': 'instance-00000006', 'instance_id': '6fe80388-ca60-492c-a99f-a338bcce8d5b', 'instance_type': 'm1.nano', 'host': '7e9e89413c0d2fcbe0cd4f3560f7c52ba231968510f86e127911473b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '89b2b1b4-c776-11f0-9b25-fa163ecc0304', 'monotonic_time': 3980.506810909, 'message_signature': '12e589334912af59f70180939a3af7d8b8a6a3edeca27e93819a3df562f9dac8'}]}, 'timestamp': '2025-11-22 07:40:37.874484', '_unique_id': 'da78be7a3bc94827b8de21b8af88a1f7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.875 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.875 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.875 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.875 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.875 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.875 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.875 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.875 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.875 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.875 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.875 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.875 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.875 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.875 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.875 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.875 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.875 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.875 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.875 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.875 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.875 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.875 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.875 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.875 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.875 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.875 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.875 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.875 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.875 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.875 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.875 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.875 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.875 12 DEBUG ceilometer.compute.pollsters [-] 6fe80388-ca60-492c-a99f-a338bcce8d5b/disk.device.read.latency volume: 1358461413 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.876 12 DEBUG ceilometer.compute.pollsters [-] 6fe80388-ca60-492c-a99f-a338bcce8d5b/disk.device.read.latency volume: 41308016 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.876 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '75b638ae-6bd7-477c-a355-1ea8a0ac3441', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1358461413, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '6fe80388-ca60-492c-a99f-a338bcce8d5b-vda', 'timestamp': '2025-11-22T07:40:37.875893', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-3', 'name': 'instance-00000006', 'instance_id': '6fe80388-ca60-492c-a99f-a338bcce8d5b', 'instance_type': 'm1.nano', 'host': '7e9e89413c0d2fcbe0cd4f3560f7c52ba231968510f86e127911473b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '89b2f480-c776-11f0-9b25-fa163ecc0304', 'monotonic_time': 3980.506810909, 'message_signature': '7c67d8468bb1c5a35f298db977778ffe5ea259f649c4180164fcf487192f8516'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 41308016, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '6fe80388-ca60-492c-a99f-a338bcce8d5b-sda', 'timestamp': '2025-11-22T07:40:37.875893', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-3', 'name': 'instance-00000006', 'instance_id': '6fe80388-ca60-492c-a99f-a338bcce8d5b', 'instance_type': 'm1.nano', 'host': '7e9e89413c0d2fcbe0cd4f3560f7c52ba231968510f86e127911473b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '89b2ff34-c776-11f0-9b25-fa163ecc0304', 'monotonic_time': 3980.506810909, 'message_signature': 'b4433964e7c518cda5f55fca97e75da1c7eb13d1fdf1213e9bd5669ec72f6f94'}]}, 'timestamp': '2025-11-22 07:40:37.876408', '_unique_id': 'c71e62e299fc458baa876c2b7139b857'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.876 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.876 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.876 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.876 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.876 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.876 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.876 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.876 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.876 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.876 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.876 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.876 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.876 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.876 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.876 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.876 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.876 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.876 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.876 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.876 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.876 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.876 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.876 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.876 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.876 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.876 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.876 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.876 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.876 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.876 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.876 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.877 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.877 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.877 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-tempest.common.compute-instance-1346960213-3>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-tempest.common.compute-instance-1346960213-3>]
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.878 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.878 12 DEBUG ceilometer.compute.pollsters [-] 6fe80388-ca60-492c-a99f-a338bcce8d5b/disk.device.read.requests volume: 838 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.878 12 DEBUG ceilometer.compute.pollsters [-] 6fe80388-ca60-492c-a99f-a338bcce8d5b/disk.device.read.requests volume: 20 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.879 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3b189e21-4a9f-4c4c-942c-d8ca4eaed191', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 838, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '6fe80388-ca60-492c-a99f-a338bcce8d5b-vda', 'timestamp': '2025-11-22T07:40:37.878131', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-3', 'name': 'instance-00000006', 'instance_id': '6fe80388-ca60-492c-a99f-a338bcce8d5b', 'instance_type': 'm1.nano', 'host': '7e9e89413c0d2fcbe0cd4f3560f7c52ba231968510f86e127911473b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '89b34c00-c776-11f0-9b25-fa163ecc0304', 'monotonic_time': 3980.506810909, 'message_signature': '0646a43426fcd8d0a00491e5d219471724f072d3cc941a168ebb57eafc1a8d08'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 20, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '6fe80388-ca60-492c-a99f-a338bcce8d5b-sda', 'timestamp': '2025-11-22T07:40:37.878131', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-3', 'name': 'instance-00000006', 'instance_id': '6fe80388-ca60-492c-a99f-a338bcce8d5b', 'instance_type': 'm1.nano', 'host': '7e9e89413c0d2fcbe0cd4f3560f7c52ba231968510f86e127911473b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '89b355b0-c776-11f0-9b25-fa163ecc0304', 'monotonic_time': 3980.506810909, 'message_signature': '1ccdc15c7b97095459e2d15ffa8495b0c6b9de01acac16b8839741cf7f8ac946'}]}, 'timestamp': '2025-11-22 07:40:37.878632', '_unique_id': '147e6e58f61f4abbbd1b44c76e03a29a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.879 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.879 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.879 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.879 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.879 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.879 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.879 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.879 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.879 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.879 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.879 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.879 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.879 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.879 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.879 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.879 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.879 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.879 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.879 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.879 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.879 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.879 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.879 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.879 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.879 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.879 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.879 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.879 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.879 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.879 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.879 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.879 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.879 12 DEBUG ceilometer.compute.pollsters [-] 6fe80388-ca60-492c-a99f-a338bcce8d5b/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.880 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3c03d192-2772-4052-845e-76481a0874b2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': 'instance-00000006-6fe80388-ca60-492c-a99f-a338bcce8d5b-tapd7f5e5f5-f5', 'timestamp': '2025-11-22T07:40:37.879871', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-3', 'name': 'tapd7f5e5f5-f5', 'instance_id': '6fe80388-ca60-492c-a99f-a338bcce8d5b', 'instance_type': 'm1.nano', 'host': '7e9e89413c0d2fcbe0cd4f3560f7c52ba231968510f86e127911473b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7c:98:89', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd7f5e5f5-f5'}, 'message_id': '89b38fd0-c776-11f0-9b25-fa163ecc0304', 'monotonic_time': 3980.489497163, 'message_signature': 'bfe615d1f664a3278696eb8c972d8e31b039d0eb1769f5b54da66985a4ba709f'}]}, 'timestamp': '2025-11-22 07:40:37.880153', '_unique_id': '9f523296ac584684bcee2d08e5e43762'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.880 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.880 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.880 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.880 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.880 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.880 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.880 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.880 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.880 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.880 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.880 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.880 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.880 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.880 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.880 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.880 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.880 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.880 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.880 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.880 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.880 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.880 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.880 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.880 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.880 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.880 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.880 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.880 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.880 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.880 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.880 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.881 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.881 12 DEBUG ceilometer.compute.pollsters [-] 6fe80388-ca60-492c-a99f-a338bcce8d5b/disk.device.write.bytes volume: 25333760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.881 12 DEBUG ceilometer.compute.pollsters [-] 6fe80388-ca60-492c-a99f-a338bcce8d5b/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.882 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '968362d4-362d-4177-aa32-6d0137351ce5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 25333760, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '6fe80388-ca60-492c-a99f-a338bcce8d5b-vda', 'timestamp': '2025-11-22T07:40:37.881441', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-3', 'name': 'instance-00000006', 'instance_id': '6fe80388-ca60-492c-a99f-a338bcce8d5b', 'instance_type': 'm1.nano', 'host': '7e9e89413c0d2fcbe0cd4f3560f7c52ba231968510f86e127911473b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '89b3cd88-c776-11f0-9b25-fa163ecc0304', 'monotonic_time': 3980.506810909, 'message_signature': '30eadcfe0c0a4d0a618a62d42917de5b16f1e496cc7f6b8c530ef0197ef33c69'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '6fe80388-ca60-492c-a99f-a338bcce8d5b-sda', 'timestamp': '2025-11-22T07:40:37.881441', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-3', 'name': 'instance-00000006', 'instance_id': '6fe80388-ca60-492c-a99f-a338bcce8d5b', 'instance_type': 'm1.nano', 'host': '7e9e89413c0d2fcbe0cd4f3560f7c52ba231968510f86e127911473b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '89b3d7a6-c776-11f0-9b25-fa163ecc0304', 'monotonic_time': 3980.506810909, 'message_signature': '0e09fa5cde9ecc51403838ca0ec909d7fdd5f5adeb18da03b646587cca5cb277'}]}, 'timestamp': '2025-11-22 07:40:37.881971', '_unique_id': '70cd1cd87c2c46338b431bff8204ddb3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.882 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.882 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.882 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.882 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.882 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.882 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.882 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.882 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.882 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.882 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.882 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.882 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.882 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.882 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.882 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.882 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.882 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.882 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.882 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.882 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.882 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.882 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.882 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.882 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.882 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.882 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.882 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.882 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.882 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.882 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.882 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.883 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.895 12 DEBUG ceilometer.compute.pollsters [-] 6fe80388-ca60-492c-a99f-a338bcce8d5b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.895 12 DEBUG ceilometer.compute.pollsters [-] 6fe80388-ca60-492c-a99f-a338bcce8d5b/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.897 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5eb54008-57e3-4197-81bb-46f5533c3ce1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '6fe80388-ca60-492c-a99f-a338bcce8d5b-vda', 'timestamp': '2025-11-22T07:40:37.883231', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-3', 'name': 'instance-00000006', 'instance_id': '6fe80388-ca60-492c-a99f-a338bcce8d5b', 'instance_type': 'm1.nano', 'host': '7e9e89413c0d2fcbe0cd4f3560f7c52ba231968510f86e127911473b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '89b5f554-c776-11f0-9b25-fa163ecc0304', 'monotonic_time': 3980.553170814, 'message_signature': '16b182841e92cf82f67a13ff8d3cd7cd2d5f53bde6b84dd4938b6ec9d03bd284'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '6fe80388-ca60-492c-a99f-a338bcce8d5b-sda', 'timestamp': '2025-11-22T07:40:37.883231', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-3', 'name': 'instance-00000006', 'instance_id': '6fe80388-ca60-492c-a99f-a338bcce8d5b', 'instance_type': 'm1.nano', 'host': '7e9e89413c0d2fcbe0cd4f3560f7c52ba231968510f86e127911473b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '89b602f6-c776-11f0-9b25-fa163ecc0304', 'monotonic_time': 3980.553170814, 'message_signature': '680bd3bed647790046702d228e6893abfe4a8db16099d4e41e1f36bd85993639'}]}, 'timestamp': '2025-11-22 07:40:37.896216', '_unique_id': '90c5a79eaed8453d8914b485cf1f1f27'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.897 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.897 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.897 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.897 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.897 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.897 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.897 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.897 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.897 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.897 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.897 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.897 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.897 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.897 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.897 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.897 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.897 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.897 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.897 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.897 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.897 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.897 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.897 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.897 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.897 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.897 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.897 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.897 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.897 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.897 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.897 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.898 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.898 12 DEBUG ceilometer.compute.pollsters [-] 6fe80388-ca60-492c-a99f-a338bcce8d5b/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.899 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '909dfe42-1f2a-4ec8-ab8d-9277df2e44c3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': 'instance-00000006-6fe80388-ca60-492c-a99f-a338bcce8d5b-tapd7f5e5f5-f5', 'timestamp': '2025-11-22T07:40:37.898297', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-3', 'name': 'tapd7f5e5f5-f5', 'instance_id': '6fe80388-ca60-492c-a99f-a338bcce8d5b', 'instance_type': 'm1.nano', 'host': '7e9e89413c0d2fcbe0cd4f3560f7c52ba231968510f86e127911473b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7c:98:89', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd7f5e5f5-f5'}, 'message_id': '89b6632c-c776-11f0-9b25-fa163ecc0304', 'monotonic_time': 3980.489497163, 'message_signature': '5666a2cca12520d0b2ca4f25d278dbe21c251fc104f8c73d00d59df901666c0b'}]}, 'timestamp': '2025-11-22 07:40:37.898690', '_unique_id': '1b82b09411b6462bb01b633c2cb1b353'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.899 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.899 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.899 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.899 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.899 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.899 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.899 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.899 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.899 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.899 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.899 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.899 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.899 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.899 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.899 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.899 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.899 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.899 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.899 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.899 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.899 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.899 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.899 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.899 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.899 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.899 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.899 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.899 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.899 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.899 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.899 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.900 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.900 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.900 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-tempest.common.compute-instance-1346960213-3>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-tempest.common.compute-instance-1346960213-3>]
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.900 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.900 12 DEBUG ceilometer.compute.pollsters [-] 6fe80388-ca60-492c-a99f-a338bcce8d5b/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.901 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '38130fdd-333d-4973-91ae-281facaa8008', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': 'instance-00000006-6fe80388-ca60-492c-a99f-a338bcce8d5b-tapd7f5e5f5-f5', 'timestamp': '2025-11-22T07:40:37.900640', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-3', 'name': 'tapd7f5e5f5-f5', 'instance_id': '6fe80388-ca60-492c-a99f-a338bcce8d5b', 'instance_type': 'm1.nano', 'host': '7e9e89413c0d2fcbe0cd4f3560f7c52ba231968510f86e127911473b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7c:98:89', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd7f5e5f5-f5'}, 'message_id': '89b6bc32-c776-11f0-9b25-fa163ecc0304', 'monotonic_time': 3980.489497163, 'message_signature': '0424721f77d8fa97b16f77446fae9a2ef8b08826418211080c06a2b364b63c58'}]}, 'timestamp': '2025-11-22 07:40:37.900961', '_unique_id': 'c254533a97904bd7a98ede58f6ea553e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.901 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.901 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.901 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.901 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.901 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.901 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.901 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.901 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.901 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.901 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.901 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.901 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.901 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.901 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.901 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.901 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.901 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.901 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.901 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.901 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.901 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.901 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.901 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.901 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.901 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.901 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.901 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.901 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.901 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.901 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.901 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.902 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.902 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.902 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-tempest.common.compute-instance-1346960213-3>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-tempest.common.compute-instance-1346960213-3>]
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.902 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.919 12 DEBUG ceilometer.compute.pollsters [-] 6fe80388-ca60-492c-a99f-a338bcce8d5b/cpu volume: 12540000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.921 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'be66daf5-24b3-49e7-943a-9946647e53dc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12540000000, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '6fe80388-ca60-492c-a99f-a338bcce8d5b', 'timestamp': '2025-11-22T07:40:37.902980', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-3', 'name': 'instance-00000006', 'instance_id': '6fe80388-ca60-492c-a99f-a338bcce8d5b', 'instance_type': 'm1.nano', 'host': '7e9e89413c0d2fcbe0cd4f3560f7c52ba231968510f86e127911473b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '89b9b4a0-c776-11f0-9b25-fa163ecc0304', 'monotonic_time': 3980.589443747, 'message_signature': 'e64a203e69030104c53e76ed4ccfef2ebb4b68b7bf7ea802c3d56332b0d261cd'}]}, 'timestamp': '2025-11-22 07:40:37.920473', '_unique_id': '2dddde7766164c24a66df4ce4eb66228'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.921 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.921 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.921 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.921 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.921 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.921 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.921 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.921 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.921 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.921 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.921 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.921 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.921 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.921 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.921 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.921 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.921 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.921 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.921 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.921 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.921 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.921 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.921 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.921 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.921 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.921 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.921 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.921 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.921 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.921 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.921 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.922 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.922 12 DEBUG ceilometer.compute.pollsters [-] 6fe80388-ca60-492c-a99f-a338bcce8d5b/disk.device.read.bytes volume: 25349632 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.922 12 DEBUG ceilometer.compute.pollsters [-] 6fe80388-ca60-492c-a99f-a338bcce8d5b/disk.device.read.bytes volume: 55474 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.923 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '614048d6-e478-4d2a-8325-577984552bf0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 25349632, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '6fe80388-ca60-492c-a99f-a338bcce8d5b-vda', 'timestamp': '2025-11-22T07:40:37.922132', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-3', 'name': 'instance-00000006', 'instance_id': '6fe80388-ca60-492c-a99f-a338bcce8d5b', 'instance_type': 'm1.nano', 'host': '7e9e89413c0d2fcbe0cd4f3560f7c52ba231968510f86e127911473b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '89ba0428-c776-11f0-9b25-fa163ecc0304', 'monotonic_time': 3980.506810909, 'message_signature': '598cfbe98322d6432e91e23d4f2e2f8574924bcdf80922bbcea2b855f92e6fc4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 55474, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '6fe80388-ca60-492c-a99f-a338bcce8d5b-sda', 'timestamp': '2025-11-22T07:40:37.922132', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-3', 'name': 'instance-00000006', 'instance_id': '6fe80388-ca60-492c-a99f-a338bcce8d5b', 'instance_type': 'm1.nano', 'host': '7e9e89413c0d2fcbe0cd4f3560f7c52ba231968510f86e127911473b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '89ba1166-c776-11f0-9b25-fa163ecc0304', 'monotonic_time': 3980.506810909, 'message_signature': 'ca2dbcdc124698fdb31ed71024753cbf2d68c9cd57d4893e2858c250b478402b'}]}, 'timestamp': '2025-11-22 07:40:37.922781', '_unique_id': 'a076fc429be649c888ae2af241bd4690'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.923 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.923 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.923 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.923 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.923 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.923 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.923 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.923 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.923 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.923 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.923 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.923 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.923 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.923 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.923 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.923 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.923 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.923 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.923 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.923 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.923 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.923 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.923 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.923 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.923 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.923 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.923 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.923 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.923 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.923 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.923 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.924 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.924 12 DEBUG ceilometer.compute.pollsters [-] 6fe80388-ca60-492c-a99f-a338bcce8d5b/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.924 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dccb7feb-9f9a-40ae-bf13-3d9bed110ca5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': 'instance-00000006-6fe80388-ca60-492c-a99f-a338bcce8d5b-tapd7f5e5f5-f5', 'timestamp': '2025-11-22T07:40:37.924138', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-3', 'name': 'tapd7f5e5f5-f5', 'instance_id': '6fe80388-ca60-492c-a99f-a338bcce8d5b', 'instance_type': 'm1.nano', 'host': '7e9e89413c0d2fcbe0cd4f3560f7c52ba231968510f86e127911473b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7c:98:89', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd7f5e5f5-f5'}, 'message_id': '89ba51e4-c776-11f0-9b25-fa163ecc0304', 'monotonic_time': 3980.489497163, 'message_signature': 'cf89fbbf39d79d7c63f73344aa2f7aa53e745eeab96eaff4291fb0200ef0572a'}]}, 'timestamp': '2025-11-22 07:40:37.924414', '_unique_id': '57f969c3ed414415bee27cfc43798d3a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.924 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.924 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.924 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.924 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.924 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.924 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.924 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.924 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.924 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.924 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.924 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.924 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.924 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.924 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.924 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.924 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.924 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.924 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.924 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.924 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.924 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.924 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.924 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.924 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.924 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.924 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.924 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.924 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.924 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.925 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.925 12 DEBUG ceilometer.compute.pollsters [-] 6fe80388-ca60-492c-a99f-a338bcce8d5b/disk.device.allocation volume: 29237248 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.925 12 DEBUG ceilometer.compute.pollsters [-] 6fe80388-ca60-492c-a99f-a338bcce8d5b/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.926 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bf7889f7-8acc-46a3-a52f-96fa160b5d7a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29237248, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '6fe80388-ca60-492c-a99f-a338bcce8d5b-vda', 'timestamp': '2025-11-22T07:40:37.925528', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-3', 'name': 'instance-00000006', 'instance_id': '6fe80388-ca60-492c-a99f-a338bcce8d5b', 'instance_type': 'm1.nano', 'host': '7e9e89413c0d2fcbe0cd4f3560f7c52ba231968510f86e127911473b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '89ba8682-c776-11f0-9b25-fa163ecc0304', 'monotonic_time': 3980.553170814, 'message_signature': '2e11268f6bdde481e006ea1a61cc22f55707fbe26dfdba6287002469def824d8'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '6fe80388-ca60-492c-a99f-a338bcce8d5b-sda', 'timestamp': '2025-11-22T07:40:37.925528', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-3', 'name': 'instance-00000006', 'instance_id': '6fe80388-ca60-492c-a99f-a338bcce8d5b', 'instance_type': 'm1.nano', 'host': '7e9e89413c0d2fcbe0cd4f3560f7c52ba231968510f86e127911473b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '89ba8e7a-c776-11f0-9b25-fa163ecc0304', 'monotonic_time': 3980.553170814, 'message_signature': 'ae11121a61b9c60ff6881d0dba7378f971393c06df70717756136136008949e8'}]}, 'timestamp': '2025-11-22 07:40:37.925946', '_unique_id': 'eea6046396854d419424ff9edc08ef3e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.926 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.926 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.926 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.926 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.926 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.926 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.926 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.926 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.926 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.926 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.926 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.926 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.926 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.926 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.926 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.926 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.926 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.926 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.926 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.926 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.926 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.926 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.926 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.926 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.926 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.926 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.926 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.926 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.926 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.927 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.927 12 DEBUG ceilometer.compute.pollsters [-] 6fe80388-ca60-492c-a99f-a338bcce8d5b/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.927 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f335c642-c761-46ec-ac31-19644e2f6cfa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': 'instance-00000006-6fe80388-ca60-492c-a99f-a338bcce8d5b-tapd7f5e5f5-f5', 'timestamp': '2025-11-22T07:40:37.927132', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-3', 'name': 'tapd7f5e5f5-f5', 'instance_id': '6fe80388-ca60-492c-a99f-a338bcce8d5b', 'instance_type': 'm1.nano', 'host': '7e9e89413c0d2fcbe0cd4f3560f7c52ba231968510f86e127911473b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7c:98:89', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd7f5e5f5-f5'}, 'message_id': '89bac868-c776-11f0-9b25-fa163ecc0304', 'monotonic_time': 3980.489497163, 'message_signature': 'b8fd47b3c690ffd73300f62b5cecd7cf350f0e1e5075e8b150597859c5bfda9a'}]}, 'timestamp': '2025-11-22 07:40:37.927481', '_unique_id': 'cd67af816676404393578f85759c185b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.927 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.927 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.927 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.927 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.927 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.927 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.927 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.927 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.927 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.927 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.927 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.927 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.927 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.927 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.927 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.927 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.927 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.927 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.927 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.927 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.927 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.927 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.927 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.927 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.927 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.927 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.927 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.927 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.927 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.928 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.928 12 DEBUG ceilometer.compute.pollsters [-] 6fe80388-ca60-492c-a99f-a338bcce8d5b/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.929 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'de8067ea-4161-4fc2-bb0e-8ce809892f22', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': 'instance-00000006-6fe80388-ca60-492c-a99f-a338bcce8d5b-tapd7f5e5f5-f5', 'timestamp': '2025-11-22T07:40:37.928584', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-3', 'name': 'tapd7f5e5f5-f5', 'instance_id': '6fe80388-ca60-492c-a99f-a338bcce8d5b', 'instance_type': 'm1.nano', 'host': '7e9e89413c0d2fcbe0cd4f3560f7c52ba231968510f86e127911473b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7c:98:89', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd7f5e5f5-f5'}, 'message_id': '89baff18-c776-11f0-9b25-fa163ecc0304', 'monotonic_time': 3980.489497163, 'message_signature': 'a3dee68f3cb1d34746a5dd94d3e029e23b1a184a448dedd4fea6a66223decb17'}]}, 'timestamp': '2025-11-22 07:40:37.928878', '_unique_id': '4a616f87d20c4299a30194b825f2c9a9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.929 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.929 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.929 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.929 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.929 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.929 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.929 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.929 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.929 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.929 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.929 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.929 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.929 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.929 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.929 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.929 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.929 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.929 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.929 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.929 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.929 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.929 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.929 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.929 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.929 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.929 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.929 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.929 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.929 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.929 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.929 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.930 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.930 12 DEBUG ceilometer.compute.pollsters [-] 6fe80388-ca60-492c-a99f-a338bcce8d5b/memory.usage volume: 40.37890625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.931 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '26d4ac17-65d5-4051-95fd-8a4e82b7f93f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.37890625, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '6fe80388-ca60-492c-a99f-a338bcce8d5b', 'timestamp': '2025-11-22T07:40:37.930158', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-3', 'name': 'instance-00000006', 'instance_id': '6fe80388-ca60-492c-a99f-a338bcce8d5b', 'instance_type': 'm1.nano', 'host': '7e9e89413c0d2fcbe0cd4f3560f7c52ba231968510f86e127911473b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '89bb3de8-c776-11f0-9b25-fa163ecc0304', 'monotonic_time': 3980.589443747, 'message_signature': '723db906b9342a89d41af9a812992f45000e1f37ef9d83e46b0ef258e7d0f95c'}]}, 'timestamp': '2025-11-22 07:40:37.930512', '_unique_id': '1fcca25ea9c3426d9f636cb479a2c4bd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.931 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.931 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.931 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.931 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.931 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.931 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.931 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.931 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.931 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.931 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.931 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.931 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.931 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.931 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.931 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.931 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.931 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.931 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.931 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.931 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.931 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.931 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.931 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.931 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.931 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.931 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.931 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.931 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.931 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.931 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.931 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.931 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.931 12 DEBUG ceilometer.compute.pollsters [-] 6fe80388-ca60-492c-a99f-a338bcce8d5b/disk.device.usage volume: 28246016 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.931 12 DEBUG ceilometer.compute.pollsters [-] 6fe80388-ca60-492c-a99f-a338bcce8d5b/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.932 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eb39610a-f3a1-4dc1-8a12-6b835fdfc8db', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 28246016, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '6fe80388-ca60-492c-a99f-a338bcce8d5b-vda', 'timestamp': '2025-11-22T07:40:37.931702', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-3', 'name': 'instance-00000006', 'instance_id': '6fe80388-ca60-492c-a99f-a338bcce8d5b', 'instance_type': 'm1.nano', 'host': '7e9e89413c0d2fcbe0cd4f3560f7c52ba231968510f86e127911473b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '89bb7916-c776-11f0-9b25-fa163ecc0304', 'monotonic_time': 3980.553170814, 'message_signature': 'c2010678fd3c398201dc30990d463c5ddd5e27c923c362dd12c2468a5d838773'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '6fe80388-ca60-492c-a99f-a338bcce8d5b-sda', 'timestamp': '2025-11-22T07:40:37.931702', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-3', 'name': 'instance-00000006', 'instance_id': '6fe80388-ca60-492c-a99f-a338bcce8d5b', 'instance_type': 'm1.nano', 'host': '7e9e89413c0d2fcbe0cd4f3560f7c52ba231968510f86e127911473b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '89bb83ca-c776-11f0-9b25-fa163ecc0304', 'monotonic_time': 3980.553170814, 'message_signature': 'e893373864b640bd38902e8ac7c401332251174fb51168ac760e2abd7ae42a39'}]}, 'timestamp': '2025-11-22 07:40:37.932293', '_unique_id': '4a53407bebe646e4a966efc5a2e1dc62'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.932 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.932 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.932 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.932 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.932 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.932 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.932 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.932 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.932 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.932 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.932 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.932 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.932 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.932 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.932 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.932 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.932 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.932 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.932 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.932 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.932 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.932 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.932 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.932 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.932 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.932 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.932 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.932 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.932 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.932 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.932 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.933 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.933 12 DEBUG ceilometer.compute.pollsters [-] 6fe80388-ca60-492c-a99f-a338bcce8d5b/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.934 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b2e04f9e-5ef3-4868-b3f8-000b27d80895', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': 'instance-00000006-6fe80388-ca60-492c-a99f-a338bcce8d5b-tapd7f5e5f5-f5', 'timestamp': '2025-11-22T07:40:37.933806', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-3', 'name': 'tapd7f5e5f5-f5', 'instance_id': '6fe80388-ca60-492c-a99f-a338bcce8d5b', 'instance_type': 'm1.nano', 'host': '7e9e89413c0d2fcbe0cd4f3560f7c52ba231968510f86e127911473b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7c:98:89', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd7f5e5f5-f5'}, 'message_id': '89bbccb8-c776-11f0-9b25-fa163ecc0304', 'monotonic_time': 3980.489497163, 'message_signature': 'b088ead6e1be76f2bc56c91a4257372df38692cef6cce7c4845105ac74c7cfd3'}]}, 'timestamp': '2025-11-22 07:40:37.934167', '_unique_id': '6ea38c01f6064207851d43ccbe897a20'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.934 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.934 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.934 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.934 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.934 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.934 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.934 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.934 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.934 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.934 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.934 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.934 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.934 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.934 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.934 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.934 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.934 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.934 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.934 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.934 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.934 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.934 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.934 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.934 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.934 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.934 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.934 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.934 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.934 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.934 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.934 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.935 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.935 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 02:40:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:40:37.935 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-tempest.common.compute-instance-1346960213-3>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-tempest.common.compute-instance-1346960213-3>]
Nov 22 02:40:38 np0005531887 ovn_controller[95130]: 2025-11-22T07:40:38Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7c:98:89 10.1.0.61
Nov 22 02:40:38 np0005531887 ovn_controller[95130]: 2025-11-22T07:40:38Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7c:98:89 10.1.0.61
Nov 22 02:40:39 np0005531887 nova_compute[186849]: 2025-11-22 07:40:39.406 186853 DEBUG nova.network.neutron [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Updating instance_info_cache with network_info: [{"id": "d7f5e5f5-f58a-46a6-92e2-0ec1be38e606", "address": "fa:16:3e:7c:98:89", "network": {"id": "cd94b117-ddd2-457a-a1e9-a1e03ac67322", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::3c4", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.61", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98627e04b62e4ce4bf9650377c674f73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7f5e5f5-f5", "ovs_interfaceid": "d7f5e5f5-f58a-46a6-92e2-0ec1be38e606", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:40:39 np0005531887 nova_compute[186849]: 2025-11-22 07:40:39.546 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Releasing lock "refresh_cache-6fe80388-ca60-492c-a99f-a338bcce8d5b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:40:39 np0005531887 nova_compute[186849]: 2025-11-22 07:40:39.547 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 02:40:39 np0005531887 nova_compute[186849]: 2025-11-22 07:40:39.547 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:40:39 np0005531887 nova_compute[186849]: 2025-11-22 07:40:39.548 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:40:39 np0005531887 nova_compute[186849]: 2025-11-22 07:40:39.548 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:40:39 np0005531887 nova_compute[186849]: 2025-11-22 07:40:39.548 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:40:39 np0005531887 nova_compute[186849]: 2025-11-22 07:40:39.549 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 02:40:39 np0005531887 nova_compute[186849]: 2025-11-22 07:40:39.549 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:40:39 np0005531887 nova_compute[186849]: 2025-11-22 07:40:39.600 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:40:39 np0005531887 nova_compute[186849]: 2025-11-22 07:40:39.601 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:40:39 np0005531887 nova_compute[186849]: 2025-11-22 07:40:39.601 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:40:39 np0005531887 nova_compute[186849]: 2025-11-22 07:40:39.601 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 02:40:39 np0005531887 nova_compute[186849]: 2025-11-22 07:40:39.816 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:39 np0005531887 nova_compute[186849]: 2025-11-22 07:40:39.821 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6fe80388-ca60-492c-a99f-a338bcce8d5b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:40:39 np0005531887 nova_compute[186849]: 2025-11-22 07:40:39.881 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6fe80388-ca60-492c-a99f-a338bcce8d5b/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:40:39 np0005531887 nova_compute[186849]: 2025-11-22 07:40:39.882 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6fe80388-ca60-492c-a99f-a338bcce8d5b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:40:39 np0005531887 nova_compute[186849]: 2025-11-22 07:40:39.967 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6fe80388-ca60-492c-a99f-a338bcce8d5b/disk --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:40:40 np0005531887 nova_compute[186849]: 2025-11-22 07:40:40.130 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:40:40 np0005531887 nova_compute[186849]: 2025-11-22 07:40:40.132 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5647MB free_disk=73.43392562866211GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 02:40:40 np0005531887 nova_compute[186849]: 2025-11-22 07:40:40.133 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:40:40 np0005531887 nova_compute[186849]: 2025-11-22 07:40:40.133 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:40:40 np0005531887 nova_compute[186849]: 2025-11-22 07:40:40.144 186853 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797225.143678, 6fabeb14-7440-41d0-8be1-453a7607a8ed => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:40:40 np0005531887 nova_compute[186849]: 2025-11-22 07:40:40.144 186853 INFO nova.compute.manager [-] [instance: 6fabeb14-7440-41d0-8be1-453a7607a8ed] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:40:40 np0005531887 nova_compute[186849]: 2025-11-22 07:40:40.180 186853 DEBUG nova.compute.manager [None req-10d280d2-0a85-4551-bd3f-9c8614eafac5 - - - - - -] [instance: 6fabeb14-7440-41d0-8be1-453a7607a8ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:40:40 np0005531887 nova_compute[186849]: 2025-11-22 07:40:40.278 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Instance 6fe80388-ca60-492c-a99f-a338bcce8d5b actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 02:40:40 np0005531887 nova_compute[186849]: 2025-11-22 07:40:40.279 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 02:40:40 np0005531887 nova_compute[186849]: 2025-11-22 07:40:40.279 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 02:40:40 np0005531887 nova_compute[186849]: 2025-11-22 07:40:40.358 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:40:40 np0005531887 nova_compute[186849]: 2025-11-22 07:40:40.391 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:40:40 np0005531887 nova_compute[186849]: 2025-11-22 07:40:40.429 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 02:40:40 np0005531887 nova_compute[186849]: 2025-11-22 07:40:40.430 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.296s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:40:41 np0005531887 nova_compute[186849]: 2025-11-22 07:40:41.278 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:43 np0005531887 podman[213999]: 2025-11-22 07:40:43.853423136 +0000 UTC m=+0.075821334 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 22 02:40:43 np0005531887 podman[214000]: 2025-11-22 07:40:43.870368074 +0000 UTC m=+0.088047599 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 22 02:40:44 np0005531887 nova_compute[186849]: 2025-11-22 07:40:44.819 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:46 np0005531887 nova_compute[186849]: 2025-11-22 07:40:46.280 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:49 np0005531887 nova_compute[186849]: 2025-11-22 07:40:49.821 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:49 np0005531887 podman[214045]: 2025-11-22 07:40:49.827659338 +0000 UTC m=+0.051293845 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 02:40:51 np0005531887 nova_compute[186849]: 2025-11-22 07:40:51.283 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:52 np0005531887 podman[214069]: 2025-11-22 07:40:52.831204828 +0000 UTC m=+0.052116754 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 22 02:40:54 np0005531887 nova_compute[186849]: 2025-11-22 07:40:54.824 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:55 np0005531887 podman[214089]: 2025-11-22 07:40:55.834410729 +0000 UTC m=+0.057834042 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 02:40:56 np0005531887 nova_compute[186849]: 2025-11-22 07:40:56.285 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:40:59 np0005531887 nova_compute[186849]: 2025-11-22 07:40:59.826 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:00 np0005531887 podman[214110]: 2025-11-22 07:41:00.835312677 +0000 UTC m=+0.053275743 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 02:41:01 np0005531887 nova_compute[186849]: 2025-11-22 07:41:01.288 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:04 np0005531887 nova_compute[186849]: 2025-11-22 07:41:04.043 186853 DEBUG oslo_concurrency.lockutils [None req-bb54bd2c-1f7d-40d7-a655-87f31066c5dd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Acquiring lock "6fe80388-ca60-492c-a99f-a338bcce8d5b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:41:04 np0005531887 nova_compute[186849]: 2025-11-22 07:41:04.044 186853 DEBUG oslo_concurrency.lockutils [None req-bb54bd2c-1f7d-40d7-a655-87f31066c5dd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "6fe80388-ca60-492c-a99f-a338bcce8d5b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:41:04 np0005531887 nova_compute[186849]: 2025-11-22 07:41:04.044 186853 DEBUG oslo_concurrency.lockutils [None req-bb54bd2c-1f7d-40d7-a655-87f31066c5dd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Acquiring lock "6fe80388-ca60-492c-a99f-a338bcce8d5b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:41:04 np0005531887 nova_compute[186849]: 2025-11-22 07:41:04.044 186853 DEBUG oslo_concurrency.lockutils [None req-bb54bd2c-1f7d-40d7-a655-87f31066c5dd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "6fe80388-ca60-492c-a99f-a338bcce8d5b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:41:04 np0005531887 nova_compute[186849]: 2025-11-22 07:41:04.044 186853 DEBUG oslo_concurrency.lockutils [None req-bb54bd2c-1f7d-40d7-a655-87f31066c5dd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "6fe80388-ca60-492c-a99f-a338bcce8d5b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:41:04 np0005531887 nova_compute[186849]: 2025-11-22 07:41:04.053 186853 INFO nova.compute.manager [None req-bb54bd2c-1f7d-40d7-a655-87f31066c5dd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Terminating instance#033[00m
Nov 22 02:41:04 np0005531887 nova_compute[186849]: 2025-11-22 07:41:04.059 186853 DEBUG nova.compute.manager [None req-bb54bd2c-1f7d-40d7-a655-87f31066c5dd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 02:41:04 np0005531887 kernel: tapd7f5e5f5-f5 (unregistering): left promiscuous mode
Nov 22 02:41:04 np0005531887 NetworkManager[55210]: <info>  [1763797264.0817] device (tapd7f5e5f5-f5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 02:41:04 np0005531887 ovn_controller[95130]: 2025-11-22T07:41:04Z|00032|binding|INFO|Releasing lport d7f5e5f5-f58a-46a6-92e2-0ec1be38e606 from this chassis (sb_readonly=0)
Nov 22 02:41:04 np0005531887 ovn_controller[95130]: 2025-11-22T07:41:04Z|00033|binding|INFO|Setting lport d7f5e5f5-f58a-46a6-92e2-0ec1be38e606 down in Southbound
Nov 22 02:41:04 np0005531887 ovn_controller[95130]: 2025-11-22T07:41:04Z|00034|binding|INFO|Removing iface tapd7f5e5f5-f5 ovn-installed in OVS
Nov 22 02:41:04 np0005531887 nova_compute[186849]: 2025-11-22 07:41:04.088 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:04 np0005531887 nova_compute[186849]: 2025-11-22 07:41:04.107 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:41:04.116 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7c:98:89 10.1.0.61 fdfe:381f:8400::3c4'], port_security=['fa:16:3e:7c:98:89 10.1.0.61 fdfe:381f:8400::3c4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.0.61/26 fdfe:381f:8400::3c4/64', 'neutron:device_id': '6fe80388-ca60-492c-a99f-a338bcce8d5b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cd94b117-ddd2-457a-a1e9-a1e03ac67322', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98627e04b62e4ce4bf9650377c674f73', 'neutron:revision_number': '4', 'neutron:security_group_ids': '931bf7c3-500b-4034-8d8e-f18219ff1b58', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6120d3e5-4a9e-45cc-93a1-87b92bf94714, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=d7f5e5f5-f58a-46a6-92e2-0ec1be38e606) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:41:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:41:04.118 104084 INFO neutron.agent.ovn.metadata.agent [-] Port d7f5e5f5-f58a-46a6-92e2-0ec1be38e606 in datapath cd94b117-ddd2-457a-a1e9-a1e03ac67322 unbound from our chassis#033[00m
Nov 22 02:41:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:41:04.119 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cd94b117-ddd2-457a-a1e9-a1e03ac67322, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 02:41:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:41:04.121 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[21058ac8-486c-426b-a3be-7ecda19905da]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:41:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:41:04.121 104084 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-cd94b117-ddd2-457a-a1e9-a1e03ac67322 namespace which is not needed anymore#033[00m
Nov 22 02:41:04 np0005531887 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000006.scope: Deactivated successfully.
Nov 22 02:41:04 np0005531887 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000006.scope: Consumed 16.237s CPU time.
Nov 22 02:41:04 np0005531887 systemd-machined[153180]: Machine qemu-3-instance-00000006 terminated.
Nov 22 02:41:04 np0005531887 neutron-haproxy-ovnmeta-cd94b117-ddd2-457a-a1e9-a1e03ac67322[213913]: [NOTICE]   (213917) : haproxy version is 2.8.14-c23fe91
Nov 22 02:41:04 np0005531887 neutron-haproxy-ovnmeta-cd94b117-ddd2-457a-a1e9-a1e03ac67322[213913]: [NOTICE]   (213917) : path to executable is /usr/sbin/haproxy
Nov 22 02:41:04 np0005531887 neutron-haproxy-ovnmeta-cd94b117-ddd2-457a-a1e9-a1e03ac67322[213913]: [WARNING]  (213917) : Exiting Master process...
Nov 22 02:41:04 np0005531887 neutron-haproxy-ovnmeta-cd94b117-ddd2-457a-a1e9-a1e03ac67322[213913]: [ALERT]    (213917) : Current worker (213919) exited with code 143 (Terminated)
Nov 22 02:41:04 np0005531887 neutron-haproxy-ovnmeta-cd94b117-ddd2-457a-a1e9-a1e03ac67322[213913]: [WARNING]  (213917) : All workers exited. Exiting... (0)
Nov 22 02:41:04 np0005531887 systemd[1]: libpod-7ce24ac3c6498975ec04df0528a998c64f90169ab6c9331f4df41d3ea0dd663c.scope: Deactivated successfully.
Nov 22 02:41:04 np0005531887 conmon[213913]: conmon 7ce24ac3c6498975ec04 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-7ce24ac3c6498975ec04df0528a998c64f90169ab6c9331f4df41d3ea0dd663c.scope/container/memory.events
Nov 22 02:41:04 np0005531887 podman[214158]: 2025-11-22 07:41:04.270921061 +0000 UTC m=+0.052364251 container died 7ce24ac3c6498975ec04df0528a998c64f90169ab6c9331f4df41d3ea0dd663c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd94b117-ddd2-457a-a1e9-a1e03ac67322, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 22 02:41:04 np0005531887 nova_compute[186849]: 2025-11-22 07:41:04.284 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:04 np0005531887 nova_compute[186849]: 2025-11-22 07:41:04.290 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:04 np0005531887 nova_compute[186849]: 2025-11-22 07:41:04.327 186853 INFO nova.virt.libvirt.driver [-] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Instance destroyed successfully.#033[00m
Nov 22 02:41:04 np0005531887 nova_compute[186849]: 2025-11-22 07:41:04.328 186853 DEBUG nova.objects.instance [None req-bb54bd2c-1f7d-40d7-a655-87f31066c5dd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lazy-loading 'resources' on Instance uuid 6fe80388-ca60-492c-a99f-a338bcce8d5b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:41:04 np0005531887 nova_compute[186849]: 2025-11-22 07:41:04.356 186853 DEBUG nova.virt.libvirt.vif [None req-bb54bd2c-1f7d-40d7-a655-87f31066c5dd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:39:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-1346960213-3',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1346960213-3',id=6,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=2,launched_at=2025-11-22T07:40:23Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='98627e04b62e4ce4bf9650377c674f73',ramdisk_id='',reservation_id='r-qng9bmzp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AutoAllocateNetworkTest-83498172',owner_user_name='tempest-AutoAllocateNetworkTest-83498172-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:40:23Z,user_data=None,user_id='12b223a79f8b4927861908eb11663fb5',uuid=6fe80388-ca60-492c-a99f-a338bcce8d5b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d7f5e5f5-f58a-46a6-92e2-0ec1be38e606", "address": "fa:16:3e:7c:98:89", "network": {"id": "cd94b117-ddd2-457a-a1e9-a1e03ac67322", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::3c4", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.61", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98627e04b62e4ce4bf9650377c674f73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7f5e5f5-f5", "ovs_interfaceid": "d7f5e5f5-f58a-46a6-92e2-0ec1be38e606", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 02:41:04 np0005531887 nova_compute[186849]: 2025-11-22 07:41:04.356 186853 DEBUG nova.network.os_vif_util [None req-bb54bd2c-1f7d-40d7-a655-87f31066c5dd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Converting VIF {"id": "d7f5e5f5-f58a-46a6-92e2-0ec1be38e606", "address": "fa:16:3e:7c:98:89", "network": {"id": "cd94b117-ddd2-457a-a1e9-a1e03ac67322", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::3c4", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.61", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98627e04b62e4ce4bf9650377c674f73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7f5e5f5-f5", "ovs_interfaceid": "d7f5e5f5-f58a-46a6-92e2-0ec1be38e606", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:41:04 np0005531887 nova_compute[186849]: 2025-11-22 07:41:04.357 186853 DEBUG nova.network.os_vif_util [None req-bb54bd2c-1f7d-40d7-a655-87f31066c5dd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7c:98:89,bridge_name='br-int',has_traffic_filtering=True,id=d7f5e5f5-f58a-46a6-92e2-0ec1be38e606,network=Network(cd94b117-ddd2-457a-a1e9-a1e03ac67322),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7f5e5f5-f5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:41:04 np0005531887 nova_compute[186849]: 2025-11-22 07:41:04.358 186853 DEBUG os_vif [None req-bb54bd2c-1f7d-40d7-a655-87f31066c5dd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7c:98:89,bridge_name='br-int',has_traffic_filtering=True,id=d7f5e5f5-f58a-46a6-92e2-0ec1be38e606,network=Network(cd94b117-ddd2-457a-a1e9-a1e03ac67322),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7f5e5f5-f5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 02:41:04 np0005531887 nova_compute[186849]: 2025-11-22 07:41:04.359 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:04 np0005531887 nova_compute[186849]: 2025-11-22 07:41:04.360 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7f5e5f5-f5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:41:04 np0005531887 nova_compute[186849]: 2025-11-22 07:41:04.361 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:04 np0005531887 nova_compute[186849]: 2025-11-22 07:41:04.362 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:04 np0005531887 nova_compute[186849]: 2025-11-22 07:41:04.365 186853 INFO os_vif [None req-bb54bd2c-1f7d-40d7-a655-87f31066c5dd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7c:98:89,bridge_name='br-int',has_traffic_filtering=True,id=d7f5e5f5-f58a-46a6-92e2-0ec1be38e606,network=Network(cd94b117-ddd2-457a-a1e9-a1e03ac67322),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7f5e5f5-f5')#033[00m
Nov 22 02:41:04 np0005531887 nova_compute[186849]: 2025-11-22 07:41:04.366 186853 INFO nova.virt.libvirt.driver [None req-bb54bd2c-1f7d-40d7-a655-87f31066c5dd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Deleting instance files /var/lib/nova/instances/6fe80388-ca60-492c-a99f-a338bcce8d5b_del#033[00m
Nov 22 02:41:04 np0005531887 nova_compute[186849]: 2025-11-22 07:41:04.367 186853 INFO nova.virt.libvirt.driver [None req-bb54bd2c-1f7d-40d7-a655-87f31066c5dd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Deletion of /var/lib/nova/instances/6fe80388-ca60-492c-a99f-a338bcce8d5b_del complete#033[00m
Nov 22 02:41:04 np0005531887 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7ce24ac3c6498975ec04df0528a998c64f90169ab6c9331f4df41d3ea0dd663c-userdata-shm.mount: Deactivated successfully.
Nov 22 02:41:04 np0005531887 systemd[1]: var-lib-containers-storage-overlay-d88a1f422fc85b44ba2b00a6360e3f911d9204d0b4efd2359b139838fa06b86f-merged.mount: Deactivated successfully.
Nov 22 02:41:04 np0005531887 nova_compute[186849]: 2025-11-22 07:41:04.540 186853 INFO nova.compute.manager [None req-bb54bd2c-1f7d-40d7-a655-87f31066c5dd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Took 0.48 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 02:41:04 np0005531887 nova_compute[186849]: 2025-11-22 07:41:04.540 186853 DEBUG oslo.service.loopingcall [None req-bb54bd2c-1f7d-40d7-a655-87f31066c5dd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 02:41:04 np0005531887 nova_compute[186849]: 2025-11-22 07:41:04.541 186853 DEBUG nova.compute.manager [-] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 02:41:04 np0005531887 nova_compute[186849]: 2025-11-22 07:41:04.541 186853 DEBUG nova.network.neutron [-] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 02:41:04 np0005531887 podman[214158]: 2025-11-22 07:41:04.560953948 +0000 UTC m=+0.342397138 container cleanup 7ce24ac3c6498975ec04df0528a998c64f90169ab6c9331f4df41d3ea0dd663c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd94b117-ddd2-457a-a1e9-a1e03ac67322, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 02:41:04 np0005531887 systemd[1]: libpod-conmon-7ce24ac3c6498975ec04df0528a998c64f90169ab6c9331f4df41d3ea0dd663c.scope: Deactivated successfully.
Nov 22 02:41:04 np0005531887 podman[214205]: 2025-11-22 07:41:04.626096734 +0000 UTC m=+0.042899282 container remove 7ce24ac3c6498975ec04df0528a998c64f90169ab6c9331f4df41d3ea0dd663c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd94b117-ddd2-457a-a1e9-a1e03ac67322, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:41:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:41:04.631 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[6ead0928-7faf-49df-be77-daf94ea59b9d]: (4, ('Sat Nov 22 07:41:04 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-cd94b117-ddd2-457a-a1e9-a1e03ac67322 (7ce24ac3c6498975ec04df0528a998c64f90169ab6c9331f4df41d3ea0dd663c)\n7ce24ac3c6498975ec04df0528a998c64f90169ab6c9331f4df41d3ea0dd663c\nSat Nov 22 07:41:04 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-cd94b117-ddd2-457a-a1e9-a1e03ac67322 (7ce24ac3c6498975ec04df0528a998c64f90169ab6c9331f4df41d3ea0dd663c)\n7ce24ac3c6498975ec04df0528a998c64f90169ab6c9331f4df41d3ea0dd663c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:41:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:41:04.633 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[97363e54-ce80-4de2-8ef1-9a30d4d87964]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:41:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:41:04.635 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcd94b117-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:41:04 np0005531887 nova_compute[186849]: 2025-11-22 07:41:04.638 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:04 np0005531887 kernel: tapcd94b117-d0: left promiscuous mode
Nov 22 02:41:04 np0005531887 nova_compute[186849]: 2025-11-22 07:41:04.649 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:41:04.652 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[4e47c00c-df19-4175-b1e3-d7d9f5ad469b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:41:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:41:04.667 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[63f97257-5350-46bb-8fc8-ea5f745c41a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:41:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:41:04.669 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[b8358f63-3b4a-4b3a-90e0-68875f8ea7a6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:41:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:41:04.685 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[9f528ed0-693a-4a36-b95d-d8a841a6260d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 396891, 'reachable_time': 19521, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214220, 'error': None, 'target': 'ovnmeta-cd94b117-ddd2-457a-a1e9-a1e03ac67322', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:41:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:41:04.698 104200 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-cd94b117-ddd2-457a-a1e9-a1e03ac67322 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 02:41:04 np0005531887 systemd[1]: run-netns-ovnmeta\x2dcd94b117\x2dddd2\x2d457a\x2da1e9\x2da1e03ac67322.mount: Deactivated successfully.
Nov 22 02:41:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:41:04.699 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[b0a1772a-1a87-4cef-9715-2cdf218154d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:41:04 np0005531887 nova_compute[186849]: 2025-11-22 07:41:04.872 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:05 np0005531887 podman[214222]: 2025-11-22 07:41:05.849051593 +0000 UTC m=+0.065010555 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, distribution-scope=public, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, version=9.6, container_name=openstack_network_exporter, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, io.openshift.expose-services=, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm)
Nov 22 02:41:06 np0005531887 nova_compute[186849]: 2025-11-22 07:41:06.895 186853 DEBUG nova.compute.manager [req-a7a93e70-3410-4b79-b3f2-791571d34ef4 req-51b6a903-397a-46eb-947f-ca03c40bbf55 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Received event network-vif-unplugged-d7f5e5f5-f58a-46a6-92e2-0ec1be38e606 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:41:06 np0005531887 nova_compute[186849]: 2025-11-22 07:41:06.896 186853 DEBUG oslo_concurrency.lockutils [req-a7a93e70-3410-4b79-b3f2-791571d34ef4 req-51b6a903-397a-46eb-947f-ca03c40bbf55 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "6fe80388-ca60-492c-a99f-a338bcce8d5b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:41:06 np0005531887 nova_compute[186849]: 2025-11-22 07:41:06.896 186853 DEBUG oslo_concurrency.lockutils [req-a7a93e70-3410-4b79-b3f2-791571d34ef4 req-51b6a903-397a-46eb-947f-ca03c40bbf55 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6fe80388-ca60-492c-a99f-a338bcce8d5b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:41:06 np0005531887 nova_compute[186849]: 2025-11-22 07:41:06.896 186853 DEBUG oslo_concurrency.lockutils [req-a7a93e70-3410-4b79-b3f2-791571d34ef4 req-51b6a903-397a-46eb-947f-ca03c40bbf55 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6fe80388-ca60-492c-a99f-a338bcce8d5b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:41:06 np0005531887 nova_compute[186849]: 2025-11-22 07:41:06.896 186853 DEBUG nova.compute.manager [req-a7a93e70-3410-4b79-b3f2-791571d34ef4 req-51b6a903-397a-46eb-947f-ca03c40bbf55 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] No waiting events found dispatching network-vif-unplugged-d7f5e5f5-f58a-46a6-92e2-0ec1be38e606 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:41:06 np0005531887 nova_compute[186849]: 2025-11-22 07:41:06.897 186853 DEBUG nova.compute.manager [req-a7a93e70-3410-4b79-b3f2-791571d34ef4 req-51b6a903-397a-46eb-947f-ca03c40bbf55 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Received event network-vif-unplugged-d7f5e5f5-f58a-46a6-92e2-0ec1be38e606 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 02:41:07 np0005531887 nova_compute[186849]: 2025-11-22 07:41:07.525 186853 DEBUG nova.network.neutron [-] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:41:07 np0005531887 nova_compute[186849]: 2025-11-22 07:41:07.603 186853 INFO nova.compute.manager [-] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Took 3.06 seconds to deallocate network for instance.#033[00m
Nov 22 02:41:07 np0005531887 nova_compute[186849]: 2025-11-22 07:41:07.746 186853 DEBUG oslo_concurrency.lockutils [None req-bb54bd2c-1f7d-40d7-a655-87f31066c5dd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:41:07 np0005531887 nova_compute[186849]: 2025-11-22 07:41:07.747 186853 DEBUG oslo_concurrency.lockutils [None req-bb54bd2c-1f7d-40d7-a655-87f31066c5dd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:41:07 np0005531887 nova_compute[186849]: 2025-11-22 07:41:07.853 186853 DEBUG nova.compute.provider_tree [None req-bb54bd2c-1f7d-40d7-a655-87f31066c5dd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:41:07 np0005531887 nova_compute[186849]: 2025-11-22 07:41:07.875 186853 DEBUG nova.scheduler.client.report [None req-bb54bd2c-1f7d-40d7-a655-87f31066c5dd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:41:07 np0005531887 nova_compute[186849]: 2025-11-22 07:41:07.902 186853 DEBUG oslo_concurrency.lockutils [None req-bb54bd2c-1f7d-40d7-a655-87f31066c5dd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.155s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:41:07 np0005531887 nova_compute[186849]: 2025-11-22 07:41:07.997 186853 INFO nova.scheduler.client.report [None req-bb54bd2c-1f7d-40d7-a655-87f31066c5dd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Deleted allocations for instance 6fe80388-ca60-492c-a99f-a338bcce8d5b#033[00m
Nov 22 02:41:08 np0005531887 nova_compute[186849]: 2025-11-22 07:41:08.146 186853 DEBUG oslo_concurrency.lockutils [None req-bb54bd2c-1f7d-40d7-a655-87f31066c5dd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "6fe80388-ca60-492c-a99f-a338bcce8d5b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:41:08 np0005531887 nova_compute[186849]: 2025-11-22 07:41:08.613 186853 DEBUG nova.compute.manager [req-b9819b40-2cfb-4293-a8b5-efb72286303c req-b75e6dc2-9b95-4e8e-8e03-f287bc633399 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Received event network-vif-deleted-d7f5e5f5-f58a-46a6-92e2-0ec1be38e606 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:41:09 np0005531887 nova_compute[186849]: 2025-11-22 07:41:09.193 186853 DEBUG nova.compute.manager [req-2d8f0bec-fe80-44c2-ac5d-a1871df21eea req-9a6b24f0-0fce-4f48-9513-95013ac470a6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Received event network-vif-plugged-d7f5e5f5-f58a-46a6-92e2-0ec1be38e606 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:41:09 np0005531887 nova_compute[186849]: 2025-11-22 07:41:09.194 186853 DEBUG oslo_concurrency.lockutils [req-2d8f0bec-fe80-44c2-ac5d-a1871df21eea req-9a6b24f0-0fce-4f48-9513-95013ac470a6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "6fe80388-ca60-492c-a99f-a338bcce8d5b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:41:09 np0005531887 nova_compute[186849]: 2025-11-22 07:41:09.194 186853 DEBUG oslo_concurrency.lockutils [req-2d8f0bec-fe80-44c2-ac5d-a1871df21eea req-9a6b24f0-0fce-4f48-9513-95013ac470a6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6fe80388-ca60-492c-a99f-a338bcce8d5b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:41:09 np0005531887 nova_compute[186849]: 2025-11-22 07:41:09.194 186853 DEBUG oslo_concurrency.lockutils [req-2d8f0bec-fe80-44c2-ac5d-a1871df21eea req-9a6b24f0-0fce-4f48-9513-95013ac470a6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6fe80388-ca60-492c-a99f-a338bcce8d5b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:41:09 np0005531887 nova_compute[186849]: 2025-11-22 07:41:09.194 186853 DEBUG nova.compute.manager [req-2d8f0bec-fe80-44c2-ac5d-a1871df21eea req-9a6b24f0-0fce-4f48-9513-95013ac470a6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] No waiting events found dispatching network-vif-plugged-d7f5e5f5-f58a-46a6-92e2-0ec1be38e606 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:41:09 np0005531887 nova_compute[186849]: 2025-11-22 07:41:09.194 186853 WARNING nova.compute.manager [req-2d8f0bec-fe80-44c2-ac5d-a1871df21eea req-9a6b24f0-0fce-4f48-9513-95013ac470a6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Received unexpected event network-vif-plugged-d7f5e5f5-f58a-46a6-92e2-0ec1be38e606 for instance with vm_state deleted and task_state None.#033[00m
Nov 22 02:41:09 np0005531887 nova_compute[186849]: 2025-11-22 07:41:09.362 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:09 np0005531887 nova_compute[186849]: 2025-11-22 07:41:09.874 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:14 np0005531887 nova_compute[186849]: 2025-11-22 07:41:14.365 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:14 np0005531887 podman[214246]: 2025-11-22 07:41:14.836806586 +0000 UTC m=+0.055217139 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 22 02:41:14 np0005531887 podman[214247]: 2025-11-22 07:41:14.86359924 +0000 UTC m=+0.079928863 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 22 02:41:14 np0005531887 nova_compute[186849]: 2025-11-22 07:41:14.874 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:19 np0005531887 nova_compute[186849]: 2025-11-22 07:41:19.325 186853 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797264.3245606, 6fe80388-ca60-492c-a99f-a338bcce8d5b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:41:19 np0005531887 nova_compute[186849]: 2025-11-22 07:41:19.325 186853 INFO nova.compute.manager [-] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:41:19 np0005531887 nova_compute[186849]: 2025-11-22 07:41:19.345 186853 DEBUG nova.compute.manager [None req-e826ef21-f0f3-43e1-8327-3c8c3a2f8045 - - - - - -] [instance: 6fe80388-ca60-492c-a99f-a338bcce8d5b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:41:19 np0005531887 nova_compute[186849]: 2025-11-22 07:41:19.368 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:19 np0005531887 nova_compute[186849]: 2025-11-22 07:41:19.875 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:20 np0005531887 podman[214291]: 2025-11-22 07:41:20.837666217 +0000 UTC m=+0.054556035 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 02:41:21 np0005531887 nova_compute[186849]: 2025-11-22 07:41:21.304 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:23 np0005531887 podman[214315]: 2025-11-22 07:41:23.828243676 +0000 UTC m=+0.050900066 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 22 02:41:24 np0005531887 nova_compute[186849]: 2025-11-22 07:41:24.372 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:24 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:41:24.671 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:41:24 np0005531887 nova_compute[186849]: 2025-11-22 07:41:24.671 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:24 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:41:24.673 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 02:41:24 np0005531887 nova_compute[186849]: 2025-11-22 07:41:24.878 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:26 np0005531887 podman[214335]: 2025-11-22 07:41:26.835155337 +0000 UTC m=+0.059071122 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:41:29 np0005531887 nova_compute[186849]: 2025-11-22 07:41:29.375 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:29 np0005531887 nova_compute[186849]: 2025-11-22 07:41:29.880 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:31 np0005531887 nova_compute[186849]: 2025-11-22 07:41:31.651 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:41:31 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:41:31.676 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:41:31 np0005531887 nova_compute[186849]: 2025-11-22 07:41:31.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:41:31 np0005531887 podman[214356]: 2025-11-22 07:41:31.827906298 +0000 UTC m=+0.051602823 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 02:41:32 np0005531887 nova_compute[186849]: 2025-11-22 07:41:32.763 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:41:32 np0005531887 nova_compute[186849]: 2025-11-22 07:41:32.767 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:41:32 np0005531887 nova_compute[186849]: 2025-11-22 07:41:32.768 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 02:41:32 np0005531887 nova_compute[186849]: 2025-11-22 07:41:32.768 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 02:41:32 np0005531887 nova_compute[186849]: 2025-11-22 07:41:32.788 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 02:41:32 np0005531887 nova_compute[186849]: 2025-11-22 07:41:32.788 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:41:33 np0005531887 nova_compute[186849]: 2025-11-22 07:41:33.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:41:33 np0005531887 nova_compute[186849]: 2025-11-22 07:41:33.807 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:41:34 np0005531887 nova_compute[186849]: 2025-11-22 07:41:34.377 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:34 np0005531887 nova_compute[186849]: 2025-11-22 07:41:34.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:41:34 np0005531887 nova_compute[186849]: 2025-11-22 07:41:34.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:41:34 np0005531887 nova_compute[186849]: 2025-11-22 07:41:34.768 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 02:41:34 np0005531887 nova_compute[186849]: 2025-11-22 07:41:34.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:41:34 np0005531887 nova_compute[186849]: 2025-11-22 07:41:34.799 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:41:34 np0005531887 nova_compute[186849]: 2025-11-22 07:41:34.800 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:41:34 np0005531887 nova_compute[186849]: 2025-11-22 07:41:34.800 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:41:34 np0005531887 nova_compute[186849]: 2025-11-22 07:41:34.800 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 02:41:34 np0005531887 nova_compute[186849]: 2025-11-22 07:41:34.881 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:34 np0005531887 nova_compute[186849]: 2025-11-22 07:41:34.947 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:41:34 np0005531887 nova_compute[186849]: 2025-11-22 07:41:34.947 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5813MB free_disk=73.46184539794922GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 02:41:34 np0005531887 nova_compute[186849]: 2025-11-22 07:41:34.948 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:41:34 np0005531887 nova_compute[186849]: 2025-11-22 07:41:34.948 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:41:35 np0005531887 nova_compute[186849]: 2025-11-22 07:41:35.042 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 02:41:35 np0005531887 nova_compute[186849]: 2025-11-22 07:41:35.042 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 02:41:35 np0005531887 nova_compute[186849]: 2025-11-22 07:41:35.071 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:41:35 np0005531887 nova_compute[186849]: 2025-11-22 07:41:35.093 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:41:35 np0005531887 nova_compute[186849]: 2025-11-22 07:41:35.142 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 02:41:35 np0005531887 nova_compute[186849]: 2025-11-22 07:41:35.143 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.195s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:41:36 np0005531887 podman[214382]: 2025-11-22 07:41:36.834890691 +0000 UTC m=+0.056377777 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, version=9.6, container_name=openstack_network_exporter, architecture=x86_64, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 22 02:41:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:41:37.313 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:41:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:41:37.314 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:41:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:41:37.314 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:41:39 np0005531887 nova_compute[186849]: 2025-11-22 07:41:39.380 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:39 np0005531887 nova_compute[186849]: 2025-11-22 07:41:39.883 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:44 np0005531887 nova_compute[186849]: 2025-11-22 07:41:44.382 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:44 np0005531887 nova_compute[186849]: 2025-11-22 07:41:44.887 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:45 np0005531887 podman[214405]: 2025-11-22 07:41:45.842531472 +0000 UTC m=+0.050587387 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=edpm, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute)
Nov 22 02:41:45 np0005531887 podman[214406]: 2025-11-22 07:41:45.853626379 +0000 UTC m=+0.068026907 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:41:49 np0005531887 nova_compute[186849]: 2025-11-22 07:41:49.385 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:49 np0005531887 nova_compute[186849]: 2025-11-22 07:41:49.888 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:51 np0005531887 podman[214447]: 2025-11-22 07:41:51.825240139 +0000 UTC m=+0.046265304 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 02:41:52 np0005531887 ovn_controller[95130]: 2025-11-22T07:41:52Z|00035|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Nov 22 02:41:54 np0005531887 nova_compute[186849]: 2025-11-22 07:41:54.388 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:54 np0005531887 podman[214472]: 2025-11-22 07:41:54.826199867 +0000 UTC m=+0.045465965 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 22 02:41:54 np0005531887 nova_compute[186849]: 2025-11-22 07:41:54.889 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:57 np0005531887 podman[214494]: 2025-11-22 07:41:57.839311438 +0000 UTC m=+0.058816107 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible)
Nov 22 02:41:59 np0005531887 nova_compute[186849]: 2025-11-22 07:41:59.390 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:41:59 np0005531887 nova_compute[186849]: 2025-11-22 07:41:59.890 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:42:02 np0005531887 nova_compute[186849]: 2025-11-22 07:42:02.765 186853 DEBUG oslo_concurrency.lockutils [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] Acquiring lock "a2102d70-9444-4326-bc8f-08e05e669d9d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:42:02 np0005531887 nova_compute[186849]: 2025-11-22 07:42:02.765 186853 DEBUG oslo_concurrency.lockutils [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] Lock "a2102d70-9444-4326-bc8f-08e05e669d9d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:42:02 np0005531887 nova_compute[186849]: 2025-11-22 07:42:02.790 186853 DEBUG nova.compute.manager [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] [instance: a2102d70-9444-4326-bc8f-08e05e669d9d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 02:42:02 np0005531887 podman[214514]: 2025-11-22 07:42:02.828676977 +0000 UTC m=+0.050645439 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 02:42:03 np0005531887 nova_compute[186849]: 2025-11-22 07:42:03.314 186853 DEBUG oslo_concurrency.lockutils [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:42:03 np0005531887 nova_compute[186849]: 2025-11-22 07:42:03.315 186853 DEBUG oslo_concurrency.lockutils [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:42:03 np0005531887 nova_compute[186849]: 2025-11-22 07:42:03.323 186853 DEBUG nova.virt.hardware [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:42:03 np0005531887 nova_compute[186849]: 2025-11-22 07:42:03.323 186853 INFO nova.compute.claims [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] [instance: a2102d70-9444-4326-bc8f-08e05e669d9d] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 22 02:42:03 np0005531887 nova_compute[186849]: 2025-11-22 07:42:03.619 186853 DEBUG nova.compute.provider_tree [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:42:03 np0005531887 nova_compute[186849]: 2025-11-22 07:42:03.636 186853 DEBUG nova.scheduler.client.report [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:42:03 np0005531887 nova_compute[186849]: 2025-11-22 07:42:03.734 186853 DEBUG oslo_concurrency.lockutils [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.419s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:42:03 np0005531887 nova_compute[186849]: 2025-11-22 07:42:03.735 186853 DEBUG nova.compute.manager [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] [instance: a2102d70-9444-4326-bc8f-08e05e669d9d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 02:42:03 np0005531887 nova_compute[186849]: 2025-11-22 07:42:03.851 186853 DEBUG nova.compute.manager [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] [instance: a2102d70-9444-4326-bc8f-08e05e669d9d] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948#033[00m
Nov 22 02:42:03 np0005531887 nova_compute[186849]: 2025-11-22 07:42:03.970 186853 INFO nova.virt.libvirt.driver [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] [instance: a2102d70-9444-4326-bc8f-08e05e669d9d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 02:42:04 np0005531887 nova_compute[186849]: 2025-11-22 07:42:04.061 186853 DEBUG nova.compute.manager [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] [instance: a2102d70-9444-4326-bc8f-08e05e669d9d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 02:42:04 np0005531887 nova_compute[186849]: 2025-11-22 07:42:04.393 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:42:04 np0005531887 nova_compute[186849]: 2025-11-22 07:42:04.501 186853 DEBUG nova.compute.manager [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] [instance: a2102d70-9444-4326-bc8f-08e05e669d9d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 02:42:04 np0005531887 nova_compute[186849]: 2025-11-22 07:42:04.502 186853 DEBUG nova.virt.libvirt.driver [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] [instance: a2102d70-9444-4326-bc8f-08e05e669d9d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:42:04 np0005531887 nova_compute[186849]: 2025-11-22 07:42:04.503 186853 INFO nova.virt.libvirt.driver [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] [instance: a2102d70-9444-4326-bc8f-08e05e669d9d] Creating image(s)#033[00m
Nov 22 02:42:04 np0005531887 nova_compute[186849]: 2025-11-22 07:42:04.505 186853 DEBUG oslo_concurrency.lockutils [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] Acquiring lock "/var/lib/nova/instances/a2102d70-9444-4326-bc8f-08e05e669d9d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:42:04 np0005531887 nova_compute[186849]: 2025-11-22 07:42:04.505 186853 DEBUG oslo_concurrency.lockutils [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] Lock "/var/lib/nova/instances/a2102d70-9444-4326-bc8f-08e05e669d9d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:42:04 np0005531887 nova_compute[186849]: 2025-11-22 07:42:04.505 186853 DEBUG oslo_concurrency.lockutils [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] Lock "/var/lib/nova/instances/a2102d70-9444-4326-bc8f-08e05e669d9d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:42:04 np0005531887 nova_compute[186849]: 2025-11-22 07:42:04.518 186853 DEBUG oslo_concurrency.processutils [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:42:04 np0005531887 nova_compute[186849]: 2025-11-22 07:42:04.606 186853 DEBUG oslo_concurrency.processutils [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:42:04 np0005531887 nova_compute[186849]: 2025-11-22 07:42:04.608 186853 DEBUG oslo_concurrency.lockutils [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:42:04 np0005531887 nova_compute[186849]: 2025-11-22 07:42:04.609 186853 DEBUG oslo_concurrency.lockutils [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:42:04 np0005531887 nova_compute[186849]: 2025-11-22 07:42:04.620 186853 DEBUG oslo_concurrency.processutils [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:42:04 np0005531887 nova_compute[186849]: 2025-11-22 07:42:04.713 186853 DEBUG oslo_concurrency.processutils [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:42:04 np0005531887 nova_compute[186849]: 2025-11-22 07:42:04.714 186853 DEBUG oslo_concurrency.processutils [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/a2102d70-9444-4326-bc8f-08e05e669d9d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:42:04 np0005531887 nova_compute[186849]: 2025-11-22 07:42:04.772 186853 DEBUG oslo_concurrency.processutils [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/a2102d70-9444-4326-bc8f-08e05e669d9d/disk 1073741824" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:42:04 np0005531887 nova_compute[186849]: 2025-11-22 07:42:04.773 186853 DEBUG oslo_concurrency.lockutils [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.164s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:42:04 np0005531887 nova_compute[186849]: 2025-11-22 07:42:04.773 186853 DEBUG oslo_concurrency.processutils [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:42:04 np0005531887 nova_compute[186849]: 2025-11-22 07:42:04.861 186853 DEBUG oslo_concurrency.processutils [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:42:04 np0005531887 nova_compute[186849]: 2025-11-22 07:42:04.863 186853 DEBUG nova.virt.disk.api [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] Checking if we can resize image /var/lib/nova/instances/a2102d70-9444-4326-bc8f-08e05e669d9d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:42:04 np0005531887 nova_compute[186849]: 2025-11-22 07:42:04.863 186853 DEBUG oslo_concurrency.processutils [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a2102d70-9444-4326-bc8f-08e05e669d9d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:42:04 np0005531887 nova_compute[186849]: 2025-11-22 07:42:04.892 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:42:04 np0005531887 nova_compute[186849]: 2025-11-22 07:42:04.929 186853 DEBUG oslo_concurrency.processutils [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a2102d70-9444-4326-bc8f-08e05e669d9d/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:42:04 np0005531887 nova_compute[186849]: 2025-11-22 07:42:04.930 186853 DEBUG nova.virt.disk.api [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] Cannot resize image /var/lib/nova/instances/a2102d70-9444-4326-bc8f-08e05e669d9d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:42:04 np0005531887 nova_compute[186849]: 2025-11-22 07:42:04.930 186853 DEBUG nova.objects.instance [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] Lazy-loading 'migration_context' on Instance uuid a2102d70-9444-4326-bc8f-08e05e669d9d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:42:05 np0005531887 nova_compute[186849]: 2025-11-22 07:42:05.111 186853 DEBUG nova.virt.libvirt.driver [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] [instance: a2102d70-9444-4326-bc8f-08e05e669d9d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:42:05 np0005531887 nova_compute[186849]: 2025-11-22 07:42:05.112 186853 DEBUG nova.virt.libvirt.driver [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] [instance: a2102d70-9444-4326-bc8f-08e05e669d9d] Ensure instance console log exists: /var/lib/nova/instances/a2102d70-9444-4326-bc8f-08e05e669d9d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:42:05 np0005531887 nova_compute[186849]: 2025-11-22 07:42:05.112 186853 DEBUG oslo_concurrency.lockutils [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:42:05 np0005531887 nova_compute[186849]: 2025-11-22 07:42:05.113 186853 DEBUG oslo_concurrency.lockutils [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:42:05 np0005531887 nova_compute[186849]: 2025-11-22 07:42:05.113 186853 DEBUG oslo_concurrency.lockutils [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:42:05 np0005531887 nova_compute[186849]: 2025-11-22 07:42:05.115 186853 DEBUG nova.virt.libvirt.driver [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] [instance: a2102d70-9444-4326-bc8f-08e05e669d9d] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'size': 0, 'boot_index': 0, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:42:05 np0005531887 nova_compute[186849]: 2025-11-22 07:42:05.120 186853 WARNING nova.virt.libvirt.driver [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:42:05 np0005531887 nova_compute[186849]: 2025-11-22 07:42:05.127 186853 DEBUG nova.virt.libvirt.host [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:42:05 np0005531887 nova_compute[186849]: 2025-11-22 07:42:05.128 186853 DEBUG nova.virt.libvirt.host [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:42:05 np0005531887 nova_compute[186849]: 2025-11-22 07:42:05.135 186853 DEBUG nova.virt.libvirt.host [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:42:05 np0005531887 nova_compute[186849]: 2025-11-22 07:42:05.136 186853 DEBUG nova.virt.libvirt.host [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:42:05 np0005531887 nova_compute[186849]: 2025-11-22 07:42:05.138 186853 DEBUG nova.virt.libvirt.driver [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:42:05 np0005531887 nova_compute[186849]: 2025-11-22 07:42:05.138 186853 DEBUG nova.virt.hardware [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:42:05 np0005531887 nova_compute[186849]: 2025-11-22 07:42:05.138 186853 DEBUG nova.virt.hardware [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:42:05 np0005531887 nova_compute[186849]: 2025-11-22 07:42:05.138 186853 DEBUG nova.virt.hardware [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:42:05 np0005531887 nova_compute[186849]: 2025-11-22 07:42:05.139 186853 DEBUG nova.virt.hardware [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:42:05 np0005531887 nova_compute[186849]: 2025-11-22 07:42:05.139 186853 DEBUG nova.virt.hardware [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:42:05 np0005531887 nova_compute[186849]: 2025-11-22 07:42:05.139 186853 DEBUG nova.virt.hardware [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:42:05 np0005531887 nova_compute[186849]: 2025-11-22 07:42:05.139 186853 DEBUG nova.virt.hardware [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:42:05 np0005531887 nova_compute[186849]: 2025-11-22 07:42:05.140 186853 DEBUG nova.virt.hardware [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:42:05 np0005531887 nova_compute[186849]: 2025-11-22 07:42:05.140 186853 DEBUG nova.virt.hardware [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:42:05 np0005531887 nova_compute[186849]: 2025-11-22 07:42:05.140 186853 DEBUG nova.virt.hardware [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:42:05 np0005531887 nova_compute[186849]: 2025-11-22 07:42:05.140 186853 DEBUG nova.virt.hardware [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:42:05 np0005531887 nova_compute[186849]: 2025-11-22 07:42:05.144 186853 DEBUG nova.objects.instance [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] Lazy-loading 'pci_devices' on Instance uuid a2102d70-9444-4326-bc8f-08e05e669d9d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:42:05 np0005531887 nova_compute[186849]: 2025-11-22 07:42:05.166 186853 DEBUG nova.virt.libvirt.driver [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] [instance: a2102d70-9444-4326-bc8f-08e05e669d9d] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:42:05 np0005531887 nova_compute[186849]:  <uuid>a2102d70-9444-4326-bc8f-08e05e669d9d</uuid>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:  <name>instance-0000000b</name>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:  <memory>131072</memory>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:  <vcpu>1</vcpu>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:42:05 np0005531887 nova_compute[186849]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:      <nova:name>tempest-ServerDiagnosticsV248Test-server-227776945</nova:name>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:      <nova:creationTime>2025-11-22 07:42:05</nova:creationTime>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:      <nova:flavor name="m1.nano">
Nov 22 02:42:05 np0005531887 nova_compute[186849]:        <nova:memory>128</nova:memory>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:        <nova:disk>1</nova:disk>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:        <nova:swap>0</nova:swap>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:      </nova:flavor>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:      <nova:owner>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:        <nova:user uuid="7c080ee1829a431b913bee21bfdc05a0">tempest-ServerDiagnosticsV248Test-1210129591-project-member</nova:user>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:        <nova:project uuid="1816c7516c7a44fca5ba6c88d4462163">tempest-ServerDiagnosticsV248Test-1210129591</nova:project>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:      </nova:owner>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:      <nova:ports/>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:    </nova:instance>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:  <sysinfo type="smbios">
Nov 22 02:42:05 np0005531887 nova_compute[186849]:    <system>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:      <entry name="serial">a2102d70-9444-4326-bc8f-08e05e669d9d</entry>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:      <entry name="uuid">a2102d70-9444-4326-bc8f-08e05e669d9d</entry>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:    </system>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:  <os>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:    <boot dev="hd"/>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:    <smbios mode="sysinfo"/>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:  </os>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:  <features>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:    <vmcoreinfo/>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:  </features>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:  <clock offset="utc">
Nov 22 02:42:05 np0005531887 nova_compute[186849]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:    <timer name="hpet" present="no"/>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:  </clock>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:  <cpu mode="custom" match="exact">
Nov 22 02:42:05 np0005531887 nova_compute[186849]:    <model>Nehalem</model>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:  <devices>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:    <disk type="file" device="disk">
Nov 22 02:42:05 np0005531887 nova_compute[186849]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/a2102d70-9444-4326-bc8f-08e05e669d9d/disk"/>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:      <target dev="vda" bus="virtio"/>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:    <disk type="file" device="cdrom">
Nov 22 02:42:05 np0005531887 nova_compute[186849]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/a2102d70-9444-4326-bc8f-08e05e669d9d/disk.config"/>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:      <target dev="sda" bus="sata"/>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:    <serial type="pty">
Nov 22 02:42:05 np0005531887 nova_compute[186849]:      <log file="/var/lib/nova/instances/a2102d70-9444-4326-bc8f-08e05e669d9d/console.log" append="off"/>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:    </serial>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:    <video>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:    </video>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:    <input type="tablet" bus="usb"/>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:    <rng model="virtio">
Nov 22 02:42:05 np0005531887 nova_compute[186849]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:    </rng>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:    <controller type="usb" index="0"/>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:    <memballoon model="virtio">
Nov 22 02:42:05 np0005531887 nova_compute[186849]:      <stats period="10"/>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 02:42:05 np0005531887 nova_compute[186849]:  </devices>
Nov 22 02:42:05 np0005531887 nova_compute[186849]: </domain>
Nov 22 02:42:05 np0005531887 nova_compute[186849]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:42:05 np0005531887 nova_compute[186849]: 2025-11-22 07:42:05.272 186853 DEBUG nova.virt.libvirt.driver [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:42:05 np0005531887 nova_compute[186849]: 2025-11-22 07:42:05.273 186853 DEBUG nova.virt.libvirt.driver [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:42:05 np0005531887 nova_compute[186849]: 2025-11-22 07:42:05.274 186853 INFO nova.virt.libvirt.driver [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] [instance: a2102d70-9444-4326-bc8f-08e05e669d9d] Using config drive#033[00m
Nov 22 02:42:05 np0005531887 nova_compute[186849]: 2025-11-22 07:42:05.578 186853 INFO nova.virt.libvirt.driver [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] [instance: a2102d70-9444-4326-bc8f-08e05e669d9d] Creating config drive at /var/lib/nova/instances/a2102d70-9444-4326-bc8f-08e05e669d9d/disk.config#033[00m
Nov 22 02:42:05 np0005531887 nova_compute[186849]: 2025-11-22 07:42:05.583 186853 DEBUG oslo_concurrency.processutils [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a2102d70-9444-4326-bc8f-08e05e669d9d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplawk0i5j execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:42:05 np0005531887 nova_compute[186849]: 2025-11-22 07:42:05.722 186853 DEBUG oslo_concurrency.processutils [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a2102d70-9444-4326-bc8f-08e05e669d9d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplawk0i5j" returned: 0 in 0.138s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:42:05 np0005531887 systemd-machined[153180]: New machine qemu-4-instance-0000000b.
Nov 22 02:42:05 np0005531887 systemd[1]: Started Virtual Machine qemu-4-instance-0000000b.
Nov 22 02:42:06 np0005531887 nova_compute[186849]: 2025-11-22 07:42:06.674 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763797326.673275, a2102d70-9444-4326-bc8f-08e05e669d9d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:42:06 np0005531887 nova_compute[186849]: 2025-11-22 07:42:06.676 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: a2102d70-9444-4326-bc8f-08e05e669d9d] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:42:06 np0005531887 nova_compute[186849]: 2025-11-22 07:42:06.681 186853 DEBUG nova.compute.manager [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] [instance: a2102d70-9444-4326-bc8f-08e05e669d9d] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:42:06 np0005531887 nova_compute[186849]: 2025-11-22 07:42:06.683 186853 DEBUG nova.virt.libvirt.driver [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] [instance: a2102d70-9444-4326-bc8f-08e05e669d9d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 02:42:06 np0005531887 nova_compute[186849]: 2025-11-22 07:42:06.692 186853 INFO nova.virt.libvirt.driver [-] [instance: a2102d70-9444-4326-bc8f-08e05e669d9d] Instance spawned successfully.#033[00m
Nov 22 02:42:06 np0005531887 nova_compute[186849]: 2025-11-22 07:42:06.693 186853 DEBUG nova.virt.libvirt.driver [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] [instance: a2102d70-9444-4326-bc8f-08e05e669d9d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 02:42:06 np0005531887 nova_compute[186849]: 2025-11-22 07:42:06.731 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: a2102d70-9444-4326-bc8f-08e05e669d9d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:42:06 np0005531887 nova_compute[186849]: 2025-11-22 07:42:06.741 186853 DEBUG nova.virt.libvirt.driver [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] [instance: a2102d70-9444-4326-bc8f-08e05e669d9d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:42:06 np0005531887 nova_compute[186849]: 2025-11-22 07:42:06.742 186853 DEBUG nova.virt.libvirt.driver [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] [instance: a2102d70-9444-4326-bc8f-08e05e669d9d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:42:06 np0005531887 nova_compute[186849]: 2025-11-22 07:42:06.742 186853 DEBUG nova.virt.libvirt.driver [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] [instance: a2102d70-9444-4326-bc8f-08e05e669d9d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:42:06 np0005531887 nova_compute[186849]: 2025-11-22 07:42:06.743 186853 DEBUG nova.virt.libvirt.driver [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] [instance: a2102d70-9444-4326-bc8f-08e05e669d9d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:42:06 np0005531887 nova_compute[186849]: 2025-11-22 07:42:06.743 186853 DEBUG nova.virt.libvirt.driver [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] [instance: a2102d70-9444-4326-bc8f-08e05e669d9d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:42:06 np0005531887 nova_compute[186849]: 2025-11-22 07:42:06.744 186853 DEBUG nova.virt.libvirt.driver [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] [instance: a2102d70-9444-4326-bc8f-08e05e669d9d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:42:06 np0005531887 nova_compute[186849]: 2025-11-22 07:42:06.750 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: a2102d70-9444-4326-bc8f-08e05e669d9d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:42:06 np0005531887 nova_compute[186849]: 2025-11-22 07:42:06.792 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: a2102d70-9444-4326-bc8f-08e05e669d9d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:42:06 np0005531887 nova_compute[186849]: 2025-11-22 07:42:06.792 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763797326.6751542, a2102d70-9444-4326-bc8f-08e05e669d9d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:42:06 np0005531887 nova_compute[186849]: 2025-11-22 07:42:06.792 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: a2102d70-9444-4326-bc8f-08e05e669d9d] VM Started (Lifecycle Event)#033[00m
Nov 22 02:42:06 np0005531887 nova_compute[186849]: 2025-11-22 07:42:06.825 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: a2102d70-9444-4326-bc8f-08e05e669d9d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:42:06 np0005531887 nova_compute[186849]: 2025-11-22 07:42:06.829 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: a2102d70-9444-4326-bc8f-08e05e669d9d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:42:06 np0005531887 nova_compute[186849]: 2025-11-22 07:42:06.832 186853 INFO nova.compute.manager [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] [instance: a2102d70-9444-4326-bc8f-08e05e669d9d] Took 2.33 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 02:42:06 np0005531887 nova_compute[186849]: 2025-11-22 07:42:06.833 186853 DEBUG nova.compute.manager [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] [instance: a2102d70-9444-4326-bc8f-08e05e669d9d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:42:06 np0005531887 nova_compute[186849]: 2025-11-22 07:42:06.866 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: a2102d70-9444-4326-bc8f-08e05e669d9d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:42:06 np0005531887 nova_compute[186849]: 2025-11-22 07:42:06.946 186853 INFO nova.compute.manager [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] [instance: a2102d70-9444-4326-bc8f-08e05e669d9d] Took 3.71 seconds to build instance.#033[00m
Nov 22 02:42:06 np0005531887 nova_compute[186849]: 2025-11-22 07:42:06.963 186853 DEBUG oslo_concurrency.lockutils [None req-649e9bd0-6dec-4d86-91ec-f262e28306ed 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] Lock "a2102d70-9444-4326-bc8f-08e05e669d9d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.198s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:42:07 np0005531887 podman[214580]: 2025-11-22 07:42:07.872222471 +0000 UTC m=+0.078033879 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.buildah.version=1.33.7, vendor=Red Hat, Inc., version=9.6, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., release=1755695350, vcs-type=git)
Nov 22 02:42:09 np0005531887 nova_compute[186849]: 2025-11-22 07:42:09.390 186853 DEBUG nova.compute.manager [None req-d814bea2-191d-47fc-8b33-3a1f879d2f58 dab2d8cd125646b795f382c0e6ef7a0d 4f5e98fc3e184950b4eee80ec27caeeb - - default default] [instance: a2102d70-9444-4326-bc8f-08e05e669d9d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:42:09 np0005531887 nova_compute[186849]: 2025-11-22 07:42:09.394 186853 INFO nova.compute.manager [None req-d814bea2-191d-47fc-8b33-3a1f879d2f58 dab2d8cd125646b795f382c0e6ef7a0d 4f5e98fc3e184950b4eee80ec27caeeb - - default default] [instance: a2102d70-9444-4326-bc8f-08e05e669d9d] Retrieving diagnostics#033[00m
Nov 22 02:42:09 np0005531887 nova_compute[186849]: 2025-11-22 07:42:09.396 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:42:09 np0005531887 nova_compute[186849]: 2025-11-22 07:42:09.893 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:42:14 np0005531887 nova_compute[186849]: 2025-11-22 07:42:14.399 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:42:14 np0005531887 nova_compute[186849]: 2025-11-22 07:42:14.898 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:42:16 np0005531887 podman[214601]: 2025-11-22 07:42:16.884568323 +0000 UTC m=+0.094066974 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 22 02:42:16 np0005531887 podman[214600]: 2025-11-22 07:42:16.888014396 +0000 UTC m=+0.096233336 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 02:42:19 np0005531887 nova_compute[186849]: 2025-11-22 07:42:19.407 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:42:19 np0005531887 nova_compute[186849]: 2025-11-22 07:42:19.859 186853 DEBUG nova.compute.manager [None req-461679d8-25d4-43e6-9467-21c773a9dbe0 dab2d8cd125646b795f382c0e6ef7a0d 4f5e98fc3e184950b4eee80ec27caeeb - - default default] [instance: a2102d70-9444-4326-bc8f-08e05e669d9d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:42:19 np0005531887 nova_compute[186849]: 2025-11-22 07:42:19.863 186853 INFO nova.compute.manager [None req-461679d8-25d4-43e6-9467-21c773a9dbe0 dab2d8cd125646b795f382c0e6ef7a0d 4f5e98fc3e184950b4eee80ec27caeeb - - default default] [instance: a2102d70-9444-4326-bc8f-08e05e669d9d] Retrieving diagnostics#033[00m
Nov 22 02:42:19 np0005531887 nova_compute[186849]: 2025-11-22 07:42:19.928 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:42:20 np0005531887 nova_compute[186849]: 2025-11-22 07:42:20.547 186853 DEBUG oslo_concurrency.lockutils [None req-f908a570-f7cf-438a-9ead-3e02fbd3d155 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] Acquiring lock "a2102d70-9444-4326-bc8f-08e05e669d9d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:42:20 np0005531887 nova_compute[186849]: 2025-11-22 07:42:20.548 186853 DEBUG oslo_concurrency.lockutils [None req-f908a570-f7cf-438a-9ead-3e02fbd3d155 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] Lock "a2102d70-9444-4326-bc8f-08e05e669d9d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:42:20 np0005531887 nova_compute[186849]: 2025-11-22 07:42:20.548 186853 DEBUG oslo_concurrency.lockutils [None req-f908a570-f7cf-438a-9ead-3e02fbd3d155 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] Acquiring lock "a2102d70-9444-4326-bc8f-08e05e669d9d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:42:20 np0005531887 nova_compute[186849]: 2025-11-22 07:42:20.548 186853 DEBUG oslo_concurrency.lockutils [None req-f908a570-f7cf-438a-9ead-3e02fbd3d155 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] Lock "a2102d70-9444-4326-bc8f-08e05e669d9d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:42:20 np0005531887 nova_compute[186849]: 2025-11-22 07:42:20.549 186853 DEBUG oslo_concurrency.lockutils [None req-f908a570-f7cf-438a-9ead-3e02fbd3d155 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] Lock "a2102d70-9444-4326-bc8f-08e05e669d9d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:42:20 np0005531887 nova_compute[186849]: 2025-11-22 07:42:20.555 186853 INFO nova.compute.manager [None req-f908a570-f7cf-438a-9ead-3e02fbd3d155 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] [instance: a2102d70-9444-4326-bc8f-08e05e669d9d] Terminating instance#033[00m
Nov 22 02:42:20 np0005531887 nova_compute[186849]: 2025-11-22 07:42:20.560 186853 DEBUG oslo_concurrency.lockutils [None req-f908a570-f7cf-438a-9ead-3e02fbd3d155 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] Acquiring lock "refresh_cache-a2102d70-9444-4326-bc8f-08e05e669d9d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:42:20 np0005531887 nova_compute[186849]: 2025-11-22 07:42:20.561 186853 DEBUG oslo_concurrency.lockutils [None req-f908a570-f7cf-438a-9ead-3e02fbd3d155 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] Acquired lock "refresh_cache-a2102d70-9444-4326-bc8f-08e05e669d9d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:42:20 np0005531887 nova_compute[186849]: 2025-11-22 07:42:20.561 186853 DEBUG nova.network.neutron [None req-f908a570-f7cf-438a-9ead-3e02fbd3d155 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] [instance: a2102d70-9444-4326-bc8f-08e05e669d9d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:42:21 np0005531887 nova_compute[186849]: 2025-11-22 07:42:21.025 186853 DEBUG nova.network.neutron [None req-f908a570-f7cf-438a-9ead-3e02fbd3d155 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] [instance: a2102d70-9444-4326-bc8f-08e05e669d9d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:42:22 np0005531887 nova_compute[186849]: 2025-11-22 07:42:22.488 186853 DEBUG nova.network.neutron [None req-f908a570-f7cf-438a-9ead-3e02fbd3d155 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] [instance: a2102d70-9444-4326-bc8f-08e05e669d9d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:42:22 np0005531887 nova_compute[186849]: 2025-11-22 07:42:22.519 186853 DEBUG oslo_concurrency.lockutils [None req-f908a570-f7cf-438a-9ead-3e02fbd3d155 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] Releasing lock "refresh_cache-a2102d70-9444-4326-bc8f-08e05e669d9d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:42:22 np0005531887 nova_compute[186849]: 2025-11-22 07:42:22.520 186853 DEBUG nova.compute.manager [None req-f908a570-f7cf-438a-9ead-3e02fbd3d155 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] [instance: a2102d70-9444-4326-bc8f-08e05e669d9d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 02:42:22 np0005531887 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d0000000b.scope: Deactivated successfully.
Nov 22 02:42:22 np0005531887 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d0000000b.scope: Consumed 14.352s CPU time.
Nov 22 02:42:22 np0005531887 systemd-machined[153180]: Machine qemu-4-instance-0000000b terminated.
Nov 22 02:42:22 np0005531887 podman[214661]: 2025-11-22 07:42:22.62421533 +0000 UTC m=+0.056444689 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 02:42:22 np0005531887 nova_compute[186849]: 2025-11-22 07:42:22.763 186853 INFO nova.virt.libvirt.driver [-] [instance: a2102d70-9444-4326-bc8f-08e05e669d9d] Instance destroyed successfully.#033[00m
Nov 22 02:42:22 np0005531887 nova_compute[186849]: 2025-11-22 07:42:22.763 186853 DEBUG nova.objects.instance [None req-f908a570-f7cf-438a-9ead-3e02fbd3d155 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] Lazy-loading 'resources' on Instance uuid a2102d70-9444-4326-bc8f-08e05e669d9d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:42:22 np0005531887 nova_compute[186849]: 2025-11-22 07:42:22.780 186853 INFO nova.virt.libvirt.driver [None req-f908a570-f7cf-438a-9ead-3e02fbd3d155 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] [instance: a2102d70-9444-4326-bc8f-08e05e669d9d] Deleting instance files /var/lib/nova/instances/a2102d70-9444-4326-bc8f-08e05e669d9d_del#033[00m
Nov 22 02:42:22 np0005531887 nova_compute[186849]: 2025-11-22 07:42:22.781 186853 INFO nova.virt.libvirt.driver [None req-f908a570-f7cf-438a-9ead-3e02fbd3d155 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] [instance: a2102d70-9444-4326-bc8f-08e05e669d9d] Deletion of /var/lib/nova/instances/a2102d70-9444-4326-bc8f-08e05e669d9d_del complete#033[00m
Nov 22 02:42:22 np0005531887 nova_compute[186849]: 2025-11-22 07:42:22.899 186853 INFO nova.compute.manager [None req-f908a570-f7cf-438a-9ead-3e02fbd3d155 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] [instance: a2102d70-9444-4326-bc8f-08e05e669d9d] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 02:42:22 np0005531887 nova_compute[186849]: 2025-11-22 07:42:22.900 186853 DEBUG oslo.service.loopingcall [None req-f908a570-f7cf-438a-9ead-3e02fbd3d155 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 02:42:22 np0005531887 nova_compute[186849]: 2025-11-22 07:42:22.900 186853 DEBUG nova.compute.manager [-] [instance: a2102d70-9444-4326-bc8f-08e05e669d9d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 02:42:22 np0005531887 nova_compute[186849]: 2025-11-22 07:42:22.900 186853 DEBUG nova.network.neutron [-] [instance: a2102d70-9444-4326-bc8f-08e05e669d9d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 02:42:23 np0005531887 nova_compute[186849]: 2025-11-22 07:42:23.348 186853 DEBUG nova.network.neutron [-] [instance: a2102d70-9444-4326-bc8f-08e05e669d9d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:42:23 np0005531887 nova_compute[186849]: 2025-11-22 07:42:23.366 186853 DEBUG nova.network.neutron [-] [instance: a2102d70-9444-4326-bc8f-08e05e669d9d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:42:23 np0005531887 nova_compute[186849]: 2025-11-22 07:42:23.385 186853 INFO nova.compute.manager [-] [instance: a2102d70-9444-4326-bc8f-08e05e669d9d] Took 0.48 seconds to deallocate network for instance.#033[00m
Nov 22 02:42:23 np0005531887 nova_compute[186849]: 2025-11-22 07:42:23.498 186853 DEBUG oslo_concurrency.lockutils [None req-f908a570-f7cf-438a-9ead-3e02fbd3d155 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:42:23 np0005531887 nova_compute[186849]: 2025-11-22 07:42:23.498 186853 DEBUG oslo_concurrency.lockutils [None req-f908a570-f7cf-438a-9ead-3e02fbd3d155 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:42:23 np0005531887 nova_compute[186849]: 2025-11-22 07:42:23.561 186853 DEBUG nova.compute.provider_tree [None req-f908a570-f7cf-438a-9ead-3e02fbd3d155 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:42:23 np0005531887 nova_compute[186849]: 2025-11-22 07:42:23.578 186853 DEBUG nova.scheduler.client.report [None req-f908a570-f7cf-438a-9ead-3e02fbd3d155 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:42:23 np0005531887 nova_compute[186849]: 2025-11-22 07:42:23.613 186853 DEBUG oslo_concurrency.lockutils [None req-f908a570-f7cf-438a-9ead-3e02fbd3d155 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:42:23 np0005531887 nova_compute[186849]: 2025-11-22 07:42:23.660 186853 INFO nova.scheduler.client.report [None req-f908a570-f7cf-438a-9ead-3e02fbd3d155 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] Deleted allocations for instance a2102d70-9444-4326-bc8f-08e05e669d9d#033[00m
Nov 22 02:42:23 np0005531887 nova_compute[186849]: 2025-11-22 07:42:23.767 186853 DEBUG oslo_concurrency.lockutils [None req-f908a570-f7cf-438a-9ead-3e02fbd3d155 7c080ee1829a431b913bee21bfdc05a0 1816c7516c7a44fca5ba6c88d4462163 - - default default] Lock "a2102d70-9444-4326-bc8f-08e05e669d9d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.219s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:42:24 np0005531887 nova_compute[186849]: 2025-11-22 07:42:24.409 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:42:24 np0005531887 nova_compute[186849]: 2025-11-22 07:42:24.929 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:42:25 np0005531887 podman[214696]: 2025-11-22 07:42:25.83960653 +0000 UTC m=+0.058590723 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:42:28 np0005531887 podman[214715]: 2025-11-22 07:42:28.839395066 +0000 UTC m=+0.059061814 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 22 02:42:29 np0005531887 nova_compute[186849]: 2025-11-22 07:42:29.412 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:42:29 np0005531887 nova_compute[186849]: 2025-11-22 07:42:29.933 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:42:32 np0005531887 nova_compute[186849]: 2025-11-22 07:42:32.146 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:42:32 np0005531887 nova_compute[186849]: 2025-11-22 07:42:32.146 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:42:33 np0005531887 nova_compute[186849]: 2025-11-22 07:42:33.762 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:42:33 np0005531887 nova_compute[186849]: 2025-11-22 07:42:33.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:42:33 np0005531887 nova_compute[186849]: 2025-11-22 07:42:33.768 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 02:42:33 np0005531887 nova_compute[186849]: 2025-11-22 07:42:33.768 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 02:42:33 np0005531887 nova_compute[186849]: 2025-11-22 07:42:33.783 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 02:42:33 np0005531887 nova_compute[186849]: 2025-11-22 07:42:33.784 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:42:33 np0005531887 nova_compute[186849]: 2025-11-22 07:42:33.784 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:42:33 np0005531887 podman[214735]: 2025-11-22 07:42:33.848121238 +0000 UTC m=+0.065002989 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 02:42:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:42:34.024 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:42:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:42:34.025 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 02:42:34 np0005531887 nova_compute[186849]: 2025-11-22 07:42:34.025 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:42:34 np0005531887 nova_compute[186849]: 2025-11-22 07:42:34.413 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:42:34 np0005531887 nova_compute[186849]: 2025-11-22 07:42:34.934 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:42:35 np0005531887 nova_compute[186849]: 2025-11-22 07:42:35.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:42:35 np0005531887 nova_compute[186849]: 2025-11-22 07:42:35.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:42:35 np0005531887 nova_compute[186849]: 2025-11-22 07:42:35.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 02:42:35 np0005531887 nova_compute[186849]: 2025-11-22 07:42:35.770 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:42:35 np0005531887 nova_compute[186849]: 2025-11-22 07:42:35.805 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:42:35 np0005531887 nova_compute[186849]: 2025-11-22 07:42:35.805 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:42:35 np0005531887 nova_compute[186849]: 2025-11-22 07:42:35.805 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:42:35 np0005531887 nova_compute[186849]: 2025-11-22 07:42:35.806 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 02:42:35 np0005531887 nova_compute[186849]: 2025-11-22 07:42:35.977 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:42:35 np0005531887 nova_compute[186849]: 2025-11-22 07:42:35.978 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5810MB free_disk=73.46228408813477GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 02:42:35 np0005531887 nova_compute[186849]: 2025-11-22 07:42:35.978 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:42:35 np0005531887 nova_compute[186849]: 2025-11-22 07:42:35.979 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:42:36 np0005531887 nova_compute[186849]: 2025-11-22 07:42:36.056 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 02:42:36 np0005531887 nova_compute[186849]: 2025-11-22 07:42:36.056 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 02:42:36 np0005531887 nova_compute[186849]: 2025-11-22 07:42:36.102 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:42:36 np0005531887 nova_compute[186849]: 2025-11-22 07:42:36.118 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:42:36 np0005531887 nova_compute[186849]: 2025-11-22 07:42:36.201 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 02:42:36 np0005531887 nova_compute[186849]: 2025-11-22 07:42:36.202 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.223s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:42:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:42:36.657 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:42:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:42:36.657 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:42:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:42:36.658 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:42:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:42:36.658 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:42:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:42:36.658 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:42:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:42:36.658 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:42:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:42:36.658 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:42:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:42:36.658 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:42:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:42:36.658 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:42:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:42:36.659 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:42:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:42:36.659 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:42:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:42:36.659 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:42:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:42:36.659 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:42:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:42:36.659 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:42:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:42:36.659 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:42:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:42:36.659 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:42:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:42:36.659 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:42:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:42:36.659 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:42:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:42:36.659 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:42:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:42:36.660 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:42:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:42:36.660 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:42:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:42:36.660 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:42:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:42:36.660 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:42:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:42:36.660 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:42:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:42:36.660 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:42:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:42:37.315 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:42:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:42:37.315 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:42:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:42:37.315 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:42:37 np0005531887 nova_compute[186849]: 2025-11-22 07:42:37.761 186853 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797342.7598512, a2102d70-9444-4326-bc8f-08e05e669d9d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:42:37 np0005531887 nova_compute[186849]: 2025-11-22 07:42:37.761 186853 INFO nova.compute.manager [-] [instance: a2102d70-9444-4326-bc8f-08e05e669d9d] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:42:37 np0005531887 nova_compute[186849]: 2025-11-22 07:42:37.784 186853 DEBUG nova.compute.manager [None req-1edfffc7-1506-4c24-83d4-cc4baace5371 - - - - - -] [instance: a2102d70-9444-4326-bc8f-08e05e669d9d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:42:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:42:38.027 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:42:38 np0005531887 podman[214760]: 2025-11-22 07:42:38.847602481 +0000 UTC m=+0.064060127 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, release=1755695350, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, vcs-type=git, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 22 02:42:39 np0005531887 nova_compute[186849]: 2025-11-22 07:42:39.417 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:42:39 np0005531887 nova_compute[186849]: 2025-11-22 07:42:39.937 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:42:43 np0005531887 ovn_controller[95130]: 2025-11-22T07:42:43Z|00036|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Nov 22 02:42:44 np0005531887 nova_compute[186849]: 2025-11-22 07:42:44.420 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:42:44 np0005531887 nova_compute[186849]: 2025-11-22 07:42:44.938 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:42:47 np0005531887 podman[214783]: 2025-11-22 07:42:47.850245801 +0000 UTC m=+0.069250124 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:42:47 np0005531887 podman[214784]: 2025-11-22 07:42:47.860122022 +0000 UTC m=+0.078955981 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:42:49 np0005531887 nova_compute[186849]: 2025-11-22 07:42:49.423 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:42:49 np0005531887 nova_compute[186849]: 2025-11-22 07:42:49.939 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:42:52 np0005531887 podman[214829]: 2025-11-22 07:42:52.829227874 +0000 UTC m=+0.048407885 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 02:42:54 np0005531887 nova_compute[186849]: 2025-11-22 07:42:54.425 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:42:54 np0005531887 nova_compute[186849]: 2025-11-22 07:42:54.941 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:42:56 np0005531887 podman[214853]: 2025-11-22 07:42:56.842362844 +0000 UTC m=+0.055491687 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:42:59 np0005531887 nova_compute[186849]: 2025-11-22 07:42:59.428 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:42:59 np0005531887 podman[214872]: 2025-11-22 07:42:59.834398443 +0000 UTC m=+0.053979211 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=multipathd)
Nov 22 02:42:59 np0005531887 nova_compute[186849]: 2025-11-22 07:42:59.943 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:04 np0005531887 nova_compute[186849]: 2025-11-22 07:43:04.431 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:04 np0005531887 podman[214894]: 2025-11-22 07:43:04.831116458 +0000 UTC m=+0.049450290 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 02:43:04 np0005531887 nova_compute[186849]: 2025-11-22 07:43:04.947 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:09 np0005531887 nova_compute[186849]: 2025-11-22 07:43:09.433 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:09 np0005531887 podman[214918]: 2025-11-22 07:43:09.858632336 +0000 UTC m=+0.064409084 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., release=1755695350, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, version=9.6, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public)
Nov 22 02:43:09 np0005531887 nova_compute[186849]: 2025-11-22 07:43:09.948 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:14 np0005531887 nova_compute[186849]: 2025-11-22 07:43:14.435 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:14 np0005531887 nova_compute[186849]: 2025-11-22 07:43:14.950 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:18 np0005531887 podman[214942]: 2025-11-22 07:43:18.842419747 +0000 UTC m=+0.059471185 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 22 02:43:18 np0005531887 podman[214943]: 2025-11-22 07:43:18.906294787 +0000 UTC m=+0.117224996 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:43:19 np0005531887 nova_compute[186849]: 2025-11-22 07:43:19.438 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:19 np0005531887 nova_compute[186849]: 2025-11-22 07:43:19.951 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:23 np0005531887 podman[214989]: 2025-11-22 07:43:23.825691534 +0000 UTC m=+0.049645334 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 02:43:24 np0005531887 nova_compute[186849]: 2025-11-22 07:43:24.442 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:24 np0005531887 nova_compute[186849]: 2025-11-22 07:43:24.953 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:27 np0005531887 podman[215013]: 2025-11-22 07:43:27.839318966 +0000 UTC m=+0.055561418 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 02:43:29 np0005531887 nova_compute[186849]: 2025-11-22 07:43:29.444 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:29 np0005531887 nova_compute[186849]: 2025-11-22 07:43:29.954 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:30 np0005531887 podman[215032]: 2025-11-22 07:43:30.837645958 +0000 UTC m=+0.058782038 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Nov 22 02:43:32 np0005531887 nova_compute[186849]: 2025-11-22 07:43:32.202 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:43:32 np0005531887 nova_compute[186849]: 2025-11-22 07:43:32.608 186853 DEBUG oslo_concurrency.lockutils [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Acquiring lock "144e6cca-5b79-4b25-9456-a59f6895075b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:43:32 np0005531887 nova_compute[186849]: 2025-11-22 07:43:32.608 186853 DEBUG oslo_concurrency.lockutils [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Lock "144e6cca-5b79-4b25-9456-a59f6895075b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:43:32 np0005531887 nova_compute[186849]: 2025-11-22 07:43:32.633 186853 DEBUG nova.compute.manager [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 02:43:32 np0005531887 nova_compute[186849]: 2025-11-22 07:43:32.722 186853 DEBUG oslo_concurrency.lockutils [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:43:32 np0005531887 nova_compute[186849]: 2025-11-22 07:43:32.723 186853 DEBUG oslo_concurrency.lockutils [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:43:32 np0005531887 nova_compute[186849]: 2025-11-22 07:43:32.730 186853 DEBUG nova.virt.hardware [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:43:32 np0005531887 nova_compute[186849]: 2025-11-22 07:43:32.731 186853 INFO nova.compute.claims [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 22 02:43:32 np0005531887 nova_compute[186849]: 2025-11-22 07:43:32.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:43:32 np0005531887 nova_compute[186849]: 2025-11-22 07:43:32.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 22 02:43:32 np0005531887 nova_compute[186849]: 2025-11-22 07:43:32.789 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 22 02:43:32 np0005531887 nova_compute[186849]: 2025-11-22 07:43:32.867 186853 DEBUG nova.compute.provider_tree [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:43:32 np0005531887 nova_compute[186849]: 2025-11-22 07:43:32.879 186853 DEBUG nova.scheduler.client.report [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:43:32 np0005531887 nova_compute[186849]: 2025-11-22 07:43:32.907 186853 DEBUG oslo_concurrency.lockutils [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.184s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:43:32 np0005531887 nova_compute[186849]: 2025-11-22 07:43:32.908 186853 DEBUG nova.compute.manager [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 02:43:32 np0005531887 nova_compute[186849]: 2025-11-22 07:43:32.997 186853 DEBUG nova.compute.manager [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 02:43:32 np0005531887 nova_compute[186849]: 2025-11-22 07:43:32.997 186853 DEBUG nova.network.neutron [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 02:43:33 np0005531887 nova_compute[186849]: 2025-11-22 07:43:33.026 186853 INFO nova.virt.libvirt.driver [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 02:43:33 np0005531887 nova_compute[186849]: 2025-11-22 07:43:33.049 186853 DEBUG nova.compute.manager [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 02:43:33 np0005531887 nova_compute[186849]: 2025-11-22 07:43:33.182 186853 DEBUG nova.compute.manager [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 02:43:33 np0005531887 nova_compute[186849]: 2025-11-22 07:43:33.184 186853 DEBUG nova.virt.libvirt.driver [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:43:33 np0005531887 nova_compute[186849]: 2025-11-22 07:43:33.184 186853 INFO nova.virt.libvirt.driver [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Creating image(s)#033[00m
Nov 22 02:43:33 np0005531887 nova_compute[186849]: 2025-11-22 07:43:33.185 186853 DEBUG oslo_concurrency.lockutils [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Acquiring lock "/var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:43:33 np0005531887 nova_compute[186849]: 2025-11-22 07:43:33.185 186853 DEBUG oslo_concurrency.lockutils [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Lock "/var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:43:33 np0005531887 nova_compute[186849]: 2025-11-22 07:43:33.186 186853 DEBUG oslo_concurrency.lockutils [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Lock "/var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:43:33 np0005531887 nova_compute[186849]: 2025-11-22 07:43:33.198 186853 DEBUG oslo_concurrency.processutils [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:43:33 np0005531887 nova_compute[186849]: 2025-11-22 07:43:33.279 186853 DEBUG oslo_concurrency.processutils [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:43:33 np0005531887 nova_compute[186849]: 2025-11-22 07:43:33.281 186853 DEBUG oslo_concurrency.lockutils [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:43:33 np0005531887 nova_compute[186849]: 2025-11-22 07:43:33.282 186853 DEBUG oslo_concurrency.lockutils [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:43:33 np0005531887 nova_compute[186849]: 2025-11-22 07:43:33.299 186853 DEBUG oslo_concurrency.processutils [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:43:33 np0005531887 nova_compute[186849]: 2025-11-22 07:43:33.348 186853 DEBUG nova.policy [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4ca2e31d955040598948fa3da5d84888', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '74651b744925468db6c6e47d1397cc04', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 02:43:33 np0005531887 nova_compute[186849]: 2025-11-22 07:43:33.361 186853 DEBUG oslo_concurrency.processutils [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:43:33 np0005531887 nova_compute[186849]: 2025-11-22 07:43:33.362 186853 DEBUG oslo_concurrency.processutils [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:43:33 np0005531887 nova_compute[186849]: 2025-11-22 07:43:33.496 186853 DEBUG oslo_concurrency.processutils [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b/disk 1073741824" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:43:33 np0005531887 nova_compute[186849]: 2025-11-22 07:43:33.497 186853 DEBUG oslo_concurrency.lockutils [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.215s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:43:33 np0005531887 nova_compute[186849]: 2025-11-22 07:43:33.497 186853 DEBUG oslo_concurrency.processutils [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:43:33 np0005531887 nova_compute[186849]: 2025-11-22 07:43:33.572 186853 DEBUG oslo_concurrency.processutils [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:43:33 np0005531887 nova_compute[186849]: 2025-11-22 07:43:33.574 186853 DEBUG nova.virt.disk.api [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Checking if we can resize image /var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:43:33 np0005531887 nova_compute[186849]: 2025-11-22 07:43:33.574 186853 DEBUG oslo_concurrency.processutils [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:43:33 np0005531887 nova_compute[186849]: 2025-11-22 07:43:33.625 186853 DEBUG oslo_concurrency.processutils [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:43:33 np0005531887 nova_compute[186849]: 2025-11-22 07:43:33.626 186853 DEBUG nova.virt.disk.api [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Cannot resize image /var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:43:33 np0005531887 nova_compute[186849]: 2025-11-22 07:43:33.627 186853 DEBUG nova.objects.instance [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Lazy-loading 'migration_context' on Instance uuid 144e6cca-5b79-4b25-9456-a59f6895075b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:43:33 np0005531887 nova_compute[186849]: 2025-11-22 07:43:33.657 186853 DEBUG nova.virt.libvirt.driver [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:43:33 np0005531887 nova_compute[186849]: 2025-11-22 07:43:33.657 186853 DEBUG nova.virt.libvirt.driver [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Ensure instance console log exists: /var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:43:33 np0005531887 nova_compute[186849]: 2025-11-22 07:43:33.658 186853 DEBUG oslo_concurrency.lockutils [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:43:33 np0005531887 nova_compute[186849]: 2025-11-22 07:43:33.658 186853 DEBUG oslo_concurrency.lockutils [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:43:33 np0005531887 nova_compute[186849]: 2025-11-22 07:43:33.658 186853 DEBUG oslo_concurrency.lockutils [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:43:33 np0005531887 nova_compute[186849]: 2025-11-22 07:43:33.789 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:43:34 np0005531887 nova_compute[186849]: 2025-11-22 07:43:34.446 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:34 np0005531887 nova_compute[186849]: 2025-11-22 07:43:34.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:43:34 np0005531887 nova_compute[186849]: 2025-11-22 07:43:34.891 186853 DEBUG nova.network.neutron [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Successfully created port: 66ab05b0-442e-4420-82b9-0fc90a3df63b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 02:43:34 np0005531887 nova_compute[186849]: 2025-11-22 07:43:34.955 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:34.978 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:43:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:34.978 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 02:43:34 np0005531887 nova_compute[186849]: 2025-11-22 07:43:34.979 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:35 np0005531887 nova_compute[186849]: 2025-11-22 07:43:35.762 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:43:35 np0005531887 nova_compute[186849]: 2025-11-22 07:43:35.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:43:35 np0005531887 nova_compute[186849]: 2025-11-22 07:43:35.768 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 02:43:35 np0005531887 nova_compute[186849]: 2025-11-22 07:43:35.768 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 02:43:35 np0005531887 podman[215067]: 2025-11-22 07:43:35.847325171 +0000 UTC m=+0.063157324 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 02:43:35 np0005531887 nova_compute[186849]: 2025-11-22 07:43:35.988 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Nov 22 02:43:35 np0005531887 nova_compute[186849]: 2025-11-22 07:43:35.989 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 02:43:35 np0005531887 nova_compute[186849]: 2025-11-22 07:43:35.989 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:43:35 np0005531887 nova_compute[186849]: 2025-11-22 07:43:35.989 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:43:35 np0005531887 nova_compute[186849]: 2025-11-22 07:43:35.990 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 22 02:43:36 np0005531887 nova_compute[186849]: 2025-11-22 07:43:36.004 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:43:36 np0005531887 nova_compute[186849]: 2025-11-22 07:43:36.682 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:43:36 np0005531887 nova_compute[186849]: 2025-11-22 07:43:36.711 186853 WARNING nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] While synchronizing instance power states, found 1 instances in the database and 0 instances on the hypervisor.#033[00m
Nov 22 02:43:36 np0005531887 nova_compute[186849]: 2025-11-22 07:43:36.712 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Triggering sync for uuid 144e6cca-5b79-4b25-9456-a59f6895075b _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Nov 22 02:43:36 np0005531887 nova_compute[186849]: 2025-11-22 07:43:36.712 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "144e6cca-5b79-4b25-9456-a59f6895075b" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:43:36 np0005531887 nova_compute[186849]: 2025-11-22 07:43:36.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:43:36 np0005531887 nova_compute[186849]: 2025-11-22 07:43:36.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:43:36 np0005531887 nova_compute[186849]: 2025-11-22 07:43:36.795 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:43:36 np0005531887 nova_compute[186849]: 2025-11-22 07:43:36.796 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:43:36 np0005531887 nova_compute[186849]: 2025-11-22 07:43:36.796 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:43:36 np0005531887 nova_compute[186849]: 2025-11-22 07:43:36.796 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 02:43:36 np0005531887 nova_compute[186849]: 2025-11-22 07:43:36.985 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:43:36 np0005531887 nova_compute[186849]: 2025-11-22 07:43:36.987 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5805MB free_disk=73.46207046508789GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 02:43:36 np0005531887 nova_compute[186849]: 2025-11-22 07:43:36.987 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:43:36 np0005531887 nova_compute[186849]: 2025-11-22 07:43:36.988 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:43:37 np0005531887 nova_compute[186849]: 2025-11-22 07:43:37.249 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Instance 144e6cca-5b79-4b25-9456-a59f6895075b actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 02:43:37 np0005531887 nova_compute[186849]: 2025-11-22 07:43:37.249 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 02:43:37 np0005531887 nova_compute[186849]: 2025-11-22 07:43:37.249 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 02:43:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:37.315 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:43:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:37.315 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:43:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:37.316 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:43:37 np0005531887 nova_compute[186849]: 2025-11-22 07:43:37.448 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:43:37 np0005531887 nova_compute[186849]: 2025-11-22 07:43:37.455 186853 DEBUG nova.network.neutron [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Successfully updated port: 66ab05b0-442e-4420-82b9-0fc90a3df63b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 02:43:37 np0005531887 nova_compute[186849]: 2025-11-22 07:43:37.475 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:43:37 np0005531887 nova_compute[186849]: 2025-11-22 07:43:37.480 186853 DEBUG oslo_concurrency.lockutils [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Acquiring lock "refresh_cache-144e6cca-5b79-4b25-9456-a59f6895075b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:43:37 np0005531887 nova_compute[186849]: 2025-11-22 07:43:37.480 186853 DEBUG oslo_concurrency.lockutils [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Acquired lock "refresh_cache-144e6cca-5b79-4b25-9456-a59f6895075b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:43:37 np0005531887 nova_compute[186849]: 2025-11-22 07:43:37.480 186853 DEBUG nova.network.neutron [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:43:37 np0005531887 nova_compute[186849]: 2025-11-22 07:43:37.505 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 02:43:37 np0005531887 nova_compute[186849]: 2025-11-22 07:43:37.505 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.517s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:43:37 np0005531887 nova_compute[186849]: 2025-11-22 07:43:37.596 186853 DEBUG nova.compute.manager [req-bfef16d5-10f0-403d-808d-6ccfc4677f06 req-8ed1916e-63cf-4997-84f8-d73e8b6b8101 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Received event network-changed-66ab05b0-442e-4420-82b9-0fc90a3df63b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:43:37 np0005531887 nova_compute[186849]: 2025-11-22 07:43:37.597 186853 DEBUG nova.compute.manager [req-bfef16d5-10f0-403d-808d-6ccfc4677f06 req-8ed1916e-63cf-4997-84f8-d73e8b6b8101 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Refreshing instance network info cache due to event network-changed-66ab05b0-442e-4420-82b9-0fc90a3df63b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 02:43:37 np0005531887 nova_compute[186849]: 2025-11-22 07:43:37.597 186853 DEBUG oslo_concurrency.lockutils [req-bfef16d5-10f0-403d-808d-6ccfc4677f06 req-8ed1916e-63cf-4997-84f8-d73e8b6b8101 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-144e6cca-5b79-4b25-9456-a59f6895075b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:43:37 np0005531887 nova_compute[186849]: 2025-11-22 07:43:37.700 186853 DEBUG nova.network.neutron [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:43:38 np0005531887 nova_compute[186849]: 2025-11-22 07:43:38.504 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:43:38 np0005531887 nova_compute[186849]: 2025-11-22 07:43:38.505 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 02:43:38 np0005531887 nova_compute[186849]: 2025-11-22 07:43:38.763 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:43:39 np0005531887 nova_compute[186849]: 2025-11-22 07:43:39.449 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:39 np0005531887 nova_compute[186849]: 2025-11-22 07:43:39.959 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:40 np0005531887 nova_compute[186849]: 2025-11-22 07:43:40.301 186853 DEBUG nova.network.neutron [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Updating instance_info_cache with network_info: [{"id": "66ab05b0-442e-4420-82b9-0fc90a3df63b", "address": "fa:16:3e:4f:30:6c", "network": {"id": "cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-667943228-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74651b744925468db6c6e47d1397cc04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66ab05b0-44", "ovs_interfaceid": "66ab05b0-442e-4420-82b9-0fc90a3df63b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:43:40 np0005531887 nova_compute[186849]: 2025-11-22 07:43:40.329 186853 DEBUG oslo_concurrency.lockutils [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Releasing lock "refresh_cache-144e6cca-5b79-4b25-9456-a59f6895075b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:43:40 np0005531887 nova_compute[186849]: 2025-11-22 07:43:40.330 186853 DEBUG nova.compute.manager [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Instance network_info: |[{"id": "66ab05b0-442e-4420-82b9-0fc90a3df63b", "address": "fa:16:3e:4f:30:6c", "network": {"id": "cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-667943228-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74651b744925468db6c6e47d1397cc04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66ab05b0-44", "ovs_interfaceid": "66ab05b0-442e-4420-82b9-0fc90a3df63b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 02:43:40 np0005531887 nova_compute[186849]: 2025-11-22 07:43:40.331 186853 DEBUG oslo_concurrency.lockutils [req-bfef16d5-10f0-403d-808d-6ccfc4677f06 req-8ed1916e-63cf-4997-84f8-d73e8b6b8101 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-144e6cca-5b79-4b25-9456-a59f6895075b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:43:40 np0005531887 nova_compute[186849]: 2025-11-22 07:43:40.331 186853 DEBUG nova.network.neutron [req-bfef16d5-10f0-403d-808d-6ccfc4677f06 req-8ed1916e-63cf-4997-84f8-d73e8b6b8101 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Refreshing network info cache for port 66ab05b0-442e-4420-82b9-0fc90a3df63b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 02:43:40 np0005531887 nova_compute[186849]: 2025-11-22 07:43:40.336 186853 DEBUG nova.virt.libvirt.driver [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Start _get_guest_xml network_info=[{"id": "66ab05b0-442e-4420-82b9-0fc90a3df63b", "address": "fa:16:3e:4f:30:6c", "network": {"id": "cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-667943228-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74651b744925468db6c6e47d1397cc04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66ab05b0-44", "ovs_interfaceid": "66ab05b0-442e-4420-82b9-0fc90a3df63b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'size': 0, 'boot_index': 0, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:43:40 np0005531887 nova_compute[186849]: 2025-11-22 07:43:40.342 186853 WARNING nova.virt.libvirt.driver [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:43:40 np0005531887 nova_compute[186849]: 2025-11-22 07:43:40.349 186853 DEBUG nova.virt.libvirt.host [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:43:40 np0005531887 nova_compute[186849]: 2025-11-22 07:43:40.350 186853 DEBUG nova.virt.libvirt.host [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:43:40 np0005531887 nova_compute[186849]: 2025-11-22 07:43:40.359 186853 DEBUG nova.virt.libvirt.host [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:43:40 np0005531887 nova_compute[186849]: 2025-11-22 07:43:40.360 186853 DEBUG nova.virt.libvirt.host [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:43:40 np0005531887 nova_compute[186849]: 2025-11-22 07:43:40.362 186853 DEBUG nova.virt.libvirt.driver [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:43:40 np0005531887 nova_compute[186849]: 2025-11-22 07:43:40.362 186853 DEBUG nova.virt.hardware [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:43:40 np0005531887 nova_compute[186849]: 2025-11-22 07:43:40.363 186853 DEBUG nova.virt.hardware [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:43:40 np0005531887 nova_compute[186849]: 2025-11-22 07:43:40.363 186853 DEBUG nova.virt.hardware [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:43:40 np0005531887 nova_compute[186849]: 2025-11-22 07:43:40.363 186853 DEBUG nova.virt.hardware [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:43:40 np0005531887 nova_compute[186849]: 2025-11-22 07:43:40.363 186853 DEBUG nova.virt.hardware [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:43:40 np0005531887 nova_compute[186849]: 2025-11-22 07:43:40.364 186853 DEBUG nova.virt.hardware [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:43:40 np0005531887 nova_compute[186849]: 2025-11-22 07:43:40.364 186853 DEBUG nova.virt.hardware [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:43:40 np0005531887 nova_compute[186849]: 2025-11-22 07:43:40.364 186853 DEBUG nova.virt.hardware [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:43:40 np0005531887 nova_compute[186849]: 2025-11-22 07:43:40.364 186853 DEBUG nova.virt.hardware [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:43:40 np0005531887 nova_compute[186849]: 2025-11-22 07:43:40.364 186853 DEBUG nova.virt.hardware [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:43:40 np0005531887 nova_compute[186849]: 2025-11-22 07:43:40.364 186853 DEBUG nova.virt.hardware [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:43:40 np0005531887 nova_compute[186849]: 2025-11-22 07:43:40.368 186853 DEBUG nova.virt.libvirt.vif [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:43:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1027576693',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1027576693',id=16,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='74651b744925468db6c6e47d1397cc04',ramdisk_id='',reservation_id='r-u8vxgo1r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1505701588',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1505701588-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:43:33Z,user_data=None,user_id='4ca2e31d955040598948fa3da5d84888',uuid=144e6cca-5b79-4b25-9456-a59f6895075b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "66ab05b0-442e-4420-82b9-0fc90a3df63b", "address": "fa:16:3e:4f:30:6c", "network": {"id": "cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-667943228-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74651b744925468db6c6e47d1397cc04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66ab05b0-44", "ovs_interfaceid": "66ab05b0-442e-4420-82b9-0fc90a3df63b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 02:43:40 np0005531887 nova_compute[186849]: 2025-11-22 07:43:40.369 186853 DEBUG nova.network.os_vif_util [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Converting VIF {"id": "66ab05b0-442e-4420-82b9-0fc90a3df63b", "address": "fa:16:3e:4f:30:6c", "network": {"id": "cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-667943228-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74651b744925468db6c6e47d1397cc04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66ab05b0-44", "ovs_interfaceid": "66ab05b0-442e-4420-82b9-0fc90a3df63b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:43:40 np0005531887 nova_compute[186849]: 2025-11-22 07:43:40.369 186853 DEBUG nova.network.os_vif_util [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4f:30:6c,bridge_name='br-int',has_traffic_filtering=True,id=66ab05b0-442e-4420-82b9-0fc90a3df63b,network=Network(cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66ab05b0-44') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:43:40 np0005531887 nova_compute[186849]: 2025-11-22 07:43:40.370 186853 DEBUG nova.objects.instance [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Lazy-loading 'pci_devices' on Instance uuid 144e6cca-5b79-4b25-9456-a59f6895075b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:43:40 np0005531887 nova_compute[186849]: 2025-11-22 07:43:40.383 186853 DEBUG nova.virt.libvirt.driver [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:43:40 np0005531887 nova_compute[186849]:  <uuid>144e6cca-5b79-4b25-9456-a59f6895075b</uuid>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:  <name>instance-00000010</name>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:  <memory>131072</memory>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:  <vcpu>1</vcpu>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:43:40 np0005531887 nova_compute[186849]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:      <nova:name>tempest-LiveAutoBlockMigrationV225Test-server-1027576693</nova:name>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:      <nova:creationTime>2025-11-22 07:43:40</nova:creationTime>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:      <nova:flavor name="m1.nano">
Nov 22 02:43:40 np0005531887 nova_compute[186849]:        <nova:memory>128</nova:memory>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:        <nova:disk>1</nova:disk>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:        <nova:swap>0</nova:swap>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:      </nova:flavor>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:      <nova:owner>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:        <nova:user uuid="4ca2e31d955040598948fa3da5d84888">tempest-LiveAutoBlockMigrationV225Test-1505701588-project-member</nova:user>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:        <nova:project uuid="74651b744925468db6c6e47d1397cc04">tempest-LiveAutoBlockMigrationV225Test-1505701588</nova:project>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:      </nova:owner>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:      <nova:ports>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:        <nova:port uuid="66ab05b0-442e-4420-82b9-0fc90a3df63b">
Nov 22 02:43:40 np0005531887 nova_compute[186849]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:        </nova:port>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:      </nova:ports>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:    </nova:instance>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:  <sysinfo type="smbios">
Nov 22 02:43:40 np0005531887 nova_compute[186849]:    <system>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:      <entry name="serial">144e6cca-5b79-4b25-9456-a59f6895075b</entry>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:      <entry name="uuid">144e6cca-5b79-4b25-9456-a59f6895075b</entry>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:    </system>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:  <os>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:    <boot dev="hd"/>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:    <smbios mode="sysinfo"/>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:  </os>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:  <features>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:    <vmcoreinfo/>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:  </features>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:  <clock offset="utc">
Nov 22 02:43:40 np0005531887 nova_compute[186849]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:    <timer name="hpet" present="no"/>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:  </clock>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:  <cpu mode="custom" match="exact">
Nov 22 02:43:40 np0005531887 nova_compute[186849]:    <model>Nehalem</model>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:  <devices>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:    <disk type="file" device="disk">
Nov 22 02:43:40 np0005531887 nova_compute[186849]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b/disk"/>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:      <target dev="vda" bus="virtio"/>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:    <disk type="file" device="cdrom">
Nov 22 02:43:40 np0005531887 nova_compute[186849]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b/disk.config"/>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:      <target dev="sda" bus="sata"/>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:    <interface type="ethernet">
Nov 22 02:43:40 np0005531887 nova_compute[186849]:      <mac address="fa:16:3e:4f:30:6c"/>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:      <mtu size="1442"/>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:      <target dev="tap66ab05b0-44"/>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:    </interface>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:    <serial type="pty">
Nov 22 02:43:40 np0005531887 nova_compute[186849]:      <log file="/var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b/console.log" append="off"/>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:    </serial>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:    <video>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:    </video>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:    <input type="tablet" bus="usb"/>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:    <rng model="virtio">
Nov 22 02:43:40 np0005531887 nova_compute[186849]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:    </rng>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:    <controller type="usb" index="0"/>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:    <memballoon model="virtio">
Nov 22 02:43:40 np0005531887 nova_compute[186849]:      <stats period="10"/>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 02:43:40 np0005531887 nova_compute[186849]:  </devices>
Nov 22 02:43:40 np0005531887 nova_compute[186849]: </domain>
Nov 22 02:43:40 np0005531887 nova_compute[186849]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:43:40 np0005531887 nova_compute[186849]: 2025-11-22 07:43:40.384 186853 DEBUG nova.compute.manager [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Preparing to wait for external event network-vif-plugged-66ab05b0-442e-4420-82b9-0fc90a3df63b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 02:43:40 np0005531887 nova_compute[186849]: 2025-11-22 07:43:40.384 186853 DEBUG oslo_concurrency.lockutils [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Acquiring lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:43:40 np0005531887 nova_compute[186849]: 2025-11-22 07:43:40.384 186853 DEBUG oslo_concurrency.lockutils [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:43:40 np0005531887 nova_compute[186849]: 2025-11-22 07:43:40.384 186853 DEBUG oslo_concurrency.lockutils [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:43:40 np0005531887 nova_compute[186849]: 2025-11-22 07:43:40.385 186853 DEBUG nova.virt.libvirt.vif [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:43:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1027576693',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1027576693',id=16,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='74651b744925468db6c6e47d1397cc04',ramdisk_id='',reservation_id='r-u8vxgo1r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1505701588',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1505701588-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:43:33Z,user_data=None,user_id='4ca2e31d955040598948fa3da5d84888',uuid=144e6cca-5b79-4b25-9456-a59f6895075b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "66ab05b0-442e-4420-82b9-0fc90a3df63b", "address": "fa:16:3e:4f:30:6c", "network": {"id": "cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-667943228-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74651b744925468db6c6e47d1397cc04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66ab05b0-44", "ovs_interfaceid": "66ab05b0-442e-4420-82b9-0fc90a3df63b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 02:43:40 np0005531887 nova_compute[186849]: 2025-11-22 07:43:40.385 186853 DEBUG nova.network.os_vif_util [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Converting VIF {"id": "66ab05b0-442e-4420-82b9-0fc90a3df63b", "address": "fa:16:3e:4f:30:6c", "network": {"id": "cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-667943228-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74651b744925468db6c6e47d1397cc04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66ab05b0-44", "ovs_interfaceid": "66ab05b0-442e-4420-82b9-0fc90a3df63b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:43:40 np0005531887 nova_compute[186849]: 2025-11-22 07:43:40.385 186853 DEBUG nova.network.os_vif_util [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4f:30:6c,bridge_name='br-int',has_traffic_filtering=True,id=66ab05b0-442e-4420-82b9-0fc90a3df63b,network=Network(cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66ab05b0-44') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:43:40 np0005531887 nova_compute[186849]: 2025-11-22 07:43:40.386 186853 DEBUG os_vif [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4f:30:6c,bridge_name='br-int',has_traffic_filtering=True,id=66ab05b0-442e-4420-82b9-0fc90a3df63b,network=Network(cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66ab05b0-44') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 02:43:40 np0005531887 nova_compute[186849]: 2025-11-22 07:43:40.386 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:40 np0005531887 nova_compute[186849]: 2025-11-22 07:43:40.386 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:43:40 np0005531887 nova_compute[186849]: 2025-11-22 07:43:40.387 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:43:40 np0005531887 nova_compute[186849]: 2025-11-22 07:43:40.389 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:40 np0005531887 nova_compute[186849]: 2025-11-22 07:43:40.389 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap66ab05b0-44, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:43:40 np0005531887 nova_compute[186849]: 2025-11-22 07:43:40.390 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap66ab05b0-44, col_values=(('external_ids', {'iface-id': '66ab05b0-442e-4420-82b9-0fc90a3df63b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4f:30:6c', 'vm-uuid': '144e6cca-5b79-4b25-9456-a59f6895075b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:43:40 np0005531887 nova_compute[186849]: 2025-11-22 07:43:40.391 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:40 np0005531887 NetworkManager[55210]: <info>  [1763797420.3926] manager: (tap66ab05b0-44): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/25)
Nov 22 02:43:40 np0005531887 nova_compute[186849]: 2025-11-22 07:43:40.395 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:43:40 np0005531887 nova_compute[186849]: 2025-11-22 07:43:40.399 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:40 np0005531887 nova_compute[186849]: 2025-11-22 07:43:40.400 186853 INFO os_vif [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4f:30:6c,bridge_name='br-int',has_traffic_filtering=True,id=66ab05b0-442e-4420-82b9-0fc90a3df63b,network=Network(cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66ab05b0-44')#033[00m
Nov 22 02:43:40 np0005531887 nova_compute[186849]: 2025-11-22 07:43:40.493 186853 DEBUG nova.virt.libvirt.driver [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:43:40 np0005531887 nova_compute[186849]: 2025-11-22 07:43:40.494 186853 DEBUG nova.virt.libvirt.driver [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:43:40 np0005531887 nova_compute[186849]: 2025-11-22 07:43:40.494 186853 DEBUG nova.virt.libvirt.driver [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] No VIF found with MAC fa:16:3e:4f:30:6c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 02:43:40 np0005531887 nova_compute[186849]: 2025-11-22 07:43:40.495 186853 INFO nova.virt.libvirt.driver [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Using config drive#033[00m
Nov 22 02:43:40 np0005531887 nova_compute[186849]: 2025-11-22 07:43:40.803 186853 INFO nova.virt.libvirt.driver [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Creating config drive at /var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b/disk.config#033[00m
Nov 22 02:43:40 np0005531887 nova_compute[186849]: 2025-11-22 07:43:40.810 186853 DEBUG oslo_concurrency.processutils [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpa1h88r73 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:43:40 np0005531887 podman[215093]: 2025-11-22 07:43:40.86750031 +0000 UTC m=+0.085936381 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, release=1755695350, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., config_id=edpm)
Nov 22 02:43:40 np0005531887 nova_compute[186849]: 2025-11-22 07:43:40.933 186853 DEBUG oslo_concurrency.processutils [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpa1h88r73" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:43:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:40.981 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:43:41 np0005531887 kernel: tap66ab05b0-44: entered promiscuous mode
Nov 22 02:43:41 np0005531887 NetworkManager[55210]: <info>  [1763797421.0023] manager: (tap66ab05b0-44): new Tun device (/org/freedesktop/NetworkManager/Devices/26)
Nov 22 02:43:41 np0005531887 ovn_controller[95130]: 2025-11-22T07:43:41Z|00037|binding|INFO|Claiming lport 66ab05b0-442e-4420-82b9-0fc90a3df63b for this chassis.
Nov 22 02:43:41 np0005531887 ovn_controller[95130]: 2025-11-22T07:43:41Z|00038|binding|INFO|66ab05b0-442e-4420-82b9-0fc90a3df63b: Claiming fa:16:3e:4f:30:6c 10.100.0.8
Nov 22 02:43:41 np0005531887 nova_compute[186849]: 2025-11-22 07:43:41.003 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:41 np0005531887 nova_compute[186849]: 2025-11-22 07:43:41.007 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:41 np0005531887 nova_compute[186849]: 2025-11-22 07:43:41.014 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:41.025 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4f:30:6c 10.100.0.8'], port_security=['fa:16:3e:4f:30:6c 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '144e6cca-5b79-4b25-9456-a59f6895075b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '74651b744925468db6c6e47d1397cc04', 'neutron:revision_number': '2', 'neutron:security_group_ids': '91f2be3c-33ea-422b-b9a4-1d9e92a850d8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=14c3e272-b4ef-4625-a876-b23f3cbba9b7, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=66ab05b0-442e-4420-82b9-0fc90a3df63b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:43:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:41.026 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 66ab05b0-442e-4420-82b9-0fc90a3df63b in datapath cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9 bound to our chassis#033[00m
Nov 22 02:43:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:41.028 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9#033[00m
Nov 22 02:43:41 np0005531887 systemd-udevd[215130]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:43:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:41.040 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[425a981d-8d24-48fb-a3c1-1ce1e75e9025]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:43:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:41.041 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapcd5fa4f6-01 in ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 02:43:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:41.043 213790 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapcd5fa4f6-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 02:43:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:41.043 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[6d21f19c-9c8c-4f7c-81eb-46ef41387bc3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:43:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:41.043 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[f87a39cf-afb0-4bfc-b796-eb097d782b91]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:43:41 np0005531887 systemd-machined[153180]: New machine qemu-5-instance-00000010.
Nov 22 02:43:41 np0005531887 NetworkManager[55210]: <info>  [1763797421.0526] device (tap66ab05b0-44): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 02:43:41 np0005531887 NetworkManager[55210]: <info>  [1763797421.0537] device (tap66ab05b0-44): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 02:43:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:41.055 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[69cb7a7d-4930-45b9-b3e3-ce5a6f964eee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:43:41 np0005531887 systemd[1]: Started Virtual Machine qemu-5-instance-00000010.
Nov 22 02:43:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:41.083 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[46c1cc91-af72-4799-817c-ada49b9464f7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:43:41 np0005531887 nova_compute[186849]: 2025-11-22 07:43:41.085 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:41 np0005531887 ovn_controller[95130]: 2025-11-22T07:43:41Z|00039|binding|INFO|Setting lport 66ab05b0-442e-4420-82b9-0fc90a3df63b ovn-installed in OVS
Nov 22 02:43:41 np0005531887 ovn_controller[95130]: 2025-11-22T07:43:41Z|00040|binding|INFO|Setting lport 66ab05b0-442e-4420-82b9-0fc90a3df63b up in Southbound
Nov 22 02:43:41 np0005531887 nova_compute[186849]: 2025-11-22 07:43:41.091 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:41.111 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[1ea52f07-5078-45a9-b962-2bb77752badb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:43:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:41.116 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[38b54be5-fe93-408f-8b45-92c4fe4b2145]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:43:41 np0005531887 NetworkManager[55210]: <info>  [1763797421.1183] manager: (tapcd5fa4f6-00): new Veth device (/org/freedesktop/NetworkManager/Devices/27)
Nov 22 02:43:41 np0005531887 systemd-udevd[215135]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:43:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:41.145 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[e686c6a8-7b83-4853-b27a-441269f50f05]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:43:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:41.148 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[0ad3380f-4a4f-412c-827c-ec2ad8cf3cc8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:43:41 np0005531887 NetworkManager[55210]: <info>  [1763797421.1712] device (tapcd5fa4f6-00): carrier: link connected
Nov 22 02:43:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:41.175 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[07bcec8c-2eb9-4aab-a24f-11f604deb324]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:43:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:41.191 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[dad9b468-eebc-4ec9-a166-be0d218cf034]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcd5fa4f6-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:12:db:2b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 416377, 'reachable_time': 42777, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215164, 'error': None, 'target': 'ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:43:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:41.206 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[5c230928-da4c-4b9d-9fae-6660463b3aff]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe12:db2b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 416377, 'tstamp': 416377}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215166, 'error': None, 'target': 'ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:43:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:41.221 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[e73cf913-7bc8-452d-8e20-24b1c79dbd41]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcd5fa4f6-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:12:db:2b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 416377, 'reachable_time': 42777, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215167, 'error': None, 'target': 'ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:43:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:41.253 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[ef71a215-c017-4b15-ae4c-16753d8a5a54]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:43:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:41.313 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[75a759bd-8f1a-4a71-b5a5-5495e492a4ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:43:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:41.314 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcd5fa4f6-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:43:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:41.314 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:43:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:41.314 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcd5fa4f6-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:43:41 np0005531887 nova_compute[186849]: 2025-11-22 07:43:41.316 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:41 np0005531887 NetworkManager[55210]: <info>  [1763797421.3168] manager: (tapcd5fa4f6-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/28)
Nov 22 02:43:41 np0005531887 kernel: tapcd5fa4f6-00: entered promiscuous mode
Nov 22 02:43:41 np0005531887 nova_compute[186849]: 2025-11-22 07:43:41.318 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:41.319 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapcd5fa4f6-00, col_values=(('external_ids', {'iface-id': 'f400467f-3f35-4435-bb4a-0b3da05366fb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:43:41 np0005531887 nova_compute[186849]: 2025-11-22 07:43:41.320 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:41 np0005531887 nova_compute[186849]: 2025-11-22 07:43:41.321 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:41 np0005531887 ovn_controller[95130]: 2025-11-22T07:43:41Z|00041|binding|INFO|Releasing lport f400467f-3f35-4435-bb4a-0b3da05366fb from this chassis (sb_readonly=0)
Nov 22 02:43:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:41.321 104084 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 02:43:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:41.322 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[cadd50f7-08aa-4d66-9889-063c4d58460c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:43:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:41.322 104084 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 02:43:41 np0005531887 ovn_metadata_agent[104079]: global
Nov 22 02:43:41 np0005531887 ovn_metadata_agent[104079]:    log         /dev/log local0 debug
Nov 22 02:43:41 np0005531887 ovn_metadata_agent[104079]:    log-tag     haproxy-metadata-proxy-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9
Nov 22 02:43:41 np0005531887 ovn_metadata_agent[104079]:    user        root
Nov 22 02:43:41 np0005531887 ovn_metadata_agent[104079]:    group       root
Nov 22 02:43:41 np0005531887 ovn_metadata_agent[104079]:    maxconn     1024
Nov 22 02:43:41 np0005531887 ovn_metadata_agent[104079]:    pidfile     /var/lib/neutron/external/pids/cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9.pid.haproxy
Nov 22 02:43:41 np0005531887 ovn_metadata_agent[104079]:    daemon
Nov 22 02:43:41 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:43:41 np0005531887 ovn_metadata_agent[104079]: defaults
Nov 22 02:43:41 np0005531887 ovn_metadata_agent[104079]:    log global
Nov 22 02:43:41 np0005531887 ovn_metadata_agent[104079]:    mode http
Nov 22 02:43:41 np0005531887 ovn_metadata_agent[104079]:    option httplog
Nov 22 02:43:41 np0005531887 ovn_metadata_agent[104079]:    option dontlognull
Nov 22 02:43:41 np0005531887 ovn_metadata_agent[104079]:    option http-server-close
Nov 22 02:43:41 np0005531887 ovn_metadata_agent[104079]:    option forwardfor
Nov 22 02:43:41 np0005531887 ovn_metadata_agent[104079]:    retries                 3
Nov 22 02:43:41 np0005531887 ovn_metadata_agent[104079]:    timeout http-request    30s
Nov 22 02:43:41 np0005531887 ovn_metadata_agent[104079]:    timeout connect         30s
Nov 22 02:43:41 np0005531887 ovn_metadata_agent[104079]:    timeout client          32s
Nov 22 02:43:41 np0005531887 ovn_metadata_agent[104079]:    timeout server          32s
Nov 22 02:43:41 np0005531887 ovn_metadata_agent[104079]:    timeout http-keep-alive 30s
Nov 22 02:43:41 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:43:41 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:43:41 np0005531887 ovn_metadata_agent[104079]: listen listener
Nov 22 02:43:41 np0005531887 ovn_metadata_agent[104079]:    bind 169.254.169.254:80
Nov 22 02:43:41 np0005531887 ovn_metadata_agent[104079]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 02:43:41 np0005531887 ovn_metadata_agent[104079]:    http-request add-header X-OVN-Network-ID cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9
Nov 22 02:43:41 np0005531887 ovn_metadata_agent[104079]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 02:43:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:41.323 104084 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9', 'env', 'PROCESS_TAG=haproxy-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 02:43:41 np0005531887 nova_compute[186849]: 2025-11-22 07:43:41.332 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:41 np0005531887 nova_compute[186849]: 2025-11-22 07:43:41.442 186853 DEBUG nova.compute.manager [req-701c0c01-8000-4937-97ca-31854fedb5d0 req-7615100d-e825-4138-a328-a5610293ee64 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Received event network-vif-plugged-66ab05b0-442e-4420-82b9-0fc90a3df63b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:43:41 np0005531887 nova_compute[186849]: 2025-11-22 07:43:41.442 186853 DEBUG oslo_concurrency.lockutils [req-701c0c01-8000-4937-97ca-31854fedb5d0 req-7615100d-e825-4138-a328-a5610293ee64 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:43:41 np0005531887 nova_compute[186849]: 2025-11-22 07:43:41.442 186853 DEBUG oslo_concurrency.lockutils [req-701c0c01-8000-4937-97ca-31854fedb5d0 req-7615100d-e825-4138-a328-a5610293ee64 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:43:41 np0005531887 nova_compute[186849]: 2025-11-22 07:43:41.443 186853 DEBUG oslo_concurrency.lockutils [req-701c0c01-8000-4937-97ca-31854fedb5d0 req-7615100d-e825-4138-a328-a5610293ee64 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:43:41 np0005531887 nova_compute[186849]: 2025-11-22 07:43:41.443 186853 DEBUG nova.compute.manager [req-701c0c01-8000-4937-97ca-31854fedb5d0 req-7615100d-e825-4138-a328-a5610293ee64 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Processing event network-vif-plugged-66ab05b0-442e-4420-82b9-0fc90a3df63b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 02:43:41 np0005531887 nova_compute[186849]: 2025-11-22 07:43:41.554 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763797421.5542264, 144e6cca-5b79-4b25-9456-a59f6895075b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:43:41 np0005531887 nova_compute[186849]: 2025-11-22 07:43:41.555 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] VM Started (Lifecycle Event)#033[00m
Nov 22 02:43:41 np0005531887 nova_compute[186849]: 2025-11-22 07:43:41.557 186853 DEBUG nova.compute.manager [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:43:41 np0005531887 nova_compute[186849]: 2025-11-22 07:43:41.560 186853 DEBUG nova.virt.libvirt.driver [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 02:43:41 np0005531887 nova_compute[186849]: 2025-11-22 07:43:41.563 186853 INFO nova.virt.libvirt.driver [-] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Instance spawned successfully.#033[00m
Nov 22 02:43:41 np0005531887 nova_compute[186849]: 2025-11-22 07:43:41.564 186853 DEBUG nova.virt.libvirt.driver [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 02:43:41 np0005531887 nova_compute[186849]: 2025-11-22 07:43:41.620 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:43:41 np0005531887 nova_compute[186849]: 2025-11-22 07:43:41.625 186853 DEBUG nova.virt.libvirt.driver [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:43:41 np0005531887 nova_compute[186849]: 2025-11-22 07:43:41.626 186853 DEBUG nova.virt.libvirt.driver [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:43:41 np0005531887 nova_compute[186849]: 2025-11-22 07:43:41.626 186853 DEBUG nova.virt.libvirt.driver [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:43:41 np0005531887 nova_compute[186849]: 2025-11-22 07:43:41.627 186853 DEBUG nova.virt.libvirt.driver [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:43:41 np0005531887 nova_compute[186849]: 2025-11-22 07:43:41.627 186853 DEBUG nova.virt.libvirt.driver [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:43:41 np0005531887 nova_compute[186849]: 2025-11-22 07:43:41.627 186853 DEBUG nova.virt.libvirt.driver [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:43:41 np0005531887 nova_compute[186849]: 2025-11-22 07:43:41.631 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:43:41 np0005531887 nova_compute[186849]: 2025-11-22 07:43:41.671 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:43:41 np0005531887 nova_compute[186849]: 2025-11-22 07:43:41.671 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763797421.5545962, 144e6cca-5b79-4b25-9456-a59f6895075b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:43:41 np0005531887 nova_compute[186849]: 2025-11-22 07:43:41.671 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] VM Paused (Lifecycle Event)#033[00m
Nov 22 02:43:41 np0005531887 nova_compute[186849]: 2025-11-22 07:43:41.690 186853 DEBUG nova.network.neutron [req-bfef16d5-10f0-403d-808d-6ccfc4677f06 req-8ed1916e-63cf-4997-84f8-d73e8b6b8101 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Updated VIF entry in instance network info cache for port 66ab05b0-442e-4420-82b9-0fc90a3df63b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 02:43:41 np0005531887 nova_compute[186849]: 2025-11-22 07:43:41.691 186853 DEBUG nova.network.neutron [req-bfef16d5-10f0-403d-808d-6ccfc4677f06 req-8ed1916e-63cf-4997-84f8-d73e8b6b8101 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Updating instance_info_cache with network_info: [{"id": "66ab05b0-442e-4420-82b9-0fc90a3df63b", "address": "fa:16:3e:4f:30:6c", "network": {"id": "cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-667943228-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74651b744925468db6c6e47d1397cc04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66ab05b0-44", "ovs_interfaceid": "66ab05b0-442e-4420-82b9-0fc90a3df63b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:43:41 np0005531887 nova_compute[186849]: 2025-11-22 07:43:41.693 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:43:41 np0005531887 nova_compute[186849]: 2025-11-22 07:43:41.697 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763797421.5606146, 144e6cca-5b79-4b25-9456-a59f6895075b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:43:41 np0005531887 nova_compute[186849]: 2025-11-22 07:43:41.698 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:43:41 np0005531887 podman[215206]: 2025-11-22 07:43:41.706293318 +0000 UTC m=+0.050733711 container create 1878772e49265ceeffc6badec0f62fbf8fad98c28f18f25fa6245b606db4a983 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 22 02:43:41 np0005531887 nova_compute[186849]: 2025-11-22 07:43:41.707 186853 DEBUG oslo_concurrency.lockutils [req-bfef16d5-10f0-403d-808d-6ccfc4677f06 req-8ed1916e-63cf-4997-84f8-d73e8b6b8101 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-144e6cca-5b79-4b25-9456-a59f6895075b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:43:41 np0005531887 nova_compute[186849]: 2025-11-22 07:43:41.720 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:43:41 np0005531887 nova_compute[186849]: 2025-11-22 07:43:41.722 186853 INFO nova.compute.manager [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Took 8.54 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 02:43:41 np0005531887 nova_compute[186849]: 2025-11-22 07:43:41.722 186853 DEBUG nova.compute.manager [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:43:41 np0005531887 nova_compute[186849]: 2025-11-22 07:43:41.725 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:43:41 np0005531887 systemd[1]: Started libpod-conmon-1878772e49265ceeffc6badec0f62fbf8fad98c28f18f25fa6245b606db4a983.scope.
Nov 22 02:43:41 np0005531887 nova_compute[186849]: 2025-11-22 07:43:41.753 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:43:41 np0005531887 systemd[1]: Started libcrun container.
Nov 22 02:43:41 np0005531887 podman[215206]: 2025-11-22 07:43:41.676547891 +0000 UTC m=+0.020988304 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 02:43:41 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/622139ff980240a5e975f4a253bda1d7a8572f840d01ef3fe2e6eb962e954b68/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 02:43:41 np0005531887 podman[215206]: 2025-11-22 07:43:41.789658046 +0000 UTC m=+0.134098469 container init 1878772e49265ceeffc6badec0f62fbf8fad98c28f18f25fa6245b606db4a983 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:43:41 np0005531887 podman[215206]: 2025-11-22 07:43:41.795770245 +0000 UTC m=+0.140210638 container start 1878772e49265ceeffc6badec0f62fbf8fad98c28f18f25fa6245b606db4a983 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:43:41 np0005531887 nova_compute[186849]: 2025-11-22 07:43:41.811 186853 INFO nova.compute.manager [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Took 9.12 seconds to build instance.#033[00m
Nov 22 02:43:41 np0005531887 neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9[215221]: [NOTICE]   (215225) : New worker (215227) forked
Nov 22 02:43:41 np0005531887 neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9[215221]: [NOTICE]   (215225) : Loading success.
Nov 22 02:43:41 np0005531887 nova_compute[186849]: 2025-11-22 07:43:41.831 186853 DEBUG oslo_concurrency.lockutils [None req-05c4ea17-c788-472a-8f07-2608c1959ecc 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Lock "144e6cca-5b79-4b25-9456-a59f6895075b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.223s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:43:41 np0005531887 nova_compute[186849]: 2025-11-22 07:43:41.831 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "144e6cca-5b79-4b25-9456-a59f6895075b" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 5.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:43:41 np0005531887 nova_compute[186849]: 2025-11-22 07:43:41.832 186853 INFO nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:43:41 np0005531887 nova_compute[186849]: 2025-11-22 07:43:41.832 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "144e6cca-5b79-4b25-9456-a59f6895075b" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:43:43 np0005531887 nova_compute[186849]: 2025-11-22 07:43:43.294 186853 DEBUG oslo_concurrency.lockutils [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Acquiring lock "9141d9a1-7a40-4a72-a3f7-2d67ae112383" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:43:43 np0005531887 nova_compute[186849]: 2025-11-22 07:43:43.295 186853 DEBUG oslo_concurrency.lockutils [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "9141d9a1-7a40-4a72-a3f7-2d67ae112383" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:43:43 np0005531887 nova_compute[186849]: 2025-11-22 07:43:43.325 186853 DEBUG nova.compute.manager [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 02:43:43 np0005531887 nova_compute[186849]: 2025-11-22 07:43:43.436 186853 DEBUG oslo_concurrency.lockutils [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:43:43 np0005531887 nova_compute[186849]: 2025-11-22 07:43:43.436 186853 DEBUG oslo_concurrency.lockutils [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:43:43 np0005531887 nova_compute[186849]: 2025-11-22 07:43:43.444 186853 DEBUG nova.virt.hardware [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:43:43 np0005531887 nova_compute[186849]: 2025-11-22 07:43:43.444 186853 INFO nova.compute.claims [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 22 02:43:43 np0005531887 nova_compute[186849]: 2025-11-22 07:43:43.539 186853 DEBUG nova.compute.manager [req-842ad613-5dc8-458a-a214-92ad521560e2 req-2b139c9f-16d9-4139-b403-2c0b8c85fe9c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Received event network-vif-plugged-66ab05b0-442e-4420-82b9-0fc90a3df63b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:43:43 np0005531887 nova_compute[186849]: 2025-11-22 07:43:43.539 186853 DEBUG oslo_concurrency.lockutils [req-842ad613-5dc8-458a-a214-92ad521560e2 req-2b139c9f-16d9-4139-b403-2c0b8c85fe9c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:43:43 np0005531887 nova_compute[186849]: 2025-11-22 07:43:43.539 186853 DEBUG oslo_concurrency.lockutils [req-842ad613-5dc8-458a-a214-92ad521560e2 req-2b139c9f-16d9-4139-b403-2c0b8c85fe9c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:43:43 np0005531887 nova_compute[186849]: 2025-11-22 07:43:43.540 186853 DEBUG oslo_concurrency.lockutils [req-842ad613-5dc8-458a-a214-92ad521560e2 req-2b139c9f-16d9-4139-b403-2c0b8c85fe9c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:43:43 np0005531887 nova_compute[186849]: 2025-11-22 07:43:43.540 186853 DEBUG nova.compute.manager [req-842ad613-5dc8-458a-a214-92ad521560e2 req-2b139c9f-16d9-4139-b403-2c0b8c85fe9c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] No waiting events found dispatching network-vif-plugged-66ab05b0-442e-4420-82b9-0fc90a3df63b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:43:43 np0005531887 nova_compute[186849]: 2025-11-22 07:43:43.540 186853 WARNING nova.compute.manager [req-842ad613-5dc8-458a-a214-92ad521560e2 req-2b139c9f-16d9-4139-b403-2c0b8c85fe9c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Received unexpected event network-vif-plugged-66ab05b0-442e-4420-82b9-0fc90a3df63b for instance with vm_state active and task_state None.#033[00m
Nov 22 02:43:43 np0005531887 nova_compute[186849]: 2025-11-22 07:43:43.597 186853 DEBUG nova.compute.provider_tree [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:43:43 np0005531887 nova_compute[186849]: 2025-11-22 07:43:43.608 186853 DEBUG nova.scheduler.client.report [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:43:43 np0005531887 nova_compute[186849]: 2025-11-22 07:43:43.630 186853 DEBUG oslo_concurrency.lockutils [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.194s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:43:43 np0005531887 nova_compute[186849]: 2025-11-22 07:43:43.631 186853 DEBUG nova.compute.manager [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 02:43:43 np0005531887 nova_compute[186849]: 2025-11-22 07:43:43.697 186853 DEBUG nova.compute.manager [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 02:43:43 np0005531887 nova_compute[186849]: 2025-11-22 07:43:43.698 186853 DEBUG nova.network.neutron [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 02:43:43 np0005531887 nova_compute[186849]: 2025-11-22 07:43:43.716 186853 INFO nova.virt.libvirt.driver [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 02:43:43 np0005531887 nova_compute[186849]: 2025-11-22 07:43:43.744 186853 DEBUG nova.compute.manager [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 02:43:43 np0005531887 nova_compute[186849]: 2025-11-22 07:43:43.833 186853 DEBUG nova.compute.manager [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 02:43:43 np0005531887 nova_compute[186849]: 2025-11-22 07:43:43.834 186853 DEBUG nova.virt.libvirt.driver [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:43:43 np0005531887 nova_compute[186849]: 2025-11-22 07:43:43.835 186853 INFO nova.virt.libvirt.driver [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Creating image(s)#033[00m
Nov 22 02:43:43 np0005531887 nova_compute[186849]: 2025-11-22 07:43:43.837 186853 DEBUG oslo_concurrency.lockutils [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Acquiring lock "/var/lib/nova/instances/9141d9a1-7a40-4a72-a3f7-2d67ae112383/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:43:43 np0005531887 nova_compute[186849]: 2025-11-22 07:43:43.837 186853 DEBUG oslo_concurrency.lockutils [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "/var/lib/nova/instances/9141d9a1-7a40-4a72-a3f7-2d67ae112383/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:43:43 np0005531887 nova_compute[186849]: 2025-11-22 07:43:43.838 186853 DEBUG oslo_concurrency.lockutils [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "/var/lib/nova/instances/9141d9a1-7a40-4a72-a3f7-2d67ae112383/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:43:43 np0005531887 nova_compute[186849]: 2025-11-22 07:43:43.856 186853 DEBUG oslo_concurrency.processutils [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:43:43 np0005531887 nova_compute[186849]: 2025-11-22 07:43:43.885 186853 DEBUG nova.policy [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 02:43:43 np0005531887 nova_compute[186849]: 2025-11-22 07:43:43.911 186853 DEBUG oslo_concurrency.processutils [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:43:43 np0005531887 nova_compute[186849]: 2025-11-22 07:43:43.912 186853 DEBUG oslo_concurrency.lockutils [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:43:43 np0005531887 nova_compute[186849]: 2025-11-22 07:43:43.912 186853 DEBUG oslo_concurrency.lockutils [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:43:43 np0005531887 nova_compute[186849]: 2025-11-22 07:43:43.923 186853 DEBUG oslo_concurrency.processutils [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:43:43 np0005531887 nova_compute[186849]: 2025-11-22 07:43:43.980 186853 DEBUG oslo_concurrency.processutils [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:43:43 np0005531887 nova_compute[186849]: 2025-11-22 07:43:43.981 186853 DEBUG oslo_concurrency.processutils [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/9141d9a1-7a40-4a72-a3f7-2d67ae112383/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:43:44 np0005531887 nova_compute[186849]: 2025-11-22 07:43:44.020 186853 DEBUG oslo_concurrency.processutils [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/9141d9a1-7a40-4a72-a3f7-2d67ae112383/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:43:44 np0005531887 nova_compute[186849]: 2025-11-22 07:43:44.021 186853 DEBUG oslo_concurrency.lockutils [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:43:44 np0005531887 nova_compute[186849]: 2025-11-22 07:43:44.021 186853 DEBUG oslo_concurrency.processutils [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:43:44 np0005531887 nova_compute[186849]: 2025-11-22 07:43:44.083 186853 DEBUG oslo_concurrency.processutils [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:43:44 np0005531887 nova_compute[186849]: 2025-11-22 07:43:44.084 186853 DEBUG nova.virt.disk.api [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Checking if we can resize image /var/lib/nova/instances/9141d9a1-7a40-4a72-a3f7-2d67ae112383/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:43:44 np0005531887 nova_compute[186849]: 2025-11-22 07:43:44.085 186853 DEBUG oslo_concurrency.processutils [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9141d9a1-7a40-4a72-a3f7-2d67ae112383/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:43:44 np0005531887 nova_compute[186849]: 2025-11-22 07:43:44.139 186853 DEBUG oslo_concurrency.processutils [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9141d9a1-7a40-4a72-a3f7-2d67ae112383/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:43:44 np0005531887 nova_compute[186849]: 2025-11-22 07:43:44.140 186853 DEBUG nova.virt.disk.api [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Cannot resize image /var/lib/nova/instances/9141d9a1-7a40-4a72-a3f7-2d67ae112383/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:43:44 np0005531887 nova_compute[186849]: 2025-11-22 07:43:44.140 186853 DEBUG nova.objects.instance [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lazy-loading 'migration_context' on Instance uuid 9141d9a1-7a40-4a72-a3f7-2d67ae112383 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:43:44 np0005531887 nova_compute[186849]: 2025-11-22 07:43:44.170 186853 DEBUG nova.virt.libvirt.driver [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:43:44 np0005531887 nova_compute[186849]: 2025-11-22 07:43:44.171 186853 DEBUG nova.virt.libvirt.driver [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Ensure instance console log exists: /var/lib/nova/instances/9141d9a1-7a40-4a72-a3f7-2d67ae112383/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:43:44 np0005531887 nova_compute[186849]: 2025-11-22 07:43:44.171 186853 DEBUG oslo_concurrency.lockutils [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:43:44 np0005531887 nova_compute[186849]: 2025-11-22 07:43:44.171 186853 DEBUG oslo_concurrency.lockutils [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:43:44 np0005531887 nova_compute[186849]: 2025-11-22 07:43:44.172 186853 DEBUG oslo_concurrency.lockutils [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:43:44 np0005531887 nova_compute[186849]: 2025-11-22 07:43:44.815 186853 DEBUG nova.network.neutron [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Successfully created port: 606a4645-8996-452d-9864-00ce49d9140c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 02:43:45 np0005531887 nova_compute[186849]: 2025-11-22 07:43:45.096 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:45 np0005531887 nova_compute[186849]: 2025-11-22 07:43:45.391 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:46 np0005531887 nova_compute[186849]: 2025-11-22 07:43:46.883 186853 DEBUG nova.network.neutron [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Successfully updated port: 606a4645-8996-452d-9864-00ce49d9140c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 02:43:46 np0005531887 nova_compute[186849]: 2025-11-22 07:43:46.899 186853 DEBUG oslo_concurrency.lockutils [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Acquiring lock "refresh_cache-9141d9a1-7a40-4a72-a3f7-2d67ae112383" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:43:46 np0005531887 nova_compute[186849]: 2025-11-22 07:43:46.899 186853 DEBUG oslo_concurrency.lockutils [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Acquired lock "refresh_cache-9141d9a1-7a40-4a72-a3f7-2d67ae112383" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:43:46 np0005531887 nova_compute[186849]: 2025-11-22 07:43:46.899 186853 DEBUG nova.network.neutron [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:43:47 np0005531887 nova_compute[186849]: 2025-11-22 07:43:47.007 186853 DEBUG nova.compute.manager [req-5f170c31-745a-499b-8a11-e5ef64e69296 req-437eaddc-60b5-4a36-95f7-c66a641c59ac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Received event network-changed-606a4645-8996-452d-9864-00ce49d9140c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:43:47 np0005531887 nova_compute[186849]: 2025-11-22 07:43:47.008 186853 DEBUG nova.compute.manager [req-5f170c31-745a-499b-8a11-e5ef64e69296 req-437eaddc-60b5-4a36-95f7-c66a641c59ac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Refreshing instance network info cache due to event network-changed-606a4645-8996-452d-9864-00ce49d9140c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 02:43:47 np0005531887 nova_compute[186849]: 2025-11-22 07:43:47.008 186853 DEBUG oslo_concurrency.lockutils [req-5f170c31-745a-499b-8a11-e5ef64e69296 req-437eaddc-60b5-4a36-95f7-c66a641c59ac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-9141d9a1-7a40-4a72-a3f7-2d67ae112383" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:43:47 np0005531887 nova_compute[186849]: 2025-11-22 07:43:47.105 186853 DEBUG nova.network.neutron [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:43:47 np0005531887 nova_compute[186849]: 2025-11-22 07:43:47.210 186853 DEBUG nova.virt.libvirt.driver [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Check if temp file /var/lib/nova/instances/tmpjjc7nk5w exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065#033[00m
Nov 22 02:43:47 np0005531887 nova_compute[186849]: 2025-11-22 07:43:47.211 186853 DEBUG nova.compute.manager [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpjjc7nk5w',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='144e6cca-5b79-4b25-9456-a59f6895075b',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587#033[00m
Nov 22 02:43:47 np0005531887 nova_compute[186849]: 2025-11-22 07:43:47.871 186853 DEBUG nova.network.neutron [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Updating instance_info_cache with network_info: [{"id": "606a4645-8996-452d-9864-00ce49d9140c", "address": "fa:16:3e:ce:a6:f6", "network": {"id": "d7ba1c27-6255-4c71-8e98-23a1c59b5723", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1812148536-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b004cb06df74de2903dae19345fd9c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap606a4645-89", "ovs_interfaceid": "606a4645-8996-452d-9864-00ce49d9140c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:43:47 np0005531887 nova_compute[186849]: 2025-11-22 07:43:47.892 186853 DEBUG oslo_concurrency.lockutils [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Releasing lock "refresh_cache-9141d9a1-7a40-4a72-a3f7-2d67ae112383" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:43:47 np0005531887 nova_compute[186849]: 2025-11-22 07:43:47.892 186853 DEBUG nova.compute.manager [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Instance network_info: |[{"id": "606a4645-8996-452d-9864-00ce49d9140c", "address": "fa:16:3e:ce:a6:f6", "network": {"id": "d7ba1c27-6255-4c71-8e98-23a1c59b5723", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1812148536-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b004cb06df74de2903dae19345fd9c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap606a4645-89", "ovs_interfaceid": "606a4645-8996-452d-9864-00ce49d9140c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 02:43:47 np0005531887 nova_compute[186849]: 2025-11-22 07:43:47.893 186853 DEBUG oslo_concurrency.lockutils [req-5f170c31-745a-499b-8a11-e5ef64e69296 req-437eaddc-60b5-4a36-95f7-c66a641c59ac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-9141d9a1-7a40-4a72-a3f7-2d67ae112383" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:43:47 np0005531887 nova_compute[186849]: 2025-11-22 07:43:47.893 186853 DEBUG nova.network.neutron [req-5f170c31-745a-499b-8a11-e5ef64e69296 req-437eaddc-60b5-4a36-95f7-c66a641c59ac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Refreshing network info cache for port 606a4645-8996-452d-9864-00ce49d9140c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 02:43:47 np0005531887 nova_compute[186849]: 2025-11-22 07:43:47.897 186853 DEBUG nova.virt.libvirt.driver [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Start _get_guest_xml network_info=[{"id": "606a4645-8996-452d-9864-00ce49d9140c", "address": "fa:16:3e:ce:a6:f6", "network": {"id": "d7ba1c27-6255-4c71-8e98-23a1c59b5723", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1812148536-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b004cb06df74de2903dae19345fd9c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap606a4645-89", "ovs_interfaceid": "606a4645-8996-452d-9864-00ce49d9140c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'size': 0, 'boot_index': 0, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:43:47 np0005531887 nova_compute[186849]: 2025-11-22 07:43:47.901 186853 WARNING nova.virt.libvirt.driver [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:43:47 np0005531887 nova_compute[186849]: 2025-11-22 07:43:47.907 186853 DEBUG nova.virt.libvirt.host [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:43:47 np0005531887 nova_compute[186849]: 2025-11-22 07:43:47.908 186853 DEBUG nova.virt.libvirt.host [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:43:47 np0005531887 nova_compute[186849]: 2025-11-22 07:43:47.912 186853 DEBUG nova.virt.libvirt.host [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:43:47 np0005531887 nova_compute[186849]: 2025-11-22 07:43:47.913 186853 DEBUG nova.virt.libvirt.host [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:43:47 np0005531887 nova_compute[186849]: 2025-11-22 07:43:47.914 186853 DEBUG nova.virt.libvirt.driver [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:43:47 np0005531887 nova_compute[186849]: 2025-11-22 07:43:47.914 186853 DEBUG nova.virt.hardware [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:43:47 np0005531887 nova_compute[186849]: 2025-11-22 07:43:47.915 186853 DEBUG nova.virt.hardware [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:43:47 np0005531887 nova_compute[186849]: 2025-11-22 07:43:47.915 186853 DEBUG nova.virt.hardware [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:43:47 np0005531887 nova_compute[186849]: 2025-11-22 07:43:47.916 186853 DEBUG nova.virt.hardware [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:43:47 np0005531887 nova_compute[186849]: 2025-11-22 07:43:47.916 186853 DEBUG nova.virt.hardware [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:43:47 np0005531887 nova_compute[186849]: 2025-11-22 07:43:47.916 186853 DEBUG nova.virt.hardware [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:43:47 np0005531887 nova_compute[186849]: 2025-11-22 07:43:47.916 186853 DEBUG nova.virt.hardware [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:43:47 np0005531887 nova_compute[186849]: 2025-11-22 07:43:47.917 186853 DEBUG nova.virt.hardware [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:43:47 np0005531887 nova_compute[186849]: 2025-11-22 07:43:47.917 186853 DEBUG nova.virt.hardware [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:43:47 np0005531887 nova_compute[186849]: 2025-11-22 07:43:47.917 186853 DEBUG nova.virt.hardware [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:43:47 np0005531887 nova_compute[186849]: 2025-11-22 07:43:47.917 186853 DEBUG nova.virt.hardware [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:43:47 np0005531887 nova_compute[186849]: 2025-11-22 07:43:47.922 186853 DEBUG nova.virt.libvirt.vif [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:43:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-611382116',display_name='tempest-ServersAdminTestJSON-server-611382116',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-611382116',id=18,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9b004cb06df74de2903dae19345fd9c7',ramdisk_id='',reservation_id='r-e3r0caf9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-1843119868',owner_user_name='tempest-ServersAdminTestJSON-1843119868-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:43:43Z,user_data=None,user_id='7c0fb56fc41e44dfa23a0d45149e78e3',uuid=9141d9a1-7a40-4a72-a3f7-2d67ae112383,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "606a4645-8996-452d-9864-00ce49d9140c", "address": "fa:16:3e:ce:a6:f6", "network": {"id": "d7ba1c27-6255-4c71-8e98-23a1c59b5723", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1812148536-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b004cb06df74de2903dae19345fd9c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap606a4645-89", "ovs_interfaceid": "606a4645-8996-452d-9864-00ce49d9140c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 02:43:47 np0005531887 nova_compute[186849]: 2025-11-22 07:43:47.923 186853 DEBUG nova.network.os_vif_util [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Converting VIF {"id": "606a4645-8996-452d-9864-00ce49d9140c", "address": "fa:16:3e:ce:a6:f6", "network": {"id": "d7ba1c27-6255-4c71-8e98-23a1c59b5723", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1812148536-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b004cb06df74de2903dae19345fd9c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap606a4645-89", "ovs_interfaceid": "606a4645-8996-452d-9864-00ce49d9140c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:43:47 np0005531887 nova_compute[186849]: 2025-11-22 07:43:47.924 186853 DEBUG nova.network.os_vif_util [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ce:a6:f6,bridge_name='br-int',has_traffic_filtering=True,id=606a4645-8996-452d-9864-00ce49d9140c,network=Network(d7ba1c27-6255-4c71-8e98-23a1c59b5723),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap606a4645-89') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:43:47 np0005531887 nova_compute[186849]: 2025-11-22 07:43:47.925 186853 DEBUG nova.objects.instance [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9141d9a1-7a40-4a72-a3f7-2d67ae112383 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:43:47 np0005531887 nova_compute[186849]: 2025-11-22 07:43:47.944 186853 DEBUG nova.virt.libvirt.driver [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:43:47 np0005531887 nova_compute[186849]:  <uuid>9141d9a1-7a40-4a72-a3f7-2d67ae112383</uuid>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:  <name>instance-00000012</name>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:  <memory>131072</memory>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:  <vcpu>1</vcpu>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:43:47 np0005531887 nova_compute[186849]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:      <nova:name>tempest-ServersAdminTestJSON-server-611382116</nova:name>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:      <nova:creationTime>2025-11-22 07:43:47</nova:creationTime>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:      <nova:flavor name="m1.nano">
Nov 22 02:43:47 np0005531887 nova_compute[186849]:        <nova:memory>128</nova:memory>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:        <nova:disk>1</nova:disk>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:        <nova:swap>0</nova:swap>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:      </nova:flavor>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:      <nova:owner>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:        <nova:user uuid="7c0fb56fc41e44dfa23a0d45149e78e3">tempest-ServersAdminTestJSON-1843119868-project-member</nova:user>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:        <nova:project uuid="9b004cb06df74de2903dae19345fd9c7">tempest-ServersAdminTestJSON-1843119868</nova:project>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:      </nova:owner>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:      <nova:ports>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:        <nova:port uuid="606a4645-8996-452d-9864-00ce49d9140c">
Nov 22 02:43:47 np0005531887 nova_compute[186849]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:        </nova:port>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:      </nova:ports>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:    </nova:instance>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:  <sysinfo type="smbios">
Nov 22 02:43:47 np0005531887 nova_compute[186849]:    <system>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:      <entry name="serial">9141d9a1-7a40-4a72-a3f7-2d67ae112383</entry>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:      <entry name="uuid">9141d9a1-7a40-4a72-a3f7-2d67ae112383</entry>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:    </system>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:  <os>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:    <boot dev="hd"/>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:    <smbios mode="sysinfo"/>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:  </os>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:  <features>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:    <vmcoreinfo/>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:  </features>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:  <clock offset="utc">
Nov 22 02:43:47 np0005531887 nova_compute[186849]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:    <timer name="hpet" present="no"/>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:  </clock>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:  <cpu mode="custom" match="exact">
Nov 22 02:43:47 np0005531887 nova_compute[186849]:    <model>Nehalem</model>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:  <devices>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:    <disk type="file" device="disk">
Nov 22 02:43:47 np0005531887 nova_compute[186849]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/9141d9a1-7a40-4a72-a3f7-2d67ae112383/disk"/>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:      <target dev="vda" bus="virtio"/>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:    <disk type="file" device="cdrom">
Nov 22 02:43:47 np0005531887 nova_compute[186849]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/9141d9a1-7a40-4a72-a3f7-2d67ae112383/disk.config"/>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:      <target dev="sda" bus="sata"/>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:    <interface type="ethernet">
Nov 22 02:43:47 np0005531887 nova_compute[186849]:      <mac address="fa:16:3e:ce:a6:f6"/>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:      <mtu size="1442"/>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:      <target dev="tap606a4645-89"/>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:    </interface>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:    <serial type="pty">
Nov 22 02:43:47 np0005531887 nova_compute[186849]:      <log file="/var/lib/nova/instances/9141d9a1-7a40-4a72-a3f7-2d67ae112383/console.log" append="off"/>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:    </serial>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:    <video>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:    </video>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:    <input type="tablet" bus="usb"/>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:    <rng model="virtio">
Nov 22 02:43:47 np0005531887 nova_compute[186849]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:    </rng>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:    <controller type="usb" index="0"/>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:    <memballoon model="virtio">
Nov 22 02:43:47 np0005531887 nova_compute[186849]:      <stats period="10"/>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 02:43:47 np0005531887 nova_compute[186849]:  </devices>
Nov 22 02:43:47 np0005531887 nova_compute[186849]: </domain>
Nov 22 02:43:47 np0005531887 nova_compute[186849]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:43:47 np0005531887 nova_compute[186849]: 2025-11-22 07:43:47.946 186853 DEBUG nova.compute.manager [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Preparing to wait for external event network-vif-plugged-606a4645-8996-452d-9864-00ce49d9140c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 02:43:47 np0005531887 nova_compute[186849]: 2025-11-22 07:43:47.947 186853 DEBUG oslo_concurrency.lockutils [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Acquiring lock "9141d9a1-7a40-4a72-a3f7-2d67ae112383-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:43:47 np0005531887 nova_compute[186849]: 2025-11-22 07:43:47.947 186853 DEBUG oslo_concurrency.lockutils [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "9141d9a1-7a40-4a72-a3f7-2d67ae112383-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:43:47 np0005531887 nova_compute[186849]: 2025-11-22 07:43:47.947 186853 DEBUG oslo_concurrency.lockutils [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "9141d9a1-7a40-4a72-a3f7-2d67ae112383-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:43:47 np0005531887 nova_compute[186849]: 2025-11-22 07:43:47.948 186853 DEBUG nova.virt.libvirt.vif [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:43:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-611382116',display_name='tempest-ServersAdminTestJSON-server-611382116',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-611382116',id=18,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9b004cb06df74de2903dae19345fd9c7',ramdisk_id='',reservation_id='r-e3r0caf9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-1843119868',owner_user_name='tempest-ServersAdminTestJSON-1843119868-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:43:43Z,user_data=None,user_id='7c0fb56fc41e44dfa23a0d45149e78e3',uuid=9141d9a1-7a40-4a72-a3f7-2d67ae112383,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "606a4645-8996-452d-9864-00ce49d9140c", "address": "fa:16:3e:ce:a6:f6", "network": {"id": "d7ba1c27-6255-4c71-8e98-23a1c59b5723", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1812148536-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b004cb06df74de2903dae19345fd9c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap606a4645-89", "ovs_interfaceid": "606a4645-8996-452d-9864-00ce49d9140c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 02:43:47 np0005531887 nova_compute[186849]: 2025-11-22 07:43:47.948 186853 DEBUG nova.network.os_vif_util [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Converting VIF {"id": "606a4645-8996-452d-9864-00ce49d9140c", "address": "fa:16:3e:ce:a6:f6", "network": {"id": "d7ba1c27-6255-4c71-8e98-23a1c59b5723", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1812148536-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b004cb06df74de2903dae19345fd9c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap606a4645-89", "ovs_interfaceid": "606a4645-8996-452d-9864-00ce49d9140c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:43:47 np0005531887 nova_compute[186849]: 2025-11-22 07:43:47.949 186853 DEBUG nova.network.os_vif_util [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ce:a6:f6,bridge_name='br-int',has_traffic_filtering=True,id=606a4645-8996-452d-9864-00ce49d9140c,network=Network(d7ba1c27-6255-4c71-8e98-23a1c59b5723),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap606a4645-89') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:43:47 np0005531887 nova_compute[186849]: 2025-11-22 07:43:47.950 186853 DEBUG os_vif [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ce:a6:f6,bridge_name='br-int',has_traffic_filtering=True,id=606a4645-8996-452d-9864-00ce49d9140c,network=Network(d7ba1c27-6255-4c71-8e98-23a1c59b5723),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap606a4645-89') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 02:43:47 np0005531887 nova_compute[186849]: 2025-11-22 07:43:47.950 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:47 np0005531887 nova_compute[186849]: 2025-11-22 07:43:47.951 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:43:47 np0005531887 nova_compute[186849]: 2025-11-22 07:43:47.951 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:43:47 np0005531887 nova_compute[186849]: 2025-11-22 07:43:47.955 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:47 np0005531887 nova_compute[186849]: 2025-11-22 07:43:47.956 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap606a4645-89, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:43:47 np0005531887 nova_compute[186849]: 2025-11-22 07:43:47.956 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap606a4645-89, col_values=(('external_ids', {'iface-id': '606a4645-8996-452d-9864-00ce49d9140c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ce:a6:f6', 'vm-uuid': '9141d9a1-7a40-4a72-a3f7-2d67ae112383'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:43:47 np0005531887 nova_compute[186849]: 2025-11-22 07:43:47.958 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:47 np0005531887 NetworkManager[55210]: <info>  [1763797427.9591] manager: (tap606a4645-89): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/29)
Nov 22 02:43:47 np0005531887 nova_compute[186849]: 2025-11-22 07:43:47.961 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:43:47 np0005531887 nova_compute[186849]: 2025-11-22 07:43:47.965 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:47 np0005531887 nova_compute[186849]: 2025-11-22 07:43:47.966 186853 INFO os_vif [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ce:a6:f6,bridge_name='br-int',has_traffic_filtering=True,id=606a4645-8996-452d-9864-00ce49d9140c,network=Network(d7ba1c27-6255-4c71-8e98-23a1c59b5723),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap606a4645-89')#033[00m
Nov 22 02:43:48 np0005531887 nova_compute[186849]: 2025-11-22 07:43:48.016 186853 DEBUG nova.virt.libvirt.driver [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:43:48 np0005531887 nova_compute[186849]: 2025-11-22 07:43:48.016 186853 DEBUG nova.virt.libvirt.driver [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:43:48 np0005531887 nova_compute[186849]: 2025-11-22 07:43:48.017 186853 DEBUG nova.virt.libvirt.driver [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] No VIF found with MAC fa:16:3e:ce:a6:f6, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 02:43:48 np0005531887 nova_compute[186849]: 2025-11-22 07:43:48.017 186853 INFO nova.virt.libvirt.driver [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Using config drive#033[00m
Nov 22 02:43:48 np0005531887 nova_compute[186849]: 2025-11-22 07:43:48.491 186853 INFO nova.virt.libvirt.driver [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Creating config drive at /var/lib/nova/instances/9141d9a1-7a40-4a72-a3f7-2d67ae112383/disk.config#033[00m
Nov 22 02:43:48 np0005531887 nova_compute[186849]: 2025-11-22 07:43:48.496 186853 DEBUG oslo_concurrency.processutils [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9141d9a1-7a40-4a72-a3f7-2d67ae112383/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkygmmyng execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:43:48 np0005531887 nova_compute[186849]: 2025-11-22 07:43:48.620 186853 DEBUG oslo_concurrency.processutils [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9141d9a1-7a40-4a72-a3f7-2d67ae112383/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkygmmyng" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:43:48 np0005531887 kernel: tap606a4645-89: entered promiscuous mode
Nov 22 02:43:48 np0005531887 NetworkManager[55210]: <info>  [1763797428.6833] manager: (tap606a4645-89): new Tun device (/org/freedesktop/NetworkManager/Devices/30)
Nov 22 02:43:48 np0005531887 nova_compute[186849]: 2025-11-22 07:43:48.685 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:48 np0005531887 ovn_controller[95130]: 2025-11-22T07:43:48Z|00042|binding|INFO|Claiming lport 606a4645-8996-452d-9864-00ce49d9140c for this chassis.
Nov 22 02:43:48 np0005531887 ovn_controller[95130]: 2025-11-22T07:43:48Z|00043|binding|INFO|606a4645-8996-452d-9864-00ce49d9140c: Claiming fa:16:3e:ce:a6:f6 10.100.0.6
Nov 22 02:43:48 np0005531887 nova_compute[186849]: 2025-11-22 07:43:48.691 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:48 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:48.726 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ce:a6:f6 10.100.0.6'], port_security=['fa:16:3e:ce:a6:f6 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '9141d9a1-7a40-4a72-a3f7-2d67ae112383', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7ba1c27-6255-4c71-8e98-23a1c59b5723', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9b004cb06df74de2903dae19345fd9c7', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f8b9c274-fa57-419c-9d40-54201db84f9d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ce3be460-df7c-41a5-9ff2-c82c8fc728ec, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=606a4645-8996-452d-9864-00ce49d9140c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:43:48 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:48.728 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 606a4645-8996-452d-9864-00ce49d9140c in datapath d7ba1c27-6255-4c71-8e98-23a1c59b5723 bound to our chassis#033[00m
Nov 22 02:43:48 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:48.730 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d7ba1c27-6255-4c71-8e98-23a1c59b5723#033[00m
Nov 22 02:43:48 np0005531887 systemd-machined[153180]: New machine qemu-6-instance-00000012.
Nov 22 02:43:48 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:48.742 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[9c49c912-a9e7-423b-94b1-e5b8320bba2a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:43:48 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:48.743 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd7ba1c27-61 in ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 02:43:48 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:48.745 213790 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd7ba1c27-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 02:43:48 np0005531887 nova_compute[186849]: 2025-11-22 07:43:48.746 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:48 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:48.746 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[bbd273d7-046e-486e-9955-c448c6ddfc38]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:43:48 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:48.748 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[6a093c3b-16cb-4ccd-9648-d79a5e3e7211]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:43:48 np0005531887 ovn_controller[95130]: 2025-11-22T07:43:48Z|00044|binding|INFO|Setting lport 606a4645-8996-452d-9864-00ce49d9140c ovn-installed in OVS
Nov 22 02:43:48 np0005531887 ovn_controller[95130]: 2025-11-22T07:43:48Z|00045|binding|INFO|Setting lport 606a4645-8996-452d-9864-00ce49d9140c up in Southbound
Nov 22 02:43:48 np0005531887 nova_compute[186849]: 2025-11-22 07:43:48.755 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:48 np0005531887 systemd[1]: Started Virtual Machine qemu-6-instance-00000012.
Nov 22 02:43:48 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:48.763 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[8541ece9-4152-4299-a63d-268cb6aaa410]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:43:48 np0005531887 systemd-udevd[215275]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:43:48 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:48.786 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[0197c81b-ff97-49f1-87ed-b4f10450cf88]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:43:48 np0005531887 NetworkManager[55210]: <info>  [1763797428.8035] device (tap606a4645-89): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 02:43:48 np0005531887 NetworkManager[55210]: <info>  [1763797428.8058] device (tap606a4645-89): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 02:43:48 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:48.820 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[746115fc-b5b6-447d-ac99-2d24385eb8e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:43:48 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:48.824 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[e03d8925-304a-4f98-a6f6-85c6888aae17]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:43:48 np0005531887 NetworkManager[55210]: <info>  [1763797428.8284] manager: (tapd7ba1c27-60): new Veth device (/org/freedesktop/NetworkManager/Devices/31)
Nov 22 02:43:48 np0005531887 nova_compute[186849]: 2025-11-22 07:43:48.830 186853 DEBUG oslo_concurrency.processutils [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:43:48 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:48.875 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[e3acbb31-735d-4004-8e65-72cca3793399]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:43:48 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:48.885 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[bdf11f92-3161-4fe9-a70e-312f28d8906a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:43:48 np0005531887 NetworkManager[55210]: <info>  [1763797428.9119] device (tapd7ba1c27-60): carrier: link connected
Nov 22 02:43:48 np0005531887 nova_compute[186849]: 2025-11-22 07:43:48.912 186853 DEBUG oslo_concurrency.processutils [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:43:48 np0005531887 nova_compute[186849]: 2025-11-22 07:43:48.914 186853 DEBUG oslo_concurrency.processutils [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:43:48 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:48.918 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[d723c96a-97f4-407c-9c72-a6b27527177a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:43:48 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:48.937 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[46aaf044-1f61-4cd4-a737-3bfd899662a9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7ba1c27-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bb:37:eb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 417151, 'reachable_time': 40537, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215308, 'error': None, 'target': 'ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:43:48 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:48.958 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[bb7c7d4a-e855-4b29-b7c1-6a6633ad2a35]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:febb:37eb'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 417151, 'tstamp': 417151}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215309, 'error': None, 'target': 'ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:43:48 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:48.975 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[55411632-8bdc-4239-817d-7c291ffa3b4f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7ba1c27-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bb:37:eb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 417151, 'reachable_time': 40537, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215310, 'error': None, 'target': 'ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:43:48 np0005531887 nova_compute[186849]: 2025-11-22 07:43:48.982 186853 DEBUG oslo_concurrency.processutils [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:43:48 np0005531887 nova_compute[186849]: 2025-11-22 07:43:48.983 186853 DEBUG oslo_concurrency.lockutils [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:43:48 np0005531887 nova_compute[186849]: 2025-11-22 07:43:48.984 186853 DEBUG oslo_concurrency.lockutils [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:43:48 np0005531887 nova_compute[186849]: 2025-11-22 07:43:48.992 186853 INFO nova.compute.rpcapi [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Automatically selected compute RPC version 6.2 from minimum service version 66#033[00m
Nov 22 02:43:48 np0005531887 nova_compute[186849]: 2025-11-22 07:43:48.992 186853 DEBUG oslo_concurrency.lockutils [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:43:49 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:49.006 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[f7da44d1-ec29-4e0f-a4e8-8ff67dd94317]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:43:49 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:49.064 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[3bc83533-0cbe-4a2c-8143-6ab79ecb5296]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:43:49 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:49.066 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7ba1c27-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:43:49 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:49.066 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:43:49 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:49.066 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd7ba1c27-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:43:49 np0005531887 kernel: tapd7ba1c27-60: entered promiscuous mode
Nov 22 02:43:49 np0005531887 NetworkManager[55210]: <info>  [1763797429.0698] manager: (tapd7ba1c27-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/32)
Nov 22 02:43:49 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:49.072 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd7ba1c27-60, col_values=(('external_ids', {'iface-id': '3c20001c-28e2-4cdd-9a7c-497ed470b31c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:43:49 np0005531887 nova_compute[186849]: 2025-11-22 07:43:49.069 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:49 np0005531887 ovn_controller[95130]: 2025-11-22T07:43:49Z|00046|binding|INFO|Releasing lport 3c20001c-28e2-4cdd-9a7c-497ed470b31c from this chassis (sb_readonly=0)
Nov 22 02:43:49 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:49.078 104084 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d7ba1c27-6255-4c71-8e98-23a1c59b5723.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d7ba1c27-6255-4c71-8e98-23a1c59b5723.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 02:43:49 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:49.079 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[1fd7a69e-5438-4337-8fba-9dc6fe8ded10]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:43:49 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:49.080 104084 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 02:43:49 np0005531887 ovn_metadata_agent[104079]: global
Nov 22 02:43:49 np0005531887 ovn_metadata_agent[104079]:    log         /dev/log local0 debug
Nov 22 02:43:49 np0005531887 ovn_metadata_agent[104079]:    log-tag     haproxy-metadata-proxy-d7ba1c27-6255-4c71-8e98-23a1c59b5723
Nov 22 02:43:49 np0005531887 ovn_metadata_agent[104079]:    user        root
Nov 22 02:43:49 np0005531887 ovn_metadata_agent[104079]:    group       root
Nov 22 02:43:49 np0005531887 ovn_metadata_agent[104079]:    maxconn     1024
Nov 22 02:43:49 np0005531887 ovn_metadata_agent[104079]:    pidfile     /var/lib/neutron/external/pids/d7ba1c27-6255-4c71-8e98-23a1c59b5723.pid.haproxy
Nov 22 02:43:49 np0005531887 ovn_metadata_agent[104079]:    daemon
Nov 22 02:43:49 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:43:49 np0005531887 ovn_metadata_agent[104079]: defaults
Nov 22 02:43:49 np0005531887 ovn_metadata_agent[104079]:    log global
Nov 22 02:43:49 np0005531887 ovn_metadata_agent[104079]:    mode http
Nov 22 02:43:49 np0005531887 ovn_metadata_agent[104079]:    option httplog
Nov 22 02:43:49 np0005531887 ovn_metadata_agent[104079]:    option dontlognull
Nov 22 02:43:49 np0005531887 ovn_metadata_agent[104079]:    option http-server-close
Nov 22 02:43:49 np0005531887 ovn_metadata_agent[104079]:    option forwardfor
Nov 22 02:43:49 np0005531887 ovn_metadata_agent[104079]:    retries                 3
Nov 22 02:43:49 np0005531887 ovn_metadata_agent[104079]:    timeout http-request    30s
Nov 22 02:43:49 np0005531887 ovn_metadata_agent[104079]:    timeout connect         30s
Nov 22 02:43:49 np0005531887 ovn_metadata_agent[104079]:    timeout client          32s
Nov 22 02:43:49 np0005531887 ovn_metadata_agent[104079]:    timeout server          32s
Nov 22 02:43:49 np0005531887 ovn_metadata_agent[104079]:    timeout http-keep-alive 30s
Nov 22 02:43:49 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:43:49 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:43:49 np0005531887 ovn_metadata_agent[104079]: listen listener
Nov 22 02:43:49 np0005531887 ovn_metadata_agent[104079]:    bind 169.254.169.254:80
Nov 22 02:43:49 np0005531887 ovn_metadata_agent[104079]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 02:43:49 np0005531887 ovn_metadata_agent[104079]:    http-request add-header X-OVN-Network-ID d7ba1c27-6255-4c71-8e98-23a1c59b5723
Nov 22 02:43:49 np0005531887 ovn_metadata_agent[104079]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 02:43:49 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:49.082 104084 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723', 'env', 'PROCESS_TAG=haproxy-d7ba1c27-6255-4c71-8e98-23a1c59b5723', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d7ba1c27-6255-4c71-8e98-23a1c59b5723.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 02:43:49 np0005531887 nova_compute[186849]: 2025-11-22 07:43:49.088 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:49 np0005531887 nova_compute[186849]: 2025-11-22 07:43:49.110 186853 DEBUG nova.compute.manager [req-be349fdc-123c-4e6a-92c1-38ecaa2c4278 req-37c4de34-7ed6-42b3-ba5d-eda2f0c57b43 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Received event network-vif-plugged-606a4645-8996-452d-9864-00ce49d9140c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:43:49 np0005531887 nova_compute[186849]: 2025-11-22 07:43:49.110 186853 DEBUG oslo_concurrency.lockutils [req-be349fdc-123c-4e6a-92c1-38ecaa2c4278 req-37c4de34-7ed6-42b3-ba5d-eda2f0c57b43 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "9141d9a1-7a40-4a72-a3f7-2d67ae112383-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:43:49 np0005531887 nova_compute[186849]: 2025-11-22 07:43:49.111 186853 DEBUG oslo_concurrency.lockutils [req-be349fdc-123c-4e6a-92c1-38ecaa2c4278 req-37c4de34-7ed6-42b3-ba5d-eda2f0c57b43 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "9141d9a1-7a40-4a72-a3f7-2d67ae112383-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:43:49 np0005531887 nova_compute[186849]: 2025-11-22 07:43:49.111 186853 DEBUG oslo_concurrency.lockutils [req-be349fdc-123c-4e6a-92c1-38ecaa2c4278 req-37c4de34-7ed6-42b3-ba5d-eda2f0c57b43 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "9141d9a1-7a40-4a72-a3f7-2d67ae112383-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:43:49 np0005531887 nova_compute[186849]: 2025-11-22 07:43:49.111 186853 DEBUG nova.compute.manager [req-be349fdc-123c-4e6a-92c1-38ecaa2c4278 req-37c4de34-7ed6-42b3-ba5d-eda2f0c57b43 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Processing event network-vif-plugged-606a4645-8996-452d-9864-00ce49d9140c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 02:43:49 np0005531887 nova_compute[186849]: 2025-11-22 07:43:49.361 186853 DEBUG nova.network.neutron [req-5f170c31-745a-499b-8a11-e5ef64e69296 req-437eaddc-60b5-4a36-95f7-c66a641c59ac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Updated VIF entry in instance network info cache for port 606a4645-8996-452d-9864-00ce49d9140c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 02:43:49 np0005531887 nova_compute[186849]: 2025-11-22 07:43:49.362 186853 DEBUG nova.network.neutron [req-5f170c31-745a-499b-8a11-e5ef64e69296 req-437eaddc-60b5-4a36-95f7-c66a641c59ac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Updating instance_info_cache with network_info: [{"id": "606a4645-8996-452d-9864-00ce49d9140c", "address": "fa:16:3e:ce:a6:f6", "network": {"id": "d7ba1c27-6255-4c71-8e98-23a1c59b5723", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1812148536-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b004cb06df74de2903dae19345fd9c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap606a4645-89", "ovs_interfaceid": "606a4645-8996-452d-9864-00ce49d9140c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:43:49 np0005531887 nova_compute[186849]: 2025-11-22 07:43:49.377 186853 DEBUG oslo_concurrency.lockutils [req-5f170c31-745a-499b-8a11-e5ef64e69296 req-437eaddc-60b5-4a36-95f7-c66a641c59ac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-9141d9a1-7a40-4a72-a3f7-2d67ae112383" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:43:49 np0005531887 podman[215344]: 2025-11-22 07:43:49.497126006 +0000 UTC m=+0.075177579 container create c34aae5a32655d08bb9a4cb3d84f64f791bf035b319d96480ec9b00736b02799 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:43:49 np0005531887 podman[215344]: 2025-11-22 07:43:49.446664872 +0000 UTC m=+0.024716455 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 02:43:49 np0005531887 systemd[1]: Started libpod-conmon-c34aae5a32655d08bb9a4cb3d84f64f791bf035b319d96480ec9b00736b02799.scope.
Nov 22 02:43:49 np0005531887 systemd[1]: Started libcrun container.
Nov 22 02:43:49 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7e6c8a3c630d48c9c21bb5e3cbc5930f23c1d567c11262afa9edbfcd59b211d7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 02:43:49 np0005531887 podman[215344]: 2025-11-22 07:43:49.598941873 +0000 UTC m=+0.176993446 container init c34aae5a32655d08bb9a4cb3d84f64f791bf035b319d96480ec9b00736b02799 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS)
Nov 22 02:43:49 np0005531887 podman[215357]: 2025-11-22 07:43:49.603844083 +0000 UTC m=+0.069666213 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, container_name=ceilometer_agent_compute, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 22 02:43:49 np0005531887 podman[215344]: 2025-11-22 07:43:49.605439872 +0000 UTC m=+0.183491425 container start c34aae5a32655d08bb9a4cb3d84f64f791bf035b319d96480ec9b00736b02799 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:43:49 np0005531887 neutron-haproxy-ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723[215379]: [NOTICE]   (215400) : New worker (215406) forked
Nov 22 02:43:49 np0005531887 neutron-haproxy-ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723[215379]: [NOTICE]   (215400) : Loading success.
Nov 22 02:43:49 np0005531887 podman[215358]: 2025-11-22 07:43:49.642240241 +0000 UTC m=+0.103809128 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3)
Nov 22 02:43:50 np0005531887 nova_compute[186849]: 2025-11-22 07:43:50.098 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:50 np0005531887 nova_compute[186849]: 2025-11-22 07:43:50.413 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763797430.412521, 9141d9a1-7a40-4a72-a3f7-2d67ae112383 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:43:50 np0005531887 nova_compute[186849]: 2025-11-22 07:43:50.413 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] VM Started (Lifecycle Event)#033[00m
Nov 22 02:43:50 np0005531887 nova_compute[186849]: 2025-11-22 07:43:50.415 186853 DEBUG nova.compute.manager [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:43:50 np0005531887 nova_compute[186849]: 2025-11-22 07:43:50.418 186853 DEBUG nova.virt.libvirt.driver [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 02:43:50 np0005531887 nova_compute[186849]: 2025-11-22 07:43:50.421 186853 INFO nova.virt.libvirt.driver [-] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Instance spawned successfully.#033[00m
Nov 22 02:43:50 np0005531887 nova_compute[186849]: 2025-11-22 07:43:50.422 186853 DEBUG nova.virt.libvirt.driver [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 02:43:50 np0005531887 nova_compute[186849]: 2025-11-22 07:43:50.438 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:43:50 np0005531887 nova_compute[186849]: 2025-11-22 07:43:50.450 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:43:50 np0005531887 nova_compute[186849]: 2025-11-22 07:43:50.455 186853 DEBUG nova.virt.libvirt.driver [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:43:50 np0005531887 nova_compute[186849]: 2025-11-22 07:43:50.455 186853 DEBUG nova.virt.libvirt.driver [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:43:50 np0005531887 nova_compute[186849]: 2025-11-22 07:43:50.456 186853 DEBUG nova.virt.libvirt.driver [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:43:50 np0005531887 nova_compute[186849]: 2025-11-22 07:43:50.456 186853 DEBUG nova.virt.libvirt.driver [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:43:50 np0005531887 nova_compute[186849]: 2025-11-22 07:43:50.456 186853 DEBUG nova.virt.libvirt.driver [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:43:50 np0005531887 nova_compute[186849]: 2025-11-22 07:43:50.457 186853 DEBUG nova.virt.libvirt.driver [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:43:50 np0005531887 nova_compute[186849]: 2025-11-22 07:43:50.481 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:43:50 np0005531887 nova_compute[186849]: 2025-11-22 07:43:50.482 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763797430.4127662, 9141d9a1-7a40-4a72-a3f7-2d67ae112383 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:43:50 np0005531887 nova_compute[186849]: 2025-11-22 07:43:50.482 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] VM Paused (Lifecycle Event)#033[00m
Nov 22 02:43:50 np0005531887 nova_compute[186849]: 2025-11-22 07:43:50.499 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:43:50 np0005531887 nova_compute[186849]: 2025-11-22 07:43:50.502 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763797430.4182127, 9141d9a1-7a40-4a72-a3f7-2d67ae112383 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:43:50 np0005531887 nova_compute[186849]: 2025-11-22 07:43:50.502 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:43:50 np0005531887 nova_compute[186849]: 2025-11-22 07:43:50.517 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:43:50 np0005531887 nova_compute[186849]: 2025-11-22 07:43:50.520 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:43:50 np0005531887 nova_compute[186849]: 2025-11-22 07:43:50.543 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:43:50 np0005531887 nova_compute[186849]: 2025-11-22 07:43:50.570 186853 INFO nova.compute.manager [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Took 6.74 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 02:43:50 np0005531887 nova_compute[186849]: 2025-11-22 07:43:50.570 186853 DEBUG nova.compute.manager [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:43:50 np0005531887 nova_compute[186849]: 2025-11-22 07:43:50.648 186853 INFO nova.compute.manager [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Took 7.25 seconds to build instance.#033[00m
Nov 22 02:43:50 np0005531887 nova_compute[186849]: 2025-11-22 07:43:50.809 186853 DEBUG oslo_concurrency.lockutils [None req-7619b0fc-3d6d-4a1e-9ca6-1f5c484adb88 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "9141d9a1-7a40-4a72-a3f7-2d67ae112383" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.514s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:43:51 np0005531887 nova_compute[186849]: 2025-11-22 07:43:51.210 186853 DEBUG nova.compute.manager [req-c7b911e5-a048-4d44-aead-aa58aa1809e8 req-70a2e02a-10c6-44c9-bf79-ff50cd0a4af3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Received event network-vif-plugged-606a4645-8996-452d-9864-00ce49d9140c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:43:51 np0005531887 nova_compute[186849]: 2025-11-22 07:43:51.210 186853 DEBUG oslo_concurrency.lockutils [req-c7b911e5-a048-4d44-aead-aa58aa1809e8 req-70a2e02a-10c6-44c9-bf79-ff50cd0a4af3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "9141d9a1-7a40-4a72-a3f7-2d67ae112383-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:43:51 np0005531887 nova_compute[186849]: 2025-11-22 07:43:51.211 186853 DEBUG oslo_concurrency.lockutils [req-c7b911e5-a048-4d44-aead-aa58aa1809e8 req-70a2e02a-10c6-44c9-bf79-ff50cd0a4af3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "9141d9a1-7a40-4a72-a3f7-2d67ae112383-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:43:51 np0005531887 nova_compute[186849]: 2025-11-22 07:43:51.211 186853 DEBUG oslo_concurrency.lockutils [req-c7b911e5-a048-4d44-aead-aa58aa1809e8 req-70a2e02a-10c6-44c9-bf79-ff50cd0a4af3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "9141d9a1-7a40-4a72-a3f7-2d67ae112383-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:43:51 np0005531887 nova_compute[186849]: 2025-11-22 07:43:51.211 186853 DEBUG nova.compute.manager [req-c7b911e5-a048-4d44-aead-aa58aa1809e8 req-70a2e02a-10c6-44c9-bf79-ff50cd0a4af3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] No waiting events found dispatching network-vif-plugged-606a4645-8996-452d-9864-00ce49d9140c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:43:51 np0005531887 nova_compute[186849]: 2025-11-22 07:43:51.212 186853 WARNING nova.compute.manager [req-c7b911e5-a048-4d44-aead-aa58aa1809e8 req-70a2e02a-10c6-44c9-bf79-ff50cd0a4af3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Received unexpected event network-vif-plugged-606a4645-8996-452d-9864-00ce49d9140c for instance with vm_state active and task_state None.#033[00m
Nov 22 02:43:52 np0005531887 systemd[1]: Created slice User Slice of UID 42436.
Nov 22 02:43:52 np0005531887 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 22 02:43:52 np0005531887 systemd-logind[821]: New session 27 of user nova.
Nov 22 02:43:52 np0005531887 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 22 02:43:52 np0005531887 systemd[1]: Starting User Manager for UID 42436...
Nov 22 02:43:52 np0005531887 systemd[215430]: Queued start job for default target Main User Target.
Nov 22 02:43:52 np0005531887 systemd[215430]: Created slice User Application Slice.
Nov 22 02:43:52 np0005531887 systemd[215430]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 22 02:43:52 np0005531887 systemd[215430]: Started Daily Cleanup of User's Temporary Directories.
Nov 22 02:43:52 np0005531887 systemd[215430]: Reached target Paths.
Nov 22 02:43:52 np0005531887 systemd[215430]: Reached target Timers.
Nov 22 02:43:52 np0005531887 systemd[215430]: Starting D-Bus User Message Bus Socket...
Nov 22 02:43:52 np0005531887 systemd[215430]: Starting Create User's Volatile Files and Directories...
Nov 22 02:43:52 np0005531887 systemd[215430]: Finished Create User's Volatile Files and Directories.
Nov 22 02:43:52 np0005531887 systemd[215430]: Listening on D-Bus User Message Bus Socket.
Nov 22 02:43:52 np0005531887 systemd[215430]: Reached target Sockets.
Nov 22 02:43:52 np0005531887 systemd[215430]: Reached target Basic System.
Nov 22 02:43:52 np0005531887 systemd[215430]: Reached target Main User Target.
Nov 22 02:43:52 np0005531887 systemd[215430]: Startup finished in 162ms.
Nov 22 02:43:52 np0005531887 systemd[1]: Started User Manager for UID 42436.
Nov 22 02:43:52 np0005531887 systemd[1]: Started Session 27 of User nova.
Nov 22 02:43:52 np0005531887 nova_compute[186849]: 2025-11-22 07:43:52.960 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:52 np0005531887 systemd[1]: session-27.scope: Deactivated successfully.
Nov 22 02:43:52 np0005531887 systemd-logind[821]: Session 27 logged out. Waiting for processes to exit.
Nov 22 02:43:53 np0005531887 systemd-logind[821]: Removed session 27.
Nov 22 02:43:53 np0005531887 nova_compute[186849]: 2025-11-22 07:43:53.530 186853 DEBUG oslo_concurrency.lockutils [None req-47673235-0a02-4160-98e3-931b8c277157 f4dfe5a0ca3d4f2090adec077ca6b9af 765c19fc62c848b9a5257d45514e034f - - default default] Acquiring lock "refresh_cache-9141d9a1-7a40-4a72-a3f7-2d67ae112383" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:43:53 np0005531887 nova_compute[186849]: 2025-11-22 07:43:53.531 186853 DEBUG oslo_concurrency.lockutils [None req-47673235-0a02-4160-98e3-931b8c277157 f4dfe5a0ca3d4f2090adec077ca6b9af 765c19fc62c848b9a5257d45514e034f - - default default] Acquired lock "refresh_cache-9141d9a1-7a40-4a72-a3f7-2d67ae112383" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:43:53 np0005531887 nova_compute[186849]: 2025-11-22 07:43:53.531 186853 DEBUG nova.network.neutron [None req-47673235-0a02-4160-98e3-931b8c277157 f4dfe5a0ca3d4f2090adec077ca6b9af 765c19fc62c848b9a5257d45514e034f - - default default] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:43:53 np0005531887 nova_compute[186849]: 2025-11-22 07:43:53.868 186853 DEBUG nova.compute.manager [req-a8b9c607-cfe4-437e-ad9b-101305036cea req-dec60489-e06f-4f7f-9271-b7b7339d875c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Received event network-vif-unplugged-66ab05b0-442e-4420-82b9-0fc90a3df63b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:43:53 np0005531887 nova_compute[186849]: 2025-11-22 07:43:53.869 186853 DEBUG oslo_concurrency.lockutils [req-a8b9c607-cfe4-437e-ad9b-101305036cea req-dec60489-e06f-4f7f-9271-b7b7339d875c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:43:53 np0005531887 nova_compute[186849]: 2025-11-22 07:43:53.869 186853 DEBUG oslo_concurrency.lockutils [req-a8b9c607-cfe4-437e-ad9b-101305036cea req-dec60489-e06f-4f7f-9271-b7b7339d875c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:43:53 np0005531887 nova_compute[186849]: 2025-11-22 07:43:53.869 186853 DEBUG oslo_concurrency.lockutils [req-a8b9c607-cfe4-437e-ad9b-101305036cea req-dec60489-e06f-4f7f-9271-b7b7339d875c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:43:53 np0005531887 nova_compute[186849]: 2025-11-22 07:43:53.869 186853 DEBUG nova.compute.manager [req-a8b9c607-cfe4-437e-ad9b-101305036cea req-dec60489-e06f-4f7f-9271-b7b7339d875c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] No waiting events found dispatching network-vif-unplugged-66ab05b0-442e-4420-82b9-0fc90a3df63b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:43:53 np0005531887 nova_compute[186849]: 2025-11-22 07:43:53.869 186853 DEBUG nova.compute.manager [req-a8b9c607-cfe4-437e-ad9b-101305036cea req-dec60489-e06f-4f7f-9271-b7b7339d875c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Received event network-vif-unplugged-66ab05b0-442e-4420-82b9-0fc90a3df63b for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 02:43:54 np0005531887 nova_compute[186849]: 2025-11-22 07:43:54.812 186853 INFO nova.compute.manager [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Took 5.83 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.#033[00m
Nov 22 02:43:54 np0005531887 nova_compute[186849]: 2025-11-22 07:43:54.813 186853 DEBUG nova.compute.manager [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:43:54 np0005531887 nova_compute[186849]: 2025-11-22 07:43:54.832 186853 DEBUG nova.compute.manager [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpjjc7nk5w',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='144e6cca-5b79-4b25-9456-a59f6895075b',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(d7d1dd24-6605-44cf-9fd2-5b9abad61c6d),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939#033[00m
Nov 22 02:43:54 np0005531887 nova_compute[186849]: 2025-11-22 07:43:54.855 186853 DEBUG nova.objects.instance [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Lazy-loading 'migration_context' on Instance uuid 144e6cca-5b79-4b25-9456-a59f6895075b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:43:54 np0005531887 podman[215467]: 2025-11-22 07:43:54.855977021 +0000 UTC m=+0.071380105 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 02:43:54 np0005531887 nova_compute[186849]: 2025-11-22 07:43:54.858 186853 DEBUG nova.virt.libvirt.driver [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639#033[00m
Nov 22 02:43:54 np0005531887 nova_compute[186849]: 2025-11-22 07:43:54.860 186853 DEBUG nova.virt.libvirt.driver [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440#033[00m
Nov 22 02:43:54 np0005531887 nova_compute[186849]: 2025-11-22 07:43:54.860 186853 DEBUG nova.virt.libvirt.driver [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449#033[00m
Nov 22 02:43:54 np0005531887 nova_compute[186849]: 2025-11-22 07:43:54.877 186853 DEBUG nova.virt.libvirt.vif [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:43:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1027576693',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1027576693',id=16,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:43:41Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='74651b744925468db6c6e47d1397cc04',ramdisk_id='',reservation_id='r-u8vxgo1r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1505701588',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1505701588-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:43:41Z,user_data=None,user_id='4ca2e31d955040598948fa3da5d84888',uuid=144e6cca-5b79-4b25-9456-a59f6895075b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "66ab05b0-442e-4420-82b9-0fc90a3df63b", "address": "fa:16:3e:4f:30:6c", "network": {"id": "cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-667943228-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74651b744925468db6c6e47d1397cc04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap66ab05b0-44", "ovs_interfaceid": "66ab05b0-442e-4420-82b9-0fc90a3df63b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 02:43:54 np0005531887 nova_compute[186849]: 2025-11-22 07:43:54.878 186853 DEBUG nova.network.os_vif_util [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Converting VIF {"id": "66ab05b0-442e-4420-82b9-0fc90a3df63b", "address": "fa:16:3e:4f:30:6c", "network": {"id": "cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-667943228-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74651b744925468db6c6e47d1397cc04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap66ab05b0-44", "ovs_interfaceid": "66ab05b0-442e-4420-82b9-0fc90a3df63b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:43:54 np0005531887 nova_compute[186849]: 2025-11-22 07:43:54.879 186853 DEBUG nova.network.os_vif_util [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4f:30:6c,bridge_name='br-int',has_traffic_filtering=True,id=66ab05b0-442e-4420-82b9-0fc90a3df63b,network=Network(cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66ab05b0-44') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:43:54 np0005531887 nova_compute[186849]: 2025-11-22 07:43:54.880 186853 DEBUG nova.virt.libvirt.migration [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Updating guest XML with vif config: <interface type="ethernet">
Nov 22 02:43:54 np0005531887 nova_compute[186849]:  <mac address="fa:16:3e:4f:30:6c"/>
Nov 22 02:43:54 np0005531887 nova_compute[186849]:  <model type="virtio"/>
Nov 22 02:43:54 np0005531887 nova_compute[186849]:  <driver name="vhost" rx_queue_size="512"/>
Nov 22 02:43:54 np0005531887 nova_compute[186849]:  <mtu size="1442"/>
Nov 22 02:43:54 np0005531887 nova_compute[186849]:  <target dev="tap66ab05b0-44"/>
Nov 22 02:43:54 np0005531887 nova_compute[186849]: </interface>
Nov 22 02:43:54 np0005531887 nova_compute[186849]: _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388#033[00m
Nov 22 02:43:54 np0005531887 nova_compute[186849]: 2025-11-22 07:43:54.881 186853 DEBUG nova.virt.libvirt.driver [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272#033[00m
Nov 22 02:43:54 np0005531887 nova_compute[186849]: 2025-11-22 07:43:54.902 186853 DEBUG nova.network.neutron [None req-47673235-0a02-4160-98e3-931b8c277157 f4dfe5a0ca3d4f2090adec077ca6b9af 765c19fc62c848b9a5257d45514e034f - - default default] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Updating instance_info_cache with network_info: [{"id": "606a4645-8996-452d-9864-00ce49d9140c", "address": "fa:16:3e:ce:a6:f6", "network": {"id": "d7ba1c27-6255-4c71-8e98-23a1c59b5723", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1812148536-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b004cb06df74de2903dae19345fd9c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap606a4645-89", "ovs_interfaceid": "606a4645-8996-452d-9864-00ce49d9140c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:43:54 np0005531887 nova_compute[186849]: 2025-11-22 07:43:54.929 186853 DEBUG oslo_concurrency.lockutils [None req-47673235-0a02-4160-98e3-931b8c277157 f4dfe5a0ca3d4f2090adec077ca6b9af 765c19fc62c848b9a5257d45514e034f - - default default] Releasing lock "refresh_cache-9141d9a1-7a40-4a72-a3f7-2d67ae112383" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:43:54 np0005531887 nova_compute[186849]: 2025-11-22 07:43:54.930 186853 DEBUG nova.compute.manager [None req-47673235-0a02-4160-98e3-931b8c277157 f4dfe5a0ca3d4f2090adec077ca6b9af 765c19fc62c848b9a5257d45514e034f - - default default] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Inject network info _inject_network_info /usr/lib/python3.9/site-packages/nova/compute/manager.py:7144#033[00m
Nov 22 02:43:54 np0005531887 nova_compute[186849]: 2025-11-22 07:43:54.930 186853 DEBUG nova.compute.manager [None req-47673235-0a02-4160-98e3-931b8c277157 f4dfe5a0ca3d4f2090adec077ca6b9af 765c19fc62c848b9a5257d45514e034f - - default default] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] network_info to inject: |[{"id": "606a4645-8996-452d-9864-00ce49d9140c", "address": "fa:16:3e:ce:a6:f6", "network": {"id": "d7ba1c27-6255-4c71-8e98-23a1c59b5723", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1812148536-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b004cb06df74de2903dae19345fd9c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap606a4645-89", "ovs_interfaceid": "606a4645-8996-452d-9864-00ce49d9140c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _inject_network_info /usr/lib/python3.9/site-packages/nova/compute/manager.py:7145#033[00m
Nov 22 02:43:55 np0005531887 nova_compute[186849]: 2025-11-22 07:43:55.101 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:55 np0005531887 nova_compute[186849]: 2025-11-22 07:43:55.363 186853 DEBUG nova.virt.libvirt.migration [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 22 02:43:55 np0005531887 nova_compute[186849]: 2025-11-22 07:43:55.364 186853 INFO nova.virt.libvirt.migration [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Increasing downtime to 50 ms after 0 sec elapsed time#033[00m
Nov 22 02:43:55 np0005531887 nova_compute[186849]: 2025-11-22 07:43:55.451 186853 INFO nova.virt.libvirt.driver [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).#033[00m
Nov 22 02:43:55 np0005531887 nova_compute[186849]: 2025-11-22 07:43:55.953 186853 DEBUG nova.virt.libvirt.migration [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 22 02:43:55 np0005531887 nova_compute[186849]: 2025-11-22 07:43:55.954 186853 DEBUG nova.virt.libvirt.migration [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 22 02:43:55 np0005531887 nova_compute[186849]: 2025-11-22 07:43:55.974 186853 DEBUG nova.compute.manager [req-5a2d5b9b-8156-4ec8-b9af-11ac4ab67a2a req-602ceee0-4f56-4c7a-b4ac-851ab9b834ac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Received event network-vif-plugged-66ab05b0-442e-4420-82b9-0fc90a3df63b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:43:55 np0005531887 nova_compute[186849]: 2025-11-22 07:43:55.974 186853 DEBUG oslo_concurrency.lockutils [req-5a2d5b9b-8156-4ec8-b9af-11ac4ab67a2a req-602ceee0-4f56-4c7a-b4ac-851ab9b834ac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:43:55 np0005531887 nova_compute[186849]: 2025-11-22 07:43:55.975 186853 DEBUG oslo_concurrency.lockutils [req-5a2d5b9b-8156-4ec8-b9af-11ac4ab67a2a req-602ceee0-4f56-4c7a-b4ac-851ab9b834ac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:43:55 np0005531887 nova_compute[186849]: 2025-11-22 07:43:55.975 186853 DEBUG oslo_concurrency.lockutils [req-5a2d5b9b-8156-4ec8-b9af-11ac4ab67a2a req-602ceee0-4f56-4c7a-b4ac-851ab9b834ac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:43:55 np0005531887 nova_compute[186849]: 2025-11-22 07:43:55.975 186853 DEBUG nova.compute.manager [req-5a2d5b9b-8156-4ec8-b9af-11ac4ab67a2a req-602ceee0-4f56-4c7a-b4ac-851ab9b834ac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] No waiting events found dispatching network-vif-plugged-66ab05b0-442e-4420-82b9-0fc90a3df63b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:43:55 np0005531887 nova_compute[186849]: 2025-11-22 07:43:55.975 186853 WARNING nova.compute.manager [req-5a2d5b9b-8156-4ec8-b9af-11ac4ab67a2a req-602ceee0-4f56-4c7a-b4ac-851ab9b834ac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Received unexpected event network-vif-plugged-66ab05b0-442e-4420-82b9-0fc90a3df63b for instance with vm_state active and task_state migrating.#033[00m
Nov 22 02:43:55 np0005531887 nova_compute[186849]: 2025-11-22 07:43:55.976 186853 DEBUG nova.compute.manager [req-5a2d5b9b-8156-4ec8-b9af-11ac4ab67a2a req-602ceee0-4f56-4c7a-b4ac-851ab9b834ac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Received event network-changed-66ab05b0-442e-4420-82b9-0fc90a3df63b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:43:55 np0005531887 nova_compute[186849]: 2025-11-22 07:43:55.976 186853 DEBUG nova.compute.manager [req-5a2d5b9b-8156-4ec8-b9af-11ac4ab67a2a req-602ceee0-4f56-4c7a-b4ac-851ab9b834ac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Refreshing instance network info cache due to event network-changed-66ab05b0-442e-4420-82b9-0fc90a3df63b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 02:43:55 np0005531887 nova_compute[186849]: 2025-11-22 07:43:55.976 186853 DEBUG oslo_concurrency.lockutils [req-5a2d5b9b-8156-4ec8-b9af-11ac4ab67a2a req-602ceee0-4f56-4c7a-b4ac-851ab9b834ac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-144e6cca-5b79-4b25-9456-a59f6895075b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:43:55 np0005531887 nova_compute[186849]: 2025-11-22 07:43:55.976 186853 DEBUG oslo_concurrency.lockutils [req-5a2d5b9b-8156-4ec8-b9af-11ac4ab67a2a req-602ceee0-4f56-4c7a-b4ac-851ab9b834ac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-144e6cca-5b79-4b25-9456-a59f6895075b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:43:55 np0005531887 nova_compute[186849]: 2025-11-22 07:43:55.977 186853 DEBUG nova.network.neutron [req-5a2d5b9b-8156-4ec8-b9af-11ac4ab67a2a req-602ceee0-4f56-4c7a-b4ac-851ab9b834ac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Refreshing network info cache for port 66ab05b0-442e-4420-82b9-0fc90a3df63b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 02:43:56 np0005531887 nova_compute[186849]: 2025-11-22 07:43:56.488 186853 DEBUG nova.virt.libvirt.migration [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 22 02:43:56 np0005531887 nova_compute[186849]: 2025-11-22 07:43:56.489 186853 DEBUG nova.virt.libvirt.migration [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 22 02:43:56 np0005531887 nova_compute[186849]: 2025-11-22 07:43:56.993 186853 DEBUG nova.virt.libvirt.migration [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 22 02:43:56 np0005531887 nova_compute[186849]: 2025-11-22 07:43:56.995 186853 DEBUG nova.virt.libvirt.migration [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 22 02:43:57 np0005531887 ovn_controller[95130]: 2025-11-22T07:43:57Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:4f:30:6c 10.100.0.8
Nov 22 02:43:57 np0005531887 ovn_controller[95130]: 2025-11-22T07:43:57Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:4f:30:6c 10.100.0.8
Nov 22 02:43:57 np0005531887 nova_compute[186849]: 2025-11-22 07:43:57.350 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763797437.350135, 144e6cca-5b79-4b25-9456-a59f6895075b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:43:57 np0005531887 nova_compute[186849]: 2025-11-22 07:43:57.351 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] VM Paused (Lifecycle Event)#033[00m
Nov 22 02:43:57 np0005531887 nova_compute[186849]: 2025-11-22 07:43:57.372 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:43:57 np0005531887 nova_compute[186849]: 2025-11-22 07:43:57.377 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:43:57 np0005531887 nova_compute[186849]: 2025-11-22 07:43:57.398 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] During sync_power_state the instance has a pending task (migrating). Skip.#033[00m
Nov 22 02:43:57 np0005531887 kernel: tap66ab05b0-44 (unregistering): left promiscuous mode
Nov 22 02:43:57 np0005531887 nova_compute[186849]: 2025-11-22 07:43:57.523 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:57 np0005531887 ovn_controller[95130]: 2025-11-22T07:43:57Z|00047|binding|INFO|Releasing lport 66ab05b0-442e-4420-82b9-0fc90a3df63b from this chassis (sb_readonly=0)
Nov 22 02:43:57 np0005531887 ovn_controller[95130]: 2025-11-22T07:43:57Z|00048|binding|INFO|Setting lport 66ab05b0-442e-4420-82b9-0fc90a3df63b down in Southbound
Nov 22 02:43:57 np0005531887 ovn_controller[95130]: 2025-11-22T07:43:57Z|00049|binding|INFO|Removing iface tap66ab05b0-44 ovn-installed in OVS
Nov 22 02:43:57 np0005531887 NetworkManager[55210]: <info>  [1763797437.5318] device (tap66ab05b0-44): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 02:43:57 np0005531887 nova_compute[186849]: 2025-11-22 07:43:57.540 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:57 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:57.596 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4f:30:6c 10.100.0.8'], port_security=['fa:16:3e:4f:30:6c 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'df09844c-c111-44b4-9c36-d4950a55a590'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '144e6cca-5b79-4b25-9456-a59f6895075b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '74651b744925468db6c6e47d1397cc04', 'neutron:revision_number': '8', 'neutron:security_group_ids': '91f2be3c-33ea-422b-b9a4-1d9e92a850d8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=14c3e272-b4ef-4625-a876-b23f3cbba9b7, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=66ab05b0-442e-4420-82b9-0fc90a3df63b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:43:57 np0005531887 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000010.scope: Deactivated successfully.
Nov 22 02:43:57 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:57.597 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 66ab05b0-442e-4420-82b9-0fc90a3df63b in datapath cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9 unbound from our chassis#033[00m
Nov 22 02:43:57 np0005531887 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000010.scope: Consumed 15.022s CPU time.
Nov 22 02:43:57 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:57.599 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 02:43:57 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:57.601 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[4dd33b07-4112-4813-8d93-8fb0659670f0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:43:57 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:57.602 104084 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9 namespace which is not needed anymore#033[00m
Nov 22 02:43:57 np0005531887 systemd-machined[153180]: Machine qemu-5-instance-00000010 terminated.
Nov 22 02:43:57 np0005531887 nova_compute[186849]: 2025-11-22 07:43:57.716 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:57 np0005531887 nova_compute[186849]: 2025-11-22 07:43:57.721 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:57 np0005531887 neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9[215221]: [NOTICE]   (215225) : haproxy version is 2.8.14-c23fe91
Nov 22 02:43:57 np0005531887 neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9[215221]: [NOTICE]   (215225) : path to executable is /usr/sbin/haproxy
Nov 22 02:43:57 np0005531887 neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9[215221]: [WARNING]  (215225) : Exiting Master process...
Nov 22 02:43:57 np0005531887 neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9[215221]: [WARNING]  (215225) : Exiting Master process...
Nov 22 02:43:57 np0005531887 neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9[215221]: [ALERT]    (215225) : Current worker (215227) exited with code 143 (Terminated)
Nov 22 02:43:57 np0005531887 neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9[215221]: [WARNING]  (215225) : All workers exited. Exiting... (0)
Nov 22 02:43:57 np0005531887 systemd[1]: libpod-1878772e49265ceeffc6badec0f62fbf8fad98c28f18f25fa6245b606db4a983.scope: Deactivated successfully.
Nov 22 02:43:57 np0005531887 podman[215513]: 2025-11-22 07:43:57.756045991 +0000 UTC m=+0.052455432 container died 1878772e49265ceeffc6badec0f62fbf8fad98c28f18f25fa6245b606db4a983 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 22 02:43:57 np0005531887 nova_compute[186849]: 2025-11-22 07:43:57.762 186853 DEBUG nova.virt.libvirt.guest [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688#033[00m
Nov 22 02:43:57 np0005531887 nova_compute[186849]: 2025-11-22 07:43:57.763 186853 INFO nova.virt.libvirt.driver [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Migration operation has completed#033[00m
Nov 22 02:43:57 np0005531887 nova_compute[186849]: 2025-11-22 07:43:57.763 186853 INFO nova.compute.manager [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] _post_live_migration() is started..#033[00m
Nov 22 02:43:57 np0005531887 virtqemud[186424]: Cannot recv data: Input/output error
Nov 22 02:43:57 np0005531887 nova_compute[186849]: 2025-11-22 07:43:57.772 186853 DEBUG nova.virt.libvirt.driver [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279#033[00m
Nov 22 02:43:57 np0005531887 nova_compute[186849]: 2025-11-22 07:43:57.772 186853 DEBUG nova.virt.libvirt.driver [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327#033[00m
Nov 22 02:43:57 np0005531887 nova_compute[186849]: 2025-11-22 07:43:57.772 186853 DEBUG nova.virt.libvirt.driver [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630#033[00m
Nov 22 02:43:57 np0005531887 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1878772e49265ceeffc6badec0f62fbf8fad98c28f18f25fa6245b606db4a983-userdata-shm.mount: Deactivated successfully.
Nov 22 02:43:57 np0005531887 systemd[1]: var-lib-containers-storage-overlay-622139ff980240a5e975f4a253bda1d7a8572f840d01ef3fe2e6eb962e954b68-merged.mount: Deactivated successfully.
Nov 22 02:43:57 np0005531887 podman[215513]: 2025-11-22 07:43:57.798865898 +0000 UTC m=+0.095275339 container cleanup 1878772e49265ceeffc6badec0f62fbf8fad98c28f18f25fa6245b606db4a983 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 22 02:43:57 np0005531887 systemd[1]: libpod-conmon-1878772e49265ceeffc6badec0f62fbf8fad98c28f18f25fa6245b606db4a983.scope: Deactivated successfully.
Nov 22 02:43:57 np0005531887 podman[215553]: 2025-11-22 07:43:57.890340843 +0000 UTC m=+0.065485132 container remove 1878772e49265ceeffc6badec0f62fbf8fad98c28f18f25fa6245b606db4a983 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:43:57 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:57.895 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[2918998e-28af-4c05-b8ae-5a7c4adf24e1]: (4, ('Sat Nov 22 07:43:57 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9 (1878772e49265ceeffc6badec0f62fbf8fad98c28f18f25fa6245b606db4a983)\n1878772e49265ceeffc6badec0f62fbf8fad98c28f18f25fa6245b606db4a983\nSat Nov 22 07:43:57 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9 (1878772e49265ceeffc6badec0f62fbf8fad98c28f18f25fa6245b606db4a983)\n1878772e49265ceeffc6badec0f62fbf8fad98c28f18f25fa6245b606db4a983\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:43:57 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:57.897 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[99464dd6-5cae-4fce-bb77-4c68bafc2473]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:43:57 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:57.899 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcd5fa4f6-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:43:57 np0005531887 nova_compute[186849]: 2025-11-22 07:43:57.930 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:57 np0005531887 kernel: tapcd5fa4f6-00: left promiscuous mode
Nov 22 02:43:57 np0005531887 nova_compute[186849]: 2025-11-22 07:43:57.947 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:57 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:57.949 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[8b33beb0-2e00-4db1-b9ad-e53b79e0d4f3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:43:57 np0005531887 nova_compute[186849]: 2025-11-22 07:43:57.963 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:57 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:57.968 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[c9227362-8df0-4ea6-bbd7-c5a868253eb6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:43:57 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:57.970 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[9105356c-7887-4b83-a4ca-64c3e800560d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:43:57 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:57.983 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[b944dd4f-e5da-4bc8-90df-5547a92c47aa]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 416371, 'reachable_time': 36295, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215578, 'error': None, 'target': 'ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:43:57 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:57.985 104200 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 02:43:57 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:43:57.986 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[a38d4033-6b88-46cb-b1e2-8afdc5cd8f3e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:43:57 np0005531887 systemd[1]: run-netns-ovnmeta\x2dcd5fa4f6\x2d0f1b\x2d41f2\x2d9643\x2d3c1a36620dc9.mount: Deactivated successfully.
Nov 22 02:43:58 np0005531887 podman[215567]: 2025-11-22 07:43:58.038420502 +0000 UTC m=+0.068669939 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:43:58 np0005531887 nova_compute[186849]: 2025-11-22 07:43:58.416 186853 DEBUG nova.compute.manager [req-3ca176ff-f6ab-46d5-a08e-3475d13486a5 req-eefe93d5-a1f9-49b0-9d41-a715203b1dd1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Received event network-vif-unplugged-66ab05b0-442e-4420-82b9-0fc90a3df63b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:43:58 np0005531887 nova_compute[186849]: 2025-11-22 07:43:58.417 186853 DEBUG oslo_concurrency.lockutils [req-3ca176ff-f6ab-46d5-a08e-3475d13486a5 req-eefe93d5-a1f9-49b0-9d41-a715203b1dd1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:43:58 np0005531887 nova_compute[186849]: 2025-11-22 07:43:58.417 186853 DEBUG oslo_concurrency.lockutils [req-3ca176ff-f6ab-46d5-a08e-3475d13486a5 req-eefe93d5-a1f9-49b0-9d41-a715203b1dd1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:43:58 np0005531887 nova_compute[186849]: 2025-11-22 07:43:58.417 186853 DEBUG oslo_concurrency.lockutils [req-3ca176ff-f6ab-46d5-a08e-3475d13486a5 req-eefe93d5-a1f9-49b0-9d41-a715203b1dd1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:43:58 np0005531887 nova_compute[186849]: 2025-11-22 07:43:58.417 186853 DEBUG nova.compute.manager [req-3ca176ff-f6ab-46d5-a08e-3475d13486a5 req-eefe93d5-a1f9-49b0-9d41-a715203b1dd1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] No waiting events found dispatching network-vif-unplugged-66ab05b0-442e-4420-82b9-0fc90a3df63b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:43:58 np0005531887 nova_compute[186849]: 2025-11-22 07:43:58.417 186853 DEBUG nova.compute.manager [req-3ca176ff-f6ab-46d5-a08e-3475d13486a5 req-eefe93d5-a1f9-49b0-9d41-a715203b1dd1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Received event network-vif-unplugged-66ab05b0-442e-4420-82b9-0fc90a3df63b for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 02:43:58 np0005531887 nova_compute[186849]: 2025-11-22 07:43:58.461 186853 DEBUG nova.network.neutron [req-5a2d5b9b-8156-4ec8-b9af-11ac4ab67a2a req-602ceee0-4f56-4c7a-b4ac-851ab9b834ac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Updated VIF entry in instance network info cache for port 66ab05b0-442e-4420-82b9-0fc90a3df63b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 02:43:58 np0005531887 nova_compute[186849]: 2025-11-22 07:43:58.461 186853 DEBUG nova.network.neutron [req-5a2d5b9b-8156-4ec8-b9af-11ac4ab67a2a req-602ceee0-4f56-4c7a-b4ac-851ab9b834ac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Updating instance_info_cache with network_info: [{"id": "66ab05b0-442e-4420-82b9-0fc90a3df63b", "address": "fa:16:3e:4f:30:6c", "network": {"id": "cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-667943228-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74651b744925468db6c6e47d1397cc04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66ab05b0-44", "ovs_interfaceid": "66ab05b0-442e-4420-82b9-0fc90a3df63b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:43:58 np0005531887 nova_compute[186849]: 2025-11-22 07:43:58.478 186853 DEBUG oslo_concurrency.lockutils [req-5a2d5b9b-8156-4ec8-b9af-11ac4ab67a2a req-602ceee0-4f56-4c7a-b4ac-851ab9b834ac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-144e6cca-5b79-4b25-9456-a59f6895075b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:43:58 np0005531887 nova_compute[186849]: 2025-11-22 07:43:58.964 186853 DEBUG nova.network.neutron [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Activated binding for port 66ab05b0-442e-4420-82b9-0fc90a3df63b and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181#033[00m
Nov 22 02:43:58 np0005531887 nova_compute[186849]: 2025-11-22 07:43:58.965 186853 DEBUG nova.compute.manager [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "66ab05b0-442e-4420-82b9-0fc90a3df63b", "address": "fa:16:3e:4f:30:6c", "network": {"id": "cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-667943228-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74651b744925468db6c6e47d1397cc04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66ab05b0-44", "ovs_interfaceid": "66ab05b0-442e-4420-82b9-0fc90a3df63b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326#033[00m
Nov 22 02:43:58 np0005531887 nova_compute[186849]: 2025-11-22 07:43:58.966 186853 DEBUG nova.virt.libvirt.vif [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:43:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1027576693',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1027576693',id=16,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:43:41Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='74651b744925468db6c6e47d1397cc04',ramdisk_id='',reservation_id='r-u8vxgo1r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1505701588',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1505701588-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:43:46Z,user_data=None,user_id='4ca2e31d955040598948fa3da5d84888',uuid=144e6cca-5b79-4b25-9456-a59f6895075b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "66ab05b0-442e-4420-82b9-0fc90a3df63b", "address": "fa:16:3e:4f:30:6c", "network": {"id": "cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-667943228-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74651b744925468db6c6e47d1397cc04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66ab05b0-44", "ovs_interfaceid": "66ab05b0-442e-4420-82b9-0fc90a3df63b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 02:43:58 np0005531887 nova_compute[186849]: 2025-11-22 07:43:58.966 186853 DEBUG nova.network.os_vif_util [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Converting VIF {"id": "66ab05b0-442e-4420-82b9-0fc90a3df63b", "address": "fa:16:3e:4f:30:6c", "network": {"id": "cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-667943228-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74651b744925468db6c6e47d1397cc04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66ab05b0-44", "ovs_interfaceid": "66ab05b0-442e-4420-82b9-0fc90a3df63b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:43:58 np0005531887 nova_compute[186849]: 2025-11-22 07:43:58.966 186853 DEBUG nova.network.os_vif_util [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4f:30:6c,bridge_name='br-int',has_traffic_filtering=True,id=66ab05b0-442e-4420-82b9-0fc90a3df63b,network=Network(cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66ab05b0-44') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:43:58 np0005531887 nova_compute[186849]: 2025-11-22 07:43:58.967 186853 DEBUG os_vif [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4f:30:6c,bridge_name='br-int',has_traffic_filtering=True,id=66ab05b0-442e-4420-82b9-0fc90a3df63b,network=Network(cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66ab05b0-44') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 02:43:58 np0005531887 nova_compute[186849]: 2025-11-22 07:43:58.968 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:58 np0005531887 nova_compute[186849]: 2025-11-22 07:43:58.968 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap66ab05b0-44, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:43:59 np0005531887 nova_compute[186849]: 2025-11-22 07:43:58.979 186853 DEBUG nova.compute.manager [req-5b8f401b-5a64-44b4-a3cd-513175e44257 req-3b15ee77-a4b8-4039-80ab-0baf14167140 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Received event network-vif-unplugged-66ab05b0-442e-4420-82b9-0fc90a3df63b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:43:59 np0005531887 nova_compute[186849]: 2025-11-22 07:43:58.979 186853 DEBUG oslo_concurrency.lockutils [req-5b8f401b-5a64-44b4-a3cd-513175e44257 req-3b15ee77-a4b8-4039-80ab-0baf14167140 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:43:59 np0005531887 nova_compute[186849]: 2025-11-22 07:43:58.979 186853 DEBUG oslo_concurrency.lockutils [req-5b8f401b-5a64-44b4-a3cd-513175e44257 req-3b15ee77-a4b8-4039-80ab-0baf14167140 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:43:59 np0005531887 nova_compute[186849]: 2025-11-22 07:43:58.980 186853 DEBUG oslo_concurrency.lockutils [req-5b8f401b-5a64-44b4-a3cd-513175e44257 req-3b15ee77-a4b8-4039-80ab-0baf14167140 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:43:59 np0005531887 nova_compute[186849]: 2025-11-22 07:43:58.980 186853 DEBUG nova.compute.manager [req-5b8f401b-5a64-44b4-a3cd-513175e44257 req-3b15ee77-a4b8-4039-80ab-0baf14167140 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] No waiting events found dispatching network-vif-unplugged-66ab05b0-442e-4420-82b9-0fc90a3df63b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:43:59 np0005531887 nova_compute[186849]: 2025-11-22 07:43:58.980 186853 DEBUG nova.compute.manager [req-5b8f401b-5a64-44b4-a3cd-513175e44257 req-3b15ee77-a4b8-4039-80ab-0baf14167140 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Received event network-vif-unplugged-66ab05b0-442e-4420-82b9-0fc90a3df63b for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 02:43:59 np0005531887 nova_compute[186849]: 2025-11-22 07:43:59.007 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:43:59 np0005531887 nova_compute[186849]: 2025-11-22 07:43:59.009 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:43:59 np0005531887 nova_compute[186849]: 2025-11-22 07:43:59.012 186853 INFO os_vif [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4f:30:6c,bridge_name='br-int',has_traffic_filtering=True,id=66ab05b0-442e-4420-82b9-0fc90a3df63b,network=Network(cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66ab05b0-44')#033[00m
Nov 22 02:43:59 np0005531887 nova_compute[186849]: 2025-11-22 07:43:59.012 186853 DEBUG oslo_concurrency.lockutils [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:43:59 np0005531887 nova_compute[186849]: 2025-11-22 07:43:59.012 186853 DEBUG oslo_concurrency.lockutils [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:43:59 np0005531887 nova_compute[186849]: 2025-11-22 07:43:59.013 186853 DEBUG oslo_concurrency.lockutils [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:43:59 np0005531887 nova_compute[186849]: 2025-11-22 07:43:59.013 186853 DEBUG nova.compute.manager [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349#033[00m
Nov 22 02:43:59 np0005531887 nova_compute[186849]: 2025-11-22 07:43:59.013 186853 INFO nova.virt.libvirt.driver [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Deleting instance files /var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b_del#033[00m
Nov 22 02:43:59 np0005531887 nova_compute[186849]: 2025-11-22 07:43:59.014 186853 INFO nova.virt.libvirt.driver [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Deletion of /var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b_del complete#033[00m
Nov 22 02:44:00 np0005531887 nova_compute[186849]: 2025-11-22 07:44:00.144 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:00 np0005531887 nova_compute[186849]: 2025-11-22 07:44:00.499 186853 DEBUG nova.compute.manager [req-6a7cba29-efc4-4f7d-95c5-31d6203d5179 req-20aeb536-e6f0-4db8-9f66-9f94a659021c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Received event network-vif-plugged-66ab05b0-442e-4420-82b9-0fc90a3df63b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:44:00 np0005531887 nova_compute[186849]: 2025-11-22 07:44:00.499 186853 DEBUG oslo_concurrency.lockutils [req-6a7cba29-efc4-4f7d-95c5-31d6203d5179 req-20aeb536-e6f0-4db8-9f66-9f94a659021c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:44:00 np0005531887 nova_compute[186849]: 2025-11-22 07:44:00.499 186853 DEBUG oslo_concurrency.lockutils [req-6a7cba29-efc4-4f7d-95c5-31d6203d5179 req-20aeb536-e6f0-4db8-9f66-9f94a659021c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:44:00 np0005531887 nova_compute[186849]: 2025-11-22 07:44:00.499 186853 DEBUG oslo_concurrency.lockutils [req-6a7cba29-efc4-4f7d-95c5-31d6203d5179 req-20aeb536-e6f0-4db8-9f66-9f94a659021c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:44:00 np0005531887 nova_compute[186849]: 2025-11-22 07:44:00.499 186853 DEBUG nova.compute.manager [req-6a7cba29-efc4-4f7d-95c5-31d6203d5179 req-20aeb536-e6f0-4db8-9f66-9f94a659021c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] No waiting events found dispatching network-vif-plugged-66ab05b0-442e-4420-82b9-0fc90a3df63b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:44:00 np0005531887 nova_compute[186849]: 2025-11-22 07:44:00.500 186853 WARNING nova.compute.manager [req-6a7cba29-efc4-4f7d-95c5-31d6203d5179 req-20aeb536-e6f0-4db8-9f66-9f94a659021c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Received unexpected event network-vif-plugged-66ab05b0-442e-4420-82b9-0fc90a3df63b for instance with vm_state active and task_state migrating.#033[00m
Nov 22 02:44:00 np0005531887 nova_compute[186849]: 2025-11-22 07:44:00.500 186853 DEBUG nova.compute.manager [req-6a7cba29-efc4-4f7d-95c5-31d6203d5179 req-20aeb536-e6f0-4db8-9f66-9f94a659021c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Received event network-vif-plugged-66ab05b0-442e-4420-82b9-0fc90a3df63b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:44:00 np0005531887 nova_compute[186849]: 2025-11-22 07:44:00.500 186853 DEBUG oslo_concurrency.lockutils [req-6a7cba29-efc4-4f7d-95c5-31d6203d5179 req-20aeb536-e6f0-4db8-9f66-9f94a659021c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:44:00 np0005531887 nova_compute[186849]: 2025-11-22 07:44:00.500 186853 DEBUG oslo_concurrency.lockutils [req-6a7cba29-efc4-4f7d-95c5-31d6203d5179 req-20aeb536-e6f0-4db8-9f66-9f94a659021c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:44:00 np0005531887 nova_compute[186849]: 2025-11-22 07:44:00.500 186853 DEBUG oslo_concurrency.lockutils [req-6a7cba29-efc4-4f7d-95c5-31d6203d5179 req-20aeb536-e6f0-4db8-9f66-9f94a659021c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:44:00 np0005531887 nova_compute[186849]: 2025-11-22 07:44:00.500 186853 DEBUG nova.compute.manager [req-6a7cba29-efc4-4f7d-95c5-31d6203d5179 req-20aeb536-e6f0-4db8-9f66-9f94a659021c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] No waiting events found dispatching network-vif-plugged-66ab05b0-442e-4420-82b9-0fc90a3df63b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:44:00 np0005531887 nova_compute[186849]: 2025-11-22 07:44:00.501 186853 WARNING nova.compute.manager [req-6a7cba29-efc4-4f7d-95c5-31d6203d5179 req-20aeb536-e6f0-4db8-9f66-9f94a659021c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Received unexpected event network-vif-plugged-66ab05b0-442e-4420-82b9-0fc90a3df63b for instance with vm_state active and task_state migrating.#033[00m
Nov 22 02:44:00 np0005531887 nova_compute[186849]: 2025-11-22 07:44:00.501 186853 DEBUG nova.compute.manager [req-6a7cba29-efc4-4f7d-95c5-31d6203d5179 req-20aeb536-e6f0-4db8-9f66-9f94a659021c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Received event network-vif-plugged-66ab05b0-442e-4420-82b9-0fc90a3df63b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:44:00 np0005531887 nova_compute[186849]: 2025-11-22 07:44:00.501 186853 DEBUG oslo_concurrency.lockutils [req-6a7cba29-efc4-4f7d-95c5-31d6203d5179 req-20aeb536-e6f0-4db8-9f66-9f94a659021c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:44:00 np0005531887 nova_compute[186849]: 2025-11-22 07:44:00.501 186853 DEBUG oslo_concurrency.lockutils [req-6a7cba29-efc4-4f7d-95c5-31d6203d5179 req-20aeb536-e6f0-4db8-9f66-9f94a659021c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:44:00 np0005531887 nova_compute[186849]: 2025-11-22 07:44:00.501 186853 DEBUG oslo_concurrency.lockutils [req-6a7cba29-efc4-4f7d-95c5-31d6203d5179 req-20aeb536-e6f0-4db8-9f66-9f94a659021c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:44:00 np0005531887 nova_compute[186849]: 2025-11-22 07:44:00.501 186853 DEBUG nova.compute.manager [req-6a7cba29-efc4-4f7d-95c5-31d6203d5179 req-20aeb536-e6f0-4db8-9f66-9f94a659021c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] No waiting events found dispatching network-vif-plugged-66ab05b0-442e-4420-82b9-0fc90a3df63b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:44:00 np0005531887 nova_compute[186849]: 2025-11-22 07:44:00.501 186853 WARNING nova.compute.manager [req-6a7cba29-efc4-4f7d-95c5-31d6203d5179 req-20aeb536-e6f0-4db8-9f66-9f94a659021c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Received unexpected event network-vif-plugged-66ab05b0-442e-4420-82b9-0fc90a3df63b for instance with vm_state active and task_state migrating.#033[00m
Nov 22 02:44:00 np0005531887 nova_compute[186849]: 2025-11-22 07:44:00.501 186853 DEBUG nova.compute.manager [req-6a7cba29-efc4-4f7d-95c5-31d6203d5179 req-20aeb536-e6f0-4db8-9f66-9f94a659021c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Received event network-vif-plugged-66ab05b0-442e-4420-82b9-0fc90a3df63b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:44:00 np0005531887 nova_compute[186849]: 2025-11-22 07:44:00.502 186853 DEBUG oslo_concurrency.lockutils [req-6a7cba29-efc4-4f7d-95c5-31d6203d5179 req-20aeb536-e6f0-4db8-9f66-9f94a659021c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:44:00 np0005531887 nova_compute[186849]: 2025-11-22 07:44:00.502 186853 DEBUG oslo_concurrency.lockutils [req-6a7cba29-efc4-4f7d-95c5-31d6203d5179 req-20aeb536-e6f0-4db8-9f66-9f94a659021c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:44:00 np0005531887 nova_compute[186849]: 2025-11-22 07:44:00.502 186853 DEBUG oslo_concurrency.lockutils [req-6a7cba29-efc4-4f7d-95c5-31d6203d5179 req-20aeb536-e6f0-4db8-9f66-9f94a659021c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:44:00 np0005531887 nova_compute[186849]: 2025-11-22 07:44:00.502 186853 DEBUG nova.compute.manager [req-6a7cba29-efc4-4f7d-95c5-31d6203d5179 req-20aeb536-e6f0-4db8-9f66-9f94a659021c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] No waiting events found dispatching network-vif-plugged-66ab05b0-442e-4420-82b9-0fc90a3df63b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:44:00 np0005531887 nova_compute[186849]: 2025-11-22 07:44:00.502 186853 WARNING nova.compute.manager [req-6a7cba29-efc4-4f7d-95c5-31d6203d5179 req-20aeb536-e6f0-4db8-9f66-9f94a659021c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Received unexpected event network-vif-plugged-66ab05b0-442e-4420-82b9-0fc90a3df63b for instance with vm_state active and task_state migrating.#033[00m
Nov 22 02:44:01 np0005531887 podman[215590]: 2025-11-22 07:44:01.860910263 +0000 UTC m=+0.082348063 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Nov 22 02:44:03 np0005531887 systemd[1]: Stopping User Manager for UID 42436...
Nov 22 02:44:03 np0005531887 systemd[215430]: Activating special unit Exit the Session...
Nov 22 02:44:03 np0005531887 systemd[215430]: Stopped target Main User Target.
Nov 22 02:44:03 np0005531887 systemd[215430]: Stopped target Basic System.
Nov 22 02:44:03 np0005531887 systemd[215430]: Stopped target Paths.
Nov 22 02:44:03 np0005531887 systemd[215430]: Stopped target Sockets.
Nov 22 02:44:03 np0005531887 systemd[215430]: Stopped target Timers.
Nov 22 02:44:03 np0005531887 systemd[215430]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 22 02:44:03 np0005531887 systemd[215430]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 22 02:44:03 np0005531887 systemd[215430]: Closed D-Bus User Message Bus Socket.
Nov 22 02:44:03 np0005531887 systemd[215430]: Stopped Create User's Volatile Files and Directories.
Nov 22 02:44:03 np0005531887 systemd[215430]: Removed slice User Application Slice.
Nov 22 02:44:03 np0005531887 systemd[215430]: Reached target Shutdown.
Nov 22 02:44:03 np0005531887 systemd[215430]: Finished Exit the Session.
Nov 22 02:44:03 np0005531887 systemd[215430]: Reached target Exit the Session.
Nov 22 02:44:03 np0005531887 systemd[1]: user@42436.service: Deactivated successfully.
Nov 22 02:44:03 np0005531887 systemd[1]: Stopped User Manager for UID 42436.
Nov 22 02:44:03 np0005531887 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 22 02:44:03 np0005531887 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 22 02:44:03 np0005531887 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 22 02:44:03 np0005531887 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 22 02:44:03 np0005531887 systemd[1]: Removed slice User Slice of UID 42436.
Nov 22 02:44:03 np0005531887 nova_compute[186849]: 2025-11-22 07:44:03.703 186853 DEBUG oslo_concurrency.lockutils [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Acquiring lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:44:03 np0005531887 nova_compute[186849]: 2025-11-22 07:44:03.704 186853 DEBUG oslo_concurrency.lockutils [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:44:03 np0005531887 nova_compute[186849]: 2025-11-22 07:44:03.704 186853 DEBUG oslo_concurrency.lockutils [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:44:03 np0005531887 nova_compute[186849]: 2025-11-22 07:44:03.729 186853 DEBUG oslo_concurrency.lockutils [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:44:03 np0005531887 nova_compute[186849]: 2025-11-22 07:44:03.729 186853 DEBUG oslo_concurrency.lockutils [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:44:03 np0005531887 nova_compute[186849]: 2025-11-22 07:44:03.729 186853 DEBUG oslo_concurrency.lockutils [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:44:03 np0005531887 nova_compute[186849]: 2025-11-22 07:44:03.730 186853 DEBUG nova.compute.resource_tracker [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 02:44:03 np0005531887 nova_compute[186849]: 2025-11-22 07:44:03.795 186853 DEBUG oslo_concurrency.processutils [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9141d9a1-7a40-4a72-a3f7-2d67ae112383/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:44:03 np0005531887 nova_compute[186849]: 2025-11-22 07:44:03.853 186853 DEBUG oslo_concurrency.processutils [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9141d9a1-7a40-4a72-a3f7-2d67ae112383/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:44:03 np0005531887 nova_compute[186849]: 2025-11-22 07:44:03.855 186853 DEBUG oslo_concurrency.processutils [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9141d9a1-7a40-4a72-a3f7-2d67ae112383/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:44:03 np0005531887 nova_compute[186849]: 2025-11-22 07:44:03.911 186853 DEBUG oslo_concurrency.processutils [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9141d9a1-7a40-4a72-a3f7-2d67ae112383/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:44:04 np0005531887 nova_compute[186849]: 2025-11-22 07:44:04.007 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:04 np0005531887 nova_compute[186849]: 2025-11-22 07:44:04.078 186853 WARNING nova.virt.libvirt.driver [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:44:04 np0005531887 nova_compute[186849]: 2025-11-22 07:44:04.079 186853 DEBUG nova.compute.resource_tracker [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5555MB free_disk=73.43130111694336GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 02:44:04 np0005531887 nova_compute[186849]: 2025-11-22 07:44:04.080 186853 DEBUG oslo_concurrency.lockutils [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:44:04 np0005531887 nova_compute[186849]: 2025-11-22 07:44:04.080 186853 DEBUG oslo_concurrency.lockutils [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:44:04 np0005531887 nova_compute[186849]: 2025-11-22 07:44:04.134 186853 DEBUG nova.compute.resource_tracker [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Migration for instance 144e6cca-5b79-4b25-9456-a59f6895075b refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Nov 22 02:44:04 np0005531887 nova_compute[186849]: 2025-11-22 07:44:04.173 186853 DEBUG nova.compute.resource_tracker [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m
Nov 22 02:44:04 np0005531887 nova_compute[186849]: 2025-11-22 07:44:04.209 186853 DEBUG nova.compute.resource_tracker [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Instance 9141d9a1-7a40-4a72-a3f7-2d67ae112383 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 02:44:04 np0005531887 nova_compute[186849]: 2025-11-22 07:44:04.210 186853 DEBUG nova.compute.resource_tracker [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Migration d7d1dd24-6605-44cf-9fd2-5b9abad61c6d is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Nov 22 02:44:04 np0005531887 nova_compute[186849]: 2025-11-22 07:44:04.210 186853 DEBUG nova.compute.resource_tracker [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 02:44:04 np0005531887 nova_compute[186849]: 2025-11-22 07:44:04.210 186853 DEBUG nova.compute.resource_tracker [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 02:44:04 np0005531887 nova_compute[186849]: 2025-11-22 07:44:04.302 186853 DEBUG nova.compute.provider_tree [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:44:04 np0005531887 nova_compute[186849]: 2025-11-22 07:44:04.316 186853 DEBUG nova.scheduler.client.report [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:44:04 np0005531887 nova_compute[186849]: 2025-11-22 07:44:04.335 186853 DEBUG nova.compute.resource_tracker [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 02:44:04 np0005531887 nova_compute[186849]: 2025-11-22 07:44:04.336 186853 DEBUG oslo_concurrency.lockutils [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.256s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:44:04 np0005531887 nova_compute[186849]: 2025-11-22 07:44:04.346 186853 INFO nova.compute.manager [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Migrating instance to compute-0.ctlplane.example.com finished successfully.#033[00m
Nov 22 02:44:04 np0005531887 nova_compute[186849]: 2025-11-22 07:44:04.453 186853 INFO nova.scheduler.client.report [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Deleted allocation for migration d7d1dd24-6605-44cf-9fd2-5b9abad61c6d#033[00m
Nov 22 02:44:04 np0005531887 nova_compute[186849]: 2025-11-22 07:44:04.453 186853 DEBUG nova.virt.libvirt.driver [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662#033[00m
Nov 22 02:44:04 np0005531887 ovn_controller[95130]: 2025-11-22T07:44:04Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ce:a6:f6 10.100.0.6
Nov 22 02:44:04 np0005531887 ovn_controller[95130]: 2025-11-22T07:44:04Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ce:a6:f6 10.100.0.6
Nov 22 02:44:05 np0005531887 nova_compute[186849]: 2025-11-22 07:44:05.146 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:05 np0005531887 nova_compute[186849]: 2025-11-22 07:44:05.281 186853 DEBUG nova.virt.libvirt.driver [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Creating tmpfile /var/lib/nova/instances/tmppimoasud to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Nov 22 02:44:05 np0005531887 nova_compute[186849]: 2025-11-22 07:44:05.451 186853 DEBUG nova.compute.manager [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmppimoasud',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Nov 22 02:44:06 np0005531887 nova_compute[186849]: 2025-11-22 07:44:06.600 186853 DEBUG nova.compute.manager [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmppimoasud',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='144e6cca-5b79-4b25-9456-a59f6895075b',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Nov 22 02:44:06 np0005531887 nova_compute[186849]: 2025-11-22 07:44:06.645 186853 DEBUG oslo_concurrency.lockutils [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Acquiring lock "refresh_cache-144e6cca-5b79-4b25-9456-a59f6895075b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:44:06 np0005531887 nova_compute[186849]: 2025-11-22 07:44:06.645 186853 DEBUG oslo_concurrency.lockutils [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Acquired lock "refresh_cache-144e6cca-5b79-4b25-9456-a59f6895075b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:44:06 np0005531887 nova_compute[186849]: 2025-11-22 07:44:06.645 186853 DEBUG nova.network.neutron [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:44:06 np0005531887 podman[215637]: 2025-11-22 07:44:06.836322898 +0000 UTC m=+0.059103615 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 02:44:09 np0005531887 nova_compute[186849]: 2025-11-22 07:44:09.010 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:10 np0005531887 nova_compute[186849]: 2025-11-22 07:44:10.149 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:10 np0005531887 nova_compute[186849]: 2025-11-22 07:44:10.770 186853 DEBUG nova.network.neutron [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Updating instance_info_cache with network_info: [{"id": "66ab05b0-442e-4420-82b9-0fc90a3df63b", "address": "fa:16:3e:4f:30:6c", "network": {"id": "cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-667943228-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74651b744925468db6c6e47d1397cc04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66ab05b0-44", "ovs_interfaceid": "66ab05b0-442e-4420-82b9-0fc90a3df63b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:44:10 np0005531887 nova_compute[186849]: 2025-11-22 07:44:10.786 186853 DEBUG oslo_concurrency.lockutils [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Releasing lock "refresh_cache-144e6cca-5b79-4b25-9456-a59f6895075b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:44:10 np0005531887 nova_compute[186849]: 2025-11-22 07:44:10.794 186853 DEBUG nova.virt.libvirt.driver [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmppimoasud',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='144e6cca-5b79-4b25-9456-a59f6895075b',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Nov 22 02:44:10 np0005531887 nova_compute[186849]: 2025-11-22 07:44:10.795 186853 DEBUG nova.virt.libvirt.driver [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Creating instance directory: /var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Nov 22 02:44:10 np0005531887 nova_compute[186849]: 2025-11-22 07:44:10.795 186853 DEBUG nova.virt.libvirt.driver [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Creating disk.info with the contents: {'/var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b/disk': 'qcow2', '/var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854#033[00m
Nov 22 02:44:10 np0005531887 nova_compute[186849]: 2025-11-22 07:44:10.796 186853 DEBUG nova.virt.libvirt.driver [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864#033[00m
Nov 22 02:44:10 np0005531887 nova_compute[186849]: 2025-11-22 07:44:10.797 186853 DEBUG nova.objects.instance [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Lazy-loading 'trusted_certs' on Instance uuid 144e6cca-5b79-4b25-9456-a59f6895075b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:44:10 np0005531887 nova_compute[186849]: 2025-11-22 07:44:10.818 186853 DEBUG oslo_concurrency.processutils [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:44:10 np0005531887 nova_compute[186849]: 2025-11-22 07:44:10.901 186853 DEBUG oslo_concurrency.processutils [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:44:10 np0005531887 nova_compute[186849]: 2025-11-22 07:44:10.903 186853 DEBUG oslo_concurrency.lockutils [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:44:10 np0005531887 nova_compute[186849]: 2025-11-22 07:44:10.904 186853 DEBUG oslo_concurrency.lockutils [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:44:10 np0005531887 nova_compute[186849]: 2025-11-22 07:44:10.914 186853 DEBUG oslo_concurrency.processutils [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:44:10 np0005531887 nova_compute[186849]: 2025-11-22 07:44:10.974 186853 DEBUG oslo_concurrency.processutils [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:44:10 np0005531887 nova_compute[186849]: 2025-11-22 07:44:10.975 186853 DEBUG oslo_concurrency.processutils [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:44:11 np0005531887 nova_compute[186849]: 2025-11-22 07:44:11.008 186853 DEBUG oslo_concurrency.processutils [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:44:11 np0005531887 nova_compute[186849]: 2025-11-22 07:44:11.009 186853 DEBUG oslo_concurrency.lockutils [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:44:11 np0005531887 nova_compute[186849]: 2025-11-22 07:44:11.009 186853 DEBUG oslo_concurrency.processutils [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:44:11 np0005531887 nova_compute[186849]: 2025-11-22 07:44:11.070 186853 DEBUG oslo_concurrency.processutils [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:44:11 np0005531887 nova_compute[186849]: 2025-11-22 07:44:11.071 186853 DEBUG nova.virt.disk.api [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Checking if we can resize image /var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:44:11 np0005531887 nova_compute[186849]: 2025-11-22 07:44:11.072 186853 DEBUG oslo_concurrency.processutils [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:44:11 np0005531887 nova_compute[186849]: 2025-11-22 07:44:11.129 186853 DEBUG oslo_concurrency.processutils [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:44:11 np0005531887 nova_compute[186849]: 2025-11-22 07:44:11.131 186853 DEBUG nova.virt.disk.api [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Cannot resize image /var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:44:11 np0005531887 nova_compute[186849]: 2025-11-22 07:44:11.131 186853 DEBUG nova.objects.instance [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Lazy-loading 'migration_context' on Instance uuid 144e6cca-5b79-4b25-9456-a59f6895075b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:44:11 np0005531887 nova_compute[186849]: 2025-11-22 07:44:11.152 186853 DEBUG oslo_concurrency.processutils [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:44:11 np0005531887 nova_compute[186849]: 2025-11-22 07:44:11.175 186853 DEBUG oslo_concurrency.processutils [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b/disk.config 485376" returned: 0 in 0.023s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:44:11 np0005531887 nova_compute[186849]: 2025-11-22 07:44:11.177 186853 DEBUG nova.virt.libvirt.volume.remotefs [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Copying file compute-0.ctlplane.example.com:/var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b/disk.config to /var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 22 02:44:11 np0005531887 nova_compute[186849]: 2025-11-22 07:44:11.177 186853 DEBUG oslo_concurrency.processutils [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Running cmd (subprocess): scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b/disk.config /var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:44:11 np0005531887 nova_compute[186849]: 2025-11-22 07:44:11.404 186853 DEBUG oslo_concurrency.processutils [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] CMD "scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b/disk.config /var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b" returned: 0 in 0.227s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:44:11 np0005531887 nova_compute[186849]: 2025-11-22 07:44:11.405 186853 DEBUG nova.virt.libvirt.driver [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Nov 22 02:44:11 np0005531887 nova_compute[186849]: 2025-11-22 07:44:11.406 186853 DEBUG nova.virt.libvirt.vif [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-22T07:43:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1027576693',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1027576693',id=16,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:43:41Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='74651b744925468db6c6e47d1397cc04',ramdisk_id='',reservation_id='r-u8vxgo1r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1505701588',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1505701588-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:44:03Z,user_data=None,user_id='4ca2e31d955040598948fa3da5d84888',uuid=144e6cca-5b79-4b25-9456-a59f6895075b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "66ab05b0-442e-4420-82b9-0fc90a3df63b", "address": "fa:16:3e:4f:30:6c", "network": {"id": "cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-667943228-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74651b744925468db6c6e47d1397cc04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap66ab05b0-44", "ovs_interfaceid": "66ab05b0-442e-4420-82b9-0fc90a3df63b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 02:44:11 np0005531887 nova_compute[186849]: 2025-11-22 07:44:11.407 186853 DEBUG nova.network.os_vif_util [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Converting VIF {"id": "66ab05b0-442e-4420-82b9-0fc90a3df63b", "address": "fa:16:3e:4f:30:6c", "network": {"id": "cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-667943228-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74651b744925468db6c6e47d1397cc04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap66ab05b0-44", "ovs_interfaceid": "66ab05b0-442e-4420-82b9-0fc90a3df63b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:44:11 np0005531887 nova_compute[186849]: 2025-11-22 07:44:11.408 186853 DEBUG nova.network.os_vif_util [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:4f:30:6c,bridge_name='br-int',has_traffic_filtering=True,id=66ab05b0-442e-4420-82b9-0fc90a3df63b,network=Network(cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66ab05b0-44') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:44:11 np0005531887 nova_compute[186849]: 2025-11-22 07:44:11.408 186853 DEBUG os_vif [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:4f:30:6c,bridge_name='br-int',has_traffic_filtering=True,id=66ab05b0-442e-4420-82b9-0fc90a3df63b,network=Network(cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66ab05b0-44') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 02:44:11 np0005531887 nova_compute[186849]: 2025-11-22 07:44:11.409 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:11 np0005531887 nova_compute[186849]: 2025-11-22 07:44:11.409 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:44:11 np0005531887 nova_compute[186849]: 2025-11-22 07:44:11.410 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:44:11 np0005531887 nova_compute[186849]: 2025-11-22 07:44:11.413 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:11 np0005531887 nova_compute[186849]: 2025-11-22 07:44:11.413 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap66ab05b0-44, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:44:11 np0005531887 nova_compute[186849]: 2025-11-22 07:44:11.414 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap66ab05b0-44, col_values=(('external_ids', {'iface-id': '66ab05b0-442e-4420-82b9-0fc90a3df63b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4f:30:6c', 'vm-uuid': '144e6cca-5b79-4b25-9456-a59f6895075b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:44:11 np0005531887 nova_compute[186849]: 2025-11-22 07:44:11.464 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:11 np0005531887 NetworkManager[55210]: <info>  [1763797451.4655] manager: (tap66ab05b0-44): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/33)
Nov 22 02:44:11 np0005531887 nova_compute[186849]: 2025-11-22 07:44:11.467 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:44:11 np0005531887 nova_compute[186849]: 2025-11-22 07:44:11.474 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:11 np0005531887 nova_compute[186849]: 2025-11-22 07:44:11.475 186853 INFO os_vif [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:4f:30:6c,bridge_name='br-int',has_traffic_filtering=True,id=66ab05b0-442e-4420-82b9-0fc90a3df63b,network=Network(cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66ab05b0-44')#033[00m
Nov 22 02:44:11 np0005531887 nova_compute[186849]: 2025-11-22 07:44:11.476 186853 DEBUG nova.virt.libvirt.driver [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Nov 22 02:44:11 np0005531887 nova_compute[186849]: 2025-11-22 07:44:11.476 186853 DEBUG nova.compute.manager [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmppimoasud',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='144e6cca-5b79-4b25-9456-a59f6895075b',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Nov 22 02:44:11 np0005531887 podman[215685]: 2025-11-22 07:44:11.843392409 +0000 UTC m=+0.058644975 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.openshift.tags=minimal rhel9, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, distribution-scope=public, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350)
Nov 22 02:44:12 np0005531887 nova_compute[186849]: 2025-11-22 07:44:12.762 186853 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797437.7616022, 144e6cca-5b79-4b25-9456-a59f6895075b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:44:12 np0005531887 nova_compute[186849]: 2025-11-22 07:44:12.763 186853 INFO nova.compute.manager [-] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:44:12 np0005531887 nova_compute[186849]: 2025-11-22 07:44:12.778 186853 DEBUG nova.compute.manager [None req-a688cb3a-09b0-4a12-ba5a-26e4e05d3785 - - - - - -] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:44:14 np0005531887 nova_compute[186849]: 2025-11-22 07:44:14.258 186853 DEBUG nova.network.neutron [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Port 66ab05b0-442e-4420-82b9-0fc90a3df63b updated with migration profile {'os_vif_delegation': True, 'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Nov 22 02:44:14 np0005531887 nova_compute[186849]: 2025-11-22 07:44:14.266 186853 DEBUG nova.compute.manager [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmppimoasud',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='144e6cca-5b79-4b25-9456-a59f6895075b',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Nov 22 02:44:14 np0005531887 systemd[1]: Starting libvirt proxy daemon...
Nov 22 02:44:14 np0005531887 systemd[1]: Started libvirt proxy daemon.
Nov 22 02:44:14 np0005531887 kernel: tap66ab05b0-44: entered promiscuous mode
Nov 22 02:44:14 np0005531887 NetworkManager[55210]: <info>  [1763797454.5471] manager: (tap66ab05b0-44): new Tun device (/org/freedesktop/NetworkManager/Devices/34)
Nov 22 02:44:14 np0005531887 nova_compute[186849]: 2025-11-22 07:44:14.547 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:14 np0005531887 ovn_controller[95130]: 2025-11-22T07:44:14Z|00050|binding|INFO|Claiming lport 66ab05b0-442e-4420-82b9-0fc90a3df63b for this additional chassis.
Nov 22 02:44:14 np0005531887 ovn_controller[95130]: 2025-11-22T07:44:14Z|00051|binding|INFO|66ab05b0-442e-4420-82b9-0fc90a3df63b: Claiming fa:16:3e:4f:30:6c 10.100.0.8
Nov 22 02:44:14 np0005531887 nova_compute[186849]: 2025-11-22 07:44:14.562 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:14 np0005531887 ovn_controller[95130]: 2025-11-22T07:44:14Z|00052|binding|INFO|Setting lport 66ab05b0-442e-4420-82b9-0fc90a3df63b ovn-installed in OVS
Nov 22 02:44:14 np0005531887 nova_compute[186849]: 2025-11-22 07:44:14.566 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:14 np0005531887 systemd-udevd[215741]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:44:14 np0005531887 systemd-machined[153180]: New machine qemu-7-instance-00000010.
Nov 22 02:44:14 np0005531887 NetworkManager[55210]: <info>  [1763797454.5963] device (tap66ab05b0-44): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 02:44:14 np0005531887 NetworkManager[55210]: <info>  [1763797454.5972] device (tap66ab05b0-44): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 02:44:14 np0005531887 systemd[1]: Started Virtual Machine qemu-7-instance-00000010.
Nov 22 02:44:15 np0005531887 nova_compute[186849]: 2025-11-22 07:44:15.151 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:15 np0005531887 nova_compute[186849]: 2025-11-22 07:44:15.920 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763797455.9197404, 144e6cca-5b79-4b25-9456-a59f6895075b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:44:15 np0005531887 nova_compute[186849]: 2025-11-22 07:44:15.921 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] VM Started (Lifecycle Event)#033[00m
Nov 22 02:44:15 np0005531887 nova_compute[186849]: 2025-11-22 07:44:15.942 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:44:16 np0005531887 nova_compute[186849]: 2025-11-22 07:44:16.464 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:16 np0005531887 nova_compute[186849]: 2025-11-22 07:44:16.767 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763797456.7670064, 144e6cca-5b79-4b25-9456-a59f6895075b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:44:16 np0005531887 nova_compute[186849]: 2025-11-22 07:44:16.768 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:44:16 np0005531887 nova_compute[186849]: 2025-11-22 07:44:16.785 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:44:16 np0005531887 nova_compute[186849]: 2025-11-22 07:44:16.789 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:44:16 np0005531887 nova_compute[186849]: 2025-11-22 07:44:16.807 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Nov 22 02:44:18 np0005531887 ovn_controller[95130]: 2025-11-22T07:44:18Z|00053|binding|INFO|Claiming lport 66ab05b0-442e-4420-82b9-0fc90a3df63b for this chassis.
Nov 22 02:44:18 np0005531887 ovn_controller[95130]: 2025-11-22T07:44:18Z|00054|binding|INFO|66ab05b0-442e-4420-82b9-0fc90a3df63b: Claiming fa:16:3e:4f:30:6c 10.100.0.8
Nov 22 02:44:18 np0005531887 ovn_controller[95130]: 2025-11-22T07:44:18Z|00055|binding|INFO|Setting lport 66ab05b0-442e-4420-82b9-0fc90a3df63b up in Southbound
Nov 22 02:44:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:44:18.485 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4f:30:6c 10.100.0.8'], port_security=['fa:16:3e:4f:30:6c 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '144e6cca-5b79-4b25-9456-a59f6895075b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '74651b744925468db6c6e47d1397cc04', 'neutron:revision_number': '21', 'neutron:security_group_ids': '91f2be3c-33ea-422b-b9a4-1d9e92a850d8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=14c3e272-b4ef-4625-a876-b23f3cbba9b7, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=66ab05b0-442e-4420-82b9-0fc90a3df63b) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:44:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:44:18.486 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 66ab05b0-442e-4420-82b9-0fc90a3df63b in datapath cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9 bound to our chassis#033[00m
Nov 22 02:44:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:44:18.488 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9#033[00m
Nov 22 02:44:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:44:18.501 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[245bee7e-b799-4499-8c8c-9b3cfe090ebf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:44:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:44:18.502 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapcd5fa4f6-01 in ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 02:44:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:44:18.505 213790 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapcd5fa4f6-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 02:44:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:44:18.505 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[15a68937-1841-491b-b861-416b5708c1be]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:44:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:44:18.506 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[8aa03e39-3de2-48fb-b9ba-31aaf5e8a982]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:44:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:44:18.520 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[c92a15d2-81ca-42aa-959a-04fdb04091e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:44:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:44:18.534 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[a92f778e-9d45-40f9-910c-ef121da4e96a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:44:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:44:18.569 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[67561558-6e74-40db-bc35-3504f34ea8d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:44:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:44:18.576 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[013cf79a-ef97-4c4c-863d-f202549d3628]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:44:18 np0005531887 NetworkManager[55210]: <info>  [1763797458.5783] manager: (tapcd5fa4f6-00): new Veth device (/org/freedesktop/NetworkManager/Devices/35)
Nov 22 02:44:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:44:18.605 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[929ab698-1d8c-4b6a-b7f5-52d6abed68c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:44:18 np0005531887 systemd-udevd[215779]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:44:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:44:18.610 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[1d17823f-2d3e-4a63-b633-13b762b22ed0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:44:18 np0005531887 NetworkManager[55210]: <info>  [1763797458.6409] device (tapcd5fa4f6-00): carrier: link connected
Nov 22 02:44:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:44:18.646 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[b690df1b-944d-42c5-8820-0ba555f435cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:44:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:44:18.668 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[f7a162b6-2606-48ec-9f31-7d7203f7b86c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcd5fa4f6-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:12:db:2b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 20], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 420124, 'reachable_time': 17415, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215798, 'error': None, 'target': 'ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:44:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:44:18.686 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[645c0869-91b5-4582-b4f4-e197e9062024]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe12:db2b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 420124, 'tstamp': 420124}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215799, 'error': None, 'target': 'ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:44:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:44:18.700 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[7033558d-33eb-4fe4-8cd1-5a50dc12cdaf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcd5fa4f6-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:12:db:2b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 20], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 420124, 'reachable_time': 17415, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215800, 'error': None, 'target': 'ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:44:18 np0005531887 nova_compute[186849]: 2025-11-22 07:44:18.732 186853 INFO nova.compute.manager [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Post operation of migration started#033[00m
Nov 22 02:44:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:44:18.734 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[148213cc-1579-472d-a809-c0ec0611864e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:44:18 np0005531887 nova_compute[186849]: 2025-11-22 07:44:18.752 186853 DEBUG oslo_concurrency.lockutils [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "3cf2b323-ba35-4807-8337-288f6c983860" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:44:18 np0005531887 nova_compute[186849]: 2025-11-22 07:44:18.752 186853 DEBUG oslo_concurrency.lockutils [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "3cf2b323-ba35-4807-8337-288f6c983860" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:44:18 np0005531887 nova_compute[186849]: 2025-11-22 07:44:18.770 186853 DEBUG nova.compute.manager [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 02:44:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:44:18.798 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[16e1c87f-1ae4-418b-89d8-ec1f54bd625d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:44:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:44:18.800 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcd5fa4f6-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:44:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:44:18.800 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:44:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:44:18.801 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcd5fa4f6-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:44:18 np0005531887 NetworkManager[55210]: <info>  [1763797458.8032] manager: (tapcd5fa4f6-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/36)
Nov 22 02:44:18 np0005531887 kernel: tapcd5fa4f6-00: entered promiscuous mode
Nov 22 02:44:18 np0005531887 nova_compute[186849]: 2025-11-22 07:44:18.804 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:44:18.805 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapcd5fa4f6-00, col_values=(('external_ids', {'iface-id': 'f400467f-3f35-4435-bb4a-0b3da05366fb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:44:18 np0005531887 nova_compute[186849]: 2025-11-22 07:44:18.807 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:18 np0005531887 ovn_controller[95130]: 2025-11-22T07:44:18Z|00056|binding|INFO|Releasing lport f400467f-3f35-4435-bb4a-0b3da05366fb from this chassis (sb_readonly=0)
Nov 22 02:44:18 np0005531887 nova_compute[186849]: 2025-11-22 07:44:18.808 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:44:18.809 104084 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 02:44:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:44:18.810 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[6ce9ebb6-44e0-4751-9d12-d04cdf11b4f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:44:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:44:18.810 104084 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 02:44:18 np0005531887 ovn_metadata_agent[104079]: global
Nov 22 02:44:18 np0005531887 ovn_metadata_agent[104079]:    log         /dev/log local0 debug
Nov 22 02:44:18 np0005531887 ovn_metadata_agent[104079]:    log-tag     haproxy-metadata-proxy-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9
Nov 22 02:44:18 np0005531887 ovn_metadata_agent[104079]:    user        root
Nov 22 02:44:18 np0005531887 ovn_metadata_agent[104079]:    group       root
Nov 22 02:44:18 np0005531887 ovn_metadata_agent[104079]:    maxconn     1024
Nov 22 02:44:18 np0005531887 ovn_metadata_agent[104079]:    pidfile     /var/lib/neutron/external/pids/cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9.pid.haproxy
Nov 22 02:44:18 np0005531887 ovn_metadata_agent[104079]:    daemon
Nov 22 02:44:18 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:44:18 np0005531887 ovn_metadata_agent[104079]: defaults
Nov 22 02:44:18 np0005531887 ovn_metadata_agent[104079]:    log global
Nov 22 02:44:18 np0005531887 ovn_metadata_agent[104079]:    mode http
Nov 22 02:44:18 np0005531887 ovn_metadata_agent[104079]:    option httplog
Nov 22 02:44:18 np0005531887 ovn_metadata_agent[104079]:    option dontlognull
Nov 22 02:44:18 np0005531887 ovn_metadata_agent[104079]:    option http-server-close
Nov 22 02:44:18 np0005531887 ovn_metadata_agent[104079]:    option forwardfor
Nov 22 02:44:18 np0005531887 ovn_metadata_agent[104079]:    retries                 3
Nov 22 02:44:18 np0005531887 ovn_metadata_agent[104079]:    timeout http-request    30s
Nov 22 02:44:18 np0005531887 ovn_metadata_agent[104079]:    timeout connect         30s
Nov 22 02:44:18 np0005531887 ovn_metadata_agent[104079]:    timeout client          32s
Nov 22 02:44:18 np0005531887 ovn_metadata_agent[104079]:    timeout server          32s
Nov 22 02:44:18 np0005531887 ovn_metadata_agent[104079]:    timeout http-keep-alive 30s
Nov 22 02:44:18 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:44:18 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:44:18 np0005531887 ovn_metadata_agent[104079]: listen listener
Nov 22 02:44:18 np0005531887 ovn_metadata_agent[104079]:    bind 169.254.169.254:80
Nov 22 02:44:18 np0005531887 ovn_metadata_agent[104079]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 02:44:18 np0005531887 ovn_metadata_agent[104079]:    http-request add-header X-OVN-Network-ID cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9
Nov 22 02:44:18 np0005531887 ovn_metadata_agent[104079]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 02:44:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:44:18.811 104084 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9', 'env', 'PROCESS_TAG=haproxy-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 02:44:18 np0005531887 nova_compute[186849]: 2025-11-22 07:44:18.819 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:18 np0005531887 nova_compute[186849]: 2025-11-22 07:44:18.892 186853 DEBUG oslo_concurrency.lockutils [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:44:18 np0005531887 nova_compute[186849]: 2025-11-22 07:44:18.892 186853 DEBUG oslo_concurrency.lockutils [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:44:18 np0005531887 nova_compute[186849]: 2025-11-22 07:44:18.899 186853 DEBUG nova.virt.hardware [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:44:18 np0005531887 nova_compute[186849]: 2025-11-22 07:44:18.899 186853 INFO nova.compute.claims [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 22 02:44:19 np0005531887 nova_compute[186849]: 2025-11-22 07:44:19.055 186853 DEBUG oslo_concurrency.lockutils [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Acquiring lock "refresh_cache-144e6cca-5b79-4b25-9456-a59f6895075b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:44:19 np0005531887 nova_compute[186849]: 2025-11-22 07:44:19.056 186853 DEBUG oslo_concurrency.lockutils [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Acquired lock "refresh_cache-144e6cca-5b79-4b25-9456-a59f6895075b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:44:19 np0005531887 nova_compute[186849]: 2025-11-22 07:44:19.056 186853 DEBUG nova.network.neutron [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:44:19 np0005531887 nova_compute[186849]: 2025-11-22 07:44:19.086 186853 DEBUG nova.compute.provider_tree [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:44:19 np0005531887 nova_compute[186849]: 2025-11-22 07:44:19.102 186853 DEBUG nova.scheduler.client.report [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:44:19 np0005531887 nova_compute[186849]: 2025-11-22 07:44:19.130 186853 DEBUG oslo_concurrency.lockutils [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.238s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:44:19 np0005531887 nova_compute[186849]: 2025-11-22 07:44:19.131 186853 DEBUG nova.compute.manager [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 02:44:19 np0005531887 podman[215833]: 2025-11-22 07:44:19.188077303 +0000 UTC m=+0.049420419 container create c21bb2b3eae28e30c7a99511df45e3b38b42ff02dfc83101fd13c51ac4a5ca6e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 22 02:44:19 np0005531887 nova_compute[186849]: 2025-11-22 07:44:19.208 186853 DEBUG nova.compute.manager [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 02:44:19 np0005531887 nova_compute[186849]: 2025-11-22 07:44:19.210 186853 DEBUG nova.network.neutron [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 02:44:19 np0005531887 systemd[1]: Started libpod-conmon-c21bb2b3eae28e30c7a99511df45e3b38b42ff02dfc83101fd13c51ac4a5ca6e.scope.
Nov 22 02:44:19 np0005531887 nova_compute[186849]: 2025-11-22 07:44:19.230 186853 INFO nova.virt.libvirt.driver [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 02:44:19 np0005531887 systemd[1]: Started libcrun container.
Nov 22 02:44:19 np0005531887 nova_compute[186849]: 2025-11-22 07:44:19.250 186853 DEBUG nova.compute.manager [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 02:44:19 np0005531887 podman[215833]: 2025-11-22 07:44:19.159982746 +0000 UTC m=+0.021325882 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 02:44:19 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eb7127f50cfab2f9470facd81dc308cd56e345965c3fdec028c54ff67d5a3388/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 02:44:19 np0005531887 podman[215833]: 2025-11-22 07:44:19.273104571 +0000 UTC m=+0.134447717 container init c21bb2b3eae28e30c7a99511df45e3b38b42ff02dfc83101fd13c51ac4a5ca6e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 22 02:44:19 np0005531887 podman[215833]: 2025-11-22 07:44:19.27961312 +0000 UTC m=+0.140956236 container start c21bb2b3eae28e30c7a99511df45e3b38b42ff02dfc83101fd13c51ac4a5ca6e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:44:19 np0005531887 neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9[215849]: [NOTICE]   (215853) : New worker (215855) forked
Nov 22 02:44:19 np0005531887 neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9[215849]: [NOTICE]   (215853) : Loading success.
Nov 22 02:44:19 np0005531887 nova_compute[186849]: 2025-11-22 07:44:19.402 186853 DEBUG nova.compute.manager [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 02:44:19 np0005531887 nova_compute[186849]: 2025-11-22 07:44:19.403 186853 DEBUG nova.virt.libvirt.driver [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:44:19 np0005531887 nova_compute[186849]: 2025-11-22 07:44:19.404 186853 INFO nova.virt.libvirt.driver [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Creating image(s)#033[00m
Nov 22 02:44:19 np0005531887 nova_compute[186849]: 2025-11-22 07:44:19.404 186853 DEBUG oslo_concurrency.lockutils [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "/var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:44:19 np0005531887 nova_compute[186849]: 2025-11-22 07:44:19.404 186853 DEBUG oslo_concurrency.lockutils [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "/var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:44:19 np0005531887 nova_compute[186849]: 2025-11-22 07:44:19.405 186853 DEBUG oslo_concurrency.lockutils [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "/var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:44:19 np0005531887 nova_compute[186849]: 2025-11-22 07:44:19.417 186853 DEBUG oslo_concurrency.processutils [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:44:19 np0005531887 nova_compute[186849]: 2025-11-22 07:44:19.491 186853 DEBUG oslo_concurrency.processutils [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:44:19 np0005531887 nova_compute[186849]: 2025-11-22 07:44:19.492 186853 DEBUG oslo_concurrency.lockutils [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:44:19 np0005531887 nova_compute[186849]: 2025-11-22 07:44:19.492 186853 DEBUG oslo_concurrency.lockutils [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:44:19 np0005531887 nova_compute[186849]: 2025-11-22 07:44:19.505 186853 DEBUG oslo_concurrency.processutils [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:44:19 np0005531887 nova_compute[186849]: 2025-11-22 07:44:19.557 186853 DEBUG oslo_concurrency.processutils [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:44:19 np0005531887 nova_compute[186849]: 2025-11-22 07:44:19.558 186853 DEBUG oslo_concurrency.processutils [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:44:19 np0005531887 nova_compute[186849]: 2025-11-22 07:44:19.595 186853 DEBUG oslo_concurrency.processutils [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:44:19 np0005531887 nova_compute[186849]: 2025-11-22 07:44:19.596 186853 DEBUG oslo_concurrency.lockutils [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:44:19 np0005531887 nova_compute[186849]: 2025-11-22 07:44:19.597 186853 DEBUG oslo_concurrency.processutils [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:44:19 np0005531887 nova_compute[186849]: 2025-11-22 07:44:19.650 186853 DEBUG oslo_concurrency.processutils [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:44:19 np0005531887 nova_compute[186849]: 2025-11-22 07:44:19.651 186853 DEBUG nova.virt.disk.api [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Checking if we can resize image /var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:44:19 np0005531887 nova_compute[186849]: 2025-11-22 07:44:19.652 186853 DEBUG oslo_concurrency.processutils [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:44:19 np0005531887 nova_compute[186849]: 2025-11-22 07:44:19.707 186853 DEBUG oslo_concurrency.processutils [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:44:19 np0005531887 nova_compute[186849]: 2025-11-22 07:44:19.708 186853 DEBUG nova.virt.disk.api [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Cannot resize image /var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:44:19 np0005531887 nova_compute[186849]: 2025-11-22 07:44:19.708 186853 DEBUG nova.objects.instance [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lazy-loading 'migration_context' on Instance uuid 3cf2b323-ba35-4807-8337-288f6c983860 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:44:19 np0005531887 nova_compute[186849]: 2025-11-22 07:44:19.721 186853 DEBUG nova.virt.libvirt.driver [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:44:19 np0005531887 nova_compute[186849]: 2025-11-22 07:44:19.722 186853 DEBUG nova.virt.libvirt.driver [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Ensure instance console log exists: /var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:44:19 np0005531887 nova_compute[186849]: 2025-11-22 07:44:19.722 186853 DEBUG oslo_concurrency.lockutils [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:44:19 np0005531887 nova_compute[186849]: 2025-11-22 07:44:19.722 186853 DEBUG oslo_concurrency.lockutils [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:44:19 np0005531887 nova_compute[186849]: 2025-11-22 07:44:19.723 186853 DEBUG oslo_concurrency.lockutils [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:44:19 np0005531887 podman[215879]: 2025-11-22 07:44:19.859221964 +0000 UTC m=+0.074560044 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:44:19 np0005531887 podman[215880]: 2025-11-22 07:44:19.889578655 +0000 UTC m=+0.101408808 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 02:44:20 np0005531887 nova_compute[186849]: 2025-11-22 07:44:20.097 186853 DEBUG nova.network.neutron [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Nov 22 02:44:20 np0005531887 nova_compute[186849]: 2025-11-22 07:44:20.097 186853 DEBUG nova.compute.manager [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 02:44:20 np0005531887 nova_compute[186849]: 2025-11-22 07:44:20.099 186853 DEBUG nova.virt.libvirt.driver [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'size': 0, 'boot_index': 0, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:44:20 np0005531887 nova_compute[186849]: 2025-11-22 07:44:20.107 186853 WARNING nova.virt.libvirt.driver [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:44:20 np0005531887 nova_compute[186849]: 2025-11-22 07:44:20.114 186853 DEBUG nova.virt.libvirt.host [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:44:20 np0005531887 nova_compute[186849]: 2025-11-22 07:44:20.115 186853 DEBUG nova.virt.libvirt.host [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:44:20 np0005531887 nova_compute[186849]: 2025-11-22 07:44:20.121 186853 DEBUG nova.virt.libvirt.host [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:44:20 np0005531887 nova_compute[186849]: 2025-11-22 07:44:20.122 186853 DEBUG nova.virt.libvirt.host [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:44:20 np0005531887 nova_compute[186849]: 2025-11-22 07:44:20.124 186853 DEBUG nova.virt.libvirt.driver [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:44:20 np0005531887 nova_compute[186849]: 2025-11-22 07:44:20.125 186853 DEBUG nova.virt.hardware [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:44:20 np0005531887 nova_compute[186849]: 2025-11-22 07:44:20.125 186853 DEBUG nova.virt.hardware [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:44:20 np0005531887 nova_compute[186849]: 2025-11-22 07:44:20.126 186853 DEBUG nova.virt.hardware [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:44:20 np0005531887 nova_compute[186849]: 2025-11-22 07:44:20.126 186853 DEBUG nova.virt.hardware [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:44:20 np0005531887 nova_compute[186849]: 2025-11-22 07:44:20.126 186853 DEBUG nova.virt.hardware [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:44:20 np0005531887 nova_compute[186849]: 2025-11-22 07:44:20.127 186853 DEBUG nova.virt.hardware [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:44:20 np0005531887 nova_compute[186849]: 2025-11-22 07:44:20.127 186853 DEBUG nova.virt.hardware [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:44:20 np0005531887 nova_compute[186849]: 2025-11-22 07:44:20.127 186853 DEBUG nova.virt.hardware [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:44:20 np0005531887 nova_compute[186849]: 2025-11-22 07:44:20.127 186853 DEBUG nova.virt.hardware [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:44:20 np0005531887 nova_compute[186849]: 2025-11-22 07:44:20.128 186853 DEBUG nova.virt.hardware [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:44:20 np0005531887 nova_compute[186849]: 2025-11-22 07:44:20.128 186853 DEBUG nova.virt.hardware [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:44:20 np0005531887 nova_compute[186849]: 2025-11-22 07:44:20.133 186853 DEBUG nova.objects.instance [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lazy-loading 'pci_devices' on Instance uuid 3cf2b323-ba35-4807-8337-288f6c983860 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:44:20 np0005531887 nova_compute[186849]: 2025-11-22 07:44:20.148 186853 DEBUG nova.virt.libvirt.driver [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:44:20 np0005531887 nova_compute[186849]:  <uuid>3cf2b323-ba35-4807-8337-288f6c983860</uuid>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:  <name>instance-00000013</name>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:  <memory>131072</memory>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:  <vcpu>1</vcpu>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:44:20 np0005531887 nova_compute[186849]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:      <nova:name>tempest-MigrationsAdminTest-server-1564380060</nova:name>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:      <nova:creationTime>2025-11-22 07:44:20</nova:creationTime>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:      <nova:flavor name="m1.nano">
Nov 22 02:44:20 np0005531887 nova_compute[186849]:        <nova:memory>128</nova:memory>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:        <nova:disk>1</nova:disk>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:        <nova:swap>0</nova:swap>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:      </nova:flavor>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:      <nova:owner>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:        <nova:user uuid="5ea417ea62e2404d8cb5b9e767e8c5c4">tempest-MigrationsAdminTest-573005991-project-member</nova:user>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:        <nova:project uuid="070aaece3c3c4232877d26c34023c56d">tempest-MigrationsAdminTest-573005991</nova:project>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:      </nova:owner>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:      <nova:ports/>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:    </nova:instance>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:  <sysinfo type="smbios">
Nov 22 02:44:20 np0005531887 nova_compute[186849]:    <system>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:      <entry name="serial">3cf2b323-ba35-4807-8337-288f6c983860</entry>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:      <entry name="uuid">3cf2b323-ba35-4807-8337-288f6c983860</entry>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:    </system>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:  <os>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:    <boot dev="hd"/>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:    <smbios mode="sysinfo"/>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:  </os>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:  <features>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:    <vmcoreinfo/>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:  </features>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:  <clock offset="utc">
Nov 22 02:44:20 np0005531887 nova_compute[186849]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:    <timer name="hpet" present="no"/>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:  </clock>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:  <cpu mode="custom" match="exact">
Nov 22 02:44:20 np0005531887 nova_compute[186849]:    <model>Nehalem</model>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:  <devices>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:    <disk type="file" device="disk">
Nov 22 02:44:20 np0005531887 nova_compute[186849]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860/disk"/>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:      <target dev="vda" bus="virtio"/>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:    <disk type="file" device="cdrom">
Nov 22 02:44:20 np0005531887 nova_compute[186849]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860/disk.config"/>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:      <target dev="sda" bus="sata"/>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:    <serial type="pty">
Nov 22 02:44:20 np0005531887 nova_compute[186849]:      <log file="/var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860/console.log" append="off"/>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:    </serial>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:    <video>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:    </video>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:    <input type="tablet" bus="usb"/>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:    <rng model="virtio">
Nov 22 02:44:20 np0005531887 nova_compute[186849]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:    </rng>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:    <controller type="usb" index="0"/>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:    <memballoon model="virtio">
Nov 22 02:44:20 np0005531887 nova_compute[186849]:      <stats period="10"/>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 02:44:20 np0005531887 nova_compute[186849]:  </devices>
Nov 22 02:44:20 np0005531887 nova_compute[186849]: </domain>
Nov 22 02:44:20 np0005531887 nova_compute[186849]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:44:20 np0005531887 nova_compute[186849]: 2025-11-22 07:44:20.153 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:20 np0005531887 nova_compute[186849]: 2025-11-22 07:44:20.202 186853 DEBUG nova.virt.libvirt.driver [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:44:20 np0005531887 nova_compute[186849]: 2025-11-22 07:44:20.203 186853 DEBUG nova.virt.libvirt.driver [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:44:20 np0005531887 nova_compute[186849]: 2025-11-22 07:44:20.203 186853 INFO nova.virt.libvirt.driver [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Using config drive#033[00m
Nov 22 02:44:20 np0005531887 nova_compute[186849]: 2025-11-22 07:44:20.370 186853 INFO nova.virt.libvirt.driver [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Creating config drive at /var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860/disk.config#033[00m
Nov 22 02:44:20 np0005531887 nova_compute[186849]: 2025-11-22 07:44:20.374 186853 DEBUG oslo_concurrency.processutils [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjkyie014 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:44:20 np0005531887 nova_compute[186849]: 2025-11-22 07:44:20.498 186853 DEBUG oslo_concurrency.processutils [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjkyie014" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:44:20 np0005531887 systemd-machined[153180]: New machine qemu-8-instance-00000013.
Nov 22 02:44:20 np0005531887 systemd[1]: Started Virtual Machine qemu-8-instance-00000013.
Nov 22 02:44:21 np0005531887 nova_compute[186849]: 2025-11-22 07:44:21.165 186853 DEBUG nova.network.neutron [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Updating instance_info_cache with network_info: [{"id": "66ab05b0-442e-4420-82b9-0fc90a3df63b", "address": "fa:16:3e:4f:30:6c", "network": {"id": "cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-667943228-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74651b744925468db6c6e47d1397cc04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66ab05b0-44", "ovs_interfaceid": "66ab05b0-442e-4420-82b9-0fc90a3df63b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:44:21 np0005531887 nova_compute[186849]: 2025-11-22 07:44:21.222 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763797461.2223527, 3cf2b323-ba35-4807-8337-288f6c983860 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:44:21 np0005531887 nova_compute[186849]: 2025-11-22 07:44:21.223 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:44:21 np0005531887 nova_compute[186849]: 2025-11-22 07:44:21.225 186853 DEBUG nova.compute.manager [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:44:21 np0005531887 nova_compute[186849]: 2025-11-22 07:44:21.226 186853 DEBUG nova.virt.libvirt.driver [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 02:44:21 np0005531887 nova_compute[186849]: 2025-11-22 07:44:21.231 186853 INFO nova.virt.libvirt.driver [-] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Instance spawned successfully.#033[00m
Nov 22 02:44:21 np0005531887 nova_compute[186849]: 2025-11-22 07:44:21.232 186853 DEBUG nova.virt.libvirt.driver [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 02:44:21 np0005531887 nova_compute[186849]: 2025-11-22 07:44:21.247 186853 DEBUG oslo_concurrency.lockutils [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Releasing lock "refresh_cache-144e6cca-5b79-4b25-9456-a59f6895075b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:44:21 np0005531887 nova_compute[186849]: 2025-11-22 07:44:21.250 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:44:21 np0005531887 nova_compute[186849]: 2025-11-22 07:44:21.263 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:44:21 np0005531887 nova_compute[186849]: 2025-11-22 07:44:21.268 186853 DEBUG nova.virt.libvirt.driver [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:44:21 np0005531887 nova_compute[186849]: 2025-11-22 07:44:21.268 186853 DEBUG nova.virt.libvirt.driver [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:44:21 np0005531887 nova_compute[186849]: 2025-11-22 07:44:21.269 186853 DEBUG nova.virt.libvirt.driver [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:44:21 np0005531887 nova_compute[186849]: 2025-11-22 07:44:21.270 186853 DEBUG nova.virt.libvirt.driver [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:44:21 np0005531887 nova_compute[186849]: 2025-11-22 07:44:21.270 186853 DEBUG nova.virt.libvirt.driver [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:44:21 np0005531887 nova_compute[186849]: 2025-11-22 07:44:21.271 186853 DEBUG nova.virt.libvirt.driver [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:44:21 np0005531887 nova_compute[186849]: 2025-11-22 07:44:21.277 186853 DEBUG oslo_concurrency.lockutils [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:44:21 np0005531887 nova_compute[186849]: 2025-11-22 07:44:21.278 186853 DEBUG oslo_concurrency.lockutils [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:44:21 np0005531887 nova_compute[186849]: 2025-11-22 07:44:21.278 186853 DEBUG oslo_concurrency.lockutils [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:44:21 np0005531887 nova_compute[186849]: 2025-11-22 07:44:21.283 186853 INFO nova.virt.libvirt.driver [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Nov 22 02:44:21 np0005531887 virtqemud[186424]: Domain id=7 name='instance-00000010' uuid=144e6cca-5b79-4b25-9456-a59f6895075b is tainted: custom-monitor
Nov 22 02:44:21 np0005531887 nova_compute[186849]: 2025-11-22 07:44:21.467 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:21 np0005531887 nova_compute[186849]: 2025-11-22 07:44:21.473 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:44:21 np0005531887 nova_compute[186849]: 2025-11-22 07:44:21.473 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763797461.2252, 3cf2b323-ba35-4807-8337-288f6c983860 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:44:21 np0005531887 nova_compute[186849]: 2025-11-22 07:44:21.474 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] VM Started (Lifecycle Event)#033[00m
Nov 22 02:44:21 np0005531887 nova_compute[186849]: 2025-11-22 07:44:21.502 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:44:21 np0005531887 nova_compute[186849]: 2025-11-22 07:44:21.507 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:44:21 np0005531887 nova_compute[186849]: 2025-11-22 07:44:21.534 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:44:21 np0005531887 nova_compute[186849]: 2025-11-22 07:44:21.571 186853 INFO nova.compute.manager [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Took 2.17 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 02:44:21 np0005531887 nova_compute[186849]: 2025-11-22 07:44:21.572 186853 DEBUG nova.compute.manager [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:44:21 np0005531887 nova_compute[186849]: 2025-11-22 07:44:21.650 186853 INFO nova.compute.manager [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Took 2.80 seconds to build instance.#033[00m
Nov 22 02:44:21 np0005531887 nova_compute[186849]: 2025-11-22 07:44:21.671 186853 DEBUG oslo_concurrency.lockutils [None req-6bef3eb2-c824-4411-8908-d079321867b9 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "3cf2b323-ba35-4807-8337-288f6c983860" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.919s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:44:22 np0005531887 nova_compute[186849]: 2025-11-22 07:44:22.293 186853 INFO nova.virt.libvirt.driver [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Nov 22 02:44:23 np0005531887 nova_compute[186849]: 2025-11-22 07:44:23.300 186853 INFO nova.virt.libvirt.driver [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Nov 22 02:44:23 np0005531887 nova_compute[186849]: 2025-11-22 07:44:23.307 186853 DEBUG nova.compute.manager [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:44:23 np0005531887 nova_compute[186849]: 2025-11-22 07:44:23.333 186853 DEBUG nova.objects.instance [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Nov 22 02:44:24 np0005531887 nova_compute[186849]: 2025-11-22 07:44:24.782 186853 DEBUG oslo_concurrency.lockutils [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "refresh_cache-3cf2b323-ba35-4807-8337-288f6c983860" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:44:24 np0005531887 nova_compute[186849]: 2025-11-22 07:44:24.782 186853 DEBUG oslo_concurrency.lockutils [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquired lock "refresh_cache-3cf2b323-ba35-4807-8337-288f6c983860" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:44:24 np0005531887 nova_compute[186849]: 2025-11-22 07:44:24.782 186853 DEBUG nova.network.neutron [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:44:25 np0005531887 nova_compute[186849]: 2025-11-22 07:44:25.029 186853 DEBUG nova.network.neutron [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:44:25 np0005531887 nova_compute[186849]: 2025-11-22 07:44:25.157 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:25 np0005531887 nova_compute[186849]: 2025-11-22 07:44:25.342 186853 DEBUG nova.network.neutron [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:44:25 np0005531887 nova_compute[186849]: 2025-11-22 07:44:25.359 186853 DEBUG oslo_concurrency.lockutils [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Releasing lock "refresh_cache-3cf2b323-ba35-4807-8337-288f6c983860" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:44:25 np0005531887 nova_compute[186849]: 2025-11-22 07:44:25.479 186853 DEBUG nova.virt.libvirt.driver [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Nov 22 02:44:25 np0005531887 nova_compute[186849]: 2025-11-22 07:44:25.480 186853 DEBUG nova.virt.libvirt.volume.remotefs [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Creating file /var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860/d8c3ab0cd91c4389981cbefda382b123.tmp on remote host 192.168.122.102 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Nov 22 02:44:25 np0005531887 nova_compute[186849]: 2025-11-22 07:44:25.480 186853 DEBUG oslo_concurrency.processutils [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860/d8c3ab0cd91c4389981cbefda382b123.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:44:25 np0005531887 podman[215954]: 2025-11-22 07:44:25.864065736 +0000 UTC m=+0.069658293 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 02:44:25 np0005531887 nova_compute[186849]: 2025-11-22 07:44:25.942 186853 DEBUG oslo_concurrency.processutils [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860/d8c3ab0cd91c4389981cbefda382b123.tmp" returned: 1 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:44:25 np0005531887 nova_compute[186849]: 2025-11-22 07:44:25.943 186853 DEBUG oslo_concurrency.processutils [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] 'ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860/d8c3ab0cd91c4389981cbefda382b123.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Nov 22 02:44:25 np0005531887 nova_compute[186849]: 2025-11-22 07:44:25.943 186853 DEBUG nova.virt.libvirt.volume.remotefs [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Creating directory /var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860 on remote host 192.168.122.102 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Nov 22 02:44:25 np0005531887 nova_compute[186849]: 2025-11-22 07:44:25.944 186853 DEBUG oslo_concurrency.processutils [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:44:26 np0005531887 nova_compute[186849]: 2025-11-22 07:44:26.167 186853 DEBUG oslo_concurrency.processutils [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860" returned: 0 in 0.223s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:44:26 np0005531887 nova_compute[186849]: 2025-11-22 07:44:26.173 186853 DEBUG nova.virt.libvirt.driver [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 22 02:44:26 np0005531887 nova_compute[186849]: 2025-11-22 07:44:26.470 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:28 np0005531887 podman[215979]: 2025-11-22 07:44:28.842223904 +0000 UTC m=+0.058528771 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:44:30 np0005531887 nova_compute[186849]: 2025-11-22 07:44:30.160 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:31 np0005531887 nova_compute[186849]: 2025-11-22 07:44:31.169 186853 DEBUG oslo_concurrency.lockutils [None req-29f93869-491f-4711-bfdd-d54f8d4de4c3 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Acquiring lock "9141d9a1-7a40-4a72-a3f7-2d67ae112383" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:44:31 np0005531887 nova_compute[186849]: 2025-11-22 07:44:31.169 186853 DEBUG oslo_concurrency.lockutils [None req-29f93869-491f-4711-bfdd-d54f8d4de4c3 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "9141d9a1-7a40-4a72-a3f7-2d67ae112383" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:44:31 np0005531887 nova_compute[186849]: 2025-11-22 07:44:31.170 186853 DEBUG oslo_concurrency.lockutils [None req-29f93869-491f-4711-bfdd-d54f8d4de4c3 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Acquiring lock "9141d9a1-7a40-4a72-a3f7-2d67ae112383-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:44:31 np0005531887 nova_compute[186849]: 2025-11-22 07:44:31.172 186853 DEBUG oslo_concurrency.lockutils [None req-29f93869-491f-4711-bfdd-d54f8d4de4c3 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "9141d9a1-7a40-4a72-a3f7-2d67ae112383-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:44:31 np0005531887 nova_compute[186849]: 2025-11-22 07:44:31.173 186853 DEBUG oslo_concurrency.lockutils [None req-29f93869-491f-4711-bfdd-d54f8d4de4c3 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "9141d9a1-7a40-4a72-a3f7-2d67ae112383-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:44:31 np0005531887 nova_compute[186849]: 2025-11-22 07:44:31.183 186853 INFO nova.compute.manager [None req-29f93869-491f-4711-bfdd-d54f8d4de4c3 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Terminating instance#033[00m
Nov 22 02:44:31 np0005531887 nova_compute[186849]: 2025-11-22 07:44:31.194 186853 DEBUG nova.compute.manager [None req-29f93869-491f-4711-bfdd-d54f8d4de4c3 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 02:44:31 np0005531887 kernel: tap606a4645-89 (unregistering): left promiscuous mode
Nov 22 02:44:31 np0005531887 NetworkManager[55210]: <info>  [1763797471.2257] device (tap606a4645-89): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 02:44:31 np0005531887 ovn_controller[95130]: 2025-11-22T07:44:31Z|00057|binding|INFO|Releasing lport 606a4645-8996-452d-9864-00ce49d9140c from this chassis (sb_readonly=0)
Nov 22 02:44:31 np0005531887 ovn_controller[95130]: 2025-11-22T07:44:31Z|00058|binding|INFO|Setting lport 606a4645-8996-452d-9864-00ce49d9140c down in Southbound
Nov 22 02:44:31 np0005531887 ovn_controller[95130]: 2025-11-22T07:44:31Z|00059|binding|INFO|Removing iface tap606a4645-89 ovn-installed in OVS
Nov 22 02:44:31 np0005531887 nova_compute[186849]: 2025-11-22 07:44:31.236 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:31 np0005531887 nova_compute[186849]: 2025-11-22 07:44:31.240 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:31 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:44:31.247 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ce:a6:f6 10.100.0.6'], port_security=['fa:16:3e:ce:a6:f6 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '9141d9a1-7a40-4a72-a3f7-2d67ae112383', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7ba1c27-6255-4c71-8e98-23a1c59b5723', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9b004cb06df74de2903dae19345fd9c7', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f8b9c274-fa57-419c-9d40-54201db84f9d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ce3be460-df7c-41a5-9ff2-c82c8fc728ec, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=606a4645-8996-452d-9864-00ce49d9140c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:44:31 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:44:31.249 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 606a4645-8996-452d-9864-00ce49d9140c in datapath d7ba1c27-6255-4c71-8e98-23a1c59b5723 unbound from our chassis#033[00m
Nov 22 02:44:31 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:44:31.251 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d7ba1c27-6255-4c71-8e98-23a1c59b5723, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 02:44:31 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:44:31.253 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[d95a7209-98df-41c8-869c-39e4d359f41b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:44:31 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:44:31.254 104084 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723 namespace which is not needed anymore#033[00m
Nov 22 02:44:31 np0005531887 nova_compute[186849]: 2025-11-22 07:44:31.260 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:31 np0005531887 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000012.scope: Deactivated successfully.
Nov 22 02:44:31 np0005531887 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000012.scope: Consumed 16.825s CPU time.
Nov 22 02:44:31 np0005531887 systemd-machined[153180]: Machine qemu-6-instance-00000012 terminated.
Nov 22 02:44:31 np0005531887 neutron-haproxy-ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723[215379]: [NOTICE]   (215400) : haproxy version is 2.8.14-c23fe91
Nov 22 02:44:31 np0005531887 neutron-haproxy-ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723[215379]: [NOTICE]   (215400) : path to executable is /usr/sbin/haproxy
Nov 22 02:44:31 np0005531887 neutron-haproxy-ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723[215379]: [WARNING]  (215400) : Exiting Master process...
Nov 22 02:44:31 np0005531887 neutron-haproxy-ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723[215379]: [ALERT]    (215400) : Current worker (215406) exited with code 143 (Terminated)
Nov 22 02:44:31 np0005531887 neutron-haproxy-ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723[215379]: [WARNING]  (215400) : All workers exited. Exiting... (0)
Nov 22 02:44:31 np0005531887 systemd[1]: libpod-c34aae5a32655d08bb9a4cb3d84f64f791bf035b319d96480ec9b00736b02799.scope: Deactivated successfully.
Nov 22 02:44:31 np0005531887 podman[216021]: 2025-11-22 07:44:31.410878396 +0000 UTC m=+0.055497188 container died c34aae5a32655d08bb9a4cb3d84f64f791bf035b319d96480ec9b00736b02799 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 02:44:31 np0005531887 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c34aae5a32655d08bb9a4cb3d84f64f791bf035b319d96480ec9b00736b02799-userdata-shm.mount: Deactivated successfully.
Nov 22 02:44:31 np0005531887 systemd[1]: var-lib-containers-storage-overlay-7e6c8a3c630d48c9c21bb5e3cbc5930f23c1d567c11262afa9edbfcd59b211d7-merged.mount: Deactivated successfully.
Nov 22 02:44:31 np0005531887 podman[216021]: 2025-11-22 07:44:31.471440815 +0000 UTC m=+0.116059577 container cleanup c34aae5a32655d08bb9a4cb3d84f64f791bf035b319d96480ec9b00736b02799 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 02:44:31 np0005531887 nova_compute[186849]: 2025-11-22 07:44:31.472 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:31 np0005531887 nova_compute[186849]: 2025-11-22 07:44:31.475 186853 INFO nova.virt.libvirt.driver [-] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Instance destroyed successfully.#033[00m
Nov 22 02:44:31 np0005531887 nova_compute[186849]: 2025-11-22 07:44:31.476 186853 DEBUG nova.objects.instance [None req-29f93869-491f-4711-bfdd-d54f8d4de4c3 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lazy-loading 'resources' on Instance uuid 9141d9a1-7a40-4a72-a3f7-2d67ae112383 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:44:31 np0005531887 nova_compute[186849]: 2025-11-22 07:44:31.489 186853 DEBUG nova.virt.libvirt.vif [None req-29f93869-491f-4711-bfdd-d54f8d4de4c3 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:43:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-611382116',display_name='tempest-ServersAdminTestJSON-server-611382116',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-611382116',id=18,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:43:50Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9b004cb06df74de2903dae19345fd9c7',ramdisk_id='',reservation_id='r-e3r0caf9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1843119868',owner_user_name='tempest-ServersAdminTestJSON-1843119868-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:43:50Z,user_data=None,user_id='7c0fb56fc41e44dfa23a0d45149e78e3',uuid=9141d9a1-7a40-4a72-a3f7-2d67ae112383,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "606a4645-8996-452d-9864-00ce49d9140c", "address": "fa:16:3e:ce:a6:f6", "network": {"id": "d7ba1c27-6255-4c71-8e98-23a1c59b5723", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1812148536-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b004cb06df74de2903dae19345fd9c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap606a4645-89", "ovs_interfaceid": "606a4645-8996-452d-9864-00ce49d9140c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 02:44:31 np0005531887 nova_compute[186849]: 2025-11-22 07:44:31.489 186853 DEBUG nova.network.os_vif_util [None req-29f93869-491f-4711-bfdd-d54f8d4de4c3 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Converting VIF {"id": "606a4645-8996-452d-9864-00ce49d9140c", "address": "fa:16:3e:ce:a6:f6", "network": {"id": "d7ba1c27-6255-4c71-8e98-23a1c59b5723", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1812148536-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b004cb06df74de2903dae19345fd9c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap606a4645-89", "ovs_interfaceid": "606a4645-8996-452d-9864-00ce49d9140c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:44:31 np0005531887 nova_compute[186849]: 2025-11-22 07:44:31.490 186853 DEBUG nova.network.os_vif_util [None req-29f93869-491f-4711-bfdd-d54f8d4de4c3 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ce:a6:f6,bridge_name='br-int',has_traffic_filtering=True,id=606a4645-8996-452d-9864-00ce49d9140c,network=Network(d7ba1c27-6255-4c71-8e98-23a1c59b5723),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap606a4645-89') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:44:31 np0005531887 nova_compute[186849]: 2025-11-22 07:44:31.491 186853 DEBUG os_vif [None req-29f93869-491f-4711-bfdd-d54f8d4de4c3 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ce:a6:f6,bridge_name='br-int',has_traffic_filtering=True,id=606a4645-8996-452d-9864-00ce49d9140c,network=Network(d7ba1c27-6255-4c71-8e98-23a1c59b5723),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap606a4645-89') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 02:44:31 np0005531887 nova_compute[186849]: 2025-11-22 07:44:31.493 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:31 np0005531887 systemd[1]: libpod-conmon-c34aae5a32655d08bb9a4cb3d84f64f791bf035b319d96480ec9b00736b02799.scope: Deactivated successfully.
Nov 22 02:44:31 np0005531887 nova_compute[186849]: 2025-11-22 07:44:31.494 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap606a4645-89, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:44:31 np0005531887 nova_compute[186849]: 2025-11-22 07:44:31.496 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:31 np0005531887 nova_compute[186849]: 2025-11-22 07:44:31.499 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:44:31 np0005531887 nova_compute[186849]: 2025-11-22 07:44:31.501 186853 INFO os_vif [None req-29f93869-491f-4711-bfdd-d54f8d4de4c3 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ce:a6:f6,bridge_name='br-int',has_traffic_filtering=True,id=606a4645-8996-452d-9864-00ce49d9140c,network=Network(d7ba1c27-6255-4c71-8e98-23a1c59b5723),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap606a4645-89')#033[00m
Nov 22 02:44:31 np0005531887 nova_compute[186849]: 2025-11-22 07:44:31.502 186853 INFO nova.virt.libvirt.driver [None req-29f93869-491f-4711-bfdd-d54f8d4de4c3 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Deleting instance files /var/lib/nova/instances/9141d9a1-7a40-4a72-a3f7-2d67ae112383_del#033[00m
Nov 22 02:44:31 np0005531887 nova_compute[186849]: 2025-11-22 07:44:31.503 186853 INFO nova.virt.libvirt.driver [None req-29f93869-491f-4711-bfdd-d54f8d4de4c3 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Deletion of /var/lib/nova/instances/9141d9a1-7a40-4a72-a3f7-2d67ae112383_del complete#033[00m
Nov 22 02:44:31 np0005531887 podman[216067]: 2025-11-22 07:44:31.556439543 +0000 UTC m=+0.053980781 container remove c34aae5a32655d08bb9a4cb3d84f64f791bf035b319d96480ec9b00736b02799 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 02:44:31 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:44:31.562 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[9df77a0d-6ab3-4e7e-968b-ca45ed9c4b0c]: (4, ('Sat Nov 22 07:44:31 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723 (c34aae5a32655d08bb9a4cb3d84f64f791bf035b319d96480ec9b00736b02799)\nc34aae5a32655d08bb9a4cb3d84f64f791bf035b319d96480ec9b00736b02799\nSat Nov 22 07:44:31 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723 (c34aae5a32655d08bb9a4cb3d84f64f791bf035b319d96480ec9b00736b02799)\nc34aae5a32655d08bb9a4cb3d84f64f791bf035b319d96480ec9b00736b02799\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:44:31 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:44:31.564 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[c95757d7-9b6b-4bfd-a896-843b1ba31f8a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:44:31 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:44:31.565 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7ba1c27-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:44:31 np0005531887 kernel: tapd7ba1c27-60: left promiscuous mode
Nov 22 02:44:31 np0005531887 nova_compute[186849]: 2025-11-22 07:44:31.567 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:31 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:44:31.572 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[803e8b9a-f746-4c0a-b971-7aa557ba1e90]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:44:31 np0005531887 nova_compute[186849]: 2025-11-22 07:44:31.581 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:31 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:44:31.593 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[bdc7f83d-1575-4cc0-b973-c18f63fabc8c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:44:31 np0005531887 nova_compute[186849]: 2025-11-22 07:44:31.593 186853 INFO nova.compute.manager [None req-29f93869-491f-4711-bfdd-d54f8d4de4c3 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Took 0.40 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 02:44:31 np0005531887 nova_compute[186849]: 2025-11-22 07:44:31.594 186853 DEBUG oslo.service.loopingcall [None req-29f93869-491f-4711-bfdd-d54f8d4de4c3 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 02:44:31 np0005531887 nova_compute[186849]: 2025-11-22 07:44:31.594 186853 DEBUG nova.compute.manager [-] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 02:44:31 np0005531887 nova_compute[186849]: 2025-11-22 07:44:31.594 186853 DEBUG nova.network.neutron [-] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 02:44:31 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:44:31.595 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[d2960d73-f96d-4dbc-af74-bd79d1765db7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:44:31 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:44:31.610 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[5e697d0c-32c5-421a-93d6-78637cad7f02]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 417142, 'reachable_time': 26933, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216082, 'error': None, 'target': 'ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:44:31 np0005531887 systemd[1]: run-netns-ovnmeta\x2dd7ba1c27\x2d6255\x2d4c71\x2d8e98\x2d23a1c59b5723.mount: Deactivated successfully.
Nov 22 02:44:31 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:44:31.615 104200 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 02:44:31 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:44:31.616 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[06a62ba5-b3b9-4c57-b079-ecd1c23b334a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:44:31 np0005531887 nova_compute[186849]: 2025-11-22 07:44:31.648 186853 DEBUG nova.compute.manager [req-66fb9561-d87f-4ecb-8c75-8c466935d460 req-51fc377a-b376-4d77-a6a9-0ab85188e39a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Received event network-vif-unplugged-606a4645-8996-452d-9864-00ce49d9140c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:44:31 np0005531887 nova_compute[186849]: 2025-11-22 07:44:31.648 186853 DEBUG oslo_concurrency.lockutils [req-66fb9561-d87f-4ecb-8c75-8c466935d460 req-51fc377a-b376-4d77-a6a9-0ab85188e39a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "9141d9a1-7a40-4a72-a3f7-2d67ae112383-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:44:31 np0005531887 nova_compute[186849]: 2025-11-22 07:44:31.649 186853 DEBUG oslo_concurrency.lockutils [req-66fb9561-d87f-4ecb-8c75-8c466935d460 req-51fc377a-b376-4d77-a6a9-0ab85188e39a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "9141d9a1-7a40-4a72-a3f7-2d67ae112383-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:44:31 np0005531887 nova_compute[186849]: 2025-11-22 07:44:31.649 186853 DEBUG oslo_concurrency.lockutils [req-66fb9561-d87f-4ecb-8c75-8c466935d460 req-51fc377a-b376-4d77-a6a9-0ab85188e39a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "9141d9a1-7a40-4a72-a3f7-2d67ae112383-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:44:31 np0005531887 nova_compute[186849]: 2025-11-22 07:44:31.649 186853 DEBUG nova.compute.manager [req-66fb9561-d87f-4ecb-8c75-8c466935d460 req-51fc377a-b376-4d77-a6a9-0ab85188e39a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] No waiting events found dispatching network-vif-unplugged-606a4645-8996-452d-9864-00ce49d9140c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:44:31 np0005531887 nova_compute[186849]: 2025-11-22 07:44:31.649 186853 DEBUG nova.compute.manager [req-66fb9561-d87f-4ecb-8c75-8c466935d460 req-51fc377a-b376-4d77-a6a9-0ab85188e39a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Received event network-vif-unplugged-606a4645-8996-452d-9864-00ce49d9140c for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 02:44:32 np0005531887 nova_compute[186849]: 2025-11-22 07:44:32.297 186853 DEBUG nova.network.neutron [-] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:44:32 np0005531887 nova_compute[186849]: 2025-11-22 07:44:32.329 186853 INFO nova.compute.manager [-] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Took 0.73 seconds to deallocate network for instance.#033[00m
Nov 22 02:44:32 np0005531887 nova_compute[186849]: 2025-11-22 07:44:32.368 186853 DEBUG nova.compute.manager [req-0761aa60-86ab-4744-9b34-340d02b5ea94 req-6ea1dc40-e8ea-4ed3-8d18-b781e1393e9f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Received event network-vif-deleted-606a4645-8996-452d-9864-00ce49d9140c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:44:32 np0005531887 nova_compute[186849]: 2025-11-22 07:44:32.406 186853 DEBUG oslo_concurrency.lockutils [None req-29f93869-491f-4711-bfdd-d54f8d4de4c3 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:44:32 np0005531887 nova_compute[186849]: 2025-11-22 07:44:32.407 186853 DEBUG oslo_concurrency.lockutils [None req-29f93869-491f-4711-bfdd-d54f8d4de4c3 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:44:32 np0005531887 nova_compute[186849]: 2025-11-22 07:44:32.526 186853 DEBUG nova.compute.provider_tree [None req-29f93869-491f-4711-bfdd-d54f8d4de4c3 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:44:32 np0005531887 nova_compute[186849]: 2025-11-22 07:44:32.541 186853 DEBUG nova.scheduler.client.report [None req-29f93869-491f-4711-bfdd-d54f8d4de4c3 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:44:32 np0005531887 nova_compute[186849]: 2025-11-22 07:44:32.567 186853 DEBUG oslo_concurrency.lockutils [None req-29f93869-491f-4711-bfdd-d54f8d4de4c3 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.160s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:44:32 np0005531887 nova_compute[186849]: 2025-11-22 07:44:32.603 186853 INFO nova.scheduler.client.report [None req-29f93869-491f-4711-bfdd-d54f8d4de4c3 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Deleted allocations for instance 9141d9a1-7a40-4a72-a3f7-2d67ae112383#033[00m
Nov 22 02:44:32 np0005531887 nova_compute[186849]: 2025-11-22 07:44:32.662 186853 DEBUG oslo_concurrency.lockutils [None req-29f93869-491f-4711-bfdd-d54f8d4de4c3 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "9141d9a1-7a40-4a72-a3f7-2d67ae112383" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.493s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:44:32 np0005531887 podman[216083]: 2025-11-22 07:44:32.891549619 +0000 UTC m=+0.089323953 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=multipathd)
Nov 22 02:44:33 np0005531887 nova_compute[186849]: 2025-11-22 07:44:33.750 186853 DEBUG nova.compute.manager [req-259ccd64-b027-4003-b78b-34e2a71c96e5 req-1ed850eb-80c6-49c8-a032-a40cdd43188e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Received event network-vif-plugged-606a4645-8996-452d-9864-00ce49d9140c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:44:33 np0005531887 nova_compute[186849]: 2025-11-22 07:44:33.751 186853 DEBUG oslo_concurrency.lockutils [req-259ccd64-b027-4003-b78b-34e2a71c96e5 req-1ed850eb-80c6-49c8-a032-a40cdd43188e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "9141d9a1-7a40-4a72-a3f7-2d67ae112383-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:44:33 np0005531887 nova_compute[186849]: 2025-11-22 07:44:33.751 186853 DEBUG oslo_concurrency.lockutils [req-259ccd64-b027-4003-b78b-34e2a71c96e5 req-1ed850eb-80c6-49c8-a032-a40cdd43188e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "9141d9a1-7a40-4a72-a3f7-2d67ae112383-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:44:33 np0005531887 nova_compute[186849]: 2025-11-22 07:44:33.751 186853 DEBUG oslo_concurrency.lockutils [req-259ccd64-b027-4003-b78b-34e2a71c96e5 req-1ed850eb-80c6-49c8-a032-a40cdd43188e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "9141d9a1-7a40-4a72-a3f7-2d67ae112383-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:44:33 np0005531887 nova_compute[186849]: 2025-11-22 07:44:33.751 186853 DEBUG nova.compute.manager [req-259ccd64-b027-4003-b78b-34e2a71c96e5 req-1ed850eb-80c6-49c8-a032-a40cdd43188e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] No waiting events found dispatching network-vif-plugged-606a4645-8996-452d-9864-00ce49d9140c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:44:33 np0005531887 nova_compute[186849]: 2025-11-22 07:44:33.752 186853 WARNING nova.compute.manager [req-259ccd64-b027-4003-b78b-34e2a71c96e5 req-1ed850eb-80c6-49c8-a032-a40cdd43188e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Received unexpected event network-vif-plugged-606a4645-8996-452d-9864-00ce49d9140c for instance with vm_state deleted and task_state None.#033[00m
Nov 22 02:44:33 np0005531887 nova_compute[186849]: 2025-11-22 07:44:33.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:44:33 np0005531887 nova_compute[186849]: 2025-11-22 07:44:33.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:44:34 np0005531887 nova_compute[186849]: 2025-11-22 07:44:34.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:44:35 np0005531887 nova_compute[186849]: 2025-11-22 07:44:35.160 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:44:35.406 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:44:35 np0005531887 nova_compute[186849]: 2025-11-22 07:44:35.406 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:44:35.407 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 02:44:35 np0005531887 nova_compute[186849]: 2025-11-22 07:44:35.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:44:35 np0005531887 nova_compute[186849]: 2025-11-22 07:44:35.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 02:44:35 np0005531887 nova_compute[186849]: 2025-11-22 07:44:35.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 02:44:35 np0005531887 nova_compute[186849]: 2025-11-22 07:44:35.795 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "refresh_cache-3cf2b323-ba35-4807-8337-288f6c983860" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:44:35 np0005531887 nova_compute[186849]: 2025-11-22 07:44:35.795 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquired lock "refresh_cache-3cf2b323-ba35-4807-8337-288f6c983860" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:44:35 np0005531887 nova_compute[186849]: 2025-11-22 07:44:35.796 186853 DEBUG nova.network.neutron [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 02:44:35 np0005531887 nova_compute[186849]: 2025-11-22 07:44:35.796 186853 DEBUG nova.objects.instance [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 3cf2b323-ba35-4807-8337-288f6c983860 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:44:36 np0005531887 nova_compute[186849]: 2025-11-22 07:44:36.095 186853 DEBUG nova.network.neutron [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:44:36 np0005531887 nova_compute[186849]: 2025-11-22 07:44:36.220 186853 DEBUG nova.virt.libvirt.driver [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Nov 22 02:44:36 np0005531887 nova_compute[186849]: 2025-11-22 07:44:36.497 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.659 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '144e6cca-5b79-4b25-9456-a59f6895075b', 'name': 'tempest-LiveAutoBlockMigrationV225Test-server-1027576693', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000010', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '74651b744925468db6c6e47d1397cc04', 'user_id': '4ca2e31d955040598948fa3da5d84888', 'hostId': 'd06eef4a3fe42049def3cd897e84669b62c1583eb0ccbdf323f1197c', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.662 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '3cf2b323-ba35-4807-8337-288f6c983860', 'name': 'tempest-MigrationsAdminTest-server-1564380060', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000013', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '070aaece3c3c4232877d26c34023c56d', 'user_id': '5ea417ea62e2404d8cb5b9e767e8c5c4', 'hostId': '130aecd49a665fe1d506bbcb189c6fffbb7b32b8a3374ac7c046aaea', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.663 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.689 12 DEBUG ceilometer.compute.pollsters [-] 144e6cca-5b79-4b25-9456-a59f6895075b/disk.device.write.bytes volume: 28672 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.691 12 DEBUG ceilometer.compute.pollsters [-] 144e6cca-5b79-4b25-9456-a59f6895075b/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.716 12 DEBUG ceilometer.compute.pollsters [-] 3cf2b323-ba35-4807-8337-288f6c983860/disk.device.write.bytes volume: 72994816 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.716 12 DEBUG ceilometer.compute.pollsters [-] 3cf2b323-ba35-4807-8337-288f6c983860/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '72fc5bb4-dc69-4a92-8de7-d1de1372b126', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 28672, 'user_id': '4ca2e31d955040598948fa3da5d84888', 'user_name': None, 'project_id': '74651b744925468db6c6e47d1397cc04', 'project_name': None, 'resource_id': '144e6cca-5b79-4b25-9456-a59f6895075b-vda', 'timestamp': '2025-11-22T07:44:36.663375', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1027576693', 'name': 'instance-00000010', 'instance_id': '144e6cca-5b79-4b25-9456-a59f6895075b', 'instance_type': 'm1.nano', 'host': 'd06eef4a3fe42049def3cd897e84669b62c1583eb0ccbdf323f1197c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '180b32a6-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4219.333198859, 'message_signature': '099ddeb4acf2a8976e3ea18465988df52dee2fe84379c2011841a6ba0ad78162'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4ca2e31d955040598948fa3da5d84888', 'user_name': None, 'project_id': '74651b744925468db6c6e47d1397cc04', 'project_name': None, 'resource_id': '144e6cca-5b79-4b25-9456-a59f6895075b-sda', 'timestamp': '2025-11-22T07:44:36.663375', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1027576693', 'name': 'instance-00000010', 'instance_id': '144e6cca-5b79-4b25-9456-a59f6895075b', 'instance_type': 'm1.nano', 'host': 'd06eef4a3fe42049def3cd897e84669b62c1583eb0ccbdf323f1197c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '180b40ca-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4219.333198859, 'message_signature': '8fe9aff9b774f32a7062cdd7c7cb6fafaf5f6df7c9cce92dec336b2ccd7427d6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72994816, 'user_id': '5ea417ea62e2404d8cb5b9e767e8c5c4', 'user_name': None, 'project_id': '070aaece3c3c4232877d26c34023c56d', 'project_name': None, 'resource_id': '3cf2b323-ba35-4807-8337-288f6c983860-vda', 'timestamp': '2025-11-22T07:44:36.663375', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1564380060', 'name': 'instance-00000013', 'instance_id': '3cf2b323-ba35-4807-8337-288f6c983860', 'instance_type': 'm1.nano', 'host': '130aecd49a665fe1d506bbcb189c6fffbb7b32b8a3374ac7c046aaea', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '180f1fd8-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4219.361118547, 'message_signature': 'e83f2518d2bc70cf0f57a1f6b899121973b174277aa52610b6b0befb4c6a49fd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5ea417ea62e2404d8cb5b9e767e8c5c4', 'user_name': None, 'project_id': '070aaece3c3c4232877d26c34023c56d', 'project_name': None, 'resource_id': '3cf2b323-ba35-4807-8337-288f6c983860-sda', 'timestamp': '2025-11-22T07:44:36.663375', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1564380060', 'name': 'instance-00000013', 'instance_id': '3cf2b323-ba35-4807-8337-288f6c983860', 'instance_type': 'm1.nano', 'host': '130aecd49a665fe1d506bbcb189c6fffbb7b32b8a3374ac7c046aaea', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '180f313a-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4219.361118547, 'message_signature': '411b2003f4ec8ff4fc89151258054fe0e83e33f3b4af5536b6edcb866abc879e'}]}, 'timestamp': '2025-11-22 07:44:36.717182', '_unique_id': '36f3a03f367d43f0b4cf78f3c3c8524c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.719 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.719 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.720 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1027576693>, <NovaLikeServer: tempest-MigrationsAdminTest-server-1564380060>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1027576693>, <NovaLikeServer: tempest-MigrationsAdminTest-server-1564380060>]
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.720 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.723 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 144e6cca-5b79-4b25-9456-a59f6895075b / tap66ab05b0-44 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.723 12 DEBUG ceilometer.compute.pollsters [-] 144e6cca-5b79-4b25-9456-a59f6895075b/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.725 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2bfdc69f-102c-417c-8366-102546af71b6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4ca2e31d955040598948fa3da5d84888', 'user_name': None, 'project_id': '74651b744925468db6c6e47d1397cc04', 'project_name': None, 'resource_id': 'instance-00000010-144e6cca-5b79-4b25-9456-a59f6895075b-tap66ab05b0-44', 'timestamp': '2025-11-22T07:44:36.720335', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1027576693', 'name': 'tap66ab05b0-44', 'instance_id': '144e6cca-5b79-4b25-9456-a59f6895075b', 'instance_type': 'm1.nano', 'host': 'd06eef4a3fe42049def3cd897e84669b62c1583eb0ccbdf323f1197c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4f:30:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap66ab05b0-44'}, 'message_id': '1810333c-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4219.390152102, 'message_signature': 'be8255c981ce9baef6162721c35f5ee51b0b2ad68b40e1571551eb0cc68d4b83'}]}, 'timestamp': '2025-11-22 07:44:36.724988', '_unique_id': '7db81ae5067c438eae791ea88b50259f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.725 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.725 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.725 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.725 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.725 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.725 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.725 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.725 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.725 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.725 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.725 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.725 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.725 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.725 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.725 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.725 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.725 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.725 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.725 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.725 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.725 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.725 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.725 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.725 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.725 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.725 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.725 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.725 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.725 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.725 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.725 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.726 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.726 12 DEBUG ceilometer.compute.pollsters [-] 144e6cca-5b79-4b25-9456-a59f6895075b/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.727 12 DEBUG ceilometer.compute.pollsters [-] 144e6cca-5b79-4b25-9456-a59f6895075b/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.727 12 DEBUG ceilometer.compute.pollsters [-] 3cf2b323-ba35-4807-8337-288f6c983860/disk.device.read.bytes volume: 30984704 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.727 12 DEBUG ceilometer.compute.pollsters [-] 3cf2b323-ba35-4807-8337-288f6c983860/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.728 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '83736e34-2b28-43ce-9dfb-7b8905077fbb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4ca2e31d955040598948fa3da5d84888', 'user_name': None, 'project_id': '74651b744925468db6c6e47d1397cc04', 'project_name': None, 'resource_id': '144e6cca-5b79-4b25-9456-a59f6895075b-vda', 'timestamp': '2025-11-22T07:44:36.726901', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1027576693', 'name': 'instance-00000010', 'instance_id': '144e6cca-5b79-4b25-9456-a59f6895075b', 'instance_type': 'm1.nano', 'host': 'd06eef4a3fe42049def3cd897e84669b62c1583eb0ccbdf323f1197c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1810bb18-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4219.333198859, 'message_signature': '81094e5d5de50ec2a8d09f9786f277d6add253eaa2ee2a1d13e5c5830775a93c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4ca2e31d955040598948fa3da5d84888', 'user_name': None, 'project_id': '74651b744925468db6c6e47d1397cc04', 'project_name': None, 'resource_id': '144e6cca-5b79-4b25-9456-a59f6895075b-sda', 'timestamp': '2025-11-22T07:44:36.726901', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1027576693', 'name': 'instance-00000010', 'instance_id': '144e6cca-5b79-4b25-9456-a59f6895075b', 'instance_type': 'm1.nano', 'host': 'd06eef4a3fe42049def3cd897e84669b62c1583eb0ccbdf323f1197c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1810c770-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4219.333198859, 'message_signature': '95d7d9a9f0371266869a8437e840253d9d1a7e52efb27fa24cb876196eb80415'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30984704, 'user_id': '5ea417ea62e2404d8cb5b9e767e8c5c4', 'user_name': None, 'project_id': '070aaece3c3c4232877d26c34023c56d', 'project_name': None, 'resource_id': '3cf2b323-ba35-4807-8337-288f6c983860-vda', 'timestamp': '2025-11-22T07:44:36.726901', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1564380060', 'name': 'instance-00000013', 'instance_id': '3cf2b323-ba35-4807-8337-288f6c983860', 'instance_type': 'm1.nano', 'host': '130aecd49a665fe1d506bbcb189c6fffbb7b32b8a3374ac7c046aaea', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1810d224-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4219.361118547, 'message_signature': 'b4d46c47c182d8a2ac08c3ffa244f7cf9bc8789ad3f2f269552bd18fe6ce618f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '5ea417ea62e2404d8cb5b9e767e8c5c4', 'user_name': None, 'project_id': '070aaece3c3c4232877d26c34023c56d', 'project_name': None, 'resource_id': '3cf2b323-ba35-4807-8337-288f6c983860-sda', 'timestamp': '2025-11-22T07:44:36.726901', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1564380060', 'name': 'instance-00000013', 'instance_id': '3cf2b323-ba35-4807-8337-288f6c983860', 'instance_type': 'm1.nano', 'host': '130aecd49a665fe1d506bbcb189c6fffbb7b32b8a3374ac7c046aaea', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1810dc4c-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4219.361118547, 'message_signature': 'e706fac290011bbc3a9f6f562b373fab705406da3d7efa2bf5ab4ba45f62f651'}]}, 'timestamp': '2025-11-22 07:44:36.728049', '_unique_id': 'd237af3aabcf41f982283177f74f7b87'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.728 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.728 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.728 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.728 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.728 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.728 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.728 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.728 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.728 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.728 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.728 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.728 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.728 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.728 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.728 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.728 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.728 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.728 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.728 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.728 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.728 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.728 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.728 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.728 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.728 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.728 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.728 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.728 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.728 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.728 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.728 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.729 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.729 12 DEBUG ceilometer.compute.pollsters [-] 144e6cca-5b79-4b25-9456-a59f6895075b/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.730 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9dca80c4-5cb2-4607-a0d9-05f11f614ede', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4ca2e31d955040598948fa3da5d84888', 'user_name': None, 'project_id': '74651b744925468db6c6e47d1397cc04', 'project_name': None, 'resource_id': 'instance-00000010-144e6cca-5b79-4b25-9456-a59f6895075b-tap66ab05b0-44', 'timestamp': '2025-11-22T07:44:36.729450', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1027576693', 'name': 'tap66ab05b0-44', 'instance_id': '144e6cca-5b79-4b25-9456-a59f6895075b', 'instance_type': 'm1.nano', 'host': 'd06eef4a3fe42049def3cd897e84669b62c1583eb0ccbdf323f1197c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4f:30:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap66ab05b0-44'}, 'message_id': '18111d1a-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4219.390152102, 'message_signature': '42f63d5fcfc0a8b803af51030bf24768aed22c70c8980ccf2655bca5a3aa05e4'}]}, 'timestamp': '2025-11-22 07:44:36.729690', '_unique_id': 'bfb2a1420ec0401b9138cba99a9adfb8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.730 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.730 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.730 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.730 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.730 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.730 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.730 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.730 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.730 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.730 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.730 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.730 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.730 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.730 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.730 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.730 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.730 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.730 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.730 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.730 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.730 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.730 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.730 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.730 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.730 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.730 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.730 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.730 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.730 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.730 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.730 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.730 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.730 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.730 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1027576693>, <NovaLikeServer: tempest-MigrationsAdminTest-server-1564380060>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1027576693>, <NovaLikeServer: tempest-MigrationsAdminTest-server-1564380060>]
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.731 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.731 12 DEBUG ceilometer.compute.pollsters [-] 144e6cca-5b79-4b25-9456-a59f6895075b/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.732 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd2458846-7c9a-4cf9-8e72-8fc238144958', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4ca2e31d955040598948fa3da5d84888', 'user_name': None, 'project_id': '74651b744925468db6c6e47d1397cc04', 'project_name': None, 'resource_id': 'instance-00000010-144e6cca-5b79-4b25-9456-a59f6895075b-tap66ab05b0-44', 'timestamp': '2025-11-22T07:44:36.731151', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1027576693', 'name': 'tap66ab05b0-44', 'instance_id': '144e6cca-5b79-4b25-9456-a59f6895075b', 'instance_type': 'm1.nano', 'host': 'd06eef4a3fe42049def3cd897e84669b62c1583eb0ccbdf323f1197c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4f:30:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap66ab05b0-44'}, 'message_id': '1811611c-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4219.390152102, 'message_signature': '7b6eaee364261895bb04a5374bc67f01220ed1484aedee0f9ec4397d6d22f9c0'}]}, 'timestamp': '2025-11-22 07:44:36.731464', '_unique_id': '37909fb8acb44ded904caff1809ecd43'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.732 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.732 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.732 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.732 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.732 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.732 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.732 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.732 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.732 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.732 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.732 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.732 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.732 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.732 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.732 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.732 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.732 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.732 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.732 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.732 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.732 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.732 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.732 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.732 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.732 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.732 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.732 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.732 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.732 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.732 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.732 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.732 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.732 12 DEBUG ceilometer.compute.pollsters [-] 144e6cca-5b79-4b25-9456-a59f6895075b/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.733 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2585a272-611f-41bd-8316-b958a85d321b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4ca2e31d955040598948fa3da5d84888', 'user_name': None, 'project_id': '74651b744925468db6c6e47d1397cc04', 'project_name': None, 'resource_id': 'instance-00000010-144e6cca-5b79-4b25-9456-a59f6895075b-tap66ab05b0-44', 'timestamp': '2025-11-22T07:44:36.732656', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1027576693', 'name': 'tap66ab05b0-44', 'instance_id': '144e6cca-5b79-4b25-9456-a59f6895075b', 'instance_type': 'm1.nano', 'host': 'd06eef4a3fe42049def3cd897e84669b62c1583eb0ccbdf323f1197c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4f:30:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap66ab05b0-44'}, 'message_id': '18119a56-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4219.390152102, 'message_signature': 'a29d4b017a8fefaf7af1d121a318d4c8a4a9eb93e9261cfafe990c7a76c3cb23'}]}, 'timestamp': '2025-11-22 07:44:36.732896', '_unique_id': 'cb190079e44e4fbe822173748f72d5f9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.733 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.733 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.733 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.733 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.733 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.733 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.733 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.733 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.733 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.733 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.733 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.733 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.733 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.733 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.733 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.733 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.733 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.733 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.733 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.733 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.733 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.733 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.733 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.733 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.733 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.733 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.733 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.733 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.733 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.733 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.733 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.734 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.734 12 DEBUG ceilometer.compute.pollsters [-] 144e6cca-5b79-4b25-9456-a59f6895075b/network.incoming.bytes volume: 622 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.735 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '044fea6c-5347-4652-bdc0-0ea60b4cd811', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 622, 'user_id': '4ca2e31d955040598948fa3da5d84888', 'user_name': None, 'project_id': '74651b744925468db6c6e47d1397cc04', 'project_name': None, 'resource_id': 'instance-00000010-144e6cca-5b79-4b25-9456-a59f6895075b-tap66ab05b0-44', 'timestamp': '2025-11-22T07:44:36.734183', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1027576693', 'name': 'tap66ab05b0-44', 'instance_id': '144e6cca-5b79-4b25-9456-a59f6895075b', 'instance_type': 'm1.nano', 'host': 'd06eef4a3fe42049def3cd897e84669b62c1583eb0ccbdf323f1197c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4f:30:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap66ab05b0-44'}, 'message_id': '1811d7aa-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4219.390152102, 'message_signature': '573e64541c25c2ac7c0be957719e5370d09fc8ddef32526abb73de50cfab3c8a'}]}, 'timestamp': '2025-11-22 07:44:36.734501', '_unique_id': 'cdd2fabb590e42b5b9e60adc1e1eaab7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.735 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.735 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.735 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.735 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.735 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.735 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.735 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.735 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.735 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.735 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.735 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.735 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.735 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.735 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.735 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.735 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.735 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.735 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.735 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.735 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.735 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.735 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.735 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.735 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.735 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.735 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.735 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.735 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.735 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.735 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.735 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.735 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.735 12 DEBUG ceilometer.compute.pollsters [-] 144e6cca-5b79-4b25-9456-a59f6895075b/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.736 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eec77e5b-bfdd-4523-8555-2eb7c1b5c0cb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4ca2e31d955040598948fa3da5d84888', 'user_name': None, 'project_id': '74651b744925468db6c6e47d1397cc04', 'project_name': None, 'resource_id': 'instance-00000010-144e6cca-5b79-4b25-9456-a59f6895075b-tap66ab05b0-44', 'timestamp': '2025-11-22T07:44:36.735850', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1027576693', 'name': 'tap66ab05b0-44', 'instance_id': '144e6cca-5b79-4b25-9456-a59f6895075b', 'instance_type': 'm1.nano', 'host': 'd06eef4a3fe42049def3cd897e84669b62c1583eb0ccbdf323f1197c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4f:30:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap66ab05b0-44'}, 'message_id': '181216f2-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4219.390152102, 'message_signature': '8ae3b479d30ac9d2d87fcf3eb4992d542a32c2e50daa949bbc20ad52f0661f3d'}]}, 'timestamp': '2025-11-22 07:44:36.736086', '_unique_id': 'c04149a23efb45d69b63e7f5a9bc9534'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.736 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.736 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.736 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.736 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.736 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.736 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.736 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.736 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.736 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.736 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.736 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.736 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.736 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.736 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.736 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.736 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.736 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.736 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.736 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.736 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.736 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.736 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.736 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.736 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.736 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.736 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.736 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.736 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.736 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.736 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.736 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.737 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.737 12 DEBUG ceilometer.compute.pollsters [-] 144e6cca-5b79-4b25-9456-a59f6895075b/network.incoming.packets volume: 7 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.738 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '09177150-3025-47f7-9228-300813d959f4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 7, 'user_id': '4ca2e31d955040598948fa3da5d84888', 'user_name': None, 'project_id': '74651b744925468db6c6e47d1397cc04', 'project_name': None, 'resource_id': 'instance-00000010-144e6cca-5b79-4b25-9456-a59f6895075b-tap66ab05b0-44', 'timestamp': '2025-11-22T07:44:36.737336', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1027576693', 'name': 'tap66ab05b0-44', 'instance_id': '144e6cca-5b79-4b25-9456-a59f6895075b', 'instance_type': 'm1.nano', 'host': 'd06eef4a3fe42049def3cd897e84669b62c1583eb0ccbdf323f1197c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4f:30:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap66ab05b0-44'}, 'message_id': '181250ea-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4219.390152102, 'message_signature': 'e7eee7ff4eceddf9f5f9c57cb55f1e0391cb976ff7a779677243b9cd28432cc8'}]}, 'timestamp': '2025-11-22 07:44:36.737570', '_unique_id': '5667fc3b397442ec8356106442306403'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.738 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.738 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.738 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.738 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.738 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.738 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.738 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.738 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.738 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.738 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.738 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.738 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.738 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.738 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.738 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.738 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.738 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.738 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.738 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.738 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.738 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.738 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.738 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.738 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.738 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.738 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.738 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.738 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.738 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.738 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.738 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.738 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.748 12 DEBUG ceilometer.compute.pollsters [-] 144e6cca-5b79-4b25-9456-a59f6895075b/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.748 12 DEBUG ceilometer.compute.pollsters [-] 144e6cca-5b79-4b25-9456-a59f6895075b/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.758 12 DEBUG ceilometer.compute.pollsters [-] 3cf2b323-ba35-4807-8337-288f6c983860/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.759 12 DEBUG ceilometer.compute.pollsters [-] 3cf2b323-ba35-4807-8337-288f6c983860/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.760 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e1484d9c-49c5-4276-9375-20a1eefa71b7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '4ca2e31d955040598948fa3da5d84888', 'user_name': None, 'project_id': '74651b744925468db6c6e47d1397cc04', 'project_name': None, 'resource_id': '144e6cca-5b79-4b25-9456-a59f6895075b-vda', 'timestamp': '2025-11-22T07:44:36.739007', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1027576693', 'name': 'instance-00000010', 'instance_id': '144e6cca-5b79-4b25-9456-a59f6895075b', 'instance_type': 'm1.nano', 'host': 'd06eef4a3fe42049def3cd897e84669b62c1583eb0ccbdf323f1197c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '18140c46-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4219.408812231, 'message_signature': '510ca96160d0317ffdeda4958c17ecd8bb91448957fc5c7939445d12dad2564a'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '4ca2e31d955040598948fa3da5d84888', 'user_name': None, 'project_id': '74651b744925468db6c6e47d1397cc04', 'project_name': None, 'resource_id': '144e6cca-5b79-4b25-9456-a59f6895075b-sda', 'timestamp': '2025-11-22T07:44:36.739007', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1027576693', 'name': 'instance-00000010', 'instance_id': '144e6cca-5b79-4b25-9456-a59f6895075b', 'instance_type': 'm1.nano', 'host': 'd06eef4a3fe42049def3cd897e84669b62c1583eb0ccbdf323f1197c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1814161e-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4219.408812231, 'message_signature': 'bfd263a42133e6ebc913e568da717811a7d9b77ed6382878e45797ba84eb1846'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '5ea417ea62e2404d8cb5b9e767e8c5c4', 'user_name': None, 'project_id': '070aaece3c3c4232877d26c34023c56d', 'project_name': None, 'resource_id': '3cf2b323-ba35-4807-8337-288f6c983860-vda', 'timestamp': '2025-11-22T07:44:36.739007', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1564380060', 'name': 'instance-00000013', 'instance_id': '3cf2b323-ba35-4807-8337-288f6c983860', 'instance_type': 'm1.nano', 'host': '130aecd49a665fe1d506bbcb189c6fffbb7b32b8a3374ac7c046aaea', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1815a86c-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4219.418936211, 'message_signature': '807a94f757e2e702a1392448e7a9bbdbcbd3f92d00dfe157cae0dfc0f8cd8f28'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '5ea417ea62e2404d8cb5b9e767e8c5c4', 'user_name': None, 'project_id': '070aaece3c3c4232877d26c34023c56d', 'project_name': None, 'resource_id': '3cf2b323-ba35-4807-8337-288f6c983860-sda', 'timestamp': '2025-11-22T07:44:36.739007', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1564380060', 'name': 'instance-00000013', 'instance_id': '3cf2b323-ba35-4807-8337-288f6c983860', 'instance_type': 'm1.nano', 'host': '130aecd49a665fe1d506bbcb189c6fffbb7b32b8a3374ac7c046aaea', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1815b514-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4219.418936211, 'message_signature': '43874ce2b176107f7eade879b8c850dbc7f26477f1b848d8e7de02ea16a15f5b'}]}, 'timestamp': '2025-11-22 07:44:36.759832', '_unique_id': '3e619857792043e98e515e1ced98505a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.760 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.760 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.760 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.760 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.760 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.760 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.760 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.760 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.760 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.760 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.760 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.760 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.760 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.760 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.760 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.760 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.760 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.760 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.760 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.760 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.760 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.760 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.760 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.760 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.760 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.760 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.760 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.760 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.760 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.760 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.760 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.761 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 02:44:36 np0005531887 nova_compute[186849]: 2025-11-22 07:44:36.774 186853 DEBUG nova.network.neutron [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.778 12 DEBUG ceilometer.compute.pollsters [-] 144e6cca-5b79-4b25-9456-a59f6895075b/cpu volume: 120000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.797 12 DEBUG ceilometer.compute.pollsters [-] 3cf2b323-ba35-4807-8337-288f6c983860/cpu volume: 12380000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.798 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd41089bf-8303-4bd9-a6d0-b8d5de822578', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 120000000, 'user_id': '4ca2e31d955040598948fa3da5d84888', 'user_name': None, 'project_id': '74651b744925468db6c6e47d1397cc04', 'project_name': None, 'resource_id': '144e6cca-5b79-4b25-9456-a59f6895075b', 'timestamp': '2025-11-22T07:44:36.761989', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1027576693', 'name': 'instance-00000010', 'instance_id': '144e6cca-5b79-4b25-9456-a59f6895075b', 'instance_type': 'm1.nano', 'host': 'd06eef4a3fe42049def3cd897e84669b62c1583eb0ccbdf323f1197c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '1818a71a-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4219.448302055, 'message_signature': '8488bcb55bc511302f06936604a38cbb48024cef822a22017f2598f68c3ef6da'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12380000000, 'user_id': '5ea417ea62e2404d8cb5b9e767e8c5c4', 'user_name': None, 'project_id': '070aaece3c3c4232877d26c34023c56d', 'project_name': None, 'resource_id': '3cf2b323-ba35-4807-8337-288f6c983860', 'timestamp': '2025-11-22T07:44:36.761989', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1564380060', 'name': 'instance-00000013', 'instance_id': '3cf2b323-ba35-4807-8337-288f6c983860', 'instance_type': 'm1.nano', 'host': '130aecd49a665fe1d506bbcb189c6fffbb7b32b8a3374ac7c046aaea', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '181b7b16-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4219.466861682, 'message_signature': '910a833427202796e93c06a390d05df5231765a48c25c96127abaab5be7a9045'}]}, 'timestamp': '2025-11-22 07:44:36.797687', '_unique_id': '2b7913d1a4af4472914721d44d03d9cf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.798 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.798 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.798 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.798 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.798 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.798 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.798 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.798 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.798 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.798 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.798 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.798 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.798 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.798 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.798 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.798 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.798 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.798 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.798 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.798 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.798 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.798 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.798 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.798 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.798 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.798 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.798 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.798 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.798 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.798 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.798 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.799 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.799 12 DEBUG ceilometer.compute.pollsters [-] 144e6cca-5b79-4b25-9456-a59f6895075b/disk.device.write.latency volume: 8504528 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.799 12 DEBUG ceilometer.compute.pollsters [-] 144e6cca-5b79-4b25-9456-a59f6895075b/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.800 12 DEBUG ceilometer.compute.pollsters [-] 3cf2b323-ba35-4807-8337-288f6c983860/disk.device.write.latency volume: 9312117102 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.800 12 DEBUG ceilometer.compute.pollsters [-] 3cf2b323-ba35-4807-8337-288f6c983860/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531887 nova_compute[186849]: 2025-11-22 07:44:36.800 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Releasing lock "refresh_cache-3cf2b323-ba35-4807-8337-288f6c983860" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:44:36 np0005531887 nova_compute[186849]: 2025-11-22 07:44:36.801 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 02:44:36 np0005531887 nova_compute[186849]: 2025-11-22 07:44:36.801 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.801 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'abf00749-98af-4f6a-a771-3fa52d5f590d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 8504528, 'user_id': '4ca2e31d955040598948fa3da5d84888', 'user_name': None, 'project_id': '74651b744925468db6c6e47d1397cc04', 'project_name': None, 'resource_id': '144e6cca-5b79-4b25-9456-a59f6895075b-vda', 'timestamp': '2025-11-22T07:44:36.799764', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1027576693', 'name': 'instance-00000010', 'instance_id': '144e6cca-5b79-4b25-9456-a59f6895075b', 'instance_type': 'm1.nano', 'host': 'd06eef4a3fe42049def3cd897e84669b62c1583eb0ccbdf323f1197c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '181bd7be-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4219.333198859, 'message_signature': 'cf0e38b7c00d20fd5fa93c9017b4a36afff6f50a04ce469a4e52b81a3c4513c7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '4ca2e31d955040598948fa3da5d84888', 'user_name': None, 'project_id': '74651b744925468db6c6e47d1397cc04', 'project_name': None, 'resource_id': '144e6cca-5b79-4b25-9456-a59f6895075b-sda', 'timestamp': '2025-11-22T07:44:36.799764', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1027576693', 'name': 'instance-00000010', 'instance_id': '144e6cca-5b79-4b25-9456-a59f6895075b', 'instance_type': 'm1.nano', 'host': 'd06eef4a3fe42049def3cd897e84669b62c1583eb0ccbdf323f1197c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '181bdf84-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4219.333198859, 'message_signature': 'a11baec1f2cb9e47558a33a3c55400e7c61741dd26eedcec3cc8da26790637cd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9312117102, 'user_id': '5ea417ea62e2404d8cb5b9e767e8c5c4', 'user_name': None, 'project_id': '070aaece3c3c4232877d26c34023c56d', 'project_name': None, 'resource_id': '3cf2b323-ba35-4807-8337-288f6c983860-vda', 'timestamp': '2025-11-22T07:44:36.799764', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1564380060', 'name': 'instance-00000013', 'instance_id': '3cf2b323-ba35-4807-8337-288f6c983860', 'instance_type': 'm1.nano', 'host': '130aecd49a665fe1d506bbcb189c6fffbb7b32b8a3374ac7c046aaea', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '181be7d6-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4219.361118547, 'message_signature': '46afd4e130edfce5968a189232155c480f138f7280712d24e1d392c9cea9eea8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '5ea417ea62e2404d8cb5b9e767e8c5c4', 'user_name': None, 'project_id': '070aaece3c3c4232877d26c34023c56d', 'project_name': None, 'resource_id': '3cf2b323-ba35-4807-8337-288f6c983860-sda', 'timestamp': '2025-11-22T07:44:36.799764', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1564380060', 'name': 'instance-00000013', 'instance_id': '3cf2b323-ba35-4807-8337-288f6c983860', 'instance_type': 'm1.nano', 'host': '130aecd49a665fe1d506bbcb189c6fffbb7b32b8a3374ac7c046aaea', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '181bef74-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4219.361118547, 'message_signature': '63d369905b4adf7e778da837851a440ab368b0e6ec54317c2d89244a1ba292f6'}]}, 'timestamp': '2025-11-22 07:44:36.800592', '_unique_id': '26e295bd04484bca832d257711d88e24'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.801 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.801 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.801 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.801 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.801 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.801 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.801 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.801 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.801 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.801 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.801 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.801 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.801 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.801 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.801 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.801 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.801 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.801 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.801 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.801 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.801 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.801 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.801 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.801 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.801 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.801 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.801 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.801 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.801 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.801 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.801 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.801 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.801 12 DEBUG ceilometer.compute.pollsters [-] 144e6cca-5b79-4b25-9456-a59f6895075b/network.outgoing.packets volume: 80 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531887 nova_compute[186849]: 2025-11-22 07:44:36.802 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.802 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4242f7ad-7df6-4649-af5e-d4af1b2b7997', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 80, 'user_id': '4ca2e31d955040598948fa3da5d84888', 'user_name': None, 'project_id': '74651b744925468db6c6e47d1397cc04', 'project_name': None, 'resource_id': 'instance-00000010-144e6cca-5b79-4b25-9456-a59f6895075b-tap66ab05b0-44', 'timestamp': '2025-11-22T07:44:36.801887', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1027576693', 'name': 'tap66ab05b0-44', 'instance_id': '144e6cca-5b79-4b25-9456-a59f6895075b', 'instance_type': 'm1.nano', 'host': 'd06eef4a3fe42049def3cd897e84669b62c1583eb0ccbdf323f1197c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4f:30:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap66ab05b0-44'}, 'message_id': '181c2b24-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4219.390152102, 'message_signature': '6d901b1aba390d07fbebf80e2e3ae070220496da16a0e5594cbc2add379f26cc'}]}, 'timestamp': '2025-11-22 07:44:36.802137', '_unique_id': 'ee5462f4e9f14e92a99778bf8ab6758a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.802 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.802 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.802 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.802 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.802 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.802 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.802 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.802 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.802 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.802 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.802 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.802 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.802 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.802 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.802 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.802 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.802 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.802 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.802 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.802 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.802 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.802 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.802 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.802 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.802 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.802 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.802 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.802 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.802 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.802 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.802 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.803 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.803 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.804 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1027576693>, <NovaLikeServer: tempest-MigrationsAdminTest-server-1564380060>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1027576693>, <NovaLikeServer: tempest-MigrationsAdminTest-server-1564380060>]
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.804 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.804 12 DEBUG ceilometer.compute.pollsters [-] 144e6cca-5b79-4b25-9456-a59f6895075b/memory.usage volume: 42.484375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.804 12 DEBUG ceilometer.compute.pollsters [-] 3cf2b323-ba35-4807-8337-288f6c983860/memory.usage volume: 40.40625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.805 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8f4a49ec-3228-4308-a1d1-9a6dd388fecb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.484375, 'user_id': '4ca2e31d955040598948fa3da5d84888', 'user_name': None, 'project_id': '74651b744925468db6c6e47d1397cc04', 'project_name': None, 'resource_id': '144e6cca-5b79-4b25-9456-a59f6895075b', 'timestamp': '2025-11-22T07:44:36.804325', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1027576693', 'name': 'instance-00000010', 'instance_id': '144e6cca-5b79-4b25-9456-a59f6895075b', 'instance_type': 'm1.nano', 'host': 'd06eef4a3fe42049def3cd897e84669b62c1583eb0ccbdf323f1197c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '181c8bd2-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4219.448302055, 'message_signature': '73166692f0e61b2a1c55755da62a3b751acaadc6884b7d0389ce326a986ef5a1'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.40625, 'user_id': '5ea417ea62e2404d8cb5b9e767e8c5c4', 'user_name': None, 'project_id': '070aaece3c3c4232877d26c34023c56d', 'project_name': None, 'resource_id': '3cf2b323-ba35-4807-8337-288f6c983860', 'timestamp': '2025-11-22T07:44:36.804325', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1564380060', 'name': 'instance-00000013', 'instance_id': '3cf2b323-ba35-4807-8337-288f6c983860', 'instance_type': 'm1.nano', 'host': '130aecd49a665fe1d506bbcb189c6fffbb7b32b8a3374ac7c046aaea', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '181c976c-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4219.466861682, 'message_signature': 'df11da7ae4d2c14eee2c977d3c531bee253fa49b4255047867436444f2e9522b'}]}, 'timestamp': '2025-11-22 07:44:36.805077', '_unique_id': '7aab22f7c2ef448c98bceb6b07da2bf9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.805 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.805 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.805 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.805 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.805 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.805 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.805 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.805 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.805 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.805 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.805 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.805 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.805 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.805 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.805 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.805 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.805 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.805 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.805 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.805 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.805 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.805 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.805 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.805 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.805 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.805 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.805 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.805 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.805 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.805 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.805 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.806 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.806 12 DEBUG ceilometer.compute.pollsters [-] 144e6cca-5b79-4b25-9456-a59f6895075b/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.806 12 DEBUG ceilometer.compute.pollsters [-] 144e6cca-5b79-4b25-9456-a59f6895075b/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.806 12 DEBUG ceilometer.compute.pollsters [-] 3cf2b323-ba35-4807-8337-288f6c983860/disk.device.read.requests volume: 1129 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.806 12 DEBUG ceilometer.compute.pollsters [-] 3cf2b323-ba35-4807-8337-288f6c983860/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.807 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '600a3f06-3ace-4858-84cd-0bac3560d105', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '4ca2e31d955040598948fa3da5d84888', 'user_name': None, 'project_id': '74651b744925468db6c6e47d1397cc04', 'project_name': None, 'resource_id': '144e6cca-5b79-4b25-9456-a59f6895075b-vda', 'timestamp': '2025-11-22T07:44:36.806386', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1027576693', 'name': 'instance-00000010', 'instance_id': '144e6cca-5b79-4b25-9456-a59f6895075b', 'instance_type': 'm1.nano', 'host': 'd06eef4a3fe42049def3cd897e84669b62c1583eb0ccbdf323f1197c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '181cda24-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4219.333198859, 'message_signature': 'a6e410e767dc10d09c0e0e4e815b84bcff559c022ca3081c0b24dff42a91db0d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '4ca2e31d955040598948fa3da5d84888', 'user_name': None, 'project_id': '74651b744925468db6c6e47d1397cc04', 'project_name': None, 'resource_id': '144e6cca-5b79-4b25-9456-a59f6895075b-sda', 'timestamp': '2025-11-22T07:44:36.806386', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1027576693', 'name': 'instance-00000010', 'instance_id': '144e6cca-5b79-4b25-9456-a59f6895075b', 'instance_type': 'm1.nano', 'host': 'd06eef4a3fe42049def3cd897e84669b62c1583eb0ccbdf323f1197c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '181ce1d6-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4219.333198859, 'message_signature': 'a5ba345fac121698cb8dfc7e3a64f9977f46999720e80a4841d842227505babb'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1129, 'user_id': '5ea417ea62e2404d8cb5b9e767e8c5c4', 'user_name': None, 'project_id': '070aaece3c3c4232877d26c34023c56d', 'project_name': None, 'resource_id': '3cf2b323-ba35-4807-8337-288f6c983860-vda', 'timestamp': '2025-11-22T07:44:36.806386', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1564380060', 'name': 'instance-00000013', 'instance_id': '3cf2b323-ba35-4807-8337-288f6c983860', 'instance_type': 'm1.nano', 'host': '130aecd49a665fe1d506bbcb189c6fffbb7b32b8a3374ac7c046aaea', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '181ce974-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4219.361118547, 'message_signature': '98a6de9f074d9ed7c601016c77117b6255ddee55a1c5e1e84010ea5647d584ea'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '5ea417ea62e2404d8cb5b9e767e8c5c4', 'user_name': None, 'project_id': '070aaece3c3c4232877d26c34023c56d', 'project_name': None, 'resource_id': '3cf2b323-ba35-4807-8337-288f6c983860-sda', 'timestamp': '2025-11-22T07:44:36.806386', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1564380060', 'name': 'instance-00000013', 'instance_id': '3cf2b323-ba35-4807-8337-288f6c983860', 'instance_type': 'm1.nano', 'host': '130aecd49a665fe1d506bbcb189c6fffbb7b32b8a3374ac7c046aaea', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '181cf0d6-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4219.361118547, 'message_signature': '74f629c8ca8d6648d6b1dc19fe85cb8158cf3f0ae699a09c8cb3f0e8d7b50202'}]}, 'timestamp': '2025-11-22 07:44:36.807178', '_unique_id': '5e19fe21ea974092b945f80b154dd6bb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.807 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.807 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.807 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.807 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.807 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.807 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.807 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.807 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.807 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.807 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.807 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.807 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.807 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.807 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.807 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.807 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.807 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.807 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.807 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.807 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.807 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.807 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.807 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.807 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.807 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.807 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.807 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.807 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.807 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.807 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.807 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.808 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.808 12 DEBUG ceilometer.compute.pollsters [-] 144e6cca-5b79-4b25-9456-a59f6895075b/disk.device.write.requests volume: 6 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.808 12 DEBUG ceilometer.compute.pollsters [-] 144e6cca-5b79-4b25-9456-a59f6895075b/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.808 12 DEBUG ceilometer.compute.pollsters [-] 3cf2b323-ba35-4807-8337-288f6c983860/disk.device.write.requests volume: 330 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.808 12 DEBUG ceilometer.compute.pollsters [-] 3cf2b323-ba35-4807-8337-288f6c983860/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.809 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '86e55774-d20e-49fc-a17f-ad95e0c6927f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 6, 'user_id': '4ca2e31d955040598948fa3da5d84888', 'user_name': None, 'project_id': '74651b744925468db6c6e47d1397cc04', 'project_name': None, 'resource_id': '144e6cca-5b79-4b25-9456-a59f6895075b-vda', 'timestamp': '2025-11-22T07:44:36.808363', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1027576693', 'name': 'instance-00000010', 'instance_id': '144e6cca-5b79-4b25-9456-a59f6895075b', 'instance_type': 'm1.nano', 'host': 'd06eef4a3fe42049def3cd897e84669b62c1583eb0ccbdf323f1197c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '181d272c-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4219.333198859, 'message_signature': '8e66cf8911e8b0186317134e087c937e3ad09c8f8d4e5e29eff9dfb150242efd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '4ca2e31d955040598948fa3da5d84888', 'user_name': None, 'project_id': '74651b744925468db6c6e47d1397cc04', 'project_name': None, 'resource_id': '144e6cca-5b79-4b25-9456-a59f6895075b-sda', 'timestamp': '2025-11-22T07:44:36.808363', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1027576693', 'name': 'instance-00000010', 'instance_id': '144e6cca-5b79-4b25-9456-a59f6895075b', 'instance_type': 'm1.nano', 'host': 'd06eef4a3fe42049def3cd897e84669b62c1583eb0ccbdf323f1197c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '181d2ea2-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4219.333198859, 'message_signature': 'e394de6aca72bfd89dabefdb8a62babea934c74a4f87f7fad46d71c5f9887642'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 330, 'user_id': '5ea417ea62e2404d8cb5b9e767e8c5c4', 'user_name': None, 'project_id': '070aaece3c3c4232877d26c34023c56d', 'project_name': None, 'resource_id': '3cf2b323-ba35-4807-8337-288f6c983860-vda', 'timestamp': '2025-11-22T07:44:36.808363', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1564380060', 'name': 'instance-00000013', 'instance_id': '3cf2b323-ba35-4807-8337-288f6c983860', 'instance_type': 'm1.nano', 'host': '130aecd49a665fe1d506bbcb189c6fffbb7b32b8a3374ac7c046aaea', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '181d3618-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4219.361118547, 'message_signature': '9b33ea30a2079f634f9e3dfcdce0122a9e0d8a06806bb96a99c9432e78746e3e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '5ea417ea62e2404d8cb5b9e767e8c5c4', 'user_name': None, 'project_id': '070aaece3c3c4232877d26c34023c56d', 'project_name': None, 'resource_id': '3cf2b323-ba35-4807-8337-288f6c983860-sda', 'timestamp': '2025-11-22T07:44:36.808363', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1564380060', 'name': 'instance-00000013', 'instance_id': '3cf2b323-ba35-4807-8337-288f6c983860', 'instance_type': 'm1.nano', 'host': '130aecd49a665fe1d506bbcb189c6fffbb7b32b8a3374ac7c046aaea', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '181d3d70-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4219.361118547, 'message_signature': 'fc296226dbef42a675b47913f5c0905a1c48b533d109c3199a74e83fcdf2e2aa'}]}, 'timestamp': '2025-11-22 07:44:36.809139', '_unique_id': 'c469a86482614ba7b34fefc9c51fa3cf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.809 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.809 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.809 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.809 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.809 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.809 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.809 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.809 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.809 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.809 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.809 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.809 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.809 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.809 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.809 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.809 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.809 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.809 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.809 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.809 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.809 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.809 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.809 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.809 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.809 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.809 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.809 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.809 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.809 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.809 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.809 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.810 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.810 12 DEBUG ceilometer.compute.pollsters [-] 144e6cca-5b79-4b25-9456-a59f6895075b/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.811 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2328fbf8-38b0-4ea8-8f13-c744fd85833a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4ca2e31d955040598948fa3da5d84888', 'user_name': None, 'project_id': '74651b744925468db6c6e47d1397cc04', 'project_name': None, 'resource_id': 'instance-00000010-144e6cca-5b79-4b25-9456-a59f6895075b-tap66ab05b0-44', 'timestamp': '2025-11-22T07:44:36.810296', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1027576693', 'name': 'tap66ab05b0-44', 'instance_id': '144e6cca-5b79-4b25-9456-a59f6895075b', 'instance_type': 'm1.nano', 'host': 'd06eef4a3fe42049def3cd897e84669b62c1583eb0ccbdf323f1197c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4f:30:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap66ab05b0-44'}, 'message_id': '181d7506-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4219.390152102, 'message_signature': '54f0aab309e047ec53f4bde162d0b4374bb09241a79293042e13d56b740b2b9b'}]}, 'timestamp': '2025-11-22 07:44:36.810580', '_unique_id': '0a30b24cab5644fcb5321670a953ed99'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.811 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.811 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.811 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.811 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.811 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.811 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.811 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.811 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.811 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.811 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.811 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.811 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.811 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.811 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.811 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.811 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.811 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.811 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.811 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.811 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.811 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.811 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.811 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.811 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.811 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.811 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.811 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.811 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.811 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.811 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.811 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.811 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.811 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.811 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.811 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.811 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.811 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.811 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.811 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.811 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.811 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.811 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.811 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.811 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.811 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.811 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.811 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.811 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.811 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.811 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.811 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.811 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.811 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.811 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.811 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.811 12 DEBUG ceilometer.compute.pollsters [-] 144e6cca-5b79-4b25-9456-a59f6895075b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.812 12 DEBUG ceilometer.compute.pollsters [-] 144e6cca-5b79-4b25-9456-a59f6895075b/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.812 12 DEBUG ceilometer.compute.pollsters [-] 3cf2b323-ba35-4807-8337-288f6c983860/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.812 12 DEBUG ceilometer.compute.pollsters [-] 3cf2b323-ba35-4807-8337-288f6c983860/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.813 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0f0b8e59-1961-4fe6-8172-45ba995eb08b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4ca2e31d955040598948fa3da5d84888', 'user_name': None, 'project_id': '74651b744925468db6c6e47d1397cc04', 'project_name': None, 'resource_id': '144e6cca-5b79-4b25-9456-a59f6895075b-vda', 'timestamp': '2025-11-22T07:44:36.811900', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1027576693', 'name': 'instance-00000010', 'instance_id': '144e6cca-5b79-4b25-9456-a59f6895075b', 'instance_type': 'm1.nano', 'host': 'd06eef4a3fe42049def3cd897e84669b62c1583eb0ccbdf323f1197c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '181db160-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4219.408812231, 'message_signature': 'f0ef0528ebb71f62892cf7624af9ba088607f2a81c3f7c3d04d12d8cd86acd47'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '4ca2e31d955040598948fa3da5d84888', 'user_name': None, 'project_id': '74651b744925468db6c6e47d1397cc04', 'project_name': None, 'resource_id': '144e6cca-5b79-4b25-9456-a59f6895075b-sda', 'timestamp': '2025-11-22T07:44:36.811900', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1027576693', 'name': 'instance-00000010', 'instance_id': '144e6cca-5b79-4b25-9456-a59f6895075b', 'instance_type': 'm1.nano', 'host': 'd06eef4a3fe42049def3cd897e84669b62c1583eb0ccbdf323f1197c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '181db8ea-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4219.408812231, 'message_signature': '48ecd326a98d6e192909cd5909eaa7dc0b879ea983f3a8f362794f5d454bd0d5'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5ea417ea62e2404d8cb5b9e767e8c5c4', 'user_name': None, 'project_id': '070aaece3c3c4232877d26c34023c56d', 'project_name': None, 'resource_id': '3cf2b323-ba35-4807-8337-288f6c983860-vda', 'timestamp': '2025-11-22T07:44:36.811900', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1564380060', 'name': 'instance-00000013', 'instance_id': '3cf2b323-ba35-4807-8337-288f6c983860', 'instance_type': 'm1.nano', 'host': '130aecd49a665fe1d506bbcb189c6fffbb7b32b8a3374ac7c046aaea', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '181dc290-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4219.418936211, 'message_signature': 'fecc5826dd8a1a651c18ad821316817d0a992512f9d356987d064a2874365c76'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '5ea417ea62e2404d8cb5b9e767e8c5c4', 'user_name': None, 'project_id': '070aaece3c3c4232877d26c34023c56d', 'project_name': None, 'resource_id': '3cf2b323-ba35-4807-8337-288f6c983860-sda', 'timestamp': '2025-11-22T07:44:36.811900', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1564380060', 'name': 'instance-00000013', 'instance_id': '3cf2b323-ba35-4807-8337-288f6c983860', 'instance_type': 'm1.nano', 'host': '130aecd49a665fe1d506bbcb189c6fffbb7b32b8a3374ac7c046aaea', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '181dcb64-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4219.418936211, 'message_signature': '8a1feb81ef0efb9fca4ebd5081d5353bae5abaa23ca999cf2f0908e4ed7579ae'}]}, 'timestamp': '2025-11-22 07:44:36.812778', '_unique_id': '6faaa4c4d052413283f44c70cd1b6602'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.813 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.813 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.813 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.813 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.813 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.813 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.813 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.813 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.813 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.813 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.813 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.813 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.813 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.813 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.813 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.813 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.813 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.813 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.813 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.813 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.813 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.813 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.813 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.813 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.813 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.813 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.813 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.813 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.813 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.813 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.813 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.813 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.814 12 DEBUG ceilometer.compute.pollsters [-] 144e6cca-5b79-4b25-9456-a59f6895075b/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.814 12 DEBUG ceilometer.compute.pollsters [-] 144e6cca-5b79-4b25-9456-a59f6895075b/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.814 12 DEBUG ceilometer.compute.pollsters [-] 3cf2b323-ba35-4807-8337-288f6c983860/disk.device.read.latency volume: 767771907 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.814 12 DEBUG ceilometer.compute.pollsters [-] 3cf2b323-ba35-4807-8337-288f6c983860/disk.device.read.latency volume: 45721845 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.815 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8e228b39-a2a0-40b7-94ea-968e02d73a43', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '4ca2e31d955040598948fa3da5d84888', 'user_name': None, 'project_id': '74651b744925468db6c6e47d1397cc04', 'project_name': None, 'resource_id': '144e6cca-5b79-4b25-9456-a59f6895075b-vda', 'timestamp': '2025-11-22T07:44:36.814014', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1027576693', 'name': 'instance-00000010', 'instance_id': '144e6cca-5b79-4b25-9456-a59f6895075b', 'instance_type': 'm1.nano', 'host': 'd06eef4a3fe42049def3cd897e84669b62c1583eb0ccbdf323f1197c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '181e03fe-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4219.333198859, 'message_signature': 'a7c2d1a4f04565fb7db15135d606cd74224ad785aa86fb8ace591da578a3ef1a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '4ca2e31d955040598948fa3da5d84888', 'user_name': None, 'project_id': '74651b744925468db6c6e47d1397cc04', 'project_name': None, 'resource_id': '144e6cca-5b79-4b25-9456-a59f6895075b-sda', 'timestamp': '2025-11-22T07:44:36.814014', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1027576693', 'name': 'instance-00000010', 'instance_id': '144e6cca-5b79-4b25-9456-a59f6895075b', 'instance_type': 'm1.nano', 'host': 'd06eef4a3fe42049def3cd897e84669b62c1583eb0ccbdf323f1197c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '181e0d04-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4219.333198859, 'message_signature': 'ab8319a1d9013420483a89c57f5ed0b352cbfa8031488121fa4f4b47442a1309'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 767771907, 'user_id': '5ea417ea62e2404d8cb5b9e767e8c5c4', 'user_name': None, 'project_id': '070aaece3c3c4232877d26c34023c56d', 'project_name': None, 'resource_id': '3cf2b323-ba35-4807-8337-288f6c983860-vda', 'timestamp': '2025-11-22T07:44:36.814014', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1564380060', 'name': 'instance-00000013', 'instance_id': '3cf2b323-ba35-4807-8337-288f6c983860', 'instance_type': 'm1.nano', 'host': '130aecd49a665fe1d506bbcb189c6fffbb7b32b8a3374ac7c046aaea', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '181e148e-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4219.361118547, 'message_signature': '4d05c2d4cb665c2ef6eab1bff7005d0dc2f25bec14c37cb73f9addc77c3cf025'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 45721845, 'user_id': '5ea417ea62e2404d8cb5b9e767e8c5c4', 'user_name': None, 'project_id': '070aaece3c3c4232877d26c34023c56d', 'project_name': None, 'resource_id': '3cf2b323-ba35-4807-8337-288f6c983860-sda', 'timestamp': '2025-11-22T07:44:36.814014', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1564380060', 'name': 'instance-00000013', 'instance_id': '3cf2b323-ba35-4807-8337-288f6c983860', 'instance_type': 'm1.nano', 'host': '130aecd49a665fe1d506bbcb189c6fffbb7b32b8a3374ac7c046aaea', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '181e1bf0-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4219.361118547, 'message_signature': '4e8a470613ca69eac8da666c1128dd75a6f03e3d1972f1005b4984a36e0364dd'}]}, 'timestamp': '2025-11-22 07:44:36.814835', '_unique_id': '1619942293ef469d8ade9e5e8ba0290e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.815 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.815 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.815 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.815 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.815 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.815 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.815 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.815 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.815 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.815 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.815 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.815 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.815 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.815 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.815 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.815 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.815 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.815 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.815 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.815 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.815 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.815 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.815 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.815 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.815 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.815 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.815 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.815 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.815 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.815 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.815 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.815 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.816 12 DEBUG ceilometer.compute.pollsters [-] 144e6cca-5b79-4b25-9456-a59f6895075b/disk.device.allocation volume: 30412800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.816 12 DEBUG ceilometer.compute.pollsters [-] 144e6cca-5b79-4b25-9456-a59f6895075b/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.816 12 DEBUG ceilometer.compute.pollsters [-] 3cf2b323-ba35-4807-8337-288f6c983860/disk.device.allocation volume: 30154752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.816 12 DEBUG ceilometer.compute.pollsters [-] 3cf2b323-ba35-4807-8337-288f6c983860/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.817 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '622cc8d7-4760-40fb-93ff-b1e7b43f2ee7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30412800, 'user_id': '4ca2e31d955040598948fa3da5d84888', 'user_name': None, 'project_id': '74651b744925468db6c6e47d1397cc04', 'project_name': None, 'resource_id': '144e6cca-5b79-4b25-9456-a59f6895075b-vda', 'timestamp': '2025-11-22T07:44:36.816049', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1027576693', 'name': 'instance-00000010', 'instance_id': '144e6cca-5b79-4b25-9456-a59f6895075b', 'instance_type': 'm1.nano', 'host': 'd06eef4a3fe42049def3cd897e84669b62c1583eb0ccbdf323f1197c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '181e5386-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4219.408812231, 'message_signature': 'c02ed5ffef33655a225f700c60ab3602c4e936cc8f24c93c8c346cb1d0adfcb3'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '4ca2e31d955040598948fa3da5d84888', 'user_name': None, 'project_id': '74651b744925468db6c6e47d1397cc04', 'project_name': None, 'resource_id': '144e6cca-5b79-4b25-9456-a59f6895075b-sda', 'timestamp': '2025-11-22T07:44:36.816049', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1027576693', 'name': 'instance-00000010', 'instance_id': '144e6cca-5b79-4b25-9456-a59f6895075b', 'instance_type': 'm1.nano', 'host': 'd06eef4a3fe42049def3cd897e84669b62c1583eb0ccbdf323f1197c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '181e5d7c-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4219.408812231, 'message_signature': 'd9a6a4c137e2d52fc3d8b0dba9d36108b6071c951fe647a39ef52585810592fc'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30154752, 'user_id': '5ea417ea62e2404d8cb5b9e767e8c5c4', 'user_name': None, 'project_id': '070aaece3c3c4232877d26c34023c56d', 'project_name': None, 'resource_id': '3cf2b323-ba35-4807-8337-288f6c983860-vda', 'timestamp': '2025-11-22T07:44:36.816049', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1564380060', 'name': 'instance-00000013', 'instance_id': '3cf2b323-ba35-4807-8337-288f6c983860', 'instance_type': 'm1.nano', 'host': '130aecd49a665fe1d506bbcb189c6fffbb7b32b8a3374ac7c046aaea', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '181e6524-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4219.418936211, 'message_signature': '3556103a353231ce4c92a44aeef17edd2180dc63a63aedde09955d18c5be1285'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '5ea417ea62e2404d8cb5b9e767e8c5c4', 'user_name': None, 'project_id': '070aaece3c3c4232877d26c34023c56d', 'project_name': None, 'resource_id': '3cf2b323-ba35-4807-8337-288f6c983860-sda', 'timestamp': '2025-11-22T07:44:36.816049', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1564380060', 'name': 'instance-00000013', 'instance_id': '3cf2b323-ba35-4807-8337-288f6c983860', 'instance_type': 'm1.nano', 'host': '130aecd49a665fe1d506bbcb189c6fffbb7b32b8a3374ac7c046aaea', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '181e6e02-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4219.418936211, 'message_signature': 'ba2660dd1c8ec67faaa4222a800447752356fd55a381f9be73f4113136f418cc'}]}, 'timestamp': '2025-11-22 07:44:36.816975', '_unique_id': 'f5b5b56681de47e1805c46352f554e8d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.817 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.817 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.817 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.817 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.817 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.817 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.817 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.817 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.817 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.817 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.817 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.817 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.817 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.817 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.817 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.817 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.817 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.817 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.817 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.817 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.817 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.817 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.817 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.817 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.817 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.817 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.817 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.817 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.817 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.817 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.817 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.818 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.818 12 DEBUG ceilometer.compute.pollsters [-] 144e6cca-5b79-4b25-9456-a59f6895075b/network.outgoing.bytes volume: 5560 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:44:36 np0005531887 nova_compute[186849]: 2025-11-22 07:44:36.818 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:44:36 np0005531887 nova_compute[186849]: 2025-11-22 07:44:36.819 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:44:36 np0005531887 nova_compute[186849]: 2025-11-22 07:44:36.819 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.818 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '32fc76a4-a54e-4308-8cbc-99dc0810ee31', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 5560, 'user_id': '4ca2e31d955040598948fa3da5d84888', 'user_name': None, 'project_id': '74651b744925468db6c6e47d1397cc04', 'project_name': None, 'resource_id': 'instance-00000010-144e6cca-5b79-4b25-9456-a59f6895075b-tap66ab05b0-44', 'timestamp': '2025-11-22T07:44:36.818278', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1027576693', 'name': 'tap66ab05b0-44', 'instance_id': '144e6cca-5b79-4b25-9456-a59f6895075b', 'instance_type': 'm1.nano', 'host': 'd06eef4a3fe42049def3cd897e84669b62c1583eb0ccbdf323f1197c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4f:30:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap66ab05b0-44'}, 'message_id': '181eab06-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4219.390152102, 'message_signature': '695ec0f1d3ca617caa7dfac216b852700eed3e3b9253d532d42649509ececbd8'}]}, 'timestamp': '2025-11-22 07:44:36.818516', '_unique_id': 'c20c12c6929d48aa9beda6bddb7393d4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.818 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.818 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.818 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.818 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.818 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.818 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.818 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.818 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.818 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.818 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:44:36 np0005531887 nova_compute[186849]: 2025-11-22 07:44:36.819 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.818 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.818 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.818 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.818 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.818 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.818 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.818 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.818 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.818 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.818 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.818 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.818 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.818 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.818 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.818 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.818 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.818 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.818 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.818 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.818 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.818 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.819 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.819 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 02:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:44:36.819 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1027576693>, <NovaLikeServer: tempest-MigrationsAdminTest-server-1564380060>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1027576693>, <NovaLikeServer: tempest-MigrationsAdminTest-server-1564380060>]
Nov 22 02:44:36 np0005531887 nova_compute[186849]: 2025-11-22 07:44:36.891 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:44:36 np0005531887 nova_compute[186849]: 2025-11-22 07:44:36.955 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:44:36 np0005531887 nova_compute[186849]: 2025-11-22 07:44:36.957 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:44:37 np0005531887 nova_compute[186849]: 2025-11-22 07:44:37.012 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:44:37 np0005531887 nova_compute[186849]: 2025-11-22 07:44:37.017 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:44:37 np0005531887 nova_compute[186849]: 2025-11-22 07:44:37.070 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:44:37 np0005531887 nova_compute[186849]: 2025-11-22 07:44:37.071 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:44:37 np0005531887 nova_compute[186849]: 2025-11-22 07:44:37.143 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:44:37 np0005531887 nova_compute[186849]: 2025-11-22 07:44:37.294 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:44:37 np0005531887 nova_compute[186849]: 2025-11-22 07:44:37.296 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5460MB free_disk=73.40065383911133GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 02:44:37 np0005531887 nova_compute[186849]: 2025-11-22 07:44:37.296 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:44:37 np0005531887 nova_compute[186849]: 2025-11-22 07:44:37.296 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:44:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:44:37.316 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:44:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:44:37.317 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:44:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:44:37.317 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:44:37 np0005531887 nova_compute[186849]: 2025-11-22 07:44:37.341 186853 INFO nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Updating resource usage from migration ed0b10e7-46ed-431c-bd8e-aa93dcdc9988#033[00m
Nov 22 02:44:37 np0005531887 nova_compute[186849]: 2025-11-22 07:44:37.364 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Instance 144e6cca-5b79-4b25-9456-a59f6895075b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 02:44:37 np0005531887 nova_compute[186849]: 2025-11-22 07:44:37.364 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Migration ed0b10e7-46ed-431c-bd8e-aa93dcdc9988 is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Nov 22 02:44:37 np0005531887 nova_compute[186849]: 2025-11-22 07:44:37.365 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 02:44:37 np0005531887 nova_compute[186849]: 2025-11-22 07:44:37.365 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 02:44:37 np0005531887 nova_compute[186849]: 2025-11-22 07:44:37.423 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:44:37 np0005531887 nova_compute[186849]: 2025-11-22 07:44:37.434 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:44:37 np0005531887 nova_compute[186849]: 2025-11-22 07:44:37.477 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 02:44:37 np0005531887 nova_compute[186849]: 2025-11-22 07:44:37.478 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.182s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:44:37 np0005531887 podman[216132]: 2025-11-22 07:44:37.847613327 +0000 UTC m=+0.057198119 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 02:44:38 np0005531887 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000013.scope: Deactivated successfully.
Nov 22 02:44:38 np0005531887 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000013.scope: Consumed 14.011s CPU time.
Nov 22 02:44:38 np0005531887 systemd-machined[153180]: Machine qemu-8-instance-00000013 terminated.
Nov 22 02:44:39 np0005531887 nova_compute[186849]: 2025-11-22 07:44:39.235 186853 INFO nova.virt.libvirt.driver [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Instance shutdown successfully after 13 seconds.#033[00m
Nov 22 02:44:39 np0005531887 nova_compute[186849]: 2025-11-22 07:44:39.241 186853 INFO nova.virt.libvirt.driver [-] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Instance destroyed successfully.#033[00m
Nov 22 02:44:39 np0005531887 nova_compute[186849]: 2025-11-22 07:44:39.244 186853 DEBUG oslo_concurrency.processutils [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:44:39 np0005531887 nova_compute[186849]: 2025-11-22 07:44:39.300 186853 DEBUG oslo_concurrency.processutils [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:44:39 np0005531887 nova_compute[186849]: 2025-11-22 07:44:39.301 186853 DEBUG oslo_concurrency.processutils [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:44:39 np0005531887 nova_compute[186849]: 2025-11-22 07:44:39.354 186853 DEBUG oslo_concurrency.processutils [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:44:39 np0005531887 nova_compute[186849]: 2025-11-22 07:44:39.356 186853 DEBUG nova.virt.libvirt.volume.remotefs [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Copying file /var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860_resize/disk to 192.168.122.102:/var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 22 02:44:39 np0005531887 nova_compute[186849]: 2025-11-22 07:44:39.356 186853 DEBUG oslo_concurrency.processutils [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860_resize/disk 192.168.122.102:/var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:44:39 np0005531887 nova_compute[186849]: 2025-11-22 07:44:39.446 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:44:39 np0005531887 nova_compute[186849]: 2025-11-22 07:44:39.447 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:44:39 np0005531887 nova_compute[186849]: 2025-11-22 07:44:39.447 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:44:39 np0005531887 nova_compute[186849]: 2025-11-22 07:44:39.447 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 02:44:40 np0005531887 nova_compute[186849]: 2025-11-22 07:44:40.162 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:40 np0005531887 nova_compute[186849]: 2025-11-22 07:44:40.186 186853 DEBUG oslo_concurrency.processutils [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CMD "scp -r /var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860_resize/disk 192.168.122.102:/var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860/disk" returned: 0 in 0.829s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:44:40 np0005531887 nova_compute[186849]: 2025-11-22 07:44:40.186 186853 DEBUG nova.virt.libvirt.volume.remotefs [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Copying file /var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860_resize/disk.config to 192.168.122.102:/var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 22 02:44:40 np0005531887 nova_compute[186849]: 2025-11-22 07:44:40.187 186853 DEBUG oslo_concurrency.processutils [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860_resize/disk.config 192.168.122.102:/var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:44:40 np0005531887 nova_compute[186849]: 2025-11-22 07:44:40.408 186853 DEBUG oslo_concurrency.processutils [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CMD "scp -C -r /var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860_resize/disk.config 192.168.122.102:/var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860/disk.config" returned: 0 in 0.222s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:44:40 np0005531887 nova_compute[186849]: 2025-11-22 07:44:40.410 186853 DEBUG nova.virt.libvirt.volume.remotefs [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Copying file /var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860_resize/disk.info to 192.168.122.102:/var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 22 02:44:40 np0005531887 nova_compute[186849]: 2025-11-22 07:44:40.410 186853 DEBUG oslo_concurrency.processutils [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860_resize/disk.info 192.168.122.102:/var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:44:40 np0005531887 nova_compute[186849]: 2025-11-22 07:44:40.637 186853 DEBUG oslo_concurrency.processutils [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CMD "scp -C -r /var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860_resize/disk.info 192.168.122.102:/var/lib/nova/instances/3cf2b323-ba35-4807-8337-288f6c983860/disk.info" returned: 0 in 0.227s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:44:40 np0005531887 nova_compute[186849]: 2025-11-22 07:44:40.925 186853 DEBUG oslo_concurrency.lockutils [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "3cf2b323-ba35-4807-8337-288f6c983860-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:44:40 np0005531887 nova_compute[186849]: 2025-11-22 07:44:40.925 186853 DEBUG oslo_concurrency.lockutils [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "3cf2b323-ba35-4807-8337-288f6c983860-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:44:40 np0005531887 nova_compute[186849]: 2025-11-22 07:44:40.925 186853 DEBUG oslo_concurrency.lockutils [None req-5806ffce-ba35-46d1-9ba1-6fa21e211774 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "3cf2b323-ba35-4807-8337-288f6c983860-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:44:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:44:41.409 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:44:41 np0005531887 nova_compute[186849]: 2025-11-22 07:44:41.499 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:42 np0005531887 podman[216184]: 2025-11-22 07:44:42.870311114 +0000 UTC m=+0.072952708 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, architecture=x86_64, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 02:44:45 np0005531887 nova_compute[186849]: 2025-11-22 07:44:45.164 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:45 np0005531887 nova_compute[186849]: 2025-11-22 07:44:45.220 186853 DEBUG oslo_concurrency.lockutils [None req-e17e67eb-1788-407f-81a8-1119256d2128 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "3cf2b323-ba35-4807-8337-288f6c983860" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:44:45 np0005531887 nova_compute[186849]: 2025-11-22 07:44:45.221 186853 DEBUG oslo_concurrency.lockutils [None req-e17e67eb-1788-407f-81a8-1119256d2128 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "3cf2b323-ba35-4807-8337-288f6c983860" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:44:45 np0005531887 nova_compute[186849]: 2025-11-22 07:44:45.221 186853 DEBUG nova.compute.manager [None req-e17e67eb-1788-407f-81a8-1119256d2128 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Going to confirm migration 4 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679#033[00m
Nov 22 02:44:45 np0005531887 nova_compute[186849]: 2025-11-22 07:44:45.249 186853 DEBUG nova.objects.instance [None req-e17e67eb-1788-407f-81a8-1119256d2128 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lazy-loading 'info_cache' on Instance uuid 3cf2b323-ba35-4807-8337-288f6c983860 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:44:45 np0005531887 nova_compute[186849]: 2025-11-22 07:44:45.475 186853 DEBUG oslo_concurrency.lockutils [None req-e17e67eb-1788-407f-81a8-1119256d2128 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "refresh_cache-3cf2b323-ba35-4807-8337-288f6c983860" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:44:45 np0005531887 nova_compute[186849]: 2025-11-22 07:44:45.475 186853 DEBUG oslo_concurrency.lockutils [None req-e17e67eb-1788-407f-81a8-1119256d2128 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquired lock "refresh_cache-3cf2b323-ba35-4807-8337-288f6c983860" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:44:45 np0005531887 nova_compute[186849]: 2025-11-22 07:44:45.475 186853 DEBUG nova.network.neutron [None req-e17e67eb-1788-407f-81a8-1119256d2128 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:44:45 np0005531887 nova_compute[186849]: 2025-11-22 07:44:45.724 186853 DEBUG nova.network.neutron [None req-e17e67eb-1788-407f-81a8-1119256d2128 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:44:46 np0005531887 nova_compute[186849]: 2025-11-22 07:44:46.473 186853 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797471.4715776, 9141d9a1-7a40-4a72-a3f7-2d67ae112383 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:44:46 np0005531887 nova_compute[186849]: 2025-11-22 07:44:46.473 186853 INFO nova.compute.manager [-] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:44:46 np0005531887 nova_compute[186849]: 2025-11-22 07:44:46.501 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:46 np0005531887 nova_compute[186849]: 2025-11-22 07:44:46.814 186853 DEBUG nova.compute.manager [None req-9dd19e95-b951-4fb6-a92e-4edc45493f6f - - - - - -] [instance: 9141d9a1-7a40-4a72-a3f7-2d67ae112383] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:44:47 np0005531887 nova_compute[186849]: 2025-11-22 07:44:47.065 186853 DEBUG nova.network.neutron [None req-e17e67eb-1788-407f-81a8-1119256d2128 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:44:47 np0005531887 nova_compute[186849]: 2025-11-22 07:44:47.085 186853 DEBUG oslo_concurrency.lockutils [None req-e17e67eb-1788-407f-81a8-1119256d2128 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Releasing lock "refresh_cache-3cf2b323-ba35-4807-8337-288f6c983860" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:44:47 np0005531887 nova_compute[186849]: 2025-11-22 07:44:47.085 186853 DEBUG nova.objects.instance [None req-e17e67eb-1788-407f-81a8-1119256d2128 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lazy-loading 'migration_context' on Instance uuid 3cf2b323-ba35-4807-8337-288f6c983860 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:44:47 np0005531887 nova_compute[186849]: 2025-11-22 07:44:47.108 186853 DEBUG oslo_concurrency.lockutils [None req-e17e67eb-1788-407f-81a8-1119256d2128 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:44:47 np0005531887 nova_compute[186849]: 2025-11-22 07:44:47.109 186853 DEBUG oslo_concurrency.lockutils [None req-e17e67eb-1788-407f-81a8-1119256d2128 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:44:47 np0005531887 nova_compute[186849]: 2025-11-22 07:44:47.227 186853 DEBUG nova.compute.provider_tree [None req-e17e67eb-1788-407f-81a8-1119256d2128 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:44:47 np0005531887 nova_compute[186849]: 2025-11-22 07:44:47.247 186853 DEBUG nova.scheduler.client.report [None req-e17e67eb-1788-407f-81a8-1119256d2128 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:44:47 np0005531887 nova_compute[186849]: 2025-11-22 07:44:47.384 186853 DEBUG oslo_concurrency.lockutils [None req-e17e67eb-1788-407f-81a8-1119256d2128 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.275s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:44:48 np0005531887 nova_compute[186849]: 2025-11-22 07:44:48.260 186853 INFO nova.scheduler.client.report [None req-e17e67eb-1788-407f-81a8-1119256d2128 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Deleted allocation for migration ed0b10e7-46ed-431c-bd8e-aa93dcdc9988#033[00m
Nov 22 02:44:48 np0005531887 nova_compute[186849]: 2025-11-22 07:44:48.390 186853 DEBUG oslo_concurrency.lockutils [None req-e17e67eb-1788-407f-81a8-1119256d2128 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "3cf2b323-ba35-4807-8337-288f6c983860" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 3.169s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:44:50 np0005531887 nova_compute[186849]: 2025-11-22 07:44:50.167 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:50 np0005531887 podman[216205]: 2025-11-22 07:44:50.839211664 +0000 UTC m=+0.057422505 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 22 02:44:50 np0005531887 podman[216206]: 2025-11-22 07:44:50.868317261 +0000 UTC m=+0.081983810 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118)
Nov 22 02:44:51 np0005531887 nova_compute[186849]: 2025-11-22 07:44:51.504 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:51 np0005531887 ovn_controller[95130]: 2025-11-22T07:44:51Z|00060|binding|INFO|Releasing lport f400467f-3f35-4435-bb4a-0b3da05366fb from this chassis (sb_readonly=0)
Nov 22 02:44:51 np0005531887 nova_compute[186849]: 2025-11-22 07:44:51.767 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:53 np0005531887 nova_compute[186849]: 2025-11-22 07:44:53.661 186853 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797478.6606617, 3cf2b323-ba35-4807-8337-288f6c983860 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:44:53 np0005531887 nova_compute[186849]: 2025-11-22 07:44:53.662 186853 INFO nova.compute.manager [-] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:44:53 np0005531887 nova_compute[186849]: 2025-11-22 07:44:53.685 186853 DEBUG nova.compute.manager [None req-2fe4e6f7-6b33-4ae7-96cc-02908ee5200c - - - - - -] [instance: 3cf2b323-ba35-4807-8337-288f6c983860] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:44:55 np0005531887 nova_compute[186849]: 2025-11-22 07:44:55.169 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:56 np0005531887 nova_compute[186849]: 2025-11-22 07:44:56.506 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:44:56 np0005531887 podman[216253]: 2025-11-22 07:44:56.861510201 +0000 UTC m=+0.077749196 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 02:44:59 np0005531887 nova_compute[186849]: 2025-11-22 07:44:59.425 186853 DEBUG oslo_concurrency.lockutils [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "88c868e5-67c5-4f22-b584-d8772316044d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:44:59 np0005531887 nova_compute[186849]: 2025-11-22 07:44:59.426 186853 DEBUG oslo_concurrency.lockutils [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "88c868e5-67c5-4f22-b584-d8772316044d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:44:59 np0005531887 nova_compute[186849]: 2025-11-22 07:44:59.515 186853 DEBUG nova.compute.manager [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 02:44:59 np0005531887 podman[216278]: 2025-11-22 07:44:59.836334399 +0000 UTC m=+0.057220020 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 02:45:00 np0005531887 nova_compute[186849]: 2025-11-22 07:45:00.009 186853 DEBUG oslo_concurrency.lockutils [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:45:00 np0005531887 nova_compute[186849]: 2025-11-22 07:45:00.010 186853 DEBUG oslo_concurrency.lockutils [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:45:00 np0005531887 nova_compute[186849]: 2025-11-22 07:45:00.017 186853 DEBUG nova.virt.hardware [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:45:00 np0005531887 nova_compute[186849]: 2025-11-22 07:45:00.017 186853 INFO nova.compute.claims [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 22 02:45:00 np0005531887 nova_compute[186849]: 2025-11-22 07:45:00.170 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:00 np0005531887 nova_compute[186849]: 2025-11-22 07:45:00.234 186853 DEBUG nova.scheduler.client.report [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Refreshing inventories for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 22 02:45:00 np0005531887 nova_compute[186849]: 2025-11-22 07:45:00.364 186853 DEBUG nova.scheduler.client.report [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Updating ProviderTree inventory for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 22 02:45:00 np0005531887 nova_compute[186849]: 2025-11-22 07:45:00.365 186853 DEBUG nova.compute.provider_tree [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Updating inventory in ProviderTree for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 22 02:45:00 np0005531887 nova_compute[186849]: 2025-11-22 07:45:00.409 186853 DEBUG nova.scheduler.client.report [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Refreshing aggregate associations for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 22 02:45:00 np0005531887 nova_compute[186849]: 2025-11-22 07:45:00.457 186853 DEBUG nova.scheduler.client.report [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Refreshing trait associations for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78, traits: COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_MMX,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_PCNET _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 22 02:45:00 np0005531887 nova_compute[186849]: 2025-11-22 07:45:00.547 186853 DEBUG nova.compute.provider_tree [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:45:00 np0005531887 nova_compute[186849]: 2025-11-22 07:45:00.570 186853 DEBUG nova.scheduler.client.report [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:45:00 np0005531887 nova_compute[186849]: 2025-11-22 07:45:00.629 186853 DEBUG oslo_concurrency.lockutils [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.619s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:45:00 np0005531887 nova_compute[186849]: 2025-11-22 07:45:00.630 186853 DEBUG nova.compute.manager [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 02:45:00 np0005531887 nova_compute[186849]: 2025-11-22 07:45:00.868 186853 DEBUG nova.compute.manager [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 02:45:00 np0005531887 nova_compute[186849]: 2025-11-22 07:45:00.868 186853 DEBUG nova.network.neutron [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 02:45:00 np0005531887 nova_compute[186849]: 2025-11-22 07:45:00.901 186853 INFO nova.virt.libvirt.driver [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 02:45:00 np0005531887 nova_compute[186849]: 2025-11-22 07:45:00.939 186853 DEBUG nova.compute.manager [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 02:45:01 np0005531887 nova_compute[186849]: 2025-11-22 07:45:01.156 186853 DEBUG nova.compute.manager [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 02:45:01 np0005531887 nova_compute[186849]: 2025-11-22 07:45:01.157 186853 DEBUG nova.virt.libvirt.driver [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:45:01 np0005531887 nova_compute[186849]: 2025-11-22 07:45:01.157 186853 INFO nova.virt.libvirt.driver [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Creating image(s)#033[00m
Nov 22 02:45:01 np0005531887 nova_compute[186849]: 2025-11-22 07:45:01.158 186853 DEBUG oslo_concurrency.lockutils [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "/var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:45:01 np0005531887 nova_compute[186849]: 2025-11-22 07:45:01.158 186853 DEBUG oslo_concurrency.lockutils [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "/var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:45:01 np0005531887 nova_compute[186849]: 2025-11-22 07:45:01.159 186853 DEBUG oslo_concurrency.lockutils [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "/var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:45:01 np0005531887 nova_compute[186849]: 2025-11-22 07:45:01.172 186853 DEBUG oslo_concurrency.processutils [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:45:01 np0005531887 nova_compute[186849]: 2025-11-22 07:45:01.231 186853 DEBUG oslo_concurrency.processutils [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:45:01 np0005531887 nova_compute[186849]: 2025-11-22 07:45:01.232 186853 DEBUG oslo_concurrency.lockutils [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:45:01 np0005531887 nova_compute[186849]: 2025-11-22 07:45:01.233 186853 DEBUG oslo_concurrency.lockutils [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:45:01 np0005531887 nova_compute[186849]: 2025-11-22 07:45:01.245 186853 DEBUG oslo_concurrency.processutils [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:45:01 np0005531887 nova_compute[186849]: 2025-11-22 07:45:01.308 186853 DEBUG oslo_concurrency.processutils [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:45:01 np0005531887 nova_compute[186849]: 2025-11-22 07:45:01.309 186853 DEBUG oslo_concurrency.processutils [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:45:01 np0005531887 nova_compute[186849]: 2025-11-22 07:45:01.366 186853 DEBUG oslo_concurrency.processutils [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d/disk 1073741824" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:45:01 np0005531887 nova_compute[186849]: 2025-11-22 07:45:01.368 186853 DEBUG oslo_concurrency.lockutils [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:45:01 np0005531887 nova_compute[186849]: 2025-11-22 07:45:01.368 186853 DEBUG oslo_concurrency.processutils [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:45:01 np0005531887 nova_compute[186849]: 2025-11-22 07:45:01.417 186853 DEBUG nova.network.neutron [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Nov 22 02:45:01 np0005531887 nova_compute[186849]: 2025-11-22 07:45:01.417 186853 DEBUG nova.compute.manager [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 02:45:01 np0005531887 nova_compute[186849]: 2025-11-22 07:45:01.425 186853 DEBUG oslo_concurrency.processutils [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:45:01 np0005531887 nova_compute[186849]: 2025-11-22 07:45:01.425 186853 DEBUG nova.virt.disk.api [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Checking if we can resize image /var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:45:01 np0005531887 nova_compute[186849]: 2025-11-22 07:45:01.426 186853 DEBUG oslo_concurrency.processutils [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:45:01 np0005531887 nova_compute[186849]: 2025-11-22 07:45:01.484 186853 DEBUG oslo_concurrency.processutils [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:45:01 np0005531887 nova_compute[186849]: 2025-11-22 07:45:01.485 186853 DEBUG nova.virt.disk.api [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Cannot resize image /var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:45:01 np0005531887 nova_compute[186849]: 2025-11-22 07:45:01.486 186853 DEBUG nova.objects.instance [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lazy-loading 'migration_context' on Instance uuid 88c868e5-67c5-4f22-b584-d8772316044d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:45:01 np0005531887 nova_compute[186849]: 2025-11-22 07:45:01.507 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:01 np0005531887 nova_compute[186849]: 2025-11-22 07:45:01.510 186853 DEBUG nova.virt.libvirt.driver [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:45:01 np0005531887 nova_compute[186849]: 2025-11-22 07:45:01.510 186853 DEBUG nova.virt.libvirt.driver [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Ensure instance console log exists: /var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:45:01 np0005531887 nova_compute[186849]: 2025-11-22 07:45:01.511 186853 DEBUG oslo_concurrency.lockutils [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:45:01 np0005531887 nova_compute[186849]: 2025-11-22 07:45:01.511 186853 DEBUG oslo_concurrency.lockutils [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:45:01 np0005531887 nova_compute[186849]: 2025-11-22 07:45:01.511 186853 DEBUG oslo_concurrency.lockutils [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:45:01 np0005531887 nova_compute[186849]: 2025-11-22 07:45:01.513 186853 DEBUG nova.virt.libvirt.driver [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'size': 0, 'boot_index': 0, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:45:01 np0005531887 nova_compute[186849]: 2025-11-22 07:45:01.519 186853 WARNING nova.virt.libvirt.driver [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:45:01 np0005531887 nova_compute[186849]: 2025-11-22 07:45:01.524 186853 DEBUG nova.virt.libvirt.host [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:45:01 np0005531887 nova_compute[186849]: 2025-11-22 07:45:01.525 186853 DEBUG nova.virt.libvirt.host [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:45:01 np0005531887 nova_compute[186849]: 2025-11-22 07:45:01.529 186853 DEBUG nova.virt.libvirt.host [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:45:01 np0005531887 nova_compute[186849]: 2025-11-22 07:45:01.530 186853 DEBUG nova.virt.libvirt.host [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:45:01 np0005531887 nova_compute[186849]: 2025-11-22 07:45:01.532 186853 DEBUG nova.virt.libvirt.driver [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:45:01 np0005531887 nova_compute[186849]: 2025-11-22 07:45:01.532 186853 DEBUG nova.virt.hardware [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:44:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='03a9bbee-8c6d-4345-a323-8fa81a00e495',id=28,is_public=True,memory_mb=128,name='tempest-test_resize_flavor_-2139299838',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:45:01 np0005531887 nova_compute[186849]: 2025-11-22 07:45:01.533 186853 DEBUG nova.virt.hardware [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:45:01 np0005531887 nova_compute[186849]: 2025-11-22 07:45:01.534 186853 DEBUG nova.virt.hardware [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:45:01 np0005531887 nova_compute[186849]: 2025-11-22 07:45:01.534 186853 DEBUG nova.virt.hardware [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:45:01 np0005531887 nova_compute[186849]: 2025-11-22 07:45:01.534 186853 DEBUG nova.virt.hardware [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:45:01 np0005531887 nova_compute[186849]: 2025-11-22 07:45:01.535 186853 DEBUG nova.virt.hardware [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:45:01 np0005531887 nova_compute[186849]: 2025-11-22 07:45:01.535 186853 DEBUG nova.virt.hardware [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:45:01 np0005531887 nova_compute[186849]: 2025-11-22 07:45:01.535 186853 DEBUG nova.virt.hardware [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:45:01 np0005531887 nova_compute[186849]: 2025-11-22 07:45:01.536 186853 DEBUG nova.virt.hardware [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:45:01 np0005531887 nova_compute[186849]: 2025-11-22 07:45:01.536 186853 DEBUG nova.virt.hardware [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:45:01 np0005531887 nova_compute[186849]: 2025-11-22 07:45:01.536 186853 DEBUG nova.virt.hardware [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:45:01 np0005531887 nova_compute[186849]: 2025-11-22 07:45:01.542 186853 DEBUG nova.objects.instance [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lazy-loading 'pci_devices' on Instance uuid 88c868e5-67c5-4f22-b584-d8772316044d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:45:01 np0005531887 nova_compute[186849]: 2025-11-22 07:45:01.562 186853 DEBUG nova.virt.libvirt.driver [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:45:01 np0005531887 nova_compute[186849]:  <uuid>88c868e5-67c5-4f22-b584-d8772316044d</uuid>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:  <name>instance-00000016</name>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:  <memory>131072</memory>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:  <vcpu>1</vcpu>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:45:01 np0005531887 nova_compute[186849]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:      <nova:name>tempest-MigrationsAdminTest-server-1406881377</nova:name>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:      <nova:creationTime>2025-11-22 07:45:01</nova:creationTime>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:      <nova:flavor name="tempest-test_resize_flavor_-2139299838">
Nov 22 02:45:01 np0005531887 nova_compute[186849]:        <nova:memory>128</nova:memory>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:        <nova:disk>1</nova:disk>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:        <nova:swap>0</nova:swap>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:      </nova:flavor>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:      <nova:owner>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:        <nova:user uuid="5ea417ea62e2404d8cb5b9e767e8c5c4">tempest-MigrationsAdminTest-573005991-project-member</nova:user>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:        <nova:project uuid="070aaece3c3c4232877d26c34023c56d">tempest-MigrationsAdminTest-573005991</nova:project>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:      </nova:owner>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:      <nova:ports/>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:    </nova:instance>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:  <sysinfo type="smbios">
Nov 22 02:45:01 np0005531887 nova_compute[186849]:    <system>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:      <entry name="serial">88c868e5-67c5-4f22-b584-d8772316044d</entry>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:      <entry name="uuid">88c868e5-67c5-4f22-b584-d8772316044d</entry>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:    </system>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:  <os>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:    <boot dev="hd"/>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:    <smbios mode="sysinfo"/>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:  </os>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:  <features>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:    <vmcoreinfo/>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:  </features>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:  <clock offset="utc">
Nov 22 02:45:01 np0005531887 nova_compute[186849]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:    <timer name="hpet" present="no"/>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:  </clock>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:  <cpu mode="custom" match="exact">
Nov 22 02:45:01 np0005531887 nova_compute[186849]:    <model>Nehalem</model>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:  <devices>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:    <disk type="file" device="disk">
Nov 22 02:45:01 np0005531887 nova_compute[186849]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d/disk"/>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:      <target dev="vda" bus="virtio"/>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:    <disk type="file" device="cdrom">
Nov 22 02:45:01 np0005531887 nova_compute[186849]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d/disk.config"/>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:      <target dev="sda" bus="sata"/>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:    <serial type="pty">
Nov 22 02:45:01 np0005531887 nova_compute[186849]:      <log file="/var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d/console.log" append="off"/>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:    </serial>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:    <video>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:    </video>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:    <input type="tablet" bus="usb"/>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:    <rng model="virtio">
Nov 22 02:45:01 np0005531887 nova_compute[186849]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:    </rng>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:    <controller type="usb" index="0"/>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:    <memballoon model="virtio">
Nov 22 02:45:01 np0005531887 nova_compute[186849]:      <stats period="10"/>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 02:45:01 np0005531887 nova_compute[186849]:  </devices>
Nov 22 02:45:01 np0005531887 nova_compute[186849]: </domain>
Nov 22 02:45:01 np0005531887 nova_compute[186849]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:45:01 np0005531887 nova_compute[186849]: 2025-11-22 07:45:01.610 186853 DEBUG nova.virt.libvirt.driver [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:45:01 np0005531887 nova_compute[186849]: 2025-11-22 07:45:01.611 186853 DEBUG nova.virt.libvirt.driver [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:45:01 np0005531887 nova_compute[186849]: 2025-11-22 07:45:01.611 186853 INFO nova.virt.libvirt.driver [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Using config drive#033[00m
Nov 22 02:45:01 np0005531887 nova_compute[186849]: 2025-11-22 07:45:01.784 186853 INFO nova.virt.libvirt.driver [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Creating config drive at /var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d/disk.config#033[00m
Nov 22 02:45:01 np0005531887 nova_compute[186849]: 2025-11-22 07:45:01.790 186853 DEBUG oslo_concurrency.processutils [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjq6plz6n execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:45:01 np0005531887 nova_compute[186849]: 2025-11-22 07:45:01.918 186853 DEBUG oslo_concurrency.processutils [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjq6plz6n" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:45:01 np0005531887 systemd-machined[153180]: New machine qemu-9-instance-00000016.
Nov 22 02:45:02 np0005531887 systemd[1]: Started Virtual Machine qemu-9-instance-00000016.
Nov 22 02:45:02 np0005531887 nova_compute[186849]: 2025-11-22 07:45:02.356 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763797502.3557308, 88c868e5-67c5-4f22-b584-d8772316044d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:45:02 np0005531887 nova_compute[186849]: 2025-11-22 07:45:02.357 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:45:02 np0005531887 nova_compute[186849]: 2025-11-22 07:45:02.360 186853 DEBUG nova.compute.manager [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:45:02 np0005531887 nova_compute[186849]: 2025-11-22 07:45:02.361 186853 DEBUG nova.virt.libvirt.driver [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 02:45:02 np0005531887 nova_compute[186849]: 2025-11-22 07:45:02.365 186853 INFO nova.virt.libvirt.driver [-] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Instance spawned successfully.#033[00m
Nov 22 02:45:02 np0005531887 nova_compute[186849]: 2025-11-22 07:45:02.366 186853 DEBUG nova.virt.libvirt.driver [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 02:45:02 np0005531887 nova_compute[186849]: 2025-11-22 07:45:02.391 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:45:02 np0005531887 nova_compute[186849]: 2025-11-22 07:45:02.395 186853 DEBUG nova.virt.libvirt.driver [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:45:02 np0005531887 nova_compute[186849]: 2025-11-22 07:45:02.395 186853 DEBUG nova.virt.libvirt.driver [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:45:02 np0005531887 nova_compute[186849]: 2025-11-22 07:45:02.396 186853 DEBUG nova.virt.libvirt.driver [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:45:02 np0005531887 nova_compute[186849]: 2025-11-22 07:45:02.396 186853 DEBUG nova.virt.libvirt.driver [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:45:02 np0005531887 nova_compute[186849]: 2025-11-22 07:45:02.396 186853 DEBUG nova.virt.libvirt.driver [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:45:02 np0005531887 nova_compute[186849]: 2025-11-22 07:45:02.397 186853 DEBUG nova.virt.libvirt.driver [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:45:02 np0005531887 nova_compute[186849]: 2025-11-22 07:45:02.401 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:45:02 np0005531887 nova_compute[186849]: 2025-11-22 07:45:02.444 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:45:02 np0005531887 nova_compute[186849]: 2025-11-22 07:45:02.445 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763797502.357297, 88c868e5-67c5-4f22-b584-d8772316044d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:45:02 np0005531887 nova_compute[186849]: 2025-11-22 07:45:02.445 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] VM Started (Lifecycle Event)#033[00m
Nov 22 02:45:02 np0005531887 nova_compute[186849]: 2025-11-22 07:45:02.478 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:45:02 np0005531887 nova_compute[186849]: 2025-11-22 07:45:02.481 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:45:02 np0005531887 nova_compute[186849]: 2025-11-22 07:45:02.499 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:45:02 np0005531887 nova_compute[186849]: 2025-11-22 07:45:02.510 186853 INFO nova.compute.manager [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Took 1.35 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 02:45:02 np0005531887 nova_compute[186849]: 2025-11-22 07:45:02.511 186853 DEBUG nova.compute.manager [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:45:02 np0005531887 nova_compute[186849]: 2025-11-22 07:45:02.586 186853 INFO nova.compute.manager [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Took 2.99 seconds to build instance.#033[00m
Nov 22 02:45:02 np0005531887 nova_compute[186849]: 2025-11-22 07:45:02.602 186853 DEBUG oslo_concurrency.lockutils [None req-833aa902-3371-4b56-96de-fd676a536046 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "88c868e5-67c5-4f22-b584-d8772316044d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.176s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:45:03 np0005531887 podman[216342]: 2025-11-22 07:45:03.857243362 +0000 UTC m=+0.075714656 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 02:45:05 np0005531887 nova_compute[186849]: 2025-11-22 07:45:05.173 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:05 np0005531887 nova_compute[186849]: 2025-11-22 07:45:05.808 186853 DEBUG oslo_concurrency.lockutils [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "refresh_cache-88c868e5-67c5-4f22-b584-d8772316044d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:45:05 np0005531887 nova_compute[186849]: 2025-11-22 07:45:05.809 186853 DEBUG oslo_concurrency.lockutils [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquired lock "refresh_cache-88c868e5-67c5-4f22-b584-d8772316044d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:45:05 np0005531887 nova_compute[186849]: 2025-11-22 07:45:05.809 186853 DEBUG nova.network.neutron [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:45:06 np0005531887 nova_compute[186849]: 2025-11-22 07:45:06.003 186853 DEBUG nova.network.neutron [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:45:06 np0005531887 nova_compute[186849]: 2025-11-22 07:45:06.394 186853 DEBUG nova.network.neutron [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:45:06 np0005531887 nova_compute[186849]: 2025-11-22 07:45:06.413 186853 DEBUG oslo_concurrency.lockutils [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Releasing lock "refresh_cache-88c868e5-67c5-4f22-b584-d8772316044d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:45:06 np0005531887 nova_compute[186849]: 2025-11-22 07:45:06.510 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:06 np0005531887 nova_compute[186849]: 2025-11-22 07:45:06.549 186853 DEBUG nova.virt.libvirt.driver [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Nov 22 02:45:06 np0005531887 nova_compute[186849]: 2025-11-22 07:45:06.550 186853 DEBUG nova.virt.libvirt.volume.remotefs [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Creating file /var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d/a65d771a39fb447682725e1109c27eee.tmp on remote host 192.168.122.102 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Nov 22 02:45:06 np0005531887 nova_compute[186849]: 2025-11-22 07:45:06.550 186853 DEBUG oslo_concurrency.processutils [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d/a65d771a39fb447682725e1109c27eee.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:45:07 np0005531887 nova_compute[186849]: 2025-11-22 07:45:07.065 186853 DEBUG oslo_concurrency.processutils [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d/a65d771a39fb447682725e1109c27eee.tmp" returned: 1 in 0.515s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:45:07 np0005531887 nova_compute[186849]: 2025-11-22 07:45:07.066 186853 DEBUG oslo_concurrency.processutils [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] 'ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d/a65d771a39fb447682725e1109c27eee.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Nov 22 02:45:07 np0005531887 nova_compute[186849]: 2025-11-22 07:45:07.066 186853 DEBUG nova.virt.libvirt.volume.remotefs [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Creating directory /var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d on remote host 192.168.122.102 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Nov 22 02:45:07 np0005531887 nova_compute[186849]: 2025-11-22 07:45:07.066 186853 DEBUG oslo_concurrency.processutils [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:45:07 np0005531887 nova_compute[186849]: 2025-11-22 07:45:07.300 186853 DEBUG oslo_concurrency.processutils [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d" returned: 0 in 0.233s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:45:07 np0005531887 nova_compute[186849]: 2025-11-22 07:45:07.306 186853 DEBUG nova.virt.libvirt.driver [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 22 02:45:08 np0005531887 podman[216365]: 2025-11-22 07:45:08.852955944 +0000 UTC m=+0.054286458 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 02:45:10 np0005531887 nova_compute[186849]: 2025-11-22 07:45:10.177 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:11 np0005531887 nova_compute[186849]: 2025-11-22 07:45:11.512 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:13 np0005531887 podman[216389]: 2025-11-22 07:45:13.848353537 +0000 UTC m=+0.055549628 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.buildah.version=1.33.7, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., managed_by=edpm_ansible, version=9.6, build-date=2025-08-20T13:12:41, architecture=x86_64, io.openshift.expose-services=)
Nov 22 02:45:15 np0005531887 nova_compute[186849]: 2025-11-22 07:45:15.178 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:16 np0005531887 nova_compute[186849]: 2025-11-22 07:45:16.515 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:17 np0005531887 nova_compute[186849]: 2025-11-22 07:45:17.362 186853 DEBUG nova.virt.libvirt.driver [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Nov 22 02:45:19 np0005531887 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000016.scope: Deactivated successfully.
Nov 22 02:45:19 np0005531887 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000016.scope: Consumed 13.582s CPU time.
Nov 22 02:45:19 np0005531887 systemd-machined[153180]: Machine qemu-9-instance-00000016 terminated.
Nov 22 02:45:20 np0005531887 nova_compute[186849]: 2025-11-22 07:45:20.139 186853 DEBUG oslo_concurrency.lockutils [None req-0291a836-c56c-48d7-8a67-afa26041b2d7 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Acquiring lock "144e6cca-5b79-4b25-9456-a59f6895075b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:45:20 np0005531887 nova_compute[186849]: 2025-11-22 07:45:20.140 186853 DEBUG oslo_concurrency.lockutils [None req-0291a836-c56c-48d7-8a67-afa26041b2d7 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Lock "144e6cca-5b79-4b25-9456-a59f6895075b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:45:20 np0005531887 nova_compute[186849]: 2025-11-22 07:45:20.140 186853 DEBUG oslo_concurrency.lockutils [None req-0291a836-c56c-48d7-8a67-afa26041b2d7 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Acquiring lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:45:20 np0005531887 nova_compute[186849]: 2025-11-22 07:45:20.140 186853 DEBUG oslo_concurrency.lockutils [None req-0291a836-c56c-48d7-8a67-afa26041b2d7 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:45:20 np0005531887 nova_compute[186849]: 2025-11-22 07:45:20.141 186853 DEBUG oslo_concurrency.lockutils [None req-0291a836-c56c-48d7-8a67-afa26041b2d7 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:45:20 np0005531887 nova_compute[186849]: 2025-11-22 07:45:20.151 186853 INFO nova.compute.manager [None req-0291a836-c56c-48d7-8a67-afa26041b2d7 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Terminating instance#033[00m
Nov 22 02:45:20 np0005531887 nova_compute[186849]: 2025-11-22 07:45:20.158 186853 DEBUG nova.compute.manager [None req-0291a836-c56c-48d7-8a67-afa26041b2d7 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 02:45:20 np0005531887 kernel: tap66ab05b0-44 (unregistering): left promiscuous mode
Nov 22 02:45:20 np0005531887 nova_compute[186849]: 2025-11-22 07:45:20.183 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:20 np0005531887 NetworkManager[55210]: <info>  [1763797520.1855] device (tap66ab05b0-44): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 02:45:20 np0005531887 ovn_controller[95130]: 2025-11-22T07:45:20Z|00061|binding|INFO|Releasing lport 66ab05b0-442e-4420-82b9-0fc90a3df63b from this chassis (sb_readonly=0)
Nov 22 02:45:20 np0005531887 ovn_controller[95130]: 2025-11-22T07:45:20Z|00062|binding|INFO|Setting lport 66ab05b0-442e-4420-82b9-0fc90a3df63b down in Southbound
Nov 22 02:45:20 np0005531887 ovn_controller[95130]: 2025-11-22T07:45:20Z|00063|binding|INFO|Removing iface tap66ab05b0-44 ovn-installed in OVS
Nov 22 02:45:20 np0005531887 nova_compute[186849]: 2025-11-22 07:45:20.197 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:20 np0005531887 nova_compute[186849]: 2025-11-22 07:45:20.202 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:45:20.205 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4f:30:6c 10.100.0.8'], port_security=['fa:16:3e:4f:30:6c 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '144e6cca-5b79-4b25-9456-a59f6895075b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '74651b744925468db6c6e47d1397cc04', 'neutron:revision_number': '23', 'neutron:security_group_ids': '91f2be3c-33ea-422b-b9a4-1d9e92a850d8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=14c3e272-b4ef-4625-a876-b23f3cbba9b7, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=66ab05b0-442e-4420-82b9-0fc90a3df63b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:45:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:45:20.206 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 66ab05b0-442e-4420-82b9-0fc90a3df63b in datapath cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9 unbound from our chassis#033[00m
Nov 22 02:45:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:45:20.207 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 02:45:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:45:20.209 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[09afec7f-bf44-47e3-b416-a74d9b07991f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:45:20.209 104084 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9 namespace which is not needed anymore#033[00m
Nov 22 02:45:20 np0005531887 nova_compute[186849]: 2025-11-22 07:45:20.219 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:20 np0005531887 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000010.scope: Deactivated successfully.
Nov 22 02:45:20 np0005531887 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000010.scope: Consumed 4.758s CPU time.
Nov 22 02:45:20 np0005531887 systemd-machined[153180]: Machine qemu-7-instance-00000010 terminated.
Nov 22 02:45:20 np0005531887 neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9[215849]: [NOTICE]   (215853) : haproxy version is 2.8.14-c23fe91
Nov 22 02:45:20 np0005531887 neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9[215849]: [NOTICE]   (215853) : path to executable is /usr/sbin/haproxy
Nov 22 02:45:20 np0005531887 neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9[215849]: [WARNING]  (215853) : Exiting Master process...
Nov 22 02:45:20 np0005531887 neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9[215849]: [ALERT]    (215853) : Current worker (215855) exited with code 143 (Terminated)
Nov 22 02:45:20 np0005531887 neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9[215849]: [WARNING]  (215853) : All workers exited. Exiting... (0)
Nov 22 02:45:20 np0005531887 systemd[1]: libpod-c21bb2b3eae28e30c7a99511df45e3b38b42ff02dfc83101fd13c51ac4a5ca6e.scope: Deactivated successfully.
Nov 22 02:45:20 np0005531887 podman[216453]: 2025-11-22 07:45:20.372335939 +0000 UTC m=+0.049661675 container died c21bb2b3eae28e30c7a99511df45e3b38b42ff02dfc83101fd13c51ac4a5ca6e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 02:45:20 np0005531887 NetworkManager[55210]: <info>  [1763797520.3771] manager: (tap66ab05b0-44): new Tun device (/org/freedesktop/NetworkManager/Devices/37)
Nov 22 02:45:20 np0005531887 nova_compute[186849]: 2025-11-22 07:45:20.379 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:20 np0005531887 nova_compute[186849]: 2025-11-22 07:45:20.382 186853 INFO nova.virt.libvirt.driver [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Instance shutdown successfully after 13 seconds.#033[00m
Nov 22 02:45:20 np0005531887 nova_compute[186849]: 2025-11-22 07:45:20.385 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:20 np0005531887 nova_compute[186849]: 2025-11-22 07:45:20.403 186853 INFO nova.virt.libvirt.driver [-] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Instance destroyed successfully.#033[00m
Nov 22 02:45:20 np0005531887 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c21bb2b3eae28e30c7a99511df45e3b38b42ff02dfc83101fd13c51ac4a5ca6e-userdata-shm.mount: Deactivated successfully.
Nov 22 02:45:20 np0005531887 systemd[1]: var-lib-containers-storage-overlay-eb7127f50cfab2f9470facd81dc308cd56e345965c3fdec028c54ff67d5a3388-merged.mount: Deactivated successfully.
Nov 22 02:45:20 np0005531887 podman[216453]: 2025-11-22 07:45:20.420841084 +0000 UTC m=+0.098166800 container cleanup c21bb2b3eae28e30c7a99511df45e3b38b42ff02dfc83101fd13c51ac4a5ca6e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 22 02:45:20 np0005531887 nova_compute[186849]: 2025-11-22 07:45:20.426 186853 DEBUG oslo_concurrency.processutils [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:45:20 np0005531887 systemd[1]: libpod-conmon-c21bb2b3eae28e30c7a99511df45e3b38b42ff02dfc83101fd13c51ac4a5ca6e.scope: Deactivated successfully.
Nov 22 02:45:20 np0005531887 nova_compute[186849]: 2025-11-22 07:45:20.450 186853 INFO nova.virt.libvirt.driver [-] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Instance destroyed successfully.#033[00m
Nov 22 02:45:20 np0005531887 nova_compute[186849]: 2025-11-22 07:45:20.452 186853 DEBUG nova.objects.instance [None req-0291a836-c56c-48d7-8a67-afa26041b2d7 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Lazy-loading 'resources' on Instance uuid 144e6cca-5b79-4b25-9456-a59f6895075b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:45:20 np0005531887 nova_compute[186849]: 2025-11-22 07:45:20.472 186853 DEBUG nova.virt.libvirt.vif [None req-0291a836-c56c-48d7-8a67-afa26041b2d7 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-22T07:43:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1027576693',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1027576693',id=16,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:43:41Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='74651b744925468db6c6e47d1397cc04',ramdisk_id='',reservation_id='r-u8vxgo1r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='2',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1505701588',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1505701588-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:44:23Z,user_data=None,user_id='4ca2e31d955040598948fa3da5d84888',uuid=144e6cca-5b79-4b25-9456-a59f6895075b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "66ab05b0-442e-4420-82b9-0fc90a3df63b", "address": "fa:16:3e:4f:30:6c", "network": {"id": "cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-667943228-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74651b744925468db6c6e47d1397cc04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66ab05b0-44", "ovs_interfaceid": "66ab05b0-442e-4420-82b9-0fc90a3df63b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 02:45:20 np0005531887 nova_compute[186849]: 2025-11-22 07:45:20.473 186853 DEBUG nova.network.os_vif_util [None req-0291a836-c56c-48d7-8a67-afa26041b2d7 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Converting VIF {"id": "66ab05b0-442e-4420-82b9-0fc90a3df63b", "address": "fa:16:3e:4f:30:6c", "network": {"id": "cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-667943228-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74651b744925468db6c6e47d1397cc04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66ab05b0-44", "ovs_interfaceid": "66ab05b0-442e-4420-82b9-0fc90a3df63b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:45:20 np0005531887 nova_compute[186849]: 2025-11-22 07:45:20.474 186853 DEBUG nova.network.os_vif_util [None req-0291a836-c56c-48d7-8a67-afa26041b2d7 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:4f:30:6c,bridge_name='br-int',has_traffic_filtering=True,id=66ab05b0-442e-4420-82b9-0fc90a3df63b,network=Network(cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66ab05b0-44') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:45:20 np0005531887 nova_compute[186849]: 2025-11-22 07:45:20.475 186853 DEBUG os_vif [None req-0291a836-c56c-48d7-8a67-afa26041b2d7 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:4f:30:6c,bridge_name='br-int',has_traffic_filtering=True,id=66ab05b0-442e-4420-82b9-0fc90a3df63b,network=Network(cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66ab05b0-44') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 02:45:20 np0005531887 nova_compute[186849]: 2025-11-22 07:45:20.476 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:20 np0005531887 nova_compute[186849]: 2025-11-22 07:45:20.477 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap66ab05b0-44, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:45:20 np0005531887 nova_compute[186849]: 2025-11-22 07:45:20.480 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:20 np0005531887 nova_compute[186849]: 2025-11-22 07:45:20.481 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:45:20 np0005531887 podman[216494]: 2025-11-22 07:45:20.481315843 +0000 UTC m=+0.039834572 container remove c21bb2b3eae28e30c7a99511df45e3b38b42ff02dfc83101fd13c51ac4a5ca6e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 22 02:45:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:45:20.487 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[6fb19b84-9ba0-4cc7-b26f-3f04970f0a75]: (4, ('Sat Nov 22 07:45:20 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9 (c21bb2b3eae28e30c7a99511df45e3b38b42ff02dfc83101fd13c51ac4a5ca6e)\nc21bb2b3eae28e30c7a99511df45e3b38b42ff02dfc83101fd13c51ac4a5ca6e\nSat Nov 22 07:45:20 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9 (c21bb2b3eae28e30c7a99511df45e3b38b42ff02dfc83101fd13c51ac4a5ca6e)\nc21bb2b3eae28e30c7a99511df45e3b38b42ff02dfc83101fd13c51ac4a5ca6e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:20 np0005531887 nova_compute[186849]: 2025-11-22 07:45:20.488 186853 INFO os_vif [None req-0291a836-c56c-48d7-8a67-afa26041b2d7 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:4f:30:6c,bridge_name='br-int',has_traffic_filtering=True,id=66ab05b0-442e-4420-82b9-0fc90a3df63b,network=Network(cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66ab05b0-44')#033[00m
Nov 22 02:45:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:45:20.488 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[29bcb62f-d0f7-42df-8640-c3c240e8cada]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:20 np0005531887 nova_compute[186849]: 2025-11-22 07:45:20.489 186853 INFO nova.virt.libvirt.driver [None req-0291a836-c56c-48d7-8a67-afa26041b2d7 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Deleting instance files /var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b_del#033[00m
Nov 22 02:45:20 np0005531887 nova_compute[186849]: 2025-11-22 07:45:20.490 186853 INFO nova.virt.libvirt.driver [None req-0291a836-c56c-48d7-8a67-afa26041b2d7 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Deletion of /var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b_del complete#033[00m
Nov 22 02:45:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:45:20.490 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcd5fa4f6-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:45:20 np0005531887 kernel: tapcd5fa4f6-00: left promiscuous mode
Nov 22 02:45:20 np0005531887 nova_compute[186849]: 2025-11-22 07:45:20.492 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:45:20.500 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[d62ff17f-01c7-4be5-bb66-cbda9c704413]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:20 np0005531887 nova_compute[186849]: 2025-11-22 07:45:20.501 186853 DEBUG oslo_concurrency.processutils [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:45:20 np0005531887 nova_compute[186849]: 2025-11-22 07:45:20.502 186853 DEBUG oslo_concurrency.processutils [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:45:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:45:20.515 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[81ecbc86-23f5-4167-8d38-64fc7b7e4b42]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:45:20.517 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[130bde7e-c9e2-4a43-afa0-8e94b4087830]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:20 np0005531887 nova_compute[186849]: 2025-11-22 07:45:20.521 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:45:20.533 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[14a0a80d-12bb-4e3b-97e4-994ea1f26251]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 420116, 'reachable_time': 18602, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216514, 'error': None, 'target': 'ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:20 np0005531887 systemd[1]: run-netns-ovnmeta\x2dcd5fa4f6\x2d0f1b\x2d41f2\x2d9643\x2d3c1a36620dc9.mount: Deactivated successfully.
Nov 22 02:45:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:45:20.539 104200 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 02:45:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:45:20.539 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[b4e5757a-fda1-411a-8bfc-8c7ac7cc6c84]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:20 np0005531887 nova_compute[186849]: 2025-11-22 07:45:20.559 186853 DEBUG nova.compute.manager [req-7a9c1159-cfa6-48d1-a1f8-a0aa4da60dfe req-404f4e35-fe9d-46bb-bf08-1fc84f28f483 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Received event network-vif-unplugged-66ab05b0-442e-4420-82b9-0fc90a3df63b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:45:20 np0005531887 nova_compute[186849]: 2025-11-22 07:45:20.560 186853 DEBUG oslo_concurrency.lockutils [req-7a9c1159-cfa6-48d1-a1f8-a0aa4da60dfe req-404f4e35-fe9d-46bb-bf08-1fc84f28f483 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:45:20 np0005531887 nova_compute[186849]: 2025-11-22 07:45:20.560 186853 DEBUG oslo_concurrency.lockutils [req-7a9c1159-cfa6-48d1-a1f8-a0aa4da60dfe req-404f4e35-fe9d-46bb-bf08-1fc84f28f483 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:45:20 np0005531887 nova_compute[186849]: 2025-11-22 07:45:20.560 186853 DEBUG oslo_concurrency.lockutils [req-7a9c1159-cfa6-48d1-a1f8-a0aa4da60dfe req-404f4e35-fe9d-46bb-bf08-1fc84f28f483 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:45:20 np0005531887 nova_compute[186849]: 2025-11-22 07:45:20.560 186853 DEBUG nova.compute.manager [req-7a9c1159-cfa6-48d1-a1f8-a0aa4da60dfe req-404f4e35-fe9d-46bb-bf08-1fc84f28f483 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] No waiting events found dispatching network-vif-unplugged-66ab05b0-442e-4420-82b9-0fc90a3df63b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:45:20 np0005531887 nova_compute[186849]: 2025-11-22 07:45:20.561 186853 DEBUG nova.compute.manager [req-7a9c1159-cfa6-48d1-a1f8-a0aa4da60dfe req-404f4e35-fe9d-46bb-bf08-1fc84f28f483 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Received event network-vif-unplugged-66ab05b0-442e-4420-82b9-0fc90a3df63b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 02:45:20 np0005531887 nova_compute[186849]: 2025-11-22 07:45:20.563 186853 DEBUG oslo_concurrency.processutils [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:45:20 np0005531887 nova_compute[186849]: 2025-11-22 07:45:20.564 186853 DEBUG nova.virt.libvirt.volume.remotefs [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Copying file /var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d_resize/disk to 192.168.122.102:/var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 22 02:45:20 np0005531887 nova_compute[186849]: 2025-11-22 07:45:20.565 186853 DEBUG oslo_concurrency.processutils [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d_resize/disk 192.168.122.102:/var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:45:20 np0005531887 nova_compute[186849]: 2025-11-22 07:45:20.623 186853 INFO nova.compute.manager [None req-0291a836-c56c-48d7-8a67-afa26041b2d7 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Took 0.46 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 02:45:20 np0005531887 nova_compute[186849]: 2025-11-22 07:45:20.624 186853 DEBUG oslo.service.loopingcall [None req-0291a836-c56c-48d7-8a67-afa26041b2d7 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 02:45:20 np0005531887 nova_compute[186849]: 2025-11-22 07:45:20.624 186853 DEBUG nova.compute.manager [-] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 02:45:20 np0005531887 nova_compute[186849]: 2025-11-22 07:45:20.625 186853 DEBUG nova.network.neutron [-] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 02:45:21 np0005531887 nova_compute[186849]: 2025-11-22 07:45:21.474 186853 DEBUG oslo_concurrency.processutils [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CMD "scp -r /var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d_resize/disk 192.168.122.102:/var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d/disk" returned: 0 in 0.909s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:45:21 np0005531887 nova_compute[186849]: 2025-11-22 07:45:21.475 186853 DEBUG nova.virt.libvirt.volume.remotefs [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Copying file /var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d_resize/disk.config to 192.168.122.102:/var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 22 02:45:21 np0005531887 nova_compute[186849]: 2025-11-22 07:45:21.475 186853 DEBUG oslo_concurrency.processutils [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d_resize/disk.config 192.168.122.102:/var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:45:21 np0005531887 nova_compute[186849]: 2025-11-22 07:45:21.589 186853 DEBUG nova.network.neutron [-] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:45:21 np0005531887 nova_compute[186849]: 2025-11-22 07:45:21.612 186853 INFO nova.compute.manager [-] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Took 0.99 seconds to deallocate network for instance.#033[00m
Nov 22 02:45:21 np0005531887 nova_compute[186849]: 2025-11-22 07:45:21.699 186853 DEBUG oslo_concurrency.lockutils [None req-0291a836-c56c-48d7-8a67-afa26041b2d7 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:45:21 np0005531887 nova_compute[186849]: 2025-11-22 07:45:21.700 186853 DEBUG oslo_concurrency.lockutils [None req-0291a836-c56c-48d7-8a67-afa26041b2d7 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:45:21 np0005531887 nova_compute[186849]: 2025-11-22 07:45:21.709 186853 DEBUG nova.compute.manager [req-f1b912e5-bf14-4554-88c7-f3fc045e13da req-ba78abbc-c975-4199-9968-3b552eb02e29 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Received event network-vif-deleted-66ab05b0-442e-4420-82b9-0fc90a3df63b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:45:21 np0005531887 nova_compute[186849]: 2025-11-22 07:45:21.716 186853 DEBUG oslo_concurrency.processutils [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CMD "scp -C -r /var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d_resize/disk.config 192.168.122.102:/var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d/disk.config" returned: 0 in 0.241s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:45:21 np0005531887 nova_compute[186849]: 2025-11-22 07:45:21.716 186853 DEBUG nova.virt.libvirt.volume.remotefs [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Copying file /var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d_resize/disk.info to 192.168.122.102:/var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 22 02:45:21 np0005531887 nova_compute[186849]: 2025-11-22 07:45:21.717 186853 DEBUG oslo_concurrency.processutils [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d_resize/disk.info 192.168.122.102:/var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:45:21 np0005531887 nova_compute[186849]: 2025-11-22 07:45:21.793 186853 DEBUG nova.compute.provider_tree [None req-0291a836-c56c-48d7-8a67-afa26041b2d7 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:45:21 np0005531887 nova_compute[186849]: 2025-11-22 07:45:21.810 186853 DEBUG nova.scheduler.client.report [None req-0291a836-c56c-48d7-8a67-afa26041b2d7 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:45:21 np0005531887 nova_compute[186849]: 2025-11-22 07:45:21.834 186853 DEBUG oslo_concurrency.lockutils [None req-0291a836-c56c-48d7-8a67-afa26041b2d7 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:45:21 np0005531887 podman[216523]: 2025-11-22 07:45:21.841548925 +0000 UTC m=+0.062960442 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Nov 22 02:45:21 np0005531887 nova_compute[186849]: 2025-11-22 07:45:21.857 186853 INFO nova.scheduler.client.report [None req-0291a836-c56c-48d7-8a67-afa26041b2d7 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Deleted allocations for instance 144e6cca-5b79-4b25-9456-a59f6895075b#033[00m
Nov 22 02:45:21 np0005531887 podman[216524]: 2025-11-22 07:45:21.894055619 +0000 UTC m=+0.112395320 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 02:45:21 np0005531887 nova_compute[186849]: 2025-11-22 07:45:21.937 186853 DEBUG oslo_concurrency.processutils [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CMD "scp -C -r /var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d_resize/disk.info 192.168.122.102:/var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d/disk.info" returned: 0 in 0.220s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:45:21 np0005531887 nova_compute[186849]: 2025-11-22 07:45:21.969 186853 DEBUG oslo_concurrency.lockutils [None req-0291a836-c56c-48d7-8a67-afa26041b2d7 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Lock "144e6cca-5b79-4b25-9456-a59f6895075b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.829s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:45:22 np0005531887 nova_compute[186849]: 2025-11-22 07:45:22.091 186853 DEBUG oslo_concurrency.lockutils [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "88c868e5-67c5-4f22-b584-d8772316044d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:45:22 np0005531887 nova_compute[186849]: 2025-11-22 07:45:22.092 186853 DEBUG oslo_concurrency.lockutils [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "88c868e5-67c5-4f22-b584-d8772316044d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:45:22 np0005531887 nova_compute[186849]: 2025-11-22 07:45:22.093 186853 DEBUG oslo_concurrency.lockutils [None req-1122cdfb-2cfb-4b05-8e49-41651ac50ef6 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "88c868e5-67c5-4f22-b584-d8772316044d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:45:22 np0005531887 nova_compute[186849]: 2025-11-22 07:45:22.665 186853 DEBUG nova.compute.manager [req-45f76b71-8be0-4f19-8e1f-e20edd6850dd req-1f1c7320-f1e2-45e8-8ff6-9d8cc5dbd204 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Received event network-vif-plugged-66ab05b0-442e-4420-82b9-0fc90a3df63b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:45:22 np0005531887 nova_compute[186849]: 2025-11-22 07:45:22.667 186853 DEBUG oslo_concurrency.lockutils [req-45f76b71-8be0-4f19-8e1f-e20edd6850dd req-1f1c7320-f1e2-45e8-8ff6-9d8cc5dbd204 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:45:22 np0005531887 nova_compute[186849]: 2025-11-22 07:45:22.667 186853 DEBUG oslo_concurrency.lockutils [req-45f76b71-8be0-4f19-8e1f-e20edd6850dd req-1f1c7320-f1e2-45e8-8ff6-9d8cc5dbd204 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:45:22 np0005531887 nova_compute[186849]: 2025-11-22 07:45:22.667 186853 DEBUG oslo_concurrency.lockutils [req-45f76b71-8be0-4f19-8e1f-e20edd6850dd req-1f1c7320-f1e2-45e8-8ff6-9d8cc5dbd204 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:45:22 np0005531887 nova_compute[186849]: 2025-11-22 07:45:22.667 186853 DEBUG nova.compute.manager [req-45f76b71-8be0-4f19-8e1f-e20edd6850dd req-1f1c7320-f1e2-45e8-8ff6-9d8cc5dbd204 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] No waiting events found dispatching network-vif-plugged-66ab05b0-442e-4420-82b9-0fc90a3df63b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:45:22 np0005531887 nova_compute[186849]: 2025-11-22 07:45:22.667 186853 WARNING nova.compute.manager [req-45f76b71-8be0-4f19-8e1f-e20edd6850dd req-1f1c7320-f1e2-45e8-8ff6-9d8cc5dbd204 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Received unexpected event network-vif-plugged-66ab05b0-442e-4420-82b9-0fc90a3df63b for instance with vm_state deleted and task_state None.#033[00m
Nov 22 02:45:25 np0005531887 nova_compute[186849]: 2025-11-22 07:45:25.197 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:25 np0005531887 nova_compute[186849]: 2025-11-22 07:45:25.480 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:26 np0005531887 nova_compute[186849]: 2025-11-22 07:45:26.031 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:27 np0005531887 nova_compute[186849]: 2025-11-22 07:45:27.664 186853 INFO nova.compute.manager [None req-3f0060b6-e018-47d5-9330-cebe5ad75f31 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Swapping old allocation on dict_keys(['9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78']) held by migration 6daa597c-30c1-47c8-a172-26297bb49453 for instance#033[00m
Nov 22 02:45:27 np0005531887 nova_compute[186849]: 2025-11-22 07:45:27.698 186853 DEBUG nova.scheduler.client.report [None req-3f0060b6-e018-47d5-9330-cebe5ad75f31 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Overwriting current allocation {'allocations': {'1afd6948-7df7-46e7-8718-35e2b3007a5d': {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}, 'generation': 16}}, 'project_id': '070aaece3c3c4232877d26c34023c56d', 'user_id': '5ea417ea62e2404d8cb5b9e767e8c5c4', 'consumer_generation': 1} on consumer 88c868e5-67c5-4f22-b584-d8772316044d move_allocations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:2018#033[00m
Nov 22 02:45:27 np0005531887 podman[216568]: 2025-11-22 07:45:27.84020228 +0000 UTC m=+0.056429971 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 02:45:28 np0005531887 nova_compute[186849]: 2025-11-22 07:45:28.301 186853 DEBUG oslo_concurrency.lockutils [None req-3f0060b6-e018-47d5-9330-cebe5ad75f31 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "refresh_cache-88c868e5-67c5-4f22-b584-d8772316044d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:45:28 np0005531887 nova_compute[186849]: 2025-11-22 07:45:28.302 186853 DEBUG oslo_concurrency.lockutils [None req-3f0060b6-e018-47d5-9330-cebe5ad75f31 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquired lock "refresh_cache-88c868e5-67c5-4f22-b584-d8772316044d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:45:28 np0005531887 nova_compute[186849]: 2025-11-22 07:45:28.302 186853 DEBUG nova.network.neutron [None req-3f0060b6-e018-47d5-9330-cebe5ad75f31 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:45:28 np0005531887 nova_compute[186849]: 2025-11-22 07:45:28.512 186853 DEBUG nova.network.neutron [None req-3f0060b6-e018-47d5-9330-cebe5ad75f31 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:45:28 np0005531887 nova_compute[186849]: 2025-11-22 07:45:28.843 186853 DEBUG nova.network.neutron [None req-3f0060b6-e018-47d5-9330-cebe5ad75f31 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:45:28 np0005531887 nova_compute[186849]: 2025-11-22 07:45:28.859 186853 DEBUG oslo_concurrency.lockutils [None req-3f0060b6-e018-47d5-9330-cebe5ad75f31 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Releasing lock "refresh_cache-88c868e5-67c5-4f22-b584-d8772316044d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:45:28 np0005531887 nova_compute[186849]: 2025-11-22 07:45:28.860 186853 DEBUG nova.virt.libvirt.driver [None req-3f0060b6-e018-47d5-9330-cebe5ad75f31 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Starting finish_revert_migration finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11843#033[00m
Nov 22 02:45:28 np0005531887 nova_compute[186849]: 2025-11-22 07:45:28.867 186853 DEBUG nova.virt.libvirt.driver [None req-3f0060b6-e018-47d5-9330-cebe5ad75f31 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'size': 0, 'boot_index': 0, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:45:28 np0005531887 nova_compute[186849]: 2025-11-22 07:45:28.870 186853 WARNING nova.virt.libvirt.driver [None req-3f0060b6-e018-47d5-9330-cebe5ad75f31 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:45:28 np0005531887 nova_compute[186849]: 2025-11-22 07:45:28.875 186853 DEBUG nova.virt.libvirt.host [None req-3f0060b6-e018-47d5-9330-cebe5ad75f31 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:45:28 np0005531887 nova_compute[186849]: 2025-11-22 07:45:28.875 186853 DEBUG nova.virt.libvirt.host [None req-3f0060b6-e018-47d5-9330-cebe5ad75f31 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:45:28 np0005531887 nova_compute[186849]: 2025-11-22 07:45:28.879 186853 DEBUG nova.virt.libvirt.host [None req-3f0060b6-e018-47d5-9330-cebe5ad75f31 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:45:28 np0005531887 nova_compute[186849]: 2025-11-22 07:45:28.880 186853 DEBUG nova.virt.libvirt.host [None req-3f0060b6-e018-47d5-9330-cebe5ad75f31 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:45:28 np0005531887 nova_compute[186849]: 2025-11-22 07:45:28.881 186853 DEBUG nova.virt.libvirt.driver [None req-3f0060b6-e018-47d5-9330-cebe5ad75f31 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:45:28 np0005531887 nova_compute[186849]: 2025-11-22 07:45:28.881 186853 DEBUG nova.virt.hardware [None req-3f0060b6-e018-47d5-9330-cebe5ad75f31 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:44:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='03a9bbee-8c6d-4345-a323-8fa81a00e495',id=28,is_public=True,memory_mb=128,name='tempest-test_resize_flavor_-2139299838',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:45:28 np0005531887 nova_compute[186849]: 2025-11-22 07:45:28.882 186853 DEBUG nova.virt.hardware [None req-3f0060b6-e018-47d5-9330-cebe5ad75f31 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:45:28 np0005531887 nova_compute[186849]: 2025-11-22 07:45:28.882 186853 DEBUG nova.virt.hardware [None req-3f0060b6-e018-47d5-9330-cebe5ad75f31 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:45:28 np0005531887 nova_compute[186849]: 2025-11-22 07:45:28.882 186853 DEBUG nova.virt.hardware [None req-3f0060b6-e018-47d5-9330-cebe5ad75f31 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:45:28 np0005531887 nova_compute[186849]: 2025-11-22 07:45:28.882 186853 DEBUG nova.virt.hardware [None req-3f0060b6-e018-47d5-9330-cebe5ad75f31 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:45:28 np0005531887 nova_compute[186849]: 2025-11-22 07:45:28.883 186853 DEBUG nova.virt.hardware [None req-3f0060b6-e018-47d5-9330-cebe5ad75f31 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:45:28 np0005531887 nova_compute[186849]: 2025-11-22 07:45:28.883 186853 DEBUG nova.virt.hardware [None req-3f0060b6-e018-47d5-9330-cebe5ad75f31 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:45:28 np0005531887 nova_compute[186849]: 2025-11-22 07:45:28.883 186853 DEBUG nova.virt.hardware [None req-3f0060b6-e018-47d5-9330-cebe5ad75f31 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:45:28 np0005531887 nova_compute[186849]: 2025-11-22 07:45:28.883 186853 DEBUG nova.virt.hardware [None req-3f0060b6-e018-47d5-9330-cebe5ad75f31 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:45:28 np0005531887 nova_compute[186849]: 2025-11-22 07:45:28.883 186853 DEBUG nova.virt.hardware [None req-3f0060b6-e018-47d5-9330-cebe5ad75f31 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:45:28 np0005531887 nova_compute[186849]: 2025-11-22 07:45:28.884 186853 DEBUG nova.virt.hardware [None req-3f0060b6-e018-47d5-9330-cebe5ad75f31 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:45:28 np0005531887 nova_compute[186849]: 2025-11-22 07:45:28.884 186853 DEBUG nova.objects.instance [None req-3f0060b6-e018-47d5-9330-cebe5ad75f31 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lazy-loading 'vcpu_model' on Instance uuid 88c868e5-67c5-4f22-b584-d8772316044d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:45:28 np0005531887 nova_compute[186849]: 2025-11-22 07:45:28.899 186853 DEBUG oslo_concurrency.processutils [None req-3f0060b6-e018-47d5-9330-cebe5ad75f31 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:45:28 np0005531887 nova_compute[186849]: 2025-11-22 07:45:28.976 186853 DEBUG oslo_concurrency.processutils [None req-3f0060b6-e018-47d5-9330-cebe5ad75f31 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d/disk.config --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:45:28 np0005531887 nova_compute[186849]: 2025-11-22 07:45:28.977 186853 DEBUG oslo_concurrency.lockutils [None req-3f0060b6-e018-47d5-9330-cebe5ad75f31 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "/var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:45:28 np0005531887 nova_compute[186849]: 2025-11-22 07:45:28.977 186853 DEBUG oslo_concurrency.lockutils [None req-3f0060b6-e018-47d5-9330-cebe5ad75f31 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "/var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:45:28 np0005531887 nova_compute[186849]: 2025-11-22 07:45:28.978 186853 DEBUG oslo_concurrency.lockutils [None req-3f0060b6-e018-47d5-9330-cebe5ad75f31 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "/var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:45:28 np0005531887 nova_compute[186849]: 2025-11-22 07:45:28.980 186853 DEBUG nova.virt.libvirt.driver [None req-3f0060b6-e018-47d5-9330-cebe5ad75f31 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:45:28 np0005531887 nova_compute[186849]:  <uuid>88c868e5-67c5-4f22-b584-d8772316044d</uuid>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:  <name>instance-00000016</name>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:  <memory>131072</memory>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:  <vcpu>1</vcpu>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:45:28 np0005531887 nova_compute[186849]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:      <nova:name>tempest-MigrationsAdminTest-server-1406881377</nova:name>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:      <nova:creationTime>2025-11-22 07:45:28</nova:creationTime>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:      <nova:flavor name="tempest-test_resize_flavor_-2139299838">
Nov 22 02:45:28 np0005531887 nova_compute[186849]:        <nova:memory>128</nova:memory>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:        <nova:disk>1</nova:disk>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:        <nova:swap>0</nova:swap>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:      </nova:flavor>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:      <nova:owner>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:        <nova:user uuid="5ea417ea62e2404d8cb5b9e767e8c5c4">tempest-MigrationsAdminTest-573005991-project-member</nova:user>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:        <nova:project uuid="070aaece3c3c4232877d26c34023c56d">tempest-MigrationsAdminTest-573005991</nova:project>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:      </nova:owner>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:      <nova:ports/>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:    </nova:instance>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:  <sysinfo type="smbios">
Nov 22 02:45:28 np0005531887 nova_compute[186849]:    <system>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:      <entry name="serial">88c868e5-67c5-4f22-b584-d8772316044d</entry>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:      <entry name="uuid">88c868e5-67c5-4f22-b584-d8772316044d</entry>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:    </system>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:  <os>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:    <boot dev="hd"/>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:    <smbios mode="sysinfo"/>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:  </os>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:  <features>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:    <vmcoreinfo/>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:  </features>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:  <clock offset="utc">
Nov 22 02:45:28 np0005531887 nova_compute[186849]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:    <timer name="hpet" present="no"/>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:  </clock>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:  <cpu mode="custom" match="exact">
Nov 22 02:45:28 np0005531887 nova_compute[186849]:    <model>Nehalem</model>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:  <devices>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:    <disk type="file" device="disk">
Nov 22 02:45:28 np0005531887 nova_compute[186849]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d/disk"/>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:      <target dev="vda" bus="virtio"/>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:    <disk type="file" device="cdrom">
Nov 22 02:45:28 np0005531887 nova_compute[186849]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d/disk.config"/>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:      <target dev="sda" bus="sata"/>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:    <serial type="pty">
Nov 22 02:45:28 np0005531887 nova_compute[186849]:      <log file="/var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d/console.log" append="off"/>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:    </serial>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:    <video>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:    </video>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:    <input type="tablet" bus="usb"/>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:    <input type="keyboard" bus="usb"/>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:    <rng model="virtio">
Nov 22 02:45:28 np0005531887 nova_compute[186849]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:    </rng>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:    <controller type="usb" index="0"/>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:    <memballoon model="virtio">
Nov 22 02:45:28 np0005531887 nova_compute[186849]:      <stats period="10"/>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 02:45:28 np0005531887 nova_compute[186849]:  </devices>
Nov 22 02:45:28 np0005531887 nova_compute[186849]: </domain>
Nov 22 02:45:28 np0005531887 nova_compute[186849]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:45:29 np0005531887 systemd-machined[153180]: New machine qemu-10-instance-00000016.
Nov 22 02:45:29 np0005531887 systemd[1]: Started Virtual Machine qemu-10-instance-00000016.
Nov 22 02:45:29 np0005531887 nova_compute[186849]: 2025-11-22 07:45:29.383 186853 DEBUG nova.virt.libvirt.host [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Removed pending event for 88c868e5-67c5-4f22-b584-d8772316044d due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Nov 22 02:45:29 np0005531887 nova_compute[186849]: 2025-11-22 07:45:29.384 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763797529.3817384, 88c868e5-67c5-4f22-b584-d8772316044d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:45:29 np0005531887 nova_compute[186849]: 2025-11-22 07:45:29.384 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:45:29 np0005531887 nova_compute[186849]: 2025-11-22 07:45:29.387 186853 DEBUG nova.compute.manager [None req-3f0060b6-e018-47d5-9330-cebe5ad75f31 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:45:29 np0005531887 nova_compute[186849]: 2025-11-22 07:45:29.391 186853 INFO nova.virt.libvirt.driver [-] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Instance running successfully.#033[00m
Nov 22 02:45:29 np0005531887 nova_compute[186849]: 2025-11-22 07:45:29.391 186853 DEBUG nova.virt.libvirt.driver [None req-3f0060b6-e018-47d5-9330-cebe5ad75f31 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] finish_revert_migration finished successfully. finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11887#033[00m
Nov 22 02:45:29 np0005531887 nova_compute[186849]: 2025-11-22 07:45:29.421 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:45:29 np0005531887 nova_compute[186849]: 2025-11-22 07:45:29.424 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:45:29 np0005531887 nova_compute[186849]: 2025-11-22 07:45:29.455 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] During sync_power_state the instance has a pending task (resize_reverting). Skip.#033[00m
Nov 22 02:45:29 np0005531887 nova_compute[186849]: 2025-11-22 07:45:29.456 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763797529.3838754, 88c868e5-67c5-4f22-b584-d8772316044d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:45:29 np0005531887 nova_compute[186849]: 2025-11-22 07:45:29.456 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] VM Started (Lifecycle Event)#033[00m
Nov 22 02:45:29 np0005531887 nova_compute[186849]: 2025-11-22 07:45:29.485 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:45:29 np0005531887 nova_compute[186849]: 2025-11-22 07:45:29.489 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:45:29 np0005531887 nova_compute[186849]: 2025-11-22 07:45:29.495 186853 INFO nova.compute.manager [None req-3f0060b6-e018-47d5-9330-cebe5ad75f31 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Updating instance to original state: 'active'#033[00m
Nov 22 02:45:29 np0005531887 nova_compute[186849]: 2025-11-22 07:45:29.519 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] During sync_power_state the instance has a pending task (resize_reverting). Skip.#033[00m
Nov 22 02:45:30 np0005531887 nova_compute[186849]: 2025-11-22 07:45:30.219 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:30 np0005531887 nova_compute[186849]: 2025-11-22 07:45:30.481 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:30 np0005531887 podman[216621]: 2025-11-22 07:45:30.847528798 +0000 UTC m=+0.065168765 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent)
Nov 22 02:45:34 np0005531887 nova_compute[186849]: 2025-11-22 07:45:34.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:45:34 np0005531887 podman[216641]: 2025-11-22 07:45:34.858592949 +0000 UTC m=+0.070512207 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 02:45:35 np0005531887 nova_compute[186849]: 2025-11-22 07:45:35.221 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:35 np0005531887 nova_compute[186849]: 2025-11-22 07:45:35.449 186853 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797520.4337218, 144e6cca-5b79-4b25-9456-a59f6895075b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:45:35 np0005531887 nova_compute[186849]: 2025-11-22 07:45:35.449 186853 INFO nova.compute.manager [-] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:45:35 np0005531887 nova_compute[186849]: 2025-11-22 07:45:35.473 186853 DEBUG nova.compute.manager [None req-6647dab9-d5f8-47d7-9d51-a67f3a9591e6 - - - - - -] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:45:35 np0005531887 nova_compute[186849]: 2025-11-22 07:45:35.533 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:35 np0005531887 nova_compute[186849]: 2025-11-22 07:45:35.767 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:45:35 np0005531887 nova_compute[186849]: 2025-11-22 07:45:35.768 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 02:45:35 np0005531887 nova_compute[186849]: 2025-11-22 07:45:35.794 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 02:45:35 np0005531887 nova_compute[186849]: 2025-11-22 07:45:35.795 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:45:35 np0005531887 nova_compute[186849]: 2025-11-22 07:45:35.819 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:45:35.818 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:45:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:45:35.820 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 02:45:36 np0005531887 nova_compute[186849]: 2025-11-22 07:45:36.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:45:36 np0005531887 nova_compute[186849]: 2025-11-22 07:45:36.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:45:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:45:36.822 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:45:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:45:37.318 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:45:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:45:37.318 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:45:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:45:37.318 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:45:37 np0005531887 nova_compute[186849]: 2025-11-22 07:45:37.762 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:45:38 np0005531887 nova_compute[186849]: 2025-11-22 07:45:38.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:45:38 np0005531887 nova_compute[186849]: 2025-11-22 07:45:38.786 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:45:38 np0005531887 nova_compute[186849]: 2025-11-22 07:45:38.787 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:45:38 np0005531887 nova_compute[186849]: 2025-11-22 07:45:38.787 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:45:38 np0005531887 nova_compute[186849]: 2025-11-22 07:45:38.787 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 02:45:38 np0005531887 nova_compute[186849]: 2025-11-22 07:45:38.871 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:45:38 np0005531887 nova_compute[186849]: 2025-11-22 07:45:38.930 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:45:38 np0005531887 nova_compute[186849]: 2025-11-22 07:45:38.931 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:45:38 np0005531887 nova_compute[186849]: 2025-11-22 07:45:38.995 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:45:39 np0005531887 nova_compute[186849]: 2025-11-22 07:45:39.164 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:45:39 np0005531887 nova_compute[186849]: 2025-11-22 07:45:39.166 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5604MB free_disk=73.4295883178711GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 02:45:39 np0005531887 nova_compute[186849]: 2025-11-22 07:45:39.167 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:45:39 np0005531887 nova_compute[186849]: 2025-11-22 07:45:39.167 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:45:39 np0005531887 nova_compute[186849]: 2025-11-22 07:45:39.348 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Instance 88c868e5-67c5-4f22-b584-d8772316044d actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 02:45:39 np0005531887 nova_compute[186849]: 2025-11-22 07:45:39.349 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 02:45:39 np0005531887 nova_compute[186849]: 2025-11-22 07:45:39.349 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 02:45:39 np0005531887 nova_compute[186849]: 2025-11-22 07:45:39.418 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:45:39 np0005531887 nova_compute[186849]: 2025-11-22 07:45:39.446 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:45:39 np0005531887 nova_compute[186849]: 2025-11-22 07:45:39.491 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 02:45:39 np0005531887 nova_compute[186849]: 2025-11-22 07:45:39.492 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.324s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:45:39 np0005531887 podman[216670]: 2025-11-22 07:45:39.859581202 +0000 UTC m=+0.081435127 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 02:45:40 np0005531887 nova_compute[186849]: 2025-11-22 07:45:40.224 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:40 np0005531887 nova_compute[186849]: 2025-11-22 07:45:40.300 186853 DEBUG nova.compute.manager [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Nov 22 02:45:40 np0005531887 nova_compute[186849]: 2025-11-22 07:45:40.475 186853 DEBUG oslo_concurrency.lockutils [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:45:40 np0005531887 nova_compute[186849]: 2025-11-22 07:45:40.476 186853 DEBUG oslo_concurrency.lockutils [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:45:40 np0005531887 nova_compute[186849]: 2025-11-22 07:45:40.493 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:45:40 np0005531887 nova_compute[186849]: 2025-11-22 07:45:40.493 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:45:40 np0005531887 nova_compute[186849]: 2025-11-22 07:45:40.494 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 02:45:40 np0005531887 nova_compute[186849]: 2025-11-22 07:45:40.499 186853 DEBUG nova.objects.instance [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Lazy-loading 'pci_requests' on Instance uuid 0edda70f-511a-49a0-8c13-561c699336c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:45:40 np0005531887 nova_compute[186849]: 2025-11-22 07:45:40.521 186853 DEBUG nova.virt.hardware [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:45:40 np0005531887 nova_compute[186849]: 2025-11-22 07:45:40.521 186853 INFO nova.compute.claims [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 22 02:45:40 np0005531887 nova_compute[186849]: 2025-11-22 07:45:40.522 186853 DEBUG nova.objects.instance [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Lazy-loading 'resources' on Instance uuid 0edda70f-511a-49a0-8c13-561c699336c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:45:40 np0005531887 nova_compute[186849]: 2025-11-22 07:45:40.533 186853 DEBUG nova.objects.instance [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Lazy-loading 'numa_topology' on Instance uuid 0edda70f-511a-49a0-8c13-561c699336c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:45:40 np0005531887 nova_compute[186849]: 2025-11-22 07:45:40.536 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:40 np0005531887 nova_compute[186849]: 2025-11-22 07:45:40.550 186853 DEBUG nova.objects.instance [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Lazy-loading 'pci_devices' on Instance uuid 0edda70f-511a-49a0-8c13-561c699336c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:45:40 np0005531887 nova_compute[186849]: 2025-11-22 07:45:40.589 186853 INFO nova.compute.resource_tracker [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Updating resource usage from migration 3de1f32a-6ea3-48fe-b75c-a192ccefb94a#033[00m
Nov 22 02:45:40 np0005531887 nova_compute[186849]: 2025-11-22 07:45:40.590 186853 DEBUG nova.compute.resource_tracker [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Starting to track incoming migration 3de1f32a-6ea3-48fe-b75c-a192ccefb94a with flavor 31612188-3cd6-428b-9166-9568f0affd4a _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Nov 22 02:45:40 np0005531887 nova_compute[186849]: 2025-11-22 07:45:40.674 186853 DEBUG nova.compute.provider_tree [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:45:40 np0005531887 nova_compute[186849]: 2025-11-22 07:45:40.687 186853 DEBUG nova.scheduler.client.report [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:45:40 np0005531887 nova_compute[186849]: 2025-11-22 07:45:40.711 186853 DEBUG oslo_concurrency.lockutils [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.235s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:45:40 np0005531887 nova_compute[186849]: 2025-11-22 07:45:40.711 186853 INFO nova.compute.manager [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Migrating#033[00m
Nov 22 02:45:41 np0005531887 systemd-logind[821]: New session 29 of user nova.
Nov 22 02:45:41 np0005531887 systemd[1]: Created slice User Slice of UID 42436.
Nov 22 02:45:41 np0005531887 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 22 02:45:41 np0005531887 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 22 02:45:41 np0005531887 systemd[1]: Starting User Manager for UID 42436...
Nov 22 02:45:41 np0005531887 systemd[216704]: Queued start job for default target Main User Target.
Nov 22 02:45:41 np0005531887 systemd[216704]: Created slice User Application Slice.
Nov 22 02:45:41 np0005531887 systemd[216704]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 22 02:45:41 np0005531887 systemd[216704]: Started Daily Cleanup of User's Temporary Directories.
Nov 22 02:45:41 np0005531887 systemd[216704]: Reached target Paths.
Nov 22 02:45:41 np0005531887 systemd[216704]: Reached target Timers.
Nov 22 02:45:41 np0005531887 nova_compute[186849]: 2025-11-22 07:45:41.764 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:45:41 np0005531887 systemd[216704]: Starting D-Bus User Message Bus Socket...
Nov 22 02:45:41 np0005531887 systemd[216704]: Starting Create User's Volatile Files and Directories...
Nov 22 02:45:41 np0005531887 systemd[216704]: Listening on D-Bus User Message Bus Socket.
Nov 22 02:45:41 np0005531887 systemd[216704]: Reached target Sockets.
Nov 22 02:45:41 np0005531887 systemd[216704]: Finished Create User's Volatile Files and Directories.
Nov 22 02:45:41 np0005531887 systemd[216704]: Reached target Basic System.
Nov 22 02:45:41 np0005531887 systemd[216704]: Reached target Main User Target.
Nov 22 02:45:41 np0005531887 systemd[216704]: Startup finished in 159ms.
Nov 22 02:45:41 np0005531887 systemd[1]: Started User Manager for UID 42436.
Nov 22 02:45:41 np0005531887 systemd[1]: Started Session 29 of User nova.
Nov 22 02:45:41 np0005531887 systemd[1]: session-29.scope: Deactivated successfully.
Nov 22 02:45:41 np0005531887 systemd-logind[821]: Session 29 logged out. Waiting for processes to exit.
Nov 22 02:45:41 np0005531887 systemd-logind[821]: Removed session 29.
Nov 22 02:45:42 np0005531887 systemd-logind[821]: New session 31 of user nova.
Nov 22 02:45:42 np0005531887 systemd[1]: Started Session 31 of User nova.
Nov 22 02:45:42 np0005531887 systemd[1]: session-31.scope: Deactivated successfully.
Nov 22 02:45:42 np0005531887 systemd-logind[821]: Session 31 logged out. Waiting for processes to exit.
Nov 22 02:45:42 np0005531887 systemd-logind[821]: Removed session 31.
Nov 22 02:45:44 np0005531887 podman[216727]: 2025-11-22 07:45:44.870874578 +0000 UTC m=+0.077084790 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, io.openshift.expose-services=, managed_by=edpm_ansible, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, config_id=edpm, container_name=openstack_network_exporter, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, vendor=Red Hat, Inc.)
Nov 22 02:45:45 np0005531887 nova_compute[186849]: 2025-11-22 07:45:45.225 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:45 np0005531887 nova_compute[186849]: 2025-11-22 07:45:45.537 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:50 np0005531887 nova_compute[186849]: 2025-11-22 07:45:50.227 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:50 np0005531887 nova_compute[186849]: 2025-11-22 07:45:50.571 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:52 np0005531887 systemd[1]: Stopping User Manager for UID 42436...
Nov 22 02:45:52 np0005531887 systemd[216704]: Activating special unit Exit the Session...
Nov 22 02:45:52 np0005531887 systemd[216704]: Stopped target Main User Target.
Nov 22 02:45:52 np0005531887 systemd[216704]: Stopped target Basic System.
Nov 22 02:45:52 np0005531887 systemd[216704]: Stopped target Paths.
Nov 22 02:45:52 np0005531887 systemd[216704]: Stopped target Sockets.
Nov 22 02:45:52 np0005531887 systemd[216704]: Stopped target Timers.
Nov 22 02:45:52 np0005531887 systemd[216704]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 22 02:45:52 np0005531887 systemd[216704]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 22 02:45:52 np0005531887 systemd[216704]: Closed D-Bus User Message Bus Socket.
Nov 22 02:45:52 np0005531887 systemd[216704]: Stopped Create User's Volatile Files and Directories.
Nov 22 02:45:52 np0005531887 systemd[216704]: Removed slice User Application Slice.
Nov 22 02:45:52 np0005531887 systemd[216704]: Reached target Shutdown.
Nov 22 02:45:52 np0005531887 systemd[216704]: Finished Exit the Session.
Nov 22 02:45:52 np0005531887 systemd[216704]: Reached target Exit the Session.
Nov 22 02:45:52 np0005531887 systemd[1]: user@42436.service: Deactivated successfully.
Nov 22 02:45:52 np0005531887 systemd[1]: Stopped User Manager for UID 42436.
Nov 22 02:45:52 np0005531887 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 22 02:45:52 np0005531887 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 22 02:45:52 np0005531887 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 22 02:45:52 np0005531887 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 22 02:45:52 np0005531887 systemd[1]: Removed slice User Slice of UID 42436.
Nov 22 02:45:52 np0005531887 podman[216750]: 2025-11-22 07:45:52.34815155 +0000 UTC m=+0.067015041 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm)
Nov 22 02:45:52 np0005531887 podman[216751]: 2025-11-22 07:45:52.370021509 +0000 UTC m=+0.085050086 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Nov 22 02:45:53 np0005531887 nova_compute[186849]: 2025-11-22 07:45:53.202 186853 DEBUG oslo_concurrency.lockutils [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Acquiring lock "1cb44f55-1231-44bc-8a2d-e598899b2f89" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:45:53 np0005531887 nova_compute[186849]: 2025-11-22 07:45:53.202 186853 DEBUG oslo_concurrency.lockutils [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Lock "1cb44f55-1231-44bc-8a2d-e598899b2f89" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:45:53 np0005531887 nova_compute[186849]: 2025-11-22 07:45:53.265 186853 DEBUG nova.compute.manager [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 02:45:53 np0005531887 nova_compute[186849]: 2025-11-22 07:45:53.507 186853 DEBUG oslo_concurrency.lockutils [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:45:53 np0005531887 nova_compute[186849]: 2025-11-22 07:45:53.507 186853 DEBUG oslo_concurrency.lockutils [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:45:53 np0005531887 nova_compute[186849]: 2025-11-22 07:45:53.516 186853 DEBUG nova.virt.hardware [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:45:53 np0005531887 nova_compute[186849]: 2025-11-22 07:45:53.516 186853 INFO nova.compute.claims [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 22 02:45:53 np0005531887 nova_compute[186849]: 2025-11-22 07:45:53.730 186853 DEBUG nova.compute.provider_tree [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:45:54 np0005531887 nova_compute[186849]: 2025-11-22 07:45:54.110 186853 DEBUG nova.scheduler.client.report [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:45:54 np0005531887 nova_compute[186849]: 2025-11-22 07:45:54.148 186853 DEBUG oslo_concurrency.lockutils [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.641s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:45:54 np0005531887 nova_compute[186849]: 2025-11-22 07:45:54.149 186853 DEBUG nova.compute.manager [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 02:45:54 np0005531887 nova_compute[186849]: 2025-11-22 07:45:54.299 186853 DEBUG nova.compute.manager [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 02:45:54 np0005531887 nova_compute[186849]: 2025-11-22 07:45:54.299 186853 DEBUG nova.network.neutron [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 02:45:54 np0005531887 nova_compute[186849]: 2025-11-22 07:45:54.387 186853 INFO nova.virt.libvirt.driver [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 02:45:54 np0005531887 nova_compute[186849]: 2025-11-22 07:45:54.440 186853 DEBUG nova.compute.manager [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 02:45:54 np0005531887 nova_compute[186849]: 2025-11-22 07:45:54.596 186853 DEBUG nova.compute.manager [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 02:45:54 np0005531887 nova_compute[186849]: 2025-11-22 07:45:54.598 186853 DEBUG nova.virt.libvirt.driver [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:45:54 np0005531887 nova_compute[186849]: 2025-11-22 07:45:54.598 186853 INFO nova.virt.libvirt.driver [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Creating image(s)#033[00m
Nov 22 02:45:54 np0005531887 nova_compute[186849]: 2025-11-22 07:45:54.599 186853 DEBUG oslo_concurrency.lockutils [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Acquiring lock "/var/lib/nova/instances/1cb44f55-1231-44bc-8a2d-e598899b2f89/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:45:54 np0005531887 nova_compute[186849]: 2025-11-22 07:45:54.599 186853 DEBUG oslo_concurrency.lockutils [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Lock "/var/lib/nova/instances/1cb44f55-1231-44bc-8a2d-e598899b2f89/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:45:54 np0005531887 nova_compute[186849]: 2025-11-22 07:45:54.600 186853 DEBUG oslo_concurrency.lockutils [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Lock "/var/lib/nova/instances/1cb44f55-1231-44bc-8a2d-e598899b2f89/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:45:54 np0005531887 nova_compute[186849]: 2025-11-22 07:45:54.614 186853 DEBUG nova.policy [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 02:45:54 np0005531887 nova_compute[186849]: 2025-11-22 07:45:54.618 186853 DEBUG oslo_concurrency.processutils [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:45:54 np0005531887 nova_compute[186849]: 2025-11-22 07:45:54.681 186853 DEBUG oslo_concurrency.processutils [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:45:54 np0005531887 nova_compute[186849]: 2025-11-22 07:45:54.682 186853 DEBUG oslo_concurrency.lockutils [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:45:54 np0005531887 nova_compute[186849]: 2025-11-22 07:45:54.683 186853 DEBUG oslo_concurrency.lockutils [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:45:54 np0005531887 nova_compute[186849]: 2025-11-22 07:45:54.695 186853 DEBUG oslo_concurrency.processutils [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:45:54 np0005531887 nova_compute[186849]: 2025-11-22 07:45:54.760 186853 DEBUG oslo_concurrency.processutils [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:45:54 np0005531887 nova_compute[186849]: 2025-11-22 07:45:54.762 186853 DEBUG oslo_concurrency.processutils [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/1cb44f55-1231-44bc-8a2d-e598899b2f89/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:45:54 np0005531887 nova_compute[186849]: 2025-11-22 07:45:54.803 186853 DEBUG oslo_concurrency.processutils [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/1cb44f55-1231-44bc-8a2d-e598899b2f89/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:45:54 np0005531887 nova_compute[186849]: 2025-11-22 07:45:54.805 186853 DEBUG oslo_concurrency.lockutils [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:45:54 np0005531887 nova_compute[186849]: 2025-11-22 07:45:54.806 186853 DEBUG oslo_concurrency.processutils [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:45:54 np0005531887 nova_compute[186849]: 2025-11-22 07:45:54.872 186853 DEBUG oslo_concurrency.processutils [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:45:54 np0005531887 nova_compute[186849]: 2025-11-22 07:45:54.874 186853 DEBUG nova.virt.disk.api [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Checking if we can resize image /var/lib/nova/instances/1cb44f55-1231-44bc-8a2d-e598899b2f89/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:45:54 np0005531887 nova_compute[186849]: 2025-11-22 07:45:54.875 186853 DEBUG oslo_concurrency.processutils [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1cb44f55-1231-44bc-8a2d-e598899b2f89/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:45:54 np0005531887 nova_compute[186849]: 2025-11-22 07:45:54.942 186853 DEBUG oslo_concurrency.processutils [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1cb44f55-1231-44bc-8a2d-e598899b2f89/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:45:54 np0005531887 nova_compute[186849]: 2025-11-22 07:45:54.944 186853 DEBUG nova.virt.disk.api [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Cannot resize image /var/lib/nova/instances/1cb44f55-1231-44bc-8a2d-e598899b2f89/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:45:54 np0005531887 nova_compute[186849]: 2025-11-22 07:45:54.944 186853 DEBUG nova.objects.instance [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Lazy-loading 'migration_context' on Instance uuid 1cb44f55-1231-44bc-8a2d-e598899b2f89 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:45:54 np0005531887 nova_compute[186849]: 2025-11-22 07:45:54.967 186853 DEBUG nova.virt.libvirt.driver [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:45:54 np0005531887 nova_compute[186849]: 2025-11-22 07:45:54.967 186853 DEBUG nova.virt.libvirt.driver [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Ensure instance console log exists: /var/lib/nova/instances/1cb44f55-1231-44bc-8a2d-e598899b2f89/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:45:54 np0005531887 nova_compute[186849]: 2025-11-22 07:45:54.968 186853 DEBUG oslo_concurrency.lockutils [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:45:54 np0005531887 nova_compute[186849]: 2025-11-22 07:45:54.968 186853 DEBUG oslo_concurrency.lockutils [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:45:54 np0005531887 nova_compute[186849]: 2025-11-22 07:45:54.969 186853 DEBUG oslo_concurrency.lockutils [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:45:55 np0005531887 nova_compute[186849]: 2025-11-22 07:45:55.229 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:55 np0005531887 nova_compute[186849]: 2025-11-22 07:45:55.365 186853 DEBUG nova.network.neutron [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Successfully created port: d3f68e24-4f74-40aa-8f91-ff43c969dcd0 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 02:45:55 np0005531887 systemd-logind[821]: New session 32 of user nova.
Nov 22 02:45:55 np0005531887 systemd[1]: Created slice User Slice of UID 42436.
Nov 22 02:45:55 np0005531887 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 22 02:45:55 np0005531887 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 22 02:45:55 np0005531887 systemd[1]: Starting User Manager for UID 42436...
Nov 22 02:45:55 np0005531887 nova_compute[186849]: 2025-11-22 07:45:55.572 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:55 np0005531887 systemd[216814]: Queued start job for default target Main User Target.
Nov 22 02:45:55 np0005531887 systemd[216814]: Created slice User Application Slice.
Nov 22 02:45:55 np0005531887 systemd[216814]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 22 02:45:55 np0005531887 systemd[216814]: Started Daily Cleanup of User's Temporary Directories.
Nov 22 02:45:55 np0005531887 systemd[216814]: Reached target Paths.
Nov 22 02:45:55 np0005531887 systemd[216814]: Reached target Timers.
Nov 22 02:45:55 np0005531887 systemd[216814]: Starting D-Bus User Message Bus Socket...
Nov 22 02:45:55 np0005531887 systemd[216814]: Starting Create User's Volatile Files and Directories...
Nov 22 02:45:55 np0005531887 systemd[216814]: Finished Create User's Volatile Files and Directories.
Nov 22 02:45:55 np0005531887 systemd[216814]: Listening on D-Bus User Message Bus Socket.
Nov 22 02:45:55 np0005531887 systemd[216814]: Reached target Sockets.
Nov 22 02:45:55 np0005531887 systemd[216814]: Reached target Basic System.
Nov 22 02:45:55 np0005531887 systemd[216814]: Reached target Main User Target.
Nov 22 02:45:55 np0005531887 systemd[216814]: Startup finished in 131ms.
Nov 22 02:45:55 np0005531887 systemd[1]: Started User Manager for UID 42436.
Nov 22 02:45:55 np0005531887 systemd[1]: Started Session 32 of User nova.
Nov 22 02:45:56 np0005531887 systemd[1]: session-32.scope: Deactivated successfully.
Nov 22 02:45:56 np0005531887 systemd-logind[821]: Session 32 logged out. Waiting for processes to exit.
Nov 22 02:45:56 np0005531887 systemd-logind[821]: Removed session 32.
Nov 22 02:45:56 np0005531887 systemd-logind[821]: New session 34 of user nova.
Nov 22 02:45:56 np0005531887 systemd[1]: Started Session 34 of User nova.
Nov 22 02:45:56 np0005531887 systemd-logind[821]: Session 34 logged out. Waiting for processes to exit.
Nov 22 02:45:56 np0005531887 systemd[1]: session-34.scope: Deactivated successfully.
Nov 22 02:45:56 np0005531887 systemd-logind[821]: Removed session 34.
Nov 22 02:45:56 np0005531887 systemd-logind[821]: New session 35 of user nova.
Nov 22 02:45:56 np0005531887 systemd[1]: Started Session 35 of User nova.
Nov 22 02:45:56 np0005531887 systemd[1]: session-35.scope: Deactivated successfully.
Nov 22 02:45:56 np0005531887 systemd-logind[821]: Session 35 logged out. Waiting for processes to exit.
Nov 22 02:45:56 np0005531887 systemd-logind[821]: Removed session 35.
Nov 22 02:45:56 np0005531887 nova_compute[186849]: 2025-11-22 07:45:56.556 186853 DEBUG nova.network.neutron [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Successfully updated port: d3f68e24-4f74-40aa-8f91-ff43c969dcd0 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 02:45:56 np0005531887 nova_compute[186849]: 2025-11-22 07:45:56.588 186853 DEBUG oslo_concurrency.lockutils [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Acquiring lock "refresh_cache-1cb44f55-1231-44bc-8a2d-e598899b2f89" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:45:56 np0005531887 nova_compute[186849]: 2025-11-22 07:45:56.589 186853 DEBUG oslo_concurrency.lockutils [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Acquired lock "refresh_cache-1cb44f55-1231-44bc-8a2d-e598899b2f89" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:45:56 np0005531887 nova_compute[186849]: 2025-11-22 07:45:56.589 186853 DEBUG nova.network.neutron [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:45:56 np0005531887 nova_compute[186849]: 2025-11-22 07:45:56.666 186853 DEBUG nova.compute.manager [req-8671a8f6-73e8-46c5-aa91-3685ef029aac req-9b7c6162-c113-4e09-bb0c-8a4e9c3a8963 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Received event network-changed-d3f68e24-4f74-40aa-8f91-ff43c969dcd0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:45:56 np0005531887 nova_compute[186849]: 2025-11-22 07:45:56.667 186853 DEBUG nova.compute.manager [req-8671a8f6-73e8-46c5-aa91-3685ef029aac req-9b7c6162-c113-4e09-bb0c-8a4e9c3a8963 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Refreshing instance network info cache due to event network-changed-d3f68e24-4f74-40aa-8f91-ff43c969dcd0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 02:45:56 np0005531887 nova_compute[186849]: 2025-11-22 07:45:56.667 186853 DEBUG oslo_concurrency.lockutils [req-8671a8f6-73e8-46c5-aa91-3685ef029aac req-9b7c6162-c113-4e09-bb0c-8a4e9c3a8963 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-1cb44f55-1231-44bc-8a2d-e598899b2f89" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.002 186853 DEBUG oslo_concurrency.lockutils [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Acquiring lock "refresh_cache-0edda70f-511a-49a0-8c13-561c699336c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.002 186853 DEBUG oslo_concurrency.lockutils [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Acquired lock "refresh_cache-0edda70f-511a-49a0-8c13-561c699336c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.003 186853 DEBUG nova.network.neutron [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.015 186853 DEBUG nova.network.neutron [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.146 186853 DEBUG nova.network.neutron [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.391 186853 DEBUG nova.network.neutron [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.411 186853 DEBUG oslo_concurrency.lockutils [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Releasing lock "refresh_cache-0edda70f-511a-49a0-8c13-561c699336c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.564 186853 DEBUG nova.virt.libvirt.driver [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.567 186853 DEBUG nova.virt.libvirt.driver [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.568 186853 INFO nova.virt.libvirt.driver [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Creating image(s)#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.569 186853 DEBUG nova.objects.instance [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Lazy-loading 'trusted_certs' on Instance uuid 0edda70f-511a-49a0-8c13-561c699336c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.579 186853 DEBUG oslo_concurrency.processutils [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.650 186853 DEBUG oslo_concurrency.processutils [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.654 186853 DEBUG nova.virt.disk.api [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Checking if we can resize image /var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.655 186853 DEBUG oslo_concurrency.processutils [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.715 186853 DEBUG oslo_concurrency.processutils [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.716 186853 DEBUG nova.virt.disk.api [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Cannot resize image /var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.733 186853 DEBUG nova.virt.libvirt.driver [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.734 186853 DEBUG nova.virt.libvirt.driver [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Ensure instance console log exists: /var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.734 186853 DEBUG oslo_concurrency.lockutils [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.735 186853 DEBUG oslo_concurrency.lockutils [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.735 186853 DEBUG oslo_concurrency.lockutils [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.737 186853 DEBUG nova.virt.libvirt.driver [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'size': 0, 'boot_index': 0, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.742 186853 WARNING nova.virt.libvirt.driver [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.749 186853 DEBUG nova.virt.libvirt.host [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.751 186853 DEBUG nova.virt.libvirt.host [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.755 186853 DEBUG nova.virt.libvirt.host [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.756 186853 DEBUG nova.virt.libvirt.host [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.757 186853 DEBUG nova.virt.libvirt.driver [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.758 186853 DEBUG nova.virt.hardware [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.758 186853 DEBUG nova.virt.hardware [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.758 186853 DEBUG nova.virt.hardware [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.759 186853 DEBUG nova.virt.hardware [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.759 186853 DEBUG nova.virt.hardware [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.759 186853 DEBUG nova.virt.hardware [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.759 186853 DEBUG nova.virt.hardware [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.759 186853 DEBUG nova.virt.hardware [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.760 186853 DEBUG nova.virt.hardware [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.760 186853 DEBUG nova.virt.hardware [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.760 186853 DEBUG nova.virt.hardware [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.760 186853 DEBUG nova.objects.instance [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Lazy-loading 'vcpu_model' on Instance uuid 0edda70f-511a-49a0-8c13-561c699336c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.780 186853 DEBUG oslo_concurrency.processutils [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.802 186853 DEBUG nova.network.neutron [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Updating instance_info_cache with network_info: [{"id": "d3f68e24-4f74-40aa-8f91-ff43c969dcd0", "address": "fa:16:3e:fd:ee:47", "network": {"id": "9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-440990750-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef27a8ab9a794f7782ac89b9c28c893a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3f68e24-4f", "ovs_interfaceid": "d3f68e24-4f74-40aa-8f91-ff43c969dcd0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.839 186853 DEBUG oslo_concurrency.processutils [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1/disk.config --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.841 186853 DEBUG oslo_concurrency.lockutils [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Acquiring lock "/var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.841 186853 DEBUG oslo_concurrency.lockutils [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Lock "/var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.842 186853 DEBUG oslo_concurrency.lockutils [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Lock "/var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.845 186853 DEBUG nova.virt.libvirt.driver [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:45:57 np0005531887 nova_compute[186849]:  <uuid>0edda70f-511a-49a0-8c13-561c699336c1</uuid>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:  <name>instance-0000001a</name>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:  <memory>131072</memory>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:  <vcpu>1</vcpu>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:45:57 np0005531887 nova_compute[186849]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:      <nova:name>tempest-MigrationsAdminTest-server-1141640296</nova:name>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:      <nova:creationTime>2025-11-22 07:45:57</nova:creationTime>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:      <nova:flavor name="m1.nano">
Nov 22 02:45:57 np0005531887 nova_compute[186849]:        <nova:memory>128</nova:memory>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:        <nova:disk>1</nova:disk>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:        <nova:swap>0</nova:swap>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:      </nova:flavor>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:      <nova:owner>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:        <nova:user uuid="5ea417ea62e2404d8cb5b9e767e8c5c4">tempest-MigrationsAdminTest-573005991-project-member</nova:user>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:        <nova:project uuid="070aaece3c3c4232877d26c34023c56d">tempest-MigrationsAdminTest-573005991</nova:project>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:      </nova:owner>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:      <nova:ports/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    </nova:instance>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:  <sysinfo type="smbios">
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <system>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:      <entry name="serial">0edda70f-511a-49a0-8c13-561c699336c1</entry>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:      <entry name="uuid">0edda70f-511a-49a0-8c13-561c699336c1</entry>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    </system>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:  <os>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <boot dev="hd"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <smbios mode="sysinfo"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:  </os>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:  <features>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <vmcoreinfo/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:  </features>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:  <clock offset="utc">
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <timer name="hpet" present="no"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:  </clock>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:  <cpu mode="custom" match="exact">
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <model>Nehalem</model>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:  <devices>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <disk type="file" device="disk">
Nov 22 02:45:57 np0005531887 nova_compute[186849]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1/disk"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:      <target dev="vda" bus="virtio"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <disk type="file" device="cdrom">
Nov 22 02:45:57 np0005531887 nova_compute[186849]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1/disk.config"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:      <target dev="sda" bus="sata"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <serial type="pty">
Nov 22 02:45:57 np0005531887 nova_compute[186849]:      <log file="/var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1/console.log" append="off"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    </serial>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <video>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    </video>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <input type="tablet" bus="usb"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <rng model="virtio">
Nov 22 02:45:57 np0005531887 nova_compute[186849]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    </rng>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <controller type="usb" index="0"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <memballoon model="virtio">
Nov 22 02:45:57 np0005531887 nova_compute[186849]:      <stats period="10"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:  </devices>
Nov 22 02:45:57 np0005531887 nova_compute[186849]: </domain>
Nov 22 02:45:57 np0005531887 nova_compute[186849]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.860 186853 DEBUG oslo_concurrency.lockutils [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Releasing lock "refresh_cache-1cb44f55-1231-44bc-8a2d-e598899b2f89" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.860 186853 DEBUG nova.compute.manager [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Instance network_info: |[{"id": "d3f68e24-4f74-40aa-8f91-ff43c969dcd0", "address": "fa:16:3e:fd:ee:47", "network": {"id": "9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-440990750-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef27a8ab9a794f7782ac89b9c28c893a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3f68e24-4f", "ovs_interfaceid": "d3f68e24-4f74-40aa-8f91-ff43c969dcd0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.861 186853 DEBUG oslo_concurrency.lockutils [req-8671a8f6-73e8-46c5-aa91-3685ef029aac req-9b7c6162-c113-4e09-bb0c-8a4e9c3a8963 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-1cb44f55-1231-44bc-8a2d-e598899b2f89" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.861 186853 DEBUG nova.network.neutron [req-8671a8f6-73e8-46c5-aa91-3685ef029aac req-9b7c6162-c113-4e09-bb0c-8a4e9c3a8963 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Refreshing network info cache for port d3f68e24-4f74-40aa-8f91-ff43c969dcd0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.865 186853 DEBUG nova.virt.libvirt.driver [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Start _get_guest_xml network_info=[{"id": "d3f68e24-4f74-40aa-8f91-ff43c969dcd0", "address": "fa:16:3e:fd:ee:47", "network": {"id": "9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-440990750-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef27a8ab9a794f7782ac89b9c28c893a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3f68e24-4f", "ovs_interfaceid": "d3f68e24-4f74-40aa-8f91-ff43c969dcd0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'size': 0, 'boot_index': 0, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.872 186853 WARNING nova.virt.libvirt.driver [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.878 186853 DEBUG nova.virt.libvirt.host [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.878 186853 DEBUG nova.virt.libvirt.host [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.882 186853 DEBUG nova.virt.libvirt.host [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.882 186853 DEBUG nova.virt.libvirt.host [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.884 186853 DEBUG nova.virt.libvirt.driver [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.884 186853 DEBUG nova.virt.hardware [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.884 186853 DEBUG nova.virt.hardware [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.885 186853 DEBUG nova.virt.hardware [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.885 186853 DEBUG nova.virt.hardware [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.885 186853 DEBUG nova.virt.hardware [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.885 186853 DEBUG nova.virt.hardware [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.885 186853 DEBUG nova.virt.hardware [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.886 186853 DEBUG nova.virt.hardware [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.886 186853 DEBUG nova.virt.hardware [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.886 186853 DEBUG nova.virt.hardware [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.886 186853 DEBUG nova.virt.hardware [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.890 186853 DEBUG nova.virt.libvirt.vif [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:45:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-1328693686',display_name='tempest-FloatingIPsAssociationTestJSON-server-1328693686',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-1328693686',id=27,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ef27a8ab9a794f7782ac89b9c28c893a',ramdisk_id='',reservation_id='r-fpdsdowm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationTestJSON-1465053098',owner_user_name='tempest-FloatingIPsAssociationTestJSON-1465053098-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:45:54Z,user_data=None,user_id='65ded9a5f9a7463d8c52561197054664',uuid=1cb44f55-1231-44bc-8a2d-e598899b2f89,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d3f68e24-4f74-40aa-8f91-ff43c969dcd0", "address": "fa:16:3e:fd:ee:47", "network": {"id": "9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-440990750-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef27a8ab9a794f7782ac89b9c28c893a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3f68e24-4f", "ovs_interfaceid": "d3f68e24-4f74-40aa-8f91-ff43c969dcd0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.890 186853 DEBUG nova.network.os_vif_util [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Converting VIF {"id": "d3f68e24-4f74-40aa-8f91-ff43c969dcd0", "address": "fa:16:3e:fd:ee:47", "network": {"id": "9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-440990750-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef27a8ab9a794f7782ac89b9c28c893a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3f68e24-4f", "ovs_interfaceid": "d3f68e24-4f74-40aa-8f91-ff43c969dcd0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.891 186853 DEBUG nova.network.os_vif_util [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fd:ee:47,bridge_name='br-int',has_traffic_filtering=True,id=d3f68e24-4f74-40aa-8f91-ff43c969dcd0,network=Network(9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3f68e24-4f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.892 186853 DEBUG nova.objects.instance [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Lazy-loading 'pci_devices' on Instance uuid 1cb44f55-1231-44bc-8a2d-e598899b2f89 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.911 186853 DEBUG nova.virt.libvirt.driver [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:45:57 np0005531887 nova_compute[186849]:  <uuid>1cb44f55-1231-44bc-8a2d-e598899b2f89</uuid>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:  <name>instance-0000001b</name>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:  <memory>131072</memory>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:  <vcpu>1</vcpu>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:45:57 np0005531887 nova_compute[186849]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:      <nova:name>tempest-FloatingIPsAssociationTestJSON-server-1328693686</nova:name>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:      <nova:creationTime>2025-11-22 07:45:57</nova:creationTime>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:      <nova:flavor name="m1.nano">
Nov 22 02:45:57 np0005531887 nova_compute[186849]:        <nova:memory>128</nova:memory>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:        <nova:disk>1</nova:disk>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:        <nova:swap>0</nova:swap>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:      </nova:flavor>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:      <nova:owner>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:        <nova:user uuid="65ded9a5f9a7463d8c52561197054664">tempest-FloatingIPsAssociationTestJSON-1465053098-project-member</nova:user>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:        <nova:project uuid="ef27a8ab9a794f7782ac89b9c28c893a">tempest-FloatingIPsAssociationTestJSON-1465053098</nova:project>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:      </nova:owner>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:      <nova:ports>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:        <nova:port uuid="d3f68e24-4f74-40aa-8f91-ff43c969dcd0">
Nov 22 02:45:57 np0005531887 nova_compute[186849]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:        </nova:port>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:      </nova:ports>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    </nova:instance>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:  <sysinfo type="smbios">
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <system>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:      <entry name="serial">1cb44f55-1231-44bc-8a2d-e598899b2f89</entry>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:      <entry name="uuid">1cb44f55-1231-44bc-8a2d-e598899b2f89</entry>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    </system>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:  <os>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <boot dev="hd"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <smbios mode="sysinfo"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:  </os>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:  <features>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <vmcoreinfo/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:  </features>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:  <clock offset="utc">
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <timer name="hpet" present="no"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:  </clock>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:  <cpu mode="custom" match="exact">
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <model>Nehalem</model>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:  <devices>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <disk type="file" device="disk">
Nov 22 02:45:57 np0005531887 nova_compute[186849]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/1cb44f55-1231-44bc-8a2d-e598899b2f89/disk"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:      <target dev="vda" bus="virtio"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <disk type="file" device="cdrom">
Nov 22 02:45:57 np0005531887 nova_compute[186849]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/1cb44f55-1231-44bc-8a2d-e598899b2f89/disk.config"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:      <target dev="sda" bus="sata"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <interface type="ethernet">
Nov 22 02:45:57 np0005531887 nova_compute[186849]:      <mac address="fa:16:3e:fd:ee:47"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:      <mtu size="1442"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:      <target dev="tapd3f68e24-4f"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    </interface>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <serial type="pty">
Nov 22 02:45:57 np0005531887 nova_compute[186849]:      <log file="/var/lib/nova/instances/1cb44f55-1231-44bc-8a2d-e598899b2f89/console.log" append="off"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    </serial>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <video>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    </video>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <input type="tablet" bus="usb"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <rng model="virtio">
Nov 22 02:45:57 np0005531887 nova_compute[186849]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    </rng>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <controller type="usb" index="0"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    <memballoon model="virtio">
Nov 22 02:45:57 np0005531887 nova_compute[186849]:      <stats period="10"/>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 02:45:57 np0005531887 nova_compute[186849]:  </devices>
Nov 22 02:45:57 np0005531887 nova_compute[186849]: </domain>
Nov 22 02:45:57 np0005531887 nova_compute[186849]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.912 186853 DEBUG nova.compute.manager [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Preparing to wait for external event network-vif-plugged-d3f68e24-4f74-40aa-8f91-ff43c969dcd0 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.913 186853 DEBUG oslo_concurrency.lockutils [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Acquiring lock "1cb44f55-1231-44bc-8a2d-e598899b2f89-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.914 186853 DEBUG oslo_concurrency.lockutils [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Lock "1cb44f55-1231-44bc-8a2d-e598899b2f89-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.914 186853 DEBUG oslo_concurrency.lockutils [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Lock "1cb44f55-1231-44bc-8a2d-e598899b2f89-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.915 186853 DEBUG nova.virt.libvirt.vif [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:45:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-1328693686',display_name='tempest-FloatingIPsAssociationTestJSON-server-1328693686',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-1328693686',id=27,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ef27a8ab9a794f7782ac89b9c28c893a',ramdisk_id='',reservation_id='r-fpdsdowm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationTestJSON-1465053098',owner_user_name='tempest-FloatingIPsAssociationTestJSON-1465053098-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:45:54Z,user_data=None,user_id='65ded9a5f9a7463d8c52561197054664',uuid=1cb44f55-1231-44bc-8a2d-e598899b2f89,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d3f68e24-4f74-40aa-8f91-ff43c969dcd0", "address": "fa:16:3e:fd:ee:47", "network": {"id": "9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-440990750-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef27a8ab9a794f7782ac89b9c28c893a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3f68e24-4f", "ovs_interfaceid": "d3f68e24-4f74-40aa-8f91-ff43c969dcd0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.915 186853 DEBUG nova.network.os_vif_util [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Converting VIF {"id": "d3f68e24-4f74-40aa-8f91-ff43c969dcd0", "address": "fa:16:3e:fd:ee:47", "network": {"id": "9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-440990750-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef27a8ab9a794f7782ac89b9c28c893a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3f68e24-4f", "ovs_interfaceid": "d3f68e24-4f74-40aa-8f91-ff43c969dcd0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.916 186853 DEBUG nova.network.os_vif_util [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fd:ee:47,bridge_name='br-int',has_traffic_filtering=True,id=d3f68e24-4f74-40aa-8f91-ff43c969dcd0,network=Network(9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3f68e24-4f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.917 186853 DEBUG os_vif [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fd:ee:47,bridge_name='br-int',has_traffic_filtering=True,id=d3f68e24-4f74-40aa-8f91-ff43c969dcd0,network=Network(9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3f68e24-4f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.919 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.920 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.920 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.925 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.926 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd3f68e24-4f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.927 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd3f68e24-4f, col_values=(('external_ids', {'iface-id': 'd3f68e24-4f74-40aa-8f91-ff43c969dcd0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fd:ee:47', 'vm-uuid': '1cb44f55-1231-44bc-8a2d-e598899b2f89'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.928 186853 DEBUG nova.virt.libvirt.driver [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.929 186853 DEBUG nova.virt.libvirt.driver [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:45:57 np0005531887 NetworkManager[55210]: <info>  [1763797557.9295] manager: (tapd3f68e24-4f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/38)
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.930 186853 INFO nova.virt.libvirt.driver [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Using config drive#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.930 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.933 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.938 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:57 np0005531887 nova_compute[186849]: 2025-11-22 07:45:57.939 186853 INFO os_vif [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fd:ee:47,bridge_name='br-int',has_traffic_filtering=True,id=d3f68e24-4f74-40aa-8f91-ff43c969dcd0,network=Network(9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3f68e24-4f')#033[00m
Nov 22 02:45:58 np0005531887 nova_compute[186849]: 2025-11-22 07:45:58.019 186853 DEBUG nova.virt.libvirt.driver [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:45:58 np0005531887 nova_compute[186849]: 2025-11-22 07:45:58.020 186853 DEBUG nova.virt.libvirt.driver [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:45:58 np0005531887 nova_compute[186849]: 2025-11-22 07:45:58.021 186853 DEBUG nova.virt.libvirt.driver [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] No VIF found with MAC fa:16:3e:fd:ee:47, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 02:45:58 np0005531887 nova_compute[186849]: 2025-11-22 07:45:58.021 186853 INFO nova.virt.libvirt.driver [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Using config drive#033[00m
Nov 22 02:45:58 np0005531887 systemd-machined[153180]: New machine qemu-11-instance-0000001a.
Nov 22 02:45:58 np0005531887 systemd[1]: Started Virtual Machine qemu-11-instance-0000001a.
Nov 22 02:45:58 np0005531887 podman[216859]: 2025-11-22 07:45:58.098556238 +0000 UTC m=+0.081750894 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 02:45:58 np0005531887 nova_compute[186849]: 2025-11-22 07:45:58.496 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763797558.4961097, 0edda70f-511a-49a0-8c13-561c699336c1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:45:58 np0005531887 nova_compute[186849]: 2025-11-22 07:45:58.499 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:45:58 np0005531887 nova_compute[186849]: 2025-11-22 07:45:58.502 186853 DEBUG nova.compute.manager [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:45:58 np0005531887 nova_compute[186849]: 2025-11-22 07:45:58.506 186853 INFO nova.virt.libvirt.driver [-] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Instance running successfully.#033[00m
Nov 22 02:45:58 np0005531887 virtqemud[186424]: argument unsupported: QEMU guest agent is not configured
Nov 22 02:45:58 np0005531887 nova_compute[186849]: 2025-11-22 07:45:58.510 186853 DEBUG nova.virt.libvirt.guest [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Nov 22 02:45:58 np0005531887 nova_compute[186849]: 2025-11-22 07:45:58.510 186853 DEBUG nova.virt.libvirt.driver [None req-c56a04af-3610-4964-9a07-7ea3ee144b11 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Nov 22 02:45:58 np0005531887 nova_compute[186849]: 2025-11-22 07:45:58.535 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:45:58 np0005531887 nova_compute[186849]: 2025-11-22 07:45:58.539 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:45:58 np0005531887 nova_compute[186849]: 2025-11-22 07:45:58.577 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Nov 22 02:45:58 np0005531887 nova_compute[186849]: 2025-11-22 07:45:58.578 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763797558.4976172, 0edda70f-511a-49a0-8c13-561c699336c1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:45:58 np0005531887 nova_compute[186849]: 2025-11-22 07:45:58.578 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] VM Started (Lifecycle Event)#033[00m
Nov 22 02:45:58 np0005531887 nova_compute[186849]: 2025-11-22 07:45:58.598 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:45:58 np0005531887 nova_compute[186849]: 2025-11-22 07:45:58.602 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:45:58 np0005531887 nova_compute[186849]: 2025-11-22 07:45:58.620 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Nov 22 02:45:58 np0005531887 nova_compute[186849]: 2025-11-22 07:45:58.869 186853 INFO nova.virt.libvirt.driver [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Creating config drive at /var/lib/nova/instances/1cb44f55-1231-44bc-8a2d-e598899b2f89/disk.config#033[00m
Nov 22 02:45:58 np0005531887 nova_compute[186849]: 2025-11-22 07:45:58.878 186853 DEBUG oslo_concurrency.processutils [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1cb44f55-1231-44bc-8a2d-e598899b2f89/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpligxmv4i execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:45:59 np0005531887 nova_compute[186849]: 2025-11-22 07:45:59.012 186853 DEBUG oslo_concurrency.processutils [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1cb44f55-1231-44bc-8a2d-e598899b2f89/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpligxmv4i" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:45:59 np0005531887 kernel: tapd3f68e24-4f: entered promiscuous mode
Nov 22 02:45:59 np0005531887 ovn_controller[95130]: 2025-11-22T07:45:59Z|00064|binding|INFO|Claiming lport d3f68e24-4f74-40aa-8f91-ff43c969dcd0 for this chassis.
Nov 22 02:45:59 np0005531887 nova_compute[186849]: 2025-11-22 07:45:59.096 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:59 np0005531887 ovn_controller[95130]: 2025-11-22T07:45:59Z|00065|binding|INFO|d3f68e24-4f74-40aa-8f91-ff43c969dcd0: Claiming fa:16:3e:fd:ee:47 10.100.0.6
Nov 22 02:45:59 np0005531887 systemd-udevd[216900]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:45:59 np0005531887 NetworkManager[55210]: <info>  [1763797559.1077] manager: (tapd3f68e24-4f): new Tun device (/org/freedesktop/NetworkManager/Devices/39)
Nov 22 02:45:59 np0005531887 NetworkManager[55210]: <info>  [1763797559.1256] device (tapd3f68e24-4f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 02:45:59 np0005531887 NetworkManager[55210]: <info>  [1763797559.1269] device (tapd3f68e24-4f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 02:45:59 np0005531887 systemd-machined[153180]: New machine qemu-12-instance-0000001b.
Nov 22 02:45:59 np0005531887 nova_compute[186849]: 2025-11-22 07:45:59.169 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:59 np0005531887 ovn_controller[95130]: 2025-11-22T07:45:59Z|00066|binding|INFO|Setting lport d3f68e24-4f74-40aa-8f91-ff43c969dcd0 ovn-installed in OVS
Nov 22 02:45:59 np0005531887 nova_compute[186849]: 2025-11-22 07:45:59.176 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:59 np0005531887 systemd[1]: Started Virtual Machine qemu-12-instance-0000001b.
Nov 22 02:45:59 np0005531887 ovn_controller[95130]: 2025-11-22T07:45:59Z|00067|binding|INFO|Setting lport d3f68e24-4f74-40aa-8f91-ff43c969dcd0 up in Southbound
Nov 22 02:45:59 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:45:59.482 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fd:ee:47 10.100.0.6'], port_security=['fa:16:3e:fd:ee:47 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '1cb44f55-1231-44bc-8a2d-e598899b2f89', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7ba545ee-7ef7-4120-9b36-dfb927d132f4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=750112b4-7c3d-47fc-a624-7726e73fdc53, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=d3f68e24-4f74-40aa-8f91-ff43c969dcd0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:45:59 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:45:59.484 104084 INFO neutron.agent.ovn.metadata.agent [-] Port d3f68e24-4f74-40aa-8f91-ff43c969dcd0 in datapath 9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202 bound to our chassis#033[00m
Nov 22 02:45:59 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:45:59.485 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202#033[00m
Nov 22 02:45:59 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:45:59.498 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[aa3b1d8e-a250-4701-a45d-f76ea0ae39b6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:59 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:45:59.500 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9dfbfc3c-a1 in ovnmeta-9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 02:45:59 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:45:59.503 213790 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9dfbfc3c-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 02:45:59 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:45:59.503 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[e87794ef-3e95-40e8-8b97-6bcc69fd3cb2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:59 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:45:59.504 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[1a2c4bb5-79c2-4a53-bf1e-710c67caf7cd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:59 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:45:59.522 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[ff5c0ac7-c04a-4793-8c63-62b023e4bb69]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:59 np0005531887 nova_compute[186849]: 2025-11-22 07:45:59.523 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763797559.522825, 1cb44f55-1231-44bc-8a2d-e598899b2f89 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:45:59 np0005531887 nova_compute[186849]: 2025-11-22 07:45:59.524 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] VM Started (Lifecycle Event)#033[00m
Nov 22 02:45:59 np0005531887 nova_compute[186849]: 2025-11-22 07:45:59.544 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:45:59 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:45:59.549 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[2ee6f665-8b91-4307-b447-1cc1fb9eed33]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:59 np0005531887 nova_compute[186849]: 2025-11-22 07:45:59.551 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763797559.524604, 1cb44f55-1231-44bc-8a2d-e598899b2f89 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:45:59 np0005531887 nova_compute[186849]: 2025-11-22 07:45:59.551 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] VM Paused (Lifecycle Event)#033[00m
Nov 22 02:45:59 np0005531887 nova_compute[186849]: 2025-11-22 07:45:59.566 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:45:59 np0005531887 nova_compute[186849]: 2025-11-22 07:45:59.571 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:45:59 np0005531887 nova_compute[186849]: 2025-11-22 07:45:59.586 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:45:59 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:45:59.598 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[b7e90dce-0556-4234-affc-df30f6018e43]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:59 np0005531887 NetworkManager[55210]: <info>  [1763797559.6100] manager: (tap9dfbfc3c-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/40)
Nov 22 02:45:59 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:45:59.607 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[af4f3039-417a-442b-9191-2d34c37d0cfd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:59 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:45:59.658 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[169b59ff-1a8a-4294-afea-29ae9c506a32]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:59 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:45:59.661 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[b76473b6-3789-42e8-a215-7050fc77c8c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:59 np0005531887 NetworkManager[55210]: <info>  [1763797559.6851] device (tap9dfbfc3c-a0): carrier: link connected
Nov 22 02:45:59 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:45:59.691 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[fb437c7a-f9ab-4b6e-a1d3-95bc54b8ba52]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:59 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:45:59.711 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[770cd033-f6d5-40cf-8290-f43ca3491d0e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9dfbfc3c-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3e:03:74'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 430229, 'reachable_time': 26524, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216960, 'error': None, 'target': 'ovnmeta-9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:59 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:45:59.732 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[d98ecd93-42cd-44b8-a6a0-2c2ef572415b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe3e:374'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 430229, 'tstamp': 430229}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216961, 'error': None, 'target': 'ovnmeta-9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:59 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:45:59.753 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[a4d00bec-4609-437d-8ba4-3dffba197452]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9dfbfc3c-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3e:03:74'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 430229, 'reachable_time': 26524, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216962, 'error': None, 'target': 'ovnmeta-9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:59 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:45:59.789 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[6ade3452-85a3-45a0-8074-bbf9d7a2fe92]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:59 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:45:59.856 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[d0f5f1ec-1bbc-4716-9d74-c14ae3e56388]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:59 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:45:59.857 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9dfbfc3c-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:45:59 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:45:59.857 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:45:59 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:45:59.858 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9dfbfc3c-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:45:59 np0005531887 nova_compute[186849]: 2025-11-22 07:45:59.859 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:59 np0005531887 kernel: tap9dfbfc3c-a0: entered promiscuous mode
Nov 22 02:45:59 np0005531887 NetworkManager[55210]: <info>  [1763797559.8624] manager: (tap9dfbfc3c-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/41)
Nov 22 02:45:59 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:45:59.866 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9dfbfc3c-a0, col_values=(('external_ids', {'iface-id': 'd4b08431-3a8d-4e48-ba0b-792923071bed'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:45:59 np0005531887 nova_compute[186849]: 2025-11-22 07:45:59.868 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:59 np0005531887 nova_compute[186849]: 2025-11-22 07:45:59.869 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:45:59 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:45:59.869 104084 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 02:45:59 np0005531887 ovn_controller[95130]: 2025-11-22T07:45:59Z|00068|binding|INFO|Releasing lport d4b08431-3a8d-4e48-ba0b-792923071bed from this chassis (sb_readonly=0)
Nov 22 02:45:59 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:45:59.870 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[24a5bfd0-4952-4ba2-a3e8-2d5c6b7ef2e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:45:59 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:45:59.870 104084 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 02:45:59 np0005531887 ovn_metadata_agent[104079]: global
Nov 22 02:45:59 np0005531887 ovn_metadata_agent[104079]:    log         /dev/log local0 debug
Nov 22 02:45:59 np0005531887 ovn_metadata_agent[104079]:    log-tag     haproxy-metadata-proxy-9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202
Nov 22 02:45:59 np0005531887 ovn_metadata_agent[104079]:    user        root
Nov 22 02:45:59 np0005531887 ovn_metadata_agent[104079]:    group       root
Nov 22 02:45:59 np0005531887 ovn_metadata_agent[104079]:    maxconn     1024
Nov 22 02:45:59 np0005531887 ovn_metadata_agent[104079]:    pidfile     /var/lib/neutron/external/pids/9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202.pid.haproxy
Nov 22 02:45:59 np0005531887 ovn_metadata_agent[104079]:    daemon
Nov 22 02:45:59 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:45:59 np0005531887 ovn_metadata_agent[104079]: defaults
Nov 22 02:45:59 np0005531887 ovn_metadata_agent[104079]:    log global
Nov 22 02:45:59 np0005531887 ovn_metadata_agent[104079]:    mode http
Nov 22 02:45:59 np0005531887 ovn_metadata_agent[104079]:    option httplog
Nov 22 02:45:59 np0005531887 ovn_metadata_agent[104079]:    option dontlognull
Nov 22 02:45:59 np0005531887 ovn_metadata_agent[104079]:    option http-server-close
Nov 22 02:45:59 np0005531887 ovn_metadata_agent[104079]:    option forwardfor
Nov 22 02:45:59 np0005531887 ovn_metadata_agent[104079]:    retries                 3
Nov 22 02:45:59 np0005531887 ovn_metadata_agent[104079]:    timeout http-request    30s
Nov 22 02:45:59 np0005531887 ovn_metadata_agent[104079]:    timeout connect         30s
Nov 22 02:45:59 np0005531887 ovn_metadata_agent[104079]:    timeout client          32s
Nov 22 02:45:59 np0005531887 ovn_metadata_agent[104079]:    timeout server          32s
Nov 22 02:45:59 np0005531887 ovn_metadata_agent[104079]:    timeout http-keep-alive 30s
Nov 22 02:45:59 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:45:59 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:45:59 np0005531887 ovn_metadata_agent[104079]: listen listener
Nov 22 02:45:59 np0005531887 ovn_metadata_agent[104079]:    bind 169.254.169.254:80
Nov 22 02:45:59 np0005531887 ovn_metadata_agent[104079]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 02:45:59 np0005531887 ovn_metadata_agent[104079]:    http-request add-header X-OVN-Network-ID 9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202
Nov 22 02:45:59 np0005531887 ovn_metadata_agent[104079]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 02:45:59 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:45:59.871 104084 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202', 'env', 'PROCESS_TAG=haproxy-9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 02:45:59 np0005531887 nova_compute[186849]: 2025-11-22 07:45:59.881 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:00 np0005531887 nova_compute[186849]: 2025-11-22 07:46:00.231 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:00 np0005531887 podman[216993]: 2025-11-22 07:46:00.248213774 +0000 UTC m=+0.059438656 container create 6e11267e02823b16401a196da038f4e009db113cb281a2187eca9fb072693e96 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118)
Nov 22 02:46:00 np0005531887 systemd[1]: Started libpod-conmon-6e11267e02823b16401a196da038f4e009db113cb281a2187eca9fb072693e96.scope.
Nov 22 02:46:00 np0005531887 podman[216993]: 2025-11-22 07:46:00.212383961 +0000 UTC m=+0.023608863 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 02:46:00 np0005531887 systemd[1]: Started libcrun container.
Nov 22 02:46:00 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/15a6ea8cf80624f138d8bed53e0b03759b84cbaf74fd404f8e41b0ac2fc11410/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 02:46:00 np0005531887 podman[216993]: 2025-11-22 07:46:00.341505651 +0000 UTC m=+0.152730583 container init 6e11267e02823b16401a196da038f4e009db113cb281a2187eca9fb072693e96 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 22 02:46:00 np0005531887 podman[216993]: 2025-11-22 07:46:00.347327795 +0000 UTC m=+0.158552697 container start 6e11267e02823b16401a196da038f4e009db113cb281a2187eca9fb072693e96 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 02:46:00 np0005531887 neutron-haproxy-ovnmeta-9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202[217009]: [NOTICE]   (217013) : New worker (217015) forked
Nov 22 02:46:00 np0005531887 neutron-haproxy-ovnmeta-9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202[217009]: [NOTICE]   (217013) : Loading success.
Nov 22 02:46:00 np0005531887 nova_compute[186849]: 2025-11-22 07:46:00.494 186853 DEBUG nova.compute.manager [req-b4609e85-06f4-4819-8fe5-4e8bad52d82c req-16d19372-edab-4062-aa13-9f6d61edb05d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Received event network-vif-plugged-d3f68e24-4f74-40aa-8f91-ff43c969dcd0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:46:00 np0005531887 nova_compute[186849]: 2025-11-22 07:46:00.494 186853 DEBUG oslo_concurrency.lockutils [req-b4609e85-06f4-4819-8fe5-4e8bad52d82c req-16d19372-edab-4062-aa13-9f6d61edb05d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1cb44f55-1231-44bc-8a2d-e598899b2f89-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:46:00 np0005531887 nova_compute[186849]: 2025-11-22 07:46:00.495 186853 DEBUG oslo_concurrency.lockutils [req-b4609e85-06f4-4819-8fe5-4e8bad52d82c req-16d19372-edab-4062-aa13-9f6d61edb05d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1cb44f55-1231-44bc-8a2d-e598899b2f89-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:46:00 np0005531887 nova_compute[186849]: 2025-11-22 07:46:00.496 186853 DEBUG oslo_concurrency.lockutils [req-b4609e85-06f4-4819-8fe5-4e8bad52d82c req-16d19372-edab-4062-aa13-9f6d61edb05d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1cb44f55-1231-44bc-8a2d-e598899b2f89-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:46:00 np0005531887 nova_compute[186849]: 2025-11-22 07:46:00.496 186853 DEBUG nova.compute.manager [req-b4609e85-06f4-4819-8fe5-4e8bad52d82c req-16d19372-edab-4062-aa13-9f6d61edb05d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Processing event network-vif-plugged-d3f68e24-4f74-40aa-8f91-ff43c969dcd0 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 02:46:00 np0005531887 nova_compute[186849]: 2025-11-22 07:46:00.497 186853 DEBUG nova.compute.manager [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:46:00 np0005531887 nova_compute[186849]: 2025-11-22 07:46:00.501 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763797560.5013514, 1cb44f55-1231-44bc-8a2d-e598899b2f89 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:46:00 np0005531887 nova_compute[186849]: 2025-11-22 07:46:00.502 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:46:00 np0005531887 nova_compute[186849]: 2025-11-22 07:46:00.504 186853 DEBUG nova.virt.libvirt.driver [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 02:46:00 np0005531887 nova_compute[186849]: 2025-11-22 07:46:00.508 186853 INFO nova.virt.libvirt.driver [-] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Instance spawned successfully.#033[00m
Nov 22 02:46:00 np0005531887 nova_compute[186849]: 2025-11-22 07:46:00.508 186853 DEBUG nova.virt.libvirt.driver [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 02:46:00 np0005531887 nova_compute[186849]: 2025-11-22 07:46:00.532 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:46:00 np0005531887 nova_compute[186849]: 2025-11-22 07:46:00.537 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:46:00 np0005531887 nova_compute[186849]: 2025-11-22 07:46:00.542 186853 DEBUG nova.virt.libvirt.driver [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:46:00 np0005531887 nova_compute[186849]: 2025-11-22 07:46:00.542 186853 DEBUG nova.virt.libvirt.driver [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:46:00 np0005531887 nova_compute[186849]: 2025-11-22 07:46:00.543 186853 DEBUG nova.virt.libvirt.driver [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:46:00 np0005531887 nova_compute[186849]: 2025-11-22 07:46:00.543 186853 DEBUG nova.virt.libvirt.driver [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:46:00 np0005531887 nova_compute[186849]: 2025-11-22 07:46:00.544 186853 DEBUG nova.virt.libvirt.driver [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:46:00 np0005531887 nova_compute[186849]: 2025-11-22 07:46:00.544 186853 DEBUG nova.virt.libvirt.driver [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:46:00 np0005531887 nova_compute[186849]: 2025-11-22 07:46:00.569 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:46:00 np0005531887 nova_compute[186849]: 2025-11-22 07:46:00.631 186853 INFO nova.compute.manager [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Took 6.03 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 02:46:00 np0005531887 nova_compute[186849]: 2025-11-22 07:46:00.631 186853 DEBUG nova.compute.manager [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:46:00 np0005531887 nova_compute[186849]: 2025-11-22 07:46:00.708 186853 INFO nova.compute.manager [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Took 7.24 seconds to build instance.#033[00m
Nov 22 02:46:00 np0005531887 nova_compute[186849]: 2025-11-22 07:46:00.797 186853 DEBUG oslo_concurrency.lockutils [None req-f64a9b92-82ee-4c9e-b667-8ee3d0ae4346 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Lock "1cb44f55-1231-44bc-8a2d-e598899b2f89" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.595s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:46:01 np0005531887 nova_compute[186849]: 2025-11-22 07:46:01.339 186853 DEBUG oslo_concurrency.lockutils [None req-05c9d166-73d2-48dc-9be4-85c7c59fe3eb 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "refresh_cache-0edda70f-511a-49a0-8c13-561c699336c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:46:01 np0005531887 nova_compute[186849]: 2025-11-22 07:46:01.339 186853 DEBUG oslo_concurrency.lockutils [None req-05c9d166-73d2-48dc-9be4-85c7c59fe3eb 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquired lock "refresh_cache-0edda70f-511a-49a0-8c13-561c699336c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:46:01 np0005531887 nova_compute[186849]: 2025-11-22 07:46:01.340 186853 DEBUG nova.network.neutron [None req-05c9d166-73d2-48dc-9be4-85c7c59fe3eb 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:46:01 np0005531887 nova_compute[186849]: 2025-11-22 07:46:01.408 186853 DEBUG nova.network.neutron [req-8671a8f6-73e8-46c5-aa91-3685ef029aac req-9b7c6162-c113-4e09-bb0c-8a4e9c3a8963 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Updated VIF entry in instance network info cache for port d3f68e24-4f74-40aa-8f91-ff43c969dcd0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 02:46:01 np0005531887 nova_compute[186849]: 2025-11-22 07:46:01.409 186853 DEBUG nova.network.neutron [req-8671a8f6-73e8-46c5-aa91-3685ef029aac req-9b7c6162-c113-4e09-bb0c-8a4e9c3a8963 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Updating instance_info_cache with network_info: [{"id": "d3f68e24-4f74-40aa-8f91-ff43c969dcd0", "address": "fa:16:3e:fd:ee:47", "network": {"id": "9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-440990750-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef27a8ab9a794f7782ac89b9c28c893a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3f68e24-4f", "ovs_interfaceid": "d3f68e24-4f74-40aa-8f91-ff43c969dcd0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:46:01 np0005531887 nova_compute[186849]: 2025-11-22 07:46:01.425 186853 DEBUG oslo_concurrency.lockutils [req-8671a8f6-73e8-46c5-aa91-3685ef029aac req-9b7c6162-c113-4e09-bb0c-8a4e9c3a8963 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-1cb44f55-1231-44bc-8a2d-e598899b2f89" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:46:01 np0005531887 podman[217024]: 2025-11-22 07:46:01.851898182 +0000 UTC m=+0.060479961 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 22 02:46:02 np0005531887 nova_compute[186849]: 2025-11-22 07:46:02.311 186853 DEBUG nova.network.neutron [None req-05c9d166-73d2-48dc-9be4-85c7c59fe3eb 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:46:02 np0005531887 nova_compute[186849]: 2025-11-22 07:46:02.594 186853 DEBUG nova.compute.manager [req-c14eddb6-2e39-4601-bd41-5fb50b8e738c req-f113bea0-3084-44d5-aebf-37141afd3515 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Received event network-vif-plugged-d3f68e24-4f74-40aa-8f91-ff43c969dcd0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:46:02 np0005531887 nova_compute[186849]: 2025-11-22 07:46:02.595 186853 DEBUG oslo_concurrency.lockutils [req-c14eddb6-2e39-4601-bd41-5fb50b8e738c req-f113bea0-3084-44d5-aebf-37141afd3515 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1cb44f55-1231-44bc-8a2d-e598899b2f89-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:46:02 np0005531887 nova_compute[186849]: 2025-11-22 07:46:02.595 186853 DEBUG oslo_concurrency.lockutils [req-c14eddb6-2e39-4601-bd41-5fb50b8e738c req-f113bea0-3084-44d5-aebf-37141afd3515 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1cb44f55-1231-44bc-8a2d-e598899b2f89-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:46:02 np0005531887 nova_compute[186849]: 2025-11-22 07:46:02.596 186853 DEBUG oslo_concurrency.lockutils [req-c14eddb6-2e39-4601-bd41-5fb50b8e738c req-f113bea0-3084-44d5-aebf-37141afd3515 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1cb44f55-1231-44bc-8a2d-e598899b2f89-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:46:02 np0005531887 nova_compute[186849]: 2025-11-22 07:46:02.596 186853 DEBUG nova.compute.manager [req-c14eddb6-2e39-4601-bd41-5fb50b8e738c req-f113bea0-3084-44d5-aebf-37141afd3515 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] No waiting events found dispatching network-vif-plugged-d3f68e24-4f74-40aa-8f91-ff43c969dcd0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:46:02 np0005531887 nova_compute[186849]: 2025-11-22 07:46:02.596 186853 WARNING nova.compute.manager [req-c14eddb6-2e39-4601-bd41-5fb50b8e738c req-f113bea0-3084-44d5-aebf-37141afd3515 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Received unexpected event network-vif-plugged-d3f68e24-4f74-40aa-8f91-ff43c969dcd0 for instance with vm_state active and task_state None.#033[00m
Nov 22 02:46:02 np0005531887 nova_compute[186849]: 2025-11-22 07:46:02.779 186853 DEBUG nova.network.neutron [None req-05c9d166-73d2-48dc-9be4-85c7c59fe3eb 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:46:02 np0005531887 nova_compute[186849]: 2025-11-22 07:46:02.794 186853 DEBUG oslo_concurrency.lockutils [None req-05c9d166-73d2-48dc-9be4-85c7c59fe3eb 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Releasing lock "refresh_cache-0edda70f-511a-49a0-8c13-561c699336c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:46:02 np0005531887 nova_compute[186849]: 2025-11-22 07:46:02.808 186853 DEBUG nova.virt.libvirt.driver [None req-05c9d166-73d2-48dc-9be4-85c7c59fe3eb 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Creating tmpfile /var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1/tmpm_4yzszx to verify with other compute node that the instance is on the same shared storage. check_instance_shared_storage_local /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:9618#033[00m
Nov 22 02:46:02 np0005531887 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000001a.scope: Deactivated successfully.
Nov 22 02:46:02 np0005531887 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000001a.scope: Consumed 4.751s CPU time.
Nov 22 02:46:02 np0005531887 systemd-machined[153180]: Machine qemu-11-instance-0000001a terminated.
Nov 22 02:46:02 np0005531887 nova_compute[186849]: 2025-11-22 07:46:02.930 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:03 np0005531887 nova_compute[186849]: 2025-11-22 07:46:03.074 186853 INFO nova.virt.libvirt.driver [-] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Instance destroyed successfully.#033[00m
Nov 22 02:46:03 np0005531887 nova_compute[186849]: 2025-11-22 07:46:03.075 186853 DEBUG nova.objects.instance [None req-05c9d166-73d2-48dc-9be4-85c7c59fe3eb 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lazy-loading 'resources' on Instance uuid 0edda70f-511a-49a0-8c13-561c699336c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:46:03 np0005531887 nova_compute[186849]: 2025-11-22 07:46:03.088 186853 INFO nova.virt.libvirt.driver [None req-05c9d166-73d2-48dc-9be4-85c7c59fe3eb 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Deleting instance files /var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1_del#033[00m
Nov 22 02:46:03 np0005531887 nova_compute[186849]: 2025-11-22 07:46:03.095 186853 INFO nova.virt.libvirt.driver [None req-05c9d166-73d2-48dc-9be4-85c7c59fe3eb 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Deletion of /var/lib/nova/instances/0edda70f-511a-49a0-8c13-561c699336c1_del complete#033[00m
Nov 22 02:46:03 np0005531887 nova_compute[186849]: 2025-11-22 07:46:03.400 186853 DEBUG oslo_concurrency.lockutils [None req-05c9d166-73d2-48dc-9be4-85c7c59fe3eb 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:46:03 np0005531887 nova_compute[186849]: 2025-11-22 07:46:03.401 186853 DEBUG oslo_concurrency.lockutils [None req-05c9d166-73d2-48dc-9be4-85c7c59fe3eb 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:46:03 np0005531887 nova_compute[186849]: 2025-11-22 07:46:03.465 186853 DEBUG nova.objects.instance [None req-05c9d166-73d2-48dc-9be4-85c7c59fe3eb 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lazy-loading 'migration_context' on Instance uuid 0edda70f-511a-49a0-8c13-561c699336c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:46:03 np0005531887 nova_compute[186849]: 2025-11-22 07:46:03.569 186853 DEBUG nova.compute.provider_tree [None req-05c9d166-73d2-48dc-9be4-85c7c59fe3eb 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:46:03 np0005531887 nova_compute[186849]: 2025-11-22 07:46:03.581 186853 DEBUG nova.scheduler.client.report [None req-05c9d166-73d2-48dc-9be4-85c7c59fe3eb 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:46:03 np0005531887 nova_compute[186849]: 2025-11-22 07:46:03.671 186853 DEBUG oslo_concurrency.lockutils [None req-05c9d166-73d2-48dc-9be4-85c7c59fe3eb 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: held 0.270s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:46:05 np0005531887 nova_compute[186849]: 2025-11-22 07:46:05.233 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:05 np0005531887 podman[217053]: 2025-11-22 07:46:05.836449779 +0000 UTC m=+0.055696103 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:46:06 np0005531887 systemd[1]: Stopping User Manager for UID 42436...
Nov 22 02:46:06 np0005531887 systemd[216814]: Activating special unit Exit the Session...
Nov 22 02:46:06 np0005531887 systemd[216814]: Stopped target Main User Target.
Nov 22 02:46:06 np0005531887 systemd[216814]: Stopped target Basic System.
Nov 22 02:46:06 np0005531887 systemd[216814]: Stopped target Paths.
Nov 22 02:46:06 np0005531887 systemd[216814]: Stopped target Sockets.
Nov 22 02:46:06 np0005531887 systemd[216814]: Stopped target Timers.
Nov 22 02:46:06 np0005531887 systemd[216814]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 22 02:46:06 np0005531887 systemd[216814]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 22 02:46:06 np0005531887 systemd[216814]: Closed D-Bus User Message Bus Socket.
Nov 22 02:46:06 np0005531887 systemd[216814]: Stopped Create User's Volatile Files and Directories.
Nov 22 02:46:06 np0005531887 systemd[216814]: Removed slice User Application Slice.
Nov 22 02:46:06 np0005531887 systemd[216814]: Reached target Shutdown.
Nov 22 02:46:06 np0005531887 systemd[216814]: Finished Exit the Session.
Nov 22 02:46:06 np0005531887 systemd[216814]: Reached target Exit the Session.
Nov 22 02:46:06 np0005531887 systemd[1]: user@42436.service: Deactivated successfully.
Nov 22 02:46:06 np0005531887 systemd[1]: Stopped User Manager for UID 42436.
Nov 22 02:46:06 np0005531887 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 22 02:46:06 np0005531887 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 22 02:46:06 np0005531887 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 22 02:46:06 np0005531887 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 22 02:46:06 np0005531887 systemd[1]: Removed slice User Slice of UID 42436.
Nov 22 02:46:07 np0005531887 nova_compute[186849]: 2025-11-22 07:46:07.936 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:10 np0005531887 nova_compute[186849]: 2025-11-22 07:46:10.234 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:10 np0005531887 nova_compute[186849]: 2025-11-22 07:46:10.423 186853 DEBUG oslo_concurrency.lockutils [None req-90c14353-f334-4299-9c1e-c5009634b601 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "88c868e5-67c5-4f22-b584-d8772316044d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:46:10 np0005531887 nova_compute[186849]: 2025-11-22 07:46:10.424 186853 DEBUG oslo_concurrency.lockutils [None req-90c14353-f334-4299-9c1e-c5009634b601 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "88c868e5-67c5-4f22-b584-d8772316044d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:46:10 np0005531887 nova_compute[186849]: 2025-11-22 07:46:10.424 186853 DEBUG oslo_concurrency.lockutils [None req-90c14353-f334-4299-9c1e-c5009634b601 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "88c868e5-67c5-4f22-b584-d8772316044d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:46:10 np0005531887 nova_compute[186849]: 2025-11-22 07:46:10.424 186853 DEBUG oslo_concurrency.lockutils [None req-90c14353-f334-4299-9c1e-c5009634b601 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "88c868e5-67c5-4f22-b584-d8772316044d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:46:10 np0005531887 nova_compute[186849]: 2025-11-22 07:46:10.424 186853 DEBUG oslo_concurrency.lockutils [None req-90c14353-f334-4299-9c1e-c5009634b601 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "88c868e5-67c5-4f22-b584-d8772316044d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:46:10 np0005531887 nova_compute[186849]: 2025-11-22 07:46:10.431 186853 INFO nova.compute.manager [None req-90c14353-f334-4299-9c1e-c5009634b601 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Terminating instance#033[00m
Nov 22 02:46:10 np0005531887 nova_compute[186849]: 2025-11-22 07:46:10.436 186853 DEBUG oslo_concurrency.lockutils [None req-90c14353-f334-4299-9c1e-c5009634b601 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "refresh_cache-88c868e5-67c5-4f22-b584-d8772316044d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:46:10 np0005531887 nova_compute[186849]: 2025-11-22 07:46:10.436 186853 DEBUG oslo_concurrency.lockutils [None req-90c14353-f334-4299-9c1e-c5009634b601 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquired lock "refresh_cache-88c868e5-67c5-4f22-b584-d8772316044d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:46:10 np0005531887 nova_compute[186849]: 2025-11-22 07:46:10.437 186853 DEBUG nova.network.neutron [None req-90c14353-f334-4299-9c1e-c5009634b601 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:46:10 np0005531887 nova_compute[186849]: 2025-11-22 07:46:10.613 186853 DEBUG nova.network.neutron [None req-90c14353-f334-4299-9c1e-c5009634b601 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:46:10 np0005531887 podman[217077]: 2025-11-22 07:46:10.844006573 +0000 UTC m=+0.056651066 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 02:46:10 np0005531887 nova_compute[186849]: 2025-11-22 07:46:10.957 186853 DEBUG nova.network.neutron [None req-90c14353-f334-4299-9c1e-c5009634b601 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:46:10 np0005531887 nova_compute[186849]: 2025-11-22 07:46:10.972 186853 DEBUG oslo_concurrency.lockutils [None req-90c14353-f334-4299-9c1e-c5009634b601 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Releasing lock "refresh_cache-88c868e5-67c5-4f22-b584-d8772316044d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:46:10 np0005531887 nova_compute[186849]: 2025-11-22 07:46:10.972 186853 DEBUG nova.compute.manager [None req-90c14353-f334-4299-9c1e-c5009634b601 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 02:46:11 np0005531887 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d00000016.scope: Deactivated successfully.
Nov 22 02:46:11 np0005531887 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d00000016.scope: Consumed 14.943s CPU time.
Nov 22 02:46:11 np0005531887 systemd-machined[153180]: Machine qemu-10-instance-00000016 terminated.
Nov 22 02:46:11 np0005531887 nova_compute[186849]: 2025-11-22 07:46:11.220 186853 INFO nova.virt.libvirt.driver [-] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Instance destroyed successfully.#033[00m
Nov 22 02:46:11 np0005531887 nova_compute[186849]: 2025-11-22 07:46:11.221 186853 DEBUG nova.objects.instance [None req-90c14353-f334-4299-9c1e-c5009634b601 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lazy-loading 'resources' on Instance uuid 88c868e5-67c5-4f22-b584-d8772316044d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:46:11 np0005531887 nova_compute[186849]: 2025-11-22 07:46:11.252 186853 INFO nova.virt.libvirt.driver [None req-90c14353-f334-4299-9c1e-c5009634b601 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Deleting instance files /var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d_del#033[00m
Nov 22 02:46:11 np0005531887 nova_compute[186849]: 2025-11-22 07:46:11.258 186853 INFO nova.virt.libvirt.driver [None req-90c14353-f334-4299-9c1e-c5009634b601 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Deletion of /var/lib/nova/instances/88c868e5-67c5-4f22-b584-d8772316044d_del complete#033[00m
Nov 22 02:46:11 np0005531887 nova_compute[186849]: 2025-11-22 07:46:11.381 186853 INFO nova.compute.manager [None req-90c14353-f334-4299-9c1e-c5009634b601 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Took 0.41 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 02:46:11 np0005531887 nova_compute[186849]: 2025-11-22 07:46:11.381 186853 DEBUG oslo.service.loopingcall [None req-90c14353-f334-4299-9c1e-c5009634b601 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 02:46:11 np0005531887 nova_compute[186849]: 2025-11-22 07:46:11.381 186853 DEBUG nova.compute.manager [-] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 02:46:11 np0005531887 nova_compute[186849]: 2025-11-22 07:46:11.382 186853 DEBUG nova.network.neutron [-] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 02:46:11 np0005531887 nova_compute[186849]: 2025-11-22 07:46:11.562 186853 DEBUG nova.network.neutron [-] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:46:11 np0005531887 nova_compute[186849]: 2025-11-22 07:46:11.578 186853 DEBUG nova.network.neutron [-] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:46:11 np0005531887 nova_compute[186849]: 2025-11-22 07:46:11.590 186853 INFO nova.compute.manager [-] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Took 0.21 seconds to deallocate network for instance.#033[00m
Nov 22 02:46:11 np0005531887 nova_compute[186849]: 2025-11-22 07:46:11.705 186853 DEBUG oslo_concurrency.lockutils [None req-90c14353-f334-4299-9c1e-c5009634b601 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:46:11 np0005531887 nova_compute[186849]: 2025-11-22 07:46:11.705 186853 DEBUG oslo_concurrency.lockutils [None req-90c14353-f334-4299-9c1e-c5009634b601 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:46:11 np0005531887 nova_compute[186849]: 2025-11-22 07:46:11.842 186853 DEBUG nova.compute.provider_tree [None req-90c14353-f334-4299-9c1e-c5009634b601 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:46:11 np0005531887 nova_compute[186849]: 2025-11-22 07:46:11.864 186853 DEBUG nova.scheduler.client.report [None req-90c14353-f334-4299-9c1e-c5009634b601 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:46:11 np0005531887 nova_compute[186849]: 2025-11-22 07:46:11.901 186853 DEBUG oslo_concurrency.lockutils [None req-90c14353-f334-4299-9c1e-c5009634b601 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.195s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:46:11 np0005531887 nova_compute[186849]: 2025-11-22 07:46:11.950 186853 INFO nova.scheduler.client.report [None req-90c14353-f334-4299-9c1e-c5009634b601 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Deleted allocations for instance 88c868e5-67c5-4f22-b584-d8772316044d#033[00m
Nov 22 02:46:12 np0005531887 nova_compute[186849]: 2025-11-22 07:46:12.050 186853 DEBUG oslo_concurrency.lockutils [None req-90c14353-f334-4299-9c1e-c5009634b601 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "88c868e5-67c5-4f22-b584-d8772316044d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.626s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:46:12 np0005531887 nova_compute[186849]: 2025-11-22 07:46:12.939 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:14 np0005531887 ovn_controller[95130]: 2025-11-22T07:46:14Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:fd:ee:47 10.100.0.6
Nov 22 02:46:14 np0005531887 ovn_controller[95130]: 2025-11-22T07:46:14Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:fd:ee:47 10.100.0.6
Nov 22 02:46:15 np0005531887 nova_compute[186849]: 2025-11-22 07:46:15.236 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:15 np0005531887 podman[217129]: 2025-11-22 07:46:15.840168716 +0000 UTC m=+0.057368074 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, architecture=x86_64, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, vcs-type=git, distribution-scope=public, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 22 02:46:17 np0005531887 systemd[1]: virtproxyd.service: Deactivated successfully.
Nov 22 02:46:17 np0005531887 nova_compute[186849]: 2025-11-22 07:46:17.944 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:18 np0005531887 nova_compute[186849]: 2025-11-22 07:46:18.071 186853 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797563.0689294, 0edda70f-511a-49a0-8c13-561c699336c1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:46:18 np0005531887 nova_compute[186849]: 2025-11-22 07:46:18.072 186853 INFO nova.compute.manager [-] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:46:18 np0005531887 nova_compute[186849]: 2025-11-22 07:46:18.091 186853 DEBUG nova.compute.manager [None req-548bf3ab-4a10-47b2-ace7-4a51d5fd958c - - - - - -] [instance: 0edda70f-511a-49a0-8c13-561c699336c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:46:20 np0005531887 nova_compute[186849]: 2025-11-22 07:46:20.240 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:22 np0005531887 NetworkManager[55210]: <info>  [1763797582.0266] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/42)
Nov 22 02:46:22 np0005531887 NetworkManager[55210]: <info>  [1763797582.0277] device (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 22 02:46:22 np0005531887 nova_compute[186849]: 2025-11-22 07:46:22.026 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:22 np0005531887 NetworkManager[55210]: <info>  [1763797582.0289] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/43)
Nov 22 02:46:22 np0005531887 NetworkManager[55210]: <info>  [1763797582.0293] device (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 22 02:46:22 np0005531887 NetworkManager[55210]: <info>  [1763797582.0304] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/44)
Nov 22 02:46:22 np0005531887 NetworkManager[55210]: <info>  [1763797582.0312] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/45)
Nov 22 02:46:22 np0005531887 NetworkManager[55210]: <info>  [1763797582.0316] device (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Nov 22 02:46:22 np0005531887 NetworkManager[55210]: <info>  [1763797582.0321] device (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Nov 22 02:46:22 np0005531887 nova_compute[186849]: 2025-11-22 07:46:22.163 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:22 np0005531887 ovn_controller[95130]: 2025-11-22T07:46:22Z|00069|binding|INFO|Releasing lport d4b08431-3a8d-4e48-ba0b-792923071bed from this chassis (sb_readonly=0)
Nov 22 02:46:22 np0005531887 nova_compute[186849]: 2025-11-22 07:46:22.184 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:22 np0005531887 podman[217154]: 2025-11-22 07:46:22.842021379 +0000 UTC m=+0.066478209 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:46:22 np0005531887 nova_compute[186849]: 2025-11-22 07:46:22.852 186853 DEBUG nova.compute.manager [req-c3c886e9-9cc6-40e0-aa2d-b62e3fd2fc63 req-50436129-8e86-4678-9c05-418d92fec0d5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Received event network-changed-d3f68e24-4f74-40aa-8f91-ff43c969dcd0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:46:22 np0005531887 nova_compute[186849]: 2025-11-22 07:46:22.853 186853 DEBUG nova.compute.manager [req-c3c886e9-9cc6-40e0-aa2d-b62e3fd2fc63 req-50436129-8e86-4678-9c05-418d92fec0d5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Refreshing instance network info cache due to event network-changed-d3f68e24-4f74-40aa-8f91-ff43c969dcd0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 02:46:22 np0005531887 nova_compute[186849]: 2025-11-22 07:46:22.853 186853 DEBUG oslo_concurrency.lockutils [req-c3c886e9-9cc6-40e0-aa2d-b62e3fd2fc63 req-50436129-8e86-4678-9c05-418d92fec0d5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-1cb44f55-1231-44bc-8a2d-e598899b2f89" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:46:22 np0005531887 nova_compute[186849]: 2025-11-22 07:46:22.853 186853 DEBUG oslo_concurrency.lockutils [req-c3c886e9-9cc6-40e0-aa2d-b62e3fd2fc63 req-50436129-8e86-4678-9c05-418d92fec0d5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-1cb44f55-1231-44bc-8a2d-e598899b2f89" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:46:22 np0005531887 nova_compute[186849]: 2025-11-22 07:46:22.853 186853 DEBUG nova.network.neutron [req-c3c886e9-9cc6-40e0-aa2d-b62e3fd2fc63 req-50436129-8e86-4678-9c05-418d92fec0d5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Refreshing network info cache for port d3f68e24-4f74-40aa-8f91-ff43c969dcd0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 02:46:22 np0005531887 podman[217155]: 2025-11-22 07:46:22.896341556 +0000 UTC m=+0.115399183 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible)
Nov 22 02:46:22 np0005531887 nova_compute[186849]: 2025-11-22 07:46:22.946 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:25 np0005531887 nova_compute[186849]: 2025-11-22 07:46:25.243 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:25 np0005531887 nova_compute[186849]: 2025-11-22 07:46:25.278 186853 DEBUG nova.network.neutron [req-c3c886e9-9cc6-40e0-aa2d-b62e3fd2fc63 req-50436129-8e86-4678-9c05-418d92fec0d5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Updated VIF entry in instance network info cache for port d3f68e24-4f74-40aa-8f91-ff43c969dcd0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 02:46:25 np0005531887 nova_compute[186849]: 2025-11-22 07:46:25.278 186853 DEBUG nova.network.neutron [req-c3c886e9-9cc6-40e0-aa2d-b62e3fd2fc63 req-50436129-8e86-4678-9c05-418d92fec0d5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Updating instance_info_cache with network_info: [{"id": "d3f68e24-4f74-40aa-8f91-ff43c969dcd0", "address": "fa:16:3e:fd:ee:47", "network": {"id": "9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-440990750-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef27a8ab9a794f7782ac89b9c28c893a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3f68e24-4f", "ovs_interfaceid": "d3f68e24-4f74-40aa-8f91-ff43c969dcd0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:46:25 np0005531887 nova_compute[186849]: 2025-11-22 07:46:25.309 186853 DEBUG oslo_concurrency.lockutils [req-c3c886e9-9cc6-40e0-aa2d-b62e3fd2fc63 req-50436129-8e86-4678-9c05-418d92fec0d5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-1cb44f55-1231-44bc-8a2d-e598899b2f89" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:46:26 np0005531887 nova_compute[186849]: 2025-11-22 07:46:26.220 186853 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797571.2188137, 88c868e5-67c5-4f22-b584-d8772316044d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:46:26 np0005531887 nova_compute[186849]: 2025-11-22 07:46:26.221 186853 INFO nova.compute.manager [-] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:46:26 np0005531887 nova_compute[186849]: 2025-11-22 07:46:26.350 186853 DEBUG nova.compute.manager [None req-f5006dc8-1ddf-4b45-9a5a-1480accd879f - - - - - -] [instance: 88c868e5-67c5-4f22-b584-d8772316044d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:46:27 np0005531887 nova_compute[186849]: 2025-11-22 07:46:27.950 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:28 np0005531887 podman[217202]: 2025-11-22 07:46:28.835444153 +0000 UTC m=+0.053778625 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 02:46:30 np0005531887 nova_compute[186849]: 2025-11-22 07:46:30.244 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:32 np0005531887 nova_compute[186849]: 2025-11-22 07:46:32.093 186853 DEBUG nova.compute.manager [req-913719a8-f147-4247-8dfc-c21c0f3b87c0 req-73d6ca62-2832-486a-889f-add723536eba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Received event network-changed-d3f68e24-4f74-40aa-8f91-ff43c969dcd0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:46:32 np0005531887 nova_compute[186849]: 2025-11-22 07:46:32.093 186853 DEBUG nova.compute.manager [req-913719a8-f147-4247-8dfc-c21c0f3b87c0 req-73d6ca62-2832-486a-889f-add723536eba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Refreshing instance network info cache due to event network-changed-d3f68e24-4f74-40aa-8f91-ff43c969dcd0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 02:46:32 np0005531887 nova_compute[186849]: 2025-11-22 07:46:32.094 186853 DEBUG oslo_concurrency.lockutils [req-913719a8-f147-4247-8dfc-c21c0f3b87c0 req-73d6ca62-2832-486a-889f-add723536eba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-1cb44f55-1231-44bc-8a2d-e598899b2f89" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:46:32 np0005531887 nova_compute[186849]: 2025-11-22 07:46:32.094 186853 DEBUG oslo_concurrency.lockutils [req-913719a8-f147-4247-8dfc-c21c0f3b87c0 req-73d6ca62-2832-486a-889f-add723536eba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-1cb44f55-1231-44bc-8a2d-e598899b2f89" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:46:32 np0005531887 nova_compute[186849]: 2025-11-22 07:46:32.094 186853 DEBUG nova.network.neutron [req-913719a8-f147-4247-8dfc-c21c0f3b87c0 req-73d6ca62-2832-486a-889f-add723536eba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Refreshing network info cache for port d3f68e24-4f74-40aa-8f91-ff43c969dcd0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 02:46:32 np0005531887 podman[217227]: 2025-11-22 07:46:32.822120806 +0000 UTC m=+0.045814810 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:46:32 np0005531887 nova_compute[186849]: 2025-11-22 07:46:32.953 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:34 np0005531887 nova_compute[186849]: 2025-11-22 07:46:34.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:46:35 np0005531887 nova_compute[186849]: 2025-11-22 07:46:35.246 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:35 np0005531887 nova_compute[186849]: 2025-11-22 07:46:35.291 186853 DEBUG nova.network.neutron [req-913719a8-f147-4247-8dfc-c21c0f3b87c0 req-73d6ca62-2832-486a-889f-add723536eba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Updated VIF entry in instance network info cache for port d3f68e24-4f74-40aa-8f91-ff43c969dcd0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 02:46:35 np0005531887 nova_compute[186849]: 2025-11-22 07:46:35.292 186853 DEBUG nova.network.neutron [req-913719a8-f147-4247-8dfc-c21c0f3b87c0 req-73d6ca62-2832-486a-889f-add723536eba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Updating instance_info_cache with network_info: [{"id": "d3f68e24-4f74-40aa-8f91-ff43c969dcd0", "address": "fa:16:3e:fd:ee:47", "network": {"id": "9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-440990750-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef27a8ab9a794f7782ac89b9c28c893a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3f68e24-4f", "ovs_interfaceid": "d3f68e24-4f74-40aa-8f91-ff43c969dcd0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:46:35 np0005531887 nova_compute[186849]: 2025-11-22 07:46:35.314 186853 DEBUG oslo_concurrency.lockutils [req-913719a8-f147-4247-8dfc-c21c0f3b87c0 req-73d6ca62-2832-486a-889f-add723536eba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-1cb44f55-1231-44bc-8a2d-e598899b2f89" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:46:35 np0005531887 nova_compute[186849]: 2025-11-22 07:46:35.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.661 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '1cb44f55-1231-44bc-8a2d-e598899b2f89', 'name': 'tempest-FloatingIPsAssociationTestJSON-server-1328693686', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000001b', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'user_id': '65ded9a5f9a7463d8c52561197054664', 'hostId': 'd152851af04cbf04d33298cb4c6c48ca5ecdd1398913fd0d3a458fff', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.662 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.689 12 DEBUG ceilometer.compute.pollsters [-] 1cb44f55-1231-44bc-8a2d-e598899b2f89/disk.device.write.latency volume: 1697198374 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.690 12 DEBUG ceilometer.compute.pollsters [-] 1cb44f55-1231-44bc-8a2d-e598899b2f89/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.691 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '18a221f2-34c5-41fe-91ae-fa07ba7a0b4a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1697198374, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_name': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_name': None, 'resource_id': '1cb44f55-1231-44bc-8a2d-e598899b2f89-vda', 'timestamp': '2025-11-22T07:46:36.662644', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-1328693686', 'name': 'instance-0000001b', 'instance_id': '1cb44f55-1231-44bc-8a2d-e598899b2f89', 'instance_type': 'm1.nano', 'host': 'd152851af04cbf04d33298cb4c6c48ca5ecdd1398913fd0d3a458fff', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5f919da4-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4339.332496095, 'message_signature': '5bc69d3b263baeb08ef83d3dde0c32a7e0174642708b07c89e6bfcda8d35f79a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_name': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_name': None, 'resource_id': '1cb44f55-1231-44bc-8a2d-e598899b2f89-sda', 'timestamp': '2025-11-22T07:46:36.662644', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-1328693686', 'name': 'instance-0000001b', 'instance_id': '1cb44f55-1231-44bc-8a2d-e598899b2f89', 'instance_type': 'm1.nano', 'host': 'd152851af04cbf04d33298cb4c6c48ca5ecdd1398913fd0d3a458fff', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5f91b17c-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4339.332496095, 'message_signature': 'ad80ad269dc8184c8de61f2880e933b8fea8d46ba98fc3a757e2cf1cef0f37b0'}]}, 'timestamp': '2025-11-22 07:46:36.690683', '_unique_id': '53b3c4bac79f423c8330df171b7a4eb6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.691 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.691 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.691 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.691 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.691 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.691 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.691 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.691 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.691 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.691 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.691 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.691 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.691 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.691 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.691 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.691 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.691 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.691 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.691 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.691 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.691 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.691 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.691 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.691 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.691 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.691 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.691 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.691 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.691 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.691 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.691 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.692 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.693 12 DEBUG ceilometer.compute.pollsters [-] 1cb44f55-1231-44bc-8a2d-e598899b2f89/disk.device.read.latency volume: 827533976 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.693 12 DEBUG ceilometer.compute.pollsters [-] 1cb44f55-1231-44bc-8a2d-e598899b2f89/disk.device.read.latency volume: 139954890 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.695 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2eec0a29-c29e-42be-98d3-53a098b613c7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 827533976, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_name': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_name': None, 'resource_id': '1cb44f55-1231-44bc-8a2d-e598899b2f89-vda', 'timestamp': '2025-11-22T07:46:36.693139', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-1328693686', 'name': 'instance-0000001b', 'instance_id': '1cb44f55-1231-44bc-8a2d-e598899b2f89', 'instance_type': 'm1.nano', 'host': 'd152851af04cbf04d33298cb4c6c48ca5ecdd1398913fd0d3a458fff', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5f9225a8-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4339.332496095, 'message_signature': '5d01c26899bce75b5e2426c2048ff3f4d74505eabfe65a196a7ec5c1e40f872a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 139954890, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_name': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_name': None, 'resource_id': '1cb44f55-1231-44bc-8a2d-e598899b2f89-sda', 'timestamp': '2025-11-22T07:46:36.693139', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-1328693686', 'name': 'instance-0000001b', 'instance_id': '1cb44f55-1231-44bc-8a2d-e598899b2f89', 'instance_type': 'm1.nano', 'host': 'd152851af04cbf04d33298cb4c6c48ca5ecdd1398913fd0d3a458fff', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5f923714-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4339.332496095, 'message_signature': '221c4dfa934dc4f7b3e07c7b94de781b0342982db9d13240b84c8370e28f16d4'}]}, 'timestamp': '2025-11-22 07:46:36.694100', '_unique_id': 'cebde385636744cbab88ad6425e19ae4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.695 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.695 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.695 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.695 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.695 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.695 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.695 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.695 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.695 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.695 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.695 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.695 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.695 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.695 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.695 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.695 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.695 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.695 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.695 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.695 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.695 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.695 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.695 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.695 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.695 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.695 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.695 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.695 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.695 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.695 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.695 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.696 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.700 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 1cb44f55-1231-44bc-8a2d-e598899b2f89 / tapd3f68e24-4f inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.700 12 DEBUG ceilometer.compute.pollsters [-] 1cb44f55-1231-44bc-8a2d-e598899b2f89/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.702 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1c305e55-6477-49e1-8292-385ee219e25e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_name': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_name': None, 'resource_id': 'instance-0000001b-1cb44f55-1231-44bc-8a2d-e598899b2f89-tapd3f68e24-4f', 'timestamp': '2025-11-22T07:46:36.696408', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-1328693686', 'name': 'tapd3f68e24-4f', 'instance_id': '1cb44f55-1231-44bc-8a2d-e598899b2f89', 'instance_type': 'm1.nano', 'host': 'd152851af04cbf04d33298cb4c6c48ca5ecdd1398913fd0d3a458fff', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fd:ee:47', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3f68e24-4f'}, 'message_id': '5f934b90-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4339.366249166, 'message_signature': 'f63f5117a7896b61dd4732e32430e45f0dbbcef936e718c31459642476a7e7ff'}]}, 'timestamp': '2025-11-22 07:46:36.701211', '_unique_id': 'a9272ca390854e598436dff6cf435728'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.702 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.702 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.702 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.702 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.702 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.702 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.702 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.702 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.702 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.702 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.702 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.702 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.702 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.702 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.702 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.702 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.702 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.702 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.702 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.702 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.702 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.702 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.702 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.702 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.702 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.702 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.702 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.702 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.702 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.702 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.702 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.703 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.703 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.703 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-1328693686>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-1328693686>]
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.703 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.715 12 DEBUG ceilometer.compute.pollsters [-] 1cb44f55-1231-44bc-8a2d-e598899b2f89/disk.device.allocation volume: 30351360 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.715 12 DEBUG ceilometer.compute.pollsters [-] 1cb44f55-1231-44bc-8a2d-e598899b2f89/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.717 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '682e5db9-040c-4620-83cd-f77159e7a13a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30351360, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_name': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_name': None, 'resource_id': '1cb44f55-1231-44bc-8a2d-e598899b2f89-vda', 'timestamp': '2025-11-22T07:46:36.704034', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-1328693686', 'name': 'instance-0000001b', 'instance_id': '1cb44f55-1231-44bc-8a2d-e598899b2f89', 'instance_type': 'm1.nano', 'host': 'd152851af04cbf04d33298cb4c6c48ca5ecdd1398913fd0d3a458fff', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5f9586c6-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4339.373869384, 'message_signature': 'e70d5cb233d500ca6f58640a319ec33014f7456e1813bf3b00d7e69c8b67cb22'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_name': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_name': None, 'resource_id': '1cb44f55-1231-44bc-8a2d-e598899b2f89-sda', 'timestamp': '2025-11-22T07:46:36.704034', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-1328693686', 'name': 'instance-0000001b', 'instance_id': '1cb44f55-1231-44bc-8a2d-e598899b2f89', 'instance_type': 'm1.nano', 'host': 'd152851af04cbf04d33298cb4c6c48ca5ecdd1398913fd0d3a458fff', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5f9593b4-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4339.373869384, 'message_signature': '2a6bbb944fc119a66675ebee4c68018978253dc5051064fd113af4cd0edd2f9d'}]}, 'timestamp': '2025-11-22 07:46:36.716069', '_unique_id': '73ad0c2ca89444868e590826e9070dfe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.717 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.717 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.717 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.717 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.717 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.717 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.717 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.717 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.717 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.717 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.717 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.717 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.717 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.717 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.717 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.717 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.717 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.717 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.717 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.717 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.717 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.717 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.717 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.717 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.717 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.717 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.717 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.717 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.717 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.717 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.717 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.717 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.718 12 DEBUG ceilometer.compute.pollsters [-] 1cb44f55-1231-44bc-8a2d-e598899b2f89/network.incoming.packets volume: 10 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.718 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '26d99dbe-46df-4eb6-96d9-14dfb059d018', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 10, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_name': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_name': None, 'resource_id': 'instance-0000001b-1cb44f55-1231-44bc-8a2d-e598899b2f89-tapd3f68e24-4f', 'timestamp': '2025-11-22T07:46:36.718030', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-1328693686', 'name': 'tapd3f68e24-4f', 'instance_id': '1cb44f55-1231-44bc-8a2d-e598899b2f89', 'instance_type': 'm1.nano', 'host': 'd152851af04cbf04d33298cb4c6c48ca5ecdd1398913fd0d3a458fff', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fd:ee:47', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3f68e24-4f'}, 'message_id': '5f95ebe8-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4339.366249166, 'message_signature': '533178a1b5114400b550d3b4a33da96a77093e1b4d97d167e768f3c768dd5c8a'}]}, 'timestamp': '2025-11-22 07:46:36.718358', '_unique_id': 'd62b281aae944fd498e08def6e4b2b88'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.718 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.718 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.718 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.718 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.718 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.718 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.718 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.718 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.718 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.718 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.718 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.718 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.718 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.718 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.718 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.718 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.718 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.718 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.718 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.718 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.718 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.718 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.718 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.718 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.718 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.718 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.718 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.718 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.718 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.718 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.718 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.719 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.719 12 DEBUG ceilometer.compute.pollsters [-] 1cb44f55-1231-44bc-8a2d-e598899b2f89/network.incoming.bytes volume: 1394 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.720 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '10f68cf8-a59c-4697-9a54-301b080f7d16', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1394, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_name': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_name': None, 'resource_id': 'instance-0000001b-1cb44f55-1231-44bc-8a2d-e598899b2f89-tapd3f68e24-4f', 'timestamp': '2025-11-22T07:46:36.719679', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-1328693686', 'name': 'tapd3f68e24-4f', 'instance_id': '1cb44f55-1231-44bc-8a2d-e598899b2f89', 'instance_type': 'm1.nano', 'host': 'd152851af04cbf04d33298cb4c6c48ca5ecdd1398913fd0d3a458fff', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fd:ee:47', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3f68e24-4f'}, 'message_id': '5f962b62-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4339.366249166, 'message_signature': '76d37f692172f30d037eb0a0241883341d8da3b3bfcdd4e0d79632011a324604'}]}, 'timestamp': '2025-11-22 07:46:36.719945', '_unique_id': '8651f9db305e45b1bba9f21d29cb2817'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.720 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.720 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.720 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.720 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.720 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.720 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.720 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.720 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.720 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.720 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.720 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.720 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.720 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.720 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.720 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.720 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.720 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.720 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.720 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.720 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.720 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.720 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.720 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.720 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.720 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.720 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.720 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.720 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.720 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.720 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.720 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.721 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.721 12 DEBUG ceilometer.compute.pollsters [-] 1cb44f55-1231-44bc-8a2d-e598899b2f89/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.722 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8b2e769c-ebec-4541-9fda-c124299ab0be', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_name': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_name': None, 'resource_id': 'instance-0000001b-1cb44f55-1231-44bc-8a2d-e598899b2f89-tapd3f68e24-4f', 'timestamp': '2025-11-22T07:46:36.721447', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-1328693686', 'name': 'tapd3f68e24-4f', 'instance_id': '1cb44f55-1231-44bc-8a2d-e598899b2f89', 'instance_type': 'm1.nano', 'host': 'd152851af04cbf04d33298cb4c6c48ca5ecdd1398913fd0d3a458fff', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fd:ee:47', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3f68e24-4f'}, 'message_id': '5f967086-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4339.366249166, 'message_signature': '96bfc8e160818bbba4157252341b11114288ecc93ea03c05ff142f037e057a15'}]}, 'timestamp': '2025-11-22 07:46:36.721692', '_unique_id': 'e1e9b15dad39417ba47cd184a9a2f12e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.722 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.722 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.722 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.722 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.722 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.722 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.722 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.722 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.722 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.722 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.722 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.722 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.722 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.722 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.722 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.722 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.722 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.722 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.722 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.722 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.722 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.722 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.722 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.722 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.722 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.722 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.722 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.722 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.722 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.722 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.722 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.722 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.722 12 DEBUG ceilometer.compute.pollsters [-] 1cb44f55-1231-44bc-8a2d-e598899b2f89/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.723 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7147f99e-fb72-41bc-87ca-5daeadee2e38', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_name': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_name': None, 'resource_id': 'instance-0000001b-1cb44f55-1231-44bc-8a2d-e598899b2f89-tapd3f68e24-4f', 'timestamp': '2025-11-22T07:46:36.722870', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-1328693686', 'name': 'tapd3f68e24-4f', 'instance_id': '1cb44f55-1231-44bc-8a2d-e598899b2f89', 'instance_type': 'm1.nano', 'host': 'd152851af04cbf04d33298cb4c6c48ca5ecdd1398913fd0d3a458fff', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fd:ee:47', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3f68e24-4f'}, 'message_id': '5f96ab14-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4339.366249166, 'message_signature': '1023034e018393749e7c1d48ea054f7635281e5e4f8719a1f67f888cdba67751'}]}, 'timestamp': '2025-11-22 07:46:36.723189', '_unique_id': '74bfe45151324d7fa4261f2041d8f92a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.723 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.723 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.723 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.723 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.723 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.723 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.723 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.723 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.723 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.723 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.723 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.723 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.723 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.723 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.723 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.723 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.723 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.723 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.723 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.723 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.723 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.723 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.723 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.723 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.723 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.723 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.723 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.723 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.723 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.723 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.723 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.724 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.724 12 DEBUG ceilometer.compute.pollsters [-] 1cb44f55-1231-44bc-8a2d-e598899b2f89/disk.device.write.requests volume: 282 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.724 12 DEBUG ceilometer.compute.pollsters [-] 1cb44f55-1231-44bc-8a2d-e598899b2f89/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.725 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b3c30143-653a-40ca-b252-e362dfc08d3c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 282, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_name': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_name': None, 'resource_id': '1cb44f55-1231-44bc-8a2d-e598899b2f89-vda', 'timestamp': '2025-11-22T07:46:36.724379', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-1328693686', 'name': 'instance-0000001b', 'instance_id': '1cb44f55-1231-44bc-8a2d-e598899b2f89', 'instance_type': 'm1.nano', 'host': 'd152851af04cbf04d33298cb4c6c48ca5ecdd1398913fd0d3a458fff', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5f96e304-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4339.332496095, 'message_signature': '07fda657abe42984c76258f54ff9e06a7b8d706ede1380cd7dfa8beefec5b8e6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_name': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_name': None, 'resource_id': '1cb44f55-1231-44bc-8a2d-e598899b2f89-sda', 'timestamp': '2025-11-22T07:46:36.724379', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-1328693686', 'name': 'instance-0000001b', 'instance_id': '1cb44f55-1231-44bc-8a2d-e598899b2f89', 'instance_type': 'm1.nano', 'host': 'd152851af04cbf04d33298cb4c6c48ca5ecdd1398913fd0d3a458fff', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5f96eb38-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4339.332496095, 'message_signature': '7f7ccdb922e9a32ffa6488b3c7682704a87d15765f82d2ed88be177ccd9b0a6b'}]}, 'timestamp': '2025-11-22 07:46:36.724856', '_unique_id': 'df93d8b96fcc40f28457042274e5ef4a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.725 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.725 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.725 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.725 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.725 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.725 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.725 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.725 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.725 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.725 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.725 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.725 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.725 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.725 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.725 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.725 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.725 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.725 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.725 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.725 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.725 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.725 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.725 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.725 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.725 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.725 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.725 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.725 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.725 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.725 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.725 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.725 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.726 12 DEBUG ceilometer.compute.pollsters [-] 1cb44f55-1231-44bc-8a2d-e598899b2f89/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.726 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '175a476d-b8ba-4306-a033-b19fd62f7013', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_name': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_name': None, 'resource_id': 'instance-0000001b-1cb44f55-1231-44bc-8a2d-e598899b2f89-tapd3f68e24-4f', 'timestamp': '2025-11-22T07:46:36.726006', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-1328693686', 'name': 'tapd3f68e24-4f', 'instance_id': '1cb44f55-1231-44bc-8a2d-e598899b2f89', 'instance_type': 'm1.nano', 'host': 'd152851af04cbf04d33298cb4c6c48ca5ecdd1398913fd0d3a458fff', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fd:ee:47', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3f68e24-4f'}, 'message_id': '5f972328-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4339.366249166, 'message_signature': '473dbda7a42c11d606910d26d2947d54a29c92c1888a09e572ca63e7ee32c1e0'}]}, 'timestamp': '2025-11-22 07:46:36.726310', '_unique_id': '9e201f2a0696497b83f33209b841c548'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.726 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.726 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.726 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.726 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.726 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.726 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.726 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.726 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.726 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.726 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.726 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.726 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.726 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.726 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.726 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.726 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.726 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.726 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.726 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.726 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.726 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.726 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.726 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.726 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.726 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.726 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.726 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.726 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.726 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.726 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.726 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.727 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.727 12 DEBUG ceilometer.compute.pollsters [-] 1cb44f55-1231-44bc-8a2d-e598899b2f89/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.728 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6e98ae73-c055-444c-8067-5fcc89411a50', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_name': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_name': None, 'resource_id': 'instance-0000001b-1cb44f55-1231-44bc-8a2d-e598899b2f89-tapd3f68e24-4f', 'timestamp': '2025-11-22T07:46:36.727651', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-1328693686', 'name': 'tapd3f68e24-4f', 'instance_id': '1cb44f55-1231-44bc-8a2d-e598899b2f89', 'instance_type': 'm1.nano', 'host': 'd152851af04cbf04d33298cb4c6c48ca5ecdd1398913fd0d3a458fff', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fd:ee:47', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3f68e24-4f'}, 'message_id': '5f9763ba-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4339.366249166, 'message_signature': '82fe1ee6657a0ad9ab9d5b82c837e03608542402016b4c44564f7b6f458ae0f3'}]}, 'timestamp': '2025-11-22 07:46:36.727928', '_unique_id': 'eaba64302a5d4fd79a03249fe9d40f17'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.728 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.728 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.728 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.728 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.728 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.728 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.728 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.728 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.728 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.728 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.728 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.728 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.728 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.728 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.728 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.728 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.728 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.728 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.728 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.728 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.728 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.728 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.728 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.728 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.728 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.728 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.728 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.728 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.728 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.728 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.728 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.728 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.747 12 DEBUG ceilometer.compute.pollsters [-] 1cb44f55-1231-44bc-8a2d-e598899b2f89/cpu volume: 13140000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.749 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f11c5439-ba59-4cfc-876e-eca6a2df6567', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 13140000000, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_name': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_name': None, 'resource_id': '1cb44f55-1231-44bc-8a2d-e598899b2f89', 'timestamp': '2025-11-22T07:46:36.729060', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-1328693686', 'name': 'instance-0000001b', 'instance_id': '1cb44f55-1231-44bc-8a2d-e598899b2f89', 'instance_type': 'm1.nano', 'host': 'd152851af04cbf04d33298cb4c6c48ca5ecdd1398913fd0d3a458fff', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '5f9a7212-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4339.417147459, 'message_signature': 'a1ede02859e11302fb508cad9148be0832975c883c3a851aa0718b9d7bc95829'}]}, 'timestamp': '2025-11-22 07:46:36.748036', '_unique_id': 'd391f260f7334d30831ea664286b872b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.749 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.749 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.749 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.749 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.749 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.749 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.749 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.749 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.749 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.749 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.749 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.749 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.749 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.749 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.749 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.749 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.749 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.749 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.749 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.749 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.749 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.749 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.749 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.749 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.749 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.749 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.749 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.749 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.749 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.749 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.749 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.749 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.750 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.750 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-1328693686>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-1328693686>]
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.750 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.750 12 DEBUG ceilometer.compute.pollsters [-] 1cb44f55-1231-44bc-8a2d-e598899b2f89/disk.device.read.bytes volume: 29522432 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.750 12 DEBUG ceilometer.compute.pollsters [-] 1cb44f55-1231-44bc-8a2d-e598899b2f89/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.751 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '89402ec5-142c-493a-9401-1488cbd07da0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29522432, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_name': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_name': None, 'resource_id': '1cb44f55-1231-44bc-8a2d-e598899b2f89-vda', 'timestamp': '2025-11-22T07:46:36.750411', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-1328693686', 'name': 'instance-0000001b', 'instance_id': '1cb44f55-1231-44bc-8a2d-e598899b2f89', 'instance_type': 'm1.nano', 'host': 'd152851af04cbf04d33298cb4c6c48ca5ecdd1398913fd0d3a458fff', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5f9add42-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4339.332496095, 'message_signature': '07f0433eff6e5c1fd988abf29d748122972065de3fc68e15bec191d666639caa'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_name': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_name': None, 'resource_id': '1cb44f55-1231-44bc-8a2d-e598899b2f89-sda', 'timestamp': '2025-11-22T07:46:36.750411', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-1328693686', 'name': 'instance-0000001b', 'instance_id': '1cb44f55-1231-44bc-8a2d-e598899b2f89', 'instance_type': 'm1.nano', 'host': 'd152851af04cbf04d33298cb4c6c48ca5ecdd1398913fd0d3a458fff', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5f9ae706-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4339.332496095, 'message_signature': '87a1912abc54b3ddb578d7876679f57ced82ba275bbcedc57b6fa7930f56c190'}]}, 'timestamp': '2025-11-22 07:46:36.750940', '_unique_id': '1c43117aa5234a7099f610847895942a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.751 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.751 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.751 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.751 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.751 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.751 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.751 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.751 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.751 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.751 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.751 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.751 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.751 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.751 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.751 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.751 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.751 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.751 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.751 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.751 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.751 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.751 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.751 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.751 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.751 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.751 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.751 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.751 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.751 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.751 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.751 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.752 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.752 12 DEBUG ceilometer.compute.pollsters [-] 1cb44f55-1231-44bc-8a2d-e598899b2f89/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.752 12 DEBUG ceilometer.compute.pollsters [-] 1cb44f55-1231-44bc-8a2d-e598899b2f89/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.753 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b0ccfd0b-2792-4545-9b92-72ccd045ac86', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_name': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_name': None, 'resource_id': '1cb44f55-1231-44bc-8a2d-e598899b2f89-vda', 'timestamp': '2025-11-22T07:46:36.752137', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-1328693686', 'name': 'instance-0000001b', 'instance_id': '1cb44f55-1231-44bc-8a2d-e598899b2f89', 'instance_type': 'm1.nano', 'host': 'd152851af04cbf04d33298cb4c6c48ca5ecdd1398913fd0d3a458fff', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5f9b1fc8-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4339.373869384, 'message_signature': '0c4995a6c106336d7b68b15ebe8ccd6445bf98e3f7a2f95e94099fabbdc2f36d'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_name': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_name': None, 'resource_id': '1cb44f55-1231-44bc-8a2d-e598899b2f89-sda', 'timestamp': '2025-11-22T07:46:36.752137', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-1328693686', 'name': 'instance-0000001b', 'instance_id': '1cb44f55-1231-44bc-8a2d-e598899b2f89', 'instance_type': 'm1.nano', 'host': 'd152851af04cbf04d33298cb4c6c48ca5ecdd1398913fd0d3a458fff', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5f9b27ca-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4339.373869384, 'message_signature': '208276bed058feddd1c6887f690cced5ef7a906fa7d90cb0ba058d08b6f754f8'}]}, 'timestamp': '2025-11-22 07:46:36.752575', '_unique_id': 'd90623f563a94c2b9f51b6956b837732'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.753 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.753 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.753 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.753 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.753 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.753 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.753 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.753 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.753 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.753 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.753 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.753 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.753 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.753 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.753 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.753 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.753 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.753 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.753 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.753 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.753 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.753 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.753 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.753 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.753 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.753 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.753 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.753 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.753 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.753 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.753 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.753 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.754 12 DEBUG ceilometer.compute.pollsters [-] 1cb44f55-1231-44bc-8a2d-e598899b2f89/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.754 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '71e8bf21-ba17-4283-8a07-dd2f85001665', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_name': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_name': None, 'resource_id': 'instance-0000001b-1cb44f55-1231-44bc-8a2d-e598899b2f89-tapd3f68e24-4f', 'timestamp': '2025-11-22T07:46:36.754003', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-1328693686', 'name': 'tapd3f68e24-4f', 'instance_id': '1cb44f55-1231-44bc-8a2d-e598899b2f89', 'instance_type': 'm1.nano', 'host': 'd152851af04cbf04d33298cb4c6c48ca5ecdd1398913fd0d3a458fff', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fd:ee:47', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3f68e24-4f'}, 'message_id': '5f9b69a6-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4339.366249166, 'message_signature': '58386d146911e91b5124c7231d4594a9c83135cfa3120a2d3f272a93cdd55e9f'}]}, 'timestamp': '2025-11-22 07:46:36.754341', '_unique_id': '28e27f97e1d2474a81034b666a6f6742'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.754 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.754 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.754 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.754 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.754 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.754 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.754 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.754 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.754 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.754 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.754 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.754 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.754 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.754 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.754 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.754 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.754 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.754 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.754 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.754 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.754 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.754 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.754 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.754 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.754 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.754 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.754 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.754 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.754 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.754 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.754 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.755 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.755 12 DEBUG ceilometer.compute.pollsters [-] 1cb44f55-1231-44bc-8a2d-e598899b2f89/memory.usage volume: 42.71875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.756 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '04547b69-5ca2-4db4-bb75-2c71d18e92f7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.71875, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_name': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_name': None, 'resource_id': '1cb44f55-1231-44bc-8a2d-e598899b2f89', 'timestamp': '2025-11-22T07:46:36.755715', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-1328693686', 'name': 'instance-0000001b', 'instance_id': '1cb44f55-1231-44bc-8a2d-e598899b2f89', 'instance_type': 'm1.nano', 'host': 'd152851af04cbf04d33298cb4c6c48ca5ecdd1398913fd0d3a458fff', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '5f9bab96-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4339.417147459, 'message_signature': 'c3ae5af8e52a4dccba9b11bccc87c8d899f7937425e92251d44db9b166919abd'}]}, 'timestamp': '2025-11-22 07:46:36.755987', '_unique_id': '210218d8c07143748672c687d40dd791'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.756 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.756 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.756 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.756 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.756 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.756 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.756 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.756 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.756 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.756 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.756 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.756 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.756 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.756 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.756 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.756 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.756 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.756 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.756 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.756 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.756 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.756 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.756 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.756 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.756 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.756 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.756 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.756 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.756 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.756 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.756 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.757 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.757 12 DEBUG ceilometer.compute.pollsters [-] 1cb44f55-1231-44bc-8a2d-e598899b2f89/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.757 12 DEBUG ceilometer.compute.pollsters [-] 1cb44f55-1231-44bc-8a2d-e598899b2f89/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.758 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '263dcf4c-798f-48da-b9b8-56707403ac97', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_name': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_name': None, 'resource_id': '1cb44f55-1231-44bc-8a2d-e598899b2f89-vda', 'timestamp': '2025-11-22T07:46:36.757429', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-1328693686', 'name': 'instance-0000001b', 'instance_id': '1cb44f55-1231-44bc-8a2d-e598899b2f89', 'instance_type': 'm1.nano', 'host': 'd152851af04cbf04d33298cb4c6c48ca5ecdd1398913fd0d3a458fff', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5f9bef0c-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4339.373869384, 'message_signature': 'd00d8397a19c5a910bef8988a426162e02146d0800a790a7efeaf4eec33992e3'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_name': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_name': None, 'resource_id': '1cb44f55-1231-44bc-8a2d-e598899b2f89-sda', 'timestamp': '2025-11-22T07:46:36.757429', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-1328693686', 'name': 'instance-0000001b', 'instance_id': '1cb44f55-1231-44bc-8a2d-e598899b2f89', 'instance_type': 'm1.nano', 'host': 'd152851af04cbf04d33298cb4c6c48ca5ecdd1398913fd0d3a458fff', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5f9bf97a-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4339.373869384, 'message_signature': '726a432cdfa3daa8617bdfe7166c2bc53c67c4647c5a2b18dc1e40c6a1a3423b'}]}, 'timestamp': '2025-11-22 07:46:36.757977', '_unique_id': '4df18895f854413f87612b8b2a4ad28a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.758 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.758 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.758 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.758 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.758 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.758 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.758 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.758 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.758 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.758 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.758 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.758 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.758 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.758 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.758 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.758 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.758 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.758 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.758 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.758 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.758 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.758 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.758 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.758 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.758 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.758 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.758 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.758 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.758 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.758 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.758 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.759 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.759 12 DEBUG ceilometer.compute.pollsters [-] 1cb44f55-1231-44bc-8a2d-e598899b2f89/disk.device.read.requests volume: 1065 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.759 12 DEBUG ceilometer.compute.pollsters [-] 1cb44f55-1231-44bc-8a2d-e598899b2f89/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.760 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c0677518-ea58-4294-91f1-6909c19bcd24', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1065, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_name': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_name': None, 'resource_id': '1cb44f55-1231-44bc-8a2d-e598899b2f89-vda', 'timestamp': '2025-11-22T07:46:36.759619', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-1328693686', 'name': 'instance-0000001b', 'instance_id': '1cb44f55-1231-44bc-8a2d-e598899b2f89', 'instance_type': 'm1.nano', 'host': 'd152851af04cbf04d33298cb4c6c48ca5ecdd1398913fd0d3a458fff', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5f9c4466-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4339.332496095, 'message_signature': '57d802dc59e8d44b44dfee0c4dfd79a736551c1d626a7987bbc60e5543dd39ed'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_name': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_name': None, 'resource_id': '1cb44f55-1231-44bc-8a2d-e598899b2f89-sda', 'timestamp': '2025-11-22T07:46:36.759619', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-1328693686', 'name': 'instance-0000001b', 'instance_id': '1cb44f55-1231-44bc-8a2d-e598899b2f89', 'instance_type': 'm1.nano', 'host': 'd152851af04cbf04d33298cb4c6c48ca5ecdd1398913fd0d3a458fff', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5f9c4ede-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4339.332496095, 'message_signature': '39a3f7805a674c5725bcc1c20332c1f2243ae506e46aad37645c1e9906a6c1b1'}]}, 'timestamp': '2025-11-22 07:46:36.760169', '_unique_id': '029c515f2e45475fbaa47d7e0bb483d0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.760 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.760 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.760 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.760 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.760 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.760 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.760 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.760 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.760 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.760 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.760 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.760 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.760 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.760 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.760 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.760 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.760 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.760 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.760 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.760 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.760 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.760 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.760 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.760 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.760 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.760 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.760 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.760 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.760 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.760 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.760 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.761 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.761 12 DEBUG ceilometer.compute.pollsters [-] 1cb44f55-1231-44bc-8a2d-e598899b2f89/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.762 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '87a93387-3eec-4112-b9b8-82da7a5ca874', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_name': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_name': None, 'resource_id': 'instance-0000001b-1cb44f55-1231-44bc-8a2d-e598899b2f89-tapd3f68e24-4f', 'timestamp': '2025-11-22T07:46:36.761618', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-1328693686', 'name': 'tapd3f68e24-4f', 'instance_id': '1cb44f55-1231-44bc-8a2d-e598899b2f89', 'instance_type': 'm1.nano', 'host': 'd152851af04cbf04d33298cb4c6c48ca5ecdd1398913fd0d3a458fff', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fd:ee:47', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3f68e24-4f'}, 'message_id': '5f9c927c-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4339.366249166, 'message_signature': 'f258f00f4ab310a1bca2e1c5d01b4181ee9b9f21282c1479b883797d93d0db21'}]}, 'timestamp': '2025-11-22 07:46:36.761909', '_unique_id': 'd112530a22994d46acf231fe2ceeec9c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.762 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.762 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.762 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.762 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.762 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.762 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.762 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.762 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.762 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.762 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.762 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.762 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.762 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.762 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.762 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.762 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.762 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.762 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.762 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.762 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.762 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.762 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.762 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.762 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.762 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.762 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.762 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.762 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.762 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.762 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.762 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.763 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.763 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.763 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-1328693686>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-1328693686>]
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.763 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.763 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.763 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-1328693686>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-1328693686>]
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.763 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.763 12 DEBUG ceilometer.compute.pollsters [-] 1cb44f55-1231-44bc-8a2d-e598899b2f89/disk.device.write.bytes volume: 72876032 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.763 12 DEBUG ceilometer.compute.pollsters [-] 1cb44f55-1231-44bc-8a2d-e598899b2f89/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.764 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b00f047e-eeab-4fc3-8c43-a3bbcfdc5b63', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72876032, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_name': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_name': None, 'resource_id': '1cb44f55-1231-44bc-8a2d-e598899b2f89-vda', 'timestamp': '2025-11-22T07:46:36.763689', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-1328693686', 'name': 'instance-0000001b', 'instance_id': '1cb44f55-1231-44bc-8a2d-e598899b2f89', 'instance_type': 'm1.nano', 'host': 'd152851af04cbf04d33298cb4c6c48ca5ecdd1398913fd0d3a458fff', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5f9ce22c-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4339.332496095, 'message_signature': '91233c4af3a813f7e1b8804ca484ae5000e0f6a350ee353b518f8332c988ff75'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_name': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_name': None, 'resource_id': '1cb44f55-1231-44bc-8a2d-e598899b2f89-sda', 'timestamp': '2025-11-22T07:46:36.763689', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-1328693686', 'name': 'instance-0000001b', 'instance_id': '1cb44f55-1231-44bc-8a2d-e598899b2f89', 'instance_type': 'm1.nano', 'host': 'd152851af04cbf04d33298cb4c6c48ca5ecdd1398913fd0d3a458fff', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5f9cea1a-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4339.332496095, 'message_signature': '4197ce7149bc24af3d0203aad59f7498be7670351a74621311078b4c348367b4'}]}, 'timestamp': '2025-11-22 07:46:36.764104', '_unique_id': '49177d5bb61b4c7c83e1f11615a68a28'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.764 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.764 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.764 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.764 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.764 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.764 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.764 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.764 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.764 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.764 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.764 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.764 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.764 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.764 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.764 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.764 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.764 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.764 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.764 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.764 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.764 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.764 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.764 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.764 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.764 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.764 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.764 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.764 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.764 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.764 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.764 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.765 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.765 12 DEBUG ceilometer.compute.pollsters [-] 1cb44f55-1231-44bc-8a2d-e598899b2f89/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.765 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c455bfe9-25d1-476c-87a5-31f9ecdb700e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_name': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_name': None, 'resource_id': 'instance-0000001b-1cb44f55-1231-44bc-8a2d-e598899b2f89-tapd3f68e24-4f', 'timestamp': '2025-11-22T07:46:36.765271', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-1328693686', 'name': 'tapd3f68e24-4f', 'instance_id': '1cb44f55-1231-44bc-8a2d-e598899b2f89', 'instance_type': 'm1.nano', 'host': 'd152851af04cbf04d33298cb4c6c48ca5ecdd1398913fd0d3a458fff', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fd:ee:47', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3f68e24-4f'}, 'message_id': '5f9d2138-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4339.366249166, 'message_signature': '77dc4353e92cc5e03b78b3016b54237a757e3faadd048b3d7a5f44410c4dbefe'}]}, 'timestamp': '2025-11-22 07:46:36.765529', '_unique_id': '0dafd1a08add42d2aab30a8e2a81b605'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.765 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.765 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.765 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.765 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.765 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.765 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.765 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.765 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.765 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.765 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.765 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.765 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.765 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.765 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.765 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.765 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.765 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.765 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.765 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.765 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.765 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.765 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.765 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.765 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.765 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.765 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.765 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.765 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.765 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.765 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:46:36.765 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:46:36 np0005531887 podman[217246]: 2025-11-22 07:46:36.841132081 +0000 UTC m=+0.059258520 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, tcib_managed=true)
Nov 22 02:46:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:46:37.318 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:46:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:46:37.319 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:46:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:46:37.319 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:46:37 np0005531887 nova_compute[186849]: 2025-11-22 07:46:37.762 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:46:37 np0005531887 nova_compute[186849]: 2025-11-22 07:46:37.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:46:37 np0005531887 nova_compute[186849]: 2025-11-22 07:46:37.768 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 02:46:37 np0005531887 nova_compute[186849]: 2025-11-22 07:46:37.768 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 02:46:37 np0005531887 nova_compute[186849]: 2025-11-22 07:46:37.957 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:38 np0005531887 nova_compute[186849]: 2025-11-22 07:46:38.109 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "refresh_cache-1cb44f55-1231-44bc-8a2d-e598899b2f89" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:46:38 np0005531887 nova_compute[186849]: 2025-11-22 07:46:38.110 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquired lock "refresh_cache-1cb44f55-1231-44bc-8a2d-e598899b2f89" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:46:38 np0005531887 nova_compute[186849]: 2025-11-22 07:46:38.110 186853 DEBUG nova.network.neutron [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 02:46:38 np0005531887 nova_compute[186849]: 2025-11-22 07:46:38.110 186853 DEBUG nova.objects.instance [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 1cb44f55-1231-44bc-8a2d-e598899b2f89 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:46:40 np0005531887 nova_compute[186849]: 2025-11-22 07:46:40.249 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:40 np0005531887 nova_compute[186849]: 2025-11-22 07:46:40.789 186853 DEBUG nova.network.neutron [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Updating instance_info_cache with network_info: [{"id": "d3f68e24-4f74-40aa-8f91-ff43c969dcd0", "address": "fa:16:3e:fd:ee:47", "network": {"id": "9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-440990750-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef27a8ab9a794f7782ac89b9c28c893a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3f68e24-4f", "ovs_interfaceid": "d3f68e24-4f74-40aa-8f91-ff43c969dcd0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:46:40 np0005531887 nova_compute[186849]: 2025-11-22 07:46:40.806 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Releasing lock "refresh_cache-1cb44f55-1231-44bc-8a2d-e598899b2f89" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:46:40 np0005531887 nova_compute[186849]: 2025-11-22 07:46:40.806 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 02:46:40 np0005531887 nova_compute[186849]: 2025-11-22 07:46:40.807 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:46:40 np0005531887 nova_compute[186849]: 2025-11-22 07:46:40.807 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:46:40 np0005531887 nova_compute[186849]: 2025-11-22 07:46:40.807 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:46:40 np0005531887 nova_compute[186849]: 2025-11-22 07:46:40.808 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:46:40 np0005531887 nova_compute[186849]: 2025-11-22 07:46:40.826 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:46:40 np0005531887 nova_compute[186849]: 2025-11-22 07:46:40.826 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:46:40 np0005531887 nova_compute[186849]: 2025-11-22 07:46:40.827 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:46:40 np0005531887 nova_compute[186849]: 2025-11-22 07:46:40.827 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 02:46:40 np0005531887 nova_compute[186849]: 2025-11-22 07:46:40.889 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1cb44f55-1231-44bc-8a2d-e598899b2f89/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:46:40 np0005531887 nova_compute[186849]: 2025-11-22 07:46:40.950 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1cb44f55-1231-44bc-8a2d-e598899b2f89/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:46:40 np0005531887 nova_compute[186849]: 2025-11-22 07:46:40.951 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1cb44f55-1231-44bc-8a2d-e598899b2f89/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:46:41 np0005531887 nova_compute[186849]: 2025-11-22 07:46:41.030 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1cb44f55-1231-44bc-8a2d-e598899b2f89/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:46:41 np0005531887 nova_compute[186849]: 2025-11-22 07:46:41.186 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:46:41 np0005531887 nova_compute[186849]: 2025-11-22 07:46:41.187 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5566MB free_disk=73.42938613891602GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 02:46:41 np0005531887 nova_compute[186849]: 2025-11-22 07:46:41.187 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:46:41 np0005531887 nova_compute[186849]: 2025-11-22 07:46:41.187 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:46:41 np0005531887 nova_compute[186849]: 2025-11-22 07:46:41.277 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Instance 1cb44f55-1231-44bc-8a2d-e598899b2f89 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 02:46:41 np0005531887 nova_compute[186849]: 2025-11-22 07:46:41.277 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 02:46:41 np0005531887 nova_compute[186849]: 2025-11-22 07:46:41.277 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 02:46:41 np0005531887 nova_compute[186849]: 2025-11-22 07:46:41.319 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:46:41 np0005531887 nova_compute[186849]: 2025-11-22 07:46:41.335 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:46:41 np0005531887 nova_compute[186849]: 2025-11-22 07:46:41.358 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 02:46:41 np0005531887 nova_compute[186849]: 2025-11-22 07:46:41.359 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.171s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:46:41 np0005531887 podman[217271]: 2025-11-22 07:46:41.860059115 +0000 UTC m=+0.078728471 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 02:46:42 np0005531887 nova_compute[186849]: 2025-11-22 07:46:42.320 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:46:42 np0005531887 nova_compute[186849]: 2025-11-22 07:46:42.320 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 02:46:42 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:46:42.916 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:46:42 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:46:42.917 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 02:46:42 np0005531887 nova_compute[186849]: 2025-11-22 07:46:42.916 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:42 np0005531887 nova_compute[186849]: 2025-11-22 07:46:42.958 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:45 np0005531887 nova_compute[186849]: 2025-11-22 07:46:45.250 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:46 np0005531887 podman[217295]: 2025-11-22 07:46:46.855051077 +0000 UTC m=+0.058861439 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, release=1755695350, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, config_id=edpm, distribution-scope=public, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6)
Nov 22 02:46:46 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:46:46.919 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:46:47 np0005531887 nova_compute[186849]: 2025-11-22 07:46:47.465 186853 DEBUG nova.compute.manager [req-436e3978-b4bd-4100-b325-6747970d5b09 req-2c6f44e1-a0b6-42b7-a2f4-c01ada1b545c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Received event network-changed-d3f68e24-4f74-40aa-8f91-ff43c969dcd0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:46:47 np0005531887 nova_compute[186849]: 2025-11-22 07:46:47.465 186853 DEBUG nova.compute.manager [req-436e3978-b4bd-4100-b325-6747970d5b09 req-2c6f44e1-a0b6-42b7-a2f4-c01ada1b545c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Refreshing instance network info cache due to event network-changed-d3f68e24-4f74-40aa-8f91-ff43c969dcd0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 02:46:47 np0005531887 nova_compute[186849]: 2025-11-22 07:46:47.465 186853 DEBUG oslo_concurrency.lockutils [req-436e3978-b4bd-4100-b325-6747970d5b09 req-2c6f44e1-a0b6-42b7-a2f4-c01ada1b545c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-1cb44f55-1231-44bc-8a2d-e598899b2f89" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:46:47 np0005531887 nova_compute[186849]: 2025-11-22 07:46:47.465 186853 DEBUG oslo_concurrency.lockutils [req-436e3978-b4bd-4100-b325-6747970d5b09 req-2c6f44e1-a0b6-42b7-a2f4-c01ada1b545c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-1cb44f55-1231-44bc-8a2d-e598899b2f89" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:46:47 np0005531887 nova_compute[186849]: 2025-11-22 07:46:47.466 186853 DEBUG nova.network.neutron [req-436e3978-b4bd-4100-b325-6747970d5b09 req-2c6f44e1-a0b6-42b7-a2f4-c01ada1b545c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Refreshing network info cache for port d3f68e24-4f74-40aa-8f91-ff43c969dcd0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 02:46:47 np0005531887 nova_compute[186849]: 2025-11-22 07:46:47.961 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:49 np0005531887 nova_compute[186849]: 2025-11-22 07:46:49.735 186853 DEBUG nova.network.neutron [req-436e3978-b4bd-4100-b325-6747970d5b09 req-2c6f44e1-a0b6-42b7-a2f4-c01ada1b545c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Updated VIF entry in instance network info cache for port d3f68e24-4f74-40aa-8f91-ff43c969dcd0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 02:46:49 np0005531887 nova_compute[186849]: 2025-11-22 07:46:49.736 186853 DEBUG nova.network.neutron [req-436e3978-b4bd-4100-b325-6747970d5b09 req-2c6f44e1-a0b6-42b7-a2f4-c01ada1b545c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Updating instance_info_cache with network_info: [{"id": "d3f68e24-4f74-40aa-8f91-ff43c969dcd0", "address": "fa:16:3e:fd:ee:47", "network": {"id": "9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-440990750-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef27a8ab9a794f7782ac89b9c28c893a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3f68e24-4f", "ovs_interfaceid": "d3f68e24-4f74-40aa-8f91-ff43c969dcd0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:46:49 np0005531887 nova_compute[186849]: 2025-11-22 07:46:49.748 186853 DEBUG oslo_concurrency.lockutils [req-436e3978-b4bd-4100-b325-6747970d5b09 req-2c6f44e1-a0b6-42b7-a2f4-c01ada1b545c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-1cb44f55-1231-44bc-8a2d-e598899b2f89" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:46:50 np0005531887 nova_compute[186849]: 2025-11-22 07:46:50.252 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:51 np0005531887 nova_compute[186849]: 2025-11-22 07:46:51.754 186853 DEBUG nova.compute.manager [req-2873a349-e02f-4700-99f3-4ae50ff74d72 req-90343f3c-d611-474c-a350-423cd47a41ec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Received event network-changed-d3f68e24-4f74-40aa-8f91-ff43c969dcd0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:46:51 np0005531887 nova_compute[186849]: 2025-11-22 07:46:51.754 186853 DEBUG nova.compute.manager [req-2873a349-e02f-4700-99f3-4ae50ff74d72 req-90343f3c-d611-474c-a350-423cd47a41ec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Refreshing instance network info cache due to event network-changed-d3f68e24-4f74-40aa-8f91-ff43c969dcd0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 02:46:51 np0005531887 nova_compute[186849]: 2025-11-22 07:46:51.754 186853 DEBUG oslo_concurrency.lockutils [req-2873a349-e02f-4700-99f3-4ae50ff74d72 req-90343f3c-d611-474c-a350-423cd47a41ec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-1cb44f55-1231-44bc-8a2d-e598899b2f89" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:46:51 np0005531887 nova_compute[186849]: 2025-11-22 07:46:51.755 186853 DEBUG oslo_concurrency.lockutils [req-2873a349-e02f-4700-99f3-4ae50ff74d72 req-90343f3c-d611-474c-a350-423cd47a41ec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-1cb44f55-1231-44bc-8a2d-e598899b2f89" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:46:51 np0005531887 nova_compute[186849]: 2025-11-22 07:46:51.755 186853 DEBUG nova.network.neutron [req-2873a349-e02f-4700-99f3-4ae50ff74d72 req-90343f3c-d611-474c-a350-423cd47a41ec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Refreshing network info cache for port d3f68e24-4f74-40aa-8f91-ff43c969dcd0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 02:46:52 np0005531887 nova_compute[186849]: 2025-11-22 07:46:52.965 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:53 np0005531887 nova_compute[186849]: 2025-11-22 07:46:53.116 186853 DEBUG nova.network.neutron [req-2873a349-e02f-4700-99f3-4ae50ff74d72 req-90343f3c-d611-474c-a350-423cd47a41ec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Updated VIF entry in instance network info cache for port d3f68e24-4f74-40aa-8f91-ff43c969dcd0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 02:46:53 np0005531887 nova_compute[186849]: 2025-11-22 07:46:53.117 186853 DEBUG nova.network.neutron [req-2873a349-e02f-4700-99f3-4ae50ff74d72 req-90343f3c-d611-474c-a350-423cd47a41ec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Updating instance_info_cache with network_info: [{"id": "d3f68e24-4f74-40aa-8f91-ff43c969dcd0", "address": "fa:16:3e:fd:ee:47", "network": {"id": "9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-440990750-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef27a8ab9a794f7782ac89b9c28c893a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3f68e24-4f", "ovs_interfaceid": "d3f68e24-4f74-40aa-8f91-ff43c969dcd0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:46:53 np0005531887 nova_compute[186849]: 2025-11-22 07:46:53.139 186853 DEBUG oslo_concurrency.lockutils [req-2873a349-e02f-4700-99f3-4ae50ff74d72 req-90343f3c-d611-474c-a350-423cd47a41ec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-1cb44f55-1231-44bc-8a2d-e598899b2f89" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:46:53 np0005531887 podman[217317]: 2025-11-22 07:46:53.83863575 +0000 UTC m=+0.054287687 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true)
Nov 22 02:46:53 np0005531887 podman[217318]: 2025-11-22 07:46:53.899008955 +0000 UTC m=+0.105453397 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251118)
Nov 22 02:46:55 np0005531887 nova_compute[186849]: 2025-11-22 07:46:55.254 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:55 np0005531887 nova_compute[186849]: 2025-11-22 07:46:55.339 186853 DEBUG oslo_concurrency.lockutils [None req-5ff49bc2-4969-44af-ab22-34d8e514d0d3 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Acquiring lock "1cb44f55-1231-44bc-8a2d-e598899b2f89" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:46:55 np0005531887 nova_compute[186849]: 2025-11-22 07:46:55.340 186853 DEBUG oslo_concurrency.lockutils [None req-5ff49bc2-4969-44af-ab22-34d8e514d0d3 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Lock "1cb44f55-1231-44bc-8a2d-e598899b2f89" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:46:55 np0005531887 nova_compute[186849]: 2025-11-22 07:46:55.340 186853 DEBUG oslo_concurrency.lockutils [None req-5ff49bc2-4969-44af-ab22-34d8e514d0d3 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Acquiring lock "1cb44f55-1231-44bc-8a2d-e598899b2f89-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:46:55 np0005531887 nova_compute[186849]: 2025-11-22 07:46:55.340 186853 DEBUG oslo_concurrency.lockutils [None req-5ff49bc2-4969-44af-ab22-34d8e514d0d3 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Lock "1cb44f55-1231-44bc-8a2d-e598899b2f89-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:46:55 np0005531887 nova_compute[186849]: 2025-11-22 07:46:55.340 186853 DEBUG oslo_concurrency.lockutils [None req-5ff49bc2-4969-44af-ab22-34d8e514d0d3 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Lock "1cb44f55-1231-44bc-8a2d-e598899b2f89-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:46:55 np0005531887 nova_compute[186849]: 2025-11-22 07:46:55.349 186853 INFO nova.compute.manager [None req-5ff49bc2-4969-44af-ab22-34d8e514d0d3 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Terminating instance#033[00m
Nov 22 02:46:55 np0005531887 nova_compute[186849]: 2025-11-22 07:46:55.355 186853 DEBUG nova.compute.manager [None req-5ff49bc2-4969-44af-ab22-34d8e514d0d3 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 02:46:55 np0005531887 kernel: tapd3f68e24-4f (unregistering): left promiscuous mode
Nov 22 02:46:55 np0005531887 NetworkManager[55210]: <info>  [1763797615.3904] device (tapd3f68e24-4f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 02:46:55 np0005531887 ovn_controller[95130]: 2025-11-22T07:46:55Z|00070|binding|INFO|Releasing lport d3f68e24-4f74-40aa-8f91-ff43c969dcd0 from this chassis (sb_readonly=0)
Nov 22 02:46:55 np0005531887 ovn_controller[95130]: 2025-11-22T07:46:55Z|00071|binding|INFO|Setting lport d3f68e24-4f74-40aa-8f91-ff43c969dcd0 down in Southbound
Nov 22 02:46:55 np0005531887 ovn_controller[95130]: 2025-11-22T07:46:55Z|00072|binding|INFO|Removing iface tapd3f68e24-4f ovn-installed in OVS
Nov 22 02:46:55 np0005531887 nova_compute[186849]: 2025-11-22 07:46:55.398 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:55 np0005531887 nova_compute[186849]: 2025-11-22 07:46:55.415 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:55 np0005531887 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000001b.scope: Deactivated successfully.
Nov 22 02:46:55 np0005531887 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000001b.scope: Consumed 15.981s CPU time.
Nov 22 02:46:55 np0005531887 systemd-machined[153180]: Machine qemu-12-instance-0000001b terminated.
Nov 22 02:46:55 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:46:55.527 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fd:ee:47 10.100.0.6'], port_security=['fa:16:3e:fd:ee:47 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '1cb44f55-1231-44bc-8a2d-e598899b2f89', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7ba545ee-7ef7-4120-9b36-dfb927d132f4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=750112b4-7c3d-47fc-a624-7726e73fdc53, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=d3f68e24-4f74-40aa-8f91-ff43c969dcd0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:46:55 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:46:55.528 104084 INFO neutron.agent.ovn.metadata.agent [-] Port d3f68e24-4f74-40aa-8f91-ff43c969dcd0 in datapath 9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202 unbound from our chassis#033[00m
Nov 22 02:46:55 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:46:55.529 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 02:46:55 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:46:55.531 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[85f598f2-9703-486c-be1c-72e9c5d58ecd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:46:55 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:46:55.531 104084 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202 namespace which is not needed anymore#033[00m
Nov 22 02:46:55 np0005531887 nova_compute[186849]: 2025-11-22 07:46:55.621 186853 INFO nova.virt.libvirt.driver [-] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Instance destroyed successfully.#033[00m
Nov 22 02:46:55 np0005531887 nova_compute[186849]: 2025-11-22 07:46:55.622 186853 DEBUG nova.objects.instance [None req-5ff49bc2-4969-44af-ab22-34d8e514d0d3 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Lazy-loading 'resources' on Instance uuid 1cb44f55-1231-44bc-8a2d-e598899b2f89 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:46:55 np0005531887 neutron-haproxy-ovnmeta-9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202[217009]: [NOTICE]   (217013) : haproxy version is 2.8.14-c23fe91
Nov 22 02:46:55 np0005531887 neutron-haproxy-ovnmeta-9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202[217009]: [NOTICE]   (217013) : path to executable is /usr/sbin/haproxy
Nov 22 02:46:55 np0005531887 neutron-haproxy-ovnmeta-9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202[217009]: [WARNING]  (217013) : Exiting Master process...
Nov 22 02:46:55 np0005531887 neutron-haproxy-ovnmeta-9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202[217009]: [ALERT]    (217013) : Current worker (217015) exited with code 143 (Terminated)
Nov 22 02:46:55 np0005531887 neutron-haproxy-ovnmeta-9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202[217009]: [WARNING]  (217013) : All workers exited. Exiting... (0)
Nov 22 02:46:55 np0005531887 systemd[1]: libpod-6e11267e02823b16401a196da038f4e009db113cb281a2187eca9fb072693e96.scope: Deactivated successfully.
Nov 22 02:46:55 np0005531887 podman[217403]: 2025-11-22 07:46:55.680685704 +0000 UTC m=+0.044957439 container died 6e11267e02823b16401a196da038f4e009db113cb281a2187eca9fb072693e96 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:46:55 np0005531887 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6e11267e02823b16401a196da038f4e009db113cb281a2187eca9fb072693e96-userdata-shm.mount: Deactivated successfully.
Nov 22 02:46:55 np0005531887 systemd[1]: var-lib-containers-storage-overlay-15a6ea8cf80624f138d8bed53e0b03759b84cbaf74fd404f8e41b0ac2fc11410-merged.mount: Deactivated successfully.
Nov 22 02:46:55 np0005531887 podman[217403]: 2025-11-22 07:46:55.725014238 +0000 UTC m=+0.089285973 container cleanup 6e11267e02823b16401a196da038f4e009db113cb281a2187eca9fb072693e96 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 02:46:55 np0005531887 systemd[1]: libpod-conmon-6e11267e02823b16401a196da038f4e009db113cb281a2187eca9fb072693e96.scope: Deactivated successfully.
Nov 22 02:46:55 np0005531887 nova_compute[186849]: 2025-11-22 07:46:55.747 186853 DEBUG nova.virt.libvirt.vif [None req-5ff49bc2-4969-44af-ab22-34d8e514d0d3 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:45:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-1328693686',display_name='tempest-FloatingIPsAssociationTestJSON-server-1328693686',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-1328693686',id=27,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:46:00Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ef27a8ab9a794f7782ac89b9c28c893a',ramdisk_id='',reservation_id='r-fpdsdowm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIPsAssociationTestJSON-1465053098',owner_user_name='tempest-FloatingIPsAssociationTestJSON-1465053098-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:46:00Z,user_data=None,user_id='65ded9a5f9a7463d8c52561197054664',uuid=1cb44f55-1231-44bc-8a2d-e598899b2f89,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d3f68e24-4f74-40aa-8f91-ff43c969dcd0", "address": "fa:16:3e:fd:ee:47", "network": {"id": "9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-440990750-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef27a8ab9a794f7782ac89b9c28c893a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3f68e24-4f", "ovs_interfaceid": "d3f68e24-4f74-40aa-8f91-ff43c969dcd0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 02:46:55 np0005531887 nova_compute[186849]: 2025-11-22 07:46:55.747 186853 DEBUG nova.network.os_vif_util [None req-5ff49bc2-4969-44af-ab22-34d8e514d0d3 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Converting VIF {"id": "d3f68e24-4f74-40aa-8f91-ff43c969dcd0", "address": "fa:16:3e:fd:ee:47", "network": {"id": "9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-440990750-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef27a8ab9a794f7782ac89b9c28c893a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3f68e24-4f", "ovs_interfaceid": "d3f68e24-4f74-40aa-8f91-ff43c969dcd0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:46:55 np0005531887 nova_compute[186849]: 2025-11-22 07:46:55.748 186853 DEBUG nova.network.os_vif_util [None req-5ff49bc2-4969-44af-ab22-34d8e514d0d3 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:fd:ee:47,bridge_name='br-int',has_traffic_filtering=True,id=d3f68e24-4f74-40aa-8f91-ff43c969dcd0,network=Network(9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3f68e24-4f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:46:55 np0005531887 nova_compute[186849]: 2025-11-22 07:46:55.748 186853 DEBUG os_vif [None req-5ff49bc2-4969-44af-ab22-34d8e514d0d3 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:fd:ee:47,bridge_name='br-int',has_traffic_filtering=True,id=d3f68e24-4f74-40aa-8f91-ff43c969dcd0,network=Network(9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3f68e24-4f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 02:46:55 np0005531887 nova_compute[186849]: 2025-11-22 07:46:55.750 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:55 np0005531887 nova_compute[186849]: 2025-11-22 07:46:55.750 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd3f68e24-4f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:46:55 np0005531887 nova_compute[186849]: 2025-11-22 07:46:55.751 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:55 np0005531887 nova_compute[186849]: 2025-11-22 07:46:55.753 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:46:55 np0005531887 nova_compute[186849]: 2025-11-22 07:46:55.753 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:55 np0005531887 nova_compute[186849]: 2025-11-22 07:46:55.755 186853 INFO os_vif [None req-5ff49bc2-4969-44af-ab22-34d8e514d0d3 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:fd:ee:47,bridge_name='br-int',has_traffic_filtering=True,id=d3f68e24-4f74-40aa-8f91-ff43c969dcd0,network=Network(9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3f68e24-4f')#033[00m
Nov 22 02:46:55 np0005531887 nova_compute[186849]: 2025-11-22 07:46:55.756 186853 INFO nova.virt.libvirt.driver [None req-5ff49bc2-4969-44af-ab22-34d8e514d0d3 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Deleting instance files /var/lib/nova/instances/1cb44f55-1231-44bc-8a2d-e598899b2f89_del#033[00m
Nov 22 02:46:55 np0005531887 nova_compute[186849]: 2025-11-22 07:46:55.757 186853 INFO nova.virt.libvirt.driver [None req-5ff49bc2-4969-44af-ab22-34d8e514d0d3 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Deletion of /var/lib/nova/instances/1cb44f55-1231-44bc-8a2d-e598899b2f89_del complete#033[00m
Nov 22 02:46:55 np0005531887 podman[217434]: 2025-11-22 07:46:55.786223144 +0000 UTC m=+0.043246668 container remove 6e11267e02823b16401a196da038f4e009db113cb281a2187eca9fb072693e96 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:46:55 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:46:55.791 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[427a8172-5269-453e-9db1-dc8445480098]: (4, ('Sat Nov 22 07:46:55 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202 (6e11267e02823b16401a196da038f4e009db113cb281a2187eca9fb072693e96)\n6e11267e02823b16401a196da038f4e009db113cb281a2187eca9fb072693e96\nSat Nov 22 07:46:55 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202 (6e11267e02823b16401a196da038f4e009db113cb281a2187eca9fb072693e96)\n6e11267e02823b16401a196da038f4e009db113cb281a2187eca9fb072693e96\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:46:55 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:46:55.793 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[2e24d91c-a7d0-445f-b4ff-c71a5ddf6ca1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:46:55 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:46:55.794 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9dfbfc3c-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:46:55 np0005531887 nova_compute[186849]: 2025-11-22 07:46:55.795 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:55 np0005531887 kernel: tap9dfbfc3c-a0: left promiscuous mode
Nov 22 02:46:55 np0005531887 nova_compute[186849]: 2025-11-22 07:46:55.807 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:46:55 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:46:55.808 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[a8f5e805-4d47-429a-adc4-80558b8ccd8e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:46:55 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:46:55.829 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[d518cade-9473-4c0c-9d9e-1f22abf47e6b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:46:55 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:46:55.830 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[ee712c87-e6c8-4a66-8b20-59afda6d6559]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:46:55 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:46:55.846 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[8ab4b10f-8a33-446c-92a1-0161f53c208a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 430219, 'reachable_time': 28743, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217449, 'error': None, 'target': 'ovnmeta-9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:46:55 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:46:55.847 104200 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 02:46:55 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:46:55.848 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[a6f1e6d7-1bcd-4594-96da-513fb526f728]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:46:55 np0005531887 systemd[1]: run-netns-ovnmeta\x2d9dfbfc3c\x2da1e1\x2d4f51\x2d927e\x2dfe8b0b4a7202.mount: Deactivated successfully.
Nov 22 02:46:55 np0005531887 nova_compute[186849]: 2025-11-22 07:46:55.916 186853 INFO nova.compute.manager [None req-5ff49bc2-4969-44af-ab22-34d8e514d0d3 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Took 0.56 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 02:46:55 np0005531887 nova_compute[186849]: 2025-11-22 07:46:55.916 186853 DEBUG oslo.service.loopingcall [None req-5ff49bc2-4969-44af-ab22-34d8e514d0d3 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 02:46:55 np0005531887 nova_compute[186849]: 2025-11-22 07:46:55.916 186853 DEBUG nova.compute.manager [-] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 02:46:55 np0005531887 nova_compute[186849]: 2025-11-22 07:46:55.917 186853 DEBUG nova.network.neutron [-] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 02:46:56 np0005531887 nova_compute[186849]: 2025-11-22 07:46:56.600 186853 DEBUG nova.compute.manager [req-3a6271ba-d5d4-4435-98d7-4e0717a72100 req-418beeaa-f760-4007-9efa-f6af8b1aaa81 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Received event network-vif-unplugged-d3f68e24-4f74-40aa-8f91-ff43c969dcd0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:46:56 np0005531887 nova_compute[186849]: 2025-11-22 07:46:56.601 186853 DEBUG oslo_concurrency.lockutils [req-3a6271ba-d5d4-4435-98d7-4e0717a72100 req-418beeaa-f760-4007-9efa-f6af8b1aaa81 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1cb44f55-1231-44bc-8a2d-e598899b2f89-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:46:56 np0005531887 nova_compute[186849]: 2025-11-22 07:46:56.601 186853 DEBUG oslo_concurrency.lockutils [req-3a6271ba-d5d4-4435-98d7-4e0717a72100 req-418beeaa-f760-4007-9efa-f6af8b1aaa81 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1cb44f55-1231-44bc-8a2d-e598899b2f89-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:46:56 np0005531887 nova_compute[186849]: 2025-11-22 07:46:56.601 186853 DEBUG oslo_concurrency.lockutils [req-3a6271ba-d5d4-4435-98d7-4e0717a72100 req-418beeaa-f760-4007-9efa-f6af8b1aaa81 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1cb44f55-1231-44bc-8a2d-e598899b2f89-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:46:56 np0005531887 nova_compute[186849]: 2025-11-22 07:46:56.601 186853 DEBUG nova.compute.manager [req-3a6271ba-d5d4-4435-98d7-4e0717a72100 req-418beeaa-f760-4007-9efa-f6af8b1aaa81 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] No waiting events found dispatching network-vif-unplugged-d3f68e24-4f74-40aa-8f91-ff43c969dcd0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:46:56 np0005531887 nova_compute[186849]: 2025-11-22 07:46:56.601 186853 DEBUG nova.compute.manager [req-3a6271ba-d5d4-4435-98d7-4e0717a72100 req-418beeaa-f760-4007-9efa-f6af8b1aaa81 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Received event network-vif-unplugged-d3f68e24-4f74-40aa-8f91-ff43c969dcd0 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 02:46:57 np0005531887 nova_compute[186849]: 2025-11-22 07:46:57.591 186853 DEBUG nova.network.neutron [-] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:46:57 np0005531887 nova_compute[186849]: 2025-11-22 07:46:57.633 186853 INFO nova.compute.manager [-] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Took 1.72 seconds to deallocate network for instance.#033[00m
Nov 22 02:46:57 np0005531887 nova_compute[186849]: 2025-11-22 07:46:57.720 186853 DEBUG nova.compute.manager [req-17e9e572-fa5c-4792-8479-14d753f147f3 req-8a4f983a-ba14-4998-9462-646558018482 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Received event network-vif-deleted-d3f68e24-4f74-40aa-8f91-ff43c969dcd0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:46:57 np0005531887 nova_compute[186849]: 2025-11-22 07:46:57.766 186853 DEBUG oslo_concurrency.lockutils [None req-5ff49bc2-4969-44af-ab22-34d8e514d0d3 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:46:57 np0005531887 nova_compute[186849]: 2025-11-22 07:46:57.766 186853 DEBUG oslo_concurrency.lockutils [None req-5ff49bc2-4969-44af-ab22-34d8e514d0d3 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:46:57 np0005531887 nova_compute[186849]: 2025-11-22 07:46:57.833 186853 DEBUG nova.compute.provider_tree [None req-5ff49bc2-4969-44af-ab22-34d8e514d0d3 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:46:57 np0005531887 nova_compute[186849]: 2025-11-22 07:46:57.849 186853 DEBUG nova.scheduler.client.report [None req-5ff49bc2-4969-44af-ab22-34d8e514d0d3 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:46:57 np0005531887 nova_compute[186849]: 2025-11-22 07:46:57.879 186853 DEBUG oslo_concurrency.lockutils [None req-5ff49bc2-4969-44af-ab22-34d8e514d0d3 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:46:57 np0005531887 nova_compute[186849]: 2025-11-22 07:46:57.901 186853 INFO nova.scheduler.client.report [None req-5ff49bc2-4969-44af-ab22-34d8e514d0d3 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Deleted allocations for instance 1cb44f55-1231-44bc-8a2d-e598899b2f89#033[00m
Nov 22 02:46:58 np0005531887 nova_compute[186849]: 2025-11-22 07:46:58.009 186853 DEBUG oslo_concurrency.lockutils [None req-5ff49bc2-4969-44af-ab22-34d8e514d0d3 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Lock "1cb44f55-1231-44bc-8a2d-e598899b2f89" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.670s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:46:58 np0005531887 nova_compute[186849]: 2025-11-22 07:46:58.709 186853 DEBUG nova.compute.manager [req-d9b5e551-bdde-47d1-a7a8-e9a118d673f3 req-8436645c-695f-4f2e-8144-2349dd77cb5d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Received event network-vif-plugged-d3f68e24-4f74-40aa-8f91-ff43c969dcd0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:46:58 np0005531887 nova_compute[186849]: 2025-11-22 07:46:58.709 186853 DEBUG oslo_concurrency.lockutils [req-d9b5e551-bdde-47d1-a7a8-e9a118d673f3 req-8436645c-695f-4f2e-8144-2349dd77cb5d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1cb44f55-1231-44bc-8a2d-e598899b2f89-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:46:58 np0005531887 nova_compute[186849]: 2025-11-22 07:46:58.710 186853 DEBUG oslo_concurrency.lockutils [req-d9b5e551-bdde-47d1-a7a8-e9a118d673f3 req-8436645c-695f-4f2e-8144-2349dd77cb5d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1cb44f55-1231-44bc-8a2d-e598899b2f89-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:46:58 np0005531887 nova_compute[186849]: 2025-11-22 07:46:58.710 186853 DEBUG oslo_concurrency.lockutils [req-d9b5e551-bdde-47d1-a7a8-e9a118d673f3 req-8436645c-695f-4f2e-8144-2349dd77cb5d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1cb44f55-1231-44bc-8a2d-e598899b2f89-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:46:58 np0005531887 nova_compute[186849]: 2025-11-22 07:46:58.710 186853 DEBUG nova.compute.manager [req-d9b5e551-bdde-47d1-a7a8-e9a118d673f3 req-8436645c-695f-4f2e-8144-2349dd77cb5d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] No waiting events found dispatching network-vif-plugged-d3f68e24-4f74-40aa-8f91-ff43c969dcd0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:46:58 np0005531887 nova_compute[186849]: 2025-11-22 07:46:58.711 186853 WARNING nova.compute.manager [req-d9b5e551-bdde-47d1-a7a8-e9a118d673f3 req-8436645c-695f-4f2e-8144-2349dd77cb5d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Received unexpected event network-vif-plugged-d3f68e24-4f74-40aa-8f91-ff43c969dcd0 for instance with vm_state deleted and task_state None.#033[00m
Nov 22 02:46:59 np0005531887 podman[217450]: 2025-11-22 07:46:59.838064726 +0000 UTC m=+0.050903216 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 02:47:00 np0005531887 nova_compute[186849]: 2025-11-22 07:47:00.255 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:00 np0005531887 nova_compute[186849]: 2025-11-22 07:47:00.753 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:01 np0005531887 nova_compute[186849]: 2025-11-22 07:47:01.671 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:01 np0005531887 nova_compute[186849]: 2025-11-22 07:47:01.821 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:03 np0005531887 podman[217475]: 2025-11-22 07:47:03.832554161 +0000 UTC m=+0.054111634 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:47:05 np0005531887 nova_compute[186849]: 2025-11-22 07:47:05.256 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:05 np0005531887 nova_compute[186849]: 2025-11-22 07:47:05.754 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:07 np0005531887 podman[217494]: 2025-11-22 07:47:07.835343927 +0000 UTC m=+0.057727161 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:47:10 np0005531887 nova_compute[186849]: 2025-11-22 07:47:10.259 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:10 np0005531887 nova_compute[186849]: 2025-11-22 07:47:10.621 186853 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797615.6187253, 1cb44f55-1231-44bc-8a2d-e598899b2f89 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:47:10 np0005531887 nova_compute[186849]: 2025-11-22 07:47:10.622 186853 INFO nova.compute.manager [-] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:47:10 np0005531887 nova_compute[186849]: 2025-11-22 07:47:10.649 186853 DEBUG nova.compute.manager [None req-39dbf5ba-8832-4910-b50b-f99b62d00a03 - - - - - -] [instance: 1cb44f55-1231-44bc-8a2d-e598899b2f89] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:47:10 np0005531887 nova_compute[186849]: 2025-11-22 07:47:10.757 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:12 np0005531887 nova_compute[186849]: 2025-11-22 07:47:12.525 186853 DEBUG oslo_concurrency.lockutils [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Acquiring lock "7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:12 np0005531887 nova_compute[186849]: 2025-11-22 07:47:12.525 186853 DEBUG oslo_concurrency.lockutils [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:12 np0005531887 nova_compute[186849]: 2025-11-22 07:47:12.576 186853 DEBUG nova.compute.manager [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 02:47:12 np0005531887 nova_compute[186849]: 2025-11-22 07:47:12.600 186853 DEBUG oslo_concurrency.lockutils [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Acquiring lock "89be3b77-79e2-4c6a-9107-a17f3f4a3fca" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:12 np0005531887 nova_compute[186849]: 2025-11-22 07:47:12.601 186853 DEBUG oslo_concurrency.lockutils [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "89be3b77-79e2-4c6a-9107-a17f3f4a3fca" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:12 np0005531887 nova_compute[186849]: 2025-11-22 07:47:12.651 186853 DEBUG nova.compute.manager [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 89be3b77-79e2-4c6a-9107-a17f3f4a3fca] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 02:47:12 np0005531887 nova_compute[186849]: 2025-11-22 07:47:12.774 186853 DEBUG oslo_concurrency.lockutils [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:12 np0005531887 nova_compute[186849]: 2025-11-22 07:47:12.775 186853 DEBUG oslo_concurrency.lockutils [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:12 np0005531887 nova_compute[186849]: 2025-11-22 07:47:12.784 186853 DEBUG nova.virt.hardware [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:47:12 np0005531887 nova_compute[186849]: 2025-11-22 07:47:12.784 186853 INFO nova.compute.claims [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 22 02:47:12 np0005531887 nova_compute[186849]: 2025-11-22 07:47:12.787 186853 DEBUG oslo_concurrency.lockutils [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:12 np0005531887 podman[217517]: 2025-11-22 07:47:12.855146741 +0000 UTC m=+0.072647316 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 02:47:12 np0005531887 nova_compute[186849]: 2025-11-22 07:47:12.961 186853 DEBUG nova.compute.provider_tree [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:47:12 np0005531887 nova_compute[186849]: 2025-11-22 07:47:12.984 186853 DEBUG nova.scheduler.client.report [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:47:13 np0005531887 nova_compute[186849]: 2025-11-22 07:47:13.026 186853 DEBUG oslo_concurrency.lockutils [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.251s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:13 np0005531887 nova_compute[186849]: 2025-11-22 07:47:13.028 186853 DEBUG oslo_concurrency.lockutils [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.241s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:13 np0005531887 nova_compute[186849]: 2025-11-22 07:47:13.034 186853 DEBUG nova.virt.hardware [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:47:13 np0005531887 nova_compute[186849]: 2025-11-22 07:47:13.034 186853 INFO nova.compute.claims [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 89be3b77-79e2-4c6a-9107-a17f3f4a3fca] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 22 02:47:13 np0005531887 nova_compute[186849]: 2025-11-22 07:47:13.068 186853 DEBUG oslo_concurrency.lockutils [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Acquiring lock "8485ccbe-b484-4249-90b6-9aea4bfdf9e2" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:13 np0005531887 nova_compute[186849]: 2025-11-22 07:47:13.068 186853 DEBUG oslo_concurrency.lockutils [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "8485ccbe-b484-4249-90b6-9aea4bfdf9e2" acquired by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:13 np0005531887 nova_compute[186849]: 2025-11-22 07:47:13.114 186853 DEBUG oslo_concurrency.lockutils [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "8485ccbe-b484-4249-90b6-9aea4bfdf9e2" "released" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: held 0.046s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:13 np0005531887 nova_compute[186849]: 2025-11-22 07:47:13.115 186853 DEBUG nova.compute.manager [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 02:47:13 np0005531887 nova_compute[186849]: 2025-11-22 07:47:13.229 186853 DEBUG nova.compute.manager [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 02:47:13 np0005531887 nova_compute[186849]: 2025-11-22 07:47:13.230 186853 DEBUG nova.network.neutron [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 02:47:13 np0005531887 nova_compute[186849]: 2025-11-22 07:47:13.273 186853 INFO nova.virt.libvirt.driver [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 02:47:13 np0005531887 nova_compute[186849]: 2025-11-22 07:47:13.342 186853 DEBUG oslo_concurrency.lockutils [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Acquiring lock "b9c07170-ca6f-422e-8f1c-9dfd5cc943a4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:13 np0005531887 nova_compute[186849]: 2025-11-22 07:47:13.342 186853 DEBUG oslo_concurrency.lockutils [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Lock "b9c07170-ca6f-422e-8f1c-9dfd5cc943a4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:13 np0005531887 nova_compute[186849]: 2025-11-22 07:47:13.343 186853 DEBUG nova.compute.manager [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 02:47:13 np0005531887 nova_compute[186849]: 2025-11-22 07:47:13.356 186853 DEBUG nova.compute.provider_tree [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:47:13 np0005531887 nova_compute[186849]: 2025-11-22 07:47:13.398 186853 DEBUG nova.scheduler.client.report [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:47:13 np0005531887 nova_compute[186849]: 2025-11-22 07:47:13.402 186853 DEBUG nova.compute.manager [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 02:47:13 np0005531887 nova_compute[186849]: 2025-11-22 07:47:13.506 186853 DEBUG oslo_concurrency.lockutils [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.479s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:13 np0005531887 nova_compute[186849]: 2025-11-22 07:47:13.560 186853 DEBUG oslo_concurrency.lockutils [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Acquiring lock "8485ccbe-b484-4249-90b6-9aea4bfdf9e2" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:13 np0005531887 nova_compute[186849]: 2025-11-22 07:47:13.560 186853 DEBUG oslo_concurrency.lockutils [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "8485ccbe-b484-4249-90b6-9aea4bfdf9e2" acquired by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:13 np0005531887 nova_compute[186849]: 2025-11-22 07:47:13.586 186853 DEBUG oslo_concurrency.lockutils [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "8485ccbe-b484-4249-90b6-9aea4bfdf9e2" "released" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: held 0.026s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:13 np0005531887 nova_compute[186849]: 2025-11-22 07:47:13.586 186853 DEBUG nova.compute.manager [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 89be3b77-79e2-4c6a-9107-a17f3f4a3fca] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 02:47:13 np0005531887 nova_compute[186849]: 2025-11-22 07:47:13.621 186853 DEBUG nova.compute.manager [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 02:47:13 np0005531887 nova_compute[186849]: 2025-11-22 07:47:13.622 186853 DEBUG nova.virt.libvirt.driver [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:47:13 np0005531887 nova_compute[186849]: 2025-11-22 07:47:13.623 186853 INFO nova.virt.libvirt.driver [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5] Creating image(s)#033[00m
Nov 22 02:47:13 np0005531887 nova_compute[186849]: 2025-11-22 07:47:13.623 186853 DEBUG oslo_concurrency.lockutils [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Acquiring lock "/var/lib/nova/instances/7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:13 np0005531887 nova_compute[186849]: 2025-11-22 07:47:13.623 186853 DEBUG oslo_concurrency.lockutils [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "/var/lib/nova/instances/7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:13 np0005531887 nova_compute[186849]: 2025-11-22 07:47:13.624 186853 DEBUG oslo_concurrency.lockutils [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "/var/lib/nova/instances/7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:13 np0005531887 nova_compute[186849]: 2025-11-22 07:47:13.637 186853 DEBUG oslo_concurrency.processutils [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:47:13 np0005531887 nova_compute[186849]: 2025-11-22 07:47:13.661 186853 DEBUG oslo_concurrency.lockutils [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:13 np0005531887 nova_compute[186849]: 2025-11-22 07:47:13.662 186853 DEBUG oslo_concurrency.lockutils [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:13 np0005531887 nova_compute[186849]: 2025-11-22 07:47:13.670 186853 DEBUG nova.virt.hardware [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:47:13 np0005531887 nova_compute[186849]: 2025-11-22 07:47:13.670 186853 INFO nova.compute.claims [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 22 02:47:13 np0005531887 nova_compute[186849]: 2025-11-22 07:47:13.694 186853 DEBUG oslo_concurrency.processutils [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:47:13 np0005531887 nova_compute[186849]: 2025-11-22 07:47:13.695 186853 DEBUG oslo_concurrency.lockutils [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:13 np0005531887 nova_compute[186849]: 2025-11-22 07:47:13.695 186853 DEBUG oslo_concurrency.lockutils [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:13 np0005531887 nova_compute[186849]: 2025-11-22 07:47:13.709 186853 DEBUG oslo_concurrency.processutils [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:47:13 np0005531887 nova_compute[186849]: 2025-11-22 07:47:13.733 186853 DEBUG nova.compute.manager [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 89be3b77-79e2-4c6a-9107-a17f3f4a3fca] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 02:47:13 np0005531887 nova_compute[186849]: 2025-11-22 07:47:13.734 186853 DEBUG nova.network.neutron [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 89be3b77-79e2-4c6a-9107-a17f3f4a3fca] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 02:47:13 np0005531887 nova_compute[186849]: 2025-11-22 07:47:13.766 186853 INFO nova.virt.libvirt.driver [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 89be3b77-79e2-4c6a-9107-a17f3f4a3fca] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 02:47:13 np0005531887 nova_compute[186849]: 2025-11-22 07:47:13.769 186853 DEBUG oslo_concurrency.processutils [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:47:13 np0005531887 nova_compute[186849]: 2025-11-22 07:47:13.770 186853 DEBUG oslo_concurrency.processutils [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:47:13 np0005531887 nova_compute[186849]: 2025-11-22 07:47:13.803 186853 DEBUG oslo_concurrency.processutils [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:47:13 np0005531887 nova_compute[186849]: 2025-11-22 07:47:13.804 186853 DEBUG oslo_concurrency.lockutils [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:13 np0005531887 nova_compute[186849]: 2025-11-22 07:47:13.804 186853 DEBUG oslo_concurrency.processutils [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:47:13 np0005531887 nova_compute[186849]: 2025-11-22 07:47:13.826 186853 DEBUG nova.compute.manager [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 89be3b77-79e2-4c6a-9107-a17f3f4a3fca] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 02:47:13 np0005531887 nova_compute[186849]: 2025-11-22 07:47:13.860 186853 DEBUG oslo_concurrency.processutils [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:47:13 np0005531887 nova_compute[186849]: 2025-11-22 07:47:13.861 186853 DEBUG nova.virt.disk.api [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Checking if we can resize image /var/lib/nova/instances/7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:47:13 np0005531887 nova_compute[186849]: 2025-11-22 07:47:13.861 186853 DEBUG oslo_concurrency.processutils [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:47:13 np0005531887 nova_compute[186849]: 2025-11-22 07:47:13.917 186853 DEBUG oslo_concurrency.processutils [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:47:13 np0005531887 nova_compute[186849]: 2025-11-22 07:47:13.917 186853 DEBUG nova.virt.disk.api [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Cannot resize image /var/lib/nova/instances/7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:47:13 np0005531887 nova_compute[186849]: 2025-11-22 07:47:13.918 186853 DEBUG nova.objects.instance [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lazy-loading 'migration_context' on Instance uuid 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:47:13 np0005531887 nova_compute[186849]: 2025-11-22 07:47:13.934 186853 DEBUG nova.virt.libvirt.driver [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:47:13 np0005531887 nova_compute[186849]: 2025-11-22 07:47:13.935 186853 DEBUG nova.virt.libvirt.driver [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5] Ensure instance console log exists: /var/lib/nova/instances/7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:47:13 np0005531887 nova_compute[186849]: 2025-11-22 07:47:13.935 186853 DEBUG oslo_concurrency.lockutils [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:13 np0005531887 nova_compute[186849]: 2025-11-22 07:47:13.936 186853 DEBUG oslo_concurrency.lockutils [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:13 np0005531887 nova_compute[186849]: 2025-11-22 07:47:13.936 186853 DEBUG oslo_concurrency.lockutils [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.046 186853 DEBUG nova.network.neutron [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.046 186853 DEBUG nova.compute.manager [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.048 186853 DEBUG nova.virt.libvirt.driver [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'size': 0, 'boot_index': 0, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.053 186853 DEBUG nova.compute.provider_tree [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.056 186853 WARNING nova.virt.libvirt.driver [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.073 186853 DEBUG nova.virt.libvirt.host [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.074 186853 DEBUG nova.virt.libvirt.host [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.081 186853 DEBUG nova.virt.libvirt.host [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.081 186853 DEBUG nova.virt.libvirt.host [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.082 186853 DEBUG nova.virt.libvirt.driver [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.082 186853 DEBUG nova.virt.hardware [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.083 186853 DEBUG nova.virt.hardware [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.083 186853 DEBUG nova.virt.hardware [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.083 186853 DEBUG nova.virt.hardware [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.083 186853 DEBUG nova.virt.hardware [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.084 186853 DEBUG nova.virt.hardware [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.084 186853 DEBUG nova.virt.hardware [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.084 186853 DEBUG nova.virt.hardware [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.084 186853 DEBUG nova.virt.hardware [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.084 186853 DEBUG nova.virt.hardware [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.084 186853 DEBUG nova.virt.hardware [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.088 186853 DEBUG nova.objects.instance [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lazy-loading 'pci_devices' on Instance uuid 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.091 186853 DEBUG nova.compute.manager [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 89be3b77-79e2-4c6a-9107-a17f3f4a3fca] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.093 186853 DEBUG nova.virt.libvirt.driver [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 89be3b77-79e2-4c6a-9107-a17f3f4a3fca] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.093 186853 INFO nova.virt.libvirt.driver [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 89be3b77-79e2-4c6a-9107-a17f3f4a3fca] Creating image(s)#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.093 186853 DEBUG oslo_concurrency.lockutils [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Acquiring lock "/var/lib/nova/instances/89be3b77-79e2-4c6a-9107-a17f3f4a3fca/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.093 186853 DEBUG oslo_concurrency.lockutils [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "/var/lib/nova/instances/89be3b77-79e2-4c6a-9107-a17f3f4a3fca/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.094 186853 DEBUG oslo_concurrency.lockutils [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "/var/lib/nova/instances/89be3b77-79e2-4c6a-9107-a17f3f4a3fca/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.106 186853 DEBUG nova.scheduler.client.report [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.110 186853 DEBUG nova.virt.libvirt.driver [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:47:14 np0005531887 nova_compute[186849]:  <uuid>7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5</uuid>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:  <name>instance-00000021</name>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:  <memory>131072</memory>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:  <vcpu>1</vcpu>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:47:14 np0005531887 nova_compute[186849]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:      <nova:name>tempest-ServersOnMultiNodesTest-server-519219898-1</nova:name>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:      <nova:creationTime>2025-11-22 07:47:14</nova:creationTime>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:      <nova:flavor name="m1.nano">
Nov 22 02:47:14 np0005531887 nova_compute[186849]:        <nova:memory>128</nova:memory>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:        <nova:disk>1</nova:disk>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:        <nova:swap>0</nova:swap>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:      </nova:flavor>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:      <nova:owner>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:        <nova:user uuid="f3272c6a12f44ac18db2715976e29248">tempest-ServersOnMultiNodesTest-214232393-project-member</nova:user>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:        <nova:project uuid="b764107a4dca4a799bc3edefe458310b">tempest-ServersOnMultiNodesTest-214232393</nova:project>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:      </nova:owner>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:      <nova:ports/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    </nova:instance>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:  <sysinfo type="smbios">
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <system>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:      <entry name="serial">7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5</entry>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:      <entry name="uuid">7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5</entry>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    </system>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:  <os>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <boot dev="hd"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <smbios mode="sysinfo"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:  </os>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:  <features>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <vmcoreinfo/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:  </features>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:  <clock offset="utc">
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <timer name="hpet" present="no"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:  </clock>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:  <cpu mode="custom" match="exact">
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <model>Nehalem</model>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:  <devices>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <disk type="file" device="disk">
Nov 22 02:47:14 np0005531887 nova_compute[186849]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5/disk"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:      <target dev="vda" bus="virtio"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <disk type="file" device="cdrom">
Nov 22 02:47:14 np0005531887 nova_compute[186849]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5/disk.config"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:      <target dev="sda" bus="sata"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <serial type="pty">
Nov 22 02:47:14 np0005531887 nova_compute[186849]:      <log file="/var/lib/nova/instances/7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5/console.log" append="off"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    </serial>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <video>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    </video>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <input type="tablet" bus="usb"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <rng model="virtio">
Nov 22 02:47:14 np0005531887 nova_compute[186849]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    </rng>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <controller type="usb" index="0"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <memballoon model="virtio">
Nov 22 02:47:14 np0005531887 nova_compute[186849]:      <stats period="10"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:  </devices>
Nov 22 02:47:14 np0005531887 nova_compute[186849]: </domain>
Nov 22 02:47:14 np0005531887 nova_compute[186849]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.112 186853 DEBUG oslo_concurrency.processutils [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.181 186853 DEBUG oslo_concurrency.processutils [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.181 186853 DEBUG oslo_concurrency.lockutils [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.182 186853 DEBUG oslo_concurrency.lockutils [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.192 186853 DEBUG oslo_concurrency.processutils [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.213 186853 DEBUG nova.virt.libvirt.driver [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.214 186853 DEBUG nova.virt.libvirt.driver [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.215 186853 INFO nova.virt.libvirt.driver [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5] Using config drive#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.218 186853 DEBUG oslo_concurrency.lockutils [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.556s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.218 186853 DEBUG nova.compute.manager [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.246 186853 DEBUG oslo_concurrency.processutils [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.246 186853 DEBUG oslo_concurrency.processutils [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/89be3b77-79e2-4c6a-9107-a17f3f4a3fca/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.283 186853 DEBUG oslo_concurrency.processutils [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/89be3b77-79e2-4c6a-9107-a17f3f4a3fca/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.283 186853 DEBUG oslo_concurrency.lockutils [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.101s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.284 186853 DEBUG oslo_concurrency.processutils [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.336 186853 DEBUG oslo_concurrency.processutils [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.337 186853 DEBUG nova.virt.disk.api [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Checking if we can resize image /var/lib/nova/instances/89be3b77-79e2-4c6a-9107-a17f3f4a3fca/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.338 186853 DEBUG oslo_concurrency.processutils [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89be3b77-79e2-4c6a-9107-a17f3f4a3fca/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.392 186853 DEBUG oslo_concurrency.processutils [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89be3b77-79e2-4c6a-9107-a17f3f4a3fca/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.393 186853 DEBUG nova.virt.disk.api [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Cannot resize image /var/lib/nova/instances/89be3b77-79e2-4c6a-9107-a17f3f4a3fca/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.393 186853 DEBUG nova.objects.instance [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lazy-loading 'migration_context' on Instance uuid 89be3b77-79e2-4c6a-9107-a17f3f4a3fca obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.411 186853 DEBUG nova.network.neutron [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 89be3b77-79e2-4c6a-9107-a17f3f4a3fca] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.412 186853 DEBUG nova.compute.manager [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 89be3b77-79e2-4c6a-9107-a17f3f4a3fca] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.424 186853 DEBUG nova.virt.libvirt.driver [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 89be3b77-79e2-4c6a-9107-a17f3f4a3fca] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.424 186853 DEBUG nova.virt.libvirt.driver [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 89be3b77-79e2-4c6a-9107-a17f3f4a3fca] Ensure instance console log exists: /var/lib/nova/instances/89be3b77-79e2-4c6a-9107-a17f3f4a3fca/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.425 186853 DEBUG oslo_concurrency.lockutils [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.425 186853 DEBUG oslo_concurrency.lockutils [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.425 186853 DEBUG oslo_concurrency.lockutils [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.427 186853 DEBUG nova.virt.libvirt.driver [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 89be3b77-79e2-4c6a-9107-a17f3f4a3fca] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'size': 0, 'boot_index': 0, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.436 186853 DEBUG nova.compute.manager [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.436 186853 DEBUG nova.network.neutron [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.440 186853 WARNING nova.virt.libvirt.driver [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.453 186853 DEBUG nova.virt.libvirt.host [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.454 186853 DEBUG nova.virt.libvirt.host [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.456 186853 DEBUG nova.virt.libvirt.host [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.457 186853 DEBUG nova.virt.libvirt.host [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.458 186853 DEBUG nova.virt.libvirt.driver [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.458 186853 DEBUG nova.virt.hardware [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.458 186853 DEBUG nova.virt.hardware [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.458 186853 DEBUG nova.virt.hardware [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.459 186853 DEBUG nova.virt.hardware [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.459 186853 DEBUG nova.virt.hardware [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.459 186853 DEBUG nova.virt.hardware [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.459 186853 DEBUG nova.virt.hardware [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.459 186853 DEBUG nova.virt.hardware [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.459 186853 DEBUG nova.virt.hardware [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.460 186853 DEBUG nova.virt.hardware [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.460 186853 DEBUG nova.virt.hardware [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.463 186853 DEBUG nova.objects.instance [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lazy-loading 'pci_devices' on Instance uuid 89be3b77-79e2-4c6a-9107-a17f3f4a3fca obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.466 186853 INFO nova.virt.libvirt.driver [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5] Creating config drive at /var/lib/nova/instances/7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5/disk.config#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.471 186853 DEBUG oslo_concurrency.processutils [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppb_p3z8k execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.491 186853 DEBUG nova.virt.libvirt.driver [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 89be3b77-79e2-4c6a-9107-a17f3f4a3fca] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:47:14 np0005531887 nova_compute[186849]:  <uuid>89be3b77-79e2-4c6a-9107-a17f3f4a3fca</uuid>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:  <name>instance-00000022</name>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:  <memory>131072</memory>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:  <vcpu>1</vcpu>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:47:14 np0005531887 nova_compute[186849]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:      <nova:name>tempest-ServersOnMultiNodesTest-server-519219898-2</nova:name>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:      <nova:creationTime>2025-11-22 07:47:14</nova:creationTime>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:      <nova:flavor name="m1.nano">
Nov 22 02:47:14 np0005531887 nova_compute[186849]:        <nova:memory>128</nova:memory>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:        <nova:disk>1</nova:disk>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:        <nova:swap>0</nova:swap>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:      </nova:flavor>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:      <nova:owner>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:        <nova:user uuid="f3272c6a12f44ac18db2715976e29248">tempest-ServersOnMultiNodesTest-214232393-project-member</nova:user>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:        <nova:project uuid="b764107a4dca4a799bc3edefe458310b">tempest-ServersOnMultiNodesTest-214232393</nova:project>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:      </nova:owner>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:      <nova:ports/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    </nova:instance>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:  <sysinfo type="smbios">
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <system>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:      <entry name="serial">89be3b77-79e2-4c6a-9107-a17f3f4a3fca</entry>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:      <entry name="uuid">89be3b77-79e2-4c6a-9107-a17f3f4a3fca</entry>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    </system>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:  <os>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <boot dev="hd"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <smbios mode="sysinfo"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:  </os>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:  <features>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <vmcoreinfo/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:  </features>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:  <clock offset="utc">
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <timer name="hpet" present="no"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:  </clock>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:  <cpu mode="custom" match="exact">
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <model>Nehalem</model>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:  <devices>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <disk type="file" device="disk">
Nov 22 02:47:14 np0005531887 nova_compute[186849]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/89be3b77-79e2-4c6a-9107-a17f3f4a3fca/disk"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:      <target dev="vda" bus="virtio"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <disk type="file" device="cdrom">
Nov 22 02:47:14 np0005531887 nova_compute[186849]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/89be3b77-79e2-4c6a-9107-a17f3f4a3fca/disk.config"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:      <target dev="sda" bus="sata"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <serial type="pty">
Nov 22 02:47:14 np0005531887 nova_compute[186849]:      <log file="/var/lib/nova/instances/89be3b77-79e2-4c6a-9107-a17f3f4a3fca/console.log" append="off"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    </serial>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <video>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    </video>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <input type="tablet" bus="usb"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <rng model="virtio">
Nov 22 02:47:14 np0005531887 nova_compute[186849]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    </rng>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <controller type="usb" index="0"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    <memballoon model="virtio">
Nov 22 02:47:14 np0005531887 nova_compute[186849]:      <stats period="10"/>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 02:47:14 np0005531887 nova_compute[186849]:  </devices>
Nov 22 02:47:14 np0005531887 nova_compute[186849]: </domain>
Nov 22 02:47:14 np0005531887 nova_compute[186849]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.493 186853 INFO nova.virt.libvirt.driver [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.565 186853 DEBUG nova.compute.manager [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.572 186853 DEBUG nova.virt.libvirt.driver [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.572 186853 DEBUG nova.virt.libvirt.driver [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.573 186853 INFO nova.virt.libvirt.driver [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 89be3b77-79e2-4c6a-9107-a17f3f4a3fca] Using config drive#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.594 186853 DEBUG oslo_concurrency.processutils [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppb_p3z8k" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:47:14 np0005531887 systemd-machined[153180]: New machine qemu-13-instance-00000021.
Nov 22 02:47:14 np0005531887 systemd[1]: Started Virtual Machine qemu-13-instance-00000021.
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.766 186853 DEBUG nova.compute.manager [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.767 186853 DEBUG nova.virt.libvirt.driver [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.767 186853 INFO nova.virt.libvirt.driver [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Creating image(s)#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.768 186853 DEBUG oslo_concurrency.lockutils [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Acquiring lock "/var/lib/nova/instances/b9c07170-ca6f-422e-8f1c-9dfd5cc943a4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.768 186853 DEBUG oslo_concurrency.lockutils [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Lock "/var/lib/nova/instances/b9c07170-ca6f-422e-8f1c-9dfd5cc943a4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.769 186853 DEBUG oslo_concurrency.lockutils [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Lock "/var/lib/nova/instances/b9c07170-ca6f-422e-8f1c-9dfd5cc943a4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.782 186853 DEBUG oslo_concurrency.processutils [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.840 186853 DEBUG oslo_concurrency.processutils [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.841 186853 DEBUG oslo_concurrency.lockutils [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.841 186853 DEBUG oslo_concurrency.lockutils [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.852 186853 DEBUG oslo_concurrency.processutils [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.873 186853 INFO nova.virt.libvirt.driver [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 89be3b77-79e2-4c6a-9107-a17f3f4a3fca] Creating config drive at /var/lib/nova/instances/89be3b77-79e2-4c6a-9107-a17f3f4a3fca/disk.config#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.879 186853 DEBUG oslo_concurrency.processutils [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/89be3b77-79e2-4c6a-9107-a17f3f4a3fca/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9y5yaktt execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.921 186853 DEBUG oslo_concurrency.processutils [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:47:14 np0005531887 nova_compute[186849]: 2025-11-22 07:47:14.922 186853 DEBUG oslo_concurrency.processutils [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/b9c07170-ca6f-422e-8f1c-9dfd5cc943a4/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.011 186853 DEBUG oslo_concurrency.processutils [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/89be3b77-79e2-4c6a-9107-a17f3f4a3fca/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9y5yaktt" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.027 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763797635.0261753, 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.027 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.031 186853 DEBUG nova.compute.manager [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.031 186853 DEBUG nova.virt.libvirt.driver [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.041 186853 INFO nova.virt.libvirt.driver [-] [instance: 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5] Instance spawned successfully.#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.042 186853 DEBUG nova.virt.libvirt.driver [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.065 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.075 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.080 186853 DEBUG nova.virt.libvirt.driver [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.080 186853 DEBUG nova.virt.libvirt.driver [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.081 186853 DEBUG nova.virt.libvirt.driver [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.082 186853 DEBUG nova.virt.libvirt.driver [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.082 186853 DEBUG nova.virt.libvirt.driver [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.083 186853 DEBUG nova.virt.libvirt.driver [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:47:15 np0005531887 systemd-machined[153180]: New machine qemu-14-instance-00000022.
Nov 22 02:47:15 np0005531887 systemd[1]: Started Virtual Machine qemu-14-instance-00000022.
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.121 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.122 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763797635.0304806, 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.122 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5] VM Started (Lifecycle Event)#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.180 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.185 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.215 186853 INFO nova.compute.manager [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5] Took 1.59 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.215 186853 DEBUG nova.compute.manager [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.261 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.334 186853 DEBUG oslo_concurrency.processutils [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/b9c07170-ca6f-422e-8f1c-9dfd5cc943a4/disk 1073741824" returned: 0 in 0.412s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.335 186853 DEBUG oslo_concurrency.lockutils [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.494s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.337 186853 DEBUG oslo_concurrency.processutils [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.404 186853 DEBUG oslo_concurrency.processutils [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.406 186853 DEBUG nova.virt.disk.api [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Checking if we can resize image /var/lib/nova/instances/b9c07170-ca6f-422e-8f1c-9dfd5cc943a4/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.407 186853 DEBUG oslo_concurrency.processutils [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b9c07170-ca6f-422e-8f1c-9dfd5cc943a4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.472 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.476 186853 DEBUG oslo_concurrency.processutils [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b9c07170-ca6f-422e-8f1c-9dfd5cc943a4/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.477 186853 DEBUG nova.virt.disk.api [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Cannot resize image /var/lib/nova/instances/b9c07170-ca6f-422e-8f1c-9dfd5cc943a4/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.477 186853 DEBUG nova.objects.instance [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Lazy-loading 'migration_context' on Instance uuid b9c07170-ca6f-422e-8f1c-9dfd5cc943a4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.497 186853 DEBUG nova.virt.libvirt.driver [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.497 186853 DEBUG nova.virt.libvirt.driver [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Ensure instance console log exists: /var/lib/nova/instances/b9c07170-ca6f-422e-8f1c-9dfd5cc943a4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.498 186853 DEBUG oslo_concurrency.lockutils [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.499 186853 DEBUG oslo_concurrency.lockutils [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.499 186853 DEBUG oslo_concurrency.lockutils [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.528 186853 INFO nova.compute.manager [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5] Took 2.83 seconds to build instance.#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.560 186853 DEBUG oslo_concurrency.lockutils [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.034s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.608 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763797635.607787, 89be3b77-79e2-4c6a-9107-a17f3f4a3fca => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.610 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 89be3b77-79e2-4c6a-9107-a17f3f4a3fca] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.612 186853 DEBUG nova.compute.manager [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 89be3b77-79e2-4c6a-9107-a17f3f4a3fca] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.612 186853 DEBUG nova.virt.libvirt.driver [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 89be3b77-79e2-4c6a-9107-a17f3f4a3fca] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.616 186853 INFO nova.virt.libvirt.driver [-] [instance: 89be3b77-79e2-4c6a-9107-a17f3f4a3fca] Instance spawned successfully.#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.617 186853 DEBUG nova.virt.libvirt.driver [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 89be3b77-79e2-4c6a-9107-a17f3f4a3fca] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.654 186853 DEBUG nova.policy [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8a738b980aad493b9a21da7d5a5ccf8a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd48bda61691e4f778b6d30c0dc773a30', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.674 186853 DEBUG nova.virt.libvirt.driver [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 89be3b77-79e2-4c6a-9107-a17f3f4a3fca] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.675 186853 DEBUG nova.virt.libvirt.driver [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 89be3b77-79e2-4c6a-9107-a17f3f4a3fca] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.676 186853 DEBUG nova.virt.libvirt.driver [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 89be3b77-79e2-4c6a-9107-a17f3f4a3fca] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.677 186853 DEBUG nova.virt.libvirt.driver [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 89be3b77-79e2-4c6a-9107-a17f3f4a3fca] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.677 186853 DEBUG nova.virt.libvirt.driver [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 89be3b77-79e2-4c6a-9107-a17f3f4a3fca] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.678 186853 DEBUG nova.virt.libvirt.driver [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 89be3b77-79e2-4c6a-9107-a17f3f4a3fca] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.683 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 89be3b77-79e2-4c6a-9107-a17f3f4a3fca] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.688 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 89be3b77-79e2-4c6a-9107-a17f3f4a3fca] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.730 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 89be3b77-79e2-4c6a-9107-a17f3f4a3fca] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.731 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763797635.612274, 89be3b77-79e2-4c6a-9107-a17f3f4a3fca => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.731 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 89be3b77-79e2-4c6a-9107-a17f3f4a3fca] VM Started (Lifecycle Event)#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.760 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.784 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 89be3b77-79e2-4c6a-9107-a17f3f4a3fca] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.788 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 89be3b77-79e2-4c6a-9107-a17f3f4a3fca] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.813 186853 INFO nova.compute.manager [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 89be3b77-79e2-4c6a-9107-a17f3f4a3fca] Took 1.72 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.813 186853 DEBUG nova.compute.manager [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 89be3b77-79e2-4c6a-9107-a17f3f4a3fca] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.818 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 89be3b77-79e2-4c6a-9107-a17f3f4a3fca] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:47:15 np0005531887 nova_compute[186849]: 2025-11-22 07:47:15.967 186853 INFO nova.compute.manager [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 89be3b77-79e2-4c6a-9107-a17f3f4a3fca] Took 3.23 seconds to build instance.#033[00m
Nov 22 02:47:16 np0005531887 nova_compute[186849]: 2025-11-22 07:47:16.207 186853 DEBUG oslo_concurrency.lockutils [None req-dec85d4b-69fb-4384-87d2-77ef6720beb3 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "89be3b77-79e2-4c6a-9107-a17f3f4a3fca" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.606s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:17 np0005531887 podman[217641]: 2025-11-22 07:47:17.840292068 +0000 UTC m=+0.058367518 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, version=9.6, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, release=1755695350, vcs-type=git, io.buildah.version=1.33.7, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public)
Nov 22 02:47:20 np0005531887 nova_compute[186849]: 2025-11-22 07:47:20.262 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:20 np0005531887 nova_compute[186849]: 2025-11-22 07:47:20.270 186853 DEBUG nova.network.neutron [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Successfully updated port: 28cefb09-6f44-4c5f-b924-c2e3ca0082e1 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 02:47:20 np0005531887 nova_compute[186849]: 2025-11-22 07:47:20.285 186853 DEBUG oslo_concurrency.lockutils [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Acquiring lock "refresh_cache-b9c07170-ca6f-422e-8f1c-9dfd5cc943a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:47:20 np0005531887 nova_compute[186849]: 2025-11-22 07:47:20.286 186853 DEBUG oslo_concurrency.lockutils [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Acquired lock "refresh_cache-b9c07170-ca6f-422e-8f1c-9dfd5cc943a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:47:20 np0005531887 nova_compute[186849]: 2025-11-22 07:47:20.286 186853 DEBUG nova.network.neutron [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:47:20 np0005531887 nova_compute[186849]: 2025-11-22 07:47:20.594 186853 DEBUG nova.network.neutron [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:47:20 np0005531887 nova_compute[186849]: 2025-11-22 07:47:20.662 186853 DEBUG nova.compute.manager [req-df3a152d-02b9-4a38-86ac-6519c6922d4d req-29c601ed-6837-4d24-b2a5-0891cc2e96dd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Received event network-changed-28cefb09-6f44-4c5f-b924-c2e3ca0082e1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:47:20 np0005531887 nova_compute[186849]: 2025-11-22 07:47:20.663 186853 DEBUG nova.compute.manager [req-df3a152d-02b9-4a38-86ac-6519c6922d4d req-29c601ed-6837-4d24-b2a5-0891cc2e96dd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Refreshing instance network info cache due to event network-changed-28cefb09-6f44-4c5f-b924-c2e3ca0082e1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 02:47:20 np0005531887 nova_compute[186849]: 2025-11-22 07:47:20.663 186853 DEBUG oslo_concurrency.lockutils [req-df3a152d-02b9-4a38-86ac-6519c6922d4d req-29c601ed-6837-4d24-b2a5-0891cc2e96dd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-b9c07170-ca6f-422e-8f1c-9dfd5cc943a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:47:20 np0005531887 nova_compute[186849]: 2025-11-22 07:47:20.762 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:22 np0005531887 nova_compute[186849]: 2025-11-22 07:47:22.060 186853 DEBUG nova.network.neutron [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Updating instance_info_cache with network_info: [{"id": "28cefb09-6f44-4c5f-b924-c2e3ca0082e1", "address": "fa:16:3e:90:34:2c", "network": {"id": "c3f966e1-8cff-4ca0-9b4f-a318c31b0a70", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1427311200-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d48bda61691e4f778b6d30c0dc773a30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap28cefb09-6f", "ovs_interfaceid": "28cefb09-6f44-4c5f-b924-c2e3ca0082e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:47:22 np0005531887 nova_compute[186849]: 2025-11-22 07:47:22.084 186853 DEBUG oslo_concurrency.lockutils [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Releasing lock "refresh_cache-b9c07170-ca6f-422e-8f1c-9dfd5cc943a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:47:22 np0005531887 nova_compute[186849]: 2025-11-22 07:47:22.085 186853 DEBUG nova.compute.manager [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Instance network_info: |[{"id": "28cefb09-6f44-4c5f-b924-c2e3ca0082e1", "address": "fa:16:3e:90:34:2c", "network": {"id": "c3f966e1-8cff-4ca0-9b4f-a318c31b0a70", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1427311200-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d48bda61691e4f778b6d30c0dc773a30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap28cefb09-6f", "ovs_interfaceid": "28cefb09-6f44-4c5f-b924-c2e3ca0082e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 02:47:22 np0005531887 nova_compute[186849]: 2025-11-22 07:47:22.086 186853 DEBUG oslo_concurrency.lockutils [req-df3a152d-02b9-4a38-86ac-6519c6922d4d req-29c601ed-6837-4d24-b2a5-0891cc2e96dd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-b9c07170-ca6f-422e-8f1c-9dfd5cc943a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:47:22 np0005531887 nova_compute[186849]: 2025-11-22 07:47:22.086 186853 DEBUG nova.network.neutron [req-df3a152d-02b9-4a38-86ac-6519c6922d4d req-29c601ed-6837-4d24-b2a5-0891cc2e96dd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Refreshing network info cache for port 28cefb09-6f44-4c5f-b924-c2e3ca0082e1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 02:47:22 np0005531887 nova_compute[186849]: 2025-11-22 07:47:22.089 186853 DEBUG nova.virt.libvirt.driver [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Start _get_guest_xml network_info=[{"id": "28cefb09-6f44-4c5f-b924-c2e3ca0082e1", "address": "fa:16:3e:90:34:2c", "network": {"id": "c3f966e1-8cff-4ca0-9b4f-a318c31b0a70", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1427311200-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d48bda61691e4f778b6d30c0dc773a30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap28cefb09-6f", "ovs_interfaceid": "28cefb09-6f44-4c5f-b924-c2e3ca0082e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'size': 0, 'boot_index': 0, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:47:22 np0005531887 nova_compute[186849]: 2025-11-22 07:47:22.095 186853 WARNING nova.virt.libvirt.driver [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:47:22 np0005531887 nova_compute[186849]: 2025-11-22 07:47:22.102 186853 DEBUG nova.virt.libvirt.host [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:47:22 np0005531887 nova_compute[186849]: 2025-11-22 07:47:22.103 186853 DEBUG nova.virt.libvirt.host [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:47:22 np0005531887 nova_compute[186849]: 2025-11-22 07:47:22.107 186853 DEBUG nova.virt.libvirt.host [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:47:22 np0005531887 nova_compute[186849]: 2025-11-22 07:47:22.108 186853 DEBUG nova.virt.libvirt.host [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:47:22 np0005531887 nova_compute[186849]: 2025-11-22 07:47:22.109 186853 DEBUG nova.virt.libvirt.driver [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:47:22 np0005531887 nova_compute[186849]: 2025-11-22 07:47:22.110 186853 DEBUG nova.virt.hardware [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:47:22 np0005531887 nova_compute[186849]: 2025-11-22 07:47:22.110 186853 DEBUG nova.virt.hardware [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:47:22 np0005531887 nova_compute[186849]: 2025-11-22 07:47:22.111 186853 DEBUG nova.virt.hardware [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:47:22 np0005531887 nova_compute[186849]: 2025-11-22 07:47:22.111 186853 DEBUG nova.virt.hardware [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:47:22 np0005531887 nova_compute[186849]: 2025-11-22 07:47:22.111 186853 DEBUG nova.virt.hardware [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:47:22 np0005531887 nova_compute[186849]: 2025-11-22 07:47:22.112 186853 DEBUG nova.virt.hardware [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:47:22 np0005531887 nova_compute[186849]: 2025-11-22 07:47:22.112 186853 DEBUG nova.virt.hardware [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:47:22 np0005531887 nova_compute[186849]: 2025-11-22 07:47:22.112 186853 DEBUG nova.virt.hardware [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:47:22 np0005531887 nova_compute[186849]: 2025-11-22 07:47:22.113 186853 DEBUG nova.virt.hardware [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:47:22 np0005531887 nova_compute[186849]: 2025-11-22 07:47:22.113 186853 DEBUG nova.virt.hardware [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:47:22 np0005531887 nova_compute[186849]: 2025-11-22 07:47:22.114 186853 DEBUG nova.virt.hardware [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:47:22 np0005531887 nova_compute[186849]: 2025-11-22 07:47:22.118 186853 DEBUG nova.virt.libvirt.vif [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:47:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1107734578',display_name='tempest-LiveMigrationTest-server-1107734578',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1107734578',id=35,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d48bda61691e4f778b6d30c0dc773a30',ramdisk_id='',reservation_id='r-qyzvalf0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveMigrationTest-2093743563',owner_user_name='tempest-LiveMigrationTest-2093743563-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:47:14Z,user_data=None,user_id='8a738b980aad493b9a21da7d5a5ccf8a',uuid=b9c07170-ca6f-422e-8f1c-9dfd5cc943a4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "28cefb09-6f44-4c5f-b924-c2e3ca0082e1", "address": "fa:16:3e:90:34:2c", "network": {"id": "c3f966e1-8cff-4ca0-9b4f-a318c31b0a70", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1427311200-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d48bda61691e4f778b6d30c0dc773a30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap28cefb09-6f", "ovs_interfaceid": "28cefb09-6f44-4c5f-b924-c2e3ca0082e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 02:47:22 np0005531887 nova_compute[186849]: 2025-11-22 07:47:22.119 186853 DEBUG nova.network.os_vif_util [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Converting VIF {"id": "28cefb09-6f44-4c5f-b924-c2e3ca0082e1", "address": "fa:16:3e:90:34:2c", "network": {"id": "c3f966e1-8cff-4ca0-9b4f-a318c31b0a70", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1427311200-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d48bda61691e4f778b6d30c0dc773a30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap28cefb09-6f", "ovs_interfaceid": "28cefb09-6f44-4c5f-b924-c2e3ca0082e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:47:22 np0005531887 nova_compute[186849]: 2025-11-22 07:47:22.120 186853 DEBUG nova.network.os_vif_util [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:90:34:2c,bridge_name='br-int',has_traffic_filtering=True,id=28cefb09-6f44-4c5f-b924-c2e3ca0082e1,network=Network(c3f966e1-8cff-4ca0-9b4f-a318c31b0a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap28cefb09-6f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:47:22 np0005531887 nova_compute[186849]: 2025-11-22 07:47:22.122 186853 DEBUG nova.objects.instance [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Lazy-loading 'pci_devices' on Instance uuid b9c07170-ca6f-422e-8f1c-9dfd5cc943a4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:47:22 np0005531887 nova_compute[186849]: 2025-11-22 07:47:22.134 186853 DEBUG nova.virt.libvirt.driver [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:47:22 np0005531887 nova_compute[186849]:  <uuid>b9c07170-ca6f-422e-8f1c-9dfd5cc943a4</uuid>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:  <name>instance-00000023</name>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:  <memory>131072</memory>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:  <vcpu>1</vcpu>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:47:22 np0005531887 nova_compute[186849]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:      <nova:name>tempest-LiveMigrationTest-server-1107734578</nova:name>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:      <nova:creationTime>2025-11-22 07:47:22</nova:creationTime>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:      <nova:flavor name="m1.nano">
Nov 22 02:47:22 np0005531887 nova_compute[186849]:        <nova:memory>128</nova:memory>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:        <nova:disk>1</nova:disk>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:        <nova:swap>0</nova:swap>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:      </nova:flavor>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:      <nova:owner>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:        <nova:user uuid="8a738b980aad493b9a21da7d5a5ccf8a">tempest-LiveMigrationTest-2093743563-project-member</nova:user>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:        <nova:project uuid="d48bda61691e4f778b6d30c0dc773a30">tempest-LiveMigrationTest-2093743563</nova:project>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:      </nova:owner>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:      <nova:ports>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:        <nova:port uuid="28cefb09-6f44-4c5f-b924-c2e3ca0082e1">
Nov 22 02:47:22 np0005531887 nova_compute[186849]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:        </nova:port>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:      </nova:ports>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:    </nova:instance>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:  <sysinfo type="smbios">
Nov 22 02:47:22 np0005531887 nova_compute[186849]:    <system>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:      <entry name="serial">b9c07170-ca6f-422e-8f1c-9dfd5cc943a4</entry>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:      <entry name="uuid">b9c07170-ca6f-422e-8f1c-9dfd5cc943a4</entry>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:    </system>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:  <os>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:    <boot dev="hd"/>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:    <smbios mode="sysinfo"/>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:  </os>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:  <features>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:    <vmcoreinfo/>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:  </features>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:  <clock offset="utc">
Nov 22 02:47:22 np0005531887 nova_compute[186849]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:    <timer name="hpet" present="no"/>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:  </clock>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:  <cpu mode="custom" match="exact">
Nov 22 02:47:22 np0005531887 nova_compute[186849]:    <model>Nehalem</model>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:  <devices>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:    <disk type="file" device="disk">
Nov 22 02:47:22 np0005531887 nova_compute[186849]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/b9c07170-ca6f-422e-8f1c-9dfd5cc943a4/disk"/>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:      <target dev="vda" bus="virtio"/>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:    <disk type="file" device="cdrom">
Nov 22 02:47:22 np0005531887 nova_compute[186849]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/b9c07170-ca6f-422e-8f1c-9dfd5cc943a4/disk.config"/>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:      <target dev="sda" bus="sata"/>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:    <interface type="ethernet">
Nov 22 02:47:22 np0005531887 nova_compute[186849]:      <mac address="fa:16:3e:90:34:2c"/>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:      <mtu size="1442"/>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:      <target dev="tap28cefb09-6f"/>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:    </interface>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:    <serial type="pty">
Nov 22 02:47:22 np0005531887 nova_compute[186849]:      <log file="/var/lib/nova/instances/b9c07170-ca6f-422e-8f1c-9dfd5cc943a4/console.log" append="off"/>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:    </serial>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:    <video>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:    </video>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:    <input type="tablet" bus="usb"/>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:    <rng model="virtio">
Nov 22 02:47:22 np0005531887 nova_compute[186849]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:    </rng>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:    <controller type="usb" index="0"/>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:    <memballoon model="virtio">
Nov 22 02:47:22 np0005531887 nova_compute[186849]:      <stats period="10"/>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 02:47:22 np0005531887 nova_compute[186849]:  </devices>
Nov 22 02:47:22 np0005531887 nova_compute[186849]: </domain>
Nov 22 02:47:22 np0005531887 nova_compute[186849]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:47:22 np0005531887 nova_compute[186849]: 2025-11-22 07:47:22.141 186853 DEBUG nova.compute.manager [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Preparing to wait for external event network-vif-plugged-28cefb09-6f44-4c5f-b924-c2e3ca0082e1 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 02:47:22 np0005531887 nova_compute[186849]: 2025-11-22 07:47:22.142 186853 DEBUG oslo_concurrency.lockutils [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Acquiring lock "b9c07170-ca6f-422e-8f1c-9dfd5cc943a4-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:22 np0005531887 nova_compute[186849]: 2025-11-22 07:47:22.142 186853 DEBUG oslo_concurrency.lockutils [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Lock "b9c07170-ca6f-422e-8f1c-9dfd5cc943a4-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:22 np0005531887 nova_compute[186849]: 2025-11-22 07:47:22.143 186853 DEBUG oslo_concurrency.lockutils [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Lock "b9c07170-ca6f-422e-8f1c-9dfd5cc943a4-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:22 np0005531887 nova_compute[186849]: 2025-11-22 07:47:22.144 186853 DEBUG nova.virt.libvirt.vif [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:47:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1107734578',display_name='tempest-LiveMigrationTest-server-1107734578',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1107734578',id=35,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d48bda61691e4f778b6d30c0dc773a30',ramdisk_id='',reservation_id='r-qyzvalf0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveMigrationTest-2093743563',owner_user_name='tempest-LiveMigrationTest-2093743563-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:47:14Z,user_data=None,user_id='8a738b980aad493b9a21da7d5a5ccf8a',uuid=b9c07170-ca6f-422e-8f1c-9dfd5cc943a4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "28cefb09-6f44-4c5f-b924-c2e3ca0082e1", "address": "fa:16:3e:90:34:2c", "network": {"id": "c3f966e1-8cff-4ca0-9b4f-a318c31b0a70", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1427311200-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d48bda61691e4f778b6d30c0dc773a30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap28cefb09-6f", "ovs_interfaceid": "28cefb09-6f44-4c5f-b924-c2e3ca0082e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 02:47:22 np0005531887 nova_compute[186849]: 2025-11-22 07:47:22.144 186853 DEBUG nova.network.os_vif_util [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Converting VIF {"id": "28cefb09-6f44-4c5f-b924-c2e3ca0082e1", "address": "fa:16:3e:90:34:2c", "network": {"id": "c3f966e1-8cff-4ca0-9b4f-a318c31b0a70", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1427311200-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d48bda61691e4f778b6d30c0dc773a30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap28cefb09-6f", "ovs_interfaceid": "28cefb09-6f44-4c5f-b924-c2e3ca0082e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:47:22 np0005531887 nova_compute[186849]: 2025-11-22 07:47:22.145 186853 DEBUG nova.network.os_vif_util [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:90:34:2c,bridge_name='br-int',has_traffic_filtering=True,id=28cefb09-6f44-4c5f-b924-c2e3ca0082e1,network=Network(c3f966e1-8cff-4ca0-9b4f-a318c31b0a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap28cefb09-6f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:47:22 np0005531887 nova_compute[186849]: 2025-11-22 07:47:22.146 186853 DEBUG os_vif [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:90:34:2c,bridge_name='br-int',has_traffic_filtering=True,id=28cefb09-6f44-4c5f-b924-c2e3ca0082e1,network=Network(c3f966e1-8cff-4ca0-9b4f-a318c31b0a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap28cefb09-6f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 02:47:22 np0005531887 nova_compute[186849]: 2025-11-22 07:47:22.147 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:22 np0005531887 nova_compute[186849]: 2025-11-22 07:47:22.147 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:47:22 np0005531887 nova_compute[186849]: 2025-11-22 07:47:22.148 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:47:22 np0005531887 nova_compute[186849]: 2025-11-22 07:47:22.152 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:22 np0005531887 nova_compute[186849]: 2025-11-22 07:47:22.153 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap28cefb09-6f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:47:22 np0005531887 nova_compute[186849]: 2025-11-22 07:47:22.154 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap28cefb09-6f, col_values=(('external_ids', {'iface-id': '28cefb09-6f44-4c5f-b924-c2e3ca0082e1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:90:34:2c', 'vm-uuid': 'b9c07170-ca6f-422e-8f1c-9dfd5cc943a4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:47:22 np0005531887 nova_compute[186849]: 2025-11-22 07:47:22.156 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:22 np0005531887 NetworkManager[55210]: <info>  [1763797642.1570] manager: (tap28cefb09-6f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/46)
Nov 22 02:47:22 np0005531887 nova_compute[186849]: 2025-11-22 07:47:22.159 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:47:22 np0005531887 nova_compute[186849]: 2025-11-22 07:47:22.163 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:22 np0005531887 nova_compute[186849]: 2025-11-22 07:47:22.164 186853 INFO os_vif [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:90:34:2c,bridge_name='br-int',has_traffic_filtering=True,id=28cefb09-6f44-4c5f-b924-c2e3ca0082e1,network=Network(c3f966e1-8cff-4ca0-9b4f-a318c31b0a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap28cefb09-6f')#033[00m
Nov 22 02:47:22 np0005531887 nova_compute[186849]: 2025-11-22 07:47:22.231 186853 DEBUG nova.virt.libvirt.driver [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:47:22 np0005531887 nova_compute[186849]: 2025-11-22 07:47:22.231 186853 DEBUG nova.virt.libvirt.driver [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:47:22 np0005531887 nova_compute[186849]: 2025-11-22 07:47:22.232 186853 DEBUG nova.virt.libvirt.driver [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] No VIF found with MAC fa:16:3e:90:34:2c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 02:47:22 np0005531887 nova_compute[186849]: 2025-11-22 07:47:22.232 186853 INFO nova.virt.libvirt.driver [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Using config drive#033[00m
Nov 22 02:47:22 np0005531887 nova_compute[186849]: 2025-11-22 07:47:22.733 186853 INFO nova.virt.libvirt.driver [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Creating config drive at /var/lib/nova/instances/b9c07170-ca6f-422e-8f1c-9dfd5cc943a4/disk.config#033[00m
Nov 22 02:47:22 np0005531887 nova_compute[186849]: 2025-11-22 07:47:22.740 186853 DEBUG oslo_concurrency.processutils [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b9c07170-ca6f-422e-8f1c-9dfd5cc943a4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpf5k110fk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:47:22 np0005531887 nova_compute[186849]: 2025-11-22 07:47:22.873 186853 DEBUG oslo_concurrency.processutils [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b9c07170-ca6f-422e-8f1c-9dfd5cc943a4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpf5k110fk" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:47:22 np0005531887 kernel: tap28cefb09-6f: entered promiscuous mode
Nov 22 02:47:22 np0005531887 NetworkManager[55210]: <info>  [1763797642.9266] manager: (tap28cefb09-6f): new Tun device (/org/freedesktop/NetworkManager/Devices/47)
Nov 22 02:47:22 np0005531887 nova_compute[186849]: 2025-11-22 07:47:22.926 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:22 np0005531887 ovn_controller[95130]: 2025-11-22T07:47:22Z|00073|binding|INFO|Claiming lport 28cefb09-6f44-4c5f-b924-c2e3ca0082e1 for this chassis.
Nov 22 02:47:22 np0005531887 ovn_controller[95130]: 2025-11-22T07:47:22Z|00074|binding|INFO|28cefb09-6f44-4c5f-b924-c2e3ca0082e1: Claiming fa:16:3e:90:34:2c 10.100.0.9
Nov 22 02:47:22 np0005531887 ovn_controller[95130]: 2025-11-22T07:47:22Z|00075|binding|INFO|Claiming lport eb321ea2-ecc9-494b-a270-c3aac4f36e7d for this chassis.
Nov 22 02:47:22 np0005531887 ovn_controller[95130]: 2025-11-22T07:47:22Z|00076|binding|INFO|eb321ea2-ecc9-494b-a270-c3aac4f36e7d: Claiming fa:16:3e:d2:b0:13 19.80.0.49
Nov 22 02:47:22 np0005531887 nova_compute[186849]: 2025-11-22 07:47:22.934 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:22 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:22.951 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:90:34:2c 10.100.0.9'], port_security=['fa:16:3e:90:34:2c 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-109341048', 'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'b9c07170-ca6f-422e-8f1c-9dfd5cc943a4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-109341048', 'neutron:project_id': 'd48bda61691e4f778b6d30c0dc773a30', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd99796e5-fe06-409c-adb5-ca5cc291d6f2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=45f3847d-7d6e-44b5-a83a-dde97f76bd11, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=28cefb09-6f44-4c5f-b924-c2e3ca0082e1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:47:22 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:22.953 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d2:b0:13 19.80.0.49'], port_security=['fa:16:3e:d2:b0:13 19.80.0.49'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['28cefb09-6f44-4c5f-b924-c2e3ca0082e1'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-1622735635', 'neutron:cidrs': '19.80.0.49/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c410ae8d-536e-4819-b766-652bc78ac3e4', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-1622735635', 'neutron:project_id': 'd48bda61691e4f778b6d30c0dc773a30', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd99796e5-fe06-409c-adb5-ca5cc291d6f2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=83c2898f-e4b8-43d0-8099-6e9553385d03, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=eb321ea2-ecc9-494b-a270-c3aac4f36e7d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:47:22 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:22.954 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 28cefb09-6f44-4c5f-b924-c2e3ca0082e1 in datapath c3f966e1-8cff-4ca0-9b4f-a318c31b0a70 bound to our chassis#033[00m
Nov 22 02:47:22 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:22.956 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c3f966e1-8cff-4ca0-9b4f-a318c31b0a70#033[00m
Nov 22 02:47:22 np0005531887 systemd-machined[153180]: New machine qemu-15-instance-00000023.
Nov 22 02:47:22 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:22.974 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[73fb3c10-a235-4465-b3b1-16a7f2af6d55]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:47:22 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:22.976 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc3f966e1-81 in ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 02:47:22 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:22.984 213790 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc3f966e1-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 02:47:22 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:22.985 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[b17d7308-29c3-4ef5-baad-c9ca44d2a3f8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:47:22 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:22.991 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[73ea728f-d0f6-4ac0-a6f8-7d33218c62af]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:47:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:23.005 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[ce4f8e3a-21a5-465b-a208-66f2f7819ca1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:47:23 np0005531887 systemd[1]: Started Virtual Machine qemu-15-instance-00000023.
Nov 22 02:47:23 np0005531887 systemd-udevd[217682]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:47:23 np0005531887 nova_compute[186849]: 2025-11-22 07:47:23.028 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:23 np0005531887 NetworkManager[55210]: <info>  [1763797643.0349] device (tap28cefb09-6f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 02:47:23 np0005531887 NetworkManager[55210]: <info>  [1763797643.0360] device (tap28cefb09-6f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 02:47:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:23.038 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[c436fbb2-42f2-4705-89d3-bbf0ae30a648]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:47:23 np0005531887 ovn_controller[95130]: 2025-11-22T07:47:23Z|00077|binding|INFO|Setting lport 28cefb09-6f44-4c5f-b924-c2e3ca0082e1 ovn-installed in OVS
Nov 22 02:47:23 np0005531887 ovn_controller[95130]: 2025-11-22T07:47:23Z|00078|binding|INFO|Setting lport 28cefb09-6f44-4c5f-b924-c2e3ca0082e1 up in Southbound
Nov 22 02:47:23 np0005531887 ovn_controller[95130]: 2025-11-22T07:47:23Z|00079|binding|INFO|Setting lport eb321ea2-ecc9-494b-a270-c3aac4f36e7d up in Southbound
Nov 22 02:47:23 np0005531887 nova_compute[186849]: 2025-11-22 07:47:23.041 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:23.082 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[94dbd8c0-e271-498f-84ca-f927dd5f1e50]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:47:23 np0005531887 systemd-udevd[217685]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:47:23 np0005531887 NetworkManager[55210]: <info>  [1763797643.0908] manager: (tapc3f966e1-80): new Veth device (/org/freedesktop/NetworkManager/Devices/48)
Nov 22 02:47:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:23.090 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[e27f643e-1552-448b-9623-be1e9ee9cc30]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:47:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:23.128 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[d3101d9d-61a5-4e22-97f4-92a7fb5d2d8d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:47:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:23.132 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[7d5f3310-3cc1-4303-9f1e-a239a4f31335]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:47:23 np0005531887 NetworkManager[55210]: <info>  [1763797643.1573] device (tapc3f966e1-80): carrier: link connected
Nov 22 02:47:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:23.163 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[366a4d2d-adbd-4151-b1bf-5e97efc2f87d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:47:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:23.185 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[4514ffa1-4d11-4248-906d-0d9498af2c18]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc3f966e1-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e2:74:99'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 27], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 438576, 'reachable_time': 28729, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217712, 'error': None, 'target': 'ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:47:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:23.203 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[19638d53-36d4-4a95-be53-6df2fafc73be]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee2:7499'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 438576, 'tstamp': 438576}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217713, 'error': None, 'target': 'ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:47:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:23.222 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[8aee88af-ba54-4aa4-83d1-a85aa08dcba3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc3f966e1-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e2:74:99'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 27], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 438576, 'reachable_time': 28729, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217714, 'error': None, 'target': 'ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:47:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:23.257 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[0908e8f3-1b5c-4f74-b7c5-85bd488eb8c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:47:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:23.322 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[dd423fe9-b9ba-40ac-a2ee-4d4ff2dd3f06]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:47:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:23.324 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc3f966e1-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:47:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:23.324 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:47:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:23.325 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc3f966e1-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:47:23 np0005531887 nova_compute[186849]: 2025-11-22 07:47:23.326 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:23 np0005531887 kernel: tapc3f966e1-80: entered promiscuous mode
Nov 22 02:47:23 np0005531887 NetworkManager[55210]: <info>  [1763797643.3276] manager: (tapc3f966e1-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/49)
Nov 22 02:47:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:23.333 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc3f966e1-80, col_values=(('external_ids', {'iface-id': '8206cb6d-dd78-493d-a276-fccb0eeecc7d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:47:23 np0005531887 nova_compute[186849]: 2025-11-22 07:47:23.334 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:23 np0005531887 ovn_controller[95130]: 2025-11-22T07:47:23Z|00080|binding|INFO|Releasing lport 8206cb6d-dd78-493d-a276-fccb0eeecc7d from this chassis (sb_readonly=0)
Nov 22 02:47:23 np0005531887 nova_compute[186849]: 2025-11-22 07:47:23.335 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:23.335 104084 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c3f966e1-8cff-4ca0-9b4f-a318c31b0a70.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c3f966e1-8cff-4ca0-9b4f-a318c31b0a70.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 02:47:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:23.347 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[a4063551-01f7-41a7-b31f-ec278c44e67e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:47:23 np0005531887 nova_compute[186849]: 2025-11-22 07:47:23.349 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:23.349 104084 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 02:47:23 np0005531887 ovn_metadata_agent[104079]: global
Nov 22 02:47:23 np0005531887 ovn_metadata_agent[104079]:    log         /dev/log local0 debug
Nov 22 02:47:23 np0005531887 ovn_metadata_agent[104079]:    log-tag     haproxy-metadata-proxy-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70
Nov 22 02:47:23 np0005531887 ovn_metadata_agent[104079]:    user        root
Nov 22 02:47:23 np0005531887 ovn_metadata_agent[104079]:    group       root
Nov 22 02:47:23 np0005531887 ovn_metadata_agent[104079]:    maxconn     1024
Nov 22 02:47:23 np0005531887 ovn_metadata_agent[104079]:    pidfile     /var/lib/neutron/external/pids/c3f966e1-8cff-4ca0-9b4f-a318c31b0a70.pid.haproxy
Nov 22 02:47:23 np0005531887 ovn_metadata_agent[104079]:    daemon
Nov 22 02:47:23 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:47:23 np0005531887 ovn_metadata_agent[104079]: defaults
Nov 22 02:47:23 np0005531887 ovn_metadata_agent[104079]:    log global
Nov 22 02:47:23 np0005531887 ovn_metadata_agent[104079]:    mode http
Nov 22 02:47:23 np0005531887 ovn_metadata_agent[104079]:    option httplog
Nov 22 02:47:23 np0005531887 ovn_metadata_agent[104079]:    option dontlognull
Nov 22 02:47:23 np0005531887 ovn_metadata_agent[104079]:    option http-server-close
Nov 22 02:47:23 np0005531887 ovn_metadata_agent[104079]:    option forwardfor
Nov 22 02:47:23 np0005531887 ovn_metadata_agent[104079]:    retries                 3
Nov 22 02:47:23 np0005531887 ovn_metadata_agent[104079]:    timeout http-request    30s
Nov 22 02:47:23 np0005531887 ovn_metadata_agent[104079]:    timeout connect         30s
Nov 22 02:47:23 np0005531887 ovn_metadata_agent[104079]:    timeout client          32s
Nov 22 02:47:23 np0005531887 ovn_metadata_agent[104079]:    timeout server          32s
Nov 22 02:47:23 np0005531887 ovn_metadata_agent[104079]:    timeout http-keep-alive 30s
Nov 22 02:47:23 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:47:23 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:47:23 np0005531887 ovn_metadata_agent[104079]: listen listener
Nov 22 02:47:23 np0005531887 ovn_metadata_agent[104079]:    bind 169.254.169.254:80
Nov 22 02:47:23 np0005531887 ovn_metadata_agent[104079]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 02:47:23 np0005531887 ovn_metadata_agent[104079]:    http-request add-header X-OVN-Network-ID c3f966e1-8cff-4ca0-9b4f-a318c31b0a70
Nov 22 02:47:23 np0005531887 ovn_metadata_agent[104079]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 02:47:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:23.350 104084 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70', 'env', 'PROCESS_TAG=haproxy-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c3f966e1-8cff-4ca0-9b4f-a318c31b0a70.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 02:47:23 np0005531887 nova_compute[186849]: 2025-11-22 07:47:23.516 186853 DEBUG nova.compute.manager [req-970673a0-55fd-4e1b-814c-65c21e810686 req-ab899384-d276-465b-8ef2-67906077be32 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Received event network-vif-plugged-28cefb09-6f44-4c5f-b924-c2e3ca0082e1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:47:23 np0005531887 nova_compute[186849]: 2025-11-22 07:47:23.517 186853 DEBUG oslo_concurrency.lockutils [req-970673a0-55fd-4e1b-814c-65c21e810686 req-ab899384-d276-465b-8ef2-67906077be32 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "b9c07170-ca6f-422e-8f1c-9dfd5cc943a4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:23 np0005531887 nova_compute[186849]: 2025-11-22 07:47:23.518 186853 DEBUG oslo_concurrency.lockutils [req-970673a0-55fd-4e1b-814c-65c21e810686 req-ab899384-d276-465b-8ef2-67906077be32 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9c07170-ca6f-422e-8f1c-9dfd5cc943a4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:23 np0005531887 nova_compute[186849]: 2025-11-22 07:47:23.518 186853 DEBUG oslo_concurrency.lockutils [req-970673a0-55fd-4e1b-814c-65c21e810686 req-ab899384-d276-465b-8ef2-67906077be32 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9c07170-ca6f-422e-8f1c-9dfd5cc943a4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:23 np0005531887 nova_compute[186849]: 2025-11-22 07:47:23.518 186853 DEBUG nova.compute.manager [req-970673a0-55fd-4e1b-814c-65c21e810686 req-ab899384-d276-465b-8ef2-67906077be32 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Processing event network-vif-plugged-28cefb09-6f44-4c5f-b924-c2e3ca0082e1 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 02:47:23 np0005531887 nova_compute[186849]: 2025-11-22 07:47:23.565 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763797643.5646906, b9c07170-ca6f-422e-8f1c-9dfd5cc943a4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:47:23 np0005531887 nova_compute[186849]: 2025-11-22 07:47:23.566 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] VM Started (Lifecycle Event)#033[00m
Nov 22 02:47:23 np0005531887 nova_compute[186849]: 2025-11-22 07:47:23.568 186853 DEBUG nova.compute.manager [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:47:23 np0005531887 nova_compute[186849]: 2025-11-22 07:47:23.582 186853 DEBUG nova.virt.libvirt.driver [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 02:47:23 np0005531887 nova_compute[186849]: 2025-11-22 07:47:23.587 186853 INFO nova.virt.libvirt.driver [-] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Instance spawned successfully.#033[00m
Nov 22 02:47:23 np0005531887 nova_compute[186849]: 2025-11-22 07:47:23.588 186853 DEBUG nova.virt.libvirt.driver [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 02:47:23 np0005531887 nova_compute[186849]: 2025-11-22 07:47:23.601 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:47:23 np0005531887 nova_compute[186849]: 2025-11-22 07:47:23.606 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:47:23 np0005531887 nova_compute[186849]: 2025-11-22 07:47:23.642 186853 DEBUG nova.virt.libvirt.driver [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:47:23 np0005531887 nova_compute[186849]: 2025-11-22 07:47:23.643 186853 DEBUG nova.virt.libvirt.driver [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:47:23 np0005531887 nova_compute[186849]: 2025-11-22 07:47:23.643 186853 DEBUG nova.virt.libvirt.driver [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:47:23 np0005531887 nova_compute[186849]: 2025-11-22 07:47:23.644 186853 DEBUG nova.virt.libvirt.driver [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:47:23 np0005531887 nova_compute[186849]: 2025-11-22 07:47:23.644 186853 DEBUG nova.virt.libvirt.driver [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:47:23 np0005531887 nova_compute[186849]: 2025-11-22 07:47:23.645 186853 DEBUG nova.virt.libvirt.driver [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:47:23 np0005531887 nova_compute[186849]: 2025-11-22 07:47:23.649 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:47:23 np0005531887 nova_compute[186849]: 2025-11-22 07:47:23.650 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763797643.5661867, b9c07170-ca6f-422e-8f1c-9dfd5cc943a4 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:47:23 np0005531887 nova_compute[186849]: 2025-11-22 07:47:23.650 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] VM Paused (Lifecycle Event)#033[00m
Nov 22 02:47:23 np0005531887 nova_compute[186849]: 2025-11-22 07:47:23.687 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:47:23 np0005531887 nova_compute[186849]: 2025-11-22 07:47:23.692 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763797643.5716128, b9c07170-ca6f-422e-8f1c-9dfd5cc943a4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:47:23 np0005531887 nova_compute[186849]: 2025-11-22 07:47:23.693 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:47:23 np0005531887 nova_compute[186849]: 2025-11-22 07:47:23.719 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:47:23 np0005531887 nova_compute[186849]: 2025-11-22 07:47:23.724 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:47:23 np0005531887 nova_compute[186849]: 2025-11-22 07:47:23.727 186853 DEBUG nova.network.neutron [req-df3a152d-02b9-4a38-86ac-6519c6922d4d req-29c601ed-6837-4d24-b2a5-0891cc2e96dd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Updated VIF entry in instance network info cache for port 28cefb09-6f44-4c5f-b924-c2e3ca0082e1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 02:47:23 np0005531887 nova_compute[186849]: 2025-11-22 07:47:23.728 186853 DEBUG nova.network.neutron [req-df3a152d-02b9-4a38-86ac-6519c6922d4d req-29c601ed-6837-4d24-b2a5-0891cc2e96dd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Updating instance_info_cache with network_info: [{"id": "28cefb09-6f44-4c5f-b924-c2e3ca0082e1", "address": "fa:16:3e:90:34:2c", "network": {"id": "c3f966e1-8cff-4ca0-9b4f-a318c31b0a70", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1427311200-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d48bda61691e4f778b6d30c0dc773a30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap28cefb09-6f", "ovs_interfaceid": "28cefb09-6f44-4c5f-b924-c2e3ca0082e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:47:23 np0005531887 nova_compute[186849]: 2025-11-22 07:47:23.752 186853 INFO nova.compute.manager [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Took 8.99 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 02:47:23 np0005531887 nova_compute[186849]: 2025-11-22 07:47:23.754 186853 DEBUG nova.compute.manager [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:47:23 np0005531887 nova_compute[186849]: 2025-11-22 07:47:23.761 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:47:23 np0005531887 podman[217751]: 2025-11-22 07:47:23.764225721 +0000 UTC m=+0.067646933 container create f28d5c5ee3e903bb423d91f7dd941994d97c5bfb8870ab0b0caa6b34d799ff84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2)
Nov 22 02:47:23 np0005531887 nova_compute[186849]: 2025-11-22 07:47:23.781 186853 DEBUG oslo_concurrency.lockutils [req-df3a152d-02b9-4a38-86ac-6519c6922d4d req-29c601ed-6837-4d24-b2a5-0891cc2e96dd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-b9c07170-ca6f-422e-8f1c-9dfd5cc943a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:47:23 np0005531887 systemd[1]: Started libpod-conmon-f28d5c5ee3e903bb423d91f7dd941994d97c5bfb8870ab0b0caa6b34d799ff84.scope.
Nov 22 02:47:23 np0005531887 podman[217751]: 2025-11-22 07:47:23.729089733 +0000 UTC m=+0.032510965 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 02:47:23 np0005531887 systemd[1]: Started libcrun container.
Nov 22 02:47:23 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e1fbd7087e57143473a70d6697fc047cfb1fb8a261f57ba3314945890fb01409/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 02:47:23 np0005531887 podman[217751]: 2025-11-22 07:47:23.845153068 +0000 UTC m=+0.148574310 container init f28d5c5ee3e903bb423d91f7dd941994d97c5bfb8870ab0b0caa6b34d799ff84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:47:23 np0005531887 nova_compute[186849]: 2025-11-22 07:47:23.848 186853 INFO nova.compute.manager [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Took 10.24 seconds to build instance.#033[00m
Nov 22 02:47:23 np0005531887 podman[217751]: 2025-11-22 07:47:23.850814307 +0000 UTC m=+0.154235519 container start f28d5c5ee3e903bb423d91f7dd941994d97c5bfb8870ab0b0caa6b34d799ff84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 22 02:47:23 np0005531887 nova_compute[186849]: 2025-11-22 07:47:23.870 186853 DEBUG oslo_concurrency.lockutils [None req-491aae1b-be86-4cfa-859e-67f8489bec26 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Lock "b9c07170-ca6f-422e-8f1c-9dfd5cc943a4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.528s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:23 np0005531887 neutron-haproxy-ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70[217765]: [NOTICE]   (217769) : New worker (217771) forked
Nov 22 02:47:23 np0005531887 neutron-haproxy-ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70[217765]: [NOTICE]   (217769) : Loading success.
Nov 22 02:47:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:23.909 104084 INFO neutron.agent.ovn.metadata.agent [-] Port eb321ea2-ecc9-494b-a270-c3aac4f36e7d in datapath c410ae8d-536e-4819-b766-652bc78ac3e4 unbound from our chassis#033[00m
Nov 22 02:47:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:23.912 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c410ae8d-536e-4819-b766-652bc78ac3e4#033[00m
Nov 22 02:47:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:23.922 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[a3638901-caa5-4480-aee9-3089695bf314]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:47:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:23.924 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc410ae8d-51 in ovnmeta-c410ae8d-536e-4819-b766-652bc78ac3e4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 02:47:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:23.926 213790 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc410ae8d-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 02:47:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:23.926 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[37ac2460-0712-4b27-833d-479fc9cc26cb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:47:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:23.927 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[db408d85-2021-4553-971a-270b579e76f4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:47:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:23.945 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[ab31cb87-2e5a-400e-8035-67bac9fa267c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:47:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:23.973 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[d4b08e04-2885-4e9b-8d15-a22509d4e998]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:47:24 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:24.009 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[606f99c1-542c-4ab7-8136-c1b6bffac219]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:47:24 np0005531887 NetworkManager[55210]: <info>  [1763797644.0193] manager: (tapc410ae8d-50): new Veth device (/org/freedesktop/NetworkManager/Devices/50)
Nov 22 02:47:24 np0005531887 systemd-udevd[217696]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:47:24 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:24.017 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[c1d94a87-aca9-4928-8733-61c79e7ea29c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:47:24 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:24.056 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[369f97cf-7c24-4883-94c5-a336e12420e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:47:24 np0005531887 podman[217782]: 2025-11-22 07:47:24.062596191 +0000 UTC m=+0.090954014 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:47:24 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:24.060 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[803adb3b-ab9c-4017-9ace-77bf3b139ec6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:47:24 np0005531887 podman[217784]: 2025-11-22 07:47:24.087217723 +0000 UTC m=+0.110649685 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 22 02:47:24 np0005531887 NetworkManager[55210]: <info>  [1763797644.0912] device (tapc410ae8d-50): carrier: link connected
Nov 22 02:47:24 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:24.097 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[c8498532-d07a-40bf-97b0-d327da17d83b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:47:24 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:24.136 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[b3d386d1-2397-4d6a-9aa0-c83501d51da2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc410ae8d-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:13:32:39'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 438669, 'reachable_time': 16576, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217832, 'error': None, 'target': 'ovnmeta-c410ae8d-536e-4819-b766-652bc78ac3e4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:47:24 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:24.157 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[426632bc-6f8d-4098-9f41-80a8f30f314d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe13:3239'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 438669, 'tstamp': 438669}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217833, 'error': None, 'target': 'ovnmeta-c410ae8d-536e-4819-b766-652bc78ac3e4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:47:24 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:24.192 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[cbbc7de0-3e84-48e5-a664-8262b5492d9a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc410ae8d-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:13:32:39'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 438669, 'reachable_time': 16576, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217834, 'error': None, 'target': 'ovnmeta-c410ae8d-536e-4819-b766-652bc78ac3e4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:47:24 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:24.232 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[92a1783a-0741-467f-9960-35a48e41c661]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:47:24 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:24.328 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[c0d0665c-5dc7-4c59-8534-cf775a0a2bab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:47:24 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:24.329 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc410ae8d-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:47:24 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:24.330 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:47:24 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:24.330 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc410ae8d-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:47:24 np0005531887 kernel: tapc410ae8d-50: entered promiscuous mode
Nov 22 02:47:24 np0005531887 NetworkManager[55210]: <info>  [1763797644.3338] manager: (tapc410ae8d-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/51)
Nov 22 02:47:24 np0005531887 nova_compute[186849]: 2025-11-22 07:47:24.333 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:24 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:24.337 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc410ae8d-50, col_values=(('external_ids', {'iface-id': 'adbcca2d-be43-4042-953b-c108dbe75276'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:47:24 np0005531887 ovn_controller[95130]: 2025-11-22T07:47:24Z|00081|binding|INFO|Releasing lport adbcca2d-be43-4042-953b-c108dbe75276 from this chassis (sb_readonly=0)
Nov 22 02:47:24 np0005531887 nova_compute[186849]: 2025-11-22 07:47:24.351 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:24 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:24.352 104084 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c410ae8d-536e-4819-b766-652bc78ac3e4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c410ae8d-536e-4819-b766-652bc78ac3e4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 02:47:24 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:24.354 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[51c5e522-54c6-4f1d-8eea-314e43826a6f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:47:24 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:24.355 104084 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 02:47:24 np0005531887 ovn_metadata_agent[104079]: global
Nov 22 02:47:24 np0005531887 ovn_metadata_agent[104079]:    log         /dev/log local0 debug
Nov 22 02:47:24 np0005531887 ovn_metadata_agent[104079]:    log-tag     haproxy-metadata-proxy-c410ae8d-536e-4819-b766-652bc78ac3e4
Nov 22 02:47:24 np0005531887 ovn_metadata_agent[104079]:    user        root
Nov 22 02:47:24 np0005531887 ovn_metadata_agent[104079]:    group       root
Nov 22 02:47:24 np0005531887 ovn_metadata_agent[104079]:    maxconn     1024
Nov 22 02:47:24 np0005531887 ovn_metadata_agent[104079]:    pidfile     /var/lib/neutron/external/pids/c410ae8d-536e-4819-b766-652bc78ac3e4.pid.haproxy
Nov 22 02:47:24 np0005531887 ovn_metadata_agent[104079]:    daemon
Nov 22 02:47:24 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:47:24 np0005531887 ovn_metadata_agent[104079]: defaults
Nov 22 02:47:24 np0005531887 ovn_metadata_agent[104079]:    log global
Nov 22 02:47:24 np0005531887 ovn_metadata_agent[104079]:    mode http
Nov 22 02:47:24 np0005531887 ovn_metadata_agent[104079]:    option httplog
Nov 22 02:47:24 np0005531887 ovn_metadata_agent[104079]:    option dontlognull
Nov 22 02:47:24 np0005531887 ovn_metadata_agent[104079]:    option http-server-close
Nov 22 02:47:24 np0005531887 ovn_metadata_agent[104079]:    option forwardfor
Nov 22 02:47:24 np0005531887 ovn_metadata_agent[104079]:    retries                 3
Nov 22 02:47:24 np0005531887 ovn_metadata_agent[104079]:    timeout http-request    30s
Nov 22 02:47:24 np0005531887 ovn_metadata_agent[104079]:    timeout connect         30s
Nov 22 02:47:24 np0005531887 ovn_metadata_agent[104079]:    timeout client          32s
Nov 22 02:47:24 np0005531887 ovn_metadata_agent[104079]:    timeout server          32s
Nov 22 02:47:24 np0005531887 ovn_metadata_agent[104079]:    timeout http-keep-alive 30s
Nov 22 02:47:24 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:47:24 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:47:24 np0005531887 ovn_metadata_agent[104079]: listen listener
Nov 22 02:47:24 np0005531887 ovn_metadata_agent[104079]:    bind 169.254.169.254:80
Nov 22 02:47:24 np0005531887 ovn_metadata_agent[104079]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 02:47:24 np0005531887 ovn_metadata_agent[104079]:    http-request add-header X-OVN-Network-ID c410ae8d-536e-4819-b766-652bc78ac3e4
Nov 22 02:47:24 np0005531887 ovn_metadata_agent[104079]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 02:47:24 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:24.355 104084 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c410ae8d-536e-4819-b766-652bc78ac3e4', 'env', 'PROCESS_TAG=haproxy-c410ae8d-536e-4819-b766-652bc78ac3e4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c410ae8d-536e-4819-b766-652bc78ac3e4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 02:47:24 np0005531887 podman[217867]: 2025-11-22 07:47:24.791843458 +0000 UTC m=+0.094733536 container create f5d4e409d12c1457978220d254d507b3b3c1b8224c2924eb52ef69941096dca8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c410ae8d-536e-4819-b766-652bc78ac3e4, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 22 02:47:24 np0005531887 podman[217867]: 2025-11-22 07:47:24.734430395 +0000 UTC m=+0.037320473 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 02:47:24 np0005531887 systemd[1]: Started libpod-conmon-f5d4e409d12c1457978220d254d507b3b3c1b8224c2924eb52ef69941096dca8.scope.
Nov 22 02:47:24 np0005531887 systemd[1]: Started libcrun container.
Nov 22 02:47:24 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5e30640c6b9e775f6252005b57cb2c172d85c4ea8fd38e93384ae4bdc3573cd5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 02:47:24 np0005531887 podman[217867]: 2025-11-22 07:47:24.877659004 +0000 UTC m=+0.180549112 container init f5d4e409d12c1457978220d254d507b3b3c1b8224c2924eb52ef69941096dca8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c410ae8d-536e-4819-b766-652bc78ac3e4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118)
Nov 22 02:47:24 np0005531887 podman[217867]: 2025-11-22 07:47:24.884244835 +0000 UTC m=+0.187134903 container start f5d4e409d12c1457978220d254d507b3b3c1b8224c2924eb52ef69941096dca8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c410ae8d-536e-4819-b766-652bc78ac3e4, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:47:24 np0005531887 neutron-haproxy-ovnmeta-c410ae8d-536e-4819-b766-652bc78ac3e4[217880]: [NOTICE]   (217884) : New worker (217886) forked
Nov 22 02:47:24 np0005531887 neutron-haproxy-ovnmeta-c410ae8d-536e-4819-b766-652bc78ac3e4[217880]: [NOTICE]   (217884) : Loading success.
Nov 22 02:47:25 np0005531887 nova_compute[186849]: 2025-11-22 07:47:25.265 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:25 np0005531887 nova_compute[186849]: 2025-11-22 07:47:25.649 186853 DEBUG nova.compute.manager [req-cd837bfa-f95b-4577-822d-c7e7d549c5ff req-7fa61026-7dc7-408f-9243-955cff39a88c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Received event network-vif-plugged-28cefb09-6f44-4c5f-b924-c2e3ca0082e1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:47:25 np0005531887 nova_compute[186849]: 2025-11-22 07:47:25.651 186853 DEBUG oslo_concurrency.lockutils [req-cd837bfa-f95b-4577-822d-c7e7d549c5ff req-7fa61026-7dc7-408f-9243-955cff39a88c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "b9c07170-ca6f-422e-8f1c-9dfd5cc943a4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:25 np0005531887 nova_compute[186849]: 2025-11-22 07:47:25.651 186853 DEBUG oslo_concurrency.lockutils [req-cd837bfa-f95b-4577-822d-c7e7d549c5ff req-7fa61026-7dc7-408f-9243-955cff39a88c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9c07170-ca6f-422e-8f1c-9dfd5cc943a4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:25 np0005531887 nova_compute[186849]: 2025-11-22 07:47:25.651 186853 DEBUG oslo_concurrency.lockutils [req-cd837bfa-f95b-4577-822d-c7e7d549c5ff req-7fa61026-7dc7-408f-9243-955cff39a88c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9c07170-ca6f-422e-8f1c-9dfd5cc943a4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:25 np0005531887 nova_compute[186849]: 2025-11-22 07:47:25.652 186853 DEBUG nova.compute.manager [req-cd837bfa-f95b-4577-822d-c7e7d549c5ff req-7fa61026-7dc7-408f-9243-955cff39a88c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] No waiting events found dispatching network-vif-plugged-28cefb09-6f44-4c5f-b924-c2e3ca0082e1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:47:25 np0005531887 nova_compute[186849]: 2025-11-22 07:47:25.652 186853 WARNING nova.compute.manager [req-cd837bfa-f95b-4577-822d-c7e7d549c5ff req-7fa61026-7dc7-408f-9243-955cff39a88c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Received unexpected event network-vif-plugged-28cefb09-6f44-4c5f-b924-c2e3ca0082e1 for instance with vm_state active and task_state None.#033[00m
Nov 22 02:47:27 np0005531887 nova_compute[186849]: 2025-11-22 07:47:27.156 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:30 np0005531887 nova_compute[186849]: 2025-11-22 07:47:30.274 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:30 np0005531887 podman[217925]: 2025-11-22 07:47:30.829162631 +0000 UTC m=+0.049955491 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 02:47:32 np0005531887 nova_compute[186849]: 2025-11-22 07:47:32.160 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:33 np0005531887 nova_compute[186849]: 2025-11-22 07:47:33.963 186853 DEBUG nova.virt.libvirt.driver [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Check if temp file /var/lib/nova/instances/tmp1r930f3w exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065#033[00m
Nov 22 02:47:33 np0005531887 nova_compute[186849]: 2025-11-22 07:47:33.968 186853 DEBUG oslo_concurrency.processutils [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b9c07170-ca6f-422e-8f1c-9dfd5cc943a4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:47:34 np0005531887 nova_compute[186849]: 2025-11-22 07:47:34.039 186853 DEBUG oslo_concurrency.processutils [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b9c07170-ca6f-422e-8f1c-9dfd5cc943a4/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:47:34 np0005531887 nova_compute[186849]: 2025-11-22 07:47:34.040 186853 DEBUG oslo_concurrency.processutils [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b9c07170-ca6f-422e-8f1c-9dfd5cc943a4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:47:34 np0005531887 nova_compute[186849]: 2025-11-22 07:47:34.101 186853 DEBUG oslo_concurrency.processutils [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b9c07170-ca6f-422e-8f1c-9dfd5cc943a4/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:47:34 np0005531887 nova_compute[186849]: 2025-11-22 07:47:34.103 186853 DEBUG nova.compute.manager [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=72704,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp1r930f3w',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b9c07170-ca6f-422e-8f1c-9dfd5cc943a4',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587#033[00m
Nov 22 02:47:34 np0005531887 podman[217955]: 2025-11-22 07:47:34.832730506 +0000 UTC m=+0.055856266 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:47:35 np0005531887 nova_compute[186849]: 2025-11-22 07:47:35.277 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:36 np0005531887 nova_compute[186849]: 2025-11-22 07:47:36.403 186853 DEBUG oslo_concurrency.processutils [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b9c07170-ca6f-422e-8f1c-9dfd5cc943a4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:47:36 np0005531887 nova_compute[186849]: 2025-11-22 07:47:36.481 186853 DEBUG oslo_concurrency.processutils [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b9c07170-ca6f-422e-8f1c-9dfd5cc943a4/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:47:36 np0005531887 nova_compute[186849]: 2025-11-22 07:47:36.482 186853 DEBUG oslo_concurrency.processutils [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b9c07170-ca6f-422e-8f1c-9dfd5cc943a4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:47:36 np0005531887 nova_compute[186849]: 2025-11-22 07:47:36.550 186853 DEBUG oslo_concurrency.processutils [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b9c07170-ca6f-422e-8f1c-9dfd5cc943a4/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:47:36 np0005531887 nova_compute[186849]: 2025-11-22 07:47:36.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:47:36 np0005531887 nova_compute[186849]: 2025-11-22 07:47:36.920 186853 DEBUG oslo_concurrency.lockutils [None req-adfe0adb-aa7d-4807-acef-3779ffc27a48 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Acquiring lock "7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:36 np0005531887 nova_compute[186849]: 2025-11-22 07:47:36.920 186853 DEBUG oslo_concurrency.lockutils [None req-adfe0adb-aa7d-4807-acef-3779ffc27a48 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:36 np0005531887 nova_compute[186849]: 2025-11-22 07:47:36.921 186853 DEBUG oslo_concurrency.lockutils [None req-adfe0adb-aa7d-4807-acef-3779ffc27a48 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Acquiring lock "7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:36 np0005531887 nova_compute[186849]: 2025-11-22 07:47:36.921 186853 DEBUG oslo_concurrency.lockutils [None req-adfe0adb-aa7d-4807-acef-3779ffc27a48 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:36 np0005531887 nova_compute[186849]: 2025-11-22 07:47:36.922 186853 DEBUG oslo_concurrency.lockutils [None req-adfe0adb-aa7d-4807-acef-3779ffc27a48 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:36 np0005531887 nova_compute[186849]: 2025-11-22 07:47:36.930 186853 INFO nova.compute.manager [None req-adfe0adb-aa7d-4807-acef-3779ffc27a48 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5] Terminating instance#033[00m
Nov 22 02:47:36 np0005531887 nova_compute[186849]: 2025-11-22 07:47:36.936 186853 DEBUG oslo_concurrency.lockutils [None req-adfe0adb-aa7d-4807-acef-3779ffc27a48 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Acquiring lock "refresh_cache-7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:47:36 np0005531887 nova_compute[186849]: 2025-11-22 07:47:36.936 186853 DEBUG oslo_concurrency.lockutils [None req-adfe0adb-aa7d-4807-acef-3779ffc27a48 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Acquired lock "refresh_cache-7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:47:36 np0005531887 nova_compute[186849]: 2025-11-22 07:47:36.937 186853 DEBUG nova.network.neutron [None req-adfe0adb-aa7d-4807-acef-3779ffc27a48 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:47:37 np0005531887 nova_compute[186849]: 2025-11-22 07:47:37.162 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:37 np0005531887 nova_compute[186849]: 2025-11-22 07:47:37.224 186853 DEBUG nova.network.neutron [None req-adfe0adb-aa7d-4807-acef-3779ffc27a48 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:47:37 np0005531887 ovn_controller[95130]: 2025-11-22T07:47:37Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:90:34:2c 10.100.0.9
Nov 22 02:47:37 np0005531887 ovn_controller[95130]: 2025-11-22T07:47:37Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:90:34:2c 10.100.0.9
Nov 22 02:47:37 np0005531887 nova_compute[186849]: 2025-11-22 07:47:37.308 186853 DEBUG oslo_concurrency.lockutils [None req-d228eae0-960e-4fae-8421-d3a4a6fcc55d f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Acquiring lock "89be3b77-79e2-4c6a-9107-a17f3f4a3fca" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:37 np0005531887 nova_compute[186849]: 2025-11-22 07:47:37.309 186853 DEBUG oslo_concurrency.lockutils [None req-d228eae0-960e-4fae-8421-d3a4a6fcc55d f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "89be3b77-79e2-4c6a-9107-a17f3f4a3fca" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:37 np0005531887 nova_compute[186849]: 2025-11-22 07:47:37.309 186853 DEBUG oslo_concurrency.lockutils [None req-d228eae0-960e-4fae-8421-d3a4a6fcc55d f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Acquiring lock "89be3b77-79e2-4c6a-9107-a17f3f4a3fca-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:37 np0005531887 nova_compute[186849]: 2025-11-22 07:47:37.309 186853 DEBUG oslo_concurrency.lockutils [None req-d228eae0-960e-4fae-8421-d3a4a6fcc55d f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "89be3b77-79e2-4c6a-9107-a17f3f4a3fca-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:37 np0005531887 nova_compute[186849]: 2025-11-22 07:47:37.309 186853 DEBUG oslo_concurrency.lockutils [None req-d228eae0-960e-4fae-8421-d3a4a6fcc55d f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "89be3b77-79e2-4c6a-9107-a17f3f4a3fca-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:37 np0005531887 nova_compute[186849]: 2025-11-22 07:47:37.316 186853 INFO nova.compute.manager [None req-d228eae0-960e-4fae-8421-d3a4a6fcc55d f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 89be3b77-79e2-4c6a-9107-a17f3f4a3fca] Terminating instance#033[00m
Nov 22 02:47:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:37.319 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:37 np0005531887 nova_compute[186849]: 2025-11-22 07:47:37.321 186853 DEBUG oslo_concurrency.lockutils [None req-d228eae0-960e-4fae-8421-d3a4a6fcc55d f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Acquiring lock "refresh_cache-89be3b77-79e2-4c6a-9107-a17f3f4a3fca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:47:37 np0005531887 nova_compute[186849]: 2025-11-22 07:47:37.321 186853 DEBUG oslo_concurrency.lockutils [None req-d228eae0-960e-4fae-8421-d3a4a6fcc55d f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Acquired lock "refresh_cache-89be3b77-79e2-4c6a-9107-a17f3f4a3fca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:47:37 np0005531887 nova_compute[186849]: 2025-11-22 07:47:37.322 186853 DEBUG nova.network.neutron [None req-d228eae0-960e-4fae-8421-d3a4a6fcc55d f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 89be3b77-79e2-4c6a-9107-a17f3f4a3fca] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:47:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:37.322 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:37.323 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:37 np0005531887 nova_compute[186849]: 2025-11-22 07:47:37.599 186853 DEBUG nova.network.neutron [None req-d228eae0-960e-4fae-8421-d3a4a6fcc55d f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 89be3b77-79e2-4c6a-9107-a17f3f4a3fca] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:47:37 np0005531887 nova_compute[186849]: 2025-11-22 07:47:37.644 186853 DEBUG nova.network.neutron [None req-adfe0adb-aa7d-4807-acef-3779ffc27a48 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:47:37 np0005531887 nova_compute[186849]: 2025-11-22 07:47:37.668 186853 DEBUG oslo_concurrency.lockutils [None req-adfe0adb-aa7d-4807-acef-3779ffc27a48 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Releasing lock "refresh_cache-7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:47:37 np0005531887 nova_compute[186849]: 2025-11-22 07:47:37.669 186853 DEBUG nova.compute.manager [None req-adfe0adb-aa7d-4807-acef-3779ffc27a48 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 02:47:37 np0005531887 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000021.scope: Deactivated successfully.
Nov 22 02:47:37 np0005531887 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000021.scope: Consumed 13.850s CPU time.
Nov 22 02:47:37 np0005531887 systemd-machined[153180]: Machine qemu-13-instance-00000021 terminated.
Nov 22 02:47:37 np0005531887 nova_compute[186849]: 2025-11-22 07:47:37.762 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:47:37 np0005531887 nova_compute[186849]: 2025-11-22 07:47:37.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:47:37 np0005531887 nova_compute[186849]: 2025-11-22 07:47:37.924 186853 INFO nova.virt.libvirt.driver [-] [instance: 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5] Instance destroyed successfully.#033[00m
Nov 22 02:47:37 np0005531887 nova_compute[186849]: 2025-11-22 07:47:37.927 186853 DEBUG nova.objects.instance [None req-adfe0adb-aa7d-4807-acef-3779ffc27a48 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lazy-loading 'resources' on Instance uuid 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:47:37 np0005531887 nova_compute[186849]: 2025-11-22 07:47:37.943 186853 INFO nova.virt.libvirt.driver [None req-adfe0adb-aa7d-4807-acef-3779ffc27a48 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5] Deleting instance files /var/lib/nova/instances/7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5_del#033[00m
Nov 22 02:47:37 np0005531887 nova_compute[186849]: 2025-11-22 07:47:37.945 186853 INFO nova.virt.libvirt.driver [None req-adfe0adb-aa7d-4807-acef-3779ffc27a48 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5] Deletion of /var/lib/nova/instances/7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5_del complete#033[00m
Nov 22 02:47:38 np0005531887 nova_compute[186849]: 2025-11-22 07:47:38.011 186853 INFO nova.compute.manager [None req-adfe0adb-aa7d-4807-acef-3779ffc27a48 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5] Took 0.34 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 02:47:38 np0005531887 nova_compute[186849]: 2025-11-22 07:47:38.012 186853 DEBUG oslo.service.loopingcall [None req-adfe0adb-aa7d-4807-acef-3779ffc27a48 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 02:47:38 np0005531887 nova_compute[186849]: 2025-11-22 07:47:38.013 186853 DEBUG nova.compute.manager [-] [instance: 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 02:47:38 np0005531887 nova_compute[186849]: 2025-11-22 07:47:38.013 186853 DEBUG nova.network.neutron [-] [instance: 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 02:47:38 np0005531887 nova_compute[186849]: 2025-11-22 07:47:38.443 186853 DEBUG nova.network.neutron [-] [instance: 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:47:38 np0005531887 nova_compute[186849]: 2025-11-22 07:47:38.460 186853 DEBUG nova.network.neutron [-] [instance: 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:47:38 np0005531887 nova_compute[186849]: 2025-11-22 07:47:38.477 186853 INFO nova.compute.manager [-] [instance: 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5] Took 0.46 seconds to deallocate network for instance.#033[00m
Nov 22 02:47:38 np0005531887 nova_compute[186849]: 2025-11-22 07:47:38.551 186853 DEBUG oslo_concurrency.lockutils [None req-adfe0adb-aa7d-4807-acef-3779ffc27a48 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:38 np0005531887 nova_compute[186849]: 2025-11-22 07:47:38.552 186853 DEBUG oslo_concurrency.lockutils [None req-adfe0adb-aa7d-4807-acef-3779ffc27a48 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:38 np0005531887 nova_compute[186849]: 2025-11-22 07:47:38.630 186853 DEBUG nova.network.neutron [None req-d228eae0-960e-4fae-8421-d3a4a6fcc55d f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 89be3b77-79e2-4c6a-9107-a17f3f4a3fca] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:47:38 np0005531887 nova_compute[186849]: 2025-11-22 07:47:38.654 186853 DEBUG oslo_concurrency.lockutils [None req-d228eae0-960e-4fae-8421-d3a4a6fcc55d f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Releasing lock "refresh_cache-89be3b77-79e2-4c6a-9107-a17f3f4a3fca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:47:38 np0005531887 nova_compute[186849]: 2025-11-22 07:47:38.655 186853 DEBUG nova.compute.manager [None req-d228eae0-960e-4fae-8421-d3a4a6fcc55d f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 89be3b77-79e2-4c6a-9107-a17f3f4a3fca] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 02:47:38 np0005531887 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000022.scope: Deactivated successfully.
Nov 22 02:47:38 np0005531887 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000022.scope: Consumed 13.867s CPU time.
Nov 22 02:47:38 np0005531887 systemd-machined[153180]: Machine qemu-14-instance-00000022 terminated.
Nov 22 02:47:38 np0005531887 nova_compute[186849]: 2025-11-22 07:47:38.717 186853 DEBUG nova.compute.provider_tree [None req-adfe0adb-aa7d-4807-acef-3779ffc27a48 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:47:38 np0005531887 nova_compute[186849]: 2025-11-22 07:47:38.740 186853 DEBUG nova.scheduler.client.report [None req-adfe0adb-aa7d-4807-acef-3779ffc27a48 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:47:38 np0005531887 nova_compute[186849]: 2025-11-22 07:47:38.774 186853 DEBUG oslo_concurrency.lockutils [None req-adfe0adb-aa7d-4807-acef-3779ffc27a48 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.222s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:38 np0005531887 podman[218006]: 2025-11-22 07:47:38.776874239 +0000 UTC m=+0.060905028 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd)
Nov 22 02:47:38 np0005531887 nova_compute[186849]: 2025-11-22 07:47:38.824 186853 INFO nova.scheduler.client.report [None req-adfe0adb-aa7d-4807-acef-3779ffc27a48 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Deleted allocations for instance 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5#033[00m
Nov 22 02:47:38 np0005531887 nova_compute[186849]: 2025-11-22 07:47:38.895 186853 INFO nova.virt.libvirt.driver [-] [instance: 89be3b77-79e2-4c6a-9107-a17f3f4a3fca] Instance destroyed successfully.#033[00m
Nov 22 02:47:38 np0005531887 nova_compute[186849]: 2025-11-22 07:47:38.896 186853 DEBUG nova.objects.instance [None req-d228eae0-960e-4fae-8421-d3a4a6fcc55d f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lazy-loading 'resources' on Instance uuid 89be3b77-79e2-4c6a-9107-a17f3f4a3fca obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:47:38 np0005531887 nova_compute[186849]: 2025-11-22 07:47:38.915 186853 INFO nova.virt.libvirt.driver [None req-d228eae0-960e-4fae-8421-d3a4a6fcc55d f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 89be3b77-79e2-4c6a-9107-a17f3f4a3fca] Deleting instance files /var/lib/nova/instances/89be3b77-79e2-4c6a-9107-a17f3f4a3fca_del#033[00m
Nov 22 02:47:38 np0005531887 nova_compute[186849]: 2025-11-22 07:47:38.916 186853 INFO nova.virt.libvirt.driver [None req-d228eae0-960e-4fae-8421-d3a4a6fcc55d f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 89be3b77-79e2-4c6a-9107-a17f3f4a3fca] Deletion of /var/lib/nova/instances/89be3b77-79e2-4c6a-9107-a17f3f4a3fca_del complete#033[00m
Nov 22 02:47:38 np0005531887 nova_compute[186849]: 2025-11-22 07:47:38.937 186853 DEBUG oslo_concurrency.lockutils [None req-adfe0adb-aa7d-4807-acef-3779ffc27a48 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.016s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:39 np0005531887 nova_compute[186849]: 2025-11-22 07:47:39.015 186853 INFO nova.compute.manager [None req-d228eae0-960e-4fae-8421-d3a4a6fcc55d f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: 89be3b77-79e2-4c6a-9107-a17f3f4a3fca] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 02:47:39 np0005531887 nova_compute[186849]: 2025-11-22 07:47:39.016 186853 DEBUG oslo.service.loopingcall [None req-d228eae0-960e-4fae-8421-d3a4a6fcc55d f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 02:47:39 np0005531887 nova_compute[186849]: 2025-11-22 07:47:39.017 186853 DEBUG nova.compute.manager [-] [instance: 89be3b77-79e2-4c6a-9107-a17f3f4a3fca] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 02:47:39 np0005531887 nova_compute[186849]: 2025-11-22 07:47:39.017 186853 DEBUG nova.network.neutron [-] [instance: 89be3b77-79e2-4c6a-9107-a17f3f4a3fca] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 02:47:39 np0005531887 nova_compute[186849]: 2025-11-22 07:47:39.286 186853 DEBUG nova.network.neutron [-] [instance: 89be3b77-79e2-4c6a-9107-a17f3f4a3fca] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:47:39 np0005531887 nova_compute[186849]: 2025-11-22 07:47:39.300 186853 DEBUG nova.network.neutron [-] [instance: 89be3b77-79e2-4c6a-9107-a17f3f4a3fca] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:47:39 np0005531887 nova_compute[186849]: 2025-11-22 07:47:39.326 186853 INFO nova.compute.manager [-] [instance: 89be3b77-79e2-4c6a-9107-a17f3f4a3fca] Took 0.31 seconds to deallocate network for instance.#033[00m
Nov 22 02:47:39 np0005531887 nova_compute[186849]: 2025-11-22 07:47:39.404 186853 DEBUG oslo_concurrency.lockutils [None req-d228eae0-960e-4fae-8421-d3a4a6fcc55d f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:39 np0005531887 nova_compute[186849]: 2025-11-22 07:47:39.405 186853 DEBUG oslo_concurrency.lockutils [None req-d228eae0-960e-4fae-8421-d3a4a6fcc55d f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:39 np0005531887 nova_compute[186849]: 2025-11-22 07:47:39.480 186853 DEBUG nova.compute.provider_tree [None req-d228eae0-960e-4fae-8421-d3a4a6fcc55d f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:47:39 np0005531887 nova_compute[186849]: 2025-11-22 07:47:39.505 186853 DEBUG nova.scheduler.client.report [None req-d228eae0-960e-4fae-8421-d3a4a6fcc55d f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:47:39 np0005531887 nova_compute[186849]: 2025-11-22 07:47:39.524 186853 DEBUG oslo_concurrency.lockutils [None req-d228eae0-960e-4fae-8421-d3a4a6fcc55d f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:39 np0005531887 nova_compute[186849]: 2025-11-22 07:47:39.575 186853 INFO nova.scheduler.client.report [None req-d228eae0-960e-4fae-8421-d3a4a6fcc55d f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Deleted allocations for instance 89be3b77-79e2-4c6a-9107-a17f3f4a3fca#033[00m
Nov 22 02:47:39 np0005531887 nova_compute[186849]: 2025-11-22 07:47:39.676 186853 DEBUG oslo_concurrency.lockutils [None req-d228eae0-960e-4fae-8421-d3a4a6fcc55d f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "89be3b77-79e2-4c6a-9107-a17f3f4a3fca" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.368s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:39 np0005531887 nova_compute[186849]: 2025-11-22 07:47:39.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:47:39 np0005531887 nova_compute[186849]: 2025-11-22 07:47:39.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 02:47:39 np0005531887 nova_compute[186849]: 2025-11-22 07:47:39.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 02:47:39 np0005531887 nova_compute[186849]: 2025-11-22 07:47:39.785 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "refresh_cache-b9c07170-ca6f-422e-8f1c-9dfd5cc943a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:47:39 np0005531887 nova_compute[186849]: 2025-11-22 07:47:39.786 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquired lock "refresh_cache-b9c07170-ca6f-422e-8f1c-9dfd5cc943a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:47:39 np0005531887 nova_compute[186849]: 2025-11-22 07:47:39.786 186853 DEBUG nova.network.neutron [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 02:47:39 np0005531887 nova_compute[186849]: 2025-11-22 07:47:39.786 186853 DEBUG nova.objects.instance [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b9c07170-ca6f-422e-8f1c-9dfd5cc943a4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:47:39 np0005531887 systemd-logind[821]: New session 36 of user nova.
Nov 22 02:47:39 np0005531887 systemd[1]: Created slice User Slice of UID 42436.
Nov 22 02:47:39 np0005531887 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 22 02:47:39 np0005531887 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 22 02:47:39 np0005531887 systemd[1]: Starting User Manager for UID 42436...
Nov 22 02:47:40 np0005531887 systemd[218039]: Queued start job for default target Main User Target.
Nov 22 02:47:40 np0005531887 systemd[218039]: Created slice User Application Slice.
Nov 22 02:47:40 np0005531887 systemd[218039]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 22 02:47:40 np0005531887 systemd[218039]: Started Daily Cleanup of User's Temporary Directories.
Nov 22 02:47:40 np0005531887 systemd[218039]: Reached target Paths.
Nov 22 02:47:40 np0005531887 systemd[218039]: Reached target Timers.
Nov 22 02:47:40 np0005531887 systemd[218039]: Starting D-Bus User Message Bus Socket...
Nov 22 02:47:40 np0005531887 systemd[218039]: Starting Create User's Volatile Files and Directories...
Nov 22 02:47:40 np0005531887 systemd[218039]: Listening on D-Bus User Message Bus Socket.
Nov 22 02:47:40 np0005531887 systemd[218039]: Finished Create User's Volatile Files and Directories.
Nov 22 02:47:40 np0005531887 systemd[218039]: Reached target Sockets.
Nov 22 02:47:40 np0005531887 systemd[218039]: Reached target Basic System.
Nov 22 02:47:40 np0005531887 systemd[218039]: Reached target Main User Target.
Nov 22 02:47:40 np0005531887 systemd[218039]: Startup finished in 136ms.
Nov 22 02:47:40 np0005531887 systemd[1]: Started User Manager for UID 42436.
Nov 22 02:47:40 np0005531887 systemd[1]: Started Session 36 of User nova.
Nov 22 02:47:40 np0005531887 systemd-logind[821]: Session 36 logged out. Waiting for processes to exit.
Nov 22 02:47:40 np0005531887 systemd[1]: session-36.scope: Deactivated successfully.
Nov 22 02:47:40 np0005531887 systemd-logind[821]: Removed session 36.
Nov 22 02:47:40 np0005531887 nova_compute[186849]: 2025-11-22 07:47:40.280 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:41 np0005531887 nova_compute[186849]: 2025-11-22 07:47:41.855 186853 DEBUG nova.network.neutron [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Updating instance_info_cache with network_info: [{"id": "28cefb09-6f44-4c5f-b924-c2e3ca0082e1", "address": "fa:16:3e:90:34:2c", "network": {"id": "c3f966e1-8cff-4ca0-9b4f-a318c31b0a70", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1427311200-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d48bda61691e4f778b6d30c0dc773a30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap28cefb09-6f", "ovs_interfaceid": "28cefb09-6f44-4c5f-b924-c2e3ca0082e1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:47:41 np0005531887 nova_compute[186849]: 2025-11-22 07:47:41.893 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Releasing lock "refresh_cache-b9c07170-ca6f-422e-8f1c-9dfd5cc943a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:47:41 np0005531887 nova_compute[186849]: 2025-11-22 07:47:41.893 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 02:47:41 np0005531887 nova_compute[186849]: 2025-11-22 07:47:41.894 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:47:41 np0005531887 nova_compute[186849]: 2025-11-22 07:47:41.894 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:47:41 np0005531887 nova_compute[186849]: 2025-11-22 07:47:41.894 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:47:41 np0005531887 nova_compute[186849]: 2025-11-22 07:47:41.894 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:47:41 np0005531887 nova_compute[186849]: 2025-11-22 07:47:41.916 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:41 np0005531887 nova_compute[186849]: 2025-11-22 07:47:41.916 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:41 np0005531887 nova_compute[186849]: 2025-11-22 07:47:41.917 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:41 np0005531887 nova_compute[186849]: 2025-11-22 07:47:41.917 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 02:47:41 np0005531887 nova_compute[186849]: 2025-11-22 07:47:41.981 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b9c07170-ca6f-422e-8f1c-9dfd5cc943a4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:47:42 np0005531887 nova_compute[186849]: 2025-11-22 07:47:42.053 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b9c07170-ca6f-422e-8f1c-9dfd5cc943a4/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:47:42 np0005531887 nova_compute[186849]: 2025-11-22 07:47:42.054 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b9c07170-ca6f-422e-8f1c-9dfd5cc943a4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:47:42 np0005531887 nova_compute[186849]: 2025-11-22 07:47:42.142 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b9c07170-ca6f-422e-8f1c-9dfd5cc943a4/disk --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:47:42 np0005531887 nova_compute[186849]: 2025-11-22 07:47:42.150 186853 DEBUG nova.compute.manager [req-21bb74df-91d6-425f-ba1c-97cbde3b7ef1 req-79b2ef5c-7a7e-4352-9fc3-b2c675c69ecd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Received event network-vif-unplugged-28cefb09-6f44-4c5f-b924-c2e3ca0082e1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:47:42 np0005531887 nova_compute[186849]: 2025-11-22 07:47:42.151 186853 DEBUG oslo_concurrency.lockutils [req-21bb74df-91d6-425f-ba1c-97cbde3b7ef1 req-79b2ef5c-7a7e-4352-9fc3-b2c675c69ecd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "b9c07170-ca6f-422e-8f1c-9dfd5cc943a4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:42 np0005531887 nova_compute[186849]: 2025-11-22 07:47:42.151 186853 DEBUG oslo_concurrency.lockutils [req-21bb74df-91d6-425f-ba1c-97cbde3b7ef1 req-79b2ef5c-7a7e-4352-9fc3-b2c675c69ecd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9c07170-ca6f-422e-8f1c-9dfd5cc943a4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:42 np0005531887 nova_compute[186849]: 2025-11-22 07:47:42.151 186853 DEBUG oslo_concurrency.lockutils [req-21bb74df-91d6-425f-ba1c-97cbde3b7ef1 req-79b2ef5c-7a7e-4352-9fc3-b2c675c69ecd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9c07170-ca6f-422e-8f1c-9dfd5cc943a4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:42 np0005531887 nova_compute[186849]: 2025-11-22 07:47:42.151 186853 DEBUG nova.compute.manager [req-21bb74df-91d6-425f-ba1c-97cbde3b7ef1 req-79b2ef5c-7a7e-4352-9fc3-b2c675c69ecd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] No waiting events found dispatching network-vif-unplugged-28cefb09-6f44-4c5f-b924-c2e3ca0082e1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:47:42 np0005531887 nova_compute[186849]: 2025-11-22 07:47:42.152 186853 DEBUG nova.compute.manager [req-21bb74df-91d6-425f-ba1c-97cbde3b7ef1 req-79b2ef5c-7a7e-4352-9fc3-b2c675c69ecd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Received event network-vif-unplugged-28cefb09-6f44-4c5f-b924-c2e3ca0082e1 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 02:47:42 np0005531887 nova_compute[186849]: 2025-11-22 07:47:42.165 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:42 np0005531887 nova_compute[186849]: 2025-11-22 07:47:42.354 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:47:42 np0005531887 nova_compute[186849]: 2025-11-22 07:47:42.355 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5558MB free_disk=73.42893981933594GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 02:47:42 np0005531887 nova_compute[186849]: 2025-11-22 07:47:42.356 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:42 np0005531887 nova_compute[186849]: 2025-11-22 07:47:42.356 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:42 np0005531887 nova_compute[186849]: 2025-11-22 07:47:42.402 186853 INFO nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Updating resource usage from migration 7be40063-4c96-4b9d-85c4-ef57ecaf1c16#033[00m
Nov 22 02:47:42 np0005531887 nova_compute[186849]: 2025-11-22 07:47:42.433 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Migration 7be40063-4c96-4b9d-85c4-ef57ecaf1c16 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Nov 22 02:47:42 np0005531887 nova_compute[186849]: 2025-11-22 07:47:42.434 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 02:47:42 np0005531887 nova_compute[186849]: 2025-11-22 07:47:42.434 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 02:47:42 np0005531887 nova_compute[186849]: 2025-11-22 07:47:42.478 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:47:42 np0005531887 nova_compute[186849]: 2025-11-22 07:47:42.494 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:47:42 np0005531887 nova_compute[186849]: 2025-11-22 07:47:42.525 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 02:47:42 np0005531887 nova_compute[186849]: 2025-11-22 07:47:42.525 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.170s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:42 np0005531887 nova_compute[186849]: 2025-11-22 07:47:42.907 186853 INFO nova.compute.manager [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Took 6.36 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.#033[00m
Nov 22 02:47:42 np0005531887 nova_compute[186849]: 2025-11-22 07:47:42.908 186853 DEBUG nova.compute.manager [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:47:42 np0005531887 nova_compute[186849]: 2025-11-22 07:47:42.937 186853 DEBUG nova.compute.manager [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=72704,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp1r930f3w',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b9c07170-ca6f-422e-8f1c-9dfd5cc943a4',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(7be40063-4c96-4b9d-85c4-ef57ecaf1c16),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939#033[00m
Nov 22 02:47:42 np0005531887 nova_compute[186849]: 2025-11-22 07:47:42.959 186853 DEBUG nova.objects.instance [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Lazy-loading 'migration_context' on Instance uuid b9c07170-ca6f-422e-8f1c-9dfd5cc943a4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:47:42 np0005531887 nova_compute[186849]: 2025-11-22 07:47:42.962 186853 DEBUG nova.virt.libvirt.driver [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639#033[00m
Nov 22 02:47:42 np0005531887 nova_compute[186849]: 2025-11-22 07:47:42.964 186853 DEBUG nova.virt.libvirt.driver [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440#033[00m
Nov 22 02:47:42 np0005531887 nova_compute[186849]: 2025-11-22 07:47:42.964 186853 DEBUG nova.virt.libvirt.driver [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449#033[00m
Nov 22 02:47:42 np0005531887 nova_compute[186849]: 2025-11-22 07:47:42.979 186853 DEBUG nova.virt.libvirt.vif [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:47:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1107734578',display_name='tempest-LiveMigrationTest-server-1107734578',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1107734578',id=35,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:47:23Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d48bda61691e4f778b6d30c0dc773a30',ramdisk_id='',reservation_id='r-qyzvalf0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-2093743563',owner_user_name='tempest-LiveMigrationTest-2093743563-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:47:23Z,user_data=None,user_id='8a738b980aad493b9a21da7d5a5ccf8a',uuid=b9c07170-ca6f-422e-8f1c-9dfd5cc943a4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "28cefb09-6f44-4c5f-b924-c2e3ca0082e1", "address": "fa:16:3e:90:34:2c", "network": {"id": "c3f966e1-8cff-4ca0-9b4f-a318c31b0a70", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1427311200-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d48bda61691e4f778b6d30c0dc773a30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap28cefb09-6f", "ovs_interfaceid": "28cefb09-6f44-4c5f-b924-c2e3ca0082e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 02:47:42 np0005531887 nova_compute[186849]: 2025-11-22 07:47:42.980 186853 DEBUG nova.network.os_vif_util [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Converting VIF {"id": "28cefb09-6f44-4c5f-b924-c2e3ca0082e1", "address": "fa:16:3e:90:34:2c", "network": {"id": "c3f966e1-8cff-4ca0-9b4f-a318c31b0a70", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1427311200-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d48bda61691e4f778b6d30c0dc773a30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap28cefb09-6f", "ovs_interfaceid": "28cefb09-6f44-4c5f-b924-c2e3ca0082e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:47:42 np0005531887 nova_compute[186849]: 2025-11-22 07:47:42.980 186853 DEBUG nova.network.os_vif_util [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:90:34:2c,bridge_name='br-int',has_traffic_filtering=True,id=28cefb09-6f44-4c5f-b924-c2e3ca0082e1,network=Network(c3f966e1-8cff-4ca0-9b4f-a318c31b0a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap28cefb09-6f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:47:42 np0005531887 nova_compute[186849]: 2025-11-22 07:47:42.981 186853 DEBUG nova.virt.libvirt.migration [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Updating guest XML with vif config: <interface type="ethernet">
Nov 22 02:47:42 np0005531887 nova_compute[186849]:  <mac address="fa:16:3e:90:34:2c"/>
Nov 22 02:47:42 np0005531887 nova_compute[186849]:  <model type="virtio"/>
Nov 22 02:47:42 np0005531887 nova_compute[186849]:  <driver name="vhost" rx_queue_size="512"/>
Nov 22 02:47:42 np0005531887 nova_compute[186849]:  <mtu size="1442"/>
Nov 22 02:47:42 np0005531887 nova_compute[186849]:  <target dev="tap28cefb09-6f"/>
Nov 22 02:47:42 np0005531887 nova_compute[186849]: </interface>
Nov 22 02:47:42 np0005531887 nova_compute[186849]: _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388#033[00m
Nov 22 02:47:42 np0005531887 nova_compute[186849]: 2025-11-22 07:47:42.981 186853 DEBUG nova.virt.libvirt.driver [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272#033[00m
Nov 22 02:47:43 np0005531887 nova_compute[186849]: 2025-11-22 07:47:43.400 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:47:43 np0005531887 nova_compute[186849]: 2025-11-22 07:47:43.422 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:47:43 np0005531887 nova_compute[186849]: 2025-11-22 07:47:43.423 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 02:47:43 np0005531887 nova_compute[186849]: 2025-11-22 07:47:43.466 186853 DEBUG nova.virt.libvirt.migration [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 22 02:47:43 np0005531887 nova_compute[186849]: 2025-11-22 07:47:43.467 186853 INFO nova.virt.libvirt.migration [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Increasing downtime to 50 ms after 0 sec elapsed time#033[00m
Nov 22 02:47:43 np0005531887 nova_compute[186849]: 2025-11-22 07:47:43.580 186853 INFO nova.virt.libvirt.driver [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).#033[00m
Nov 22 02:47:43 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:43.659 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:47:43 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:43.660 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 02:47:43 np0005531887 nova_compute[186849]: 2025-11-22 07:47:43.664 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:43 np0005531887 podman[218063]: 2025-11-22 07:47:43.835141353 +0000 UTC m=+0.055618520 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 02:47:44 np0005531887 nova_compute[186849]: 2025-11-22 07:47:44.083 186853 DEBUG nova.virt.libvirt.migration [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 22 02:47:44 np0005531887 nova_compute[186849]: 2025-11-22 07:47:44.083 186853 DEBUG nova.virt.libvirt.migration [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 22 02:47:44 np0005531887 nova_compute[186849]: 2025-11-22 07:47:44.313 186853 DEBUG nova.compute.manager [req-857c1dbc-0028-48b0-8f13-834d1c4a7a15 req-362f2fe0-b698-4a44-b373-065d0129b3cc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Received event network-vif-plugged-28cefb09-6f44-4c5f-b924-c2e3ca0082e1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:47:44 np0005531887 nova_compute[186849]: 2025-11-22 07:47:44.314 186853 DEBUG oslo_concurrency.lockutils [req-857c1dbc-0028-48b0-8f13-834d1c4a7a15 req-362f2fe0-b698-4a44-b373-065d0129b3cc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "b9c07170-ca6f-422e-8f1c-9dfd5cc943a4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:44 np0005531887 nova_compute[186849]: 2025-11-22 07:47:44.314 186853 DEBUG oslo_concurrency.lockutils [req-857c1dbc-0028-48b0-8f13-834d1c4a7a15 req-362f2fe0-b698-4a44-b373-065d0129b3cc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9c07170-ca6f-422e-8f1c-9dfd5cc943a4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:44 np0005531887 nova_compute[186849]: 2025-11-22 07:47:44.314 186853 DEBUG oslo_concurrency.lockutils [req-857c1dbc-0028-48b0-8f13-834d1c4a7a15 req-362f2fe0-b698-4a44-b373-065d0129b3cc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9c07170-ca6f-422e-8f1c-9dfd5cc943a4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:44 np0005531887 nova_compute[186849]: 2025-11-22 07:47:44.315 186853 DEBUG nova.compute.manager [req-857c1dbc-0028-48b0-8f13-834d1c4a7a15 req-362f2fe0-b698-4a44-b373-065d0129b3cc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] No waiting events found dispatching network-vif-plugged-28cefb09-6f44-4c5f-b924-c2e3ca0082e1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:47:44 np0005531887 nova_compute[186849]: 2025-11-22 07:47:44.315 186853 WARNING nova.compute.manager [req-857c1dbc-0028-48b0-8f13-834d1c4a7a15 req-362f2fe0-b698-4a44-b373-065d0129b3cc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Received unexpected event network-vif-plugged-28cefb09-6f44-4c5f-b924-c2e3ca0082e1 for instance with vm_state active and task_state migrating.#033[00m
Nov 22 02:47:44 np0005531887 nova_compute[186849]: 2025-11-22 07:47:44.315 186853 DEBUG nova.compute.manager [req-857c1dbc-0028-48b0-8f13-834d1c4a7a15 req-362f2fe0-b698-4a44-b373-065d0129b3cc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Received event network-changed-28cefb09-6f44-4c5f-b924-c2e3ca0082e1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:47:44 np0005531887 nova_compute[186849]: 2025-11-22 07:47:44.315 186853 DEBUG nova.compute.manager [req-857c1dbc-0028-48b0-8f13-834d1c4a7a15 req-362f2fe0-b698-4a44-b373-065d0129b3cc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Refreshing instance network info cache due to event network-changed-28cefb09-6f44-4c5f-b924-c2e3ca0082e1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 02:47:44 np0005531887 nova_compute[186849]: 2025-11-22 07:47:44.316 186853 DEBUG oslo_concurrency.lockutils [req-857c1dbc-0028-48b0-8f13-834d1c4a7a15 req-362f2fe0-b698-4a44-b373-065d0129b3cc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-b9c07170-ca6f-422e-8f1c-9dfd5cc943a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:47:44 np0005531887 nova_compute[186849]: 2025-11-22 07:47:44.316 186853 DEBUG oslo_concurrency.lockutils [req-857c1dbc-0028-48b0-8f13-834d1c4a7a15 req-362f2fe0-b698-4a44-b373-065d0129b3cc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-b9c07170-ca6f-422e-8f1c-9dfd5cc943a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:47:44 np0005531887 nova_compute[186849]: 2025-11-22 07:47:44.316 186853 DEBUG nova.network.neutron [req-857c1dbc-0028-48b0-8f13-834d1c4a7a15 req-362f2fe0-b698-4a44-b373-065d0129b3cc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Refreshing network info cache for port 28cefb09-6f44-4c5f-b924-c2e3ca0082e1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 02:47:44 np0005531887 nova_compute[186849]: 2025-11-22 07:47:44.588 186853 DEBUG nova.virt.libvirt.migration [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 22 02:47:44 np0005531887 nova_compute[186849]: 2025-11-22 07:47:44.589 186853 DEBUG nova.virt.libvirt.migration [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 22 02:47:44 np0005531887 nova_compute[186849]: 2025-11-22 07:47:44.927 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763797664.92701, b9c07170-ca6f-422e-8f1c-9dfd5cc943a4 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:47:44 np0005531887 nova_compute[186849]: 2025-11-22 07:47:44.928 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] VM Paused (Lifecycle Event)#033[00m
Nov 22 02:47:44 np0005531887 nova_compute[186849]: 2025-11-22 07:47:44.956 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:47:44 np0005531887 nova_compute[186849]: 2025-11-22 07:47:44.960 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:47:44 np0005531887 nova_compute[186849]: 2025-11-22 07:47:44.992 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] During sync_power_state the instance has a pending task (migrating). Skip.#033[00m
Nov 22 02:47:45 np0005531887 kernel: tap28cefb09-6f (unregistering): left promiscuous mode
Nov 22 02:47:45 np0005531887 NetworkManager[55210]: <info>  [1763797665.0650] device (tap28cefb09-6f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 02:47:45 np0005531887 ovn_controller[95130]: 2025-11-22T07:47:45Z|00082|binding|INFO|Releasing lport 28cefb09-6f44-4c5f-b924-c2e3ca0082e1 from this chassis (sb_readonly=0)
Nov 22 02:47:45 np0005531887 nova_compute[186849]: 2025-11-22 07:47:45.075 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:45 np0005531887 ovn_controller[95130]: 2025-11-22T07:47:45Z|00083|binding|INFO|Setting lport 28cefb09-6f44-4c5f-b924-c2e3ca0082e1 down in Southbound
Nov 22 02:47:45 np0005531887 ovn_controller[95130]: 2025-11-22T07:47:45Z|00084|binding|INFO|Releasing lport eb321ea2-ecc9-494b-a270-c3aac4f36e7d from this chassis (sb_readonly=0)
Nov 22 02:47:45 np0005531887 ovn_controller[95130]: 2025-11-22T07:47:45Z|00085|binding|INFO|Setting lport eb321ea2-ecc9-494b-a270-c3aac4f36e7d down in Southbound
Nov 22 02:47:45 np0005531887 ovn_controller[95130]: 2025-11-22T07:47:45Z|00086|binding|INFO|Removing iface tap28cefb09-6f ovn-installed in OVS
Nov 22 02:47:45 np0005531887 nova_compute[186849]: 2025-11-22 07:47:45.079 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:45 np0005531887 ovn_controller[95130]: 2025-11-22T07:47:45Z|00087|binding|INFO|Releasing lport 8206cb6d-dd78-493d-a276-fccb0eeecc7d from this chassis (sb_readonly=0)
Nov 22 02:47:45 np0005531887 ovn_controller[95130]: 2025-11-22T07:47:45Z|00088|binding|INFO|Releasing lport adbcca2d-be43-4042-953b-c108dbe75276 from this chassis (sb_readonly=0)
Nov 22 02:47:45 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:45.088 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:90:34:2c 10.100.0.9'], port_security=['fa:16:3e:90:34:2c 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'df09844c-c111-44b4-9c36-d4950a55a590'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-109341048', 'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'b9c07170-ca6f-422e-8f1c-9dfd5cc943a4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-109341048', 'neutron:project_id': 'd48bda61691e4f778b6d30c0dc773a30', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'd99796e5-fe06-409c-adb5-ca5cc291d6f2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=45f3847d-7d6e-44b5-a83a-dde97f76bd11, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=28cefb09-6f44-4c5f-b924-c2e3ca0082e1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:47:45 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:45.090 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d2:b0:13 19.80.0.49'], port_security=['fa:16:3e:d2:b0:13 19.80.0.49'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['28cefb09-6f44-4c5f-b924-c2e3ca0082e1'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-1622735635', 'neutron:cidrs': '19.80.0.49/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c410ae8d-536e-4819-b766-652bc78ac3e4', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-1622735635', 'neutron:project_id': 'd48bda61691e4f778b6d30c0dc773a30', 'neutron:revision_number': '3', 'neutron:security_group_ids': 'd99796e5-fe06-409c-adb5-ca5cc291d6f2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=83c2898f-e4b8-43d0-8099-6e9553385d03, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=eb321ea2-ecc9-494b-a270-c3aac4f36e7d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:47:45 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:45.091 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 28cefb09-6f44-4c5f-b924-c2e3ca0082e1 in datapath c3f966e1-8cff-4ca0-9b4f-a318c31b0a70 unbound from our chassis#033[00m
Nov 22 02:47:45 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:45.092 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c3f966e1-8cff-4ca0-9b4f-a318c31b0a70, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 02:47:45 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:45.094 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[613037dd-20c1-4306-8b84-be9ffa25afa5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:47:45 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:45.095 104084 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70 namespace which is not needed anymore#033[00m
Nov 22 02:47:45 np0005531887 nova_compute[186849]: 2025-11-22 07:47:45.105 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:45 np0005531887 nova_compute[186849]: 2025-11-22 07:47:45.169 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:45 np0005531887 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000023.scope: Deactivated successfully.
Nov 22 02:47:45 np0005531887 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000023.scope: Consumed 15.462s CPU time.
Nov 22 02:47:45 np0005531887 systemd-machined[153180]: Machine qemu-15-instance-00000023 terminated.
Nov 22 02:47:45 np0005531887 neutron-haproxy-ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70[217765]: [NOTICE]   (217769) : haproxy version is 2.8.14-c23fe91
Nov 22 02:47:45 np0005531887 neutron-haproxy-ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70[217765]: [NOTICE]   (217769) : path to executable is /usr/sbin/haproxy
Nov 22 02:47:45 np0005531887 neutron-haproxy-ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70[217765]: [WARNING]  (217769) : Exiting Master process...
Nov 22 02:47:45 np0005531887 neutron-haproxy-ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70[217765]: [ALERT]    (217769) : Current worker (217771) exited with code 143 (Terminated)
Nov 22 02:47:45 np0005531887 neutron-haproxy-ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70[217765]: [WARNING]  (217769) : All workers exited. Exiting... (0)
Nov 22 02:47:45 np0005531887 systemd[1]: libpod-f28d5c5ee3e903bb423d91f7dd941994d97c5bfb8870ab0b0caa6b34d799ff84.scope: Deactivated successfully.
Nov 22 02:47:45 np0005531887 podman[218115]: 2025-11-22 07:47:45.226551498 +0000 UTC m=+0.044997580 container died f28d5c5ee3e903bb423d91f7dd941994d97c5bfb8870ab0b0caa6b34d799ff84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 02:47:45 np0005531887 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f28d5c5ee3e903bb423d91f7dd941994d97c5bfb8870ab0b0caa6b34d799ff84-userdata-shm.mount: Deactivated successfully.
Nov 22 02:47:45 np0005531887 systemd[1]: var-lib-containers-storage-overlay-e1fbd7087e57143473a70d6697fc047cfb1fb8a261f57ba3314945890fb01409-merged.mount: Deactivated successfully.
Nov 22 02:47:45 np0005531887 podman[218115]: 2025-11-22 07:47:45.269660381 +0000 UTC m=+0.088106443 container cleanup f28d5c5ee3e903bb423d91f7dd941994d97c5bfb8870ab0b0caa6b34d799ff84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 22 02:47:45 np0005531887 systemd[1]: libpod-conmon-f28d5c5ee3e903bb423d91f7dd941994d97c5bfb8870ab0b0caa6b34d799ff84.scope: Deactivated successfully.
Nov 22 02:47:45 np0005531887 nova_compute[186849]: 2025-11-22 07:47:45.282 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:45 np0005531887 nova_compute[186849]: 2025-11-22 07:47:45.318 186853 DEBUG nova.virt.libvirt.guest [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688#033[00m
Nov 22 02:47:45 np0005531887 nova_compute[186849]: 2025-11-22 07:47:45.318 186853 INFO nova.virt.libvirt.driver [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Migration operation has completed#033[00m
Nov 22 02:47:45 np0005531887 nova_compute[186849]: 2025-11-22 07:47:45.318 186853 INFO nova.compute.manager [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] _post_live_migration() is started..#033[00m
Nov 22 02:47:45 np0005531887 nova_compute[186849]: 2025-11-22 07:47:45.321 186853 DEBUG nova.virt.libvirt.driver [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279#033[00m
Nov 22 02:47:45 np0005531887 nova_compute[186849]: 2025-11-22 07:47:45.322 186853 DEBUG nova.virt.libvirt.driver [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327#033[00m
Nov 22 02:47:45 np0005531887 nova_compute[186849]: 2025-11-22 07:47:45.322 186853 DEBUG nova.virt.libvirt.driver [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630#033[00m
Nov 22 02:47:45 np0005531887 podman[218152]: 2025-11-22 07:47:45.345360761 +0000 UTC m=+0.049500971 container remove f28d5c5ee3e903bb423d91f7dd941994d97c5bfb8870ab0b0caa6b34d799ff84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true)
Nov 22 02:47:45 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:45.350 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[c70015da-14ce-4ae8-9f9a-5f378dbbb37f]: (4, ('Sat Nov 22 07:47:45 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70 (f28d5c5ee3e903bb423d91f7dd941994d97c5bfb8870ab0b0caa6b34d799ff84)\nf28d5c5ee3e903bb423d91f7dd941994d97c5bfb8870ab0b0caa6b34d799ff84\nSat Nov 22 07:47:45 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70 (f28d5c5ee3e903bb423d91f7dd941994d97c5bfb8870ab0b0caa6b34d799ff84)\nf28d5c5ee3e903bb423d91f7dd941994d97c5bfb8870ab0b0caa6b34d799ff84\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:47:45 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:45.352 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[76a1153d-42ba-418f-8828-bafd8bba0f7c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:47:45 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:45.353 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc3f966e1-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:47:45 np0005531887 kernel: tapc3f966e1-80: left promiscuous mode
Nov 22 02:47:45 np0005531887 nova_compute[186849]: 2025-11-22 07:47:45.356 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:45 np0005531887 nova_compute[186849]: 2025-11-22 07:47:45.370 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:45 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:45.373 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[ce13185b-a705-4f5b-b6f0-d2e826d0ce68]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:47:45 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:45.387 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[f7ab057a-1edf-4c9d-953d-235068a82982]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:47:45 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:45.389 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[8c50009c-be88-4f4d-be5a-8202fc23ccca]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:47:45 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:45.404 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[6dd98eba-df89-43e8-b696-926dca7abbda]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 438568, 'reachable_time': 19098, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218180, 'error': None, 'target': 'ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:47:45 np0005531887 systemd[1]: run-netns-ovnmeta\x2dc3f966e1\x2d8cff\x2d4ca0\x2d9b4f\x2da318c31b0a70.mount: Deactivated successfully.
Nov 22 02:47:45 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:45.411 104200 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 02:47:45 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:45.411 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[0867d998-9262-45eb-9718-00c1a605292e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:47:45 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:45.412 104084 INFO neutron.agent.ovn.metadata.agent [-] Port eb321ea2-ecc9-494b-a270-c3aac4f36e7d in datapath c410ae8d-536e-4819-b766-652bc78ac3e4 unbound from our chassis#033[00m
Nov 22 02:47:45 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:45.413 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c410ae8d-536e-4819-b766-652bc78ac3e4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 02:47:45 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:45.414 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[6736d8c8-a524-4cf9-b39e-ef2ed929dac4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:47:45 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:45.414 104084 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c410ae8d-536e-4819-b766-652bc78ac3e4 namespace which is not needed anymore#033[00m
Nov 22 02:47:45 np0005531887 neutron-haproxy-ovnmeta-c410ae8d-536e-4819-b766-652bc78ac3e4[217880]: [NOTICE]   (217884) : haproxy version is 2.8.14-c23fe91
Nov 22 02:47:45 np0005531887 neutron-haproxy-ovnmeta-c410ae8d-536e-4819-b766-652bc78ac3e4[217880]: [NOTICE]   (217884) : path to executable is /usr/sbin/haproxy
Nov 22 02:47:45 np0005531887 neutron-haproxy-ovnmeta-c410ae8d-536e-4819-b766-652bc78ac3e4[217880]: [WARNING]  (217884) : Exiting Master process...
Nov 22 02:47:45 np0005531887 neutron-haproxy-ovnmeta-c410ae8d-536e-4819-b766-652bc78ac3e4[217880]: [ALERT]    (217884) : Current worker (217886) exited with code 143 (Terminated)
Nov 22 02:47:45 np0005531887 neutron-haproxy-ovnmeta-c410ae8d-536e-4819-b766-652bc78ac3e4[217880]: [WARNING]  (217884) : All workers exited. Exiting... (0)
Nov 22 02:47:45 np0005531887 systemd[1]: libpod-f5d4e409d12c1457978220d254d507b3b3c1b8224c2924eb52ef69941096dca8.scope: Deactivated successfully.
Nov 22 02:47:45 np0005531887 podman[218198]: 2025-11-22 07:47:45.542776004 +0000 UTC m=+0.045716408 container died f5d4e409d12c1457978220d254d507b3b3c1b8224c2924eb52ef69941096dca8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c410ae8d-536e-4819-b766-652bc78ac3e4, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 22 02:47:45 np0005531887 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f5d4e409d12c1457978220d254d507b3b3c1b8224c2924eb52ef69941096dca8-userdata-shm.mount: Deactivated successfully.
Nov 22 02:47:45 np0005531887 systemd[1]: var-lib-containers-storage-overlay-5e30640c6b9e775f6252005b57cb2c172d85c4ea8fd38e93384ae4bdc3573cd5-merged.mount: Deactivated successfully.
Nov 22 02:47:45 np0005531887 podman[218198]: 2025-11-22 07:47:45.575718448 +0000 UTC m=+0.078658842 container cleanup f5d4e409d12c1457978220d254d507b3b3c1b8224c2924eb52ef69941096dca8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c410ae8d-536e-4819-b766-652bc78ac3e4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 02:47:45 np0005531887 systemd[1]: libpod-conmon-f5d4e409d12c1457978220d254d507b3b3c1b8224c2924eb52ef69941096dca8.scope: Deactivated successfully.
Nov 22 02:47:45 np0005531887 nova_compute[186849]: 2025-11-22 07:47:45.620 186853 DEBUG nova.compute.manager [req-234712d3-5c1f-4c05-a32c-0bd72d74dc55 req-e0d1ee0b-58ba-4364-ba6e-06a597828046 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Received event network-vif-unplugged-28cefb09-6f44-4c5f-b924-c2e3ca0082e1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:47:45 np0005531887 nova_compute[186849]: 2025-11-22 07:47:45.620 186853 DEBUG oslo_concurrency.lockutils [req-234712d3-5c1f-4c05-a32c-0bd72d74dc55 req-e0d1ee0b-58ba-4364-ba6e-06a597828046 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "b9c07170-ca6f-422e-8f1c-9dfd5cc943a4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:45 np0005531887 nova_compute[186849]: 2025-11-22 07:47:45.621 186853 DEBUG oslo_concurrency.lockutils [req-234712d3-5c1f-4c05-a32c-0bd72d74dc55 req-e0d1ee0b-58ba-4364-ba6e-06a597828046 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9c07170-ca6f-422e-8f1c-9dfd5cc943a4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:45 np0005531887 nova_compute[186849]: 2025-11-22 07:47:45.621 186853 DEBUG oslo_concurrency.lockutils [req-234712d3-5c1f-4c05-a32c-0bd72d74dc55 req-e0d1ee0b-58ba-4364-ba6e-06a597828046 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9c07170-ca6f-422e-8f1c-9dfd5cc943a4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:45 np0005531887 nova_compute[186849]: 2025-11-22 07:47:45.621 186853 DEBUG nova.compute.manager [req-234712d3-5c1f-4c05-a32c-0bd72d74dc55 req-e0d1ee0b-58ba-4364-ba6e-06a597828046 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] No waiting events found dispatching network-vif-unplugged-28cefb09-6f44-4c5f-b924-c2e3ca0082e1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:47:45 np0005531887 nova_compute[186849]: 2025-11-22 07:47:45.621 186853 DEBUG nova.compute.manager [req-234712d3-5c1f-4c05-a32c-0bd72d74dc55 req-e0d1ee0b-58ba-4364-ba6e-06a597828046 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Received event network-vif-unplugged-28cefb09-6f44-4c5f-b924-c2e3ca0082e1 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 02:47:45 np0005531887 podman[218229]: 2025-11-22 07:47:45.635245773 +0000 UTC m=+0.038864760 container remove f5d4e409d12c1457978220d254d507b3b3c1b8224c2924eb52ef69941096dca8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c410ae8d-536e-4819-b766-652bc78ac3e4, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 22 02:47:45 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:45.641 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[f2530871-e41e-4089-b0ca-7cd1a1e86c53]: (4, ('Sat Nov 22 07:47:45 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-c410ae8d-536e-4819-b766-652bc78ac3e4 (f5d4e409d12c1457978220d254d507b3b3c1b8224c2924eb52ef69941096dca8)\nf5d4e409d12c1457978220d254d507b3b3c1b8224c2924eb52ef69941096dca8\nSat Nov 22 07:47:45 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-c410ae8d-536e-4819-b766-652bc78ac3e4 (f5d4e409d12c1457978220d254d507b3b3c1b8224c2924eb52ef69941096dca8)\nf5d4e409d12c1457978220d254d507b3b3c1b8224c2924eb52ef69941096dca8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:47:45 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:45.642 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[8514d319-a981-4205-86f3-9fea78ef7f26]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:47:45 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:45.643 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc410ae8d-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:47:45 np0005531887 kernel: tapc410ae8d-50: left promiscuous mode
Nov 22 02:47:45 np0005531887 nova_compute[186849]: 2025-11-22 07:47:45.645 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:45 np0005531887 nova_compute[186849]: 2025-11-22 07:47:45.659 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:45 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:45.661 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[9f22879d-26ad-482c-afd0-5609e2292fa2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:47:45 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:45.680 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[c26457f0-7efa-48da-b417-b3a523a32adc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:47:45 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:45.682 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[a1a51545-81e4-43e0-9dac-d90b59459fb0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:47:45 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:45.697 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[8331bbac-4d08-4083-b011-bc2b8299f9a0]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 438661, 'reachable_time': 31883, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218248, 'error': None, 'target': 'ovnmeta-c410ae8d-536e-4819-b766-652bc78ac3e4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:47:45 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:45.698 104200 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c410ae8d-536e-4819-b766-652bc78ac3e4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 02:47:45 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:45.699 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[82adcb71-aecc-4089-a4e6-e5c9b47e2a2a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:47:46 np0005531887 systemd[1]: run-netns-ovnmeta\x2dc410ae8d\x2d536e\x2d4819\x2db766\x2d652bc78ac3e4.mount: Deactivated successfully.
Nov 22 02:47:46 np0005531887 nova_compute[186849]: 2025-11-22 07:47:46.597 186853 DEBUG nova.network.neutron [req-857c1dbc-0028-48b0-8f13-834d1c4a7a15 req-362f2fe0-b698-4a44-b373-065d0129b3cc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Updated VIF entry in instance network info cache for port 28cefb09-6f44-4c5f-b924-c2e3ca0082e1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 02:47:46 np0005531887 nova_compute[186849]: 2025-11-22 07:47:46.598 186853 DEBUG nova.network.neutron [req-857c1dbc-0028-48b0-8f13-834d1c4a7a15 req-362f2fe0-b698-4a44-b373-065d0129b3cc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Updating instance_info_cache with network_info: [{"id": "28cefb09-6f44-4c5f-b924-c2e3ca0082e1", "address": "fa:16:3e:90:34:2c", "network": {"id": "c3f966e1-8cff-4ca0-9b4f-a318c31b0a70", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1427311200-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d48bda61691e4f778b6d30c0dc773a30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap28cefb09-6f", "ovs_interfaceid": "28cefb09-6f44-4c5f-b924-c2e3ca0082e1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:47:46 np0005531887 nova_compute[186849]: 2025-11-22 07:47:46.616 186853 DEBUG oslo_concurrency.lockutils [req-857c1dbc-0028-48b0-8f13-834d1c4a7a15 req-362f2fe0-b698-4a44-b373-065d0129b3cc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-b9c07170-ca6f-422e-8f1c-9dfd5cc943a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:47:46 np0005531887 nova_compute[186849]: 2025-11-22 07:47:46.665 186853 DEBUG nova.compute.manager [req-c902b459-f0a1-4108-9927-64cabbfb5917 req-8f779ce8-d27d-4bf1-961c-355f4b82e9d0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Received event network-vif-unplugged-28cefb09-6f44-4c5f-b924-c2e3ca0082e1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:47:46 np0005531887 nova_compute[186849]: 2025-11-22 07:47:46.665 186853 DEBUG oslo_concurrency.lockutils [req-c902b459-f0a1-4108-9927-64cabbfb5917 req-8f779ce8-d27d-4bf1-961c-355f4b82e9d0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "b9c07170-ca6f-422e-8f1c-9dfd5cc943a4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:46 np0005531887 nova_compute[186849]: 2025-11-22 07:47:46.665 186853 DEBUG oslo_concurrency.lockutils [req-c902b459-f0a1-4108-9927-64cabbfb5917 req-8f779ce8-d27d-4bf1-961c-355f4b82e9d0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9c07170-ca6f-422e-8f1c-9dfd5cc943a4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:46 np0005531887 nova_compute[186849]: 2025-11-22 07:47:46.665 186853 DEBUG oslo_concurrency.lockutils [req-c902b459-f0a1-4108-9927-64cabbfb5917 req-8f779ce8-d27d-4bf1-961c-355f4b82e9d0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9c07170-ca6f-422e-8f1c-9dfd5cc943a4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:46 np0005531887 nova_compute[186849]: 2025-11-22 07:47:46.666 186853 DEBUG nova.compute.manager [req-c902b459-f0a1-4108-9927-64cabbfb5917 req-8f779ce8-d27d-4bf1-961c-355f4b82e9d0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] No waiting events found dispatching network-vif-unplugged-28cefb09-6f44-4c5f-b924-c2e3ca0082e1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:47:46 np0005531887 nova_compute[186849]: 2025-11-22 07:47:46.666 186853 DEBUG nova.compute.manager [req-c902b459-f0a1-4108-9927-64cabbfb5917 req-8f779ce8-d27d-4bf1-961c-355f4b82e9d0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Received event network-vif-unplugged-28cefb09-6f44-4c5f-b924-c2e3ca0082e1 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 02:47:47 np0005531887 nova_compute[186849]: 2025-11-22 07:47:47.031 186853 DEBUG nova.network.neutron [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Activated binding for port 28cefb09-6f44-4c5f-b924-c2e3ca0082e1 and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181#033[00m
Nov 22 02:47:47 np0005531887 nova_compute[186849]: 2025-11-22 07:47:47.031 186853 DEBUG nova.compute.manager [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "28cefb09-6f44-4c5f-b924-c2e3ca0082e1", "address": "fa:16:3e:90:34:2c", "network": {"id": "c3f966e1-8cff-4ca0-9b4f-a318c31b0a70", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1427311200-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d48bda61691e4f778b6d30c0dc773a30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap28cefb09-6f", "ovs_interfaceid": "28cefb09-6f44-4c5f-b924-c2e3ca0082e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326#033[00m
Nov 22 02:47:47 np0005531887 nova_compute[186849]: 2025-11-22 07:47:47.032 186853 DEBUG nova.virt.libvirt.vif [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:47:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1107734578',display_name='tempest-LiveMigrationTest-server-1107734578',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1107734578',id=35,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:47:23Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d48bda61691e4f778b6d30c0dc773a30',ramdisk_id='',reservation_id='r-qyzvalf0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-2093743563',owner_user_name='tempest-LiveMigrationTest-2093743563-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:47:33Z,user_data=None,user_id='8a738b980aad493b9a21da7d5a5ccf8a',uuid=b9c07170-ca6f-422e-8f1c-9dfd5cc943a4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "28cefb09-6f44-4c5f-b924-c2e3ca0082e1", "address": "fa:16:3e:90:34:2c", "network": {"id": "c3f966e1-8cff-4ca0-9b4f-a318c31b0a70", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1427311200-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d48bda61691e4f778b6d30c0dc773a30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap28cefb09-6f", "ovs_interfaceid": "28cefb09-6f44-4c5f-b924-c2e3ca0082e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 02:47:47 np0005531887 nova_compute[186849]: 2025-11-22 07:47:47.032 186853 DEBUG nova.network.os_vif_util [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Converting VIF {"id": "28cefb09-6f44-4c5f-b924-c2e3ca0082e1", "address": "fa:16:3e:90:34:2c", "network": {"id": "c3f966e1-8cff-4ca0-9b4f-a318c31b0a70", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1427311200-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d48bda61691e4f778b6d30c0dc773a30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap28cefb09-6f", "ovs_interfaceid": "28cefb09-6f44-4c5f-b924-c2e3ca0082e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:47:47 np0005531887 nova_compute[186849]: 2025-11-22 07:47:47.033 186853 DEBUG nova.network.os_vif_util [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:90:34:2c,bridge_name='br-int',has_traffic_filtering=True,id=28cefb09-6f44-4c5f-b924-c2e3ca0082e1,network=Network(c3f966e1-8cff-4ca0-9b4f-a318c31b0a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap28cefb09-6f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:47:47 np0005531887 nova_compute[186849]: 2025-11-22 07:47:47.033 186853 DEBUG os_vif [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:90:34:2c,bridge_name='br-int',has_traffic_filtering=True,id=28cefb09-6f44-4c5f-b924-c2e3ca0082e1,network=Network(c3f966e1-8cff-4ca0-9b4f-a318c31b0a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap28cefb09-6f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 02:47:47 np0005531887 nova_compute[186849]: 2025-11-22 07:47:47.034 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:47 np0005531887 nova_compute[186849]: 2025-11-22 07:47:47.035 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap28cefb09-6f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:47:47 np0005531887 nova_compute[186849]: 2025-11-22 07:47:47.036 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:47 np0005531887 nova_compute[186849]: 2025-11-22 07:47:47.039 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:47:47 np0005531887 nova_compute[186849]: 2025-11-22 07:47:47.042 186853 INFO os_vif [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:90:34:2c,bridge_name='br-int',has_traffic_filtering=True,id=28cefb09-6f44-4c5f-b924-c2e3ca0082e1,network=Network(c3f966e1-8cff-4ca0-9b4f-a318c31b0a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap28cefb09-6f')#033[00m
Nov 22 02:47:47 np0005531887 nova_compute[186849]: 2025-11-22 07:47:47.042 186853 DEBUG oslo_concurrency.lockutils [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:47 np0005531887 nova_compute[186849]: 2025-11-22 07:47:47.042 186853 DEBUG oslo_concurrency.lockutils [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:47 np0005531887 nova_compute[186849]: 2025-11-22 07:47:47.043 186853 DEBUG oslo_concurrency.lockutils [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:47 np0005531887 nova_compute[186849]: 2025-11-22 07:47:47.043 186853 DEBUG nova.compute.manager [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349#033[00m
Nov 22 02:47:47 np0005531887 nova_compute[186849]: 2025-11-22 07:47:47.043 186853 INFO nova.virt.libvirt.driver [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Deleting instance files /var/lib/nova/instances/b9c07170-ca6f-422e-8f1c-9dfd5cc943a4_del#033[00m
Nov 22 02:47:47 np0005531887 nova_compute[186849]: 2025-11-22 07:47:47.044 186853 INFO nova.virt.libvirt.driver [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Deletion of /var/lib/nova/instances/b9c07170-ca6f-422e-8f1c-9dfd5cc943a4_del complete#033[00m
Nov 22 02:47:47 np0005531887 nova_compute[186849]: 2025-11-22 07:47:47.704 186853 DEBUG nova.compute.manager [req-ac5900be-f801-4db3-9ff8-81a7f2c1e1de req-0bc75ccb-6b2b-44bd-aab8-dced1c643a93 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Received event network-vif-plugged-28cefb09-6f44-4c5f-b924-c2e3ca0082e1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:47:47 np0005531887 nova_compute[186849]: 2025-11-22 07:47:47.704 186853 DEBUG oslo_concurrency.lockutils [req-ac5900be-f801-4db3-9ff8-81a7f2c1e1de req-0bc75ccb-6b2b-44bd-aab8-dced1c643a93 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "b9c07170-ca6f-422e-8f1c-9dfd5cc943a4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:47 np0005531887 nova_compute[186849]: 2025-11-22 07:47:47.704 186853 DEBUG oslo_concurrency.lockutils [req-ac5900be-f801-4db3-9ff8-81a7f2c1e1de req-0bc75ccb-6b2b-44bd-aab8-dced1c643a93 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9c07170-ca6f-422e-8f1c-9dfd5cc943a4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:47 np0005531887 nova_compute[186849]: 2025-11-22 07:47:47.704 186853 DEBUG oslo_concurrency.lockutils [req-ac5900be-f801-4db3-9ff8-81a7f2c1e1de req-0bc75ccb-6b2b-44bd-aab8-dced1c643a93 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9c07170-ca6f-422e-8f1c-9dfd5cc943a4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:47 np0005531887 nova_compute[186849]: 2025-11-22 07:47:47.705 186853 DEBUG nova.compute.manager [req-ac5900be-f801-4db3-9ff8-81a7f2c1e1de req-0bc75ccb-6b2b-44bd-aab8-dced1c643a93 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] No waiting events found dispatching network-vif-plugged-28cefb09-6f44-4c5f-b924-c2e3ca0082e1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:47:47 np0005531887 nova_compute[186849]: 2025-11-22 07:47:47.705 186853 WARNING nova.compute.manager [req-ac5900be-f801-4db3-9ff8-81a7f2c1e1de req-0bc75ccb-6b2b-44bd-aab8-dced1c643a93 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Received unexpected event network-vif-plugged-28cefb09-6f44-4c5f-b924-c2e3ca0082e1 for instance with vm_state active and task_state migrating.#033[00m
Nov 22 02:47:47 np0005531887 nova_compute[186849]: 2025-11-22 07:47:47.705 186853 DEBUG nova.compute.manager [req-ac5900be-f801-4db3-9ff8-81a7f2c1e1de req-0bc75ccb-6b2b-44bd-aab8-dced1c643a93 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Received event network-vif-plugged-28cefb09-6f44-4c5f-b924-c2e3ca0082e1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:47:47 np0005531887 nova_compute[186849]: 2025-11-22 07:47:47.705 186853 DEBUG oslo_concurrency.lockutils [req-ac5900be-f801-4db3-9ff8-81a7f2c1e1de req-0bc75ccb-6b2b-44bd-aab8-dced1c643a93 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "b9c07170-ca6f-422e-8f1c-9dfd5cc943a4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:47 np0005531887 nova_compute[186849]: 2025-11-22 07:47:47.706 186853 DEBUG oslo_concurrency.lockutils [req-ac5900be-f801-4db3-9ff8-81a7f2c1e1de req-0bc75ccb-6b2b-44bd-aab8-dced1c643a93 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9c07170-ca6f-422e-8f1c-9dfd5cc943a4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:47 np0005531887 nova_compute[186849]: 2025-11-22 07:47:47.706 186853 DEBUG oslo_concurrency.lockutils [req-ac5900be-f801-4db3-9ff8-81a7f2c1e1de req-0bc75ccb-6b2b-44bd-aab8-dced1c643a93 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9c07170-ca6f-422e-8f1c-9dfd5cc943a4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:47 np0005531887 nova_compute[186849]: 2025-11-22 07:47:47.706 186853 DEBUG nova.compute.manager [req-ac5900be-f801-4db3-9ff8-81a7f2c1e1de req-0bc75ccb-6b2b-44bd-aab8-dced1c643a93 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] No waiting events found dispatching network-vif-plugged-28cefb09-6f44-4c5f-b924-c2e3ca0082e1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:47:47 np0005531887 nova_compute[186849]: 2025-11-22 07:47:47.706 186853 WARNING nova.compute.manager [req-ac5900be-f801-4db3-9ff8-81a7f2c1e1de req-0bc75ccb-6b2b-44bd-aab8-dced1c643a93 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Received unexpected event network-vif-plugged-28cefb09-6f44-4c5f-b924-c2e3ca0082e1 for instance with vm_state active and task_state migrating.#033[00m
Nov 22 02:47:47 np0005531887 nova_compute[186849]: 2025-11-22 07:47:47.706 186853 DEBUG nova.compute.manager [req-ac5900be-f801-4db3-9ff8-81a7f2c1e1de req-0bc75ccb-6b2b-44bd-aab8-dced1c643a93 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Received event network-vif-plugged-28cefb09-6f44-4c5f-b924-c2e3ca0082e1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:47:47 np0005531887 nova_compute[186849]: 2025-11-22 07:47:47.707 186853 DEBUG oslo_concurrency.lockutils [req-ac5900be-f801-4db3-9ff8-81a7f2c1e1de req-0bc75ccb-6b2b-44bd-aab8-dced1c643a93 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "b9c07170-ca6f-422e-8f1c-9dfd5cc943a4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:47 np0005531887 nova_compute[186849]: 2025-11-22 07:47:47.707 186853 DEBUG oslo_concurrency.lockutils [req-ac5900be-f801-4db3-9ff8-81a7f2c1e1de req-0bc75ccb-6b2b-44bd-aab8-dced1c643a93 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9c07170-ca6f-422e-8f1c-9dfd5cc943a4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:47 np0005531887 nova_compute[186849]: 2025-11-22 07:47:47.707 186853 DEBUG oslo_concurrency.lockutils [req-ac5900be-f801-4db3-9ff8-81a7f2c1e1de req-0bc75ccb-6b2b-44bd-aab8-dced1c643a93 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9c07170-ca6f-422e-8f1c-9dfd5cc943a4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:47 np0005531887 nova_compute[186849]: 2025-11-22 07:47:47.707 186853 DEBUG nova.compute.manager [req-ac5900be-f801-4db3-9ff8-81a7f2c1e1de req-0bc75ccb-6b2b-44bd-aab8-dced1c643a93 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] No waiting events found dispatching network-vif-plugged-28cefb09-6f44-4c5f-b924-c2e3ca0082e1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:47:47 np0005531887 nova_compute[186849]: 2025-11-22 07:47:47.708 186853 WARNING nova.compute.manager [req-ac5900be-f801-4db3-9ff8-81a7f2c1e1de req-0bc75ccb-6b2b-44bd-aab8-dced1c643a93 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Received unexpected event network-vif-plugged-28cefb09-6f44-4c5f-b924-c2e3ca0082e1 for instance with vm_state active and task_state migrating.#033[00m
Nov 22 02:47:47 np0005531887 nova_compute[186849]: 2025-11-22 07:47:47.708 186853 DEBUG nova.compute.manager [req-ac5900be-f801-4db3-9ff8-81a7f2c1e1de req-0bc75ccb-6b2b-44bd-aab8-dced1c643a93 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Received event network-vif-plugged-28cefb09-6f44-4c5f-b924-c2e3ca0082e1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:47:47 np0005531887 nova_compute[186849]: 2025-11-22 07:47:47.708 186853 DEBUG oslo_concurrency.lockutils [req-ac5900be-f801-4db3-9ff8-81a7f2c1e1de req-0bc75ccb-6b2b-44bd-aab8-dced1c643a93 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "b9c07170-ca6f-422e-8f1c-9dfd5cc943a4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:47 np0005531887 nova_compute[186849]: 2025-11-22 07:47:47.708 186853 DEBUG oslo_concurrency.lockutils [req-ac5900be-f801-4db3-9ff8-81a7f2c1e1de req-0bc75ccb-6b2b-44bd-aab8-dced1c643a93 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9c07170-ca6f-422e-8f1c-9dfd5cc943a4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:47 np0005531887 nova_compute[186849]: 2025-11-22 07:47:47.709 186853 DEBUG oslo_concurrency.lockutils [req-ac5900be-f801-4db3-9ff8-81a7f2c1e1de req-0bc75ccb-6b2b-44bd-aab8-dced1c643a93 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9c07170-ca6f-422e-8f1c-9dfd5cc943a4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:47 np0005531887 nova_compute[186849]: 2025-11-22 07:47:47.709 186853 DEBUG nova.compute.manager [req-ac5900be-f801-4db3-9ff8-81a7f2c1e1de req-0bc75ccb-6b2b-44bd-aab8-dced1c643a93 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] No waiting events found dispatching network-vif-plugged-28cefb09-6f44-4c5f-b924-c2e3ca0082e1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:47:47 np0005531887 nova_compute[186849]: 2025-11-22 07:47:47.709 186853 WARNING nova.compute.manager [req-ac5900be-f801-4db3-9ff8-81a7f2c1e1de req-0bc75ccb-6b2b-44bd-aab8-dced1c643a93 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Received unexpected event network-vif-plugged-28cefb09-6f44-4c5f-b924-c2e3ca0082e1 for instance with vm_state active and task_state migrating.#033[00m
Nov 22 02:47:48 np0005531887 podman[218249]: 2025-11-22 07:47:48.838668339 +0000 UTC m=+0.058257304 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, distribution-scope=public, architecture=x86_64, config_id=edpm, vcs-type=git, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.buildah.version=1.33.7, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 22 02:47:50 np0005531887 systemd[1]: Stopping User Manager for UID 42436...
Nov 22 02:47:50 np0005531887 systemd[218039]: Activating special unit Exit the Session...
Nov 22 02:47:50 np0005531887 systemd[218039]: Stopped target Main User Target.
Nov 22 02:47:50 np0005531887 systemd[218039]: Stopped target Basic System.
Nov 22 02:47:50 np0005531887 systemd[218039]: Stopped target Paths.
Nov 22 02:47:50 np0005531887 systemd[218039]: Stopped target Sockets.
Nov 22 02:47:50 np0005531887 systemd[218039]: Stopped target Timers.
Nov 22 02:47:50 np0005531887 systemd[218039]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 22 02:47:50 np0005531887 systemd[218039]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 22 02:47:50 np0005531887 systemd[218039]: Closed D-Bus User Message Bus Socket.
Nov 22 02:47:50 np0005531887 systemd[218039]: Stopped Create User's Volatile Files and Directories.
Nov 22 02:47:50 np0005531887 systemd[218039]: Removed slice User Application Slice.
Nov 22 02:47:50 np0005531887 systemd[218039]: Reached target Shutdown.
Nov 22 02:47:50 np0005531887 systemd[218039]: Finished Exit the Session.
Nov 22 02:47:50 np0005531887 systemd[218039]: Reached target Exit the Session.
Nov 22 02:47:50 np0005531887 systemd[1]: user@42436.service: Deactivated successfully.
Nov 22 02:47:50 np0005531887 systemd[1]: Stopped User Manager for UID 42436.
Nov 22 02:47:50 np0005531887 nova_compute[186849]: 2025-11-22 07:47:50.284 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:50 np0005531887 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 22 02:47:50 np0005531887 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 22 02:47:50 np0005531887 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 22 02:47:50 np0005531887 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 22 02:47:50 np0005531887 systemd[1]: Removed slice User Slice of UID 42436.
Nov 22 02:47:52 np0005531887 nova_compute[186849]: 2025-11-22 07:47:52.037 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:52 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:47:52.661 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:47:52 np0005531887 nova_compute[186849]: 2025-11-22 07:47:52.922 186853 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797657.9199512, 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:47:52 np0005531887 nova_compute[186849]: 2025-11-22 07:47:52.923 186853 INFO nova.compute.manager [-] [instance: 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:47:52 np0005531887 nova_compute[186849]: 2025-11-22 07:47:52.950 186853 DEBUG oslo_concurrency.lockutils [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Acquiring lock "b9c07170-ca6f-422e-8f1c-9dfd5cc943a4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:52 np0005531887 nova_compute[186849]: 2025-11-22 07:47:52.951 186853 DEBUG oslo_concurrency.lockutils [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Lock "b9c07170-ca6f-422e-8f1c-9dfd5cc943a4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:52 np0005531887 nova_compute[186849]: 2025-11-22 07:47:52.951 186853 DEBUG oslo_concurrency.lockutils [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Lock "b9c07170-ca6f-422e-8f1c-9dfd5cc943a4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:52 np0005531887 nova_compute[186849]: 2025-11-22 07:47:52.960 186853 DEBUG nova.compute.manager [None req-712c94ac-10de-4470-b5ba-b6a60e440b6a - - - - - -] [instance: 7fc0f20e-33b6-4dea-b6f5-f8bb10248ab5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:47:52 np0005531887 nova_compute[186849]: 2025-11-22 07:47:52.974 186853 DEBUG oslo_concurrency.lockutils [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:52 np0005531887 nova_compute[186849]: 2025-11-22 07:47:52.974 186853 DEBUG oslo_concurrency.lockutils [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:52 np0005531887 nova_compute[186849]: 2025-11-22 07:47:52.974 186853 DEBUG oslo_concurrency.lockutils [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:52 np0005531887 nova_compute[186849]: 2025-11-22 07:47:52.975 186853 DEBUG nova.compute.resource_tracker [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 02:47:53 np0005531887 nova_compute[186849]: 2025-11-22 07:47:53.170 186853 WARNING nova.virt.libvirt.driver [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:47:53 np0005531887 nova_compute[186849]: 2025-11-22 07:47:53.171 186853 DEBUG nova.compute.resource_tracker [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5750MB free_disk=73.45832061767578GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 02:47:53 np0005531887 nova_compute[186849]: 2025-11-22 07:47:53.171 186853 DEBUG oslo_concurrency.lockutils [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:53 np0005531887 nova_compute[186849]: 2025-11-22 07:47:53.171 186853 DEBUG oslo_concurrency.lockutils [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:53 np0005531887 nova_compute[186849]: 2025-11-22 07:47:53.221 186853 DEBUG nova.compute.resource_tracker [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Migration for instance b9c07170-ca6f-422e-8f1c-9dfd5cc943a4 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Nov 22 02:47:53 np0005531887 nova_compute[186849]: 2025-11-22 07:47:53.260 186853 DEBUG nova.compute.resource_tracker [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m
Nov 22 02:47:53 np0005531887 nova_compute[186849]: 2025-11-22 07:47:53.300 186853 DEBUG nova.compute.resource_tracker [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Migration 7be40063-4c96-4b9d-85c4-ef57ecaf1c16 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Nov 22 02:47:53 np0005531887 nova_compute[186849]: 2025-11-22 07:47:53.301 186853 DEBUG nova.compute.resource_tracker [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 02:47:53 np0005531887 nova_compute[186849]: 2025-11-22 07:47:53.301 186853 DEBUG nova.compute.resource_tracker [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 02:47:53 np0005531887 nova_compute[186849]: 2025-11-22 07:47:53.416 186853 DEBUG nova.compute.provider_tree [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:47:53 np0005531887 nova_compute[186849]: 2025-11-22 07:47:53.431 186853 DEBUG nova.scheduler.client.report [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:47:53 np0005531887 nova_compute[186849]: 2025-11-22 07:47:53.459 186853 DEBUG nova.compute.resource_tracker [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 02:47:53 np0005531887 nova_compute[186849]: 2025-11-22 07:47:53.459 186853 DEBUG oslo_concurrency.lockutils [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.288s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:53 np0005531887 nova_compute[186849]: 2025-11-22 07:47:53.469 186853 INFO nova.compute.manager [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Migrating instance to compute-0.ctlplane.example.com finished successfully.#033[00m
Nov 22 02:47:53 np0005531887 nova_compute[186849]: 2025-11-22 07:47:53.596 186853 INFO nova.scheduler.client.report [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Deleted allocation for migration 7be40063-4c96-4b9d-85c4-ef57ecaf1c16#033[00m
Nov 22 02:47:53 np0005531887 nova_compute[186849]: 2025-11-22 07:47:53.596 186853 DEBUG nova.virt.libvirt.driver [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662#033[00m
Nov 22 02:47:53 np0005531887 nova_compute[186849]: 2025-11-22 07:47:53.894 186853 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797658.893199, 89be3b77-79e2-4c6a-9107-a17f3f4a3fca => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:47:53 np0005531887 nova_compute[186849]: 2025-11-22 07:47:53.894 186853 INFO nova.compute.manager [-] [instance: 89be3b77-79e2-4c6a-9107-a17f3f4a3fca] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:47:53 np0005531887 nova_compute[186849]: 2025-11-22 07:47:53.926 186853 DEBUG nova.compute.manager [None req-18367633-f7a2-474a-85d3-95a9bdbdf677 - - - - - -] [instance: 89be3b77-79e2-4c6a-9107-a17f3f4a3fca] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:47:54 np0005531887 podman[218277]: 2025-11-22 07:47:54.746923099 +0000 UTC m=+0.082729382 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 22 02:47:54 np0005531887 podman[218276]: 2025-11-22 07:47:54.752777453 +0000 UTC m=+0.084297572 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:47:55 np0005531887 nova_compute[186849]: 2025-11-22 07:47:55.313 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:57 np0005531887 nova_compute[186849]: 2025-11-22 07:47:57.041 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:47:57 np0005531887 nova_compute[186849]: 2025-11-22 07:47:57.481 186853 DEBUG oslo_concurrency.lockutils [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Acquiring lock "0d5142cf-dac9-4d44-a43f-edcef823f370" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:57 np0005531887 nova_compute[186849]: 2025-11-22 07:47:57.482 186853 DEBUG oslo_concurrency.lockutils [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Lock "0d5142cf-dac9-4d44-a43f-edcef823f370" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:57 np0005531887 nova_compute[186849]: 2025-11-22 07:47:57.563 186853 DEBUG nova.compute.manager [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 02:47:57 np0005531887 nova_compute[186849]: 2025-11-22 07:47:57.762 186853 DEBUG oslo_concurrency.lockutils [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:57 np0005531887 nova_compute[186849]: 2025-11-22 07:47:57.763 186853 DEBUG oslo_concurrency.lockutils [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:57 np0005531887 nova_compute[186849]: 2025-11-22 07:47:57.772 186853 DEBUG nova.virt.hardware [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:47:57 np0005531887 nova_compute[186849]: 2025-11-22 07:47:57.773 186853 INFO nova.compute.claims [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 22 02:47:57 np0005531887 nova_compute[186849]: 2025-11-22 07:47:57.996 186853 DEBUG nova.compute.provider_tree [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:47:58 np0005531887 nova_compute[186849]: 2025-11-22 07:47:58.017 186853 DEBUG nova.scheduler.client.report [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:47:58 np0005531887 nova_compute[186849]: 2025-11-22 07:47:58.125 186853 DEBUG oslo_concurrency.lockutils [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.362s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:58 np0005531887 nova_compute[186849]: 2025-11-22 07:47:58.126 186853 DEBUG nova.compute.manager [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 02:47:58 np0005531887 nova_compute[186849]: 2025-11-22 07:47:58.203 186853 DEBUG nova.compute.manager [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 02:47:58 np0005531887 nova_compute[186849]: 2025-11-22 07:47:58.204 186853 DEBUG nova.network.neutron [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 02:47:58 np0005531887 nova_compute[186849]: 2025-11-22 07:47:58.231 186853 INFO nova.virt.libvirt.driver [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 02:47:58 np0005531887 nova_compute[186849]: 2025-11-22 07:47:58.259 186853 DEBUG nova.compute.manager [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 02:47:58 np0005531887 nova_compute[186849]: 2025-11-22 07:47:58.399 186853 DEBUG nova.compute.manager [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 02:47:58 np0005531887 nova_compute[186849]: 2025-11-22 07:47:58.400 186853 DEBUG nova.virt.libvirt.driver [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:47:58 np0005531887 nova_compute[186849]: 2025-11-22 07:47:58.401 186853 INFO nova.virt.libvirt.driver [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] Creating image(s)#033[00m
Nov 22 02:47:58 np0005531887 nova_compute[186849]: 2025-11-22 07:47:58.401 186853 DEBUG oslo_concurrency.lockutils [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Acquiring lock "/var/lib/nova/instances/0d5142cf-dac9-4d44-a43f-edcef823f370/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:58 np0005531887 nova_compute[186849]: 2025-11-22 07:47:58.401 186853 DEBUG oslo_concurrency.lockutils [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Lock "/var/lib/nova/instances/0d5142cf-dac9-4d44-a43f-edcef823f370/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:58 np0005531887 nova_compute[186849]: 2025-11-22 07:47:58.402 186853 DEBUG oslo_concurrency.lockutils [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Lock "/var/lib/nova/instances/0d5142cf-dac9-4d44-a43f-edcef823f370/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:58 np0005531887 nova_compute[186849]: 2025-11-22 07:47:58.419 186853 DEBUG oslo_concurrency.processutils [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:47:58 np0005531887 nova_compute[186849]: 2025-11-22 07:47:58.485 186853 DEBUG oslo_concurrency.processutils [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:47:58 np0005531887 nova_compute[186849]: 2025-11-22 07:47:58.486 186853 DEBUG oslo_concurrency.lockutils [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:58 np0005531887 nova_compute[186849]: 2025-11-22 07:47:58.487 186853 DEBUG oslo_concurrency.lockutils [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:58 np0005531887 nova_compute[186849]: 2025-11-22 07:47:58.499 186853 DEBUG oslo_concurrency.processutils [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:47:58 np0005531887 nova_compute[186849]: 2025-11-22 07:47:58.554 186853 DEBUG oslo_concurrency.processutils [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:47:58 np0005531887 nova_compute[186849]: 2025-11-22 07:47:58.555 186853 DEBUG oslo_concurrency.processutils [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/0d5142cf-dac9-4d44-a43f-edcef823f370/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:47:58 np0005531887 nova_compute[186849]: 2025-11-22 07:47:58.589 186853 DEBUG oslo_concurrency.processutils [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/0d5142cf-dac9-4d44-a43f-edcef823f370/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:47:58 np0005531887 nova_compute[186849]: 2025-11-22 07:47:58.590 186853 DEBUG oslo_concurrency.lockutils [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:58 np0005531887 nova_compute[186849]: 2025-11-22 07:47:58.590 186853 DEBUG oslo_concurrency.processutils [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:47:58 np0005531887 nova_compute[186849]: 2025-11-22 07:47:58.649 186853 DEBUG oslo_concurrency.processutils [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:47:58 np0005531887 nova_compute[186849]: 2025-11-22 07:47:58.651 186853 DEBUG nova.virt.disk.api [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Checking if we can resize image /var/lib/nova/instances/0d5142cf-dac9-4d44-a43f-edcef823f370/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:47:58 np0005531887 nova_compute[186849]: 2025-11-22 07:47:58.651 186853 DEBUG oslo_concurrency.processutils [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0d5142cf-dac9-4d44-a43f-edcef823f370/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:47:58 np0005531887 nova_compute[186849]: 2025-11-22 07:47:58.710 186853 DEBUG oslo_concurrency.processutils [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0d5142cf-dac9-4d44-a43f-edcef823f370/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:47:58 np0005531887 nova_compute[186849]: 2025-11-22 07:47:58.712 186853 DEBUG nova.virt.disk.api [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Cannot resize image /var/lib/nova/instances/0d5142cf-dac9-4d44-a43f-edcef823f370/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:47:58 np0005531887 nova_compute[186849]: 2025-11-22 07:47:58.712 186853 DEBUG nova.objects.instance [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Lazy-loading 'migration_context' on Instance uuid 0d5142cf-dac9-4d44-a43f-edcef823f370 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:47:58 np0005531887 nova_compute[186849]: 2025-11-22 07:47:58.755 186853 DEBUG nova.virt.libvirt.driver [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:47:58 np0005531887 nova_compute[186849]: 2025-11-22 07:47:58.756 186853 DEBUG nova.virt.libvirt.driver [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] Ensure instance console log exists: /var/lib/nova/instances/0d5142cf-dac9-4d44-a43f-edcef823f370/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:47:58 np0005531887 nova_compute[186849]: 2025-11-22 07:47:58.756 186853 DEBUG oslo_concurrency.lockutils [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:47:58 np0005531887 nova_compute[186849]: 2025-11-22 07:47:58.757 186853 DEBUG oslo_concurrency.lockutils [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:47:58 np0005531887 nova_compute[186849]: 2025-11-22 07:47:58.757 186853 DEBUG oslo_concurrency.lockutils [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:47:58 np0005531887 nova_compute[186849]: 2025-11-22 07:47:58.986 186853 DEBUG nova.network.neutron [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Nov 22 02:47:58 np0005531887 nova_compute[186849]: 2025-11-22 07:47:58.986 186853 DEBUG nova.compute.manager [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 02:47:58 np0005531887 nova_compute[186849]: 2025-11-22 07:47:58.988 186853 DEBUG nova.virt.libvirt.driver [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'size': 0, 'boot_index': 0, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:47:58 np0005531887 nova_compute[186849]: 2025-11-22 07:47:58.992 186853 WARNING nova.virt.libvirt.driver [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:47:58 np0005531887 nova_compute[186849]: 2025-11-22 07:47:58.998 186853 DEBUG nova.virt.libvirt.host [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:47:58 np0005531887 nova_compute[186849]: 2025-11-22 07:47:58.999 186853 DEBUG nova.virt.libvirt.host [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:47:59 np0005531887 nova_compute[186849]: 2025-11-22 07:47:59.002 186853 DEBUG nova.virt.libvirt.host [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:47:59 np0005531887 nova_compute[186849]: 2025-11-22 07:47:59.003 186853 DEBUG nova.virt.libvirt.host [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:47:59 np0005531887 nova_compute[186849]: 2025-11-22 07:47:59.004 186853 DEBUG nova.virt.libvirt.driver [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:47:59 np0005531887 nova_compute[186849]: 2025-11-22 07:47:59.004 186853 DEBUG nova.virt.hardware [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:47:59 np0005531887 nova_compute[186849]: 2025-11-22 07:47:59.005 186853 DEBUG nova.virt.hardware [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:47:59 np0005531887 nova_compute[186849]: 2025-11-22 07:47:59.005 186853 DEBUG nova.virt.hardware [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:47:59 np0005531887 nova_compute[186849]: 2025-11-22 07:47:59.005 186853 DEBUG nova.virt.hardware [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:47:59 np0005531887 nova_compute[186849]: 2025-11-22 07:47:59.006 186853 DEBUG nova.virt.hardware [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:47:59 np0005531887 nova_compute[186849]: 2025-11-22 07:47:59.006 186853 DEBUG nova.virt.hardware [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:47:59 np0005531887 nova_compute[186849]: 2025-11-22 07:47:59.006 186853 DEBUG nova.virt.hardware [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:47:59 np0005531887 nova_compute[186849]: 2025-11-22 07:47:59.006 186853 DEBUG nova.virt.hardware [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:47:59 np0005531887 nova_compute[186849]: 2025-11-22 07:47:59.007 186853 DEBUG nova.virt.hardware [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:47:59 np0005531887 nova_compute[186849]: 2025-11-22 07:47:59.007 186853 DEBUG nova.virt.hardware [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:47:59 np0005531887 nova_compute[186849]: 2025-11-22 07:47:59.007 186853 DEBUG nova.virt.hardware [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:47:59 np0005531887 nova_compute[186849]: 2025-11-22 07:47:59.011 186853 DEBUG nova.objects.instance [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0d5142cf-dac9-4d44-a43f-edcef823f370 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:47:59 np0005531887 nova_compute[186849]: 2025-11-22 07:47:59.031 186853 DEBUG nova.virt.libvirt.driver [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:47:59 np0005531887 nova_compute[186849]:  <uuid>0d5142cf-dac9-4d44-a43f-edcef823f370</uuid>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:  <name>instance-00000028</name>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:  <memory>131072</memory>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:  <vcpu>1</vcpu>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:47:59 np0005531887 nova_compute[186849]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:      <nova:name>tempest-ListImageFiltersTestJSON-server-1037960765</nova:name>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:      <nova:creationTime>2025-11-22 07:47:58</nova:creationTime>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:      <nova:flavor name="m1.nano">
Nov 22 02:47:59 np0005531887 nova_compute[186849]:        <nova:memory>128</nova:memory>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:        <nova:disk>1</nova:disk>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:        <nova:swap>0</nova:swap>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:      </nova:flavor>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:      <nova:owner>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:        <nova:user uuid="f9a51b2699f1471d9e9b3463921a67fe">tempest-ListImageFiltersTestJSON-497193209-project-member</nova:user>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:        <nova:project uuid="072c26a765bb4c6081d04d313aceda15">tempest-ListImageFiltersTestJSON-497193209</nova:project>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:      </nova:owner>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:      <nova:ports/>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:    </nova:instance>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:  <sysinfo type="smbios">
Nov 22 02:47:59 np0005531887 nova_compute[186849]:    <system>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:      <entry name="serial">0d5142cf-dac9-4d44-a43f-edcef823f370</entry>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:      <entry name="uuid">0d5142cf-dac9-4d44-a43f-edcef823f370</entry>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:    </system>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:  <os>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:    <boot dev="hd"/>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:    <smbios mode="sysinfo"/>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:  </os>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:  <features>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:    <vmcoreinfo/>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:  </features>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:  <clock offset="utc">
Nov 22 02:47:59 np0005531887 nova_compute[186849]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:    <timer name="hpet" present="no"/>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:  </clock>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:  <cpu mode="custom" match="exact">
Nov 22 02:47:59 np0005531887 nova_compute[186849]:    <model>Nehalem</model>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:  <devices>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:    <disk type="file" device="disk">
Nov 22 02:47:59 np0005531887 nova_compute[186849]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/0d5142cf-dac9-4d44-a43f-edcef823f370/disk"/>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:      <target dev="vda" bus="virtio"/>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:    <disk type="file" device="cdrom">
Nov 22 02:47:59 np0005531887 nova_compute[186849]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/0d5142cf-dac9-4d44-a43f-edcef823f370/disk.config"/>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:      <target dev="sda" bus="sata"/>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:    <serial type="pty">
Nov 22 02:47:59 np0005531887 nova_compute[186849]:      <log file="/var/lib/nova/instances/0d5142cf-dac9-4d44-a43f-edcef823f370/console.log" append="off"/>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:    </serial>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:    <video>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:    </video>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:    <input type="tablet" bus="usb"/>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:    <rng model="virtio">
Nov 22 02:47:59 np0005531887 nova_compute[186849]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:    </rng>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:    <controller type="usb" index="0"/>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:    <memballoon model="virtio">
Nov 22 02:47:59 np0005531887 nova_compute[186849]:      <stats period="10"/>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 02:47:59 np0005531887 nova_compute[186849]:  </devices>
Nov 22 02:47:59 np0005531887 nova_compute[186849]: </domain>
Nov 22 02:47:59 np0005531887 nova_compute[186849]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:47:59 np0005531887 nova_compute[186849]: 2025-11-22 07:47:59.088 186853 DEBUG nova.virt.libvirt.driver [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:47:59 np0005531887 nova_compute[186849]: 2025-11-22 07:47:59.088 186853 DEBUG nova.virt.libvirt.driver [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:47:59 np0005531887 nova_compute[186849]: 2025-11-22 07:47:59.089 186853 INFO nova.virt.libvirt.driver [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] Using config drive#033[00m
Nov 22 02:47:59 np0005531887 nova_compute[186849]: 2025-11-22 07:47:59.504 186853 INFO nova.virt.libvirt.driver [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] Creating config drive at /var/lib/nova/instances/0d5142cf-dac9-4d44-a43f-edcef823f370/disk.config#033[00m
Nov 22 02:47:59 np0005531887 nova_compute[186849]: 2025-11-22 07:47:59.537 186853 DEBUG oslo_concurrency.processutils [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0d5142cf-dac9-4d44-a43f-edcef823f370/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9fd4uhlu execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:47:59 np0005531887 nova_compute[186849]: 2025-11-22 07:47:59.661 186853 DEBUG oslo_concurrency.processutils [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0d5142cf-dac9-4d44-a43f-edcef823f370/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9fd4uhlu" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:47:59 np0005531887 systemd-machined[153180]: New machine qemu-16-instance-00000028.
Nov 22 02:47:59 np0005531887 systemd[1]: Started Virtual Machine qemu-16-instance-00000028.
Nov 22 02:48:00 np0005531887 nova_compute[186849]: 2025-11-22 07:48:00.314 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:48:00 np0005531887 nova_compute[186849]: 2025-11-22 07:48:00.318 186853 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797665.3175924, b9c07170-ca6f-422e-8f1c-9dfd5cc943a4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:48:00 np0005531887 nova_compute[186849]: 2025-11-22 07:48:00.319 186853 INFO nova.compute.manager [-] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:48:00 np0005531887 nova_compute[186849]: 2025-11-22 07:48:00.406 186853 DEBUG nova.compute.manager [None req-98a79a2f-2b31-45e9-9ebd-43e5d960db4c - - - - - -] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:48:00 np0005531887 nova_compute[186849]: 2025-11-22 07:48:00.517 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763797680.516945, 0d5142cf-dac9-4d44-a43f-edcef823f370 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:48:00 np0005531887 nova_compute[186849]: 2025-11-22 07:48:00.518 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:48:00 np0005531887 nova_compute[186849]: 2025-11-22 07:48:00.520 186853 DEBUG nova.compute.manager [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:48:00 np0005531887 nova_compute[186849]: 2025-11-22 07:48:00.520 186853 DEBUG nova.virt.libvirt.driver [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 02:48:00 np0005531887 nova_compute[186849]: 2025-11-22 07:48:00.523 186853 INFO nova.virt.libvirt.driver [-] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] Instance spawned successfully.#033[00m
Nov 22 02:48:00 np0005531887 nova_compute[186849]: 2025-11-22 07:48:00.523 186853 DEBUG nova.virt.libvirt.driver [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 02:48:00 np0005531887 nova_compute[186849]: 2025-11-22 07:48:00.549 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:48:00 np0005531887 nova_compute[186849]: 2025-11-22 07:48:00.554 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:48:00 np0005531887 nova_compute[186849]: 2025-11-22 07:48:00.558 186853 DEBUG nova.virt.libvirt.driver [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:48:00 np0005531887 nova_compute[186849]: 2025-11-22 07:48:00.558 186853 DEBUG nova.virt.libvirt.driver [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:48:00 np0005531887 nova_compute[186849]: 2025-11-22 07:48:00.558 186853 DEBUG nova.virt.libvirt.driver [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:48:00 np0005531887 nova_compute[186849]: 2025-11-22 07:48:00.559 186853 DEBUG nova.virt.libvirt.driver [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:48:00 np0005531887 nova_compute[186849]: 2025-11-22 07:48:00.559 186853 DEBUG nova.virt.libvirt.driver [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:48:00 np0005531887 nova_compute[186849]: 2025-11-22 07:48:00.559 186853 DEBUG nova.virt.libvirt.driver [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:48:00 np0005531887 nova_compute[186849]: 2025-11-22 07:48:00.595 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:48:00 np0005531887 nova_compute[186849]: 2025-11-22 07:48:00.595 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763797680.5171072, 0d5142cf-dac9-4d44-a43f-edcef823f370 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:48:00 np0005531887 nova_compute[186849]: 2025-11-22 07:48:00.595 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] VM Started (Lifecycle Event)#033[00m
Nov 22 02:48:00 np0005531887 nova_compute[186849]: 2025-11-22 07:48:00.626 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:48:00 np0005531887 nova_compute[186849]: 2025-11-22 07:48:00.629 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:48:00 np0005531887 nova_compute[186849]: 2025-11-22 07:48:00.654 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:48:00 np0005531887 nova_compute[186849]: 2025-11-22 07:48:00.842 186853 INFO nova.compute.manager [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] Took 2.44 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 02:48:00 np0005531887 nova_compute[186849]: 2025-11-22 07:48:00.843 186853 DEBUG nova.compute.manager [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:48:01 np0005531887 nova_compute[186849]: 2025-11-22 07:48:01.015 186853 INFO nova.compute.manager [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] Took 3.29 seconds to build instance.#033[00m
Nov 22 02:48:01 np0005531887 nova_compute[186849]: 2025-11-22 07:48:01.086 186853 DEBUG oslo_concurrency.lockutils [None req-54dd87b8-7f3b-425f-824a-3b8d21efc7c5 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Lock "0d5142cf-dac9-4d44-a43f-edcef823f370" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.604s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:48:01 np0005531887 podman[218362]: 2025-11-22 07:48:01.835288693 +0000 UTC m=+0.055613920 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 02:48:02 np0005531887 nova_compute[186849]: 2025-11-22 07:48:02.043 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:48:05 np0005531887 nova_compute[186849]: 2025-11-22 07:48:05.316 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:48:05 np0005531887 podman[218386]: 2025-11-22 07:48:05.825549132 +0000 UTC m=+0.050822562 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Nov 22 02:48:07 np0005531887 nova_compute[186849]: 2025-11-22 07:48:07.046 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:48:09 np0005531887 podman[218405]: 2025-11-22 07:48:09.839668925 +0000 UTC m=+0.059469184 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:48:10 np0005531887 nova_compute[186849]: 2025-11-22 07:48:10.318 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:48:12 np0005531887 nova_compute[186849]: 2025-11-22 07:48:12.049 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:48:12 np0005531887 nova_compute[186849]: 2025-11-22 07:48:12.490 186853 DEBUG nova.compute.manager [None req-3121c0ee-b466-4942-a6ad-e35568932f61 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:48:12 np0005531887 nova_compute[186849]: 2025-11-22 07:48:12.556 186853 INFO nova.compute.manager [None req-3121c0ee-b466-4942-a6ad-e35568932f61 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] instance snapshotting#033[00m
Nov 22 02:48:12 np0005531887 nova_compute[186849]: 2025-11-22 07:48:12.782 186853 INFO nova.virt.libvirt.driver [None req-3121c0ee-b466-4942-a6ad-e35568932f61 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] Beginning live snapshot process#033[00m
Nov 22 02:48:13 np0005531887 virtqemud[186424]: invalid argument: disk vda does not have an active block job
Nov 22 02:48:13 np0005531887 nova_compute[186849]: 2025-11-22 07:48:13.020 186853 DEBUG oslo_concurrency.processutils [None req-3121c0ee-b466-4942-a6ad-e35568932f61 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0d5142cf-dac9-4d44-a43f-edcef823f370/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:48:13 np0005531887 nova_compute[186849]: 2025-11-22 07:48:13.085 186853 DEBUG oslo_concurrency.processutils [None req-3121c0ee-b466-4942-a6ad-e35568932f61 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0d5142cf-dac9-4d44-a43f-edcef823f370/disk --force-share --output=json -f qcow2" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:48:13 np0005531887 nova_compute[186849]: 2025-11-22 07:48:13.086 186853 DEBUG oslo_concurrency.processutils [None req-3121c0ee-b466-4942-a6ad-e35568932f61 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0d5142cf-dac9-4d44-a43f-edcef823f370/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:48:13 np0005531887 nova_compute[186849]: 2025-11-22 07:48:13.161 186853 DEBUG oslo_concurrency.processutils [None req-3121c0ee-b466-4942-a6ad-e35568932f61 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0d5142cf-dac9-4d44-a43f-edcef823f370/disk --force-share --output=json -f qcow2" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:48:13 np0005531887 nova_compute[186849]: 2025-11-22 07:48:13.174 186853 DEBUG oslo_concurrency.processutils [None req-3121c0ee-b466-4942-a6ad-e35568932f61 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:48:13 np0005531887 nova_compute[186849]: 2025-11-22 07:48:13.241 186853 DEBUG oslo_concurrency.processutils [None req-3121c0ee-b466-4942-a6ad-e35568932f61 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:48:13 np0005531887 nova_compute[186849]: 2025-11-22 07:48:13.242 186853 DEBUG oslo_concurrency.processutils [None req-3121c0ee-b466-4942-a6ad-e35568932f61 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/snapshots/tmp0o8fis_j/0b149bf5dcf04a10ac4f1b658767a533.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:48:13 np0005531887 nova_compute[186849]: 2025-11-22 07:48:13.276 186853 DEBUG oslo_concurrency.processutils [None req-3121c0ee-b466-4942-a6ad-e35568932f61 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/snapshots/tmp0o8fis_j/0b149bf5dcf04a10ac4f1b658767a533.delta 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:48:13 np0005531887 nova_compute[186849]: 2025-11-22 07:48:13.278 186853 INFO nova.virt.libvirt.driver [None req-3121c0ee-b466-4942-a6ad-e35568932f61 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Nov 22 02:48:13 np0005531887 nova_compute[186849]: 2025-11-22 07:48:13.327 186853 DEBUG nova.virt.libvirt.guest [None req-3121c0ee-b466-4942-a6ad-e35568932f61 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] COPY block job progress, current cursor: 0 final cursor: 27918336 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Nov 22 02:48:13 np0005531887 nova_compute[186849]: 2025-11-22 07:48:13.830 186853 DEBUG nova.virt.libvirt.guest [None req-3121c0ee-b466-4942-a6ad-e35568932f61 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] COPY block job progress, current cursor: 27918336 final cursor: 27918336 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Nov 22 02:48:13 np0005531887 nova_compute[186849]: 2025-11-22 07:48:13.834 186853 INFO nova.virt.libvirt.driver [None req-3121c0ee-b466-4942-a6ad-e35568932f61 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Nov 22 02:48:13 np0005531887 nova_compute[186849]: 2025-11-22 07:48:13.874 186853 DEBUG nova.privsep.utils [None req-3121c0ee-b466-4942-a6ad-e35568932f61 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 22 02:48:13 np0005531887 nova_compute[186849]: 2025-11-22 07:48:13.874 186853 DEBUG oslo_concurrency.processutils [None req-3121c0ee-b466-4942-a6ad-e35568932f61 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmp0o8fis_j/0b149bf5dcf04a10ac4f1b658767a533.delta /var/lib/nova/instances/snapshots/tmp0o8fis_j/0b149bf5dcf04a10ac4f1b658767a533 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:48:13 np0005531887 podman[218462]: 2025-11-22 07:48:13.937080563 +0000 UTC m=+0.052568555 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 02:48:14 np0005531887 nova_compute[186849]: 2025-11-22 07:48:14.451 186853 DEBUG oslo_concurrency.processutils [None req-3121c0ee-b466-4942-a6ad-e35568932f61 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmp0o8fis_j/0b149bf5dcf04a10ac4f1b658767a533.delta /var/lib/nova/instances/snapshots/tmp0o8fis_j/0b149bf5dcf04a10ac4f1b658767a533" returned: 0 in 0.577s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:48:14 np0005531887 nova_compute[186849]: 2025-11-22 07:48:14.457 186853 INFO nova.virt.libvirt.driver [None req-3121c0ee-b466-4942-a6ad-e35568932f61 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] Snapshot extracted, beginning image upload#033[00m
Nov 22 02:48:15 np0005531887 nova_compute[186849]: 2025-11-22 07:48:15.321 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:48:17 np0005531887 nova_compute[186849]: 2025-11-22 07:48:17.052 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:48:17 np0005531887 nova_compute[186849]: 2025-11-22 07:48:17.399 186853 INFO nova.virt.libvirt.driver [None req-3121c0ee-b466-4942-a6ad-e35568932f61 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] Snapshot image upload complete#033[00m
Nov 22 02:48:17 np0005531887 nova_compute[186849]: 2025-11-22 07:48:17.399 186853 INFO nova.compute.manager [None req-3121c0ee-b466-4942-a6ad-e35568932f61 f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] Took 4.84 seconds to snapshot the instance on the hypervisor.#033[00m
Nov 22 02:48:19 np0005531887 podman[218492]: 2025-11-22 07:48:19.840194087 +0000 UTC m=+0.060609301 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, distribution-scope=public, version=9.6, build-date=2025-08-20T13:12:41, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter)
Nov 22 02:48:20 np0005531887 nova_compute[186849]: 2025-11-22 07:48:20.322 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:48:22 np0005531887 nova_compute[186849]: 2025-11-22 07:48:22.054 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:48:25 np0005531887 nova_compute[186849]: 2025-11-22 07:48:25.323 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:48:25 np0005531887 podman[218515]: 2025-11-22 07:48:25.835213007 +0000 UTC m=+0.057663369 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Nov 22 02:48:25 np0005531887 podman[218516]: 2025-11-22 07:48:25.851606458 +0000 UTC m=+0.071382455 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 22 02:48:27 np0005531887 nova_compute[186849]: 2025-11-22 07:48:27.057 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:48:30 np0005531887 nova_compute[186849]: 2025-11-22 07:48:30.326 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:48:32 np0005531887 nova_compute[186849]: 2025-11-22 07:48:32.078 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:48:32 np0005531887 podman[218562]: 2025-11-22 07:48:32.824991701 +0000 UTC m=+0.045528963 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 02:48:35 np0005531887 nova_compute[186849]: 2025-11-22 07:48:35.329 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:48:35 np0005531887 nova_compute[186849]: 2025-11-22 07:48:35.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:48:35 np0005531887 nova_compute[186849]: 2025-11-22 07:48:35.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 22 02:48:35 np0005531887 nova_compute[186849]: 2025-11-22 07:48:35.783 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.660 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '0d5142cf-dac9-4d44-a43f-edcef823f370', 'name': 'tempest-ListImageFiltersTestJSON-server-1037960765', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000028', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '072c26a765bb4c6081d04d313aceda15', 'user_id': 'f9a51b2699f1471d9e9b3463921a67fe', 'hostId': 'd6b860666b1be87d64d4600a39996f425a630442930bd0885afaa6ae', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.660 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.669 12 DEBUG ceilometer.compute.pollsters [-] 0d5142cf-dac9-4d44-a43f-edcef823f370/disk.device.allocation volume: 30744576 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.670 12 DEBUG ceilometer.compute.pollsters [-] 0d5142cf-dac9-4d44-a43f-edcef823f370/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.671 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c2fda85b-8ead-4f57-837b-06f2ed9c94b8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30744576, 'user_id': 'f9a51b2699f1471d9e9b3463921a67fe', 'user_name': None, 'project_id': '072c26a765bb4c6081d04d313aceda15', 'project_name': None, 'resource_id': '0d5142cf-dac9-4d44-a43f-edcef823f370-vda', 'timestamp': '2025-11-22T07:48:36.660916', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1037960765', 'name': 'instance-00000028', 'instance_id': '0d5142cf-dac9-4d44-a43f-edcef823f370', 'instance_type': 'm1.nano', 'host': 'd6b860666b1be87d64d4600a39996f425a630442930bd0885afaa6ae', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a715211e-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4459.330704428, 'message_signature': 'f40c6b1f5e8a2c98b8a14fc7e62e78fbdb0ea52cfed44fe05a20714dc4a1df69'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'f9a51b2699f1471d9e9b3463921a67fe', 'user_name': None, 'project_id': '072c26a765bb4c6081d04d313aceda15', 'project_name': None, 'resource_id': '0d5142cf-dac9-4d44-a43f-edcef823f370-sda', 'timestamp': '2025-11-22T07:48:36.660916', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1037960765', 'name': 'instance-00000028', 'instance_id': '0d5142cf-dac9-4d44-a43f-edcef823f370', 'instance_type': 'm1.nano', 'host': 'd6b860666b1be87d64d4600a39996f425a630442930bd0885afaa6ae', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a7152efc-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4459.330704428, 'message_signature': '29571187427c785978852534c7912c94260101766cf44f62523ee338b714c49b'}]}, 'timestamp': '2025-11-22 07:48:36.670581', '_unique_id': '3df0dee1f30f4c12bb82c48e68c4c6c8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.671 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.671 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.671 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.671 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.671 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.671 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.671 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.671 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.671 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.671 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.671 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.671 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.671 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.671 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.671 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.671 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.671 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.671 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.671 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.671 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.671 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.671 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.671 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.671 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.671 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.671 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.671 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.671 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.671 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.671 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.671 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.672 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.672 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.672 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ListImageFiltersTestJSON-server-1037960765>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ListImageFiltersTestJSON-server-1037960765>]
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.672 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.672 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.672 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ListImageFiltersTestJSON-server-1037960765>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ListImageFiltersTestJSON-server-1037960765>]
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.672 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.673 12 DEBUG ceilometer.compute.pollsters [-] 0d5142cf-dac9-4d44-a43f-edcef823f370/disk.device.usage volume: 29884416 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.673 12 DEBUG ceilometer.compute.pollsters [-] 0d5142cf-dac9-4d44-a43f-edcef823f370/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.674 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f428c72f-133d-40b9-9133-31e541b2febe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29884416, 'user_id': 'f9a51b2699f1471d9e9b3463921a67fe', 'user_name': None, 'project_id': '072c26a765bb4c6081d04d313aceda15', 'project_name': None, 'resource_id': '0d5142cf-dac9-4d44-a43f-edcef823f370-vda', 'timestamp': '2025-11-22T07:48:36.673040', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1037960765', 'name': 'instance-00000028', 'instance_id': '0d5142cf-dac9-4d44-a43f-edcef823f370', 'instance_type': 'm1.nano', 'host': 'd6b860666b1be87d64d4600a39996f425a630442930bd0885afaa6ae', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a71599f0-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4459.330704428, 'message_signature': '267e56a0b33106d1fd8393ec0762c5d932aefb1d6f6e3e256651b93a267eabb8'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'f9a51b2699f1471d9e9b3463921a67fe', 'user_name': None, 'project_id': '072c26a765bb4c6081d04d313aceda15', 'project_name': None, 'resource_id': '0d5142cf-dac9-4d44-a43f-edcef823f370-sda', 'timestamp': '2025-11-22T07:48:36.673040', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1037960765', 'name': 'instance-00000028', 'instance_id': '0d5142cf-dac9-4d44-a43f-edcef823f370', 'instance_type': 'm1.nano', 'host': 'd6b860666b1be87d64d4600a39996f425a630442930bd0885afaa6ae', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a715a544-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4459.330704428, 'message_signature': '9fd2949159e906e5b7a93ed42dd88dd40ecc3e7f47195546636b58f2b377237a'}]}, 'timestamp': '2025-11-22 07:48:36.673614', '_unique_id': '544993dfac3f456297243ac2afe65d21'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.674 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.674 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.674 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.674 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.674 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.674 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.674 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.674 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.674 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.674 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.674 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.674 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.674 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.674 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.674 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.674 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.674 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.674 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.674 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.674 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.674 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.674 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.674 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.674 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.674 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.674 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.674 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.674 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.674 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.674 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.674 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.675 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.701 12 DEBUG ceilometer.compute.pollsters [-] 0d5142cf-dac9-4d44-a43f-edcef823f370/disk.device.read.latency volume: 804503467 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.702 12 DEBUG ceilometer.compute.pollsters [-] 0d5142cf-dac9-4d44-a43f-edcef823f370/disk.device.read.latency volume: 56345456 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.704 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c5ca49a5-7f46-4dec-95af-e0e082f560b5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 804503467, 'user_id': 'f9a51b2699f1471d9e9b3463921a67fe', 'user_name': None, 'project_id': '072c26a765bb4c6081d04d313aceda15', 'project_name': None, 'resource_id': '0d5142cf-dac9-4d44-a43f-edcef823f370-vda', 'timestamp': '2025-11-22T07:48:36.675130', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1037960765', 'name': 'instance-00000028', 'instance_id': '0d5142cf-dac9-4d44-a43f-edcef823f370', 'instance_type': 'm1.nano', 'host': 'd6b860666b1be87d64d4600a39996f425a630442930bd0885afaa6ae', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a71a0e68-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4459.344942416, 'message_signature': '4118bbfbdfabf37d555c091c448271fec1a287efa4c5fc421e47d07e997a3b87'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 56345456, 'user_id': 'f9a51b2699f1471d9e9b3463921a67fe', 'user_name': None, 'project_id': '072c26a765bb4c6081d04d313aceda15', 'project_name': None, 'resource_id': '0d5142cf-dac9-4d44-a43f-edcef823f370-sda', 'timestamp': '2025-11-22T07:48:36.675130', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1037960765', 'name': 'instance-00000028', 'instance_id': '0d5142cf-dac9-4d44-a43f-edcef823f370', 'instance_type': 'm1.nano', 'host': 'd6b860666b1be87d64d4600a39996f425a630442930bd0885afaa6ae', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a71a1d7c-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4459.344942416, 'message_signature': '6944bd2088974cee1d69a5e15bcba85ddbbdae41129f822261b108ce39522511'}]}, 'timestamp': '2025-11-22 07:48:36.702914', '_unique_id': 'a412f8ca1c9746b08268ee81f0d75874'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.704 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.704 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.704 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.704 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.704 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.704 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.704 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.704 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.704 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.704 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.704 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.704 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.704 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.704 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.704 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.704 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.704 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.704 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.704 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.704 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.704 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.704 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.704 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.704 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.704 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.704 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.704 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.704 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.704 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.704 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.704 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.705 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.706 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.707 12 DEBUG ceilometer.compute.pollsters [-] 0d5142cf-dac9-4d44-a43f-edcef823f370/disk.device.write.latency volume: 3822429458 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.707 12 DEBUG ceilometer.compute.pollsters [-] 0d5142cf-dac9-4d44-a43f-edcef823f370/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.708 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '43eb63bf-2db6-4a84-9326-17be63394eb1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3822429458, 'user_id': 'f9a51b2699f1471d9e9b3463921a67fe', 'user_name': None, 'project_id': '072c26a765bb4c6081d04d313aceda15', 'project_name': None, 'resource_id': '0d5142cf-dac9-4d44-a43f-edcef823f370-vda', 'timestamp': '2025-11-22T07:48:36.707043', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1037960765', 'name': 'instance-00000028', 'instance_id': '0d5142cf-dac9-4d44-a43f-edcef823f370', 'instance_type': 'm1.nano', 'host': 'd6b860666b1be87d64d4600a39996f425a630442930bd0885afaa6ae', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a71ace7a-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4459.344942416, 'message_signature': '3dab1ff643210891c891d645e535137397c1c79c36ecab821b1f6426487cd500'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'f9a51b2699f1471d9e9b3463921a67fe', 'user_name': None, 'project_id': '072c26a765bb4c6081d04d313aceda15', 'project_name': None, 'resource_id': '0d5142cf-dac9-4d44-a43f-edcef823f370-sda', 'timestamp': '2025-11-22T07:48:36.707043', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1037960765', 'name': 'instance-00000028', 'instance_id': '0d5142cf-dac9-4d44-a43f-edcef823f370', 'instance_type': 'm1.nano', 'host': 'd6b860666b1be87d64d4600a39996f425a630442930bd0885afaa6ae', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a71adbf4-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4459.344942416, 'message_signature': '00200cade248bbd62678ff3f73d70bf255f36bba0f7a6003f55a901642f5c7dd'}]}, 'timestamp': '2025-11-22 07:48:36.707782', '_unique_id': '356f24b898d5444ca5ab6bd552b8522b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.708 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.708 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.708 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.708 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.708 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.708 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.708 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.708 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.708 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.708 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.708 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.708 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.708 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.708 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.708 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.708 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.708 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.708 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.708 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.708 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.708 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.708 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.708 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.708 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.708 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.708 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.708 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.708 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.708 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.708 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.708 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.709 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.709 12 DEBUG ceilometer.compute.pollsters [-] 0d5142cf-dac9-4d44-a43f-edcef823f370/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.710 12 DEBUG ceilometer.compute.pollsters [-] 0d5142cf-dac9-4d44-a43f-edcef823f370/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.711 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6ae40978-4f24-4bb8-a478-6cb76f66dcca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'f9a51b2699f1471d9e9b3463921a67fe', 'user_name': None, 'project_id': '072c26a765bb4c6081d04d313aceda15', 'project_name': None, 'resource_id': '0d5142cf-dac9-4d44-a43f-edcef823f370-vda', 'timestamp': '2025-11-22T07:48:36.709734', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1037960765', 'name': 'instance-00000028', 'instance_id': '0d5142cf-dac9-4d44-a43f-edcef823f370', 'instance_type': 'm1.nano', 'host': 'd6b860666b1be87d64d4600a39996f425a630442930bd0885afaa6ae', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a71b34a0-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4459.330704428, 'message_signature': '5b41c133e8ec78482ca6438127133e31a979a92633640bd255c8317ee8e544eb'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'f9a51b2699f1471d9e9b3463921a67fe', 'user_name': None, 'project_id': '072c26a765bb4c6081d04d313aceda15', 'project_name': None, 'resource_id': '0d5142cf-dac9-4d44-a43f-edcef823f370-sda', 'timestamp': '2025-11-22T07:48:36.709734', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1037960765', 'name': 'instance-00000028', 'instance_id': '0d5142cf-dac9-4d44-a43f-edcef823f370', 'instance_type': 'm1.nano', 'host': 'd6b860666b1be87d64d4600a39996f425a630442930bd0885afaa6ae', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a71b4062-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4459.330704428, 'message_signature': '69ae94dddbfe6e06d3ff29caed43505d30ee7dc72f2a4974c7706c231e4c6348'}]}, 'timestamp': '2025-11-22 07:48:36.710369', '_unique_id': '04fcb8b90d844b0193c8ce6c4a6a551d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.711 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.711 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.711 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.711 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.711 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.711 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.711 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.711 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.711 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.711 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.711 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.711 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.711 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.711 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.711 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.711 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.711 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.711 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.711 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.711 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.711 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.711 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.711 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.711 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.711 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.711 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.711 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.711 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.711 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.711 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.711 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.712 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.712 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.712 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.712 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ListImageFiltersTestJSON-server-1037960765>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ListImageFiltersTestJSON-server-1037960765>]
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.712 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.727 12 DEBUG ceilometer.compute.pollsters [-] 0d5142cf-dac9-4d44-a43f-edcef823f370/memory.usage volume: 41.078125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.728 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a7b60050-3a6c-4e65-a79b-075dfd6f2b26', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 41.078125, 'user_id': 'f9a51b2699f1471d9e9b3463921a67fe', 'user_name': None, 'project_id': '072c26a765bb4c6081d04d313aceda15', 'project_name': None, 'resource_id': '0d5142cf-dac9-4d44-a43f-edcef823f370', 'timestamp': '2025-11-22T07:48:36.712732', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1037960765', 'name': 'instance-00000028', 'instance_id': '0d5142cf-dac9-4d44-a43f-edcef823f370', 'instance_type': 'm1.nano', 'host': 'd6b860666b1be87d64d4600a39996f425a630442930bd0885afaa6ae', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'a71de218-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4459.396794313, 'message_signature': '5646703128d0be2be7b11b26818aa2772440844db1716c8ade7b7774529ecf44'}]}, 'timestamp': '2025-11-22 07:48:36.727666', '_unique_id': 'bd96fa90c06e433386d8b840d1b91a0e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.728 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.728 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.728 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.728 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.728 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.728 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.728 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.728 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.728 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.728 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.728 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.728 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.728 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.728 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.728 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.728 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.728 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.728 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.728 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.728 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.728 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.728 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.728 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.728 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.728 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.728 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.728 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.728 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.728 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.728 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.728 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.729 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.729 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.729 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.729 12 DEBUG ceilometer.compute.pollsters [-] 0d5142cf-dac9-4d44-a43f-edcef823f370/disk.device.write.requests volume: 308 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.730 12 DEBUG ceilometer.compute.pollsters [-] 0d5142cf-dac9-4d44-a43f-edcef823f370/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.731 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7dd2c275-6ac5-41f7-9768-25c5285973fb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 308, 'user_id': 'f9a51b2699f1471d9e9b3463921a67fe', 'user_name': None, 'project_id': '072c26a765bb4c6081d04d313aceda15', 'project_name': None, 'resource_id': '0d5142cf-dac9-4d44-a43f-edcef823f370-vda', 'timestamp': '2025-11-22T07:48:36.729846', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1037960765', 'name': 'instance-00000028', 'instance_id': '0d5142cf-dac9-4d44-a43f-edcef823f370', 'instance_type': 'm1.nano', 'host': 'd6b860666b1be87d64d4600a39996f425a630442930bd0885afaa6ae', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a71e456e-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4459.344942416, 'message_signature': '854b6ebf3cb877eef9ea5e7a73828e0cd221c0af016f82f6bb2dfee14e4432a9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'f9a51b2699f1471d9e9b3463921a67fe', 'user_name': None, 'project_id': '072c26a765bb4c6081d04d313aceda15', 'project_name': None, 'resource_id': '0d5142cf-dac9-4d44-a43f-edcef823f370-sda', 'timestamp': '2025-11-22T07:48:36.729846', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1037960765', 'name': 'instance-00000028', 'instance_id': '0d5142cf-dac9-4d44-a43f-edcef823f370', 'instance_type': 'm1.nano', 'host': 'd6b860666b1be87d64d4600a39996f425a630442930bd0885afaa6ae', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a71e50e0-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4459.344942416, 'message_signature': '448a82a7bbec18c257095f167e1422392fcb5a50465bd469aaee2d17e0e82aaa'}]}, 'timestamp': '2025-11-22 07:48:36.730409', '_unique_id': '6e094c2382b84010b5f6bb63a19b21e2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.731 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.731 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.731 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.731 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.731 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.731 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.731 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.731 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.731 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.731 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.731 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.731 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.731 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.731 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.731 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.731 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.731 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.731 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.731 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.731 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.731 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.731 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.731 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.731 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.731 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.731 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.731 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.731 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.731 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.731 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.731 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.731 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.731 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.732 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.732 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.732 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ListImageFiltersTestJSON-server-1037960765>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ListImageFiltersTestJSON-server-1037960765>]
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.732 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.732 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.732 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.732 12 DEBUG ceilometer.compute.pollsters [-] 0d5142cf-dac9-4d44-a43f-edcef823f370/disk.device.read.bytes volume: 30820864 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.732 12 DEBUG ceilometer.compute.pollsters [-] 0d5142cf-dac9-4d44-a43f-edcef823f370/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.733 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd55ddd40-facb-4f09-9711-56033ae832a7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30820864, 'user_id': 'f9a51b2699f1471d9e9b3463921a67fe', 'user_name': None, 'project_id': '072c26a765bb4c6081d04d313aceda15', 'project_name': None, 'resource_id': '0d5142cf-dac9-4d44-a43f-edcef823f370-vda', 'timestamp': '2025-11-22T07:48:36.732624', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1037960765', 'name': 'instance-00000028', 'instance_id': '0d5142cf-dac9-4d44-a43f-edcef823f370', 'instance_type': 'm1.nano', 'host': 'd6b860666b1be87d64d4600a39996f425a630442930bd0885afaa6ae', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a71eb15c-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4459.344942416, 'message_signature': '8ec952d15270e749f4c2dda804b7908b7928c12080ed4661257d1083f61ef8e0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'f9a51b2699f1471d9e9b3463921a67fe', 'user_name': None, 'project_id': '072c26a765bb4c6081d04d313aceda15', 'project_name': None, 'resource_id': '0d5142cf-dac9-4d44-a43f-edcef823f370-sda', 'timestamp': '2025-11-22T07:48:36.732624', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1037960765', 'name': 'instance-00000028', 'instance_id': '0d5142cf-dac9-4d44-a43f-edcef823f370', 'instance_type': 'm1.nano', 'host': 'd6b860666b1be87d64d4600a39996f425a630442930bd0885afaa6ae', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a71ebb2a-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4459.344942416, 'message_signature': 'b8e8872db618c1ce0e01b2d899f377565562d219f7ef225c06eba63585576eac'}]}, 'timestamp': '2025-11-22 07:48:36.733131', '_unique_id': '4113613809264f819ef8dcab22cf1fb9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.733 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.733 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.733 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.733 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.733 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.733 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.733 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.733 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.733 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.733 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.733 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.733 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.733 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.733 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.733 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.733 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.733 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.733 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.733 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.733 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.733 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.733 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.733 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.733 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.733 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.733 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.733 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.733 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.733 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.733 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.733 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.734 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.734 12 DEBUG ceilometer.compute.pollsters [-] 0d5142cf-dac9-4d44-a43f-edcef823f370/disk.device.read.requests volume: 1113 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.734 12 DEBUG ceilometer.compute.pollsters [-] 0d5142cf-dac9-4d44-a43f-edcef823f370/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.735 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3d2f38de-8f18-4740-bf7c-229c7b433f10', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1113, 'user_id': 'f9a51b2699f1471d9e9b3463921a67fe', 'user_name': None, 'project_id': '072c26a765bb4c6081d04d313aceda15', 'project_name': None, 'resource_id': '0d5142cf-dac9-4d44-a43f-edcef823f370-vda', 'timestamp': '2025-11-22T07:48:36.734435', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1037960765', 'name': 'instance-00000028', 'instance_id': '0d5142cf-dac9-4d44-a43f-edcef823f370', 'instance_type': 'm1.nano', 'host': 'd6b860666b1be87d64d4600a39996f425a630442930bd0885afaa6ae', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a71ef8e2-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4459.344942416, 'message_signature': '4c12b754e6919d491a8a9e6b2bf885f915f12d54e06b149e121df8e5f49539a4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'f9a51b2699f1471d9e9b3463921a67fe', 'user_name': None, 'project_id': '072c26a765bb4c6081d04d313aceda15', 'project_name': None, 'resource_id': '0d5142cf-dac9-4d44-a43f-edcef823f370-sda', 'timestamp': '2025-11-22T07:48:36.734435', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1037960765', 'name': 'instance-00000028', 'instance_id': '0d5142cf-dac9-4d44-a43f-edcef823f370', 'instance_type': 'm1.nano', 'host': 'd6b860666b1be87d64d4600a39996f425a630442930bd0885afaa6ae', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a71f01ca-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4459.344942416, 'message_signature': '9e0e92e0998297841646d96b17e26ffa993184e83b71c5fba7368a642156ebb5'}]}, 'timestamp': '2025-11-22 07:48:36.734911', '_unique_id': 'd50f7a3a3ce24e6d97fa5c4e594bcca5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.735 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.735 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.735 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.735 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.735 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.735 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.735 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.735 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.735 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.735 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.735 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.735 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.735 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.735 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.735 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.735 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.735 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.735 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.735 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.735 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.735 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.735 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.735 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.735 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.735 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.735 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.735 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.735 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.735 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.735 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.735 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.736 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.736 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.736 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.736 12 DEBUG ceilometer.compute.pollsters [-] 0d5142cf-dac9-4d44-a43f-edcef823f370/cpu volume: 12850000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.737 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3e2d2416-f2ed-4bae-81a0-182e30729db6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12850000000, 'user_id': 'f9a51b2699f1471d9e9b3463921a67fe', 'user_name': None, 'project_id': '072c26a765bb4c6081d04d313aceda15', 'project_name': None, 'resource_id': '0d5142cf-dac9-4d44-a43f-edcef823f370', 'timestamp': '2025-11-22T07:48:36.736368', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1037960765', 'name': 'instance-00000028', 'instance_id': '0d5142cf-dac9-4d44-a43f-edcef823f370', 'instance_type': 'm1.nano', 'host': 'd6b860666b1be87d64d4600a39996f425a630442930bd0885afaa6ae', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'a71f43ec-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4459.396794313, 'message_signature': '41663d20ba3ec226e61be8b512abf96e780acdfbff8f831918eebcb6d745157c'}]}, 'timestamp': '2025-11-22 07:48:36.736648', '_unique_id': '78e9a42615e341c680f54d1bed2ea90b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.737 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.737 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.737 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.737 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.737 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.737 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.737 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.737 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.737 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.737 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.737 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.737 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.737 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.737 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.737 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.737 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.737 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.737 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.737 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.737 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:48:36 np0005531887 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.737 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:48:36 np0005531887 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.737 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.737 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.737 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.737 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.737 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.737 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.737 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.737 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.737 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.737 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.737 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.738 12 DEBUG ceilometer.compute.pollsters [-] 0d5142cf-dac9-4d44-a43f-edcef823f370/disk.device.write.bytes volume: 72835072 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.738 12 DEBUG ceilometer.compute.pollsters [-] 0d5142cf-dac9-4d44-a43f-edcef823f370/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.739 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c17c6737-41fb-4e38-a409-e8fc1b419eb7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72835072, 'user_id': 'f9a51b2699f1471d9e9b3463921a67fe', 'user_name': None, 'project_id': '072c26a765bb4c6081d04d313aceda15', 'project_name': None, 'resource_id': '0d5142cf-dac9-4d44-a43f-edcef823f370-vda', 'timestamp': '2025-11-22T07:48:36.738000', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1037960765', 'name': 'instance-00000028', 'instance_id': '0d5142cf-dac9-4d44-a43f-edcef823f370', 'instance_type': 'm1.nano', 'host': 'd6b860666b1be87d64d4600a39996f425a630442930bd0885afaa6ae', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a71f8410-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4459.344942416, 'message_signature': 'd0687821905eabf517632a5d32bb8343186def95109b23df1c509e01c5769ca8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'f9a51b2699f1471d9e9b3463921a67fe', 'user_name': None, 'project_id': '072c26a765bb4c6081d04d313aceda15', 'project_name': None, 'resource_id': '0d5142cf-dac9-4d44-a43f-edcef823f370-sda', 'timestamp': '2025-11-22T07:48:36.738000', 'resource_metadata': {'display_name': 'tempest-ListImageFiltersTestJSON-server-1037960765', 'name': 'instance-00000028', 'instance_id': '0d5142cf-dac9-4d44-a43f-edcef823f370', 'instance_type': 'm1.nano', 'host': 'd6b860666b1be87d64d4600a39996f425a630442930bd0885afaa6ae', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a71f8f6e-c777-11f0-9b25-fa163ecc0304', 'monotonic_time': 4459.344942416, 'message_signature': '53cb098964b6aace7af4a67eb2be3d78284fa055c10810e34d64bb127a9d41df'}]}, 'timestamp': '2025-11-22 07:48:36.738587', '_unique_id': '4101cb9541104492b822a899630e528d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.739 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.739 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.739 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.739 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.739 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.739 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.739 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.739 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.739 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.739 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.739 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.739 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.739 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.739 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.739 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.739 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.739 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.739 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.739 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.739 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.739 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.739 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.739 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.739 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.739 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.739 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.739 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.739 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.739 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.739 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:48:36.739 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:48:36 np0005531887 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 02:48:36 np0005531887 podman[218587]: 2025-11-22 07:48:36.830221177 +0000 UTC m=+0.051403787 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent)
Nov 22 02:48:37 np0005531887 nova_compute[186849]: 2025-11-22 07:48:37.081 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:48:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:48:37.319 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:48:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:48:37.320 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:48:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:48:37.320 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:48:37 np0005531887 nova_compute[186849]: 2025-11-22 07:48:37.776 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:48:37 np0005531887 nova_compute[186849]: 2025-11-22 07:48:37.777 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:48:38 np0005531887 nova_compute[186849]: 2025-11-22 07:48:38.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:48:39 np0005531887 nova_compute[186849]: 2025-11-22 07:48:39.743 186853 DEBUG oslo_concurrency.lockutils [None req-614f75aa-3065-4ef3-9626-b8167769abfa f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Acquiring lock "0d5142cf-dac9-4d44-a43f-edcef823f370" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:48:39 np0005531887 nova_compute[186849]: 2025-11-22 07:48:39.743 186853 DEBUG oslo_concurrency.lockutils [None req-614f75aa-3065-4ef3-9626-b8167769abfa f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Lock "0d5142cf-dac9-4d44-a43f-edcef823f370" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:48:39 np0005531887 nova_compute[186849]: 2025-11-22 07:48:39.743 186853 DEBUG oslo_concurrency.lockutils [None req-614f75aa-3065-4ef3-9626-b8167769abfa f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Acquiring lock "0d5142cf-dac9-4d44-a43f-edcef823f370-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:48:39 np0005531887 nova_compute[186849]: 2025-11-22 07:48:39.744 186853 DEBUG oslo_concurrency.lockutils [None req-614f75aa-3065-4ef3-9626-b8167769abfa f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Lock "0d5142cf-dac9-4d44-a43f-edcef823f370-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:48:39 np0005531887 nova_compute[186849]: 2025-11-22 07:48:39.744 186853 DEBUG oslo_concurrency.lockutils [None req-614f75aa-3065-4ef3-9626-b8167769abfa f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Lock "0d5142cf-dac9-4d44-a43f-edcef823f370-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:48:39 np0005531887 nova_compute[186849]: 2025-11-22 07:48:39.751 186853 INFO nova.compute.manager [None req-614f75aa-3065-4ef3-9626-b8167769abfa f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] Terminating instance#033[00m
Nov 22 02:48:39 np0005531887 nova_compute[186849]: 2025-11-22 07:48:39.756 186853 DEBUG oslo_concurrency.lockutils [None req-614f75aa-3065-4ef3-9626-b8167769abfa f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Acquiring lock "refresh_cache-0d5142cf-dac9-4d44-a43f-edcef823f370" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:48:39 np0005531887 nova_compute[186849]: 2025-11-22 07:48:39.757 186853 DEBUG oslo_concurrency.lockutils [None req-614f75aa-3065-4ef3-9626-b8167769abfa f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Acquired lock "refresh_cache-0d5142cf-dac9-4d44-a43f-edcef823f370" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:48:39 np0005531887 nova_compute[186849]: 2025-11-22 07:48:39.757 186853 DEBUG nova.network.neutron [None req-614f75aa-3065-4ef3-9626-b8167769abfa f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:48:39 np0005531887 nova_compute[186849]: 2025-11-22 07:48:39.771 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:48:40 np0005531887 nova_compute[186849]: 2025-11-22 07:48:40.261 186853 DEBUG nova.network.neutron [None req-614f75aa-3065-4ef3-9626-b8167769abfa f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:48:40 np0005531887 nova_compute[186849]: 2025-11-22 07:48:40.330 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:48:40 np0005531887 nova_compute[186849]: 2025-11-22 07:48:40.577 186853 DEBUG nova.network.neutron [None req-614f75aa-3065-4ef3-9626-b8167769abfa f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:48:40 np0005531887 nova_compute[186849]: 2025-11-22 07:48:40.609 186853 DEBUG oslo_concurrency.lockutils [None req-614f75aa-3065-4ef3-9626-b8167769abfa f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Releasing lock "refresh_cache-0d5142cf-dac9-4d44-a43f-edcef823f370" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:48:40 np0005531887 nova_compute[186849]: 2025-11-22 07:48:40.610 186853 DEBUG nova.compute.manager [None req-614f75aa-3065-4ef3-9626-b8167769abfa f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 02:48:40 np0005531887 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000028.scope: Deactivated successfully.
Nov 22 02:48:40 np0005531887 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000028.scope: Consumed 15.514s CPU time.
Nov 22 02:48:40 np0005531887 systemd-machined[153180]: Machine qemu-16-instance-00000028 terminated.
Nov 22 02:48:40 np0005531887 podman[218608]: 2025-11-22 07:48:40.720241538 +0000 UTC m=+0.055766134 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, container_name=multipathd, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:48:40 np0005531887 nova_compute[186849]: 2025-11-22 07:48:40.776 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:48:40 np0005531887 nova_compute[186849]: 2025-11-22 07:48:40.776 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 02:48:40 np0005531887 nova_compute[186849]: 2025-11-22 07:48:40.776 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 02:48:40 np0005531887 nova_compute[186849]: 2025-11-22 07:48:40.791 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Nov 22 02:48:40 np0005531887 nova_compute[186849]: 2025-11-22 07:48:40.791 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 02:48:40 np0005531887 nova_compute[186849]: 2025-11-22 07:48:40.854 186853 INFO nova.virt.libvirt.driver [-] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] Instance destroyed successfully.#033[00m
Nov 22 02:48:40 np0005531887 nova_compute[186849]: 2025-11-22 07:48:40.854 186853 DEBUG nova.objects.instance [None req-614f75aa-3065-4ef3-9626-b8167769abfa f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Lazy-loading 'resources' on Instance uuid 0d5142cf-dac9-4d44-a43f-edcef823f370 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:48:40 np0005531887 nova_compute[186849]: 2025-11-22 07:48:40.873 186853 INFO nova.virt.libvirt.driver [None req-614f75aa-3065-4ef3-9626-b8167769abfa f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] Deleting instance files /var/lib/nova/instances/0d5142cf-dac9-4d44-a43f-edcef823f370_del#033[00m
Nov 22 02:48:40 np0005531887 nova_compute[186849]: 2025-11-22 07:48:40.874 186853 INFO nova.virt.libvirt.driver [None req-614f75aa-3065-4ef3-9626-b8167769abfa f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] Deletion of /var/lib/nova/instances/0d5142cf-dac9-4d44-a43f-edcef823f370_del complete#033[00m
Nov 22 02:48:40 np0005531887 nova_compute[186849]: 2025-11-22 07:48:40.948 186853 INFO nova.compute.manager [None req-614f75aa-3065-4ef3-9626-b8167769abfa f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] Took 0.34 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 02:48:40 np0005531887 nova_compute[186849]: 2025-11-22 07:48:40.949 186853 DEBUG oslo.service.loopingcall [None req-614f75aa-3065-4ef3-9626-b8167769abfa f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 02:48:40 np0005531887 nova_compute[186849]: 2025-11-22 07:48:40.949 186853 DEBUG nova.compute.manager [-] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 02:48:40 np0005531887 nova_compute[186849]: 2025-11-22 07:48:40.949 186853 DEBUG nova.network.neutron [-] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 02:48:41 np0005531887 nova_compute[186849]: 2025-11-22 07:48:41.587 186853 DEBUG nova.network.neutron [-] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:48:41 np0005531887 nova_compute[186849]: 2025-11-22 07:48:41.600 186853 DEBUG nova.network.neutron [-] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:48:41 np0005531887 nova_compute[186849]: 2025-11-22 07:48:41.611 186853 INFO nova.compute.manager [-] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] Took 0.66 seconds to deallocate network for instance.#033[00m
Nov 22 02:48:41 np0005531887 nova_compute[186849]: 2025-11-22 07:48:41.703 186853 DEBUG oslo_concurrency.lockutils [None req-614f75aa-3065-4ef3-9626-b8167769abfa f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:48:41 np0005531887 nova_compute[186849]: 2025-11-22 07:48:41.703 186853 DEBUG oslo_concurrency.lockutils [None req-614f75aa-3065-4ef3-9626-b8167769abfa f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:48:41 np0005531887 nova_compute[186849]: 2025-11-22 07:48:41.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:48:41 np0005531887 nova_compute[186849]: 2025-11-22 07:48:41.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:48:41 np0005531887 nova_compute[186849]: 2025-11-22 07:48:41.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:48:41 np0005531887 nova_compute[186849]: 2025-11-22 07:48:41.791 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:48:41 np0005531887 nova_compute[186849]: 2025-11-22 07:48:41.984 186853 DEBUG nova.compute.provider_tree [None req-614f75aa-3065-4ef3-9626-b8167769abfa f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:48:41 np0005531887 nova_compute[186849]: 2025-11-22 07:48:41.999 186853 DEBUG nova.scheduler.client.report [None req-614f75aa-3065-4ef3-9626-b8167769abfa f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:48:42 np0005531887 nova_compute[186849]: 2025-11-22 07:48:42.035 186853 DEBUG oslo_concurrency.lockutils [None req-614f75aa-3065-4ef3-9626-b8167769abfa f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.332s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:48:42 np0005531887 nova_compute[186849]: 2025-11-22 07:48:42.037 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.246s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:48:42 np0005531887 nova_compute[186849]: 2025-11-22 07:48:42.038 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:48:42 np0005531887 nova_compute[186849]: 2025-11-22 07:48:42.038 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 02:48:42 np0005531887 nova_compute[186849]: 2025-11-22 07:48:42.083 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:48:42 np0005531887 nova_compute[186849]: 2025-11-22 07:48:42.186 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:48:42 np0005531887 nova_compute[186849]: 2025-11-22 07:48:42.187 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5738MB free_disk=73.45829391479492GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 02:48:42 np0005531887 nova_compute[186849]: 2025-11-22 07:48:42.188 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:48:42 np0005531887 nova_compute[186849]: 2025-11-22 07:48:42.188 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:48:42 np0005531887 nova_compute[186849]: 2025-11-22 07:48:42.224 186853 INFO nova.scheduler.client.report [None req-614f75aa-3065-4ef3-9626-b8167769abfa f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Deleted allocations for instance 0d5142cf-dac9-4d44-a43f-edcef823f370#033[00m
Nov 22 02:48:42 np0005531887 nova_compute[186849]: 2025-11-22 07:48:42.259 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 02:48:42 np0005531887 nova_compute[186849]: 2025-11-22 07:48:42.259 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 02:48:42 np0005531887 nova_compute[186849]: 2025-11-22 07:48:42.283 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:48:42 np0005531887 nova_compute[186849]: 2025-11-22 07:48:42.296 186853 DEBUG oslo_concurrency.lockutils [None req-614f75aa-3065-4ef3-9626-b8167769abfa f9a51b2699f1471d9e9b3463921a67fe 072c26a765bb4c6081d04d313aceda15 - - default default] Lock "0d5142cf-dac9-4d44-a43f-edcef823f370" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.553s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:48:42 np0005531887 nova_compute[186849]: 2025-11-22 07:48:42.309 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:48:42 np0005531887 nova_compute[186849]: 2025-11-22 07:48:42.334 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 02:48:42 np0005531887 nova_compute[186849]: 2025-11-22 07:48:42.335 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.147s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:48:43 np0005531887 nova_compute[186849]: 2025-11-22 07:48:43.335 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:48:43 np0005531887 nova_compute[186849]: 2025-11-22 07:48:43.335 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:48:43 np0005531887 nova_compute[186849]: 2025-11-22 07:48:43.336 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 02:48:43 np0005531887 nova_compute[186849]: 2025-11-22 07:48:43.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:48:43 np0005531887 nova_compute[186849]: 2025-11-22 07:48:43.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 22 02:48:43 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:48:43.939 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:48:43 np0005531887 nova_compute[186849]: 2025-11-22 07:48:43.940 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:48:43 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:48:43.941 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 02:48:44 np0005531887 podman[218638]: 2025-11-22 07:48:44.826077481 +0000 UTC m=+0.049349168 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 02:48:45 np0005531887 nova_compute[186849]: 2025-11-22 07:48:45.332 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:48:47 np0005531887 nova_compute[186849]: 2025-11-22 07:48:47.085 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:48:48 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:48:48.943 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:48:50 np0005531887 nova_compute[186849]: 2025-11-22 07:48:50.334 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:48:50 np0005531887 podman[218661]: 2025-11-22 07:48:50.838843264 +0000 UTC m=+0.057150718 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, version=9.6, maintainer=Red Hat, Inc., config_id=edpm, distribution-scope=public, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., managed_by=edpm_ansible)
Nov 22 02:48:52 np0005531887 nova_compute[186849]: 2025-11-22 07:48:52.087 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:48:52 np0005531887 ovn_controller[95130]: 2025-11-22T07:48:52Z|00089|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Nov 22 02:48:55 np0005531887 nova_compute[186849]: 2025-11-22 07:48:55.336 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:48:55 np0005531887 nova_compute[186849]: 2025-11-22 07:48:55.853 186853 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797720.8520787, 0d5142cf-dac9-4d44-a43f-edcef823f370 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:48:55 np0005531887 nova_compute[186849]: 2025-11-22 07:48:55.853 186853 INFO nova.compute.manager [-] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:48:55 np0005531887 nova_compute[186849]: 2025-11-22 07:48:55.879 186853 DEBUG nova.compute.manager [None req-94b6b9bf-cce0-4e65-b43c-e89d913424cb - - - - - -] [instance: 0d5142cf-dac9-4d44-a43f-edcef823f370] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:48:56 np0005531887 podman[218682]: 2025-11-22 07:48:56.857839664 +0000 UTC m=+0.079307193 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118)
Nov 22 02:48:56 np0005531887 podman[218683]: 2025-11-22 07:48:56.883185064 +0000 UTC m=+0.100210312 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Nov 22 02:48:57 np0005531887 nova_compute[186849]: 2025-11-22 07:48:57.089 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:00 np0005531887 nova_compute[186849]: 2025-11-22 07:49:00.338 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:02 np0005531887 nova_compute[186849]: 2025-11-22 07:49:02.092 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:03 np0005531887 podman[218727]: 2025-11-22 07:49:03.82467562 +0000 UTC m=+0.048842985 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 02:49:05 np0005531887 nova_compute[186849]: 2025-11-22 07:49:05.341 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:07 np0005531887 nova_compute[186849]: 2025-11-22 07:49:07.094 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:07 np0005531887 podman[218751]: 2025-11-22 07:49:07.819033083 +0000 UTC m=+0.043428360 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 22 02:49:10 np0005531887 nova_compute[186849]: 2025-11-22 07:49:10.342 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:10 np0005531887 podman[218770]: 2025-11-22 07:49:10.850541895 +0000 UTC m=+0.070298369 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:49:12 np0005531887 nova_compute[186849]: 2025-11-22 07:49:12.096 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:15 np0005531887 nova_compute[186849]: 2025-11-22 07:49:15.344 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:15 np0005531887 podman[218789]: 2025-11-22 07:49:15.837114761 +0000 UTC m=+0.052552757 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 02:49:17 np0005531887 nova_compute[186849]: 2025-11-22 07:49:17.099 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:20 np0005531887 nova_compute[186849]: 2025-11-22 07:49:20.345 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:21 np0005531887 podman[218814]: 2025-11-22 07:49:21.843978267 +0000 UTC m=+0.057849028 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.component=ubi9-minimal-container, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, release=1755695350, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vcs-type=git, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, managed_by=edpm_ansible, name=ubi9-minimal)
Nov 22 02:49:22 np0005531887 nova_compute[186849]: 2025-11-22 07:49:22.102 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:24 np0005531887 nova_compute[186849]: 2025-11-22 07:49:24.080 186853 DEBUG oslo_concurrency.lockutils [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Acquiring lock "e59c400d-6c4e-44c1-b797-2809b3ce436f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:49:24 np0005531887 nova_compute[186849]: 2025-11-22 07:49:24.081 186853 DEBUG oslo_concurrency.lockutils [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Lock "e59c400d-6c4e-44c1-b797-2809b3ce436f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:49:24 np0005531887 nova_compute[186849]: 2025-11-22 07:49:24.127 186853 DEBUG nova.compute.manager [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 02:49:24 np0005531887 nova_compute[186849]: 2025-11-22 07:49:24.220 186853 DEBUG oslo_concurrency.lockutils [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:49:24 np0005531887 nova_compute[186849]: 2025-11-22 07:49:24.221 186853 DEBUG oslo_concurrency.lockutils [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:49:24 np0005531887 nova_compute[186849]: 2025-11-22 07:49:24.229 186853 DEBUG nova.virt.hardware [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:49:24 np0005531887 nova_compute[186849]: 2025-11-22 07:49:24.229 186853 INFO nova.compute.claims [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 22 02:49:24 np0005531887 nova_compute[186849]: 2025-11-22 07:49:24.350 186853 DEBUG nova.compute.provider_tree [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:49:24 np0005531887 nova_compute[186849]: 2025-11-22 07:49:24.365 186853 DEBUG nova.scheduler.client.report [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:49:24 np0005531887 nova_compute[186849]: 2025-11-22 07:49:24.399 186853 DEBUG oslo_concurrency.lockutils [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.179s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:49:24 np0005531887 nova_compute[186849]: 2025-11-22 07:49:24.401 186853 DEBUG nova.compute.manager [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 02:49:24 np0005531887 nova_compute[186849]: 2025-11-22 07:49:24.472 186853 DEBUG nova.compute.manager [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 02:49:24 np0005531887 nova_compute[186849]: 2025-11-22 07:49:24.472 186853 DEBUG nova.network.neutron [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 02:49:24 np0005531887 nova_compute[186849]: 2025-11-22 07:49:24.495 186853 INFO nova.virt.libvirt.driver [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 02:49:24 np0005531887 nova_compute[186849]: 2025-11-22 07:49:24.512 186853 DEBUG nova.compute.manager [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 02:49:24 np0005531887 nova_compute[186849]: 2025-11-22 07:49:24.665 186853 DEBUG nova.compute.manager [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 02:49:24 np0005531887 nova_compute[186849]: 2025-11-22 07:49:24.667 186853 DEBUG nova.virt.libvirt.driver [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:49:24 np0005531887 nova_compute[186849]: 2025-11-22 07:49:24.667 186853 INFO nova.virt.libvirt.driver [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Creating image(s)#033[00m
Nov 22 02:49:24 np0005531887 nova_compute[186849]: 2025-11-22 07:49:24.669 186853 DEBUG oslo_concurrency.lockutils [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Acquiring lock "/var/lib/nova/instances/e59c400d-6c4e-44c1-b797-2809b3ce436f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:49:24 np0005531887 nova_compute[186849]: 2025-11-22 07:49:24.669 186853 DEBUG oslo_concurrency.lockutils [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Lock "/var/lib/nova/instances/e59c400d-6c4e-44c1-b797-2809b3ce436f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:49:24 np0005531887 nova_compute[186849]: 2025-11-22 07:49:24.670 186853 DEBUG oslo_concurrency.lockutils [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Lock "/var/lib/nova/instances/e59c400d-6c4e-44c1-b797-2809b3ce436f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:49:24 np0005531887 nova_compute[186849]: 2025-11-22 07:49:24.686 186853 DEBUG oslo_concurrency.processutils [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:49:24 np0005531887 nova_compute[186849]: 2025-11-22 07:49:24.708 186853 DEBUG nova.policy [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '734c4f1eee2d4b3b903662ad118275cb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '016630e9d4644c9a97e64dd376a8ea67', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 02:49:24 np0005531887 nova_compute[186849]: 2025-11-22 07:49:24.752 186853 DEBUG oslo_concurrency.processutils [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:49:24 np0005531887 nova_compute[186849]: 2025-11-22 07:49:24.753 186853 DEBUG oslo_concurrency.lockutils [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:49:24 np0005531887 nova_compute[186849]: 2025-11-22 07:49:24.754 186853 DEBUG oslo_concurrency.lockutils [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:49:24 np0005531887 nova_compute[186849]: 2025-11-22 07:49:24.765 186853 DEBUG oslo_concurrency.processutils [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:49:24 np0005531887 nova_compute[186849]: 2025-11-22 07:49:24.828 186853 DEBUG oslo_concurrency.processutils [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:49:24 np0005531887 nova_compute[186849]: 2025-11-22 07:49:24.829 186853 DEBUG oslo_concurrency.processutils [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/e59c400d-6c4e-44c1-b797-2809b3ce436f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:49:24 np0005531887 nova_compute[186849]: 2025-11-22 07:49:24.884 186853 DEBUG oslo_concurrency.processutils [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/e59c400d-6c4e-44c1-b797-2809b3ce436f/disk 1073741824" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:49:24 np0005531887 nova_compute[186849]: 2025-11-22 07:49:24.886 186853 DEBUG oslo_concurrency.lockutils [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.132s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:49:24 np0005531887 nova_compute[186849]: 2025-11-22 07:49:24.886 186853 DEBUG oslo_concurrency.processutils [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:49:24 np0005531887 nova_compute[186849]: 2025-11-22 07:49:24.950 186853 DEBUG oslo_concurrency.processutils [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:49:24 np0005531887 nova_compute[186849]: 2025-11-22 07:49:24.951 186853 DEBUG nova.virt.disk.api [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Checking if we can resize image /var/lib/nova/instances/e59c400d-6c4e-44c1-b797-2809b3ce436f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:49:24 np0005531887 nova_compute[186849]: 2025-11-22 07:49:24.952 186853 DEBUG oslo_concurrency.processutils [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e59c400d-6c4e-44c1-b797-2809b3ce436f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:49:25 np0005531887 nova_compute[186849]: 2025-11-22 07:49:25.005 186853 DEBUG oslo_concurrency.processutils [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e59c400d-6c4e-44c1-b797-2809b3ce436f/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:49:25 np0005531887 nova_compute[186849]: 2025-11-22 07:49:25.007 186853 DEBUG nova.virt.disk.api [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Cannot resize image /var/lib/nova/instances/e59c400d-6c4e-44c1-b797-2809b3ce436f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:49:25 np0005531887 nova_compute[186849]: 2025-11-22 07:49:25.007 186853 DEBUG nova.objects.instance [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Lazy-loading 'migration_context' on Instance uuid e59c400d-6c4e-44c1-b797-2809b3ce436f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:49:25 np0005531887 nova_compute[186849]: 2025-11-22 07:49:25.019 186853 DEBUG nova.virt.libvirt.driver [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:49:25 np0005531887 nova_compute[186849]: 2025-11-22 07:49:25.020 186853 DEBUG nova.virt.libvirt.driver [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Ensure instance console log exists: /var/lib/nova/instances/e59c400d-6c4e-44c1-b797-2809b3ce436f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:49:25 np0005531887 nova_compute[186849]: 2025-11-22 07:49:25.021 186853 DEBUG oslo_concurrency.lockutils [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:49:25 np0005531887 nova_compute[186849]: 2025-11-22 07:49:25.021 186853 DEBUG oslo_concurrency.lockutils [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:49:25 np0005531887 nova_compute[186849]: 2025-11-22 07:49:25.021 186853 DEBUG oslo_concurrency.lockutils [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:49:25 np0005531887 nova_compute[186849]: 2025-11-22 07:49:25.348 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:26 np0005531887 nova_compute[186849]: 2025-11-22 07:49:26.283 186853 DEBUG nova.network.neutron [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Successfully created port: 949f6b09-9b43-4db5-bd85-be3c96c1c5c0 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 02:49:27 np0005531887 nova_compute[186849]: 2025-11-22 07:49:27.104 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:27 np0005531887 nova_compute[186849]: 2025-11-22 07:49:27.627 186853 DEBUG nova.network.neutron [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Successfully updated port: 949f6b09-9b43-4db5-bd85-be3c96c1c5c0 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 02:49:27 np0005531887 nova_compute[186849]: 2025-11-22 07:49:27.639 186853 DEBUG oslo_concurrency.lockutils [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Acquiring lock "refresh_cache-e59c400d-6c4e-44c1-b797-2809b3ce436f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:49:27 np0005531887 nova_compute[186849]: 2025-11-22 07:49:27.640 186853 DEBUG oslo_concurrency.lockutils [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Acquired lock "refresh_cache-e59c400d-6c4e-44c1-b797-2809b3ce436f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:49:27 np0005531887 nova_compute[186849]: 2025-11-22 07:49:27.640 186853 DEBUG nova.network.neutron [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:49:27 np0005531887 nova_compute[186849]: 2025-11-22 07:49:27.705 186853 DEBUG nova.compute.manager [req-68ada873-6277-4c46-9ebd-556f196319e6 req-254d350f-0314-4fd3-bc1a-061fd136705f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Received event network-changed-949f6b09-9b43-4db5-bd85-be3c96c1c5c0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:49:27 np0005531887 nova_compute[186849]: 2025-11-22 07:49:27.705 186853 DEBUG nova.compute.manager [req-68ada873-6277-4c46-9ebd-556f196319e6 req-254d350f-0314-4fd3-bc1a-061fd136705f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Refreshing instance network info cache due to event network-changed-949f6b09-9b43-4db5-bd85-be3c96c1c5c0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 02:49:27 np0005531887 nova_compute[186849]: 2025-11-22 07:49:27.705 186853 DEBUG oslo_concurrency.lockutils [req-68ada873-6277-4c46-9ebd-556f196319e6 req-254d350f-0314-4fd3-bc1a-061fd136705f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-e59c400d-6c4e-44c1-b797-2809b3ce436f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:49:27 np0005531887 nova_compute[186849]: 2025-11-22 07:49:27.781 186853 DEBUG nova.network.neutron [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:49:27 np0005531887 podman[218854]: 2025-11-22 07:49:27.836081476 +0000 UTC m=+0.057407148 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, tcib_managed=true, config_id=edpm)
Nov 22 02:49:27 np0005531887 podman[218855]: 2025-11-22 07:49:27.863549499 +0000 UTC m=+0.082080482 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 02:49:29 np0005531887 nova_compute[186849]: 2025-11-22 07:49:29.390 186853 DEBUG nova.network.neutron [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Updating instance_info_cache with network_info: [{"id": "949f6b09-9b43-4db5-bd85-be3c96c1c5c0", "address": "fa:16:3e:3b:0e:78", "network": {"id": "89d2ed15-12a4-49e8-81fd-f7a793c4f290", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1034549749-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016630e9d4644c9a97e64dd376a8ea67", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap949f6b09-9b", "ovs_interfaceid": "949f6b09-9b43-4db5-bd85-be3c96c1c5c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:49:29 np0005531887 nova_compute[186849]: 2025-11-22 07:49:29.417 186853 DEBUG oslo_concurrency.lockutils [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Releasing lock "refresh_cache-e59c400d-6c4e-44c1-b797-2809b3ce436f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:49:29 np0005531887 nova_compute[186849]: 2025-11-22 07:49:29.417 186853 DEBUG nova.compute.manager [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Instance network_info: |[{"id": "949f6b09-9b43-4db5-bd85-be3c96c1c5c0", "address": "fa:16:3e:3b:0e:78", "network": {"id": "89d2ed15-12a4-49e8-81fd-f7a793c4f290", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1034549749-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016630e9d4644c9a97e64dd376a8ea67", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap949f6b09-9b", "ovs_interfaceid": "949f6b09-9b43-4db5-bd85-be3c96c1c5c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 02:49:29 np0005531887 nova_compute[186849]: 2025-11-22 07:49:29.418 186853 DEBUG oslo_concurrency.lockutils [req-68ada873-6277-4c46-9ebd-556f196319e6 req-254d350f-0314-4fd3-bc1a-061fd136705f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-e59c400d-6c4e-44c1-b797-2809b3ce436f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:49:29 np0005531887 nova_compute[186849]: 2025-11-22 07:49:29.418 186853 DEBUG nova.network.neutron [req-68ada873-6277-4c46-9ebd-556f196319e6 req-254d350f-0314-4fd3-bc1a-061fd136705f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Refreshing network info cache for port 949f6b09-9b43-4db5-bd85-be3c96c1c5c0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 02:49:29 np0005531887 nova_compute[186849]: 2025-11-22 07:49:29.420 186853 DEBUG nova.virt.libvirt.driver [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Start _get_guest_xml network_info=[{"id": "949f6b09-9b43-4db5-bd85-be3c96c1c5c0", "address": "fa:16:3e:3b:0e:78", "network": {"id": "89d2ed15-12a4-49e8-81fd-f7a793c4f290", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1034549749-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016630e9d4644c9a97e64dd376a8ea67", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap949f6b09-9b", "ovs_interfaceid": "949f6b09-9b43-4db5-bd85-be3c96c1c5c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'size': 0, 'boot_index': 0, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:49:29 np0005531887 nova_compute[186849]: 2025-11-22 07:49:29.425 186853 WARNING nova.virt.libvirt.driver [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:49:29 np0005531887 nova_compute[186849]: 2025-11-22 07:49:29.431 186853 DEBUG nova.virt.libvirt.host [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:49:29 np0005531887 nova_compute[186849]: 2025-11-22 07:49:29.431 186853 DEBUG nova.virt.libvirt.host [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:49:29 np0005531887 nova_compute[186849]: 2025-11-22 07:49:29.435 186853 DEBUG nova.virt.libvirt.host [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:49:29 np0005531887 nova_compute[186849]: 2025-11-22 07:49:29.435 186853 DEBUG nova.virt.libvirt.host [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:49:29 np0005531887 nova_compute[186849]: 2025-11-22 07:49:29.436 186853 DEBUG nova.virt.libvirt.driver [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:49:29 np0005531887 nova_compute[186849]: 2025-11-22 07:49:29.437 186853 DEBUG nova.virt.hardware [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:49:29 np0005531887 nova_compute[186849]: 2025-11-22 07:49:29.437 186853 DEBUG nova.virt.hardware [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:49:29 np0005531887 nova_compute[186849]: 2025-11-22 07:49:29.437 186853 DEBUG nova.virt.hardware [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:49:29 np0005531887 nova_compute[186849]: 2025-11-22 07:49:29.437 186853 DEBUG nova.virt.hardware [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:49:29 np0005531887 nova_compute[186849]: 2025-11-22 07:49:29.438 186853 DEBUG nova.virt.hardware [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:49:29 np0005531887 nova_compute[186849]: 2025-11-22 07:49:29.438 186853 DEBUG nova.virt.hardware [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:49:29 np0005531887 nova_compute[186849]: 2025-11-22 07:49:29.438 186853 DEBUG nova.virt.hardware [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:49:29 np0005531887 nova_compute[186849]: 2025-11-22 07:49:29.438 186853 DEBUG nova.virt.hardware [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:49:29 np0005531887 nova_compute[186849]: 2025-11-22 07:49:29.438 186853 DEBUG nova.virt.hardware [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:49:29 np0005531887 nova_compute[186849]: 2025-11-22 07:49:29.439 186853 DEBUG nova.virt.hardware [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:49:29 np0005531887 nova_compute[186849]: 2025-11-22 07:49:29.439 186853 DEBUG nova.virt.hardware [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:49:29 np0005531887 nova_compute[186849]: 2025-11-22 07:49:29.442 186853 DEBUG nova.virt.libvirt.vif [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:49:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesUnderV243Test-server-915692846',display_name='tempest-AttachInterfacesUnderV243Test-server-915692846',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacesunderv243test-server-915692846',id=45,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI9I7AKHwzsdyW94e1BqU/1gxwz29bZH4aT61+oXBp3nixnDiOKpR15Cg7stiR0RFGonescIbjAMshz0mpaqhcsA6H0iKeUrj8npA+NNbPz2j6/WZYDlc6g1Ggpwz39eCw==',key_name='tempest-keypair-62070966',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='016630e9d4644c9a97e64dd376a8ea67',ramdisk_id='',reservation_id='r-wbtde39v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesUnderV243Test-58749522',owner_user_name='tempest-AttachInterfacesUnderV243Test-58749522-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:49:24Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='734c4f1eee2d4b3b903662ad118275cb',uuid=e59c400d-6c4e-44c1-b797-2809b3ce436f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "949f6b09-9b43-4db5-bd85-be3c96c1c5c0", "address": "fa:16:3e:3b:0e:78", "network": {"id": "89d2ed15-12a4-49e8-81fd-f7a793c4f290", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1034549749-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016630e9d4644c9a97e64dd376a8ea67", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap949f6b09-9b", "ovs_interfaceid": "949f6b09-9b43-4db5-bd85-be3c96c1c5c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 02:49:29 np0005531887 nova_compute[186849]: 2025-11-22 07:49:29.442 186853 DEBUG nova.network.os_vif_util [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Converting VIF {"id": "949f6b09-9b43-4db5-bd85-be3c96c1c5c0", "address": "fa:16:3e:3b:0e:78", "network": {"id": "89d2ed15-12a4-49e8-81fd-f7a793c4f290", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1034549749-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016630e9d4644c9a97e64dd376a8ea67", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap949f6b09-9b", "ovs_interfaceid": "949f6b09-9b43-4db5-bd85-be3c96c1c5c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:49:29 np0005531887 nova_compute[186849]: 2025-11-22 07:49:29.444 186853 DEBUG nova.network.os_vif_util [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3b:0e:78,bridge_name='br-int',has_traffic_filtering=True,id=949f6b09-9b43-4db5-bd85-be3c96c1c5c0,network=Network(89d2ed15-12a4-49e8-81fd-f7a793c4f290),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap949f6b09-9b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:49:29 np0005531887 nova_compute[186849]: 2025-11-22 07:49:29.445 186853 DEBUG nova.objects.instance [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Lazy-loading 'pci_devices' on Instance uuid e59c400d-6c4e-44c1-b797-2809b3ce436f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:49:29 np0005531887 nova_compute[186849]: 2025-11-22 07:49:29.458 186853 DEBUG nova.virt.libvirt.driver [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:49:29 np0005531887 nova_compute[186849]:  <uuid>e59c400d-6c4e-44c1-b797-2809b3ce436f</uuid>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:  <name>instance-0000002d</name>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:  <memory>131072</memory>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:  <vcpu>1</vcpu>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:49:29 np0005531887 nova_compute[186849]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:      <nova:name>tempest-AttachInterfacesUnderV243Test-server-915692846</nova:name>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:      <nova:creationTime>2025-11-22 07:49:29</nova:creationTime>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:      <nova:flavor name="m1.nano">
Nov 22 02:49:29 np0005531887 nova_compute[186849]:        <nova:memory>128</nova:memory>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:        <nova:disk>1</nova:disk>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:        <nova:swap>0</nova:swap>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:      </nova:flavor>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:      <nova:owner>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:        <nova:user uuid="734c4f1eee2d4b3b903662ad118275cb">tempest-AttachInterfacesUnderV243Test-58749522-project-member</nova:user>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:        <nova:project uuid="016630e9d4644c9a97e64dd376a8ea67">tempest-AttachInterfacesUnderV243Test-58749522</nova:project>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:      </nova:owner>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:      <nova:ports>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:        <nova:port uuid="949f6b09-9b43-4db5-bd85-be3c96c1c5c0">
Nov 22 02:49:29 np0005531887 nova_compute[186849]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:        </nova:port>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:      </nova:ports>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:    </nova:instance>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:  <sysinfo type="smbios">
Nov 22 02:49:29 np0005531887 nova_compute[186849]:    <system>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:      <entry name="serial">e59c400d-6c4e-44c1-b797-2809b3ce436f</entry>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:      <entry name="uuid">e59c400d-6c4e-44c1-b797-2809b3ce436f</entry>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:    </system>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:  <os>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:    <boot dev="hd"/>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:    <smbios mode="sysinfo"/>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:  </os>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:  <features>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:    <vmcoreinfo/>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:  </features>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:  <clock offset="utc">
Nov 22 02:49:29 np0005531887 nova_compute[186849]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:    <timer name="hpet" present="no"/>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:  </clock>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:  <cpu mode="custom" match="exact">
Nov 22 02:49:29 np0005531887 nova_compute[186849]:    <model>Nehalem</model>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:  <devices>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:    <disk type="file" device="disk">
Nov 22 02:49:29 np0005531887 nova_compute[186849]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/e59c400d-6c4e-44c1-b797-2809b3ce436f/disk"/>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:      <target dev="vda" bus="virtio"/>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:    <disk type="file" device="cdrom">
Nov 22 02:49:29 np0005531887 nova_compute[186849]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/e59c400d-6c4e-44c1-b797-2809b3ce436f/disk.config"/>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:      <target dev="sda" bus="sata"/>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:    <interface type="ethernet">
Nov 22 02:49:29 np0005531887 nova_compute[186849]:      <mac address="fa:16:3e:3b:0e:78"/>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:      <mtu size="1442"/>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:      <target dev="tap949f6b09-9b"/>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:    </interface>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:    <serial type="pty">
Nov 22 02:49:29 np0005531887 nova_compute[186849]:      <log file="/var/lib/nova/instances/e59c400d-6c4e-44c1-b797-2809b3ce436f/console.log" append="off"/>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:    </serial>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:    <video>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:    </video>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:    <input type="tablet" bus="usb"/>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:    <rng model="virtio">
Nov 22 02:49:29 np0005531887 nova_compute[186849]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:    </rng>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:    <controller type="usb" index="0"/>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:    <memballoon model="virtio">
Nov 22 02:49:29 np0005531887 nova_compute[186849]:      <stats period="10"/>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 02:49:29 np0005531887 nova_compute[186849]:  </devices>
Nov 22 02:49:29 np0005531887 nova_compute[186849]: </domain>
Nov 22 02:49:29 np0005531887 nova_compute[186849]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:49:29 np0005531887 nova_compute[186849]: 2025-11-22 07:49:29.458 186853 DEBUG nova.compute.manager [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Preparing to wait for external event network-vif-plugged-949f6b09-9b43-4db5-bd85-be3c96c1c5c0 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 02:49:29 np0005531887 nova_compute[186849]: 2025-11-22 07:49:29.458 186853 DEBUG oslo_concurrency.lockutils [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Acquiring lock "e59c400d-6c4e-44c1-b797-2809b3ce436f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:49:29 np0005531887 nova_compute[186849]: 2025-11-22 07:49:29.458 186853 DEBUG oslo_concurrency.lockutils [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Lock "e59c400d-6c4e-44c1-b797-2809b3ce436f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:49:29 np0005531887 nova_compute[186849]: 2025-11-22 07:49:29.459 186853 DEBUG oslo_concurrency.lockutils [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Lock "e59c400d-6c4e-44c1-b797-2809b3ce436f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:49:29 np0005531887 nova_compute[186849]: 2025-11-22 07:49:29.459 186853 DEBUG nova.virt.libvirt.vif [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:49:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesUnderV243Test-server-915692846',display_name='tempest-AttachInterfacesUnderV243Test-server-915692846',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacesunderv243test-server-915692846',id=45,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI9I7AKHwzsdyW94e1BqU/1gxwz29bZH4aT61+oXBp3nixnDiOKpR15Cg7stiR0RFGonescIbjAMshz0mpaqhcsA6H0iKeUrj8npA+NNbPz2j6/WZYDlc6g1Ggpwz39eCw==',key_name='tempest-keypair-62070966',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='016630e9d4644c9a97e64dd376a8ea67',ramdisk_id='',reservation_id='r-wbtde39v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesUnderV243Test-58749522',owner_user_name='tempest-AttachInterfacesUnderV243Test-58749522-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:49:24Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='734c4f1eee2d4b3b903662ad118275cb',uuid=e59c400d-6c4e-44c1-b797-2809b3ce436f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "949f6b09-9b43-4db5-bd85-be3c96c1c5c0", "address": "fa:16:3e:3b:0e:78", "network": {"id": "89d2ed15-12a4-49e8-81fd-f7a793c4f290", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1034549749-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016630e9d4644c9a97e64dd376a8ea67", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap949f6b09-9b", "ovs_interfaceid": "949f6b09-9b43-4db5-bd85-be3c96c1c5c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 02:49:29 np0005531887 nova_compute[186849]: 2025-11-22 07:49:29.459 186853 DEBUG nova.network.os_vif_util [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Converting VIF {"id": "949f6b09-9b43-4db5-bd85-be3c96c1c5c0", "address": "fa:16:3e:3b:0e:78", "network": {"id": "89d2ed15-12a4-49e8-81fd-f7a793c4f290", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1034549749-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016630e9d4644c9a97e64dd376a8ea67", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap949f6b09-9b", "ovs_interfaceid": "949f6b09-9b43-4db5-bd85-be3c96c1c5c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:49:29 np0005531887 nova_compute[186849]: 2025-11-22 07:49:29.460 186853 DEBUG nova.network.os_vif_util [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3b:0e:78,bridge_name='br-int',has_traffic_filtering=True,id=949f6b09-9b43-4db5-bd85-be3c96c1c5c0,network=Network(89d2ed15-12a4-49e8-81fd-f7a793c4f290),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap949f6b09-9b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:49:29 np0005531887 nova_compute[186849]: 2025-11-22 07:49:29.460 186853 DEBUG os_vif [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3b:0e:78,bridge_name='br-int',has_traffic_filtering=True,id=949f6b09-9b43-4db5-bd85-be3c96c1c5c0,network=Network(89d2ed15-12a4-49e8-81fd-f7a793c4f290),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap949f6b09-9b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 02:49:29 np0005531887 nova_compute[186849]: 2025-11-22 07:49:29.461 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:29 np0005531887 nova_compute[186849]: 2025-11-22 07:49:29.461 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:49:29 np0005531887 nova_compute[186849]: 2025-11-22 07:49:29.461 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:49:29 np0005531887 nova_compute[186849]: 2025-11-22 07:49:29.463 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:29 np0005531887 nova_compute[186849]: 2025-11-22 07:49:29.464 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap949f6b09-9b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:49:29 np0005531887 nova_compute[186849]: 2025-11-22 07:49:29.464 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap949f6b09-9b, col_values=(('external_ids', {'iface-id': '949f6b09-9b43-4db5-bd85-be3c96c1c5c0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3b:0e:78', 'vm-uuid': 'e59c400d-6c4e-44c1-b797-2809b3ce436f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:49:29 np0005531887 NetworkManager[55210]: <info>  [1763797769.5039] manager: (tap949f6b09-9b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/52)
Nov 22 02:49:29 np0005531887 nova_compute[186849]: 2025-11-22 07:49:29.503 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:29 np0005531887 nova_compute[186849]: 2025-11-22 07:49:29.506 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:49:29 np0005531887 nova_compute[186849]: 2025-11-22 07:49:29.510 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:29 np0005531887 nova_compute[186849]: 2025-11-22 07:49:29.511 186853 INFO os_vif [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3b:0e:78,bridge_name='br-int',has_traffic_filtering=True,id=949f6b09-9b43-4db5-bd85-be3c96c1c5c0,network=Network(89d2ed15-12a4-49e8-81fd-f7a793c4f290),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap949f6b09-9b')#033[00m
Nov 22 02:49:29 np0005531887 nova_compute[186849]: 2025-11-22 07:49:29.618 186853 DEBUG nova.virt.libvirt.driver [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:49:29 np0005531887 nova_compute[186849]: 2025-11-22 07:49:29.619 186853 DEBUG nova.virt.libvirt.driver [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:49:29 np0005531887 nova_compute[186849]: 2025-11-22 07:49:29.619 186853 DEBUG nova.virt.libvirt.driver [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] No VIF found with MAC fa:16:3e:3b:0e:78, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 02:49:29 np0005531887 nova_compute[186849]: 2025-11-22 07:49:29.619 186853 INFO nova.virt.libvirt.driver [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Using config drive#033[00m
Nov 22 02:49:30 np0005531887 nova_compute[186849]: 2025-11-22 07:49:30.349 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:30 np0005531887 nova_compute[186849]: 2025-11-22 07:49:30.423 186853 INFO nova.virt.libvirt.driver [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Creating config drive at /var/lib/nova/instances/e59c400d-6c4e-44c1-b797-2809b3ce436f/disk.config#033[00m
Nov 22 02:49:30 np0005531887 nova_compute[186849]: 2025-11-22 07:49:30.429 186853 DEBUG oslo_concurrency.processutils [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e59c400d-6c4e-44c1-b797-2809b3ce436f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2d8anttj execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:49:30 np0005531887 nova_compute[186849]: 2025-11-22 07:49:30.554 186853 DEBUG oslo_concurrency.processutils [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e59c400d-6c4e-44c1-b797-2809b3ce436f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2d8anttj" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:49:30 np0005531887 kernel: tap949f6b09-9b: entered promiscuous mode
Nov 22 02:49:30 np0005531887 NetworkManager[55210]: <info>  [1763797770.6166] manager: (tap949f6b09-9b): new Tun device (/org/freedesktop/NetworkManager/Devices/53)
Nov 22 02:49:30 np0005531887 systemd-udevd[218918]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:49:30 np0005531887 ovn_controller[95130]: 2025-11-22T07:49:30Z|00090|binding|INFO|Claiming lport 949f6b09-9b43-4db5-bd85-be3c96c1c5c0 for this chassis.
Nov 22 02:49:30 np0005531887 ovn_controller[95130]: 2025-11-22T07:49:30Z|00091|binding|INFO|949f6b09-9b43-4db5-bd85-be3c96c1c5c0: Claiming fa:16:3e:3b:0e:78 10.100.0.10
Nov 22 02:49:30 np0005531887 nova_compute[186849]: 2025-11-22 07:49:30.649 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:30 np0005531887 nova_compute[186849]: 2025-11-22 07:49:30.656 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:30 np0005531887 NetworkManager[55210]: <info>  [1763797770.6623] device (tap949f6b09-9b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 02:49:30 np0005531887 NetworkManager[55210]: <info>  [1763797770.6629] device (tap949f6b09-9b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 02:49:30 np0005531887 systemd-machined[153180]: New machine qemu-17-instance-0000002d.
Nov 22 02:49:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:49:30.708 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3b:0e:78 10.100.0.10'], port_security=['fa:16:3e:3b:0e:78 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'e59c400d-6c4e-44c1-b797-2809b3ce436f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-89d2ed15-12a4-49e8-81fd-f7a793c4f290', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '016630e9d4644c9a97e64dd376a8ea67', 'neutron:revision_number': '2', 'neutron:security_group_ids': '169bbea9-1e2a-4c8a-add2-24ce92a0eacf', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c1554aa3-0dc5-4add-a315-e9e7b12bbde4, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=949f6b09-9b43-4db5-bd85-be3c96c1c5c0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:49:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:49:30.709 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 949f6b09-9b43-4db5-bd85-be3c96c1c5c0 in datapath 89d2ed15-12a4-49e8-81fd-f7a793c4f290 bound to our chassis#033[00m
Nov 22 02:49:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:49:30.710 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 89d2ed15-12a4-49e8-81fd-f7a793c4f290#033[00m
Nov 22 02:49:30 np0005531887 systemd[1]: Started Virtual Machine qemu-17-instance-0000002d.
Nov 22 02:49:30 np0005531887 ovn_controller[95130]: 2025-11-22T07:49:30Z|00092|binding|INFO|Setting lport 949f6b09-9b43-4db5-bd85-be3c96c1c5c0 ovn-installed in OVS
Nov 22 02:49:30 np0005531887 ovn_controller[95130]: 2025-11-22T07:49:30Z|00093|binding|INFO|Setting lport 949f6b09-9b43-4db5-bd85-be3c96c1c5c0 up in Southbound
Nov 22 02:49:30 np0005531887 nova_compute[186849]: 2025-11-22 07:49:30.718 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:49:30.725 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[9aade5e0-d832-4125-a385-7e64df95c616]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:49:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:49:30.726 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap89d2ed15-11 in ovnmeta-89d2ed15-12a4-49e8-81fd-f7a793c4f290 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 02:49:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:49:30.730 213790 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap89d2ed15-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 02:49:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:49:30.730 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[9098d90c-9a9f-403f-a9f9-2950aae335c5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:49:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:49:30.732 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[cca163d8-699b-4521-b234-2dc42eba5aef]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:49:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:49:30.749 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[3d61afad-f64e-4989-86a6-1657756d5327]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:49:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:49:30.778 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[88a0d073-e2f1-4478-8fa2-48873f6c2681]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:49:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:49:30.809 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[80ed94e4-677e-4840-afe1-91293681c293]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:49:30 np0005531887 NetworkManager[55210]: <info>  [1763797770.8199] manager: (tap89d2ed15-10): new Veth device (/org/freedesktop/NetworkManager/Devices/54)
Nov 22 02:49:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:49:30.819 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[a5dc281a-0505-4d9a-b9e7-32ac7668020b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:49:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:49:30.855 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[6064cd89-c267-4a24-ab86-deb97f4b701c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:49:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:49:30.860 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[845b5149-c0b2-4be5-b18a-4c306894ddb9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:49:30 np0005531887 NetworkManager[55210]: <info>  [1763797770.8866] device (tap89d2ed15-10): carrier: link connected
Nov 22 02:49:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:49:30.896 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[c93b8d17-866a-4b3c-9533-02b18b57dd15]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:49:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:49:30.914 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[c9f0963a-182e-408f-8bd4-fed66cf1fdd5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap89d2ed15-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e6:13:e8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 31], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 451349, 'reachable_time': 43532, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218954, 'error': None, 'target': 'ovnmeta-89d2ed15-12a4-49e8-81fd-f7a793c4f290', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:49:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:49:30.935 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[6906a1a7-d94c-4eb6-8206-991d1d41f62c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee6:13e8'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 451349, 'tstamp': 451349}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218955, 'error': None, 'target': 'ovnmeta-89d2ed15-12a4-49e8-81fd-f7a793c4f290', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:49:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:49:30.958 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[eefde470-7644-4704-bf3e-a1d9f73c5400]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap89d2ed15-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e6:13:e8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 31], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 451349, 'reachable_time': 43532, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218956, 'error': None, 'target': 'ovnmeta-89d2ed15-12a4-49e8-81fd-f7a793c4f290', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:49:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:49:30.991 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[5387c1c7-694f-4b17-90a3-92870b8180ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:49:31 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:49:31.048 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[a6676497-20da-4661-b668-ee1e9a318a23]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:49:31 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:49:31.049 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap89d2ed15-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:49:31 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:49:31.049 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:49:31 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:49:31.051 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap89d2ed15-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:49:31 np0005531887 nova_compute[186849]: 2025-11-22 07:49:31.054 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:31 np0005531887 NetworkManager[55210]: <info>  [1763797771.0546] manager: (tap89d2ed15-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/55)
Nov 22 02:49:31 np0005531887 kernel: tap89d2ed15-10: entered promiscuous mode
Nov 22 02:49:31 np0005531887 nova_compute[186849]: 2025-11-22 07:49:31.057 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:31 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:49:31.058 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap89d2ed15-10, col_values=(('external_ids', {'iface-id': '20ffbf51-15dc-43f9-acf2-c4ad36479165'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:49:31 np0005531887 nova_compute[186849]: 2025-11-22 07:49:31.060 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:31 np0005531887 ovn_controller[95130]: 2025-11-22T07:49:31Z|00094|binding|INFO|Releasing lport 20ffbf51-15dc-43f9-acf2-c4ad36479165 from this chassis (sb_readonly=0)
Nov 22 02:49:31 np0005531887 nova_compute[186849]: 2025-11-22 07:49:31.061 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:31 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:49:31.061 104084 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/89d2ed15-12a4-49e8-81fd-f7a793c4f290.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/89d2ed15-12a4-49e8-81fd-f7a793c4f290.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 02:49:31 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:49:31.062 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[a51c5789-b353-4acf-8f16-4bc51c6d82e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:49:31 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:49:31.064 104084 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 02:49:31 np0005531887 ovn_metadata_agent[104079]: global
Nov 22 02:49:31 np0005531887 ovn_metadata_agent[104079]:    log         /dev/log local0 debug
Nov 22 02:49:31 np0005531887 ovn_metadata_agent[104079]:    log-tag     haproxy-metadata-proxy-89d2ed15-12a4-49e8-81fd-f7a793c4f290
Nov 22 02:49:31 np0005531887 ovn_metadata_agent[104079]:    user        root
Nov 22 02:49:31 np0005531887 ovn_metadata_agent[104079]:    group       root
Nov 22 02:49:31 np0005531887 ovn_metadata_agent[104079]:    maxconn     1024
Nov 22 02:49:31 np0005531887 ovn_metadata_agent[104079]:    pidfile     /var/lib/neutron/external/pids/89d2ed15-12a4-49e8-81fd-f7a793c4f290.pid.haproxy
Nov 22 02:49:31 np0005531887 ovn_metadata_agent[104079]:    daemon
Nov 22 02:49:31 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:49:31 np0005531887 ovn_metadata_agent[104079]: defaults
Nov 22 02:49:31 np0005531887 ovn_metadata_agent[104079]:    log global
Nov 22 02:49:31 np0005531887 ovn_metadata_agent[104079]:    mode http
Nov 22 02:49:31 np0005531887 ovn_metadata_agent[104079]:    option httplog
Nov 22 02:49:31 np0005531887 ovn_metadata_agent[104079]:    option dontlognull
Nov 22 02:49:31 np0005531887 ovn_metadata_agent[104079]:    option http-server-close
Nov 22 02:49:31 np0005531887 ovn_metadata_agent[104079]:    option forwardfor
Nov 22 02:49:31 np0005531887 ovn_metadata_agent[104079]:    retries                 3
Nov 22 02:49:31 np0005531887 ovn_metadata_agent[104079]:    timeout http-request    30s
Nov 22 02:49:31 np0005531887 ovn_metadata_agent[104079]:    timeout connect         30s
Nov 22 02:49:31 np0005531887 ovn_metadata_agent[104079]:    timeout client          32s
Nov 22 02:49:31 np0005531887 ovn_metadata_agent[104079]:    timeout server          32s
Nov 22 02:49:31 np0005531887 ovn_metadata_agent[104079]:    timeout http-keep-alive 30s
Nov 22 02:49:31 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:49:31 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:49:31 np0005531887 ovn_metadata_agent[104079]: listen listener
Nov 22 02:49:31 np0005531887 ovn_metadata_agent[104079]:    bind 169.254.169.254:80
Nov 22 02:49:31 np0005531887 ovn_metadata_agent[104079]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 02:49:31 np0005531887 ovn_metadata_agent[104079]:    http-request add-header X-OVN-Network-ID 89d2ed15-12a4-49e8-81fd-f7a793c4f290
Nov 22 02:49:31 np0005531887 ovn_metadata_agent[104079]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 02:49:31 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:49:31.065 104084 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-89d2ed15-12a4-49e8-81fd-f7a793c4f290', 'env', 'PROCESS_TAG=haproxy-89d2ed15-12a4-49e8-81fd-f7a793c4f290', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/89d2ed15-12a4-49e8-81fd-f7a793c4f290.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 02:49:31 np0005531887 nova_compute[186849]: 2025-11-22 07:49:31.072 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:31 np0005531887 nova_compute[186849]: 2025-11-22 07:49:31.232 186853 DEBUG nova.compute.manager [req-99f04f5c-18f8-48db-9060-099ccc94cc80 req-4574b2df-65d1-4be6-bd5e-e5aa9b8fbad8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Received event network-vif-plugged-949f6b09-9b43-4db5-bd85-be3c96c1c5c0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:49:31 np0005531887 nova_compute[186849]: 2025-11-22 07:49:31.233 186853 DEBUG oslo_concurrency.lockutils [req-99f04f5c-18f8-48db-9060-099ccc94cc80 req-4574b2df-65d1-4be6-bd5e-e5aa9b8fbad8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "e59c400d-6c4e-44c1-b797-2809b3ce436f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:49:31 np0005531887 nova_compute[186849]: 2025-11-22 07:49:31.233 186853 DEBUG oslo_concurrency.lockutils [req-99f04f5c-18f8-48db-9060-099ccc94cc80 req-4574b2df-65d1-4be6-bd5e-e5aa9b8fbad8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e59c400d-6c4e-44c1-b797-2809b3ce436f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:49:31 np0005531887 nova_compute[186849]: 2025-11-22 07:49:31.233 186853 DEBUG oslo_concurrency.lockutils [req-99f04f5c-18f8-48db-9060-099ccc94cc80 req-4574b2df-65d1-4be6-bd5e-e5aa9b8fbad8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e59c400d-6c4e-44c1-b797-2809b3ce436f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:49:31 np0005531887 nova_compute[186849]: 2025-11-22 07:49:31.234 186853 DEBUG nova.compute.manager [req-99f04f5c-18f8-48db-9060-099ccc94cc80 req-4574b2df-65d1-4be6-bd5e-e5aa9b8fbad8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Processing event network-vif-plugged-949f6b09-9b43-4db5-bd85-be3c96c1c5c0 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 02:49:31 np0005531887 nova_compute[186849]: 2025-11-22 07:49:31.381 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763797771.3804793, e59c400d-6c4e-44c1-b797-2809b3ce436f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:49:31 np0005531887 nova_compute[186849]: 2025-11-22 07:49:31.381 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] VM Started (Lifecycle Event)#033[00m
Nov 22 02:49:31 np0005531887 nova_compute[186849]: 2025-11-22 07:49:31.383 186853 DEBUG nova.compute.manager [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:49:31 np0005531887 nova_compute[186849]: 2025-11-22 07:49:31.387 186853 DEBUG nova.virt.libvirt.driver [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 02:49:31 np0005531887 nova_compute[186849]: 2025-11-22 07:49:31.391 186853 INFO nova.virt.libvirt.driver [-] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Instance spawned successfully.#033[00m
Nov 22 02:49:31 np0005531887 nova_compute[186849]: 2025-11-22 07:49:31.392 186853 DEBUG nova.virt.libvirt.driver [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 02:49:31 np0005531887 nova_compute[186849]: 2025-11-22 07:49:31.396 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:49:31 np0005531887 nova_compute[186849]: 2025-11-22 07:49:31.400 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:49:31 np0005531887 nova_compute[186849]: 2025-11-22 07:49:31.410 186853 DEBUG nova.virt.libvirt.driver [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:49:31 np0005531887 nova_compute[186849]: 2025-11-22 07:49:31.410 186853 DEBUG nova.virt.libvirt.driver [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:49:31 np0005531887 nova_compute[186849]: 2025-11-22 07:49:31.411 186853 DEBUG nova.virt.libvirt.driver [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:49:31 np0005531887 nova_compute[186849]: 2025-11-22 07:49:31.411 186853 DEBUG nova.virt.libvirt.driver [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:49:31 np0005531887 nova_compute[186849]: 2025-11-22 07:49:31.412 186853 DEBUG nova.virt.libvirt.driver [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:49:31 np0005531887 nova_compute[186849]: 2025-11-22 07:49:31.412 186853 DEBUG nova.virt.libvirt.driver [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:49:31 np0005531887 nova_compute[186849]: 2025-11-22 07:49:31.416 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:49:31 np0005531887 nova_compute[186849]: 2025-11-22 07:49:31.416 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763797771.3805573, e59c400d-6c4e-44c1-b797-2809b3ce436f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:49:31 np0005531887 nova_compute[186849]: 2025-11-22 07:49:31.416 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] VM Paused (Lifecycle Event)#033[00m
Nov 22 02:49:31 np0005531887 nova_compute[186849]: 2025-11-22 07:49:31.442 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:49:31 np0005531887 nova_compute[186849]: 2025-11-22 07:49:31.445 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763797771.3871412, e59c400d-6c4e-44c1-b797-2809b3ce436f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:49:31 np0005531887 nova_compute[186849]: 2025-11-22 07:49:31.445 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:49:31 np0005531887 nova_compute[186849]: 2025-11-22 07:49:31.475 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:49:31 np0005531887 nova_compute[186849]: 2025-11-22 07:49:31.479 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:49:31 np0005531887 nova_compute[186849]: 2025-11-22 07:49:31.497 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:49:31 np0005531887 podman[218995]: 2025-11-22 07:49:31.436922928 +0000 UTC m=+0.024742916 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 02:49:31 np0005531887 podman[218995]: 2025-11-22 07:49:31.531772565 +0000 UTC m=+0.119592533 container create 90d85c157b8f25cb0706e7b0ffafc250378145f5e765d7cf2839afe5a963dc1d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89d2ed15-12a4-49e8-81fd-f7a793c4f290, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:49:31 np0005531887 nova_compute[186849]: 2025-11-22 07:49:31.570 186853 INFO nova.compute.manager [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Took 6.90 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 02:49:31 np0005531887 nova_compute[186849]: 2025-11-22 07:49:31.572 186853 DEBUG nova.compute.manager [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:49:31 np0005531887 systemd[1]: Started libpod-conmon-90d85c157b8f25cb0706e7b0ffafc250378145f5e765d7cf2839afe5a963dc1d.scope.
Nov 22 02:49:31 np0005531887 systemd[1]: Started libcrun container.
Nov 22 02:49:31 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/73e0c0fbd9afc3d5d6c4bf9f61e8db171c1203f1e004e0a1318f281595228d2f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 02:49:31 np0005531887 podman[218995]: 2025-11-22 07:49:31.657130842 +0000 UTC m=+0.244950840 container init 90d85c157b8f25cb0706e7b0ffafc250378145f5e765d7cf2839afe5a963dc1d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89d2ed15-12a4-49e8-81fd-f7a793c4f290, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 22 02:49:31 np0005531887 podman[218995]: 2025-11-22 07:49:31.663341496 +0000 UTC m=+0.251161464 container start 90d85c157b8f25cb0706e7b0ffafc250378145f5e765d7cf2839afe5a963dc1d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89d2ed15-12a4-49e8-81fd-f7a793c4f290, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:49:31 np0005531887 neutron-haproxy-ovnmeta-89d2ed15-12a4-49e8-81fd-f7a793c4f290[219011]: [NOTICE]   (219015) : New worker (219017) forked
Nov 22 02:49:31 np0005531887 neutron-haproxy-ovnmeta-89d2ed15-12a4-49e8-81fd-f7a793c4f290[219011]: [NOTICE]   (219015) : Loading success.
Nov 22 02:49:31 np0005531887 nova_compute[186849]: 2025-11-22 07:49:31.753 186853 DEBUG nova.network.neutron [req-68ada873-6277-4c46-9ebd-556f196319e6 req-254d350f-0314-4fd3-bc1a-061fd136705f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Updated VIF entry in instance network info cache for port 949f6b09-9b43-4db5-bd85-be3c96c1c5c0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 02:49:31 np0005531887 nova_compute[186849]: 2025-11-22 07:49:31.754 186853 DEBUG nova.network.neutron [req-68ada873-6277-4c46-9ebd-556f196319e6 req-254d350f-0314-4fd3-bc1a-061fd136705f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Updating instance_info_cache with network_info: [{"id": "949f6b09-9b43-4db5-bd85-be3c96c1c5c0", "address": "fa:16:3e:3b:0e:78", "network": {"id": "89d2ed15-12a4-49e8-81fd-f7a793c4f290", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1034549749-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016630e9d4644c9a97e64dd376a8ea67", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap949f6b09-9b", "ovs_interfaceid": "949f6b09-9b43-4db5-bd85-be3c96c1c5c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:49:31 np0005531887 nova_compute[186849]: 2025-11-22 07:49:31.773 186853 DEBUG oslo_concurrency.lockutils [req-68ada873-6277-4c46-9ebd-556f196319e6 req-254d350f-0314-4fd3-bc1a-061fd136705f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-e59c400d-6c4e-44c1-b797-2809b3ce436f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:49:31 np0005531887 nova_compute[186849]: 2025-11-22 07:49:31.935 186853 INFO nova.compute.manager [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Took 7.75 seconds to build instance.#033[00m
Nov 22 02:49:32 np0005531887 nova_compute[186849]: 2025-11-22 07:49:32.052 186853 DEBUG oslo_concurrency.lockutils [None req-77dba621-797c-4e6b-b663-e037710ca55c 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Lock "e59c400d-6c4e-44c1-b797-2809b3ce436f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.971s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:49:33 np0005531887 nova_compute[186849]: 2025-11-22 07:49:33.476 186853 DEBUG nova.compute.manager [req-7c472885-f099-4ee9-b580-b4bef7797ad3 req-09058943-c692-4943-ab1a-697d0985df88 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Received event network-vif-plugged-949f6b09-9b43-4db5-bd85-be3c96c1c5c0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:49:33 np0005531887 nova_compute[186849]: 2025-11-22 07:49:33.476 186853 DEBUG oslo_concurrency.lockutils [req-7c472885-f099-4ee9-b580-b4bef7797ad3 req-09058943-c692-4943-ab1a-697d0985df88 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "e59c400d-6c4e-44c1-b797-2809b3ce436f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:49:33 np0005531887 nova_compute[186849]: 2025-11-22 07:49:33.477 186853 DEBUG oslo_concurrency.lockutils [req-7c472885-f099-4ee9-b580-b4bef7797ad3 req-09058943-c692-4943-ab1a-697d0985df88 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e59c400d-6c4e-44c1-b797-2809b3ce436f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:49:33 np0005531887 nova_compute[186849]: 2025-11-22 07:49:33.477 186853 DEBUG oslo_concurrency.lockutils [req-7c472885-f099-4ee9-b580-b4bef7797ad3 req-09058943-c692-4943-ab1a-697d0985df88 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e59c400d-6c4e-44c1-b797-2809b3ce436f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:49:33 np0005531887 nova_compute[186849]: 2025-11-22 07:49:33.477 186853 DEBUG nova.compute.manager [req-7c472885-f099-4ee9-b580-b4bef7797ad3 req-09058943-c692-4943-ab1a-697d0985df88 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] No waiting events found dispatching network-vif-plugged-949f6b09-9b43-4db5-bd85-be3c96c1c5c0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:49:33 np0005531887 nova_compute[186849]: 2025-11-22 07:49:33.477 186853 WARNING nova.compute.manager [req-7c472885-f099-4ee9-b580-b4bef7797ad3 req-09058943-c692-4943-ab1a-697d0985df88 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Received unexpected event network-vif-plugged-949f6b09-9b43-4db5-bd85-be3c96c1c5c0 for instance with vm_state active and task_state None.#033[00m
Nov 22 02:49:34 np0005531887 nova_compute[186849]: 2025-11-22 07:49:34.504 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:34 np0005531887 podman[219026]: 2025-11-22 07:49:34.858424773 +0000 UTC m=+0.081022454 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 02:49:35 np0005531887 nova_compute[186849]: 2025-11-22 07:49:35.348 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:35 np0005531887 NetworkManager[55210]: <info>  [1763797775.3496] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/56)
Nov 22 02:49:35 np0005531887 NetworkManager[55210]: <info>  [1763797775.3507] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/57)
Nov 22 02:49:35 np0005531887 nova_compute[186849]: 2025-11-22 07:49:35.421 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:35 np0005531887 nova_compute[186849]: 2025-11-22 07:49:35.424 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:35 np0005531887 ovn_controller[95130]: 2025-11-22T07:49:35Z|00095|binding|INFO|Releasing lport 20ffbf51-15dc-43f9-acf2-c4ad36479165 from this chassis (sb_readonly=0)
Nov 22 02:49:35 np0005531887 nova_compute[186849]: 2025-11-22 07:49:35.435 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:35 np0005531887 nova_compute[186849]: 2025-11-22 07:49:35.822 186853 DEBUG nova.compute.manager [req-8d0bf9c5-8bb5-4143-8ba4-b7acf4b4f8cf req-c5abe4e4-76c8-4b80-ae31-50d1494ebb13 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Received event network-changed-949f6b09-9b43-4db5-bd85-be3c96c1c5c0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:49:35 np0005531887 nova_compute[186849]: 2025-11-22 07:49:35.822 186853 DEBUG nova.compute.manager [req-8d0bf9c5-8bb5-4143-8ba4-b7acf4b4f8cf req-c5abe4e4-76c8-4b80-ae31-50d1494ebb13 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Refreshing instance network info cache due to event network-changed-949f6b09-9b43-4db5-bd85-be3c96c1c5c0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 02:49:35 np0005531887 nova_compute[186849]: 2025-11-22 07:49:35.823 186853 DEBUG oslo_concurrency.lockutils [req-8d0bf9c5-8bb5-4143-8ba4-b7acf4b4f8cf req-c5abe4e4-76c8-4b80-ae31-50d1494ebb13 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-e59c400d-6c4e-44c1-b797-2809b3ce436f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:49:35 np0005531887 nova_compute[186849]: 2025-11-22 07:49:35.823 186853 DEBUG oslo_concurrency.lockutils [req-8d0bf9c5-8bb5-4143-8ba4-b7acf4b4f8cf req-c5abe4e4-76c8-4b80-ae31-50d1494ebb13 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-e59c400d-6c4e-44c1-b797-2809b3ce436f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:49:35 np0005531887 nova_compute[186849]: 2025-11-22 07:49:35.823 186853 DEBUG nova.network.neutron [req-8d0bf9c5-8bb5-4143-8ba4-b7acf4b4f8cf req-c5abe4e4-76c8-4b80-ae31-50d1494ebb13 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Refreshing network info cache for port 949f6b09-9b43-4db5-bd85-be3c96c1c5c0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 02:49:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:49:37.321 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:49:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:49:37.322 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:49:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:49:37.322 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:49:38 np0005531887 nova_compute[186849]: 2025-11-22 07:49:38.556 186853 DEBUG nova.network.neutron [req-8d0bf9c5-8bb5-4143-8ba4-b7acf4b4f8cf req-c5abe4e4-76c8-4b80-ae31-50d1494ebb13 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Updated VIF entry in instance network info cache for port 949f6b09-9b43-4db5-bd85-be3c96c1c5c0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 02:49:38 np0005531887 nova_compute[186849]: 2025-11-22 07:49:38.558 186853 DEBUG nova.network.neutron [req-8d0bf9c5-8bb5-4143-8ba4-b7acf4b4f8cf req-c5abe4e4-76c8-4b80-ae31-50d1494ebb13 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Updating instance_info_cache with network_info: [{"id": "949f6b09-9b43-4db5-bd85-be3c96c1c5c0", "address": "fa:16:3e:3b:0e:78", "network": {"id": "89d2ed15-12a4-49e8-81fd-f7a793c4f290", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1034549749-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016630e9d4644c9a97e64dd376a8ea67", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap949f6b09-9b", "ovs_interfaceid": "949f6b09-9b43-4db5-bd85-be3c96c1c5c0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:49:38 np0005531887 nova_compute[186849]: 2025-11-22 07:49:38.580 186853 DEBUG oslo_concurrency.lockutils [req-8d0bf9c5-8bb5-4143-8ba4-b7acf4b4f8cf req-c5abe4e4-76c8-4b80-ae31-50d1494ebb13 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-e59c400d-6c4e-44c1-b797-2809b3ce436f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:49:38 np0005531887 podman[219051]: 2025-11-22 07:49:38.845038174 +0000 UTC m=+0.063570021 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 22 02:49:39 np0005531887 nova_compute[186849]: 2025-11-22 07:49:39.554 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:39 np0005531887 nova_compute[186849]: 2025-11-22 07:49:39.794 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:49:39 np0005531887 nova_compute[186849]: 2025-11-22 07:49:39.795 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:49:39 np0005531887 nova_compute[186849]: 2025-11-22 07:49:39.795 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:49:40 np0005531887 nova_compute[186849]: 2025-11-22 07:49:40.428 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:41 np0005531887 nova_compute[186849]: 2025-11-22 07:49:41.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:49:41 np0005531887 nova_compute[186849]: 2025-11-22 07:49:41.768 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 02:49:41 np0005531887 nova_compute[186849]: 2025-11-22 07:49:41.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 02:49:41 np0005531887 podman[219071]: 2025-11-22 07:49:41.841707909 +0000 UTC m=+0.057672164 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 02:49:42 np0005531887 nova_compute[186849]: 2025-11-22 07:49:42.444 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "refresh_cache-e59c400d-6c4e-44c1-b797-2809b3ce436f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:49:42 np0005531887 nova_compute[186849]: 2025-11-22 07:49:42.445 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquired lock "refresh_cache-e59c400d-6c4e-44c1-b797-2809b3ce436f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:49:42 np0005531887 nova_compute[186849]: 2025-11-22 07:49:42.445 186853 DEBUG nova.network.neutron [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 02:49:42 np0005531887 nova_compute[186849]: 2025-11-22 07:49:42.445 186853 DEBUG nova.objects.instance [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid e59c400d-6c4e-44c1-b797-2809b3ce436f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:49:44 np0005531887 nova_compute[186849]: 2025-11-22 07:49:44.556 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:44 np0005531887 nova_compute[186849]: 2025-11-22 07:49:44.641 186853 DEBUG nova.network.neutron [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Updating instance_info_cache with network_info: [{"id": "949f6b09-9b43-4db5-bd85-be3c96c1c5c0", "address": "fa:16:3e:3b:0e:78", "network": {"id": "89d2ed15-12a4-49e8-81fd-f7a793c4f290", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1034549749-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016630e9d4644c9a97e64dd376a8ea67", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap949f6b09-9b", "ovs_interfaceid": "949f6b09-9b43-4db5-bd85-be3c96c1c5c0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:49:44 np0005531887 nova_compute[186849]: 2025-11-22 07:49:44.656 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Releasing lock "refresh_cache-e59c400d-6c4e-44c1-b797-2809b3ce436f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:49:44 np0005531887 nova_compute[186849]: 2025-11-22 07:49:44.657 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 02:49:44 np0005531887 nova_compute[186849]: 2025-11-22 07:49:44.658 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:49:44 np0005531887 nova_compute[186849]: 2025-11-22 07:49:44.659 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:49:44 np0005531887 nova_compute[186849]: 2025-11-22 07:49:44.659 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:49:44 np0005531887 nova_compute[186849]: 2025-11-22 07:49:44.678 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:49:44 np0005531887 nova_compute[186849]: 2025-11-22 07:49:44.678 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:49:44 np0005531887 nova_compute[186849]: 2025-11-22 07:49:44.678 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:49:44 np0005531887 nova_compute[186849]: 2025-11-22 07:49:44.679 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 02:49:44 np0005531887 nova_compute[186849]: 2025-11-22 07:49:44.743 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e59c400d-6c4e-44c1-b797-2809b3ce436f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:49:44 np0005531887 nova_compute[186849]: 2025-11-22 07:49:44.816 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e59c400d-6c4e-44c1-b797-2809b3ce436f/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:49:44 np0005531887 nova_compute[186849]: 2025-11-22 07:49:44.817 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e59c400d-6c4e-44c1-b797-2809b3ce436f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:49:44 np0005531887 nova_compute[186849]: 2025-11-22 07:49:44.908 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e59c400d-6c4e-44c1-b797-2809b3ce436f/disk --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:49:45 np0005531887 nova_compute[186849]: 2025-11-22 07:49:45.087 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:49:45 np0005531887 nova_compute[186849]: 2025-11-22 07:49:45.089 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5573MB free_disk=73.4539566040039GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 02:49:45 np0005531887 nova_compute[186849]: 2025-11-22 07:49:45.089 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:49:45 np0005531887 nova_compute[186849]: 2025-11-22 07:49:45.089 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:49:45 np0005531887 nova_compute[186849]: 2025-11-22 07:49:45.157 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Instance e59c400d-6c4e-44c1-b797-2809b3ce436f actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 02:49:45 np0005531887 nova_compute[186849]: 2025-11-22 07:49:45.158 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 02:49:45 np0005531887 nova_compute[186849]: 2025-11-22 07:49:45.158 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 02:49:45 np0005531887 nova_compute[186849]: 2025-11-22 07:49:45.200 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:49:45 np0005531887 nova_compute[186849]: 2025-11-22 07:49:45.210 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:49:45 np0005531887 nova_compute[186849]: 2025-11-22 07:49:45.273 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 02:49:45 np0005531887 nova_compute[186849]: 2025-11-22 07:49:45.274 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.185s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:49:45 np0005531887 nova_compute[186849]: 2025-11-22 07:49:45.384 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:49:45 np0005531887 nova_compute[186849]: 2025-11-22 07:49:45.409 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:49:45 np0005531887 nova_compute[186849]: 2025-11-22 07:49:45.409 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:49:45 np0005531887 nova_compute[186849]: 2025-11-22 07:49:45.410 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 02:49:45 np0005531887 nova_compute[186849]: 2025-11-22 07:49:45.428 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:45 np0005531887 ovn_controller[95130]: 2025-11-22T07:49:45Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:3b:0e:78 10.100.0.10
Nov 22 02:49:45 np0005531887 ovn_controller[95130]: 2025-11-22T07:49:45Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:3b:0e:78 10.100.0.10
Nov 22 02:49:46 np0005531887 podman[219112]: 2025-11-22 07:49:46.860278631 +0000 UTC m=+0.078332518 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 02:49:49 np0005531887 nova_compute[186849]: 2025-11-22 07:49:49.593 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:50 np0005531887 nova_compute[186849]: 2025-11-22 07:49:50.431 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:52 np0005531887 podman[219136]: 2025-11-22 07:49:52.83819499 +0000 UTC m=+0.062072905 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.buildah.version=1.33.7, release=1755695350, config_id=edpm, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, build-date=2025-08-20T13:12:41, vcs-type=git, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 22 02:49:54 np0005531887 nova_compute[186849]: 2025-11-22 07:49:54.594 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:55 np0005531887 nova_compute[186849]: 2025-11-22 07:49:55.339 186853 DEBUG nova.objects.instance [None req-36dd94b8-9d2e-430f-b1b2-0039ce9d279d 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Lazy-loading 'flavor' on Instance uuid e59c400d-6c4e-44c1-b797-2809b3ce436f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:49:55 np0005531887 nova_compute[186849]: 2025-11-22 07:49:55.397 186853 DEBUG oslo_concurrency.lockutils [None req-36dd94b8-9d2e-430f-b1b2-0039ce9d279d 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Acquiring lock "refresh_cache-e59c400d-6c4e-44c1-b797-2809b3ce436f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:49:55 np0005531887 nova_compute[186849]: 2025-11-22 07:49:55.397 186853 DEBUG oslo_concurrency.lockutils [None req-36dd94b8-9d2e-430f-b1b2-0039ce9d279d 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Acquired lock "refresh_cache-e59c400d-6c4e-44c1-b797-2809b3ce436f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:49:55 np0005531887 nova_compute[186849]: 2025-11-22 07:49:55.433 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:55 np0005531887 nova_compute[186849]: 2025-11-22 07:49:55.836 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:49:55 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:49:55.837 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:49:55 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:49:55.838 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 02:49:56 np0005531887 nova_compute[186849]: 2025-11-22 07:49:56.912 186853 DEBUG nova.network.neutron [None req-36dd94b8-9d2e-430f-b1b2-0039ce9d279d 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:49:58 np0005531887 nova_compute[186849]: 2025-11-22 07:49:58.233 186853 DEBUG nova.compute.manager [req-cf9e6252-8729-4f26-93bc-64800db1760b req-07df9238-b92a-4ba0-a56d-cc17b80c1215 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Received event network-changed-949f6b09-9b43-4db5-bd85-be3c96c1c5c0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:49:58 np0005531887 nova_compute[186849]: 2025-11-22 07:49:58.233 186853 DEBUG nova.compute.manager [req-cf9e6252-8729-4f26-93bc-64800db1760b req-07df9238-b92a-4ba0-a56d-cc17b80c1215 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Refreshing instance network info cache due to event network-changed-949f6b09-9b43-4db5-bd85-be3c96c1c5c0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 02:49:58 np0005531887 nova_compute[186849]: 2025-11-22 07:49:58.233 186853 DEBUG oslo_concurrency.lockutils [req-cf9e6252-8729-4f26-93bc-64800db1760b req-07df9238-b92a-4ba0-a56d-cc17b80c1215 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-e59c400d-6c4e-44c1-b797-2809b3ce436f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:49:58 np0005531887 podman[219159]: 2025-11-22 07:49:58.856637692 +0000 UTC m=+0.071465507 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:49:58 np0005531887 podman[219160]: 2025-11-22 07:49:58.916944991 +0000 UTC m=+0.126699779 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 02:49:59 np0005531887 nova_compute[186849]: 2025-11-22 07:49:59.558 186853 DEBUG nova.network.neutron [None req-36dd94b8-9d2e-430f-b1b2-0039ce9d279d 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Updating instance_info_cache with network_info: [{"id": "949f6b09-9b43-4db5-bd85-be3c96c1c5c0", "address": "fa:16:3e:3b:0e:78", "network": {"id": "89d2ed15-12a4-49e8-81fd-f7a793c4f290", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1034549749-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}, {"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016630e9d4644c9a97e64dd376a8ea67", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap949f6b09-9b", "ovs_interfaceid": "949f6b09-9b43-4db5-bd85-be3c96c1c5c0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:49:59 np0005531887 nova_compute[186849]: 2025-11-22 07:49:59.575 186853 DEBUG oslo_concurrency.lockutils [None req-36dd94b8-9d2e-430f-b1b2-0039ce9d279d 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Releasing lock "refresh_cache-e59c400d-6c4e-44c1-b797-2809b3ce436f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:49:59 np0005531887 nova_compute[186849]: 2025-11-22 07:49:59.576 186853 DEBUG nova.compute.manager [None req-36dd94b8-9d2e-430f-b1b2-0039ce9d279d 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Inject network info _inject_network_info /usr/lib/python3.9/site-packages/nova/compute/manager.py:7144#033[00m
Nov 22 02:49:59 np0005531887 nova_compute[186849]: 2025-11-22 07:49:59.576 186853 DEBUG nova.compute.manager [None req-36dd94b8-9d2e-430f-b1b2-0039ce9d279d 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] network_info to inject: |[{"id": "949f6b09-9b43-4db5-bd85-be3c96c1c5c0", "address": "fa:16:3e:3b:0e:78", "network": {"id": "89d2ed15-12a4-49e8-81fd-f7a793c4f290", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1034549749-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}, {"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016630e9d4644c9a97e64dd376a8ea67", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap949f6b09-9b", "ovs_interfaceid": "949f6b09-9b43-4db5-bd85-be3c96c1c5c0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _inject_network_info /usr/lib/python3.9/site-packages/nova/compute/manager.py:7145#033[00m
Nov 22 02:49:59 np0005531887 nova_compute[186849]: 2025-11-22 07:49:59.579 186853 DEBUG oslo_concurrency.lockutils [req-cf9e6252-8729-4f26-93bc-64800db1760b req-07df9238-b92a-4ba0-a56d-cc17b80c1215 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-e59c400d-6c4e-44c1-b797-2809b3ce436f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:49:59 np0005531887 nova_compute[186849]: 2025-11-22 07:49:59.579 186853 DEBUG nova.network.neutron [req-cf9e6252-8729-4f26-93bc-64800db1760b req-07df9238-b92a-4ba0-a56d-cc17b80c1215 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Refreshing network info cache for port 949f6b09-9b43-4db5-bd85-be3c96c1c5c0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 02:49:59 np0005531887 nova_compute[186849]: 2025-11-22 07:49:59.617 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:00 np0005531887 nova_compute[186849]: 2025-11-22 07:50:00.435 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:00 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:50:00.840 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:50:00 np0005531887 nova_compute[186849]: 2025-11-22 07:50:00.841 186853 DEBUG nova.objects.instance [None req-9440f4bd-b9e8-457d-8e7a-374af432cfa3 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Lazy-loading 'flavor' on Instance uuid e59c400d-6c4e-44c1-b797-2809b3ce436f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:50:00 np0005531887 nova_compute[186849]: 2025-11-22 07:50:00.880 186853 DEBUG oslo_concurrency.lockutils [None req-9440f4bd-b9e8-457d-8e7a-374af432cfa3 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Acquiring lock "refresh_cache-e59c400d-6c4e-44c1-b797-2809b3ce436f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:50:01 np0005531887 nova_compute[186849]: 2025-11-22 07:50:01.158 186853 DEBUG nova.network.neutron [req-cf9e6252-8729-4f26-93bc-64800db1760b req-07df9238-b92a-4ba0-a56d-cc17b80c1215 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Updated VIF entry in instance network info cache for port 949f6b09-9b43-4db5-bd85-be3c96c1c5c0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 02:50:01 np0005531887 nova_compute[186849]: 2025-11-22 07:50:01.159 186853 DEBUG nova.network.neutron [req-cf9e6252-8729-4f26-93bc-64800db1760b req-07df9238-b92a-4ba0-a56d-cc17b80c1215 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Updating instance_info_cache with network_info: [{"id": "949f6b09-9b43-4db5-bd85-be3c96c1c5c0", "address": "fa:16:3e:3b:0e:78", "network": {"id": "89d2ed15-12a4-49e8-81fd-f7a793c4f290", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1034549749-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}, {"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016630e9d4644c9a97e64dd376a8ea67", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap949f6b09-9b", "ovs_interfaceid": "949f6b09-9b43-4db5-bd85-be3c96c1c5c0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:50:01 np0005531887 nova_compute[186849]: 2025-11-22 07:50:01.175 186853 DEBUG oslo_concurrency.lockutils [req-cf9e6252-8729-4f26-93bc-64800db1760b req-07df9238-b92a-4ba0-a56d-cc17b80c1215 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-e59c400d-6c4e-44c1-b797-2809b3ce436f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:50:01 np0005531887 nova_compute[186849]: 2025-11-22 07:50:01.176 186853 DEBUG oslo_concurrency.lockutils [None req-9440f4bd-b9e8-457d-8e7a-374af432cfa3 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Acquired lock "refresh_cache-e59c400d-6c4e-44c1-b797-2809b3ce436f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:50:02 np0005531887 nova_compute[186849]: 2025-11-22 07:50:02.299 186853 DEBUG nova.network.neutron [None req-9440f4bd-b9e8-457d-8e7a-374af432cfa3 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:50:02 np0005531887 nova_compute[186849]: 2025-11-22 07:50:02.459 186853 DEBUG nova.compute.manager [req-f054b17c-6327-44e5-a6dc-dd5de766e60a req-7ee8d408-b5fd-48b9-bde6-bec40127a894 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Received event network-changed-949f6b09-9b43-4db5-bd85-be3c96c1c5c0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:50:02 np0005531887 nova_compute[186849]: 2025-11-22 07:50:02.460 186853 DEBUG nova.compute.manager [req-f054b17c-6327-44e5-a6dc-dd5de766e60a req-7ee8d408-b5fd-48b9-bde6-bec40127a894 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Refreshing instance network info cache due to event network-changed-949f6b09-9b43-4db5-bd85-be3c96c1c5c0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 02:50:02 np0005531887 nova_compute[186849]: 2025-11-22 07:50:02.460 186853 DEBUG oslo_concurrency.lockutils [req-f054b17c-6327-44e5-a6dc-dd5de766e60a req-7ee8d408-b5fd-48b9-bde6-bec40127a894 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-e59c400d-6c4e-44c1-b797-2809b3ce436f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:50:04 np0005531887 nova_compute[186849]: 2025-11-22 07:50:04.123 186853 DEBUG nova.network.neutron [None req-9440f4bd-b9e8-457d-8e7a-374af432cfa3 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Updating instance_info_cache with network_info: [{"id": "949f6b09-9b43-4db5-bd85-be3c96c1c5c0", "address": "fa:16:3e:3b:0e:78", "network": {"id": "89d2ed15-12a4-49e8-81fd-f7a793c4f290", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1034549749-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016630e9d4644c9a97e64dd376a8ea67", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap949f6b09-9b", "ovs_interfaceid": "949f6b09-9b43-4db5-bd85-be3c96c1c5c0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:50:04 np0005531887 nova_compute[186849]: 2025-11-22 07:50:04.198 186853 DEBUG oslo_concurrency.lockutils [None req-9440f4bd-b9e8-457d-8e7a-374af432cfa3 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Releasing lock "refresh_cache-e59c400d-6c4e-44c1-b797-2809b3ce436f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:50:04 np0005531887 nova_compute[186849]: 2025-11-22 07:50:04.199 186853 DEBUG nova.compute.manager [None req-9440f4bd-b9e8-457d-8e7a-374af432cfa3 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Inject network info _inject_network_info /usr/lib/python3.9/site-packages/nova/compute/manager.py:7144#033[00m
Nov 22 02:50:04 np0005531887 nova_compute[186849]: 2025-11-22 07:50:04.199 186853 DEBUG nova.compute.manager [None req-9440f4bd-b9e8-457d-8e7a-374af432cfa3 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] network_info to inject: |[{"id": "949f6b09-9b43-4db5-bd85-be3c96c1c5c0", "address": "fa:16:3e:3b:0e:78", "network": {"id": "89d2ed15-12a4-49e8-81fd-f7a793c4f290", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1034549749-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016630e9d4644c9a97e64dd376a8ea67", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap949f6b09-9b", "ovs_interfaceid": "949f6b09-9b43-4db5-bd85-be3c96c1c5c0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _inject_network_info /usr/lib/python3.9/site-packages/nova/compute/manager.py:7145#033[00m
Nov 22 02:50:04 np0005531887 nova_compute[186849]: 2025-11-22 07:50:04.202 186853 DEBUG oslo_concurrency.lockutils [req-f054b17c-6327-44e5-a6dc-dd5de766e60a req-7ee8d408-b5fd-48b9-bde6-bec40127a894 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-e59c400d-6c4e-44c1-b797-2809b3ce436f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:50:04 np0005531887 nova_compute[186849]: 2025-11-22 07:50:04.202 186853 DEBUG nova.network.neutron [req-f054b17c-6327-44e5-a6dc-dd5de766e60a req-7ee8d408-b5fd-48b9-bde6-bec40127a894 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Refreshing network info cache for port 949f6b09-9b43-4db5-bd85-be3c96c1c5c0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 02:50:04 np0005531887 ovn_controller[95130]: 2025-11-22T07:50:04Z|00096|binding|INFO|Releasing lport 20ffbf51-15dc-43f9-acf2-c4ad36479165 from this chassis (sb_readonly=0)
Nov 22 02:50:04 np0005531887 nova_compute[186849]: 2025-11-22 07:50:04.525 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:04 np0005531887 nova_compute[186849]: 2025-11-22 07:50:04.619 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:05 np0005531887 nova_compute[186849]: 2025-11-22 07:50:05.075 186853 DEBUG oslo_concurrency.lockutils [None req-d6298a60-c472-473a-a6ec-925a6f72a02f 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Acquiring lock "e59c400d-6c4e-44c1-b797-2809b3ce436f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:50:05 np0005531887 nova_compute[186849]: 2025-11-22 07:50:05.075 186853 DEBUG oslo_concurrency.lockutils [None req-d6298a60-c472-473a-a6ec-925a6f72a02f 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Lock "e59c400d-6c4e-44c1-b797-2809b3ce436f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:50:05 np0005531887 nova_compute[186849]: 2025-11-22 07:50:05.076 186853 DEBUG oslo_concurrency.lockutils [None req-d6298a60-c472-473a-a6ec-925a6f72a02f 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Acquiring lock "e59c400d-6c4e-44c1-b797-2809b3ce436f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:50:05 np0005531887 nova_compute[186849]: 2025-11-22 07:50:05.076 186853 DEBUG oslo_concurrency.lockutils [None req-d6298a60-c472-473a-a6ec-925a6f72a02f 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Lock "e59c400d-6c4e-44c1-b797-2809b3ce436f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:50:05 np0005531887 nova_compute[186849]: 2025-11-22 07:50:05.076 186853 DEBUG oslo_concurrency.lockutils [None req-d6298a60-c472-473a-a6ec-925a6f72a02f 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Lock "e59c400d-6c4e-44c1-b797-2809b3ce436f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:50:05 np0005531887 nova_compute[186849]: 2025-11-22 07:50:05.084 186853 INFO nova.compute.manager [None req-d6298a60-c472-473a-a6ec-925a6f72a02f 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Terminating instance#033[00m
Nov 22 02:50:05 np0005531887 nova_compute[186849]: 2025-11-22 07:50:05.091 186853 DEBUG nova.compute.manager [None req-d6298a60-c472-473a-a6ec-925a6f72a02f 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 02:50:05 np0005531887 kernel: tap949f6b09-9b (unregistering): left promiscuous mode
Nov 22 02:50:05 np0005531887 NetworkManager[55210]: <info>  [1763797805.1219] device (tap949f6b09-9b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 02:50:05 np0005531887 ovn_controller[95130]: 2025-11-22T07:50:05Z|00097|binding|INFO|Releasing lport 949f6b09-9b43-4db5-bd85-be3c96c1c5c0 from this chassis (sb_readonly=0)
Nov 22 02:50:05 np0005531887 ovn_controller[95130]: 2025-11-22T07:50:05Z|00098|binding|INFO|Setting lport 949f6b09-9b43-4db5-bd85-be3c96c1c5c0 down in Southbound
Nov 22 02:50:05 np0005531887 ovn_controller[95130]: 2025-11-22T07:50:05Z|00099|binding|INFO|Removing iface tap949f6b09-9b ovn-installed in OVS
Nov 22 02:50:05 np0005531887 nova_compute[186849]: 2025-11-22 07:50:05.128 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:05 np0005531887 nova_compute[186849]: 2025-11-22 07:50:05.130 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:50:05.134 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3b:0e:78 10.100.0.10'], port_security=['fa:16:3e:3b:0e:78 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'e59c400d-6c4e-44c1-b797-2809b3ce436f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-89d2ed15-12a4-49e8-81fd-f7a793c4f290', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '016630e9d4644c9a97e64dd376a8ea67', 'neutron:revision_number': '6', 'neutron:security_group_ids': '169bbea9-1e2a-4c8a-add2-24ce92a0eacf', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.181'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c1554aa3-0dc5-4add-a315-e9e7b12bbde4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=949f6b09-9b43-4db5-bd85-be3c96c1c5c0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:50:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:50:05.135 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 949f6b09-9b43-4db5-bd85-be3c96c1c5c0 in datapath 89d2ed15-12a4-49e8-81fd-f7a793c4f290 unbound from our chassis#033[00m
Nov 22 02:50:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:50:05.136 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 89d2ed15-12a4-49e8-81fd-f7a793c4f290, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 02:50:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:50:05.139 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[3c5096a7-1c6d-4912-a5fd-21ade6f26a73]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:50:05.140 104084 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-89d2ed15-12a4-49e8-81fd-f7a793c4f290 namespace which is not needed anymore#033[00m
Nov 22 02:50:05 np0005531887 nova_compute[186849]: 2025-11-22 07:50:05.146 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:05 np0005531887 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d0000002d.scope: Deactivated successfully.
Nov 22 02:50:05 np0005531887 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d0000002d.scope: Consumed 15.994s CPU time.
Nov 22 02:50:05 np0005531887 systemd-machined[153180]: Machine qemu-17-instance-0000002d terminated.
Nov 22 02:50:05 np0005531887 podman[219206]: 2025-11-22 07:50:05.206815802 +0000 UTC m=+0.057961872 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 02:50:05 np0005531887 neutron-haproxy-ovnmeta-89d2ed15-12a4-49e8-81fd-f7a793c4f290[219011]: [NOTICE]   (219015) : haproxy version is 2.8.14-c23fe91
Nov 22 02:50:05 np0005531887 neutron-haproxy-ovnmeta-89d2ed15-12a4-49e8-81fd-f7a793c4f290[219011]: [NOTICE]   (219015) : path to executable is /usr/sbin/haproxy
Nov 22 02:50:05 np0005531887 neutron-haproxy-ovnmeta-89d2ed15-12a4-49e8-81fd-f7a793c4f290[219011]: [WARNING]  (219015) : Exiting Master process...
Nov 22 02:50:05 np0005531887 neutron-haproxy-ovnmeta-89d2ed15-12a4-49e8-81fd-f7a793c4f290[219011]: [ALERT]    (219015) : Current worker (219017) exited with code 143 (Terminated)
Nov 22 02:50:05 np0005531887 neutron-haproxy-ovnmeta-89d2ed15-12a4-49e8-81fd-f7a793c4f290[219011]: [WARNING]  (219015) : All workers exited. Exiting... (0)
Nov 22 02:50:05 np0005531887 systemd[1]: libpod-90d85c157b8f25cb0706e7b0ffafc250378145f5e765d7cf2839afe5a963dc1d.scope: Deactivated successfully.
Nov 22 02:50:05 np0005531887 podman[219253]: 2025-11-22 07:50:05.294037189 +0000 UTC m=+0.053550781 container died 90d85c157b8f25cb0706e7b0ffafc250378145f5e765d7cf2839afe5a963dc1d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89d2ed15-12a4-49e8-81fd-f7a793c4f290, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:50:05 np0005531887 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-90d85c157b8f25cb0706e7b0ffafc250378145f5e765d7cf2839afe5a963dc1d-userdata-shm.mount: Deactivated successfully.
Nov 22 02:50:05 np0005531887 systemd[1]: var-lib-containers-storage-overlay-73e0c0fbd9afc3d5d6c4bf9f61e8db171c1203f1e004e0a1318f281595228d2f-merged.mount: Deactivated successfully.
Nov 22 02:50:05 np0005531887 podman[219253]: 2025-11-22 07:50:05.361976738 +0000 UTC m=+0.121490330 container cleanup 90d85c157b8f25cb0706e7b0ffafc250378145f5e765d7cf2839afe5a963dc1d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89d2ed15-12a4-49e8-81fd-f7a793c4f290, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:50:05 np0005531887 systemd[1]: libpod-conmon-90d85c157b8f25cb0706e7b0ffafc250378145f5e765d7cf2839afe5a963dc1d.scope: Deactivated successfully.
Nov 22 02:50:05 np0005531887 nova_compute[186849]: 2025-11-22 07:50:05.373 186853 INFO nova.virt.libvirt.driver [-] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Instance destroyed successfully.#033[00m
Nov 22 02:50:05 np0005531887 nova_compute[186849]: 2025-11-22 07:50:05.375 186853 DEBUG nova.objects.instance [None req-d6298a60-c472-473a-a6ec-925a6f72a02f 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Lazy-loading 'resources' on Instance uuid e59c400d-6c4e-44c1-b797-2809b3ce436f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:50:05 np0005531887 nova_compute[186849]: 2025-11-22 07:50:05.392 186853 DEBUG nova.virt.libvirt.vif [None req-d6298a60-c472-473a-a6ec-925a6f72a02f 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:49:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesUnderV243Test-server-915692846',display_name='tempest-AttachInterfacesUnderV243Test-server-915692846',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacesunderv243test-server-915692846',id=45,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI9I7AKHwzsdyW94e1BqU/1gxwz29bZH4aT61+oXBp3nixnDiOKpR15Cg7stiR0RFGonescIbjAMshz0mpaqhcsA6H0iKeUrj8npA+NNbPz2j6/WZYDlc6g1Ggpwz39eCw==',key_name='tempest-keypair-62070966',keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:49:31Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='016630e9d4644c9a97e64dd376a8ea67',ramdisk_id='',reservation_id='r-wbtde39v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesUnderV243Test-58749522',owner_user_name='tempest-AttachInterfacesUnderV243Test-58749522-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:50:04Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='734c4f1eee2d4b3b903662ad118275cb',uuid=e59c400d-6c4e-44c1-b797-2809b3ce436f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "949f6b09-9b43-4db5-bd85-be3c96c1c5c0", "address": "fa:16:3e:3b:0e:78", "network": {"id": "89d2ed15-12a4-49e8-81fd-f7a793c4f290", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1034549749-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016630e9d4644c9a97e64dd376a8ea67", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap949f6b09-9b", "ovs_interfaceid": "949f6b09-9b43-4db5-bd85-be3c96c1c5c0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 02:50:05 np0005531887 nova_compute[186849]: 2025-11-22 07:50:05.393 186853 DEBUG nova.network.os_vif_util [None req-d6298a60-c472-473a-a6ec-925a6f72a02f 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Converting VIF {"id": "949f6b09-9b43-4db5-bd85-be3c96c1c5c0", "address": "fa:16:3e:3b:0e:78", "network": {"id": "89d2ed15-12a4-49e8-81fd-f7a793c4f290", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1034549749-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016630e9d4644c9a97e64dd376a8ea67", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap949f6b09-9b", "ovs_interfaceid": "949f6b09-9b43-4db5-bd85-be3c96c1c5c0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:50:05 np0005531887 nova_compute[186849]: 2025-11-22 07:50:05.394 186853 DEBUG nova.network.os_vif_util [None req-d6298a60-c472-473a-a6ec-925a6f72a02f 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:3b:0e:78,bridge_name='br-int',has_traffic_filtering=True,id=949f6b09-9b43-4db5-bd85-be3c96c1c5c0,network=Network(89d2ed15-12a4-49e8-81fd-f7a793c4f290),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap949f6b09-9b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:50:05 np0005531887 nova_compute[186849]: 2025-11-22 07:50:05.395 186853 DEBUG os_vif [None req-d6298a60-c472-473a-a6ec-925a6f72a02f 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:3b:0e:78,bridge_name='br-int',has_traffic_filtering=True,id=949f6b09-9b43-4db5-bd85-be3c96c1c5c0,network=Network(89d2ed15-12a4-49e8-81fd-f7a793c4f290),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap949f6b09-9b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 02:50:05 np0005531887 nova_compute[186849]: 2025-11-22 07:50:05.397 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:05 np0005531887 nova_compute[186849]: 2025-11-22 07:50:05.398 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap949f6b09-9b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:50:05 np0005531887 nova_compute[186849]: 2025-11-22 07:50:05.400 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:05 np0005531887 nova_compute[186849]: 2025-11-22 07:50:05.403 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:50:05 np0005531887 nova_compute[186849]: 2025-11-22 07:50:05.408 186853 INFO os_vif [None req-d6298a60-c472-473a-a6ec-925a6f72a02f 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:3b:0e:78,bridge_name='br-int',has_traffic_filtering=True,id=949f6b09-9b43-4db5-bd85-be3c96c1c5c0,network=Network(89d2ed15-12a4-49e8-81fd-f7a793c4f290),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap949f6b09-9b')#033[00m
Nov 22 02:50:05 np0005531887 nova_compute[186849]: 2025-11-22 07:50:05.409 186853 INFO nova.virt.libvirt.driver [None req-d6298a60-c472-473a-a6ec-925a6f72a02f 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Deleting instance files /var/lib/nova/instances/e59c400d-6c4e-44c1-b797-2809b3ce436f_del#033[00m
Nov 22 02:50:05 np0005531887 nova_compute[186849]: 2025-11-22 07:50:05.410 186853 INFO nova.virt.libvirt.driver [None req-d6298a60-c472-473a-a6ec-925a6f72a02f 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Deletion of /var/lib/nova/instances/e59c400d-6c4e-44c1-b797-2809b3ce436f_del complete#033[00m
Nov 22 02:50:05 np0005531887 nova_compute[186849]: 2025-11-22 07:50:05.431 186853 DEBUG nova.compute.manager [req-bef0a482-38d8-45b6-b0d6-4720905cf236 req-ebb63a7a-dbfb-4859-ab7d-f9d879c1ccbe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Received event network-vif-unplugged-949f6b09-9b43-4db5-bd85-be3c96c1c5c0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:50:05 np0005531887 nova_compute[186849]: 2025-11-22 07:50:05.432 186853 DEBUG oslo_concurrency.lockutils [req-bef0a482-38d8-45b6-b0d6-4720905cf236 req-ebb63a7a-dbfb-4859-ab7d-f9d879c1ccbe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "e59c400d-6c4e-44c1-b797-2809b3ce436f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:50:05 np0005531887 nova_compute[186849]: 2025-11-22 07:50:05.432 186853 DEBUG oslo_concurrency.lockutils [req-bef0a482-38d8-45b6-b0d6-4720905cf236 req-ebb63a7a-dbfb-4859-ab7d-f9d879c1ccbe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e59c400d-6c4e-44c1-b797-2809b3ce436f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:50:05 np0005531887 nova_compute[186849]: 2025-11-22 07:50:05.432 186853 DEBUG oslo_concurrency.lockutils [req-bef0a482-38d8-45b6-b0d6-4720905cf236 req-ebb63a7a-dbfb-4859-ab7d-f9d879c1ccbe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e59c400d-6c4e-44c1-b797-2809b3ce436f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:50:05 np0005531887 nova_compute[186849]: 2025-11-22 07:50:05.432 186853 DEBUG nova.compute.manager [req-bef0a482-38d8-45b6-b0d6-4720905cf236 req-ebb63a7a-dbfb-4859-ab7d-f9d879c1ccbe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] No waiting events found dispatching network-vif-unplugged-949f6b09-9b43-4db5-bd85-be3c96c1c5c0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:50:05 np0005531887 nova_compute[186849]: 2025-11-22 07:50:05.432 186853 DEBUG nova.compute.manager [req-bef0a482-38d8-45b6-b0d6-4720905cf236 req-ebb63a7a-dbfb-4859-ab7d-f9d879c1ccbe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Received event network-vif-unplugged-949f6b09-9b43-4db5-bd85-be3c96c1c5c0 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 02:50:05 np0005531887 nova_compute[186849]: 2025-11-22 07:50:05.437 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:05 np0005531887 podman[219304]: 2025-11-22 07:50:05.477751135 +0000 UTC m=+0.086555912 container remove 90d85c157b8f25cb0706e7b0ffafc250378145f5e765d7cf2839afe5a963dc1d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89d2ed15-12a4-49e8-81fd-f7a793c4f290, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 22 02:50:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:50:05.484 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[456d0bc8-8fcc-4a62-aa59-027c4f1e0259]: (4, ('Sat Nov 22 07:50:05 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-89d2ed15-12a4-49e8-81fd-f7a793c4f290 (90d85c157b8f25cb0706e7b0ffafc250378145f5e765d7cf2839afe5a963dc1d)\n90d85c157b8f25cb0706e7b0ffafc250378145f5e765d7cf2839afe5a963dc1d\nSat Nov 22 07:50:05 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-89d2ed15-12a4-49e8-81fd-f7a793c4f290 (90d85c157b8f25cb0706e7b0ffafc250378145f5e765d7cf2839afe5a963dc1d)\n90d85c157b8f25cb0706e7b0ffafc250378145f5e765d7cf2839afe5a963dc1d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:50:05.485 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[2889ff01-7ba7-44b0-98fb-d8ea17f1ab7e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:50:05.486 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap89d2ed15-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:50:05 np0005531887 nova_compute[186849]: 2025-11-22 07:50:05.488 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:05 np0005531887 kernel: tap89d2ed15-10: left promiscuous mode
Nov 22 02:50:05 np0005531887 nova_compute[186849]: 2025-11-22 07:50:05.500 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:50:05.505 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[34bffc81-b21d-4dc9-8804-e4619466d1db]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:50:05.518 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[959c8d9e-e46b-4ef2-b4ed-bcb1bfb335bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:50:05.519 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[54e789b5-5813-4afd-bb9e-aed5d7539628]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:50:05.535 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[59494a86-dc29-403a-8ac6-ae6f95cf3f80]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 451341, 'reachable_time': 22958, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219317, 'error': None, 'target': 'ovnmeta-89d2ed15-12a4-49e8-81fd-f7a793c4f290', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:50:05.538 104200 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-89d2ed15-12a4-49e8-81fd-f7a793c4f290 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 02:50:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:50:05.538 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[46b3afb6-dfd5-4ea6-947c-cc7fd40ea8dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:05 np0005531887 systemd[1]: run-netns-ovnmeta\x2d89d2ed15\x2d12a4\x2d49e8\x2d81fd\x2df7a793c4f290.mount: Deactivated successfully.
Nov 22 02:50:05 np0005531887 nova_compute[186849]: 2025-11-22 07:50:05.556 186853 INFO nova.compute.manager [None req-d6298a60-c472-473a-a6ec-925a6f72a02f 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Took 0.46 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 02:50:05 np0005531887 nova_compute[186849]: 2025-11-22 07:50:05.557 186853 DEBUG oslo.service.loopingcall [None req-d6298a60-c472-473a-a6ec-925a6f72a02f 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 02:50:05 np0005531887 nova_compute[186849]: 2025-11-22 07:50:05.557 186853 DEBUG nova.compute.manager [-] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 02:50:05 np0005531887 nova_compute[186849]: 2025-11-22 07:50:05.558 186853 DEBUG nova.network.neutron [-] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 02:50:07 np0005531887 nova_compute[186849]: 2025-11-22 07:50:07.540 186853 DEBUG nova.network.neutron [req-f054b17c-6327-44e5-a6dc-dd5de766e60a req-7ee8d408-b5fd-48b9-bde6-bec40127a894 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Updated VIF entry in instance network info cache for port 949f6b09-9b43-4db5-bd85-be3c96c1c5c0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 02:50:07 np0005531887 nova_compute[186849]: 2025-11-22 07:50:07.540 186853 DEBUG nova.network.neutron [req-f054b17c-6327-44e5-a6dc-dd5de766e60a req-7ee8d408-b5fd-48b9-bde6-bec40127a894 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Updating instance_info_cache with network_info: [{"id": "949f6b09-9b43-4db5-bd85-be3c96c1c5c0", "address": "fa:16:3e:3b:0e:78", "network": {"id": "89d2ed15-12a4-49e8-81fd-f7a793c4f290", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1034549749-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016630e9d4644c9a97e64dd376a8ea67", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap949f6b09-9b", "ovs_interfaceid": "949f6b09-9b43-4db5-bd85-be3c96c1c5c0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:50:07 np0005531887 nova_compute[186849]: 2025-11-22 07:50:07.544 186853 DEBUG nova.compute.manager [req-5cfb353b-23e7-4c3a-bd8d-050b998e977e req-53339004-a6d3-448f-a87e-3a27e5261241 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Received event network-vif-plugged-949f6b09-9b43-4db5-bd85-be3c96c1c5c0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:50:07 np0005531887 nova_compute[186849]: 2025-11-22 07:50:07.545 186853 DEBUG oslo_concurrency.lockutils [req-5cfb353b-23e7-4c3a-bd8d-050b998e977e req-53339004-a6d3-448f-a87e-3a27e5261241 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "e59c400d-6c4e-44c1-b797-2809b3ce436f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:50:07 np0005531887 nova_compute[186849]: 2025-11-22 07:50:07.545 186853 DEBUG oslo_concurrency.lockutils [req-5cfb353b-23e7-4c3a-bd8d-050b998e977e req-53339004-a6d3-448f-a87e-3a27e5261241 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e59c400d-6c4e-44c1-b797-2809b3ce436f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:50:07 np0005531887 nova_compute[186849]: 2025-11-22 07:50:07.545 186853 DEBUG oslo_concurrency.lockutils [req-5cfb353b-23e7-4c3a-bd8d-050b998e977e req-53339004-a6d3-448f-a87e-3a27e5261241 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e59c400d-6c4e-44c1-b797-2809b3ce436f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:50:07 np0005531887 nova_compute[186849]: 2025-11-22 07:50:07.545 186853 DEBUG nova.compute.manager [req-5cfb353b-23e7-4c3a-bd8d-050b998e977e req-53339004-a6d3-448f-a87e-3a27e5261241 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] No waiting events found dispatching network-vif-plugged-949f6b09-9b43-4db5-bd85-be3c96c1c5c0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:50:07 np0005531887 nova_compute[186849]: 2025-11-22 07:50:07.545 186853 WARNING nova.compute.manager [req-5cfb353b-23e7-4c3a-bd8d-050b998e977e req-53339004-a6d3-448f-a87e-3a27e5261241 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Received unexpected event network-vif-plugged-949f6b09-9b43-4db5-bd85-be3c96c1c5c0 for instance with vm_state active and task_state deleting.#033[00m
Nov 22 02:50:07 np0005531887 nova_compute[186849]: 2025-11-22 07:50:07.557 186853 DEBUG oslo_concurrency.lockutils [req-f054b17c-6327-44e5-a6dc-dd5de766e60a req-7ee8d408-b5fd-48b9-bde6-bec40127a894 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-e59c400d-6c4e-44c1-b797-2809b3ce436f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:50:09 np0005531887 nova_compute[186849]: 2025-11-22 07:50:09.004 186853 DEBUG nova.network.neutron [-] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:50:09 np0005531887 nova_compute[186849]: 2025-11-22 07:50:09.029 186853 INFO nova.compute.manager [-] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Took 3.47 seconds to deallocate network for instance.#033[00m
Nov 22 02:50:09 np0005531887 nova_compute[186849]: 2025-11-22 07:50:09.106 186853 DEBUG oslo_concurrency.lockutils [None req-d6298a60-c472-473a-a6ec-925a6f72a02f 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:50:09 np0005531887 nova_compute[186849]: 2025-11-22 07:50:09.107 186853 DEBUG oslo_concurrency.lockutils [None req-d6298a60-c472-473a-a6ec-925a6f72a02f 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:50:09 np0005531887 nova_compute[186849]: 2025-11-22 07:50:09.141 186853 DEBUG nova.scheduler.client.report [None req-d6298a60-c472-473a-a6ec-925a6f72a02f 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Refreshing inventories for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 22 02:50:09 np0005531887 nova_compute[186849]: 2025-11-22 07:50:09.173 186853 DEBUG nova.scheduler.client.report [None req-d6298a60-c472-473a-a6ec-925a6f72a02f 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Updating ProviderTree inventory for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 22 02:50:09 np0005531887 nova_compute[186849]: 2025-11-22 07:50:09.174 186853 DEBUG nova.compute.provider_tree [None req-d6298a60-c472-473a-a6ec-925a6f72a02f 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Updating inventory in ProviderTree for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 22 02:50:09 np0005531887 nova_compute[186849]: 2025-11-22 07:50:09.214 186853 DEBUG nova.scheduler.client.report [None req-d6298a60-c472-473a-a6ec-925a6f72a02f 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Refreshing aggregate associations for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 22 02:50:09 np0005531887 nova_compute[186849]: 2025-11-22 07:50:09.258 186853 DEBUG nova.compute.manager [req-02d03c3d-611e-4485-ac69-aee1e8885038 req-3fb57e7d-660c-4382-9386-2554d500d647 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Received event network-vif-deleted-949f6b09-9b43-4db5-bd85-be3c96c1c5c0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:50:09 np0005531887 nova_compute[186849]: 2025-11-22 07:50:09.259 186853 DEBUG nova.scheduler.client.report [None req-d6298a60-c472-473a-a6ec-925a6f72a02f 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Refreshing trait associations for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78, traits: COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_MMX,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_PCNET _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 22 02:50:09 np0005531887 nova_compute[186849]: 2025-11-22 07:50:09.327 186853 DEBUG nova.compute.provider_tree [None req-d6298a60-c472-473a-a6ec-925a6f72a02f 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:50:09 np0005531887 nova_compute[186849]: 2025-11-22 07:50:09.347 186853 DEBUG nova.scheduler.client.report [None req-d6298a60-c472-473a-a6ec-925a6f72a02f 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:50:09 np0005531887 nova_compute[186849]: 2025-11-22 07:50:09.373 186853 DEBUG oslo_concurrency.lockutils [None req-d6298a60-c472-473a-a6ec-925a6f72a02f 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.266s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:50:09 np0005531887 nova_compute[186849]: 2025-11-22 07:50:09.439 186853 INFO nova.scheduler.client.report [None req-d6298a60-c472-473a-a6ec-925a6f72a02f 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Deleted allocations for instance e59c400d-6c4e-44c1-b797-2809b3ce436f#033[00m
Nov 22 02:50:09 np0005531887 nova_compute[186849]: 2025-11-22 07:50:09.552 186853 DEBUG oslo_concurrency.lockutils [None req-d6298a60-c472-473a-a6ec-925a6f72a02f 734c4f1eee2d4b3b903662ad118275cb 016630e9d4644c9a97e64dd376a8ea67 - - default default] Lock "e59c400d-6c4e-44c1-b797-2809b3ce436f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.476s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:50:09 np0005531887 podman[219320]: 2025-11-22 07:50:09.832016225 +0000 UTC m=+0.052376042 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 22 02:50:10 np0005531887 nova_compute[186849]: 2025-11-22 07:50:10.401 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:10 np0005531887 nova_compute[186849]: 2025-11-22 07:50:10.485 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:12 np0005531887 podman[219339]: 2025-11-22 07:50:12.83431683 +0000 UTC m=+0.058466223 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 02:50:14 np0005531887 nova_compute[186849]: 2025-11-22 07:50:14.564 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:14 np0005531887 nova_compute[186849]: 2025-11-22 07:50:14.635 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:15 np0005531887 nova_compute[186849]: 2025-11-22 07:50:15.404 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:15 np0005531887 nova_compute[186849]: 2025-11-22 07:50:15.487 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:17 np0005531887 podman[219361]: 2025-11-22 07:50:17.829370978 +0000 UTC m=+0.054440964 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 02:50:20 np0005531887 nova_compute[186849]: 2025-11-22 07:50:20.377 186853 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797805.3698368, e59c400d-6c4e-44c1-b797-2809b3ce436f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:50:20 np0005531887 nova_compute[186849]: 2025-11-22 07:50:20.378 186853 INFO nova.compute.manager [-] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:50:20 np0005531887 nova_compute[186849]: 2025-11-22 07:50:20.399 186853 DEBUG nova.compute.manager [None req-f5e74ba5-4655-47df-8aea-fad617a779ee - - - - - -] [instance: e59c400d-6c4e-44c1-b797-2809b3ce436f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:50:20 np0005531887 nova_compute[186849]: 2025-11-22 07:50:20.407 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:20 np0005531887 nova_compute[186849]: 2025-11-22 07:50:20.490 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:23 np0005531887 podman[219385]: 2025-11-22 07:50:23.834598885 +0000 UTC m=+0.055076120 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.buildah.version=1.33.7, container_name=openstack_network_exporter, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_id=edpm, version=9.6, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, release=1755695350)
Nov 22 02:50:25 np0005531887 nova_compute[186849]: 2025-11-22 07:50:25.410 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:25 np0005531887 nova_compute[186849]: 2025-11-22 07:50:25.491 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:28 np0005531887 nova_compute[186849]: 2025-11-22 07:50:28.600 186853 DEBUG oslo_concurrency.lockutils [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Acquiring lock "c290ea3f-b425-4f25-946d-6b45d2ec31e5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:50:28 np0005531887 nova_compute[186849]: 2025-11-22 07:50:28.600 186853 DEBUG oslo_concurrency.lockutils [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Lock "c290ea3f-b425-4f25-946d-6b45d2ec31e5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:50:28 np0005531887 nova_compute[186849]: 2025-11-22 07:50:28.618 186853 DEBUG nova.compute.manager [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 02:50:28 np0005531887 nova_compute[186849]: 2025-11-22 07:50:28.986 186853 DEBUG oslo_concurrency.lockutils [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:50:28 np0005531887 nova_compute[186849]: 2025-11-22 07:50:28.986 186853 DEBUG oslo_concurrency.lockutils [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:50:28 np0005531887 nova_compute[186849]: 2025-11-22 07:50:28.997 186853 DEBUG nova.virt.hardware [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:50:28 np0005531887 nova_compute[186849]: 2025-11-22 07:50:28.997 186853 INFO nova.compute.claims [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 22 02:50:29 np0005531887 nova_compute[186849]: 2025-11-22 07:50:29.121 186853 DEBUG nova.compute.provider_tree [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:50:29 np0005531887 nova_compute[186849]: 2025-11-22 07:50:29.148 186853 DEBUG nova.scheduler.client.report [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:50:29 np0005531887 nova_compute[186849]: 2025-11-22 07:50:29.190 186853 DEBUG oslo_concurrency.lockutils [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.203s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:50:29 np0005531887 nova_compute[186849]: 2025-11-22 07:50:29.191 186853 DEBUG nova.compute.manager [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 02:50:29 np0005531887 nova_compute[186849]: 2025-11-22 07:50:29.270 186853 DEBUG nova.compute.manager [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 02:50:29 np0005531887 nova_compute[186849]: 2025-11-22 07:50:29.271 186853 DEBUG nova.network.neutron [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 02:50:29 np0005531887 nova_compute[186849]: 2025-11-22 07:50:29.288 186853 INFO nova.virt.libvirt.driver [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 02:50:29 np0005531887 nova_compute[186849]: 2025-11-22 07:50:29.307 186853 DEBUG nova.compute.manager [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 02:50:29 np0005531887 nova_compute[186849]: 2025-11-22 07:50:29.409 186853 DEBUG nova.compute.manager [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 02:50:29 np0005531887 nova_compute[186849]: 2025-11-22 07:50:29.411 186853 DEBUG nova.virt.libvirt.driver [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:50:29 np0005531887 nova_compute[186849]: 2025-11-22 07:50:29.411 186853 INFO nova.virt.libvirt.driver [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Creating image(s)#033[00m
Nov 22 02:50:29 np0005531887 nova_compute[186849]: 2025-11-22 07:50:29.412 186853 DEBUG oslo_concurrency.lockutils [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Acquiring lock "/var/lib/nova/instances/c290ea3f-b425-4f25-946d-6b45d2ec31e5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:50:29 np0005531887 nova_compute[186849]: 2025-11-22 07:50:29.412 186853 DEBUG oslo_concurrency.lockutils [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Lock "/var/lib/nova/instances/c290ea3f-b425-4f25-946d-6b45d2ec31e5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:50:29 np0005531887 nova_compute[186849]: 2025-11-22 07:50:29.413 186853 DEBUG oslo_concurrency.lockutils [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Lock "/var/lib/nova/instances/c290ea3f-b425-4f25-946d-6b45d2ec31e5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:50:29 np0005531887 nova_compute[186849]: 2025-11-22 07:50:29.426 186853 DEBUG oslo_concurrency.processutils [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:50:29 np0005531887 nova_compute[186849]: 2025-11-22 07:50:29.505 186853 DEBUG oslo_concurrency.processutils [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:50:29 np0005531887 nova_compute[186849]: 2025-11-22 07:50:29.506 186853 DEBUG oslo_concurrency.lockutils [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:50:29 np0005531887 nova_compute[186849]: 2025-11-22 07:50:29.506 186853 DEBUG oslo_concurrency.lockutils [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:50:29 np0005531887 nova_compute[186849]: 2025-11-22 07:50:29.518 186853 DEBUG oslo_concurrency.processutils [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:50:29 np0005531887 nova_compute[186849]: 2025-11-22 07:50:29.586 186853 DEBUG oslo_concurrency.processutils [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:50:29 np0005531887 nova_compute[186849]: 2025-11-22 07:50:29.588 186853 DEBUG oslo_concurrency.processutils [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/c290ea3f-b425-4f25-946d-6b45d2ec31e5/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:50:29 np0005531887 nova_compute[186849]: 2025-11-22 07:50:29.645 186853 DEBUG oslo_concurrency.processutils [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/c290ea3f-b425-4f25-946d-6b45d2ec31e5/disk 1073741824" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:50:29 np0005531887 nova_compute[186849]: 2025-11-22 07:50:29.647 186853 DEBUG oslo_concurrency.lockutils [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:50:29 np0005531887 nova_compute[186849]: 2025-11-22 07:50:29.647 186853 DEBUG oslo_concurrency.processutils [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:50:29 np0005531887 nova_compute[186849]: 2025-11-22 07:50:29.714 186853 DEBUG oslo_concurrency.processutils [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:50:29 np0005531887 nova_compute[186849]: 2025-11-22 07:50:29.715 186853 DEBUG nova.virt.disk.api [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Checking if we can resize image /var/lib/nova/instances/c290ea3f-b425-4f25-946d-6b45d2ec31e5/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:50:29 np0005531887 nova_compute[186849]: 2025-11-22 07:50:29.715 186853 DEBUG oslo_concurrency.processutils [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c290ea3f-b425-4f25-946d-6b45d2ec31e5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:50:29 np0005531887 nova_compute[186849]: 2025-11-22 07:50:29.785 186853 DEBUG oslo_concurrency.processutils [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c290ea3f-b425-4f25-946d-6b45d2ec31e5/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:50:29 np0005531887 nova_compute[186849]: 2025-11-22 07:50:29.786 186853 DEBUG nova.virt.disk.api [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Cannot resize image /var/lib/nova/instances/c290ea3f-b425-4f25-946d-6b45d2ec31e5/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:50:29 np0005531887 nova_compute[186849]: 2025-11-22 07:50:29.787 186853 DEBUG nova.objects.instance [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Lazy-loading 'migration_context' on Instance uuid c290ea3f-b425-4f25-946d-6b45d2ec31e5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:50:29 np0005531887 nova_compute[186849]: 2025-11-22 07:50:29.799 186853 DEBUG nova.virt.libvirt.driver [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:50:29 np0005531887 nova_compute[186849]: 2025-11-22 07:50:29.799 186853 DEBUG nova.virt.libvirt.driver [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Ensure instance console log exists: /var/lib/nova/instances/c290ea3f-b425-4f25-946d-6b45d2ec31e5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:50:29 np0005531887 nova_compute[186849]: 2025-11-22 07:50:29.800 186853 DEBUG oslo_concurrency.lockutils [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:50:29 np0005531887 nova_compute[186849]: 2025-11-22 07:50:29.800 186853 DEBUG oslo_concurrency.lockutils [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:50:29 np0005531887 nova_compute[186849]: 2025-11-22 07:50:29.800 186853 DEBUG oslo_concurrency.lockutils [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:50:29 np0005531887 podman[219419]: 2025-11-22 07:50:29.846017724 +0000 UTC m=+0.068328590 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251118, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 02:50:29 np0005531887 podman[219420]: 2025-11-22 07:50:29.866400641 +0000 UTC m=+0.085609269 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:50:29 np0005531887 nova_compute[186849]: 2025-11-22 07:50:29.947 186853 DEBUG nova.policy [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '617dbb2ad35c42bc834156437fc93e34', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '64566cf00036456abfe577ae2fef6a7c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 02:50:30 np0005531887 nova_compute[186849]: 2025-11-22 07:50:30.413 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:30 np0005531887 nova_compute[186849]: 2025-11-22 07:50:30.493 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:31 np0005531887 nova_compute[186849]: 2025-11-22 07:50:31.863 186853 DEBUG nova.network.neutron [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Successfully created port: f438eb18-11d6-49aa-9cc3-1a2656d47a6e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 02:50:33 np0005531887 nova_compute[186849]: 2025-11-22 07:50:33.683 186853 DEBUG nova.network.neutron [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Successfully updated port: f438eb18-11d6-49aa-9cc3-1a2656d47a6e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 02:50:33 np0005531887 nova_compute[186849]: 2025-11-22 07:50:33.707 186853 DEBUG oslo_concurrency.lockutils [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Acquiring lock "refresh_cache-c290ea3f-b425-4f25-946d-6b45d2ec31e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:50:33 np0005531887 nova_compute[186849]: 2025-11-22 07:50:33.707 186853 DEBUG oslo_concurrency.lockutils [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Acquired lock "refresh_cache-c290ea3f-b425-4f25-946d-6b45d2ec31e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:50:33 np0005531887 nova_compute[186849]: 2025-11-22 07:50:33.708 186853 DEBUG nova.network.neutron [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:50:33 np0005531887 nova_compute[186849]: 2025-11-22 07:50:33.750 186853 DEBUG nova.compute.manager [req-05007458-17fc-4521-84f5-0339f142756f req-92c1e29d-97c2-4170-8bd7-e5ce49098db1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Received event network-changed-f438eb18-11d6-49aa-9cc3-1a2656d47a6e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:50:33 np0005531887 nova_compute[186849]: 2025-11-22 07:50:33.751 186853 DEBUG nova.compute.manager [req-05007458-17fc-4521-84f5-0339f142756f req-92c1e29d-97c2-4170-8bd7-e5ce49098db1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Refreshing instance network info cache due to event network-changed-f438eb18-11d6-49aa-9cc3-1a2656d47a6e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 02:50:33 np0005531887 nova_compute[186849]: 2025-11-22 07:50:33.751 186853 DEBUG oslo_concurrency.lockutils [req-05007458-17fc-4521-84f5-0339f142756f req-92c1e29d-97c2-4170-8bd7-e5ce49098db1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-c290ea3f-b425-4f25-946d-6b45d2ec31e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:50:33 np0005531887 nova_compute[186849]: 2025-11-22 07:50:33.917 186853 DEBUG nova.network.neutron [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:50:35 np0005531887 nova_compute[186849]: 2025-11-22 07:50:35.349 186853 DEBUG nova.network.neutron [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Updating instance_info_cache with network_info: [{"id": "f438eb18-11d6-49aa-9cc3-1a2656d47a6e", "address": "fa:16:3e:a4:a6:64", "network": {"id": "9a29fb10-157b-4bc3-b002-01a9b93f1b72", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1756068905-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "64566cf00036456abfe577ae2fef6a7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf438eb18-11", "ovs_interfaceid": "f438eb18-11d6-49aa-9cc3-1a2656d47a6e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:50:35 np0005531887 nova_compute[186849]: 2025-11-22 07:50:35.417 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:35 np0005531887 nova_compute[186849]: 2025-11-22 07:50:35.494 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:35 np0005531887 nova_compute[186849]: 2025-11-22 07:50:35.680 186853 DEBUG oslo_concurrency.lockutils [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Releasing lock "refresh_cache-c290ea3f-b425-4f25-946d-6b45d2ec31e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:50:35 np0005531887 nova_compute[186849]: 2025-11-22 07:50:35.680 186853 DEBUG nova.compute.manager [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Instance network_info: |[{"id": "f438eb18-11d6-49aa-9cc3-1a2656d47a6e", "address": "fa:16:3e:a4:a6:64", "network": {"id": "9a29fb10-157b-4bc3-b002-01a9b93f1b72", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1756068905-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "64566cf00036456abfe577ae2fef6a7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf438eb18-11", "ovs_interfaceid": "f438eb18-11d6-49aa-9cc3-1a2656d47a6e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 02:50:35 np0005531887 nova_compute[186849]: 2025-11-22 07:50:35.681 186853 DEBUG oslo_concurrency.lockutils [req-05007458-17fc-4521-84f5-0339f142756f req-92c1e29d-97c2-4170-8bd7-e5ce49098db1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-c290ea3f-b425-4f25-946d-6b45d2ec31e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:50:35 np0005531887 nova_compute[186849]: 2025-11-22 07:50:35.681 186853 DEBUG nova.network.neutron [req-05007458-17fc-4521-84f5-0339f142756f req-92c1e29d-97c2-4170-8bd7-e5ce49098db1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Refreshing network info cache for port f438eb18-11d6-49aa-9cc3-1a2656d47a6e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 02:50:35 np0005531887 nova_compute[186849]: 2025-11-22 07:50:35.684 186853 DEBUG nova.virt.libvirt.driver [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Start _get_guest_xml network_info=[{"id": "f438eb18-11d6-49aa-9cc3-1a2656d47a6e", "address": "fa:16:3e:a4:a6:64", "network": {"id": "9a29fb10-157b-4bc3-b002-01a9b93f1b72", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1756068905-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "64566cf00036456abfe577ae2fef6a7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf438eb18-11", "ovs_interfaceid": "f438eb18-11d6-49aa-9cc3-1a2656d47a6e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'size': 0, 'boot_index': 0, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:50:35 np0005531887 nova_compute[186849]: 2025-11-22 07:50:35.690 186853 WARNING nova.virt.libvirt.driver [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:50:35 np0005531887 nova_compute[186849]: 2025-11-22 07:50:35.696 186853 DEBUG nova.virt.libvirt.host [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:50:35 np0005531887 nova_compute[186849]: 2025-11-22 07:50:35.697 186853 DEBUG nova.virt.libvirt.host [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:50:35 np0005531887 nova_compute[186849]: 2025-11-22 07:50:35.701 186853 DEBUG nova.virt.libvirt.host [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:50:35 np0005531887 nova_compute[186849]: 2025-11-22 07:50:35.702 186853 DEBUG nova.virt.libvirt.host [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:50:35 np0005531887 nova_compute[186849]: 2025-11-22 07:50:35.703 186853 DEBUG nova.virt.libvirt.driver [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:50:35 np0005531887 nova_compute[186849]: 2025-11-22 07:50:35.703 186853 DEBUG nova.virt.hardware [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:50:35 np0005531887 nova_compute[186849]: 2025-11-22 07:50:35.704 186853 DEBUG nova.virt.hardware [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:50:35 np0005531887 nova_compute[186849]: 2025-11-22 07:50:35.704 186853 DEBUG nova.virt.hardware [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:50:35 np0005531887 nova_compute[186849]: 2025-11-22 07:50:35.704 186853 DEBUG nova.virt.hardware [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:50:35 np0005531887 nova_compute[186849]: 2025-11-22 07:50:35.704 186853 DEBUG nova.virt.hardware [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:50:35 np0005531887 nova_compute[186849]: 2025-11-22 07:50:35.705 186853 DEBUG nova.virt.hardware [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:50:35 np0005531887 nova_compute[186849]: 2025-11-22 07:50:35.705 186853 DEBUG nova.virt.hardware [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:50:35 np0005531887 nova_compute[186849]: 2025-11-22 07:50:35.705 186853 DEBUG nova.virt.hardware [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:50:35 np0005531887 nova_compute[186849]: 2025-11-22 07:50:35.705 186853 DEBUG nova.virt.hardware [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:50:35 np0005531887 nova_compute[186849]: 2025-11-22 07:50:35.705 186853 DEBUG nova.virt.hardware [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:50:35 np0005531887 nova_compute[186849]: 2025-11-22 07:50:35.706 186853 DEBUG nova.virt.hardware [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:50:35 np0005531887 nova_compute[186849]: 2025-11-22 07:50:35.709 186853 DEBUG nova.virt.libvirt.vif [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:50:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='guest-instance-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx-guest-test.domaintest.com',id=49,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCq3tOHLxFADk08BTE74la/pkcSbzch0W8vcagtv1h5VaBxLk97y+NzQbQG03sgRQKdqFzRFfi3kZFVvixXTGpd83WduPNQtWl9jKX5/A3kaBuFKJlYMCgUPssX9D8icWg==',key_name='tempest-keypair-922179052',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='64566cf00036456abfe577ae2fef6a7c',ramdisk_id='',reservation_id='r-vxefzfau',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersV294TestFqdnHostnames-925199310',owner_user_name='tempest-ServersV294TestFqdnHostnames-925199310-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:50:29Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='617dbb2ad35c42bc834156437fc93e34',uuid=c290ea3f-b425-4f25-946d-6b45d2ec31e5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f438eb18-11d6-49aa-9cc3-1a2656d47a6e", "address": "fa:16:3e:a4:a6:64", "network": {"id": "9a29fb10-157b-4bc3-b002-01a9b93f1b72", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1756068905-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "64566cf00036456abfe577ae2fef6a7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf438eb18-11", "ovs_interfaceid": "f438eb18-11d6-49aa-9cc3-1a2656d47a6e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 02:50:35 np0005531887 nova_compute[186849]: 2025-11-22 07:50:35.709 186853 DEBUG nova.network.os_vif_util [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Converting VIF {"id": "f438eb18-11d6-49aa-9cc3-1a2656d47a6e", "address": "fa:16:3e:a4:a6:64", "network": {"id": "9a29fb10-157b-4bc3-b002-01a9b93f1b72", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1756068905-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "64566cf00036456abfe577ae2fef6a7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf438eb18-11", "ovs_interfaceid": "f438eb18-11d6-49aa-9cc3-1a2656d47a6e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:50:35 np0005531887 nova_compute[186849]: 2025-11-22 07:50:35.710 186853 DEBUG nova.network.os_vif_util [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a4:a6:64,bridge_name='br-int',has_traffic_filtering=True,id=f438eb18-11d6-49aa-9cc3-1a2656d47a6e,network=Network(9a29fb10-157b-4bc3-b002-01a9b93f1b72),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf438eb18-11') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:50:35 np0005531887 nova_compute[186849]: 2025-11-22 07:50:35.711 186853 DEBUG nova.objects.instance [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Lazy-loading 'pci_devices' on Instance uuid c290ea3f-b425-4f25-946d-6b45d2ec31e5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:50:35 np0005531887 nova_compute[186849]: 2025-11-22 07:50:35.748 186853 DEBUG nova.virt.libvirt.driver [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:50:35 np0005531887 nova_compute[186849]:  <uuid>c290ea3f-b425-4f25-946d-6b45d2ec31e5</uuid>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:  <name>instance-00000031</name>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:  <memory>131072</memory>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:  <vcpu>1</vcpu>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:50:35 np0005531887 nova_compute[186849]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:      <nova:name>guest-instance-1</nova:name>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:      <nova:creationTime>2025-11-22 07:50:35</nova:creationTime>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:      <nova:flavor name="m1.nano">
Nov 22 02:50:35 np0005531887 nova_compute[186849]:        <nova:memory>128</nova:memory>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:        <nova:disk>1</nova:disk>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:        <nova:swap>0</nova:swap>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:      </nova:flavor>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:      <nova:owner>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:        <nova:user uuid="617dbb2ad35c42bc834156437fc93e34">tempest-ServersV294TestFqdnHostnames-925199310-project-member</nova:user>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:        <nova:project uuid="64566cf00036456abfe577ae2fef6a7c">tempest-ServersV294TestFqdnHostnames-925199310</nova:project>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:      </nova:owner>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:      <nova:ports>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:        <nova:port uuid="f438eb18-11d6-49aa-9cc3-1a2656d47a6e">
Nov 22 02:50:35 np0005531887 nova_compute[186849]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:        </nova:port>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:      </nova:ports>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:    </nova:instance>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:  <sysinfo type="smbios">
Nov 22 02:50:35 np0005531887 nova_compute[186849]:    <system>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:      <entry name="serial">c290ea3f-b425-4f25-946d-6b45d2ec31e5</entry>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:      <entry name="uuid">c290ea3f-b425-4f25-946d-6b45d2ec31e5</entry>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:    </system>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:  <os>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:    <boot dev="hd"/>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:    <smbios mode="sysinfo"/>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:  </os>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:  <features>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:    <vmcoreinfo/>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:  </features>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:  <clock offset="utc">
Nov 22 02:50:35 np0005531887 nova_compute[186849]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:    <timer name="hpet" present="no"/>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:  </clock>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:  <cpu mode="custom" match="exact">
Nov 22 02:50:35 np0005531887 nova_compute[186849]:    <model>Nehalem</model>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:  <devices>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:    <disk type="file" device="disk">
Nov 22 02:50:35 np0005531887 nova_compute[186849]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/c290ea3f-b425-4f25-946d-6b45d2ec31e5/disk"/>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:      <target dev="vda" bus="virtio"/>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:    <disk type="file" device="cdrom">
Nov 22 02:50:35 np0005531887 nova_compute[186849]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/c290ea3f-b425-4f25-946d-6b45d2ec31e5/disk.config"/>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:      <target dev="sda" bus="sata"/>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:    <interface type="ethernet">
Nov 22 02:50:35 np0005531887 nova_compute[186849]:      <mac address="fa:16:3e:a4:a6:64"/>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:      <mtu size="1442"/>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:      <target dev="tapf438eb18-11"/>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:    </interface>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:    <serial type="pty">
Nov 22 02:50:35 np0005531887 nova_compute[186849]:      <log file="/var/lib/nova/instances/c290ea3f-b425-4f25-946d-6b45d2ec31e5/console.log" append="off"/>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:    </serial>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:    <video>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:    </video>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:    <input type="tablet" bus="usb"/>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:    <rng model="virtio">
Nov 22 02:50:35 np0005531887 nova_compute[186849]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:    </rng>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:    <controller type="usb" index="0"/>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:    <memballoon model="virtio">
Nov 22 02:50:35 np0005531887 nova_compute[186849]:      <stats period="10"/>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 02:50:35 np0005531887 nova_compute[186849]:  </devices>
Nov 22 02:50:35 np0005531887 nova_compute[186849]: </domain>
Nov 22 02:50:35 np0005531887 nova_compute[186849]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:50:35 np0005531887 nova_compute[186849]: 2025-11-22 07:50:35.749 186853 DEBUG nova.compute.manager [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Preparing to wait for external event network-vif-plugged-f438eb18-11d6-49aa-9cc3-1a2656d47a6e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 02:50:35 np0005531887 nova_compute[186849]: 2025-11-22 07:50:35.750 186853 DEBUG oslo_concurrency.lockutils [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Acquiring lock "c290ea3f-b425-4f25-946d-6b45d2ec31e5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:50:35 np0005531887 nova_compute[186849]: 2025-11-22 07:50:35.750 186853 DEBUG oslo_concurrency.lockutils [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Lock "c290ea3f-b425-4f25-946d-6b45d2ec31e5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:50:35 np0005531887 nova_compute[186849]: 2025-11-22 07:50:35.750 186853 DEBUG oslo_concurrency.lockutils [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Lock "c290ea3f-b425-4f25-946d-6b45d2ec31e5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:50:35 np0005531887 nova_compute[186849]: 2025-11-22 07:50:35.751 186853 DEBUG nova.virt.libvirt.vif [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:50:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='guest-instance-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx-guest-test.domaintest.com',id=49,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCq3tOHLxFADk08BTE74la/pkcSbzch0W8vcagtv1h5VaBxLk97y+NzQbQG03sgRQKdqFzRFfi3kZFVvixXTGpd83WduPNQtWl9jKX5/A3kaBuFKJlYMCgUPssX9D8icWg==',key_name='tempest-keypair-922179052',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='64566cf00036456abfe577ae2fef6a7c',ramdisk_id='',reservation_id='r-vxefzfau',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersV294TestFqdnHostnames-925199310',owner_user_name='tempest-ServersV294TestFqdnHostnames-925199310-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:50:29Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='617dbb2ad35c42bc834156437fc93e34',uuid=c290ea3f-b425-4f25-946d-6b45d2ec31e5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f438eb18-11d6-49aa-9cc3-1a2656d47a6e", "address": "fa:16:3e:a4:a6:64", "network": {"id": "9a29fb10-157b-4bc3-b002-01a9b93f1b72", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1756068905-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "64566cf00036456abfe577ae2fef6a7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf438eb18-11", "ovs_interfaceid": "f438eb18-11d6-49aa-9cc3-1a2656d47a6e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 02:50:35 np0005531887 nova_compute[186849]: 2025-11-22 07:50:35.751 186853 DEBUG nova.network.os_vif_util [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Converting VIF {"id": "f438eb18-11d6-49aa-9cc3-1a2656d47a6e", "address": "fa:16:3e:a4:a6:64", "network": {"id": "9a29fb10-157b-4bc3-b002-01a9b93f1b72", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1756068905-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "64566cf00036456abfe577ae2fef6a7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf438eb18-11", "ovs_interfaceid": "f438eb18-11d6-49aa-9cc3-1a2656d47a6e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:50:35 np0005531887 nova_compute[186849]: 2025-11-22 07:50:35.752 186853 DEBUG nova.network.os_vif_util [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a4:a6:64,bridge_name='br-int',has_traffic_filtering=True,id=f438eb18-11d6-49aa-9cc3-1a2656d47a6e,network=Network(9a29fb10-157b-4bc3-b002-01a9b93f1b72),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf438eb18-11') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:50:35 np0005531887 nova_compute[186849]: 2025-11-22 07:50:35.752 186853 DEBUG os_vif [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a4:a6:64,bridge_name='br-int',has_traffic_filtering=True,id=f438eb18-11d6-49aa-9cc3-1a2656d47a6e,network=Network(9a29fb10-157b-4bc3-b002-01a9b93f1b72),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf438eb18-11') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 02:50:35 np0005531887 nova_compute[186849]: 2025-11-22 07:50:35.752 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:35 np0005531887 nova_compute[186849]: 2025-11-22 07:50:35.753 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:50:35 np0005531887 nova_compute[186849]: 2025-11-22 07:50:35.753 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:50:35 np0005531887 nova_compute[186849]: 2025-11-22 07:50:35.755 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:35 np0005531887 nova_compute[186849]: 2025-11-22 07:50:35.755 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf438eb18-11, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:50:35 np0005531887 nova_compute[186849]: 2025-11-22 07:50:35.756 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf438eb18-11, col_values=(('external_ids', {'iface-id': 'f438eb18-11d6-49aa-9cc3-1a2656d47a6e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a4:a6:64', 'vm-uuid': 'c290ea3f-b425-4f25-946d-6b45d2ec31e5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:50:35 np0005531887 nova_compute[186849]: 2025-11-22 07:50:35.757 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:35 np0005531887 NetworkManager[55210]: <info>  [1763797835.7588] manager: (tapf438eb18-11): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/58)
Nov 22 02:50:35 np0005531887 nova_compute[186849]: 2025-11-22 07:50:35.759 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:50:35 np0005531887 nova_compute[186849]: 2025-11-22 07:50:35.765 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:35 np0005531887 nova_compute[186849]: 2025-11-22 07:50:35.766 186853 INFO os_vif [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a4:a6:64,bridge_name='br-int',has_traffic_filtering=True,id=f438eb18-11d6-49aa-9cc3-1a2656d47a6e,network=Network(9a29fb10-157b-4bc3-b002-01a9b93f1b72),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf438eb18-11')#033[00m
Nov 22 02:50:35 np0005531887 nova_compute[186849]: 2025-11-22 07:50:35.822 186853 DEBUG nova.virt.libvirt.driver [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:50:35 np0005531887 nova_compute[186849]: 2025-11-22 07:50:35.822 186853 DEBUG nova.virt.libvirt.driver [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:50:35 np0005531887 nova_compute[186849]: 2025-11-22 07:50:35.823 186853 DEBUG nova.virt.libvirt.driver [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] No VIF found with MAC fa:16:3e:a4:a6:64, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 02:50:35 np0005531887 nova_compute[186849]: 2025-11-22 07:50:35.823 186853 INFO nova.virt.libvirt.driver [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Using config drive#033[00m
Nov 22 02:50:35 np0005531887 podman[219466]: 2025-11-22 07:50:35.844646077 +0000 UTC m=+0.062020423 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 02:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:50:36.664 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c290ea3f-b425-4f25-946d-6b45d2ec31e5', 'name': 'guest-instance-1', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000031', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'shutdown', 'tenant_id': '64566cf00036456abfe577ae2fef6a7c', 'user_id': '617dbb2ad35c42bc834156437fc93e34', 'hostId': 'efeda85825d47863e2c1e0b09b8fc3df5bae43c2ff0fb0beb643dace', 'status': 'stopped', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 02:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:50:36.665 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 02:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:50:36.666 12 DEBUG ceilometer.compute.pollsters [-] Instance c290ea3f-b425-4f25-946d-6b45d2ec31e5 was shut off while getting sample of disk.device.capacity: Failed to inspect data of instance <name=instance-00000031, id=c290ea3f-b425-4f25-946d-6b45d2ec31e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 02:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:50:36.666 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 22 02:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:50:36.666 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 02:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:50:36.666 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: guest-instance-1>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: guest-instance-1>]
Nov 22 02:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:50:36.667 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 22 02:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:50:36.667 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 02:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:50:36.667 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: guest-instance-1>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: guest-instance-1>]
Nov 22 02:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:50:36.667 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 02:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:50:36.668 12 DEBUG ceilometer.compute.pollsters [-] Instance c290ea3f-b425-4f25-946d-6b45d2ec31e5 was shut off while getting sample of disk.device.read.requests: Failed to inspect data of instance <name=instance-00000031, id=c290ea3f-b425-4f25-946d-6b45d2ec31e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 02:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:50:36.668 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 02:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:50:36.669 12 DEBUG ceilometer.compute.pollsters [-] Instance c290ea3f-b425-4f25-946d-6b45d2ec31e5 was shut off while getting sample of network.incoming.packets: Failed to inspect data of instance <name=instance-00000031, id=c290ea3f-b425-4f25-946d-6b45d2ec31e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 02:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:50:36.670 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 02:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:50:36.670 12 DEBUG ceilometer.compute.pollsters [-] Instance c290ea3f-b425-4f25-946d-6b45d2ec31e5 was shut off while getting sample of disk.device.read.bytes: Failed to inspect data of instance <name=instance-00000031, id=c290ea3f-b425-4f25-946d-6b45d2ec31e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 02:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:50:36.671 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 22 02:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:50:36.671 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 02:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:50:36.671 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: guest-instance-1>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: guest-instance-1>]
Nov 22 02:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:50:36.671 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 02:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:50:36.672 12 DEBUG ceilometer.compute.pollsters [-] Instance c290ea3f-b425-4f25-946d-6b45d2ec31e5 was shut off while getting sample of network.outgoing.packets: Failed to inspect data of instance <name=instance-00000031, id=c290ea3f-b425-4f25-946d-6b45d2ec31e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 02:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:50:36.672 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 02:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:50:36.673 12 DEBUG ceilometer.compute.pollsters [-] Instance c290ea3f-b425-4f25-946d-6b45d2ec31e5 was shut off while getting sample of network.outgoing.packets.drop: Failed to inspect data of instance <name=instance-00000031, id=c290ea3f-b425-4f25-946d-6b45d2ec31e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 02:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:50:36.673 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 02:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:50:36.675 12 DEBUG ceilometer.compute.pollsters [-] Instance c290ea3f-b425-4f25-946d-6b45d2ec31e5 was shut off while getting sample of disk.device.write.latency: Failed to inspect data of instance <name=instance-00000031, id=c290ea3f-b425-4f25-946d-6b45d2ec31e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 02:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:50:36.675 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 02:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:50:36.676 12 DEBUG ceilometer.compute.pollsters [-] Instance c290ea3f-b425-4f25-946d-6b45d2ec31e5 was shut off while getting sample of network.outgoing.packets.error: Failed to inspect data of instance <name=instance-00000031, id=c290ea3f-b425-4f25-946d-6b45d2ec31e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 02:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:50:36.676 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 02:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:50:36.677 12 DEBUG ceilometer.compute.pollsters [-] Instance c290ea3f-b425-4f25-946d-6b45d2ec31e5 was shut off while getting sample of network.incoming.bytes: Failed to inspect data of instance <name=instance-00000031, id=c290ea3f-b425-4f25-946d-6b45d2ec31e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 02:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:50:36.677 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 02:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:50:36.678 12 DEBUG ceilometer.compute.pollsters [-] Instance c290ea3f-b425-4f25-946d-6b45d2ec31e5 was shut off while getting sample of disk.device.write.bytes: Failed to inspect data of instance <name=instance-00000031, id=c290ea3f-b425-4f25-946d-6b45d2ec31e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 02:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:50:36.678 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 02:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:50:36.679 12 DEBUG ceilometer.compute.pollsters [-] Instance c290ea3f-b425-4f25-946d-6b45d2ec31e5 was shut off while getting sample of disk.device.usage: Failed to inspect data of instance <name=instance-00000031, id=c290ea3f-b425-4f25-946d-6b45d2ec31e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 02:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:50:36.679 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 02:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:50:36.680 12 DEBUG ceilometer.compute.pollsters [-] Instance c290ea3f-b425-4f25-946d-6b45d2ec31e5 was shut off while getting sample of cpu: Failed to inspect data of instance <name=instance-00000031, id=c290ea3f-b425-4f25-946d-6b45d2ec31e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 02:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:50:36.680 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 02:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:50:36.680 12 DEBUG ceilometer.compute.pollsters [-] Instance c290ea3f-b425-4f25-946d-6b45d2ec31e5 was shut off while getting sample of disk.device.write.requests: Failed to inspect data of instance <name=instance-00000031, id=c290ea3f-b425-4f25-946d-6b45d2ec31e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 02:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:50:36.680 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 02:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:50:36.681 12 DEBUG ceilometer.compute.pollsters [-] Instance c290ea3f-b425-4f25-946d-6b45d2ec31e5 was shut off while getting sample of disk.device.read.latency: Failed to inspect data of instance <name=instance-00000031, id=c290ea3f-b425-4f25-946d-6b45d2ec31e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 02:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:50:36.681 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 02:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:50:36.682 12 DEBUG ceilometer.compute.pollsters [-] Instance c290ea3f-b425-4f25-946d-6b45d2ec31e5 was shut off while getting sample of network.incoming.packets.error: Failed to inspect data of instance <name=instance-00000031, id=c290ea3f-b425-4f25-946d-6b45d2ec31e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 02:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:50:36.682 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 02:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:50:36.683 12 DEBUG ceilometer.compute.pollsters [-] Instance c290ea3f-b425-4f25-946d-6b45d2ec31e5 was shut off while getting sample of disk.device.allocation: Failed to inspect data of instance <name=instance-00000031, id=c290ea3f-b425-4f25-946d-6b45d2ec31e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 02:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:50:36.683 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 02:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:50:36.683 12 DEBUG ceilometer.compute.pollsters [-] Instance c290ea3f-b425-4f25-946d-6b45d2ec31e5 was shut off while getting sample of network.incoming.packets.drop: Failed to inspect data of instance <name=instance-00000031, id=c290ea3f-b425-4f25-946d-6b45d2ec31e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 02:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:50:36.684 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 02:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:50:36.684 12 DEBUG ceilometer.compute.pollsters [-] Instance c290ea3f-b425-4f25-946d-6b45d2ec31e5 was shut off while getting sample of network.incoming.bytes.delta: Failed to inspect data of instance <name=instance-00000031, id=c290ea3f-b425-4f25-946d-6b45d2ec31e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 02:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:50:36.684 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 02:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:50:36.685 12 DEBUG ceilometer.compute.pollsters [-] Instance c290ea3f-b425-4f25-946d-6b45d2ec31e5 was shut off while getting sample of network.outgoing.bytes: Failed to inspect data of instance <name=instance-00000031, id=c290ea3f-b425-4f25-946d-6b45d2ec31e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 02:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:50:36.685 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 02:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:50:36.686 12 DEBUG ceilometer.compute.pollsters [-] Instance c290ea3f-b425-4f25-946d-6b45d2ec31e5 was shut off while getting sample of network.outgoing.bytes.delta: Failed to inspect data of instance <name=instance-00000031, id=c290ea3f-b425-4f25-946d-6b45d2ec31e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 02:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:50:36.686 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 22 02:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:50:36.686 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 02:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:50:36.687 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: guest-instance-1>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: guest-instance-1>]
Nov 22 02:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:50:36.687 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 02:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:50:36.688 12 DEBUG ceilometer.compute.pollsters [-] Instance c290ea3f-b425-4f25-946d-6b45d2ec31e5 was shut off while getting sample of memory.usage: Failed to inspect data of instance <name=instance-00000031, id=c290ea3f-b425-4f25-946d-6b45d2ec31e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 02:50:36 np0005531887 nova_compute[186849]: 2025-11-22 07:50:36.881 186853 INFO nova.virt.libvirt.driver [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Creating config drive at /var/lib/nova/instances/c290ea3f-b425-4f25-946d-6b45d2ec31e5/disk.config#033[00m
Nov 22 02:50:36 np0005531887 nova_compute[186849]: 2025-11-22 07:50:36.892 186853 DEBUG oslo_concurrency.processutils [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c290ea3f-b425-4f25-946d-6b45d2ec31e5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp49jdu3sw execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:50:37 np0005531887 nova_compute[186849]: 2025-11-22 07:50:37.022 186853 DEBUG oslo_concurrency.processutils [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c290ea3f-b425-4f25-946d-6b45d2ec31e5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp49jdu3sw" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:50:37 np0005531887 kernel: tapf438eb18-11: entered promiscuous mode
Nov 22 02:50:37 np0005531887 NetworkManager[55210]: <info>  [1763797837.0893] manager: (tapf438eb18-11): new Tun device (/org/freedesktop/NetworkManager/Devices/59)
Nov 22 02:50:37 np0005531887 ovn_controller[95130]: 2025-11-22T07:50:37Z|00100|binding|INFO|Claiming lport f438eb18-11d6-49aa-9cc3-1a2656d47a6e for this chassis.
Nov 22 02:50:37 np0005531887 ovn_controller[95130]: 2025-11-22T07:50:37Z|00101|binding|INFO|f438eb18-11d6-49aa-9cc3-1a2656d47a6e: Claiming fa:16:3e:a4:a6:64 10.100.0.7
Nov 22 02:50:37 np0005531887 nova_compute[186849]: 2025-11-22 07:50:37.092 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:50:37.105 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a4:a6:64 10.100.0.7'], port_security=['fa:16:3e:a4:a6:64 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'c290ea3f-b425-4f25-946d-6b45d2ec31e5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9a29fb10-157b-4bc3-b002-01a9b93f1b72', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '64566cf00036456abfe577ae2fef6a7c', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f752c9b6-5e6c-4195-af4e-8cb2b6c614cb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d44a3a02-a156-4d14-ba6f-788e886825aa, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=f438eb18-11d6-49aa-9cc3-1a2656d47a6e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:50:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:50:37.106 104084 INFO neutron.agent.ovn.metadata.agent [-] Port f438eb18-11d6-49aa-9cc3-1a2656d47a6e in datapath 9a29fb10-157b-4bc3-b002-01a9b93f1b72 bound to our chassis#033[00m
Nov 22 02:50:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:50:37.107 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9a29fb10-157b-4bc3-b002-01a9b93f1b72#033[00m
Nov 22 02:50:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:50:37.119 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[c8e994d3-a080-445d-b8d8-57fdd2287f84]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:50:37.120 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9a29fb10-11 in ovnmeta-9a29fb10-157b-4bc3-b002-01a9b93f1b72 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 02:50:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:50:37.122 213790 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9a29fb10-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 02:50:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:50:37.122 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[05826320-1ecc-493d-9082-e82542d6a4bc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:50:37.123 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[550b2000-7add-4483-b226-623030739a2b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:37 np0005531887 systemd-udevd[219509]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:50:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:50:37.136 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[866e795b-fbe8-4f4b-985a-373d2b396906]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:37 np0005531887 NetworkManager[55210]: <info>  [1763797837.1448] device (tapf438eb18-11): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 02:50:37 np0005531887 NetworkManager[55210]: <info>  [1763797837.1458] device (tapf438eb18-11): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 02:50:37 np0005531887 systemd-machined[153180]: New machine qemu-18-instance-00000031.
Nov 22 02:50:37 np0005531887 nova_compute[186849]: 2025-11-22 07:50:37.151 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:50:37.155 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[0c35dd7b-6236-426e-b368-8099fabd2a00]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:37 np0005531887 ovn_controller[95130]: 2025-11-22T07:50:37Z|00102|binding|INFO|Setting lport f438eb18-11d6-49aa-9cc3-1a2656d47a6e ovn-installed in OVS
Nov 22 02:50:37 np0005531887 ovn_controller[95130]: 2025-11-22T07:50:37Z|00103|binding|INFO|Setting lport f438eb18-11d6-49aa-9cc3-1a2656d47a6e up in Southbound
Nov 22 02:50:37 np0005531887 nova_compute[186849]: 2025-11-22 07:50:37.158 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:37 np0005531887 systemd[1]: Started Virtual Machine qemu-18-instance-00000031.
Nov 22 02:50:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:50:37.184 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[8bb51731-355d-4b55-ab79-5a856c2dc47a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:37 np0005531887 systemd-udevd[219514]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:50:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:50:37.189 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[fe611572-e009-4725-a038-6de6088ff16a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:37 np0005531887 NetworkManager[55210]: <info>  [1763797837.1906] manager: (tap9a29fb10-10): new Veth device (/org/freedesktop/NetworkManager/Devices/60)
Nov 22 02:50:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:50:37.222 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[4c60ad13-3e97-4973-8f88-e23d1b80a346]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:50:37.224 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[ec111703-7b41-4155-a8b1-37e3202b3b6a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:37 np0005531887 NetworkManager[55210]: <info>  [1763797837.2455] device (tap9a29fb10-10): carrier: link connected
Nov 22 02:50:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:50:37.249 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[8c2e846e-aa78-4058-a160-2f605f4fc652]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:50:37.265 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[67188ec1-4c00-4487-adcc-3c8d99ebc92d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9a29fb10-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:81:60:b7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 34], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 457985, 'reachable_time': 35438, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219542, 'error': None, 'target': 'ovnmeta-9a29fb10-157b-4bc3-b002-01a9b93f1b72', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:50:37.281 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[115cfba5-8863-40b8-add2-30a3d681700a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe81:60b7'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 457985, 'tstamp': 457985}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219543, 'error': None, 'target': 'ovnmeta-9a29fb10-157b-4bc3-b002-01a9b93f1b72', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:50:37.300 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[09a855fc-7acc-4787-9e03-7567afda0e45]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9a29fb10-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:81:60:b7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 34], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 457985, 'reachable_time': 35438, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219544, 'error': None, 'target': 'ovnmeta-9a29fb10-157b-4bc3-b002-01a9b93f1b72', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:50:37.322 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:50:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:50:37.322 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:50:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:50:37.322 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:50:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:50:37.352 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[971929ed-dac8-4d42-8c77-bd3acf6c05ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:50:37.429 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[22309577-ced6-4e87-b7c1-109d699f371f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:50:37.431 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9a29fb10-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:50:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:50:37.432 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:50:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:50:37.432 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9a29fb10-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:50:37 np0005531887 nova_compute[186849]: 2025-11-22 07:50:37.434 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:37 np0005531887 kernel: tap9a29fb10-10: entered promiscuous mode
Nov 22 02:50:37 np0005531887 NetworkManager[55210]: <info>  [1763797837.4362] manager: (tap9a29fb10-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/61)
Nov 22 02:50:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:50:37.441 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9a29fb10-10, col_values=(('external_ids', {'iface-id': 'bb242c10-f687-4325-8b1f-4e7f4a04ebdb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:50:37 np0005531887 nova_compute[186849]: 2025-11-22 07:50:37.443 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:37 np0005531887 ovn_controller[95130]: 2025-11-22T07:50:37Z|00104|binding|INFO|Releasing lport bb242c10-f687-4325-8b1f-4e7f4a04ebdb from this chassis (sb_readonly=0)
Nov 22 02:50:37 np0005531887 nova_compute[186849]: 2025-11-22 07:50:37.466 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:50:37.468 104084 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9a29fb10-157b-4bc3-b002-01a9b93f1b72.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9a29fb10-157b-4bc3-b002-01a9b93f1b72.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 02:50:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:50:37.469 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[bfa4ee42-f590-416f-906b-4f86726b0119]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:50:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:50:37.470 104084 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 02:50:37 np0005531887 ovn_metadata_agent[104079]: global
Nov 22 02:50:37 np0005531887 ovn_metadata_agent[104079]:    log         /dev/log local0 debug
Nov 22 02:50:37 np0005531887 ovn_metadata_agent[104079]:    log-tag     haproxy-metadata-proxy-9a29fb10-157b-4bc3-b002-01a9b93f1b72
Nov 22 02:50:37 np0005531887 ovn_metadata_agent[104079]:    user        root
Nov 22 02:50:37 np0005531887 ovn_metadata_agent[104079]:    group       root
Nov 22 02:50:37 np0005531887 ovn_metadata_agent[104079]:    maxconn     1024
Nov 22 02:50:37 np0005531887 ovn_metadata_agent[104079]:    pidfile     /var/lib/neutron/external/pids/9a29fb10-157b-4bc3-b002-01a9b93f1b72.pid.haproxy
Nov 22 02:50:37 np0005531887 ovn_metadata_agent[104079]:    daemon
Nov 22 02:50:37 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:50:37 np0005531887 ovn_metadata_agent[104079]: defaults
Nov 22 02:50:37 np0005531887 ovn_metadata_agent[104079]:    log global
Nov 22 02:50:37 np0005531887 ovn_metadata_agent[104079]:    mode http
Nov 22 02:50:37 np0005531887 ovn_metadata_agent[104079]:    option httplog
Nov 22 02:50:37 np0005531887 ovn_metadata_agent[104079]:    option dontlognull
Nov 22 02:50:37 np0005531887 ovn_metadata_agent[104079]:    option http-server-close
Nov 22 02:50:37 np0005531887 ovn_metadata_agent[104079]:    option forwardfor
Nov 22 02:50:37 np0005531887 ovn_metadata_agent[104079]:    retries                 3
Nov 22 02:50:37 np0005531887 ovn_metadata_agent[104079]:    timeout http-request    30s
Nov 22 02:50:37 np0005531887 ovn_metadata_agent[104079]:    timeout connect         30s
Nov 22 02:50:37 np0005531887 ovn_metadata_agent[104079]:    timeout client          32s
Nov 22 02:50:37 np0005531887 ovn_metadata_agent[104079]:    timeout server          32s
Nov 22 02:50:37 np0005531887 ovn_metadata_agent[104079]:    timeout http-keep-alive 30s
Nov 22 02:50:37 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:50:37 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:50:37 np0005531887 ovn_metadata_agent[104079]: listen listener
Nov 22 02:50:37 np0005531887 ovn_metadata_agent[104079]:    bind 169.254.169.254:80
Nov 22 02:50:37 np0005531887 ovn_metadata_agent[104079]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 02:50:37 np0005531887 ovn_metadata_agent[104079]:    http-request add-header X-OVN-Network-ID 9a29fb10-157b-4bc3-b002-01a9b93f1b72
Nov 22 02:50:37 np0005531887 ovn_metadata_agent[104079]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 02:50:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:50:37.470 104084 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9a29fb10-157b-4bc3-b002-01a9b93f1b72', 'env', 'PROCESS_TAG=haproxy-9a29fb10-157b-4bc3-b002-01a9b93f1b72', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9a29fb10-157b-4bc3-b002-01a9b93f1b72.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 02:50:37 np0005531887 nova_compute[186849]: 2025-11-22 07:50:37.508 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763797837.5076568, c290ea3f-b425-4f25-946d-6b45d2ec31e5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:50:37 np0005531887 nova_compute[186849]: 2025-11-22 07:50:37.508 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] VM Started (Lifecycle Event)#033[00m
Nov 22 02:50:37 np0005531887 nova_compute[186849]: 2025-11-22 07:50:37.532 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:50:37 np0005531887 nova_compute[186849]: 2025-11-22 07:50:37.537 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763797837.5078688, c290ea3f-b425-4f25-946d-6b45d2ec31e5 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:50:37 np0005531887 nova_compute[186849]: 2025-11-22 07:50:37.538 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] VM Paused (Lifecycle Event)#033[00m
Nov 22 02:50:37 np0005531887 nova_compute[186849]: 2025-11-22 07:50:37.559 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:50:37 np0005531887 nova_compute[186849]: 2025-11-22 07:50:37.564 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:50:37 np0005531887 nova_compute[186849]: 2025-11-22 07:50:37.583 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:50:37 np0005531887 podman[219583]: 2025-11-22 07:50:37.879765422 +0000 UTC m=+0.069122139 container create 9c01ef29234da655ca1bd3dc8faedf6d6dcdbf85ee8e0cfcdd4526b56308e14a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9a29fb10-157b-4bc3-b002-01a9b93f1b72, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118)
Nov 22 02:50:37 np0005531887 systemd[1]: Started libpod-conmon-9c01ef29234da655ca1bd3dc8faedf6d6dcdbf85ee8e0cfcdd4526b56308e14a.scope.
Nov 22 02:50:37 np0005531887 podman[219583]: 2025-11-22 07:50:37.839393688 +0000 UTC m=+0.028750455 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 02:50:37 np0005531887 systemd[1]: Started libcrun container.
Nov 22 02:50:37 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e82232b269efa26f45c44968c6315caefcba38747368d4f00ea8ac4583970db0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 02:50:38 np0005531887 nova_compute[186849]: 2025-11-22 07:50:38.013 186853 DEBUG nova.compute.manager [req-7ddc7976-b580-4660-9dda-eb6dd42d1668 req-919d3634-39fc-44f9-823a-f131cc1bb16b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Received event network-vif-plugged-f438eb18-11d6-49aa-9cc3-1a2656d47a6e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:50:38 np0005531887 nova_compute[186849]: 2025-11-22 07:50:38.013 186853 DEBUG oslo_concurrency.lockutils [req-7ddc7976-b580-4660-9dda-eb6dd42d1668 req-919d3634-39fc-44f9-823a-f131cc1bb16b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c290ea3f-b425-4f25-946d-6b45d2ec31e5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:50:38 np0005531887 nova_compute[186849]: 2025-11-22 07:50:38.013 186853 DEBUG oslo_concurrency.lockutils [req-7ddc7976-b580-4660-9dda-eb6dd42d1668 req-919d3634-39fc-44f9-823a-f131cc1bb16b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c290ea3f-b425-4f25-946d-6b45d2ec31e5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:50:38 np0005531887 nova_compute[186849]: 2025-11-22 07:50:38.014 186853 DEBUG oslo_concurrency.lockutils [req-7ddc7976-b580-4660-9dda-eb6dd42d1668 req-919d3634-39fc-44f9-823a-f131cc1bb16b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c290ea3f-b425-4f25-946d-6b45d2ec31e5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:50:38 np0005531887 nova_compute[186849]: 2025-11-22 07:50:38.014 186853 DEBUG nova.compute.manager [req-7ddc7976-b580-4660-9dda-eb6dd42d1668 req-919d3634-39fc-44f9-823a-f131cc1bb16b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Processing event network-vif-plugged-f438eb18-11d6-49aa-9cc3-1a2656d47a6e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 02:50:38 np0005531887 nova_compute[186849]: 2025-11-22 07:50:38.014 186853 DEBUG nova.compute.manager [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:50:38 np0005531887 nova_compute[186849]: 2025-11-22 07:50:38.020 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763797838.0204947, c290ea3f-b425-4f25-946d-6b45d2ec31e5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:50:38 np0005531887 nova_compute[186849]: 2025-11-22 07:50:38.021 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:50:38 np0005531887 nova_compute[186849]: 2025-11-22 07:50:38.024 186853 DEBUG nova.virt.libvirt.driver [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 02:50:38 np0005531887 nova_compute[186849]: 2025-11-22 07:50:38.027 186853 INFO nova.virt.libvirt.driver [-] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Instance spawned successfully.#033[00m
Nov 22 02:50:38 np0005531887 nova_compute[186849]: 2025-11-22 07:50:38.028 186853 DEBUG nova.virt.libvirt.driver [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 02:50:38 np0005531887 nova_compute[186849]: 2025-11-22 07:50:38.048 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:50:38 np0005531887 nova_compute[186849]: 2025-11-22 07:50:38.054 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:50:38 np0005531887 nova_compute[186849]: 2025-11-22 07:50:38.058 186853 DEBUG nova.virt.libvirt.driver [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:50:38 np0005531887 nova_compute[186849]: 2025-11-22 07:50:38.059 186853 DEBUG nova.virt.libvirt.driver [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:50:38 np0005531887 nova_compute[186849]: 2025-11-22 07:50:38.059 186853 DEBUG nova.virt.libvirt.driver [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:50:38 np0005531887 nova_compute[186849]: 2025-11-22 07:50:38.059 186853 DEBUG nova.virt.libvirt.driver [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:50:38 np0005531887 nova_compute[186849]: 2025-11-22 07:50:38.060 186853 DEBUG nova.virt.libvirt.driver [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:50:38 np0005531887 nova_compute[186849]: 2025-11-22 07:50:38.060 186853 DEBUG nova.virt.libvirt.driver [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:50:38 np0005531887 nova_compute[186849]: 2025-11-22 07:50:38.088 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:50:38 np0005531887 nova_compute[186849]: 2025-11-22 07:50:38.152 186853 INFO nova.compute.manager [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Took 8.74 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 02:50:38 np0005531887 nova_compute[186849]: 2025-11-22 07:50:38.155 186853 DEBUG nova.compute.manager [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:50:38 np0005531887 podman[219583]: 2025-11-22 07:50:38.220233464 +0000 UTC m=+0.409590181 container init 9c01ef29234da655ca1bd3dc8faedf6d6dcdbf85ee8e0cfcdd4526b56308e14a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9a29fb10-157b-4bc3-b002-01a9b93f1b72, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:50:38 np0005531887 podman[219583]: 2025-11-22 07:50:38.227273739 +0000 UTC m=+0.416630456 container start 9c01ef29234da655ca1bd3dc8faedf6d6dcdbf85ee8e0cfcdd4526b56308e14a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9a29fb10-157b-4bc3-b002-01a9b93f1b72, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Nov 22 02:50:38 np0005531887 nova_compute[186849]: 2025-11-22 07:50:38.236 186853 INFO nova.compute.manager [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Took 9.56 seconds to build instance.#033[00m
Nov 22 02:50:38 np0005531887 neutron-haproxy-ovnmeta-9a29fb10-157b-4bc3-b002-01a9b93f1b72[219599]: [NOTICE]   (219603) : New worker (219605) forked
Nov 22 02:50:38 np0005531887 neutron-haproxy-ovnmeta-9a29fb10-157b-4bc3-b002-01a9b93f1b72[219599]: [NOTICE]   (219603) : Loading success.
Nov 22 02:50:38 np0005531887 nova_compute[186849]: 2025-11-22 07:50:38.262 186853 DEBUG oslo_concurrency.lockutils [None req-f5ca46ce-0496-425e-bbdf-09ca51deefe7 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Lock "c290ea3f-b425-4f25-946d-6b45d2ec31e5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.662s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:50:38 np0005531887 nova_compute[186849]: 2025-11-22 07:50:38.752 186853 DEBUG nova.network.neutron [req-05007458-17fc-4521-84f5-0339f142756f req-92c1e29d-97c2-4170-8bd7-e5ce49098db1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Updated VIF entry in instance network info cache for port f438eb18-11d6-49aa-9cc3-1a2656d47a6e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 02:50:38 np0005531887 nova_compute[186849]: 2025-11-22 07:50:38.753 186853 DEBUG nova.network.neutron [req-05007458-17fc-4521-84f5-0339f142756f req-92c1e29d-97c2-4170-8bd7-e5ce49098db1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Updating instance_info_cache with network_info: [{"id": "f438eb18-11d6-49aa-9cc3-1a2656d47a6e", "address": "fa:16:3e:a4:a6:64", "network": {"id": "9a29fb10-157b-4bc3-b002-01a9b93f1b72", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1756068905-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "64566cf00036456abfe577ae2fef6a7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf438eb18-11", "ovs_interfaceid": "f438eb18-11d6-49aa-9cc3-1a2656d47a6e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:50:38 np0005531887 nova_compute[186849]: 2025-11-22 07:50:38.764 186853 DEBUG oslo_concurrency.lockutils [req-05007458-17fc-4521-84f5-0339f142756f req-92c1e29d-97c2-4170-8bd7-e5ce49098db1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-c290ea3f-b425-4f25-946d-6b45d2ec31e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:50:39 np0005531887 nova_compute[186849]: 2025-11-22 07:50:39.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:50:40 np0005531887 nova_compute[186849]: 2025-11-22 07:50:40.112 186853 DEBUG nova.compute.manager [req-f355f480-594b-46a5-8911-6a090eed5812 req-265f416e-a648-47d0-b3cd-96297467d923 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Received event network-vif-plugged-f438eb18-11d6-49aa-9cc3-1a2656d47a6e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:50:40 np0005531887 nova_compute[186849]: 2025-11-22 07:50:40.112 186853 DEBUG oslo_concurrency.lockutils [req-f355f480-594b-46a5-8911-6a090eed5812 req-265f416e-a648-47d0-b3cd-96297467d923 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c290ea3f-b425-4f25-946d-6b45d2ec31e5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:50:40 np0005531887 nova_compute[186849]: 2025-11-22 07:50:40.113 186853 DEBUG oslo_concurrency.lockutils [req-f355f480-594b-46a5-8911-6a090eed5812 req-265f416e-a648-47d0-b3cd-96297467d923 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c290ea3f-b425-4f25-946d-6b45d2ec31e5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:50:40 np0005531887 nova_compute[186849]: 2025-11-22 07:50:40.113 186853 DEBUG oslo_concurrency.lockutils [req-f355f480-594b-46a5-8911-6a090eed5812 req-265f416e-a648-47d0-b3cd-96297467d923 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c290ea3f-b425-4f25-946d-6b45d2ec31e5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:50:40 np0005531887 nova_compute[186849]: 2025-11-22 07:50:40.113 186853 DEBUG nova.compute.manager [req-f355f480-594b-46a5-8911-6a090eed5812 req-265f416e-a648-47d0-b3cd-96297467d923 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] No waiting events found dispatching network-vif-plugged-f438eb18-11d6-49aa-9cc3-1a2656d47a6e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:50:40 np0005531887 nova_compute[186849]: 2025-11-22 07:50:40.114 186853 WARNING nova.compute.manager [req-f355f480-594b-46a5-8911-6a090eed5812 req-265f416e-a648-47d0-b3cd-96297467d923 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Received unexpected event network-vif-plugged-f438eb18-11d6-49aa-9cc3-1a2656d47a6e for instance with vm_state active and task_state None.#033[00m
Nov 22 02:50:40 np0005531887 nova_compute[186849]: 2025-11-22 07:50:40.499 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:40 np0005531887 nova_compute[186849]: 2025-11-22 07:50:40.761 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:40 np0005531887 nova_compute[186849]: 2025-11-22 07:50:40.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:50:40 np0005531887 podman[219615]: 2025-11-22 07:50:40.840364681 +0000 UTC m=+0.057946832 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 22 02:50:41 np0005531887 NetworkManager[55210]: <info>  [1763797841.7175] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/62)
Nov 22 02:50:41 np0005531887 nova_compute[186849]: 2025-11-22 07:50:41.718 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:41 np0005531887 NetworkManager[55210]: <info>  [1763797841.7188] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/63)
Nov 22 02:50:41 np0005531887 nova_compute[186849]: 2025-11-22 07:50:41.762 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:50:41 np0005531887 nova_compute[186849]: 2025-11-22 07:50:41.817 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:41 np0005531887 ovn_controller[95130]: 2025-11-22T07:50:41Z|00105|binding|INFO|Releasing lport bb242c10-f687-4325-8b1f-4e7f4a04ebdb from this chassis (sb_readonly=0)
Nov 22 02:50:41 np0005531887 nova_compute[186849]: 2025-11-22 07:50:41.835 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:42 np0005531887 nova_compute[186849]: 2025-11-22 07:50:42.192 186853 DEBUG nova.compute.manager [req-da6cde3c-0c14-471d-99d0-90d683923b99 req-1dbe9c3c-a27c-4e87-a4e2-180ae0ff3c87 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Received event network-changed-f438eb18-11d6-49aa-9cc3-1a2656d47a6e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:50:42 np0005531887 nova_compute[186849]: 2025-11-22 07:50:42.193 186853 DEBUG nova.compute.manager [req-da6cde3c-0c14-471d-99d0-90d683923b99 req-1dbe9c3c-a27c-4e87-a4e2-180ae0ff3c87 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Refreshing instance network info cache due to event network-changed-f438eb18-11d6-49aa-9cc3-1a2656d47a6e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 02:50:42 np0005531887 nova_compute[186849]: 2025-11-22 07:50:42.193 186853 DEBUG oslo_concurrency.lockutils [req-da6cde3c-0c14-471d-99d0-90d683923b99 req-1dbe9c3c-a27c-4e87-a4e2-180ae0ff3c87 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-c290ea3f-b425-4f25-946d-6b45d2ec31e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:50:42 np0005531887 nova_compute[186849]: 2025-11-22 07:50:42.193 186853 DEBUG oslo_concurrency.lockutils [req-da6cde3c-0c14-471d-99d0-90d683923b99 req-1dbe9c3c-a27c-4e87-a4e2-180ae0ff3c87 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-c290ea3f-b425-4f25-946d-6b45d2ec31e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:50:42 np0005531887 nova_compute[186849]: 2025-11-22 07:50:42.194 186853 DEBUG nova.network.neutron [req-da6cde3c-0c14-471d-99d0-90d683923b99 req-1dbe9c3c-a27c-4e87-a4e2-180ae0ff3c87 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Refreshing network info cache for port f438eb18-11d6-49aa-9cc3-1a2656d47a6e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 02:50:42 np0005531887 nova_compute[186849]: 2025-11-22 07:50:42.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:50:42 np0005531887 nova_compute[186849]: 2025-11-22 07:50:42.768 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 02:50:42 np0005531887 nova_compute[186849]: 2025-11-22 07:50:42.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 02:50:42 np0005531887 nova_compute[186849]: 2025-11-22 07:50:42.998 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "refresh_cache-c290ea3f-b425-4f25-946d-6b45d2ec31e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:50:43 np0005531887 podman[219635]: 2025-11-22 07:50:43.850452229 +0000 UTC m=+0.067324415 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 02:50:44 np0005531887 nova_compute[186849]: 2025-11-22 07:50:44.048 186853 DEBUG nova.network.neutron [req-da6cde3c-0c14-471d-99d0-90d683923b99 req-1dbe9c3c-a27c-4e87-a4e2-180ae0ff3c87 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Updated VIF entry in instance network info cache for port f438eb18-11d6-49aa-9cc3-1a2656d47a6e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 02:50:44 np0005531887 nova_compute[186849]: 2025-11-22 07:50:44.048 186853 DEBUG nova.network.neutron [req-da6cde3c-0c14-471d-99d0-90d683923b99 req-1dbe9c3c-a27c-4e87-a4e2-180ae0ff3c87 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Updating instance_info_cache with network_info: [{"id": "f438eb18-11d6-49aa-9cc3-1a2656d47a6e", "address": "fa:16:3e:a4:a6:64", "network": {"id": "9a29fb10-157b-4bc3-b002-01a9b93f1b72", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1756068905-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "64566cf00036456abfe577ae2fef6a7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf438eb18-11", "ovs_interfaceid": "f438eb18-11d6-49aa-9cc3-1a2656d47a6e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:50:44 np0005531887 nova_compute[186849]: 2025-11-22 07:50:44.077 186853 DEBUG oslo_concurrency.lockutils [req-da6cde3c-0c14-471d-99d0-90d683923b99 req-1dbe9c3c-a27c-4e87-a4e2-180ae0ff3c87 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-c290ea3f-b425-4f25-946d-6b45d2ec31e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:50:44 np0005531887 nova_compute[186849]: 2025-11-22 07:50:44.078 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquired lock "refresh_cache-c290ea3f-b425-4f25-946d-6b45d2ec31e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:50:44 np0005531887 nova_compute[186849]: 2025-11-22 07:50:44.079 186853 DEBUG nova.network.neutron [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 02:50:44 np0005531887 nova_compute[186849]: 2025-11-22 07:50:44.079 186853 DEBUG nova.objects.instance [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c290ea3f-b425-4f25-946d-6b45d2ec31e5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:50:45 np0005531887 nova_compute[186849]: 2025-11-22 07:50:45.500 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:45 np0005531887 nova_compute[186849]: 2025-11-22 07:50:45.763 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:46 np0005531887 nova_compute[186849]: 2025-11-22 07:50:46.706 186853 DEBUG nova.network.neutron [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Updating instance_info_cache with network_info: [{"id": "f438eb18-11d6-49aa-9cc3-1a2656d47a6e", "address": "fa:16:3e:a4:a6:64", "network": {"id": "9a29fb10-157b-4bc3-b002-01a9b93f1b72", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1756068905-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "64566cf00036456abfe577ae2fef6a7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf438eb18-11", "ovs_interfaceid": "f438eb18-11d6-49aa-9cc3-1a2656d47a6e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:50:46 np0005531887 nova_compute[186849]: 2025-11-22 07:50:46.719 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Releasing lock "refresh_cache-c290ea3f-b425-4f25-946d-6b45d2ec31e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:50:46 np0005531887 nova_compute[186849]: 2025-11-22 07:50:46.719 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 02:50:46 np0005531887 nova_compute[186849]: 2025-11-22 07:50:46.719 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:50:46 np0005531887 nova_compute[186849]: 2025-11-22 07:50:46.720 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:50:46 np0005531887 nova_compute[186849]: 2025-11-22 07:50:46.720 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:50:46 np0005531887 nova_compute[186849]: 2025-11-22 07:50:46.720 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:50:46 np0005531887 nova_compute[186849]: 2025-11-22 07:50:46.721 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 02:50:46 np0005531887 nova_compute[186849]: 2025-11-22 07:50:46.721 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:50:46 np0005531887 nova_compute[186849]: 2025-11-22 07:50:46.740 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:50:46 np0005531887 nova_compute[186849]: 2025-11-22 07:50:46.740 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:50:46 np0005531887 nova_compute[186849]: 2025-11-22 07:50:46.740 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:50:46 np0005531887 nova_compute[186849]: 2025-11-22 07:50:46.741 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 02:50:46 np0005531887 nova_compute[186849]: 2025-11-22 07:50:46.819 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c290ea3f-b425-4f25-946d-6b45d2ec31e5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:50:46 np0005531887 nova_compute[186849]: 2025-11-22 07:50:46.886 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c290ea3f-b425-4f25-946d-6b45d2ec31e5/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:50:46 np0005531887 nova_compute[186849]: 2025-11-22 07:50:46.887 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c290ea3f-b425-4f25-946d-6b45d2ec31e5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:50:46 np0005531887 nova_compute[186849]: 2025-11-22 07:50:46.955 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c290ea3f-b425-4f25-946d-6b45d2ec31e5/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:50:47 np0005531887 nova_compute[186849]: 2025-11-22 07:50:47.128 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:50:47 np0005531887 nova_compute[186849]: 2025-11-22 07:50:47.129 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5609MB free_disk=73.45747756958008GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 02:50:47 np0005531887 nova_compute[186849]: 2025-11-22 07:50:47.129 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:50:47 np0005531887 nova_compute[186849]: 2025-11-22 07:50:47.130 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:50:47 np0005531887 nova_compute[186849]: 2025-11-22 07:50:47.196 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Instance c290ea3f-b425-4f25-946d-6b45d2ec31e5 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 02:50:47 np0005531887 nova_compute[186849]: 2025-11-22 07:50:47.197 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 02:50:47 np0005531887 nova_compute[186849]: 2025-11-22 07:50:47.197 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 02:50:47 np0005531887 nova_compute[186849]: 2025-11-22 07:50:47.235 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:50:47 np0005531887 nova_compute[186849]: 2025-11-22 07:50:47.248 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:50:47 np0005531887 nova_compute[186849]: 2025-11-22 07:50:47.272 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 02:50:47 np0005531887 nova_compute[186849]: 2025-11-22 07:50:47.273 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.143s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:50:48 np0005531887 podman[219662]: 2025-11-22 07:50:48.828803871 +0000 UTC m=+0.050321372 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 02:50:50 np0005531887 nova_compute[186849]: 2025-11-22 07:50:50.504 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:50 np0005531887 nova_compute[186849]: 2025-11-22 07:50:50.764 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:53 np0005531887 ovn_controller[95130]: 2025-11-22T07:50:53Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a4:a6:64 10.100.0.7
Nov 22 02:50:53 np0005531887 ovn_controller[95130]: 2025-11-22T07:50:53Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a4:a6:64 10.100.0.7
Nov 22 02:50:54 np0005531887 podman[219710]: 2025-11-22 07:50:54.841289909 +0000 UTC m=+0.061974651 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, build-date=2025-08-20T13:12:41, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, release=1755695350, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., name=ubi9-minimal, managed_by=edpm_ansible, version=9.6, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=)
Nov 22 02:50:55 np0005531887 nova_compute[186849]: 2025-11-22 07:50:55.506 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:55 np0005531887 nova_compute[186849]: 2025-11-22 07:50:55.768 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:57 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:50:57.038 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:50:57 np0005531887 nova_compute[186849]: 2025-11-22 07:50:57.038 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:50:57 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:50:57.041 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 02:50:57 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:50:57.042 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:50:58 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:50:58.750 104195 DEBUG eventlet.wsgi.server [-] (104195) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m
Nov 22 02:50:58 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:50:58.753 104195 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /openstack/latest/meta_data.json HTTP/1.0#015
Nov 22 02:50:58 np0005531887 ovn_metadata_agent[104079]: Accept: */*#015
Nov 22 02:50:58 np0005531887 ovn_metadata_agent[104079]: Connection: close#015
Nov 22 02:50:58 np0005531887 ovn_metadata_agent[104079]: Content-Type: text/plain#015
Nov 22 02:50:58 np0005531887 ovn_metadata_agent[104079]: Host: 169.254.169.254#015
Nov 22 02:50:58 np0005531887 ovn_metadata_agent[104079]: User-Agent: curl/7.84.0#015
Nov 22 02:50:58 np0005531887 ovn_metadata_agent[104079]: X-Forwarded-For: 10.100.0.7#015
Nov 22 02:50:58 np0005531887 ovn_metadata_agent[104079]: X-Ovn-Network-Id: 9a29fb10-157b-4bc3-b002-01a9b93f1b72 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m
Nov 22 02:51:00 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:51:00.051 104195 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m
Nov 22 02:51:00 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:51:00.052 104195 INFO eventlet.wsgi.server [-] 10.100.0.7,<local> "GET /openstack/latest/meta_data.json HTTP/1.1" status: 200  len: 1671 time: 1.2997622#033[00m
Nov 22 02:51:00 np0005531887 haproxy-metadata-proxy-9a29fb10-157b-4bc3-b002-01a9b93f1b72[219605]: 10.100.0.7:49850 [22/Nov/2025:07:50:58.747] listener listener/metadata 0/0/0/1305/1305 200 1655 - - ---- 1/1/0/0/0 0/0 "GET /openstack/latest/meta_data.json HTTP/1.1"
Nov 22 02:51:00 np0005531887 nova_compute[186849]: 2025-11-22 07:51:00.225 186853 DEBUG oslo_concurrency.lockutils [None req-27322212-c226-4a12-944f-a88cb971c6c9 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Acquiring lock "c290ea3f-b425-4f25-946d-6b45d2ec31e5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:51:00 np0005531887 nova_compute[186849]: 2025-11-22 07:51:00.225 186853 DEBUG oslo_concurrency.lockutils [None req-27322212-c226-4a12-944f-a88cb971c6c9 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Lock "c290ea3f-b425-4f25-946d-6b45d2ec31e5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:51:00 np0005531887 nova_compute[186849]: 2025-11-22 07:51:00.226 186853 DEBUG oslo_concurrency.lockutils [None req-27322212-c226-4a12-944f-a88cb971c6c9 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Acquiring lock "c290ea3f-b425-4f25-946d-6b45d2ec31e5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:51:00 np0005531887 nova_compute[186849]: 2025-11-22 07:51:00.226 186853 DEBUG oslo_concurrency.lockutils [None req-27322212-c226-4a12-944f-a88cb971c6c9 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Lock "c290ea3f-b425-4f25-946d-6b45d2ec31e5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:51:00 np0005531887 nova_compute[186849]: 2025-11-22 07:51:00.226 186853 DEBUG oslo_concurrency.lockutils [None req-27322212-c226-4a12-944f-a88cb971c6c9 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Lock "c290ea3f-b425-4f25-946d-6b45d2ec31e5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:51:00 np0005531887 nova_compute[186849]: 2025-11-22 07:51:00.233 186853 INFO nova.compute.manager [None req-27322212-c226-4a12-944f-a88cb971c6c9 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Terminating instance#033[00m
Nov 22 02:51:00 np0005531887 nova_compute[186849]: 2025-11-22 07:51:00.239 186853 DEBUG nova.compute.manager [None req-27322212-c226-4a12-944f-a88cb971c6c9 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 02:51:00 np0005531887 kernel: tapf438eb18-11 (unregistering): left promiscuous mode
Nov 22 02:51:00 np0005531887 NetworkManager[55210]: <info>  [1763797860.2618] device (tapf438eb18-11): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 02:51:00 np0005531887 nova_compute[186849]: 2025-11-22 07:51:00.274 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:00 np0005531887 ovn_controller[95130]: 2025-11-22T07:51:00Z|00106|binding|INFO|Releasing lport f438eb18-11d6-49aa-9cc3-1a2656d47a6e from this chassis (sb_readonly=0)
Nov 22 02:51:00 np0005531887 ovn_controller[95130]: 2025-11-22T07:51:00Z|00107|binding|INFO|Setting lport f438eb18-11d6-49aa-9cc3-1a2656d47a6e down in Southbound
Nov 22 02:51:00 np0005531887 ovn_controller[95130]: 2025-11-22T07:51:00Z|00108|binding|INFO|Removing iface tapf438eb18-11 ovn-installed in OVS
Nov 22 02:51:00 np0005531887 nova_compute[186849]: 2025-11-22 07:51:00.276 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:00 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:51:00.283 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a4:a6:64 10.100.0.7'], port_security=['fa:16:3e:a4:a6:64 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'c290ea3f-b425-4f25-946d-6b45d2ec31e5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9a29fb10-157b-4bc3-b002-01a9b93f1b72', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '64566cf00036456abfe577ae2fef6a7c', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f752c9b6-5e6c-4195-af4e-8cb2b6c614cb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.194'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d44a3a02-a156-4d14-ba6f-788e886825aa, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=f438eb18-11d6-49aa-9cc3-1a2656d47a6e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:51:00 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:51:00.284 104084 INFO neutron.agent.ovn.metadata.agent [-] Port f438eb18-11d6-49aa-9cc3-1a2656d47a6e in datapath 9a29fb10-157b-4bc3-b002-01a9b93f1b72 unbound from our chassis#033[00m
Nov 22 02:51:00 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:51:00.286 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9a29fb10-157b-4bc3-b002-01a9b93f1b72, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 02:51:00 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:51:00.288 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[7720ca48-cd5d-4350-9bc4-8fe89dc99fee]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:51:00 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:51:00.288 104084 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9a29fb10-157b-4bc3-b002-01a9b93f1b72 namespace which is not needed anymore#033[00m
Nov 22 02:51:00 np0005531887 nova_compute[186849]: 2025-11-22 07:51:00.297 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:00 np0005531887 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000031.scope: Deactivated successfully.
Nov 22 02:51:00 np0005531887 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000031.scope: Consumed 14.988s CPU time.
Nov 22 02:51:00 np0005531887 systemd-machined[153180]: Machine qemu-18-instance-00000031 terminated.
Nov 22 02:51:00 np0005531887 podman[219731]: 2025-11-22 07:51:00.376075441 +0000 UTC m=+0.077269701 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 02:51:00 np0005531887 podman[219734]: 2025-11-22 07:51:00.388288765 +0000 UTC m=+0.093350732 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 02:51:00 np0005531887 neutron-haproxy-ovnmeta-9a29fb10-157b-4bc3-b002-01a9b93f1b72[219599]: [NOTICE]   (219603) : haproxy version is 2.8.14-c23fe91
Nov 22 02:51:00 np0005531887 neutron-haproxy-ovnmeta-9a29fb10-157b-4bc3-b002-01a9b93f1b72[219599]: [NOTICE]   (219603) : path to executable is /usr/sbin/haproxy
Nov 22 02:51:00 np0005531887 neutron-haproxy-ovnmeta-9a29fb10-157b-4bc3-b002-01a9b93f1b72[219599]: [WARNING]  (219603) : Exiting Master process...
Nov 22 02:51:00 np0005531887 neutron-haproxy-ovnmeta-9a29fb10-157b-4bc3-b002-01a9b93f1b72[219599]: [ALERT]    (219603) : Current worker (219605) exited with code 143 (Terminated)
Nov 22 02:51:00 np0005531887 neutron-haproxy-ovnmeta-9a29fb10-157b-4bc3-b002-01a9b93f1b72[219599]: [WARNING]  (219603) : All workers exited. Exiting... (0)
Nov 22 02:51:00 np0005531887 systemd[1]: libpod-9c01ef29234da655ca1bd3dc8faedf6d6dcdbf85ee8e0cfcdd4526b56308e14a.scope: Deactivated successfully.
Nov 22 02:51:00 np0005531887 podman[219796]: 2025-11-22 07:51:00.433609941 +0000 UTC m=+0.052533536 container died 9c01ef29234da655ca1bd3dc8faedf6d6dcdbf85ee8e0cfcdd4526b56308e14a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9a29fb10-157b-4bc3-b002-01a9b93f1b72, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:51:00 np0005531887 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9c01ef29234da655ca1bd3dc8faedf6d6dcdbf85ee8e0cfcdd4526b56308e14a-userdata-shm.mount: Deactivated successfully.
Nov 22 02:51:00 np0005531887 systemd[1]: var-lib-containers-storage-overlay-e82232b269efa26f45c44968c6315caefcba38747368d4f00ea8ac4583970db0-merged.mount: Deactivated successfully.
Nov 22 02:51:00 np0005531887 podman[219796]: 2025-11-22 07:51:00.479607085 +0000 UTC m=+0.098530680 container cleanup 9c01ef29234da655ca1bd3dc8faedf6d6dcdbf85ee8e0cfcdd4526b56308e14a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9a29fb10-157b-4bc3-b002-01a9b93f1b72, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:51:00 np0005531887 systemd[1]: libpod-conmon-9c01ef29234da655ca1bd3dc8faedf6d6dcdbf85ee8e0cfcdd4526b56308e14a.scope: Deactivated successfully.
Nov 22 02:51:00 np0005531887 nova_compute[186849]: 2025-11-22 07:51:00.507 186853 INFO nova.virt.libvirt.driver [-] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Instance destroyed successfully.#033[00m
Nov 22 02:51:00 np0005531887 nova_compute[186849]: 2025-11-22 07:51:00.508 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:00 np0005531887 nova_compute[186849]: 2025-11-22 07:51:00.509 186853 DEBUG nova.objects.instance [None req-27322212-c226-4a12-944f-a88cb971c6c9 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Lazy-loading 'resources' on Instance uuid c290ea3f-b425-4f25-946d-6b45d2ec31e5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:51:00 np0005531887 nova_compute[186849]: 2025-11-22 07:51:00.531 186853 DEBUG nova.virt.libvirt.vif [None req-27322212-c226-4a12-944f-a88cb971c6c9 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:50:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='guest-instance-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx-guest-test.domaintest.com',id=49,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCq3tOHLxFADk08BTE74la/pkcSbzch0W8vcagtv1h5VaBxLk97y+NzQbQG03sgRQKdqFzRFfi3kZFVvixXTGpd83WduPNQtWl9jKX5/A3kaBuFKJlYMCgUPssX9D8icWg==',key_name='tempest-keypair-922179052',keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:50:38Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='64566cf00036456abfe577ae2fef6a7c',ramdisk_id='',reservation_id='r-vxefzfau',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersV294TestFqdnHostnames-925199310',owner_user_name='tempest-ServersV294TestFqdnHostnames-925199310-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:50:38Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='617dbb2ad35c42bc834156437fc93e34',uuid=c290ea3f-b425-4f25-946d-6b45d2ec31e5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f438eb18-11d6-49aa-9cc3-1a2656d47a6e", "address": "fa:16:3e:a4:a6:64", "network": {"id": "9a29fb10-157b-4bc3-b002-01a9b93f1b72", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1756068905-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "64566cf00036456abfe577ae2fef6a7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf438eb18-11", "ovs_interfaceid": "f438eb18-11d6-49aa-9cc3-1a2656d47a6e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 02:51:00 np0005531887 nova_compute[186849]: 2025-11-22 07:51:00.532 186853 DEBUG nova.network.os_vif_util [None req-27322212-c226-4a12-944f-a88cb971c6c9 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Converting VIF {"id": "f438eb18-11d6-49aa-9cc3-1a2656d47a6e", "address": "fa:16:3e:a4:a6:64", "network": {"id": "9a29fb10-157b-4bc3-b002-01a9b93f1b72", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1756068905-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "64566cf00036456abfe577ae2fef6a7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf438eb18-11", "ovs_interfaceid": "f438eb18-11d6-49aa-9cc3-1a2656d47a6e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:51:00 np0005531887 nova_compute[186849]: 2025-11-22 07:51:00.532 186853 DEBUG nova.network.os_vif_util [None req-27322212-c226-4a12-944f-a88cb971c6c9 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a4:a6:64,bridge_name='br-int',has_traffic_filtering=True,id=f438eb18-11d6-49aa-9cc3-1a2656d47a6e,network=Network(9a29fb10-157b-4bc3-b002-01a9b93f1b72),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf438eb18-11') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:51:00 np0005531887 nova_compute[186849]: 2025-11-22 07:51:00.533 186853 DEBUG os_vif [None req-27322212-c226-4a12-944f-a88cb971c6c9 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a4:a6:64,bridge_name='br-int',has_traffic_filtering=True,id=f438eb18-11d6-49aa-9cc3-1a2656d47a6e,network=Network(9a29fb10-157b-4bc3-b002-01a9b93f1b72),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf438eb18-11') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 02:51:00 np0005531887 nova_compute[186849]: 2025-11-22 07:51:00.535 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:00 np0005531887 nova_compute[186849]: 2025-11-22 07:51:00.535 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf438eb18-11, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:51:00 np0005531887 nova_compute[186849]: 2025-11-22 07:51:00.537 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:00 np0005531887 nova_compute[186849]: 2025-11-22 07:51:00.538 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:00 np0005531887 nova_compute[186849]: 2025-11-22 07:51:00.541 186853 INFO os_vif [None req-27322212-c226-4a12-944f-a88cb971c6c9 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a4:a6:64,bridge_name='br-int',has_traffic_filtering=True,id=f438eb18-11d6-49aa-9cc3-1a2656d47a6e,network=Network(9a29fb10-157b-4bc3-b002-01a9b93f1b72),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf438eb18-11')#033[00m
Nov 22 02:51:00 np0005531887 nova_compute[186849]: 2025-11-22 07:51:00.541 186853 INFO nova.virt.libvirt.driver [None req-27322212-c226-4a12-944f-a88cb971c6c9 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Deleting instance files /var/lib/nova/instances/c290ea3f-b425-4f25-946d-6b45d2ec31e5_del#033[00m
Nov 22 02:51:00 np0005531887 nova_compute[186849]: 2025-11-22 07:51:00.542 186853 INFO nova.virt.libvirt.driver [None req-27322212-c226-4a12-944f-a88cb971c6c9 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Deletion of /var/lib/nova/instances/c290ea3f-b425-4f25-946d-6b45d2ec31e5_del complete#033[00m
Nov 22 02:51:00 np0005531887 podman[219842]: 2025-11-22 07:51:00.564137856 +0000 UTC m=+0.057661825 container remove 9c01ef29234da655ca1bd3dc8faedf6d6dcdbf85ee8e0cfcdd4526b56308e14a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9a29fb10-157b-4bc3-b002-01a9b93f1b72, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 22 02:51:00 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:51:00.573 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[661764ee-6e2f-4537-adca-9728ca9ce6e3]: (4, ('Sat Nov 22 07:51:00 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9a29fb10-157b-4bc3-b002-01a9b93f1b72 (9c01ef29234da655ca1bd3dc8faedf6d6dcdbf85ee8e0cfcdd4526b56308e14a)\n9c01ef29234da655ca1bd3dc8faedf6d6dcdbf85ee8e0cfcdd4526b56308e14a\nSat Nov 22 07:51:00 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9a29fb10-157b-4bc3-b002-01a9b93f1b72 (9c01ef29234da655ca1bd3dc8faedf6d6dcdbf85ee8e0cfcdd4526b56308e14a)\n9c01ef29234da655ca1bd3dc8faedf6d6dcdbf85ee8e0cfcdd4526b56308e14a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:51:00 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:51:00.575 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[3fa3ffc6-cbd5-4095-99d5-0cd34d667b2b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:51:00 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:51:00.576 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9a29fb10-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:51:00 np0005531887 kernel: tap9a29fb10-10: left promiscuous mode
Nov 22 02:51:00 np0005531887 nova_compute[186849]: 2025-11-22 07:51:00.579 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:00 np0005531887 nova_compute[186849]: 2025-11-22 07:51:00.590 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:00 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:51:00.593 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[b98a39e7-6d4c-4348-bc4e-7d70b183e784]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:51:00 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:51:00.608 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[79b02c1e-a94f-4b1e-8a59-a1df8dcdd419]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:51:00 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:51:00.610 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[973e2e64-0557-4823-9736-27b9bd4c12b5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:51:00 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:51:00.624 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[9513adaf-112f-47a1-b967-2546aef12127]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 457978, 'reachable_time': 16041, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219859, 'error': None, 'target': 'ovnmeta-9a29fb10-157b-4bc3-b002-01a9b93f1b72', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:51:00 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:51:00.627 104200 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9a29fb10-157b-4bc3-b002-01a9b93f1b72 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 02:51:00 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:51:00.627 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[133d6f4c-bcc9-4ac6-a7da-0ecb8990b22d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:51:00 np0005531887 systemd[1]: run-netns-ovnmeta\x2d9a29fb10\x2d157b\x2d4bc3\x2db002\x2d01a9b93f1b72.mount: Deactivated successfully.
Nov 22 02:51:00 np0005531887 nova_compute[186849]: 2025-11-22 07:51:00.634 186853 INFO nova.compute.manager [None req-27322212-c226-4a12-944f-a88cb971c6c9 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 02:51:00 np0005531887 nova_compute[186849]: 2025-11-22 07:51:00.634 186853 DEBUG oslo.service.loopingcall [None req-27322212-c226-4a12-944f-a88cb971c6c9 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 02:51:00 np0005531887 nova_compute[186849]: 2025-11-22 07:51:00.634 186853 DEBUG nova.compute.manager [-] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 02:51:00 np0005531887 nova_compute[186849]: 2025-11-22 07:51:00.635 186853 DEBUG nova.network.neutron [-] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 02:51:02 np0005531887 nova_compute[186849]: 2025-11-22 07:51:02.302 186853 DEBUG nova.compute.manager [req-b613ec0d-7ab2-417d-a6b9-a62944c3ee3e req-6d88ea4a-b06f-4f63-b841-72eda2d8e521 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Received event network-vif-unplugged-f438eb18-11d6-49aa-9cc3-1a2656d47a6e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:51:02 np0005531887 nova_compute[186849]: 2025-11-22 07:51:02.302 186853 DEBUG oslo_concurrency.lockutils [req-b613ec0d-7ab2-417d-a6b9-a62944c3ee3e req-6d88ea4a-b06f-4f63-b841-72eda2d8e521 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c290ea3f-b425-4f25-946d-6b45d2ec31e5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:51:02 np0005531887 nova_compute[186849]: 2025-11-22 07:51:02.302 186853 DEBUG oslo_concurrency.lockutils [req-b613ec0d-7ab2-417d-a6b9-a62944c3ee3e req-6d88ea4a-b06f-4f63-b841-72eda2d8e521 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c290ea3f-b425-4f25-946d-6b45d2ec31e5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:51:02 np0005531887 nova_compute[186849]: 2025-11-22 07:51:02.303 186853 DEBUG oslo_concurrency.lockutils [req-b613ec0d-7ab2-417d-a6b9-a62944c3ee3e req-6d88ea4a-b06f-4f63-b841-72eda2d8e521 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c290ea3f-b425-4f25-946d-6b45d2ec31e5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:51:02 np0005531887 nova_compute[186849]: 2025-11-22 07:51:02.303 186853 DEBUG nova.compute.manager [req-b613ec0d-7ab2-417d-a6b9-a62944c3ee3e req-6d88ea4a-b06f-4f63-b841-72eda2d8e521 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] No waiting events found dispatching network-vif-unplugged-f438eb18-11d6-49aa-9cc3-1a2656d47a6e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:51:02 np0005531887 nova_compute[186849]: 2025-11-22 07:51:02.303 186853 DEBUG nova.compute.manager [req-b613ec0d-7ab2-417d-a6b9-a62944c3ee3e req-6d88ea4a-b06f-4f63-b841-72eda2d8e521 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Received event network-vif-unplugged-f438eb18-11d6-49aa-9cc3-1a2656d47a6e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 02:51:02 np0005531887 nova_compute[186849]: 2025-11-22 07:51:02.303 186853 DEBUG nova.compute.manager [req-b613ec0d-7ab2-417d-a6b9-a62944c3ee3e req-6d88ea4a-b06f-4f63-b841-72eda2d8e521 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Received event network-vif-plugged-f438eb18-11d6-49aa-9cc3-1a2656d47a6e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:51:02 np0005531887 nova_compute[186849]: 2025-11-22 07:51:02.303 186853 DEBUG oslo_concurrency.lockutils [req-b613ec0d-7ab2-417d-a6b9-a62944c3ee3e req-6d88ea4a-b06f-4f63-b841-72eda2d8e521 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c290ea3f-b425-4f25-946d-6b45d2ec31e5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:51:02 np0005531887 nova_compute[186849]: 2025-11-22 07:51:02.303 186853 DEBUG oslo_concurrency.lockutils [req-b613ec0d-7ab2-417d-a6b9-a62944c3ee3e req-6d88ea4a-b06f-4f63-b841-72eda2d8e521 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c290ea3f-b425-4f25-946d-6b45d2ec31e5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:51:02 np0005531887 nova_compute[186849]: 2025-11-22 07:51:02.304 186853 DEBUG oslo_concurrency.lockutils [req-b613ec0d-7ab2-417d-a6b9-a62944c3ee3e req-6d88ea4a-b06f-4f63-b841-72eda2d8e521 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c290ea3f-b425-4f25-946d-6b45d2ec31e5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:51:02 np0005531887 nova_compute[186849]: 2025-11-22 07:51:02.304 186853 DEBUG nova.compute.manager [req-b613ec0d-7ab2-417d-a6b9-a62944c3ee3e req-6d88ea4a-b06f-4f63-b841-72eda2d8e521 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] No waiting events found dispatching network-vif-plugged-f438eb18-11d6-49aa-9cc3-1a2656d47a6e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:51:02 np0005531887 nova_compute[186849]: 2025-11-22 07:51:02.304 186853 WARNING nova.compute.manager [req-b613ec0d-7ab2-417d-a6b9-a62944c3ee3e req-6d88ea4a-b06f-4f63-b841-72eda2d8e521 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Received unexpected event network-vif-plugged-f438eb18-11d6-49aa-9cc3-1a2656d47a6e for instance with vm_state active and task_state deleting.#033[00m
Nov 22 02:51:03 np0005531887 nova_compute[186849]: 2025-11-22 07:51:03.023 186853 DEBUG nova.network.neutron [-] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:51:03 np0005531887 nova_compute[186849]: 2025-11-22 07:51:03.045 186853 INFO nova.compute.manager [-] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Took 2.41 seconds to deallocate network for instance.#033[00m
Nov 22 02:51:03 np0005531887 nova_compute[186849]: 2025-11-22 07:51:03.146 186853 DEBUG oslo_concurrency.lockutils [None req-27322212-c226-4a12-944f-a88cb971c6c9 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:51:03 np0005531887 nova_compute[186849]: 2025-11-22 07:51:03.146 186853 DEBUG oslo_concurrency.lockutils [None req-27322212-c226-4a12-944f-a88cb971c6c9 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:51:03 np0005531887 nova_compute[186849]: 2025-11-22 07:51:03.215 186853 DEBUG nova.compute.provider_tree [None req-27322212-c226-4a12-944f-a88cb971c6c9 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:51:03 np0005531887 nova_compute[186849]: 2025-11-22 07:51:03.231 186853 DEBUG nova.scheduler.client.report [None req-27322212-c226-4a12-944f-a88cb971c6c9 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:51:03 np0005531887 nova_compute[186849]: 2025-11-22 07:51:03.289 186853 DEBUG oslo_concurrency.lockutils [None req-27322212-c226-4a12-944f-a88cb971c6c9 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.143s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:51:03 np0005531887 nova_compute[186849]: 2025-11-22 07:51:03.355 186853 INFO nova.scheduler.client.report [None req-27322212-c226-4a12-944f-a88cb971c6c9 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Deleted allocations for instance c290ea3f-b425-4f25-946d-6b45d2ec31e5#033[00m
Nov 22 02:51:03 np0005531887 nova_compute[186849]: 2025-11-22 07:51:03.505 186853 DEBUG oslo_concurrency.lockutils [None req-27322212-c226-4a12-944f-a88cb971c6c9 617dbb2ad35c42bc834156437fc93e34 64566cf00036456abfe577ae2fef6a7c - - default default] Lock "c290ea3f-b425-4f25-946d-6b45d2ec31e5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.279s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:51:04 np0005531887 nova_compute[186849]: 2025-11-22 07:51:04.531 186853 DEBUG nova.compute.manager [req-617289c2-cec5-4e98-9995-0d9691663e9b req-08c55ef7-46b2-42d0-9ac5-2d60be80c554 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Received event network-vif-deleted-f438eb18-11d6-49aa-9cc3-1a2656d47a6e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:51:04 np0005531887 nova_compute[186849]: 2025-11-22 07:51:04.906 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:05 np0005531887 nova_compute[186849]: 2025-11-22 07:51:05.073 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:05 np0005531887 nova_compute[186849]: 2025-11-22 07:51:05.510 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:05 np0005531887 nova_compute[186849]: 2025-11-22 07:51:05.538 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:06 np0005531887 podman[219861]: 2025-11-22 07:51:06.855222775 +0000 UTC m=+0.065870392 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 02:51:10 np0005531887 nova_compute[186849]: 2025-11-22 07:51:10.512 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:10 np0005531887 nova_compute[186849]: 2025-11-22 07:51:10.540 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:11 np0005531887 podman[219886]: 2025-11-22 07:51:11.83633444 +0000 UTC m=+0.054248496 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 22 02:51:14 np0005531887 podman[219904]: 2025-11-22 07:51:14.829458435 +0000 UTC m=+0.054687929 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Nov 22 02:51:15 np0005531887 nova_compute[186849]: 2025-11-22 07:51:15.506 186853 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797860.5054815, c290ea3f-b425-4f25-946d-6b45d2ec31e5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:51:15 np0005531887 nova_compute[186849]: 2025-11-22 07:51:15.507 186853 INFO nova.compute.manager [-] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:51:15 np0005531887 nova_compute[186849]: 2025-11-22 07:51:15.513 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:15 np0005531887 nova_compute[186849]: 2025-11-22 07:51:15.541 186853 DEBUG nova.compute.manager [None req-4ff1cd3d-58f7-4376-be2a-c493b01ed1e0 - - - - - -] [instance: c290ea3f-b425-4f25-946d-6b45d2ec31e5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:51:15 np0005531887 nova_compute[186849]: 2025-11-22 07:51:15.542 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:19 np0005531887 podman[219925]: 2025-11-22 07:51:19.833532802 +0000 UTC m=+0.053723455 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 02:51:20 np0005531887 nova_compute[186849]: 2025-11-22 07:51:20.515 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:20 np0005531887 nova_compute[186849]: 2025-11-22 07:51:20.544 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:25 np0005531887 nova_compute[186849]: 2025-11-22 07:51:25.518 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:25 np0005531887 nova_compute[186849]: 2025-11-22 07:51:25.545 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:25 np0005531887 podman[219949]: 2025-11-22 07:51:25.842926287 +0000 UTC m=+0.067295107 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, name=ubi9-minimal, vcs-type=git, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, vendor=Red Hat, Inc.)
Nov 22 02:51:30 np0005531887 nova_compute[186849]: 2025-11-22 07:51:30.520 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:30 np0005531887 nova_compute[186849]: 2025-11-22 07:51:30.547 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:30 np0005531887 podman[219971]: 2025-11-22 07:51:30.851147196 +0000 UTC m=+0.060234245 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Nov 22 02:51:30 np0005531887 podman[219972]: 2025-11-22 07:51:30.868945962 +0000 UTC m=+0.074012342 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 22 02:51:35 np0005531887 nova_compute[186849]: 2025-11-22 07:51:35.524 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:35 np0005531887 nova_compute[186849]: 2025-11-22 07:51:35.548 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:51:37.322 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:51:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:51:37.323 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:51:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:51:37.324 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:51:37 np0005531887 podman[220015]: 2025-11-22 07:51:37.837296272 +0000 UTC m=+0.053923350 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 02:51:40 np0005531887 nova_compute[186849]: 2025-11-22 07:51:40.322 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:51:40 np0005531887 nova_compute[186849]: 2025-11-22 07:51:40.525 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:40 np0005531887 nova_compute[186849]: 2025-11-22 07:51:40.550 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:41 np0005531887 nova_compute[186849]: 2025-11-22 07:51:41.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:51:42 np0005531887 podman[220042]: 2025-11-22 07:51:42.836048038 +0000 UTC m=+0.053831498 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:51:43 np0005531887 nova_compute[186849]: 2025-11-22 07:51:43.763 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:51:43 np0005531887 nova_compute[186849]: 2025-11-22 07:51:43.767 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:51:44 np0005531887 nova_compute[186849]: 2025-11-22 07:51:44.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:51:44 np0005531887 nova_compute[186849]: 2025-11-22 07:51:44.768 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 02:51:44 np0005531887 nova_compute[186849]: 2025-11-22 07:51:44.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 02:51:44 np0005531887 nova_compute[186849]: 2025-11-22 07:51:44.837 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 02:51:44 np0005531887 nova_compute[186849]: 2025-11-22 07:51:44.838 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:51:44 np0005531887 nova_compute[186849]: 2025-11-22 07:51:44.838 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:51:44 np0005531887 nova_compute[186849]: 2025-11-22 07:51:44.838 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 02:51:44 np0005531887 nova_compute[186849]: 2025-11-22 07:51:44.838 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:51:44 np0005531887 nova_compute[186849]: 2025-11-22 07:51:44.862 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:51:44 np0005531887 nova_compute[186849]: 2025-11-22 07:51:44.863 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:51:44 np0005531887 nova_compute[186849]: 2025-11-22 07:51:44.863 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:51:44 np0005531887 nova_compute[186849]: 2025-11-22 07:51:44.863 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 02:51:45 np0005531887 podman[220063]: 2025-11-22 07:51:45.088628534 +0000 UTC m=+0.079025135 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:51:45 np0005531887 nova_compute[186849]: 2025-11-22 07:51:45.162 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:51:45 np0005531887 nova_compute[186849]: 2025-11-22 07:51:45.163 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5759MB free_disk=73.45829010009766GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 02:51:45 np0005531887 nova_compute[186849]: 2025-11-22 07:51:45.163 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:51:45 np0005531887 nova_compute[186849]: 2025-11-22 07:51:45.163 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:51:45 np0005531887 nova_compute[186849]: 2025-11-22 07:51:45.237 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 02:51:45 np0005531887 nova_compute[186849]: 2025-11-22 07:51:45.237 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 02:51:45 np0005531887 nova_compute[186849]: 2025-11-22 07:51:45.274 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:51:45 np0005531887 nova_compute[186849]: 2025-11-22 07:51:45.287 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:51:45 np0005531887 nova_compute[186849]: 2025-11-22 07:51:45.312 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 02:51:45 np0005531887 nova_compute[186849]: 2025-11-22 07:51:45.312 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.149s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:51:45 np0005531887 nova_compute[186849]: 2025-11-22 07:51:45.527 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:45 np0005531887 nova_compute[186849]: 2025-11-22 07:51:45.552 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:46 np0005531887 nova_compute[186849]: 2025-11-22 07:51:46.244 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:51:46 np0005531887 nova_compute[186849]: 2025-11-22 07:51:46.762 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:51:46 np0005531887 ovn_controller[95130]: 2025-11-22T07:51:46Z|00109|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Nov 22 02:51:50 np0005531887 nova_compute[186849]: 2025-11-22 07:51:50.528 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:50 np0005531887 nova_compute[186849]: 2025-11-22 07:51:50.554 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:50 np0005531887 podman[220083]: 2025-11-22 07:51:50.819529116 +0000 UTC m=+0.044836778 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 02:51:51 np0005531887 nova_compute[186849]: 2025-11-22 07:51:51.067 186853 DEBUG oslo_concurrency.lockutils [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] Acquiring lock "0b1e0374-b871-406a-98ec-e74a4f1822a3" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:51:51 np0005531887 nova_compute[186849]: 2025-11-22 07:51:51.067 186853 DEBUG oslo_concurrency.lockutils [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] Lock "0b1e0374-b871-406a-98ec-e74a4f1822a3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:51:51 np0005531887 nova_compute[186849]: 2025-11-22 07:51:51.111 186853 DEBUG nova.compute.manager [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] [instance: 0b1e0374-b871-406a-98ec-e74a4f1822a3] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 02:51:51 np0005531887 nova_compute[186849]: 2025-11-22 07:51:51.232 186853 DEBUG oslo_concurrency.lockutils [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:51:51 np0005531887 nova_compute[186849]: 2025-11-22 07:51:51.233 186853 DEBUG oslo_concurrency.lockutils [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:51:51 np0005531887 nova_compute[186849]: 2025-11-22 07:51:51.244 186853 DEBUG nova.virt.hardware [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:51:51 np0005531887 nova_compute[186849]: 2025-11-22 07:51:51.244 186853 INFO nova.compute.claims [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] [instance: 0b1e0374-b871-406a-98ec-e74a4f1822a3] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 22 02:51:51 np0005531887 nova_compute[186849]: 2025-11-22 07:51:51.356 186853 DEBUG nova.compute.provider_tree [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:51:51 np0005531887 nova_compute[186849]: 2025-11-22 07:51:51.418 186853 DEBUG nova.scheduler.client.report [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:51:51 np0005531887 nova_compute[186849]: 2025-11-22 07:51:51.447 186853 DEBUG oslo_concurrency.lockutils [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.214s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:51:51 np0005531887 nova_compute[186849]: 2025-11-22 07:51:51.448 186853 DEBUG nova.compute.manager [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] [instance: 0b1e0374-b871-406a-98ec-e74a4f1822a3] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 02:51:51 np0005531887 nova_compute[186849]: 2025-11-22 07:51:51.519 186853 DEBUG nova.compute.manager [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] [instance: 0b1e0374-b871-406a-98ec-e74a4f1822a3] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 02:51:51 np0005531887 nova_compute[186849]: 2025-11-22 07:51:51.519 186853 DEBUG nova.network.neutron [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] [instance: 0b1e0374-b871-406a-98ec-e74a4f1822a3] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 02:51:51 np0005531887 nova_compute[186849]: 2025-11-22 07:51:51.539 186853 INFO nova.virt.libvirt.driver [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] [instance: 0b1e0374-b871-406a-98ec-e74a4f1822a3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 02:51:51 np0005531887 nova_compute[186849]: 2025-11-22 07:51:51.567 186853 DEBUG nova.compute.manager [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] [instance: 0b1e0374-b871-406a-98ec-e74a4f1822a3] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 02:51:51 np0005531887 nova_compute[186849]: 2025-11-22 07:51:51.669 186853 DEBUG nova.compute.manager [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] [instance: 0b1e0374-b871-406a-98ec-e74a4f1822a3] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 02:51:51 np0005531887 nova_compute[186849]: 2025-11-22 07:51:51.670 186853 DEBUG nova.virt.libvirt.driver [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] [instance: 0b1e0374-b871-406a-98ec-e74a4f1822a3] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:51:51 np0005531887 nova_compute[186849]: 2025-11-22 07:51:51.671 186853 INFO nova.virt.libvirt.driver [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] [instance: 0b1e0374-b871-406a-98ec-e74a4f1822a3] Creating image(s)#033[00m
Nov 22 02:51:51 np0005531887 nova_compute[186849]: 2025-11-22 07:51:51.672 186853 DEBUG oslo_concurrency.lockutils [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] Acquiring lock "/var/lib/nova/instances/0b1e0374-b871-406a-98ec-e74a4f1822a3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:51:51 np0005531887 nova_compute[186849]: 2025-11-22 07:51:51.672 186853 DEBUG oslo_concurrency.lockutils [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] Lock "/var/lib/nova/instances/0b1e0374-b871-406a-98ec-e74a4f1822a3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:51:51 np0005531887 nova_compute[186849]: 2025-11-22 07:51:51.673 186853 DEBUG oslo_concurrency.lockutils [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] Lock "/var/lib/nova/instances/0b1e0374-b871-406a-98ec-e74a4f1822a3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:51:51 np0005531887 nova_compute[186849]: 2025-11-22 07:51:51.691 186853 DEBUG oslo_concurrency.processutils [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:51:51 np0005531887 nova_compute[186849]: 2025-11-22 07:51:51.756 186853 DEBUG oslo_concurrency.processutils [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:51:51 np0005531887 nova_compute[186849]: 2025-11-22 07:51:51.758 186853 DEBUG oslo_concurrency.lockutils [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:51:51 np0005531887 nova_compute[186849]: 2025-11-22 07:51:51.759 186853 DEBUG oslo_concurrency.lockutils [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:51:51 np0005531887 nova_compute[186849]: 2025-11-22 07:51:51.770 186853 DEBUG oslo_concurrency.processutils [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:51:51 np0005531887 nova_compute[186849]: 2025-11-22 07:51:51.833 186853 DEBUG oslo_concurrency.processutils [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:51:51 np0005531887 nova_compute[186849]: 2025-11-22 07:51:51.834 186853 DEBUG oslo_concurrency.processutils [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/0b1e0374-b871-406a-98ec-e74a4f1822a3/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:51:51 np0005531887 nova_compute[186849]: 2025-11-22 07:51:51.885 186853 DEBUG oslo_concurrency.processutils [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/0b1e0374-b871-406a-98ec-e74a4f1822a3/disk 1073741824" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:51:51 np0005531887 nova_compute[186849]: 2025-11-22 07:51:51.887 186853 DEBUG oslo_concurrency.lockutils [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:51:51 np0005531887 nova_compute[186849]: 2025-11-22 07:51:51.888 186853 DEBUG oslo_concurrency.processutils [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:51:51 np0005531887 nova_compute[186849]: 2025-11-22 07:51:51.950 186853 DEBUG oslo_concurrency.processutils [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:51:51 np0005531887 nova_compute[186849]: 2025-11-22 07:51:51.951 186853 DEBUG nova.virt.disk.api [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] Checking if we can resize image /var/lib/nova/instances/0b1e0374-b871-406a-98ec-e74a4f1822a3/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:51:51 np0005531887 nova_compute[186849]: 2025-11-22 07:51:51.952 186853 DEBUG oslo_concurrency.processutils [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0b1e0374-b871-406a-98ec-e74a4f1822a3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:51:51 np0005531887 nova_compute[186849]: 2025-11-22 07:51:51.997 186853 DEBUG nova.network.neutron [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] [instance: 0b1e0374-b871-406a-98ec-e74a4f1822a3] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Nov 22 02:51:51 np0005531887 nova_compute[186849]: 2025-11-22 07:51:51.998 186853 DEBUG nova.compute.manager [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] [instance: 0b1e0374-b871-406a-98ec-e74a4f1822a3] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 02:51:52 np0005531887 nova_compute[186849]: 2025-11-22 07:51:52.021 186853 DEBUG oslo_concurrency.processutils [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0b1e0374-b871-406a-98ec-e74a4f1822a3/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:51:52 np0005531887 nova_compute[186849]: 2025-11-22 07:51:52.022 186853 DEBUG nova.virt.disk.api [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] Cannot resize image /var/lib/nova/instances/0b1e0374-b871-406a-98ec-e74a4f1822a3/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:51:52 np0005531887 nova_compute[186849]: 2025-11-22 07:51:52.022 186853 DEBUG nova.objects.instance [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] Lazy-loading 'migration_context' on Instance uuid 0b1e0374-b871-406a-98ec-e74a4f1822a3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:51:52 np0005531887 nova_compute[186849]: 2025-11-22 07:51:52.038 186853 DEBUG nova.virt.libvirt.driver [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] [instance: 0b1e0374-b871-406a-98ec-e74a4f1822a3] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:51:52 np0005531887 nova_compute[186849]: 2025-11-22 07:51:52.039 186853 DEBUG nova.virt.libvirt.driver [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] [instance: 0b1e0374-b871-406a-98ec-e74a4f1822a3] Ensure instance console log exists: /var/lib/nova/instances/0b1e0374-b871-406a-98ec-e74a4f1822a3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:51:52 np0005531887 nova_compute[186849]: 2025-11-22 07:51:52.039 186853 DEBUG oslo_concurrency.lockutils [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:51:52 np0005531887 nova_compute[186849]: 2025-11-22 07:51:52.040 186853 DEBUG oslo_concurrency.lockutils [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:51:52 np0005531887 nova_compute[186849]: 2025-11-22 07:51:52.040 186853 DEBUG oslo_concurrency.lockutils [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:51:52 np0005531887 nova_compute[186849]: 2025-11-22 07:51:52.042 186853 DEBUG nova.virt.libvirt.driver [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] [instance: 0b1e0374-b871-406a-98ec-e74a4f1822a3] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'size': 0, 'boot_index': 0, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:51:52 np0005531887 nova_compute[186849]: 2025-11-22 07:51:52.050 186853 WARNING nova.virt.libvirt.driver [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:51:52 np0005531887 nova_compute[186849]: 2025-11-22 07:51:52.056 186853 DEBUG nova.virt.libvirt.host [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:51:52 np0005531887 nova_compute[186849]: 2025-11-22 07:51:52.057 186853 DEBUG nova.virt.libvirt.host [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:51:52 np0005531887 nova_compute[186849]: 2025-11-22 07:51:52.061 186853 DEBUG nova.virt.libvirt.host [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:51:52 np0005531887 nova_compute[186849]: 2025-11-22 07:51:52.062 186853 DEBUG nova.virt.libvirt.host [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:51:52 np0005531887 nova_compute[186849]: 2025-11-22 07:51:52.064 186853 DEBUG nova.virt.libvirt.driver [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:51:52 np0005531887 nova_compute[186849]: 2025-11-22 07:51:52.064 186853 DEBUG nova.virt.hardware [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:51:52 np0005531887 nova_compute[186849]: 2025-11-22 07:51:52.065 186853 DEBUG nova.virt.hardware [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:51:52 np0005531887 nova_compute[186849]: 2025-11-22 07:51:52.065 186853 DEBUG nova.virt.hardware [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:51:52 np0005531887 nova_compute[186849]: 2025-11-22 07:51:52.065 186853 DEBUG nova.virt.hardware [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:51:52 np0005531887 nova_compute[186849]: 2025-11-22 07:51:52.066 186853 DEBUG nova.virt.hardware [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:51:52 np0005531887 nova_compute[186849]: 2025-11-22 07:51:52.066 186853 DEBUG nova.virt.hardware [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:51:52 np0005531887 nova_compute[186849]: 2025-11-22 07:51:52.066 186853 DEBUG nova.virt.hardware [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:51:52 np0005531887 nova_compute[186849]: 2025-11-22 07:51:52.066 186853 DEBUG nova.virt.hardware [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:51:52 np0005531887 nova_compute[186849]: 2025-11-22 07:51:52.067 186853 DEBUG nova.virt.hardware [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:51:52 np0005531887 nova_compute[186849]: 2025-11-22 07:51:52.067 186853 DEBUG nova.virt.hardware [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:51:52 np0005531887 nova_compute[186849]: 2025-11-22 07:51:52.067 186853 DEBUG nova.virt.hardware [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:51:52 np0005531887 nova_compute[186849]: 2025-11-22 07:51:52.072 186853 DEBUG nova.objects.instance [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0b1e0374-b871-406a-98ec-e74a4f1822a3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:51:52 np0005531887 nova_compute[186849]: 2025-11-22 07:51:52.087 186853 DEBUG nova.virt.libvirt.driver [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] [instance: 0b1e0374-b871-406a-98ec-e74a4f1822a3] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:51:52 np0005531887 nova_compute[186849]:  <uuid>0b1e0374-b871-406a-98ec-e74a4f1822a3</uuid>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:  <name>instance-00000036</name>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:  <memory>131072</memory>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:  <vcpu>1</vcpu>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:51:52 np0005531887 nova_compute[186849]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:      <nova:name>tempest-TenantUsagesTestJSON-server-1762263964</nova:name>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:      <nova:creationTime>2025-11-22 07:51:52</nova:creationTime>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:      <nova:flavor name="m1.nano">
Nov 22 02:51:52 np0005531887 nova_compute[186849]:        <nova:memory>128</nova:memory>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:        <nova:disk>1</nova:disk>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:        <nova:swap>0</nova:swap>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:      </nova:flavor>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:      <nova:owner>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:        <nova:user uuid="01c82b701aee48a3aa55d5491c638a64">tempest-TenantUsagesTestJSON-898548957-project-member</nova:user>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:        <nova:project uuid="866eac76303344da94aa51721faf7527">tempest-TenantUsagesTestJSON-898548957</nova:project>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:      </nova:owner>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:      <nova:ports/>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:    </nova:instance>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:  <sysinfo type="smbios">
Nov 22 02:51:52 np0005531887 nova_compute[186849]:    <system>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:      <entry name="serial">0b1e0374-b871-406a-98ec-e74a4f1822a3</entry>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:      <entry name="uuid">0b1e0374-b871-406a-98ec-e74a4f1822a3</entry>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:    </system>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:  <os>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:    <boot dev="hd"/>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:    <smbios mode="sysinfo"/>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:  </os>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:  <features>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:    <vmcoreinfo/>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:  </features>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:  <clock offset="utc">
Nov 22 02:51:52 np0005531887 nova_compute[186849]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:    <timer name="hpet" present="no"/>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:  </clock>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:  <cpu mode="custom" match="exact">
Nov 22 02:51:52 np0005531887 nova_compute[186849]:    <model>Nehalem</model>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:  <devices>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:    <disk type="file" device="disk">
Nov 22 02:51:52 np0005531887 nova_compute[186849]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/0b1e0374-b871-406a-98ec-e74a4f1822a3/disk"/>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:      <target dev="vda" bus="virtio"/>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:    <disk type="file" device="cdrom">
Nov 22 02:51:52 np0005531887 nova_compute[186849]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/0b1e0374-b871-406a-98ec-e74a4f1822a3/disk.config"/>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:      <target dev="sda" bus="sata"/>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:    <serial type="pty">
Nov 22 02:51:52 np0005531887 nova_compute[186849]:      <log file="/var/lib/nova/instances/0b1e0374-b871-406a-98ec-e74a4f1822a3/console.log" append="off"/>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:    </serial>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:    <video>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:    </video>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:    <input type="tablet" bus="usb"/>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:    <rng model="virtio">
Nov 22 02:51:52 np0005531887 nova_compute[186849]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:    </rng>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:    <controller type="usb" index="0"/>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:    <memballoon model="virtio">
Nov 22 02:51:52 np0005531887 nova_compute[186849]:      <stats period="10"/>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 02:51:52 np0005531887 nova_compute[186849]:  </devices>
Nov 22 02:51:52 np0005531887 nova_compute[186849]: </domain>
Nov 22 02:51:52 np0005531887 nova_compute[186849]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:51:52 np0005531887 nova_compute[186849]: 2025-11-22 07:51:52.133 186853 DEBUG nova.virt.libvirt.driver [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:51:52 np0005531887 nova_compute[186849]: 2025-11-22 07:51:52.134 186853 DEBUG nova.virt.libvirt.driver [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:51:52 np0005531887 nova_compute[186849]: 2025-11-22 07:51:52.134 186853 INFO nova.virt.libvirt.driver [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] [instance: 0b1e0374-b871-406a-98ec-e74a4f1822a3] Using config drive#033[00m
Nov 22 02:51:52 np0005531887 nova_compute[186849]: 2025-11-22 07:51:52.802 186853 INFO nova.virt.libvirt.driver [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] [instance: 0b1e0374-b871-406a-98ec-e74a4f1822a3] Creating config drive at /var/lib/nova/instances/0b1e0374-b871-406a-98ec-e74a4f1822a3/disk.config#033[00m
Nov 22 02:51:52 np0005531887 nova_compute[186849]: 2025-11-22 07:51:52.806 186853 DEBUG oslo_concurrency.processutils [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0b1e0374-b871-406a-98ec-e74a4f1822a3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpy8af3sw7 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:51:52 np0005531887 nova_compute[186849]: 2025-11-22 07:51:52.938 186853 DEBUG oslo_concurrency.processutils [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0b1e0374-b871-406a-98ec-e74a4f1822a3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpy8af3sw7" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:51:53 np0005531887 systemd-machined[153180]: New machine qemu-19-instance-00000036.
Nov 22 02:51:53 np0005531887 systemd[1]: Started Virtual Machine qemu-19-instance-00000036.
Nov 22 02:51:53 np0005531887 nova_compute[186849]: 2025-11-22 07:51:53.367 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763797913.36721, 0b1e0374-b871-406a-98ec-e74a4f1822a3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:51:53 np0005531887 nova_compute[186849]: 2025-11-22 07:51:53.368 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 0b1e0374-b871-406a-98ec-e74a4f1822a3] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:51:53 np0005531887 nova_compute[186849]: 2025-11-22 07:51:53.370 186853 DEBUG nova.compute.manager [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] [instance: 0b1e0374-b871-406a-98ec-e74a4f1822a3] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:51:53 np0005531887 nova_compute[186849]: 2025-11-22 07:51:53.372 186853 DEBUG nova.virt.libvirt.driver [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] [instance: 0b1e0374-b871-406a-98ec-e74a4f1822a3] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 02:51:53 np0005531887 nova_compute[186849]: 2025-11-22 07:51:53.375 186853 INFO nova.virt.libvirt.driver [-] [instance: 0b1e0374-b871-406a-98ec-e74a4f1822a3] Instance spawned successfully.#033[00m
Nov 22 02:51:53 np0005531887 nova_compute[186849]: 2025-11-22 07:51:53.376 186853 DEBUG nova.virt.libvirt.driver [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] [instance: 0b1e0374-b871-406a-98ec-e74a4f1822a3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 02:51:53 np0005531887 nova_compute[186849]: 2025-11-22 07:51:53.400 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 0b1e0374-b871-406a-98ec-e74a4f1822a3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:51:53 np0005531887 nova_compute[186849]: 2025-11-22 07:51:53.406 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 0b1e0374-b871-406a-98ec-e74a4f1822a3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:51:53 np0005531887 nova_compute[186849]: 2025-11-22 07:51:53.408 186853 DEBUG nova.virt.libvirt.driver [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] [instance: 0b1e0374-b871-406a-98ec-e74a4f1822a3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:51:53 np0005531887 nova_compute[186849]: 2025-11-22 07:51:53.409 186853 DEBUG nova.virt.libvirt.driver [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] [instance: 0b1e0374-b871-406a-98ec-e74a4f1822a3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:51:53 np0005531887 nova_compute[186849]: 2025-11-22 07:51:53.409 186853 DEBUG nova.virt.libvirt.driver [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] [instance: 0b1e0374-b871-406a-98ec-e74a4f1822a3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:51:53 np0005531887 nova_compute[186849]: 2025-11-22 07:51:53.409 186853 DEBUG nova.virt.libvirt.driver [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] [instance: 0b1e0374-b871-406a-98ec-e74a4f1822a3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:51:53 np0005531887 nova_compute[186849]: 2025-11-22 07:51:53.410 186853 DEBUG nova.virt.libvirt.driver [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] [instance: 0b1e0374-b871-406a-98ec-e74a4f1822a3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:51:53 np0005531887 nova_compute[186849]: 2025-11-22 07:51:53.410 186853 DEBUG nova.virt.libvirt.driver [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] [instance: 0b1e0374-b871-406a-98ec-e74a4f1822a3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:51:53 np0005531887 nova_compute[186849]: 2025-11-22 07:51:53.433 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 0b1e0374-b871-406a-98ec-e74a4f1822a3] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:51:53 np0005531887 nova_compute[186849]: 2025-11-22 07:51:53.434 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763797913.3711903, 0b1e0374-b871-406a-98ec-e74a4f1822a3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:51:53 np0005531887 nova_compute[186849]: 2025-11-22 07:51:53.434 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 0b1e0374-b871-406a-98ec-e74a4f1822a3] VM Started (Lifecycle Event)#033[00m
Nov 22 02:51:53 np0005531887 nova_compute[186849]: 2025-11-22 07:51:53.452 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 0b1e0374-b871-406a-98ec-e74a4f1822a3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:51:53 np0005531887 nova_compute[186849]: 2025-11-22 07:51:53.455 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 0b1e0374-b871-406a-98ec-e74a4f1822a3] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:51:53 np0005531887 nova_compute[186849]: 2025-11-22 07:51:53.483 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 0b1e0374-b871-406a-98ec-e74a4f1822a3] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:51:53 np0005531887 nova_compute[186849]: 2025-11-22 07:51:53.526 186853 INFO nova.compute.manager [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] [instance: 0b1e0374-b871-406a-98ec-e74a4f1822a3] Took 1.86 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 02:51:53 np0005531887 nova_compute[186849]: 2025-11-22 07:51:53.527 186853 DEBUG nova.compute.manager [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] [instance: 0b1e0374-b871-406a-98ec-e74a4f1822a3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:51:53 np0005531887 nova_compute[186849]: 2025-11-22 07:51:53.622 186853 INFO nova.compute.manager [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] [instance: 0b1e0374-b871-406a-98ec-e74a4f1822a3] Took 2.42 seconds to build instance.#033[00m
Nov 22 02:51:53 np0005531887 nova_compute[186849]: 2025-11-22 07:51:53.649 186853 DEBUG oslo_concurrency.lockutils [None req-f446241d-5242-408e-8ca9-2d41c1268ee5 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] Lock "0b1e0374-b871-406a-98ec-e74a4f1822a3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.581s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:51:55 np0005531887 nova_compute[186849]: 2025-11-22 07:51:55.531 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:55 np0005531887 nova_compute[186849]: 2025-11-22 07:51:55.556 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:51:56 np0005531887 podman[220149]: 2025-11-22 07:51:56.865112596 +0000 UTC m=+0.076959803 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, name=ubi9-minimal, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc.)
Nov 22 02:51:57 np0005531887 nova_compute[186849]: 2025-11-22 07:51:57.824 186853 DEBUG oslo_concurrency.lockutils [None req-61bdc873-712e-45bf-8e43-d988061d1ce8 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] Acquiring lock "0b1e0374-b871-406a-98ec-e74a4f1822a3" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:51:57 np0005531887 nova_compute[186849]: 2025-11-22 07:51:57.824 186853 DEBUG oslo_concurrency.lockutils [None req-61bdc873-712e-45bf-8e43-d988061d1ce8 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] Lock "0b1e0374-b871-406a-98ec-e74a4f1822a3" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:51:57 np0005531887 nova_compute[186849]: 2025-11-22 07:51:57.824 186853 DEBUG oslo_concurrency.lockutils [None req-61bdc873-712e-45bf-8e43-d988061d1ce8 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] Acquiring lock "0b1e0374-b871-406a-98ec-e74a4f1822a3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:51:57 np0005531887 nova_compute[186849]: 2025-11-22 07:51:57.825 186853 DEBUG oslo_concurrency.lockutils [None req-61bdc873-712e-45bf-8e43-d988061d1ce8 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] Lock "0b1e0374-b871-406a-98ec-e74a4f1822a3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:51:57 np0005531887 nova_compute[186849]: 2025-11-22 07:51:57.825 186853 DEBUG oslo_concurrency.lockutils [None req-61bdc873-712e-45bf-8e43-d988061d1ce8 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] Lock "0b1e0374-b871-406a-98ec-e74a4f1822a3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:51:57 np0005531887 nova_compute[186849]: 2025-11-22 07:51:57.832 186853 INFO nova.compute.manager [None req-61bdc873-712e-45bf-8e43-d988061d1ce8 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] [instance: 0b1e0374-b871-406a-98ec-e74a4f1822a3] Terminating instance#033[00m
Nov 22 02:51:57 np0005531887 nova_compute[186849]: 2025-11-22 07:51:57.838 186853 DEBUG oslo_concurrency.lockutils [None req-61bdc873-712e-45bf-8e43-d988061d1ce8 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] Acquiring lock "refresh_cache-0b1e0374-b871-406a-98ec-e74a4f1822a3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:51:57 np0005531887 nova_compute[186849]: 2025-11-22 07:51:57.838 186853 DEBUG oslo_concurrency.lockutils [None req-61bdc873-712e-45bf-8e43-d988061d1ce8 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] Acquired lock "refresh_cache-0b1e0374-b871-406a-98ec-e74a4f1822a3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:51:57 np0005531887 nova_compute[186849]: 2025-11-22 07:51:57.838 186853 DEBUG nova.network.neutron [None req-61bdc873-712e-45bf-8e43-d988061d1ce8 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] [instance: 0b1e0374-b871-406a-98ec-e74a4f1822a3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:51:58 np0005531887 nova_compute[186849]: 2025-11-22 07:51:58.108 186853 DEBUG nova.network.neutron [None req-61bdc873-712e-45bf-8e43-d988061d1ce8 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] [instance: 0b1e0374-b871-406a-98ec-e74a4f1822a3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:51:58 np0005531887 nova_compute[186849]: 2025-11-22 07:51:58.698 186853 DEBUG nova.network.neutron [None req-61bdc873-712e-45bf-8e43-d988061d1ce8 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] [instance: 0b1e0374-b871-406a-98ec-e74a4f1822a3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:51:58 np0005531887 nova_compute[186849]: 2025-11-22 07:51:58.716 186853 DEBUG oslo_concurrency.lockutils [None req-61bdc873-712e-45bf-8e43-d988061d1ce8 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] Releasing lock "refresh_cache-0b1e0374-b871-406a-98ec-e74a4f1822a3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:51:58 np0005531887 nova_compute[186849]: 2025-11-22 07:51:58.717 186853 DEBUG nova.compute.manager [None req-61bdc873-712e-45bf-8e43-d988061d1ce8 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] [instance: 0b1e0374-b871-406a-98ec-e74a4f1822a3] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 02:51:58 np0005531887 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000036.scope: Deactivated successfully.
Nov 22 02:51:58 np0005531887 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000036.scope: Consumed 5.611s CPU time.
Nov 22 02:51:58 np0005531887 systemd-machined[153180]: Machine qemu-19-instance-00000036 terminated.
Nov 22 02:51:58 np0005531887 nova_compute[186849]: 2025-11-22 07:51:58.960 186853 INFO nova.virt.libvirt.driver [-] [instance: 0b1e0374-b871-406a-98ec-e74a4f1822a3] Instance destroyed successfully.#033[00m
Nov 22 02:51:58 np0005531887 nova_compute[186849]: 2025-11-22 07:51:58.961 186853 DEBUG nova.objects.instance [None req-61bdc873-712e-45bf-8e43-d988061d1ce8 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] Lazy-loading 'resources' on Instance uuid 0b1e0374-b871-406a-98ec-e74a4f1822a3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:51:58 np0005531887 nova_compute[186849]: 2025-11-22 07:51:58.976 186853 INFO nova.virt.libvirt.driver [None req-61bdc873-712e-45bf-8e43-d988061d1ce8 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] [instance: 0b1e0374-b871-406a-98ec-e74a4f1822a3] Deleting instance files /var/lib/nova/instances/0b1e0374-b871-406a-98ec-e74a4f1822a3_del#033[00m
Nov 22 02:51:58 np0005531887 nova_compute[186849]: 2025-11-22 07:51:58.977 186853 INFO nova.virt.libvirt.driver [None req-61bdc873-712e-45bf-8e43-d988061d1ce8 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] [instance: 0b1e0374-b871-406a-98ec-e74a4f1822a3] Deletion of /var/lib/nova/instances/0b1e0374-b871-406a-98ec-e74a4f1822a3_del complete#033[00m
Nov 22 02:51:59 np0005531887 nova_compute[186849]: 2025-11-22 07:51:59.104 186853 INFO nova.compute.manager [None req-61bdc873-712e-45bf-8e43-d988061d1ce8 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] [instance: 0b1e0374-b871-406a-98ec-e74a4f1822a3] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 02:51:59 np0005531887 nova_compute[186849]: 2025-11-22 07:51:59.104 186853 DEBUG oslo.service.loopingcall [None req-61bdc873-712e-45bf-8e43-d988061d1ce8 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 02:51:59 np0005531887 nova_compute[186849]: 2025-11-22 07:51:59.105 186853 DEBUG nova.compute.manager [-] [instance: 0b1e0374-b871-406a-98ec-e74a4f1822a3] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 02:51:59 np0005531887 nova_compute[186849]: 2025-11-22 07:51:59.105 186853 DEBUG nova.network.neutron [-] [instance: 0b1e0374-b871-406a-98ec-e74a4f1822a3] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 02:51:59 np0005531887 nova_compute[186849]: 2025-11-22 07:51:59.828 186853 DEBUG nova.network.neutron [-] [instance: 0b1e0374-b871-406a-98ec-e74a4f1822a3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:51:59 np0005531887 nova_compute[186849]: 2025-11-22 07:51:59.847 186853 DEBUG nova.network.neutron [-] [instance: 0b1e0374-b871-406a-98ec-e74a4f1822a3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:51:59 np0005531887 nova_compute[186849]: 2025-11-22 07:51:59.874 186853 INFO nova.compute.manager [-] [instance: 0b1e0374-b871-406a-98ec-e74a4f1822a3] Took 0.77 seconds to deallocate network for instance.#033[00m
Nov 22 02:52:00 np0005531887 nova_compute[186849]: 2025-11-22 07:52:00.075 186853 DEBUG oslo_concurrency.lockutils [None req-61bdc873-712e-45bf-8e43-d988061d1ce8 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:52:00 np0005531887 nova_compute[186849]: 2025-11-22 07:52:00.076 186853 DEBUG oslo_concurrency.lockutils [None req-61bdc873-712e-45bf-8e43-d988061d1ce8 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:52:00 np0005531887 nova_compute[186849]: 2025-11-22 07:52:00.162 186853 DEBUG nova.compute.provider_tree [None req-61bdc873-712e-45bf-8e43-d988061d1ce8 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:52:00 np0005531887 nova_compute[186849]: 2025-11-22 07:52:00.185 186853 DEBUG nova.scheduler.client.report [None req-61bdc873-712e-45bf-8e43-d988061d1ce8 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:52:00 np0005531887 nova_compute[186849]: 2025-11-22 07:52:00.215 186853 DEBUG oslo_concurrency.lockutils [None req-61bdc873-712e-45bf-8e43-d988061d1ce8 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:52:00 np0005531887 nova_compute[186849]: 2025-11-22 07:52:00.532 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:00 np0005531887 nova_compute[186849]: 2025-11-22 07:52:00.557 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:00 np0005531887 nova_compute[186849]: 2025-11-22 07:52:00.608 186853 INFO nova.scheduler.client.report [None req-61bdc873-712e-45bf-8e43-d988061d1ce8 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] Deleted allocations for instance 0b1e0374-b871-406a-98ec-e74a4f1822a3#033[00m
Nov 22 02:52:00 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:52:00.675 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:52:00 np0005531887 nova_compute[186849]: 2025-11-22 07:52:00.675 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:00 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:52:00.675 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 02:52:00 np0005531887 nova_compute[186849]: 2025-11-22 07:52:00.743 186853 DEBUG oslo_concurrency.lockutils [None req-61bdc873-712e-45bf-8e43-d988061d1ce8 01c82b701aee48a3aa55d5491c638a64 866eac76303344da94aa51721faf7527 - - default default] Lock "0b1e0374-b871-406a-98ec-e74a4f1822a3" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.918s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:52:01 np0005531887 podman[220180]: 2025-11-22 07:52:01.83183687 +0000 UTC m=+0.054211248 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute)
Nov 22 02:52:01 np0005531887 podman[220181]: 2025-11-22 07:52:01.887290116 +0000 UTC m=+0.107543801 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Nov 22 02:52:05 np0005531887 nova_compute[186849]: 2025-11-22 07:52:05.536 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:05 np0005531887 nova_compute[186849]: 2025-11-22 07:52:05.558 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:07 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:52:07.677 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:52:08 np0005531887 nova_compute[186849]: 2025-11-22 07:52:08.148 186853 DEBUG oslo_concurrency.lockutils [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "f4f28646-e9a4-464e-a7f6-db4a7d0d83a9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:52:08 np0005531887 nova_compute[186849]: 2025-11-22 07:52:08.148 186853 DEBUG oslo_concurrency.lockutils [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "f4f28646-e9a4-464e-a7f6-db4a7d0d83a9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:52:08 np0005531887 nova_compute[186849]: 2025-11-22 07:52:08.171 186853 DEBUG nova.compute.manager [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 02:52:08 np0005531887 nova_compute[186849]: 2025-11-22 07:52:08.393 186853 DEBUG oslo_concurrency.lockutils [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:52:08 np0005531887 nova_compute[186849]: 2025-11-22 07:52:08.393 186853 DEBUG oslo_concurrency.lockutils [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:52:08 np0005531887 nova_compute[186849]: 2025-11-22 07:52:08.400 186853 DEBUG nova.virt.hardware [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:52:08 np0005531887 nova_compute[186849]: 2025-11-22 07:52:08.401 186853 INFO nova.compute.claims [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 22 02:52:08 np0005531887 nova_compute[186849]: 2025-11-22 07:52:08.545 186853 DEBUG nova.compute.provider_tree [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:52:08 np0005531887 nova_compute[186849]: 2025-11-22 07:52:08.562 186853 DEBUG nova.scheduler.client.report [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:52:08 np0005531887 nova_compute[186849]: 2025-11-22 07:52:08.707 186853 DEBUG oslo_concurrency.lockutils [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.314s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:52:08 np0005531887 nova_compute[186849]: 2025-11-22 07:52:08.708 186853 DEBUG nova.compute.manager [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 02:52:08 np0005531887 nova_compute[186849]: 2025-11-22 07:52:08.790 186853 DEBUG nova.compute.manager [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 02:52:08 np0005531887 nova_compute[186849]: 2025-11-22 07:52:08.790 186853 DEBUG nova.network.neutron [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 02:52:08 np0005531887 nova_compute[186849]: 2025-11-22 07:52:08.816 186853 INFO nova.virt.libvirt.driver [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 02:52:08 np0005531887 podman[220229]: 2025-11-22 07:52:08.850276054 +0000 UTC m=+0.070453235 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 02:52:08 np0005531887 nova_compute[186849]: 2025-11-22 07:52:08.870 186853 DEBUG nova.compute.manager [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 02:52:09 np0005531887 nova_compute[186849]: 2025-11-22 07:52:09.265 186853 DEBUG nova.compute.manager [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 02:52:09 np0005531887 nova_compute[186849]: 2025-11-22 07:52:09.268 186853 DEBUG nova.virt.libvirt.driver [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:52:09 np0005531887 nova_compute[186849]: 2025-11-22 07:52:09.268 186853 INFO nova.virt.libvirt.driver [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Creating image(s)#033[00m
Nov 22 02:52:09 np0005531887 nova_compute[186849]: 2025-11-22 07:52:09.269 186853 DEBUG oslo_concurrency.lockutils [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "/var/lib/nova/instances/f4f28646-e9a4-464e-a7f6-db4a7d0d83a9/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:52:09 np0005531887 nova_compute[186849]: 2025-11-22 07:52:09.269 186853 DEBUG oslo_concurrency.lockutils [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "/var/lib/nova/instances/f4f28646-e9a4-464e-a7f6-db4a7d0d83a9/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:52:09 np0005531887 nova_compute[186849]: 2025-11-22 07:52:09.270 186853 DEBUG oslo_concurrency.lockutils [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "/var/lib/nova/instances/f4f28646-e9a4-464e-a7f6-db4a7d0d83a9/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:52:09 np0005531887 nova_compute[186849]: 2025-11-22 07:52:09.283 186853 DEBUG oslo_concurrency.processutils [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:52:09 np0005531887 nova_compute[186849]: 2025-11-22 07:52:09.341 186853 DEBUG oslo_concurrency.processutils [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:52:09 np0005531887 nova_compute[186849]: 2025-11-22 07:52:09.343 186853 DEBUG oslo_concurrency.lockutils [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:52:09 np0005531887 nova_compute[186849]: 2025-11-22 07:52:09.343 186853 DEBUG oslo_concurrency.lockutils [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:52:09 np0005531887 nova_compute[186849]: 2025-11-22 07:52:09.359 186853 DEBUG oslo_concurrency.processutils [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:52:09 np0005531887 nova_compute[186849]: 2025-11-22 07:52:09.413 186853 DEBUG nova.policy [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 02:52:09 np0005531887 nova_compute[186849]: 2025-11-22 07:52:09.423 186853 DEBUG oslo_concurrency.processutils [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:52:09 np0005531887 nova_compute[186849]: 2025-11-22 07:52:09.424 186853 DEBUG oslo_concurrency.processutils [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/f4f28646-e9a4-464e-a7f6-db4a7d0d83a9/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:52:09 np0005531887 nova_compute[186849]: 2025-11-22 07:52:09.467 186853 DEBUG oslo_concurrency.processutils [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/f4f28646-e9a4-464e-a7f6-db4a7d0d83a9/disk 1073741824" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:52:09 np0005531887 nova_compute[186849]: 2025-11-22 07:52:09.468 186853 DEBUG oslo_concurrency.lockutils [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:52:09 np0005531887 nova_compute[186849]: 2025-11-22 07:52:09.469 186853 DEBUG oslo_concurrency.processutils [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:52:09 np0005531887 nova_compute[186849]: 2025-11-22 07:52:09.563 186853 DEBUG oslo_concurrency.processutils [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:52:09 np0005531887 nova_compute[186849]: 2025-11-22 07:52:09.564 186853 DEBUG nova.virt.disk.api [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Checking if we can resize image /var/lib/nova/instances/f4f28646-e9a4-464e-a7f6-db4a7d0d83a9/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:52:09 np0005531887 nova_compute[186849]: 2025-11-22 07:52:09.564 186853 DEBUG oslo_concurrency.processutils [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f4f28646-e9a4-464e-a7f6-db4a7d0d83a9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:52:09 np0005531887 nova_compute[186849]: 2025-11-22 07:52:09.622 186853 DEBUG oslo_concurrency.processutils [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f4f28646-e9a4-464e-a7f6-db4a7d0d83a9/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:52:09 np0005531887 nova_compute[186849]: 2025-11-22 07:52:09.623 186853 DEBUG nova.virt.disk.api [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Cannot resize image /var/lib/nova/instances/f4f28646-e9a4-464e-a7f6-db4a7d0d83a9/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:52:09 np0005531887 nova_compute[186849]: 2025-11-22 07:52:09.623 186853 DEBUG nova.objects.instance [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lazy-loading 'migration_context' on Instance uuid f4f28646-e9a4-464e-a7f6-db4a7d0d83a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:52:09 np0005531887 nova_compute[186849]: 2025-11-22 07:52:09.644 186853 DEBUG nova.virt.libvirt.driver [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:52:09 np0005531887 nova_compute[186849]: 2025-11-22 07:52:09.644 186853 DEBUG nova.virt.libvirt.driver [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Ensure instance console log exists: /var/lib/nova/instances/f4f28646-e9a4-464e-a7f6-db4a7d0d83a9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:52:09 np0005531887 nova_compute[186849]: 2025-11-22 07:52:09.645 186853 DEBUG oslo_concurrency.lockutils [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:52:09 np0005531887 nova_compute[186849]: 2025-11-22 07:52:09.645 186853 DEBUG oslo_concurrency.lockutils [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:52:09 np0005531887 nova_compute[186849]: 2025-11-22 07:52:09.646 186853 DEBUG oslo_concurrency.lockutils [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:52:10 np0005531887 nova_compute[186849]: 2025-11-22 07:52:10.538 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:10 np0005531887 nova_compute[186849]: 2025-11-22 07:52:10.560 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:11 np0005531887 nova_compute[186849]: 2025-11-22 07:52:11.339 186853 DEBUG nova.network.neutron [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Successfully created port: 8eb86bde-6b5b-491d-9ae4-94dcc46b6169 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 02:52:13 np0005531887 nova_compute[186849]: 2025-11-22 07:52:13.608 186853 DEBUG nova.network.neutron [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Successfully updated port: 8eb86bde-6b5b-491d-9ae4-94dcc46b6169 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 02:52:13 np0005531887 nova_compute[186849]: 2025-11-22 07:52:13.632 186853 DEBUG oslo_concurrency.lockutils [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "refresh_cache-f4f28646-e9a4-464e-a7f6-db4a7d0d83a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:52:13 np0005531887 nova_compute[186849]: 2025-11-22 07:52:13.633 186853 DEBUG oslo_concurrency.lockutils [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquired lock "refresh_cache-f4f28646-e9a4-464e-a7f6-db4a7d0d83a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:52:13 np0005531887 nova_compute[186849]: 2025-11-22 07:52:13.633 186853 DEBUG nova.network.neutron [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:52:13 np0005531887 podman[220268]: 2025-11-22 07:52:13.829436393 +0000 UTC m=+0.048915898 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 22 02:52:13 np0005531887 nova_compute[186849]: 2025-11-22 07:52:13.960 186853 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797918.9588935, 0b1e0374-b871-406a-98ec-e74a4f1822a3 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:52:13 np0005531887 nova_compute[186849]: 2025-11-22 07:52:13.960 186853 INFO nova.compute.manager [-] [instance: 0b1e0374-b871-406a-98ec-e74a4f1822a3] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:52:13 np0005531887 nova_compute[186849]: 2025-11-22 07:52:13.979 186853 DEBUG nova.compute.manager [None req-774df073-9621-4511-b410-9f61f4d9ba38 - - - - - -] [instance: 0b1e0374-b871-406a-98ec-e74a4f1822a3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:52:14 np0005531887 nova_compute[186849]: 2025-11-22 07:52:14.217 186853 DEBUG nova.compute.manager [req-a0747e58-3028-4276-b58b-fcd575d20c64 req-8676c053-2b97-4ae8-acc6-bdf4bbb405db 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Received event network-changed-8eb86bde-6b5b-491d-9ae4-94dcc46b6169 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:52:14 np0005531887 nova_compute[186849]: 2025-11-22 07:52:14.217 186853 DEBUG nova.compute.manager [req-a0747e58-3028-4276-b58b-fcd575d20c64 req-8676c053-2b97-4ae8-acc6-bdf4bbb405db 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Refreshing instance network info cache due to event network-changed-8eb86bde-6b5b-491d-9ae4-94dcc46b6169. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 02:52:14 np0005531887 nova_compute[186849]: 2025-11-22 07:52:14.218 186853 DEBUG oslo_concurrency.lockutils [req-a0747e58-3028-4276-b58b-fcd575d20c64 req-8676c053-2b97-4ae8-acc6-bdf4bbb405db 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-f4f28646-e9a4-464e-a7f6-db4a7d0d83a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:52:14 np0005531887 nova_compute[186849]: 2025-11-22 07:52:14.863 186853 DEBUG nova.network.neutron [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:52:15 np0005531887 nova_compute[186849]: 2025-11-22 07:52:15.539 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:15 np0005531887 nova_compute[186849]: 2025-11-22 07:52:15.561 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:15 np0005531887 podman[220287]: 2025-11-22 07:52:15.83443549 +0000 UTC m=+0.058207795 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Nov 22 02:52:17 np0005531887 nova_compute[186849]: 2025-11-22 07:52:17.870 186853 DEBUG nova.network.neutron [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Updating instance_info_cache with network_info: [{"id": "8eb86bde-6b5b-491d-9ae4-94dcc46b6169", "address": "fa:16:3e:3d:f9:da", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8eb86bde-6b", "ovs_interfaceid": "8eb86bde-6b5b-491d-9ae4-94dcc46b6169", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:52:18 np0005531887 nova_compute[186849]: 2025-11-22 07:52:18.309 186853 DEBUG oslo_concurrency.lockutils [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Releasing lock "refresh_cache-f4f28646-e9a4-464e-a7f6-db4a7d0d83a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:52:18 np0005531887 nova_compute[186849]: 2025-11-22 07:52:18.310 186853 DEBUG nova.compute.manager [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Instance network_info: |[{"id": "8eb86bde-6b5b-491d-9ae4-94dcc46b6169", "address": "fa:16:3e:3d:f9:da", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8eb86bde-6b", "ovs_interfaceid": "8eb86bde-6b5b-491d-9ae4-94dcc46b6169", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 02:52:18 np0005531887 nova_compute[186849]: 2025-11-22 07:52:18.311 186853 DEBUG oslo_concurrency.lockutils [req-a0747e58-3028-4276-b58b-fcd575d20c64 req-8676c053-2b97-4ae8-acc6-bdf4bbb405db 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-f4f28646-e9a4-464e-a7f6-db4a7d0d83a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:52:18 np0005531887 nova_compute[186849]: 2025-11-22 07:52:18.311 186853 DEBUG nova.network.neutron [req-a0747e58-3028-4276-b58b-fcd575d20c64 req-8676c053-2b97-4ae8-acc6-bdf4bbb405db 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Refreshing network info cache for port 8eb86bde-6b5b-491d-9ae4-94dcc46b6169 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 02:52:18 np0005531887 nova_compute[186849]: 2025-11-22 07:52:18.315 186853 DEBUG nova.virt.libvirt.driver [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Start _get_guest_xml network_info=[{"id": "8eb86bde-6b5b-491d-9ae4-94dcc46b6169", "address": "fa:16:3e:3d:f9:da", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8eb86bde-6b", "ovs_interfaceid": "8eb86bde-6b5b-491d-9ae4-94dcc46b6169", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'size': 0, 'boot_index': 0, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:52:18 np0005531887 nova_compute[186849]: 2025-11-22 07:52:18.320 186853 WARNING nova.virt.libvirt.driver [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:52:18 np0005531887 nova_compute[186849]: 2025-11-22 07:52:18.328 186853 DEBUG nova.virt.libvirt.host [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:52:18 np0005531887 nova_compute[186849]: 2025-11-22 07:52:18.329 186853 DEBUG nova.virt.libvirt.host [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:52:18 np0005531887 nova_compute[186849]: 2025-11-22 07:52:18.349 186853 DEBUG nova.virt.libvirt.host [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:52:18 np0005531887 nova_compute[186849]: 2025-11-22 07:52:18.349 186853 DEBUG nova.virt.libvirt.host [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:52:18 np0005531887 nova_compute[186849]: 2025-11-22 07:52:18.351 186853 DEBUG nova.virt.libvirt.driver [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:52:18 np0005531887 nova_compute[186849]: 2025-11-22 07:52:18.351 186853 DEBUG nova.virt.hardware [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:52:18 np0005531887 nova_compute[186849]: 2025-11-22 07:52:18.352 186853 DEBUG nova.virt.hardware [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:52:18 np0005531887 nova_compute[186849]: 2025-11-22 07:52:18.352 186853 DEBUG nova.virt.hardware [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:52:18 np0005531887 nova_compute[186849]: 2025-11-22 07:52:18.352 186853 DEBUG nova.virt.hardware [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:52:18 np0005531887 nova_compute[186849]: 2025-11-22 07:52:18.352 186853 DEBUG nova.virt.hardware [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:52:18 np0005531887 nova_compute[186849]: 2025-11-22 07:52:18.352 186853 DEBUG nova.virt.hardware [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:52:18 np0005531887 nova_compute[186849]: 2025-11-22 07:52:18.353 186853 DEBUG nova.virt.hardware [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:52:18 np0005531887 nova_compute[186849]: 2025-11-22 07:52:18.353 186853 DEBUG nova.virt.hardware [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:52:18 np0005531887 nova_compute[186849]: 2025-11-22 07:52:18.353 186853 DEBUG nova.virt.hardware [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:52:18 np0005531887 nova_compute[186849]: 2025-11-22 07:52:18.353 186853 DEBUG nova.virt.hardware [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:52:18 np0005531887 nova_compute[186849]: 2025-11-22 07:52:18.354 186853 DEBUG nova.virt.hardware [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:52:18 np0005531887 nova_compute[186849]: 2025-11-22 07:52:18.359 186853 DEBUG nova.virt.libvirt.vif [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:52:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1914028081',display_name='tempest-DeleteServersTestJSON-server-1914028081',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1914028081',id=55,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6b68db2b61a54aeaa8ac219f44ed3e75',ramdisk_id='',reservation_id='r-x6deem4b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-550712359',owner_user_name='tempest-DeleteServersTestJSON-550712359-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:52:08Z,user_data=None,user_id='57077a1511bf46d897beb6fd5eedfa67',uuid=f4f28646-e9a4-464e-a7f6-db4a7d0d83a9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8eb86bde-6b5b-491d-9ae4-94dcc46b6169", "address": "fa:16:3e:3d:f9:da", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8eb86bde-6b", "ovs_interfaceid": "8eb86bde-6b5b-491d-9ae4-94dcc46b6169", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 02:52:18 np0005531887 nova_compute[186849]: 2025-11-22 07:52:18.360 186853 DEBUG nova.network.os_vif_util [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Converting VIF {"id": "8eb86bde-6b5b-491d-9ae4-94dcc46b6169", "address": "fa:16:3e:3d:f9:da", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8eb86bde-6b", "ovs_interfaceid": "8eb86bde-6b5b-491d-9ae4-94dcc46b6169", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:52:18 np0005531887 nova_compute[186849]: 2025-11-22 07:52:18.361 186853 DEBUG nova.network.os_vif_util [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3d:f9:da,bridge_name='br-int',has_traffic_filtering=True,id=8eb86bde-6b5b-491d-9ae4-94dcc46b6169,network=Network(5e910dbb-27d1-4915-8b74-d0538d33c33c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8eb86bde-6b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:52:18 np0005531887 nova_compute[186849]: 2025-11-22 07:52:18.362 186853 DEBUG nova.objects.instance [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lazy-loading 'pci_devices' on Instance uuid f4f28646-e9a4-464e-a7f6-db4a7d0d83a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:52:18 np0005531887 nova_compute[186849]: 2025-11-22 07:52:18.381 186853 DEBUG nova.virt.libvirt.driver [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:52:18 np0005531887 nova_compute[186849]:  <uuid>f4f28646-e9a4-464e-a7f6-db4a7d0d83a9</uuid>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:  <name>instance-00000037</name>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:  <memory>131072</memory>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:  <vcpu>1</vcpu>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:52:18 np0005531887 nova_compute[186849]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:      <nova:name>tempest-DeleteServersTestJSON-server-1914028081</nova:name>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:      <nova:creationTime>2025-11-22 07:52:18</nova:creationTime>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:      <nova:flavor name="m1.nano">
Nov 22 02:52:18 np0005531887 nova_compute[186849]:        <nova:memory>128</nova:memory>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:        <nova:disk>1</nova:disk>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:        <nova:swap>0</nova:swap>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:      </nova:flavor>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:      <nova:owner>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:        <nova:user uuid="57077a1511bf46d897beb6fd5eedfa67">tempest-DeleteServersTestJSON-550712359-project-member</nova:user>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:        <nova:project uuid="6b68db2b61a54aeaa8ac219f44ed3e75">tempest-DeleteServersTestJSON-550712359</nova:project>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:      </nova:owner>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:      <nova:ports>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:        <nova:port uuid="8eb86bde-6b5b-491d-9ae4-94dcc46b6169">
Nov 22 02:52:18 np0005531887 nova_compute[186849]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:        </nova:port>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:      </nova:ports>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:    </nova:instance>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:  <sysinfo type="smbios">
Nov 22 02:52:18 np0005531887 nova_compute[186849]:    <system>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:      <entry name="serial">f4f28646-e9a4-464e-a7f6-db4a7d0d83a9</entry>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:      <entry name="uuid">f4f28646-e9a4-464e-a7f6-db4a7d0d83a9</entry>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:    </system>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:  <os>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:    <boot dev="hd"/>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:    <smbios mode="sysinfo"/>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:  </os>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:  <features>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:    <vmcoreinfo/>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:  </features>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:  <clock offset="utc">
Nov 22 02:52:18 np0005531887 nova_compute[186849]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:    <timer name="hpet" present="no"/>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:  </clock>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:  <cpu mode="custom" match="exact">
Nov 22 02:52:18 np0005531887 nova_compute[186849]:    <model>Nehalem</model>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:  <devices>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:    <disk type="file" device="disk">
Nov 22 02:52:18 np0005531887 nova_compute[186849]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/f4f28646-e9a4-464e-a7f6-db4a7d0d83a9/disk"/>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:      <target dev="vda" bus="virtio"/>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:    <disk type="file" device="cdrom">
Nov 22 02:52:18 np0005531887 nova_compute[186849]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/f4f28646-e9a4-464e-a7f6-db4a7d0d83a9/disk.config"/>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:      <target dev="sda" bus="sata"/>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:    <interface type="ethernet">
Nov 22 02:52:18 np0005531887 nova_compute[186849]:      <mac address="fa:16:3e:3d:f9:da"/>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:      <mtu size="1442"/>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:      <target dev="tap8eb86bde-6b"/>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:    </interface>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:    <serial type="pty">
Nov 22 02:52:18 np0005531887 nova_compute[186849]:      <log file="/var/lib/nova/instances/f4f28646-e9a4-464e-a7f6-db4a7d0d83a9/console.log" append="off"/>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:    </serial>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:    <video>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:    </video>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:    <input type="tablet" bus="usb"/>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:    <rng model="virtio">
Nov 22 02:52:18 np0005531887 nova_compute[186849]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:    </rng>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:    <controller type="usb" index="0"/>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:    <memballoon model="virtio">
Nov 22 02:52:18 np0005531887 nova_compute[186849]:      <stats period="10"/>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 02:52:18 np0005531887 nova_compute[186849]:  </devices>
Nov 22 02:52:18 np0005531887 nova_compute[186849]: </domain>
Nov 22 02:52:18 np0005531887 nova_compute[186849]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:52:18 np0005531887 nova_compute[186849]: 2025-11-22 07:52:18.383 186853 DEBUG nova.compute.manager [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Preparing to wait for external event network-vif-plugged-8eb86bde-6b5b-491d-9ae4-94dcc46b6169 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 02:52:18 np0005531887 nova_compute[186849]: 2025-11-22 07:52:18.383 186853 DEBUG oslo_concurrency.lockutils [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "f4f28646-e9a4-464e-a7f6-db4a7d0d83a9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:52:18 np0005531887 nova_compute[186849]: 2025-11-22 07:52:18.384 186853 DEBUG oslo_concurrency.lockutils [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "f4f28646-e9a4-464e-a7f6-db4a7d0d83a9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:52:18 np0005531887 nova_compute[186849]: 2025-11-22 07:52:18.384 186853 DEBUG oslo_concurrency.lockutils [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "f4f28646-e9a4-464e-a7f6-db4a7d0d83a9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:52:18 np0005531887 nova_compute[186849]: 2025-11-22 07:52:18.385 186853 DEBUG nova.virt.libvirt.vif [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:52:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1914028081',display_name='tempest-DeleteServersTestJSON-server-1914028081',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1914028081',id=55,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6b68db2b61a54aeaa8ac219f44ed3e75',ramdisk_id='',reservation_id='r-x6deem4b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-550712359',owner_user_name='tempest-DeleteServersTestJSON-550712359-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:52:08Z,user_data=None,user_id='57077a1511bf46d897beb6fd5eedfa67',uuid=f4f28646-e9a4-464e-a7f6-db4a7d0d83a9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8eb86bde-6b5b-491d-9ae4-94dcc46b6169", "address": "fa:16:3e:3d:f9:da", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8eb86bde-6b", "ovs_interfaceid": "8eb86bde-6b5b-491d-9ae4-94dcc46b6169", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 02:52:18 np0005531887 nova_compute[186849]: 2025-11-22 07:52:18.385 186853 DEBUG nova.network.os_vif_util [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Converting VIF {"id": "8eb86bde-6b5b-491d-9ae4-94dcc46b6169", "address": "fa:16:3e:3d:f9:da", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8eb86bde-6b", "ovs_interfaceid": "8eb86bde-6b5b-491d-9ae4-94dcc46b6169", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:52:18 np0005531887 nova_compute[186849]: 2025-11-22 07:52:18.386 186853 DEBUG nova.network.os_vif_util [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3d:f9:da,bridge_name='br-int',has_traffic_filtering=True,id=8eb86bde-6b5b-491d-9ae4-94dcc46b6169,network=Network(5e910dbb-27d1-4915-8b74-d0538d33c33c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8eb86bde-6b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:52:18 np0005531887 nova_compute[186849]: 2025-11-22 07:52:18.386 186853 DEBUG os_vif [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3d:f9:da,bridge_name='br-int',has_traffic_filtering=True,id=8eb86bde-6b5b-491d-9ae4-94dcc46b6169,network=Network(5e910dbb-27d1-4915-8b74-d0538d33c33c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8eb86bde-6b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 02:52:18 np0005531887 nova_compute[186849]: 2025-11-22 07:52:18.387 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:18 np0005531887 nova_compute[186849]: 2025-11-22 07:52:18.387 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:52:18 np0005531887 nova_compute[186849]: 2025-11-22 07:52:18.388 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:52:18 np0005531887 nova_compute[186849]: 2025-11-22 07:52:18.390 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:18 np0005531887 nova_compute[186849]: 2025-11-22 07:52:18.391 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8eb86bde-6b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:52:18 np0005531887 nova_compute[186849]: 2025-11-22 07:52:18.391 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8eb86bde-6b, col_values=(('external_ids', {'iface-id': '8eb86bde-6b5b-491d-9ae4-94dcc46b6169', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3d:f9:da', 'vm-uuid': 'f4f28646-e9a4-464e-a7f6-db4a7d0d83a9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:52:18 np0005531887 NetworkManager[55210]: <info>  [1763797938.3942] manager: (tap8eb86bde-6b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/64)
Nov 22 02:52:18 np0005531887 nova_compute[186849]: 2025-11-22 07:52:18.393 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:18 np0005531887 nova_compute[186849]: 2025-11-22 07:52:18.397 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:52:18 np0005531887 nova_compute[186849]: 2025-11-22 07:52:18.400 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:18 np0005531887 nova_compute[186849]: 2025-11-22 07:52:18.401 186853 INFO os_vif [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3d:f9:da,bridge_name='br-int',has_traffic_filtering=True,id=8eb86bde-6b5b-491d-9ae4-94dcc46b6169,network=Network(5e910dbb-27d1-4915-8b74-d0538d33c33c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8eb86bde-6b')#033[00m
Nov 22 02:52:18 np0005531887 nova_compute[186849]: 2025-11-22 07:52:18.503 186853 DEBUG nova.virt.libvirt.driver [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:52:18 np0005531887 nova_compute[186849]: 2025-11-22 07:52:18.504 186853 DEBUG nova.virt.libvirt.driver [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:52:18 np0005531887 nova_compute[186849]: 2025-11-22 07:52:18.504 186853 DEBUG nova.virt.libvirt.driver [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] No VIF found with MAC fa:16:3e:3d:f9:da, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 02:52:18 np0005531887 nova_compute[186849]: 2025-11-22 07:52:18.504 186853 INFO nova.virt.libvirt.driver [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Using config drive#033[00m
Nov 22 02:52:19 np0005531887 nova_compute[186849]: 2025-11-22 07:52:19.930 186853 INFO nova.virt.libvirt.driver [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Creating config drive at /var/lib/nova/instances/f4f28646-e9a4-464e-a7f6-db4a7d0d83a9/disk.config#033[00m
Nov 22 02:52:19 np0005531887 nova_compute[186849]: 2025-11-22 07:52:19.935 186853 DEBUG oslo_concurrency.processutils [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f4f28646-e9a4-464e-a7f6-db4a7d0d83a9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5j9tdgty execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:52:20 np0005531887 nova_compute[186849]: 2025-11-22 07:52:20.058 186853 DEBUG oslo_concurrency.processutils [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f4f28646-e9a4-464e-a7f6-db4a7d0d83a9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5j9tdgty" returned: 0 in 0.122s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:52:20 np0005531887 kernel: tap8eb86bde-6b: entered promiscuous mode
Nov 22 02:52:20 np0005531887 NetworkManager[55210]: <info>  [1763797940.1140] manager: (tap8eb86bde-6b): new Tun device (/org/freedesktop/NetworkManager/Devices/65)
Nov 22 02:52:20 np0005531887 nova_compute[186849]: 2025-11-22 07:52:20.115 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:20 np0005531887 ovn_controller[95130]: 2025-11-22T07:52:20Z|00110|binding|INFO|Claiming lport 8eb86bde-6b5b-491d-9ae4-94dcc46b6169 for this chassis.
Nov 22 02:52:20 np0005531887 ovn_controller[95130]: 2025-11-22T07:52:20Z|00111|binding|INFO|8eb86bde-6b5b-491d-9ae4-94dcc46b6169: Claiming fa:16:3e:3d:f9:da 10.100.0.6
Nov 22 02:52:20 np0005531887 nova_compute[186849]: 2025-11-22 07:52:20.118 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:20 np0005531887 systemd-udevd[220325]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:52:20 np0005531887 NetworkManager[55210]: <info>  [1763797940.1513] device (tap8eb86bde-6b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 02:52:20 np0005531887 NetworkManager[55210]: <info>  [1763797940.1526] device (tap8eb86bde-6b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 02:52:20 np0005531887 nova_compute[186849]: 2025-11-22 07:52:20.172 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:20 np0005531887 systemd-machined[153180]: New machine qemu-20-instance-00000037.
Nov 22 02:52:20 np0005531887 ovn_controller[95130]: 2025-11-22T07:52:20Z|00112|binding|INFO|Setting lport 8eb86bde-6b5b-491d-9ae4-94dcc46b6169 ovn-installed in OVS
Nov 22 02:52:20 np0005531887 nova_compute[186849]: 2025-11-22 07:52:20.178 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:20 np0005531887 systemd[1]: Started Virtual Machine qemu-20-instance-00000037.
Nov 22 02:52:20 np0005531887 ovn_controller[95130]: 2025-11-22T07:52:20Z|00113|binding|INFO|Setting lport 8eb86bde-6b5b-491d-9ae4-94dcc46b6169 up in Southbound
Nov 22 02:52:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:52:20.240 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3d:f9:da 10.100.0.6'], port_security=['fa:16:3e:3d:f9:da 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'f4f28646-e9a4-464e-a7f6-db4a7d0d83a9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5e910dbb-27d1-4915-8b74-d0538d33c33c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd8cd7544-2677-4974-86a3-a18d0c107043', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7bb67d1a-54cf-4f4c-900a-e9306bad2f5e, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=8eb86bde-6b5b-491d-9ae4-94dcc46b6169) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:52:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:52:20.241 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 8eb86bde-6b5b-491d-9ae4-94dcc46b6169 in datapath 5e910dbb-27d1-4915-8b74-d0538d33c33c bound to our chassis#033[00m
Nov 22 02:52:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:52:20.242 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5e910dbb-27d1-4915-8b74-d0538d33c33c#033[00m
Nov 22 02:52:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:52:20.253 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[28e7f029-3b4a-47d5-85e0-c62ddea0b78a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:52:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:52:20.255 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap5e910dbb-21 in ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 02:52:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:52:20.259 213790 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap5e910dbb-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 02:52:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:52:20.259 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[842f7a89-32fd-44e1-82c2-a878f8163018]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:52:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:52:20.261 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[c818c99d-fa99-4f4d-8834-f29461077935]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:52:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:52:20.283 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[42a36458-7af0-4c69-9012-fe2ea6d2a61b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:52:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:52:20.298 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[db44633f-68a8-4184-9015-ed1862a50193]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:52:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:52:20.337 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[2ef43c7e-a6c6-4948-8871-b2b9e90a3fd9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:52:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:52:20.346 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[cf6cd28c-b795-478f-adeb-d18c3e8ea382]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:52:20 np0005531887 NetworkManager[55210]: <info>  [1763797940.3480] manager: (tap5e910dbb-20): new Veth device (/org/freedesktop/NetworkManager/Devices/66)
Nov 22 02:52:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:52:20.382 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[8412fc1e-ddef-44eb-8449-f8ace9ae2322]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:52:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:52:20.386 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[b8145af5-81c8-4e11-a36b-b5929a2f89ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:52:20 np0005531887 NetworkManager[55210]: <info>  [1763797940.4072] device (tap5e910dbb-20): carrier: link connected
Nov 22 02:52:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:52:20.412 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[39e0cb2e-88ea-44d8-b257-d6f066b5c716]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:52:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:52:20.428 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[fa58fdad-0c83-458b-9429-671c9815e0d2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5e910dbb-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:27:e8:59'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 37], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 468301, 'reachable_time': 36749, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220365, 'error': None, 'target': 'ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:52:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:52:20.442 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[3db78bcb-862d-400c-b52b-186cbf24e4ff]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe27:e859'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 468301, 'tstamp': 468301}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220366, 'error': None, 'target': 'ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:52:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:52:20.458 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[9c2ccbe9-57d9-4068-bcce-782d80812e7d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5e910dbb-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:27:e8:59'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 37], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 468301, 'reachable_time': 36749, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220367, 'error': None, 'target': 'ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:52:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:52:20.488 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[bb1514b4-2207-491a-abdc-42b1399ae674]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:52:20 np0005531887 nova_compute[186849]: 2025-11-22 07:52:20.541 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:52:20.547 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[89f6b2bd-730d-44e3-b1cf-8db0dcabcf5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:52:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:52:20.549 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5e910dbb-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:52:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:52:20.549 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:52:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:52:20.549 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5e910dbb-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:52:20 np0005531887 NetworkManager[55210]: <info>  [1763797940.5517] manager: (tap5e910dbb-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/67)
Nov 22 02:52:20 np0005531887 nova_compute[186849]: 2025-11-22 07:52:20.551 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:20 np0005531887 kernel: tap5e910dbb-20: entered promiscuous mode
Nov 22 02:52:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:52:20.554 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5e910dbb-20, col_values=(('external_ids', {'iface-id': 'df80c07a-3ea3-4dde-8219-31b028a556e5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:52:20 np0005531887 nova_compute[186849]: 2025-11-22 07:52:20.556 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:20 np0005531887 ovn_controller[95130]: 2025-11-22T07:52:20Z|00114|binding|INFO|Releasing lport df80c07a-3ea3-4dde-8219-31b028a556e5 from this chassis (sb_readonly=0)
Nov 22 02:52:20 np0005531887 nova_compute[186849]: 2025-11-22 07:52:20.557 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:52:20.558 104084 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5e910dbb-27d1-4915-8b74-d0538d33c33c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5e910dbb-27d1-4915-8b74-d0538d33c33c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 02:52:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:52:20.559 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[2a00d07c-fca0-4e18-82ea-d25be447c49b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:52:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:52:20.560 104084 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 02:52:20 np0005531887 ovn_metadata_agent[104079]: global
Nov 22 02:52:20 np0005531887 ovn_metadata_agent[104079]:    log         /dev/log local0 debug
Nov 22 02:52:20 np0005531887 ovn_metadata_agent[104079]:    log-tag     haproxy-metadata-proxy-5e910dbb-27d1-4915-8b74-d0538d33c33c
Nov 22 02:52:20 np0005531887 ovn_metadata_agent[104079]:    user        root
Nov 22 02:52:20 np0005531887 ovn_metadata_agent[104079]:    group       root
Nov 22 02:52:20 np0005531887 ovn_metadata_agent[104079]:    maxconn     1024
Nov 22 02:52:20 np0005531887 ovn_metadata_agent[104079]:    pidfile     /var/lib/neutron/external/pids/5e910dbb-27d1-4915-8b74-d0538d33c33c.pid.haproxy
Nov 22 02:52:20 np0005531887 ovn_metadata_agent[104079]:    daemon
Nov 22 02:52:20 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:52:20 np0005531887 ovn_metadata_agent[104079]: defaults
Nov 22 02:52:20 np0005531887 ovn_metadata_agent[104079]:    log global
Nov 22 02:52:20 np0005531887 ovn_metadata_agent[104079]:    mode http
Nov 22 02:52:20 np0005531887 ovn_metadata_agent[104079]:    option httplog
Nov 22 02:52:20 np0005531887 ovn_metadata_agent[104079]:    option dontlognull
Nov 22 02:52:20 np0005531887 ovn_metadata_agent[104079]:    option http-server-close
Nov 22 02:52:20 np0005531887 ovn_metadata_agent[104079]:    option forwardfor
Nov 22 02:52:20 np0005531887 ovn_metadata_agent[104079]:    retries                 3
Nov 22 02:52:20 np0005531887 ovn_metadata_agent[104079]:    timeout http-request    30s
Nov 22 02:52:20 np0005531887 ovn_metadata_agent[104079]:    timeout connect         30s
Nov 22 02:52:20 np0005531887 ovn_metadata_agent[104079]:    timeout client          32s
Nov 22 02:52:20 np0005531887 ovn_metadata_agent[104079]:    timeout server          32s
Nov 22 02:52:20 np0005531887 ovn_metadata_agent[104079]:    timeout http-keep-alive 30s
Nov 22 02:52:20 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:52:20 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:52:20 np0005531887 ovn_metadata_agent[104079]: listen listener
Nov 22 02:52:20 np0005531887 ovn_metadata_agent[104079]:    bind 169.254.169.254:80
Nov 22 02:52:20 np0005531887 ovn_metadata_agent[104079]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 02:52:20 np0005531887 ovn_metadata_agent[104079]:    http-request add-header X-OVN-Network-ID 5e910dbb-27d1-4915-8b74-d0538d33c33c
Nov 22 02:52:20 np0005531887 ovn_metadata_agent[104079]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 02:52:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:52:20.561 104084 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c', 'env', 'PROCESS_TAG=haproxy-5e910dbb-27d1-4915-8b74-d0538d33c33c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/5e910dbb-27d1-4915-8b74-d0538d33c33c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 02:52:20 np0005531887 nova_compute[186849]: 2025-11-22 07:52:20.569 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:20 np0005531887 nova_compute[186849]: 2025-11-22 07:52:20.869 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763797940.8690689, f4f28646-e9a4-464e-a7f6-db4a7d0d83a9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:52:20 np0005531887 nova_compute[186849]: 2025-11-22 07:52:20.870 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] VM Started (Lifecycle Event)#033[00m
Nov 22 02:52:20 np0005531887 podman[220404]: 2025-11-22 07:52:20.931173895 +0000 UTC m=+0.052488376 container create bf6e758a090fff42b2b18cfcd3f2948d8d0ff083a5f92f985acc81060c1fe90a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:52:20 np0005531887 systemd[1]: Started libpod-conmon-bf6e758a090fff42b2b18cfcd3f2948d8d0ff083a5f92f985acc81060c1fe90a.scope.
Nov 22 02:52:20 np0005531887 nova_compute[186849]: 2025-11-22 07:52:20.966 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:52:20 np0005531887 nova_compute[186849]: 2025-11-22 07:52:20.972 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763797940.8733435, f4f28646-e9a4-464e-a7f6-db4a7d0d83a9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:52:20 np0005531887 nova_compute[186849]: 2025-11-22 07:52:20.973 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] VM Paused (Lifecycle Event)#033[00m
Nov 22 02:52:20 np0005531887 systemd[1]: Started libcrun container.
Nov 22 02:52:20 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c798bf80d9165c6eb18676c155599559a84ef3a1272473aef330213583ebe1c8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 02:52:20 np0005531887 podman[220404]: 2025-11-22 07:52:20.903406985 +0000 UTC m=+0.024721486 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 02:52:21 np0005531887 podman[220404]: 2025-11-22 07:52:21.014315519 +0000 UTC m=+0.135630000 container init bf6e758a090fff42b2b18cfcd3f2948d8d0ff083a5f92f985acc81060c1fe90a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118)
Nov 22 02:52:21 np0005531887 podman[220404]: 2025-11-22 07:52:21.01966808 +0000 UTC m=+0.140982551 container start bf6e758a090fff42b2b18cfcd3f2948d8d0ff083a5f92f985acc81060c1fe90a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:52:21 np0005531887 podman[220417]: 2025-11-22 07:52:21.038819209 +0000 UTC m=+0.061198819 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 02:52:21 np0005531887 neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c[220420]: [NOTICE]   (220440) : New worker (220447) forked
Nov 22 02:52:21 np0005531887 neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c[220420]: [NOTICE]   (220440) : Loading success.
Nov 22 02:52:21 np0005531887 nova_compute[186849]: 2025-11-22 07:52:21.074 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:52:21 np0005531887 nova_compute[186849]: 2025-11-22 07:52:21.078 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:52:21 np0005531887 nova_compute[186849]: 2025-11-22 07:52:21.139 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:52:21 np0005531887 nova_compute[186849]: 2025-11-22 07:52:21.847 186853 DEBUG nova.compute.manager [req-8503a4d4-5377-4c09-b28f-8d17eb79c611 req-664157a8-284e-48cc-b590-c966a4b0f3f1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Received event network-vif-plugged-8eb86bde-6b5b-491d-9ae4-94dcc46b6169 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:52:21 np0005531887 nova_compute[186849]: 2025-11-22 07:52:21.847 186853 DEBUG oslo_concurrency.lockutils [req-8503a4d4-5377-4c09-b28f-8d17eb79c611 req-664157a8-284e-48cc-b590-c966a4b0f3f1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "f4f28646-e9a4-464e-a7f6-db4a7d0d83a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:52:21 np0005531887 nova_compute[186849]: 2025-11-22 07:52:21.847 186853 DEBUG oslo_concurrency.lockutils [req-8503a4d4-5377-4c09-b28f-8d17eb79c611 req-664157a8-284e-48cc-b590-c966a4b0f3f1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "f4f28646-e9a4-464e-a7f6-db4a7d0d83a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:52:21 np0005531887 nova_compute[186849]: 2025-11-22 07:52:21.848 186853 DEBUG oslo_concurrency.lockutils [req-8503a4d4-5377-4c09-b28f-8d17eb79c611 req-664157a8-284e-48cc-b590-c966a4b0f3f1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "f4f28646-e9a4-464e-a7f6-db4a7d0d83a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:52:21 np0005531887 nova_compute[186849]: 2025-11-22 07:52:21.848 186853 DEBUG nova.compute.manager [req-8503a4d4-5377-4c09-b28f-8d17eb79c611 req-664157a8-284e-48cc-b590-c966a4b0f3f1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Processing event network-vif-plugged-8eb86bde-6b5b-491d-9ae4-94dcc46b6169 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 02:52:21 np0005531887 nova_compute[186849]: 2025-11-22 07:52:21.848 186853 DEBUG nova.compute.manager [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:52:21 np0005531887 nova_compute[186849]: 2025-11-22 07:52:21.851 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763797941.8517904, f4f28646-e9a4-464e-a7f6-db4a7d0d83a9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:52:21 np0005531887 nova_compute[186849]: 2025-11-22 07:52:21.852 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:52:21 np0005531887 nova_compute[186849]: 2025-11-22 07:52:21.853 186853 DEBUG nova.virt.libvirt.driver [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 02:52:21 np0005531887 nova_compute[186849]: 2025-11-22 07:52:21.856 186853 INFO nova.virt.libvirt.driver [-] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Instance spawned successfully.#033[00m
Nov 22 02:52:21 np0005531887 nova_compute[186849]: 2025-11-22 07:52:21.857 186853 DEBUG nova.virt.libvirt.driver [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 02:52:21 np0005531887 nova_compute[186849]: 2025-11-22 07:52:21.907 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:52:21 np0005531887 nova_compute[186849]: 2025-11-22 07:52:21.915 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:52:21 np0005531887 nova_compute[186849]: 2025-11-22 07:52:21.921 186853 DEBUG nova.virt.libvirt.driver [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:52:21 np0005531887 nova_compute[186849]: 2025-11-22 07:52:21.921 186853 DEBUG nova.virt.libvirt.driver [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:52:21 np0005531887 nova_compute[186849]: 2025-11-22 07:52:21.922 186853 DEBUG nova.virt.libvirt.driver [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:52:21 np0005531887 nova_compute[186849]: 2025-11-22 07:52:21.922 186853 DEBUG nova.virt.libvirt.driver [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:52:21 np0005531887 nova_compute[186849]: 2025-11-22 07:52:21.923 186853 DEBUG nova.virt.libvirt.driver [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:52:21 np0005531887 nova_compute[186849]: 2025-11-22 07:52:21.923 186853 DEBUG nova.virt.libvirt.driver [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:52:21 np0005531887 nova_compute[186849]: 2025-11-22 07:52:21.944 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:52:22 np0005531887 nova_compute[186849]: 2025-11-22 07:52:22.136 186853 INFO nova.compute.manager [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Took 12.87 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 02:52:22 np0005531887 nova_compute[186849]: 2025-11-22 07:52:22.137 186853 DEBUG nova.compute.manager [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:52:22 np0005531887 nova_compute[186849]: 2025-11-22 07:52:22.306 186853 INFO nova.compute.manager [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Took 13.97 seconds to build instance.#033[00m
Nov 22 02:52:22 np0005531887 nova_compute[186849]: 2025-11-22 07:52:22.351 186853 DEBUG oslo_concurrency.lockutils [None req-eb5a186a-9eb2-4ce8-88ff-fcf5688bc1cd 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "f4f28646-e9a4-464e-a7f6-db4a7d0d83a9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.202s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:52:22 np0005531887 nova_compute[186849]: 2025-11-22 07:52:22.935 186853 DEBUG nova.network.neutron [req-a0747e58-3028-4276-b58b-fcd575d20c64 req-8676c053-2b97-4ae8-acc6-bdf4bbb405db 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Updated VIF entry in instance network info cache for port 8eb86bde-6b5b-491d-9ae4-94dcc46b6169. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 02:52:22 np0005531887 nova_compute[186849]: 2025-11-22 07:52:22.936 186853 DEBUG nova.network.neutron [req-a0747e58-3028-4276-b58b-fcd575d20c64 req-8676c053-2b97-4ae8-acc6-bdf4bbb405db 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Updating instance_info_cache with network_info: [{"id": "8eb86bde-6b5b-491d-9ae4-94dcc46b6169", "address": "fa:16:3e:3d:f9:da", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8eb86bde-6b", "ovs_interfaceid": "8eb86bde-6b5b-491d-9ae4-94dcc46b6169", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:52:22 np0005531887 nova_compute[186849]: 2025-11-22 07:52:22.957 186853 DEBUG oslo_concurrency.lockutils [req-a0747e58-3028-4276-b58b-fcd575d20c64 req-8676c053-2b97-4ae8-acc6-bdf4bbb405db 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-f4f28646-e9a4-464e-a7f6-db4a7d0d83a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:52:23 np0005531887 nova_compute[186849]: 2025-11-22 07:52:23.395 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:24 np0005531887 nova_compute[186849]: 2025-11-22 07:52:24.165 186853 DEBUG nova.compute.manager [req-3c15d5f8-8dbb-4f04-9e0c-5f24632923fb req-db39ff08-38e8-4f9b-85bb-323d3690c651 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Received event network-vif-plugged-8eb86bde-6b5b-491d-9ae4-94dcc46b6169 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:52:24 np0005531887 nova_compute[186849]: 2025-11-22 07:52:24.166 186853 DEBUG oslo_concurrency.lockutils [req-3c15d5f8-8dbb-4f04-9e0c-5f24632923fb req-db39ff08-38e8-4f9b-85bb-323d3690c651 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "f4f28646-e9a4-464e-a7f6-db4a7d0d83a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:52:24 np0005531887 nova_compute[186849]: 2025-11-22 07:52:24.166 186853 DEBUG oslo_concurrency.lockutils [req-3c15d5f8-8dbb-4f04-9e0c-5f24632923fb req-db39ff08-38e8-4f9b-85bb-323d3690c651 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "f4f28646-e9a4-464e-a7f6-db4a7d0d83a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:52:24 np0005531887 nova_compute[186849]: 2025-11-22 07:52:24.167 186853 DEBUG oslo_concurrency.lockutils [req-3c15d5f8-8dbb-4f04-9e0c-5f24632923fb req-db39ff08-38e8-4f9b-85bb-323d3690c651 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "f4f28646-e9a4-464e-a7f6-db4a7d0d83a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:52:24 np0005531887 nova_compute[186849]: 2025-11-22 07:52:24.167 186853 DEBUG nova.compute.manager [req-3c15d5f8-8dbb-4f04-9e0c-5f24632923fb req-db39ff08-38e8-4f9b-85bb-323d3690c651 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] No waiting events found dispatching network-vif-plugged-8eb86bde-6b5b-491d-9ae4-94dcc46b6169 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:52:24 np0005531887 nova_compute[186849]: 2025-11-22 07:52:24.167 186853 WARNING nova.compute.manager [req-3c15d5f8-8dbb-4f04-9e0c-5f24632923fb req-db39ff08-38e8-4f9b-85bb-323d3690c651 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Received unexpected event network-vif-plugged-8eb86bde-6b5b-491d-9ae4-94dcc46b6169 for instance with vm_state active and task_state None.#033[00m
Nov 22 02:52:25 np0005531887 nova_compute[186849]: 2025-11-22 07:52:25.544 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:27 np0005531887 podman[220459]: 2025-11-22 07:52:27.843015791 +0000 UTC m=+0.054832123 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, name=ubi9-minimal, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.component=ubi9-minimal-container, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 02:52:28 np0005531887 nova_compute[186849]: 2025-11-22 07:52:28.397 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:28 np0005531887 nova_compute[186849]: 2025-11-22 07:52:28.659 186853 DEBUG oslo_concurrency.lockutils [None req-e4d15b4c-727c-4aa9-af57-b0d06240e30b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "f4f28646-e9a4-464e-a7f6-db4a7d0d83a9" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:52:28 np0005531887 nova_compute[186849]: 2025-11-22 07:52:28.659 186853 DEBUG oslo_concurrency.lockutils [None req-e4d15b4c-727c-4aa9-af57-b0d06240e30b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "f4f28646-e9a4-464e-a7f6-db4a7d0d83a9" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:52:28 np0005531887 nova_compute[186849]: 2025-11-22 07:52:28.660 186853 DEBUG nova.compute.manager [None req-e4d15b4c-727c-4aa9-af57-b0d06240e30b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:52:28 np0005531887 nova_compute[186849]: 2025-11-22 07:52:28.664 186853 DEBUG nova.compute.manager [None req-e4d15b4c-727c-4aa9-af57-b0d06240e30b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m
Nov 22 02:52:28 np0005531887 nova_compute[186849]: 2025-11-22 07:52:28.665 186853 DEBUG nova.objects.instance [None req-e4d15b4c-727c-4aa9-af57-b0d06240e30b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lazy-loading 'flavor' on Instance uuid f4f28646-e9a4-464e-a7f6-db4a7d0d83a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:52:28 np0005531887 nova_compute[186849]: 2025-11-22 07:52:28.695 186853 DEBUG nova.objects.instance [None req-e4d15b4c-727c-4aa9-af57-b0d06240e30b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lazy-loading 'info_cache' on Instance uuid f4f28646-e9a4-464e-a7f6-db4a7d0d83a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:52:28 np0005531887 nova_compute[186849]: 2025-11-22 07:52:28.738 186853 DEBUG nova.virt.libvirt.driver [None req-e4d15b4c-727c-4aa9-af57-b0d06240e30b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 22 02:52:30 np0005531887 nova_compute[186849]: 2025-11-22 07:52:30.546 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:32 np0005531887 podman[220484]: 2025-11-22 07:52:32.05384572 +0000 UTC m=+0.085693068 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 22 02:52:32 np0005531887 podman[220485]: 2025-11-22 07:52:32.057093009 +0000 UTC m=+0.084818606 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 22 02:52:33 np0005531887 nova_compute[186849]: 2025-11-22 07:52:33.400 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:35 np0005531887 ovn_controller[95130]: 2025-11-22T07:52:35Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:3d:f9:da 10.100.0.6
Nov 22 02:52:35 np0005531887 ovn_controller[95130]: 2025-11-22T07:52:35Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:3d:f9:da 10.100.0.6
Nov 22 02:52:35 np0005531887 nova_compute[186849]: 2025-11-22 07:52:35.547 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.665 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f4f28646-e9a4-464e-a7f6-db4a7d0d83a9', 'name': 'tempest-DeleteServersTestJSON-server-1914028081', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000037', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'hostId': '41cdcff15b5a7c4ff81f1776cad17739f7e062616f0dbe910b638a21', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.665 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.682 12 DEBUG ceilometer.compute.pollsters [-] f4f28646-e9a4-464e-a7f6-db4a7d0d83a9/memory.usage volume: 40.37890625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.685 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3ffe7f39-506a-4c97-a665-f8315ab9a8f3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.37890625, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_name': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_name': None, 'resource_id': 'f4f28646-e9a4-464e-a7f6-db4a7d0d83a9', 'timestamp': '2025-11-22T07:52:36.666018', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1914028081', 'name': 'instance-00000037', 'instance_id': 'f4f28646-e9a4-464e-a7f6-db4a7d0d83a9', 'instance_type': 'm1.nano', 'host': '41cdcff15b5a7c4ff81f1776cad17739f7e062616f0dbe910b638a21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '36245a3c-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4699.352549257, 'message_signature': '93d1d5ceec9c4add4341fcca2e8766c83cd5c791fe456f350ab276ca6c47ac4a'}]}, 'timestamp': '2025-11-22 07:52:36.684292', '_unique_id': '8c483b682f214c2fb6db251fe1f14fb8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.685 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.685 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.685 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.685 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.685 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.685 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.685 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.685 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.685 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.685 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.685 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.685 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.685 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.685 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.685 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.685 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.685 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.685 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.685 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.685 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.685 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.685 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.685 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.685 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.685 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.685 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.685 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.685 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.685 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.685 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.685 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.686 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.689 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for f4f28646-e9a4-464e-a7f6-db4a7d0d83a9 / tap8eb86bde-6b inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.689 12 DEBUG ceilometer.compute.pollsters [-] f4f28646-e9a4-464e-a7f6-db4a7d0d83a9/network.incoming.packets volume: 9 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.691 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7c984948-b2fa-42cd-8601-b33e2a4a602f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 9, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_name': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_name': None, 'resource_id': 'instance-00000037-f4f28646-e9a4-464e-a7f6-db4a7d0d83a9-tap8eb86bde-6b', 'timestamp': '2025-11-22T07:52:36.687137', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1914028081', 'name': 'tap8eb86bde-6b', 'instance_id': 'f4f28646-e9a4-464e-a7f6-db4a7d0d83a9', 'instance_type': 'm1.nano', 'host': '41cdcff15b5a7c4ff81f1776cad17739f7e062616f0dbe910b638a21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3d:f9:da', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8eb86bde-6b'}, 'message_id': '3625482a-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4699.356952145, 'message_signature': '6fab2184136e55363524cb02ade58048bf4190d70fa8ab6a99d99662e1ae18d6'}]}, 'timestamp': '2025-11-22 07:52:36.690324', '_unique_id': '14ef4979b69f4f579d29005ee59fe99c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.691 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.691 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.691 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.691 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.691 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.691 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.691 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.691 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.691 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.691 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.691 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.691 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.691 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.691 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.691 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.691 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.691 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.691 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.691 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.691 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.691 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.691 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.691 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.691 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.691 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.691 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.691 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.691 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.691 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.691 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.691 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.692 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.692 12 DEBUG ceilometer.compute.pollsters [-] f4f28646-e9a4-464e-a7f6-db4a7d0d83a9/cpu volume: 13230000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.693 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bd2b2a38-dd28-444b-8873-85afe8560592', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 13230000000, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_name': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_name': None, 'resource_id': 'f4f28646-e9a4-464e-a7f6-db4a7d0d83a9', 'timestamp': '2025-11-22T07:52:36.692367', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1914028081', 'name': 'instance-00000037', 'instance_id': 'f4f28646-e9a4-464e-a7f6-db4a7d0d83a9', 'instance_type': 'm1.nano', 'host': '41cdcff15b5a7c4ff81f1776cad17739f7e062616f0dbe910b638a21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '3625a64e-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4699.352549257, 'message_signature': '40aa305bc0ff7b5225b657ed03b6d9064b98e73003b82fb8bd2beef7e0c822ff'}]}, 'timestamp': '2025-11-22 07:52:36.692672', '_unique_id': '31090356c4344139912744f8c6e1e8ce'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.693 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.693 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.693 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.693 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.693 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.693 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.693 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.693 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.693 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.693 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.693 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.693 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.693 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.693 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.693 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.693 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.693 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.693 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.693 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.693 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.693 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.693 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.693 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.693 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.693 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.693 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.693 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.693 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.693 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.693 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.693 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.694 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.694 12 DEBUG ceilometer.compute.pollsters [-] f4f28646-e9a4-464e-a7f6-db4a7d0d83a9/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.694 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '30a6a8be-6600-41e5-bc33-d6e03e3cd03a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_name': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_name': None, 'resource_id': 'instance-00000037-f4f28646-e9a4-464e-a7f6-db4a7d0d83a9-tap8eb86bde-6b', 'timestamp': '2025-11-22T07:52:36.694158', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1914028081', 'name': 'tap8eb86bde-6b', 'instance_id': 'f4f28646-e9a4-464e-a7f6-db4a7d0d83a9', 'instance_type': 'm1.nano', 'host': '41cdcff15b5a7c4ff81f1776cad17739f7e062616f0dbe910b638a21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3d:f9:da', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8eb86bde-6b'}, 'message_id': '3625ec62-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4699.356952145, 'message_signature': '39ff66c38af992d71b3c010e6cf969b011706e198a6a0ce4a1ad146621808d56'}]}, 'timestamp': '2025-11-22 07:52:36.694483', '_unique_id': '87834f7b2f11404bacc7116bc0cef74d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.694 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.694 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.694 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.694 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.694 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.694 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.694 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.694 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.694 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.694 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.694 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.694 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.694 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.694 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.694 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.694 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.694 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.694 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.694 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.694 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.694 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.694 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.694 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.694 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.694 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.694 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.694 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.694 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.694 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.694 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.694 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.695 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.704 12 DEBUG ceilometer.compute.pollsters [-] f4f28646-e9a4-464e-a7f6-db4a7d0d83a9/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.705 12 DEBUG ceilometer.compute.pollsters [-] f4f28646-e9a4-464e-a7f6-db4a7d0d83a9/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.706 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bf624c53-c1e2-455c-a911-6526979f8585', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_name': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_name': None, 'resource_id': 'f4f28646-e9a4-464e-a7f6-db4a7d0d83a9-vda', 'timestamp': '2025-11-22T07:52:36.696030', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1914028081', 'name': 'instance-00000037', 'instance_id': 'f4f28646-e9a4-464e-a7f6-db4a7d0d83a9', 'instance_type': 'm1.nano', 'host': '41cdcff15b5a7c4ff81f1776cad17739f7e062616f0dbe910b638a21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '36278d74-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4699.365827522, 'message_signature': '8fe40bec5fcdb55a78f30de8edee2b33eba978f8572f04dda93dd47bdf93f2f7'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_name': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_name': None, 'resource_id': 'f4f28646-e9a4-464e-a7f6-db4a7d0d83a9-sda', 'timestamp': '2025-11-22T07:52:36.696030', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1914028081', 'name': 'instance-00000037', 'instance_id': 'f4f28646-e9a4-464e-a7f6-db4a7d0d83a9', 'instance_type': 'm1.nano', 'host': '41cdcff15b5a7c4ff81f1776cad17739f7e062616f0dbe910b638a21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '362799e0-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4699.365827522, 'message_signature': '4d91a187530a25c79be4db16de23fe97e597d03f7a5c51c3f215b4107fc84f91'}]}, 'timestamp': '2025-11-22 07:52:36.705434', '_unique_id': '3ce8038a10174647a7ad7563baafb97c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.706 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.706 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.706 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.706 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.706 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.706 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.706 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.706 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.706 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.706 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.706 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.706 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.706 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.706 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.706 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.706 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.706 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.706 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.706 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.706 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.706 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.706 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.706 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.706 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.706 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.706 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.706 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.706 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.706 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.706 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.706 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.707 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.731 12 DEBUG ceilometer.compute.pollsters [-] f4f28646-e9a4-464e-a7f6-db4a7d0d83a9/disk.device.write.bytes volume: 72712192 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.731 12 DEBUG ceilometer.compute.pollsters [-] f4f28646-e9a4-464e-a7f6-db4a7d0d83a9/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.733 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c7f55ff3-1af8-42f2-985d-1c3b7d326e3c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72712192, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_name': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_name': None, 'resource_id': 'f4f28646-e9a4-464e-a7f6-db4a7d0d83a9-vda', 'timestamp': '2025-11-22T07:52:36.707392', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1914028081', 'name': 'instance-00000037', 'instance_id': 'f4f28646-e9a4-464e-a7f6-db4a7d0d83a9', 'instance_type': 'm1.nano', 'host': '41cdcff15b5a7c4ff81f1776cad17739f7e062616f0dbe910b638a21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '362ba01c-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4699.37721914, 'message_signature': 'd3769286a20a121c2bade985371804ca4189dc1dc2157cd5e9879f8fa7822f7f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_name': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_name': None, 'resource_id': 'f4f28646-e9a4-464e-a7f6-db4a7d0d83a9-sda', 'timestamp': '2025-11-22T07:52:36.707392', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1914028081', 'name': 'instance-00000037', 'instance_id': 'f4f28646-e9a4-464e-a7f6-db4a7d0d83a9', 'instance_type': 'm1.nano', 'host': '41cdcff15b5a7c4ff81f1776cad17739f7e062616f0dbe910b638a21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '362bada0-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4699.37721914, 'message_signature': '60026d2a6dbb79dab78c149ecb6b66bc009ea9d10aec4c96b57f19cb5e101d4b'}]}, 'timestamp': '2025-11-22 07:52:36.732162', '_unique_id': '22bee927a7524681b424ce7d2274216d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.733 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.733 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.733 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.733 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.733 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.733 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.733 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.733 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.733 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.733 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.733 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.733 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.733 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.733 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.733 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.733 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.733 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.733 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.733 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.733 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.733 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.733 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.733 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.733 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.733 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.733 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.733 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.733 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.733 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.733 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.733 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.734 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.734 12 DEBUG ceilometer.compute.pollsters [-] f4f28646-e9a4-464e-a7f6-db4a7d0d83a9/disk.device.allocation volume: 30154752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.734 12 DEBUG ceilometer.compute.pollsters [-] f4f28646-e9a4-464e-a7f6-db4a7d0d83a9/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.735 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8f293e07-b51a-4a82-a19c-1a1cc602dbc4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30154752, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_name': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_name': None, 'resource_id': 'f4f28646-e9a4-464e-a7f6-db4a7d0d83a9-vda', 'timestamp': '2025-11-22T07:52:36.734287', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1914028081', 'name': 'instance-00000037', 'instance_id': 'f4f28646-e9a4-464e-a7f6-db4a7d0d83a9', 'instance_type': 'm1.nano', 'host': '41cdcff15b5a7c4ff81f1776cad17739f7e062616f0dbe910b638a21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '362c0c1e-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4699.365827522, 'message_signature': '3027b07093a98c45d0e2448a08220de90e8ad324ce424c33f4a6c6f91a114ace'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_name': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_name': None, 'resource_id': 'f4f28646-e9a4-464e-a7f6-db4a7d0d83a9-sda', 'timestamp': '2025-11-22T07:52:36.734287', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1914028081', 'name': 'instance-00000037', 'instance_id': 'f4f28646-e9a4-464e-a7f6-db4a7d0d83a9', 'instance_type': 'm1.nano', 'host': '41cdcff15b5a7c4ff81f1776cad17739f7e062616f0dbe910b638a21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '362c1970-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4699.365827522, 'message_signature': '2e96cfce95a8a63485d6bed0a31e2b0d65af8c84146e9edb62ea6d5c2d5cc9b9'}]}, 'timestamp': '2025-11-22 07:52:36.734945', '_unique_id': '27342b741ee6429dbda857e11417b0cd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.735 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.735 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.735 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.735 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.735 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.735 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.735 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.735 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.735 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.735 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.735 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.735 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.735 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.735 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.735 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.735 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.735 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.735 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.735 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.735 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.735 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.735 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.735 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.735 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.735 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.735 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.735 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.735 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.735 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.735 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.735 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.736 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.736 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.736 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-DeleteServersTestJSON-server-1914028081>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-DeleteServersTestJSON-server-1914028081>]
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.737 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.737 12 DEBUG ceilometer.compute.pollsters [-] f4f28646-e9a4-464e-a7f6-db4a7d0d83a9/disk.device.read.requests volume: 1131 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.737 12 DEBUG ceilometer.compute.pollsters [-] f4f28646-e9a4-464e-a7f6-db4a7d0d83a9/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.738 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '444ae503-d51c-4ed4-8daa-78774eb80227', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1131, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_name': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_name': None, 'resource_id': 'f4f28646-e9a4-464e-a7f6-db4a7d0d83a9-vda', 'timestamp': '2025-11-22T07:52:36.737226', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1914028081', 'name': 'instance-00000037', 'instance_id': 'f4f28646-e9a4-464e-a7f6-db4a7d0d83a9', 'instance_type': 'm1.nano', 'host': '41cdcff15b5a7c4ff81f1776cad17739f7e062616f0dbe910b638a21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '362c80f4-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4699.37721914, 'message_signature': 'eb1b80eb94076cea6999d53b33102c5ae8be9ea86d9aba1b9419091d83456572'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_name': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_name': None, 'resource_id': 'f4f28646-e9a4-464e-a7f6-db4a7d0d83a9-sda', 'timestamp': '2025-11-22T07:52:36.737226', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1914028081', 'name': 'instance-00000037', 'instance_id': 'f4f28646-e9a4-464e-a7f6-db4a7d0d83a9', 'instance_type': 'm1.nano', 'host': '41cdcff15b5a7c4ff81f1776cad17739f7e062616f0dbe910b638a21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '362c8b62-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4699.37721914, 'message_signature': '0562df7191e53978b204724bec69ea12dfd554813998e41641020d41d16ac925'}]}, 'timestamp': '2025-11-22 07:52:36.737856', '_unique_id': '311a8dea0d334f11b4108fe4e815ac47'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.738 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.738 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.738 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.738 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.738 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.738 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.738 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.738 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.738 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.738 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.738 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.738 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.738 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.738 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.738 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.738 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.738 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.738 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.738 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.738 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.738 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.738 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.738 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.738 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.738 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.738 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.738 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.738 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.738 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.738 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.738 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.739 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.739 12 DEBUG ceilometer.compute.pollsters [-] f4f28646-e9a4-464e-a7f6-db4a7d0d83a9/network.incoming.bytes volume: 1352 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.740 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c87c4f73-5470-4e1e-bbbc-cc8d1a5d7412', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1352, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_name': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_name': None, 'resource_id': 'instance-00000037-f4f28646-e9a4-464e-a7f6-db4a7d0d83a9-tap8eb86bde-6b', 'timestamp': '2025-11-22T07:52:36.739708', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1914028081', 'name': 'tap8eb86bde-6b', 'instance_id': 'f4f28646-e9a4-464e-a7f6-db4a7d0d83a9', 'instance_type': 'm1.nano', 'host': '41cdcff15b5a7c4ff81f1776cad17739f7e062616f0dbe910b638a21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3d:f9:da', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8eb86bde-6b'}, 'message_id': '362ce0da-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4699.356952145, 'message_signature': '1ac7e9a2311ee4b34b72c9428af6dc5c13c5b8f15bde3f09b0d65a06b93c1241'}]}, 'timestamp': '2025-11-22 07:52:36.740044', '_unique_id': '22b95c243e994ce09d00999b1bcb56d5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.740 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.740 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.740 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.740 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.740 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.740 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.740 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.740 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.740 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.740 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.740 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.740 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.740 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.740 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.740 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.740 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.740 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.740 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.740 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.740 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.740 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.740 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.740 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.740 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.740 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.740 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.740 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.740 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.740 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.740 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.740 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.741 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.741 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.742 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-DeleteServersTestJSON-server-1914028081>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-DeleteServersTestJSON-server-1914028081>]
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.742 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.742 12 DEBUG ceilometer.compute.pollsters [-] f4f28646-e9a4-464e-a7f6-db4a7d0d83a9/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.743 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '68c19fd0-5bb9-417d-b38c-56b5c9463588', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_name': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_name': None, 'resource_id': 'instance-00000037-f4f28646-e9a4-464e-a7f6-db4a7d0d83a9-tap8eb86bde-6b', 'timestamp': '2025-11-22T07:52:36.742405', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1914028081', 'name': 'tap8eb86bde-6b', 'instance_id': 'f4f28646-e9a4-464e-a7f6-db4a7d0d83a9', 'instance_type': 'm1.nano', 'host': '41cdcff15b5a7c4ff81f1776cad17739f7e062616f0dbe910b638a21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3d:f9:da', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8eb86bde-6b'}, 'message_id': '362d4aac-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4699.356952145, 'message_signature': '70e26fc1aec13bb3848a99b21cd15ebf7723e628b7461df8e3faedb9884f5ff3'}]}, 'timestamp': '2025-11-22 07:52:36.742755', '_unique_id': 'fad4f32d116d43f18b709098f6038ea0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.743 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.743 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.743 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.743 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.743 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.743 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.743 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.743 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.743 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.743 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.743 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.743 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.743 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.743 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.743 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.743 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.743 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.743 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.743 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.743 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.743 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.743 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.743 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.743 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.743 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.743 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.743 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.743 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.743 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.743 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.743 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.744 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.745 12 DEBUG ceilometer.compute.pollsters [-] f4f28646-e9a4-464e-a7f6-db4a7d0d83a9/disk.device.read.bytes volume: 31001088 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.745 12 DEBUG ceilometer.compute.pollsters [-] f4f28646-e9a4-464e-a7f6-db4a7d0d83a9/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.746 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '60a0841d-4eaa-4f82-add2-f3874cc2a72e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31001088, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_name': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_name': None, 'resource_id': 'f4f28646-e9a4-464e-a7f6-db4a7d0d83a9-vda', 'timestamp': '2025-11-22T07:52:36.744970', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1914028081', 'name': 'instance-00000037', 'instance_id': 'f4f28646-e9a4-464e-a7f6-db4a7d0d83a9', 'instance_type': 'm1.nano', 'host': '41cdcff15b5a7c4ff81f1776cad17739f7e062616f0dbe910b638a21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '362db78a-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4699.37721914, 'message_signature': 'c114ef2fb2677a9945d52fb5afd06865e32a830fc1967fa72336911cccaefb15'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_name': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_name': None, 'resource_id': 'f4f28646-e9a4-464e-a7f6-db4a7d0d83a9-sda', 'timestamp': '2025-11-22T07:52:36.744970', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1914028081', 'name': 'instance-00000037', 'instance_id': 'f4f28646-e9a4-464e-a7f6-db4a7d0d83a9', 'instance_type': 'm1.nano', 'host': '41cdcff15b5a7c4ff81f1776cad17739f7e062616f0dbe910b638a21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '362dc216-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4699.37721914, 'message_signature': '23c88bdfada901eaed60f3ec917f59ac62b27e12f290e7c1e65919a42884d225'}]}, 'timestamp': '2025-11-22 07:52:36.745781', '_unique_id': '9033ea823e284d8083b5c0003750aac7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.746 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.746 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.746 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.746 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.746 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.746 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.746 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.746 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.746 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.746 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.746 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.746 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.746 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.746 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.746 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.746 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.746 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.746 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.746 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.746 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.746 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.746 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.746 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.746 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.746 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.746 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.746 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.746 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.746 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.746 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.746 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.747 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.747 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.747 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-DeleteServersTestJSON-server-1914028081>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-DeleteServersTestJSON-server-1914028081>]
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.747 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.748 12 DEBUG ceilometer.compute.pollsters [-] f4f28646-e9a4-464e-a7f6-db4a7d0d83a9/disk.device.write.latency volume: 3724068436 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.748 12 DEBUG ceilometer.compute.pollsters [-] f4f28646-e9a4-464e-a7f6-db4a7d0d83a9/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.749 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ab92eadb-27ef-421e-96d1-2179ef141a56', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3724068436, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_name': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_name': None, 'resource_id': 'f4f28646-e9a4-464e-a7f6-db4a7d0d83a9-vda', 'timestamp': '2025-11-22T07:52:36.747994', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1914028081', 'name': 'instance-00000037', 'instance_id': 'f4f28646-e9a4-464e-a7f6-db4a7d0d83a9', 'instance_type': 'm1.nano', 'host': '41cdcff15b5a7c4ff81f1776cad17739f7e062616f0dbe910b638a21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '362e22c4-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4699.37721914, 'message_signature': '21da09323320c500862f800e0358f709aac41b2726e5d1aaea1dedfe40606328'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_name': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_name': None, 'resource_id': 'f4f28646-e9a4-464e-a7f6-db4a7d0d83a9-sda', 'timestamp': '2025-11-22T07:52:36.747994', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1914028081', 'name': 'instance-00000037', 'instance_id': 'f4f28646-e9a4-464e-a7f6-db4a7d0d83a9', 'instance_type': 'm1.nano', 'host': '41cdcff15b5a7c4ff81f1776cad17739f7e062616f0dbe910b638a21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '362e2e72-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4699.37721914, 'message_signature': '96af2a12ba26c4e038b05785ee51d7e04319b1e41f403e4495c22355d5ed0d75'}]}, 'timestamp': '2025-11-22 07:52:36.748582', '_unique_id': '786acaa0a4a545268793c9989b75d9b4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.749 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.749 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.749 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.749 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.749 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.749 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.749 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.749 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.749 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.749 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.749 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.749 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.749 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.749 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.749 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.749 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.749 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.749 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.749 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.749 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.749 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.749 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.749 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.749 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.749 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.749 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.749 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.749 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.749 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.749 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.749 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.750 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.750 12 DEBUG ceilometer.compute.pollsters [-] f4f28646-e9a4-464e-a7f6-db4a7d0d83a9/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.751 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '84eb510c-8a51-4b4b-8512-774aff7731a5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_name': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_name': None, 'resource_id': 'instance-00000037-f4f28646-e9a4-464e-a7f6-db4a7d0d83a9-tap8eb86bde-6b', 'timestamp': '2025-11-22T07:52:36.750308', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1914028081', 'name': 'tap8eb86bde-6b', 'instance_id': 'f4f28646-e9a4-464e-a7f6-db4a7d0d83a9', 'instance_type': 'm1.nano', 'host': '41cdcff15b5a7c4ff81f1776cad17739f7e062616f0dbe910b638a21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3d:f9:da', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8eb86bde-6b'}, 'message_id': '362e7eb8-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4699.356952145, 'message_signature': 'a4e0e04dd70bf29cb1cce263551400499cbb9ade4fab609318d71525fc8e94e7'}]}, 'timestamp': '2025-11-22 07:52:36.750662', '_unique_id': 'ed4883c02a834bf79b7b8cd414a4a766'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.751 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.751 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.751 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.751 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.751 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.751 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.751 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.751 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.751 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.751 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.751 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.751 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.751 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.751 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.751 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.751 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.751 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.751 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.751 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.751 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.751 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.751 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.751 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.751 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.751 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.751 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.751 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.751 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.751 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.751 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.751 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.752 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.752 12 DEBUG ceilometer.compute.pollsters [-] f4f28646-e9a4-464e-a7f6-db4a7d0d83a9/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.752 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ee30b705-9aeb-475a-87cd-02937ef41ada', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_name': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_name': None, 'resource_id': 'instance-00000037-f4f28646-e9a4-464e-a7f6-db4a7d0d83a9-tap8eb86bde-6b', 'timestamp': '2025-11-22T07:52:36.752177', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1914028081', 'name': 'tap8eb86bde-6b', 'instance_id': 'f4f28646-e9a4-464e-a7f6-db4a7d0d83a9', 'instance_type': 'm1.nano', 'host': '41cdcff15b5a7c4ff81f1776cad17739f7e062616f0dbe910b638a21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3d:f9:da', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8eb86bde-6b'}, 'message_id': '362ec6fc-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4699.356952145, 'message_signature': 'fee40676d9563d019aa57b41b4fc532e5b5677d9b153eddfad0d40c9090401ee'}]}, 'timestamp': '2025-11-22 07:52:36.752470', '_unique_id': 'd32776da53714c518172591984fe52ca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.752 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.752 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.752 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.752 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.752 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.752 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.752 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.752 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.752 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.752 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.752 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.752 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.752 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.752 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.752 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.752 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.752 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.752 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.752 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.752 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.752 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.752 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.752 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.752 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.752 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.752 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.752 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.752 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.752 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.752 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.752 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.753 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.753 12 DEBUG ceilometer.compute.pollsters [-] f4f28646-e9a4-464e-a7f6-db4a7d0d83a9/network.outgoing.packets volume: 8 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.754 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4454693c-76a7-4da5-917b-0e40c00603e7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 8, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_name': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_name': None, 'resource_id': 'instance-00000037-f4f28646-e9a4-464e-a7f6-db4a7d0d83a9-tap8eb86bde-6b', 'timestamp': '2025-11-22T07:52:36.753920', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1914028081', 'name': 'tap8eb86bde-6b', 'instance_id': 'f4f28646-e9a4-464e-a7f6-db4a7d0d83a9', 'instance_type': 'm1.nano', 'host': '41cdcff15b5a7c4ff81f1776cad17739f7e062616f0dbe910b638a21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3d:f9:da', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8eb86bde-6b'}, 'message_id': '362f0a36-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4699.356952145, 'message_signature': '2ae2c5b5c153e3cc580dca35df47659acd539111802c1fc07693d6849bd5534e'}]}, 'timestamp': '2025-11-22 07:52:36.754191', '_unique_id': '349ea40604224f5fa111f4fac79f2eb6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.754 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.754 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.754 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.754 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.754 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.754 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.754 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.754 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.754 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.754 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.754 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.754 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.754 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.754 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.754 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.754 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.754 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.754 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.754 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.754 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.754 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.754 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.754 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.754 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.754 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.754 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.754 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.754 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.754 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.754 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.754 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.755 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.755 12 DEBUG ceilometer.compute.pollsters [-] f4f28646-e9a4-464e-a7f6-db4a7d0d83a9/disk.device.write.requests volume: 275 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.755 12 DEBUG ceilometer.compute.pollsters [-] f4f28646-e9a4-464e-a7f6-db4a7d0d83a9/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.756 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4ae4b78e-8b5b-493a-ad3d-31a91ebf9ebc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 275, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_name': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_name': None, 'resource_id': 'f4f28646-e9a4-464e-a7f6-db4a7d0d83a9-vda', 'timestamp': '2025-11-22T07:52:36.755704', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1914028081', 'name': 'instance-00000037', 'instance_id': 'f4f28646-e9a4-464e-a7f6-db4a7d0d83a9', 'instance_type': 'm1.nano', 'host': '41cdcff15b5a7c4ff81f1776cad17739f7e062616f0dbe910b638a21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '362f50c2-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4699.37721914, 'message_signature': 'd786debd676c7a67ce9b96c27be0b94b0bf8ca02a5feab4dd170292d4ee963be'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_name': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_name': None, 'resource_id': 'f4f28646-e9a4-464e-a7f6-db4a7d0d83a9-sda', 'timestamp': '2025-11-22T07:52:36.755704', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1914028081', 'name': 'instance-00000037', 'instance_id': 'f4f28646-e9a4-464e-a7f6-db4a7d0d83a9', 'instance_type': 'm1.nano', 'host': '41cdcff15b5a7c4ff81f1776cad17739f7e062616f0dbe910b638a21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '362f5a72-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4699.37721914, 'message_signature': '0e658306fbfbec3270042e10eed015bfd8f83a89154a72df3521d00afa790e08'}]}, 'timestamp': '2025-11-22 07:52:36.756233', '_unique_id': '3f1de408e52e409682d317777f262651'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.756 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.756 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.756 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.756 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.756 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.756 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.756 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.756 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.756 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.756 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.756 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.756 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.756 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.756 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.756 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.756 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.756 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.756 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.756 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.756 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.756 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.756 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.756 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.756 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.756 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.756 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.756 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.756 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.756 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.756 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.756 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.757 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.757 12 DEBUG ceilometer.compute.pollsters [-] f4f28646-e9a4-464e-a7f6-db4a7d0d83a9/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.758 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '68626c68-f5c1-41b6-9b6c-e306e74ad3fe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_name': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_name': None, 'resource_id': 'instance-00000037-f4f28646-e9a4-464e-a7f6-db4a7d0d83a9-tap8eb86bde-6b', 'timestamp': '2025-11-22T07:52:36.757805', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1914028081', 'name': 'tap8eb86bde-6b', 'instance_id': 'f4f28646-e9a4-464e-a7f6-db4a7d0d83a9', 'instance_type': 'm1.nano', 'host': '41cdcff15b5a7c4ff81f1776cad17739f7e062616f0dbe910b638a21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3d:f9:da', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8eb86bde-6b'}, 'message_id': '362fa356-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4699.356952145, 'message_signature': 'e7a4b269725773320f9794d5a26320cbb1ed4b4c3cea18a4f2684b7b39f0395e'}]}, 'timestamp': '2025-11-22 07:52:36.758114', '_unique_id': 'f5b6438f993a4cd38caf1bdc770418a8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.758 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.758 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.758 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.758 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.758 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.758 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.758 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.758 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.758 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.758 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.758 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.758 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.758 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.758 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.758 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.758 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.758 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.758 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.758 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.758 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.758 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.758 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.758 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.758 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.758 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.758 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.758 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.758 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.758 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.758 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.758 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.759 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.759 12 DEBUG ceilometer.compute.pollsters [-] f4f28646-e9a4-464e-a7f6-db4a7d0d83a9/disk.device.usage volume: 29753344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.759 12 DEBUG ceilometer.compute.pollsters [-] f4f28646-e9a4-464e-a7f6-db4a7d0d83a9/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.760 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f4305c98-4fef-4094-8c73-6d955c28c3ec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29753344, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_name': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_name': None, 'resource_id': 'f4f28646-e9a4-464e-a7f6-db4a7d0d83a9-vda', 'timestamp': '2025-11-22T07:52:36.759702', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1914028081', 'name': 'instance-00000037', 'instance_id': 'f4f28646-e9a4-464e-a7f6-db4a7d0d83a9', 'instance_type': 'm1.nano', 'host': '41cdcff15b5a7c4ff81f1776cad17739f7e062616f0dbe910b638a21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '362fec3a-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4699.365827522, 'message_signature': '875405dee32e348be0fff37461035181404f411b36e7bfb398dacc77120bc3ed'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_name': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_name': None, 'resource_id': 'f4f28646-e9a4-464e-a7f6-db4a7d0d83a9-sda', 'timestamp': '2025-11-22T07:52:36.759702', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1914028081', 'name': 'instance-00000037', 'instance_id': 'f4f28646-e9a4-464e-a7f6-db4a7d0d83a9', 'instance_type': 'm1.nano', 'host': '41cdcff15b5a7c4ff81f1776cad17739f7e062616f0dbe910b638a21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '362ff70c-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4699.365827522, 'message_signature': 'fe02822ce5915782ccbc4acea4833a0b0ca2eba63720ace413af95d5b21b02e1'}]}, 'timestamp': '2025-11-22 07:52:36.760243', '_unique_id': '41d37f28bb384d9aadafae7045af0f24'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.760 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.760 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.760 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.760 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.760 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.760 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.760 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.760 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.760 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.760 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.760 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.760 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.760 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.760 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.760 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.760 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.760 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.760 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.760 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.760 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.760 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.760 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.760 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.760 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.760 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.760 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.760 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.760 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.760 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.760 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.760 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.761 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.761 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.761 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-DeleteServersTestJSON-server-1914028081>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-DeleteServersTestJSON-server-1914028081>]
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.762 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.762 12 DEBUG ceilometer.compute.pollsters [-] f4f28646-e9a4-464e-a7f6-db4a7d0d83a9/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.763 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '21f91f6c-56bd-41b7-9a17-cd55413f9406', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_name': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_name': None, 'resource_id': 'instance-00000037-f4f28646-e9a4-464e-a7f6-db4a7d0d83a9-tap8eb86bde-6b', 'timestamp': '2025-11-22T07:52:36.762239', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1914028081', 'name': 'tap8eb86bde-6b', 'instance_id': 'f4f28646-e9a4-464e-a7f6-db4a7d0d83a9', 'instance_type': 'm1.nano', 'host': '41cdcff15b5a7c4ff81f1776cad17739f7e062616f0dbe910b638a21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3d:f9:da', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8eb86bde-6b'}, 'message_id': '363050a8-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4699.356952145, 'message_signature': '1b763c5b3b3e31d1ff4672fa2e088a9c2411b3c04fd288772e97aceab6058b1c'}]}, 'timestamp': '2025-11-22 07:52:36.762587', '_unique_id': 'e58b83b19e774d2b836a3c8f82c33b19'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.763 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.763 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.763 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.763 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.763 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.763 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.763 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.763 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.763 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.763 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.763 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.763 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.763 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.763 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.763 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.763 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.763 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.763 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.763 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.763 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.763 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.763 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.763 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.763 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.763 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.763 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.763 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.763 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.763 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.763 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.763 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.764 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.764 12 DEBUG ceilometer.compute.pollsters [-] f4f28646-e9a4-464e-a7f6-db4a7d0d83a9/network.outgoing.bytes volume: 1152 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.764 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '82105af2-b38c-42cc-b357-07bf0fe7edbc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1152, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_name': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_name': None, 'resource_id': 'instance-00000037-f4f28646-e9a4-464e-a7f6-db4a7d0d83a9-tap8eb86bde-6b', 'timestamp': '2025-11-22T07:52:36.764135', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1914028081', 'name': 'tap8eb86bde-6b', 'instance_id': 'f4f28646-e9a4-464e-a7f6-db4a7d0d83a9', 'instance_type': 'm1.nano', 'host': '41cdcff15b5a7c4ff81f1776cad17739f7e062616f0dbe910b638a21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3d:f9:da', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8eb86bde-6b'}, 'message_id': '36309bb2-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4699.356952145, 'message_signature': '078bb333373e5ac5103731aae4bce742e34e05d41cfadbd2a8f33e5f6dc9a5de'}]}, 'timestamp': '2025-11-22 07:52:36.764476', '_unique_id': '6a5d2eac62c34bf0bcc342400fed7362'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.764 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.764 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.764 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.764 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.764 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.764 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.764 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.764 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.764 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.764 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.764 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.764 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.764 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.764 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.764 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.764 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.764 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.764 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.764 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.764 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.764 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.764 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.764 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.764 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.764 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.764 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.764 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.764 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.764 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.764 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.764 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.765 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.766 12 DEBUG ceilometer.compute.pollsters [-] f4f28646-e9a4-464e-a7f6-db4a7d0d83a9/disk.device.read.latency volume: 694948739 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.766 12 DEBUG ceilometer.compute.pollsters [-] f4f28646-e9a4-464e-a7f6-db4a7d0d83a9/disk.device.read.latency volume: 48485118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.768 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3f499a39-cb13-4978-b7aa-7c1370f229c8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 694948739, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_name': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_name': None, 'resource_id': 'f4f28646-e9a4-464e-a7f6-db4a7d0d83a9-vda', 'timestamp': '2025-11-22T07:52:36.766008', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1914028081', 'name': 'instance-00000037', 'instance_id': 'f4f28646-e9a4-464e-a7f6-db4a7d0d83a9', 'instance_type': 'm1.nano', 'host': '41cdcff15b5a7c4ff81f1776cad17739f7e062616f0dbe910b638a21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3630e2b6-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4699.37721914, 'message_signature': 'b50c72541cc300ae6dbe8fef484b2dbdee60b687ba5d25a4b9be85277ba6d53f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 48485118, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_name': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_name': None, 'resource_id': 'f4f28646-e9a4-464e-a7f6-db4a7d0d83a9-sda', 'timestamp': '2025-11-22T07:52:36.766008', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1914028081', 'name': 'instance-00000037', 'instance_id': 'f4f28646-e9a4-464e-a7f6-db4a7d0d83a9', 'instance_type': 'm1.nano', 'host': '41cdcff15b5a7c4ff81f1776cad17739f7e062616f0dbe910b638a21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '363108ea-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4699.37721914, 'message_signature': '04a491def2ac5658fa9315350cf5c5f3f349e46f021a4b1e81a6a0e9be78015a'}]}, 'timestamp': '2025-11-22 07:52:36.767333', '_unique_id': '23c834079c734486870307e50f9f74e1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.768 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.768 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.768 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.768 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.768 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.768 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.768 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.768 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.768 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.768 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.768 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.768 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.768 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.768 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.768 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.768 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.768 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.768 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.768 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.768 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.768 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.768 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.768 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.768 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.768 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.768 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.768 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.768 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.768 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.768 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:52:36.768 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:52:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:52:37.323 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:52:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:52:37.324 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:52:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:52:37.325 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:52:38 np0005531887 nova_compute[186849]: 2025-11-22 07:52:38.403 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:38 np0005531887 nova_compute[186849]: 2025-11-22 07:52:38.777 186853 DEBUG nova.virt.libvirt.driver [None req-e4d15b4c-727c-4aa9-af57-b0d06240e30b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Nov 22 02:52:39 np0005531887 nova_compute[186849]: 2025-11-22 07:52:39.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:52:39 np0005531887 podman[220541]: 2025-11-22 07:52:39.840605174 +0000 UTC m=+0.056224846 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 02:52:40 np0005531887 nova_compute[186849]: 2025-11-22 07:52:40.549 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:40 np0005531887 kernel: tap8eb86bde-6b (unregistering): left promiscuous mode
Nov 22 02:52:40 np0005531887 NetworkManager[55210]: <info>  [1763797960.9738] device (tap8eb86bde-6b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 02:52:40 np0005531887 ovn_controller[95130]: 2025-11-22T07:52:40Z|00115|binding|INFO|Releasing lport 8eb86bde-6b5b-491d-9ae4-94dcc46b6169 from this chassis (sb_readonly=0)
Nov 22 02:52:40 np0005531887 ovn_controller[95130]: 2025-11-22T07:52:40Z|00116|binding|INFO|Setting lport 8eb86bde-6b5b-491d-9ae4-94dcc46b6169 down in Southbound
Nov 22 02:52:40 np0005531887 nova_compute[186849]: 2025-11-22 07:52:40.982 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:40 np0005531887 ovn_controller[95130]: 2025-11-22T07:52:40Z|00117|binding|INFO|Removing iface tap8eb86bde-6b ovn-installed in OVS
Nov 22 02:52:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:52:40.992 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3d:f9:da 10.100.0.6'], port_security=['fa:16:3e:3d:f9:da 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'f4f28646-e9a4-464e-a7f6-db4a7d0d83a9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5e910dbb-27d1-4915-8b74-d0538d33c33c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd8cd7544-2677-4974-86a3-a18d0c107043', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7bb67d1a-54cf-4f4c-900a-e9306bad2f5e, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=8eb86bde-6b5b-491d-9ae4-94dcc46b6169) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:52:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:52:40.994 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 8eb86bde-6b5b-491d-9ae4-94dcc46b6169 in datapath 5e910dbb-27d1-4915-8b74-d0538d33c33c unbound from our chassis#033[00m
Nov 22 02:52:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:52:40.995 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5e910dbb-27d1-4915-8b74-d0538d33c33c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 02:52:40 np0005531887 nova_compute[186849]: 2025-11-22 07:52:40.996 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:52:40.997 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[ec9ba980-4520-4cac-aeea-7145f7a27c5a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:52:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:52:40.998 104084 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c namespace which is not needed anymore#033[00m
Nov 22 02:52:41 np0005531887 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000037.scope: Deactivated successfully.
Nov 22 02:52:41 np0005531887 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000037.scope: Consumed 15.127s CPU time.
Nov 22 02:52:41 np0005531887 systemd-machined[153180]: Machine qemu-20-instance-00000037 terminated.
Nov 22 02:52:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:52:41.175 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:52:41 np0005531887 nova_compute[186849]: 2025-11-22 07:52:41.176 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:41 np0005531887 neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c[220420]: [NOTICE]   (220440) : haproxy version is 2.8.14-c23fe91
Nov 22 02:52:41 np0005531887 neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c[220420]: [NOTICE]   (220440) : path to executable is /usr/sbin/haproxy
Nov 22 02:52:41 np0005531887 neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c[220420]: [WARNING]  (220440) : Exiting Master process...
Nov 22 02:52:41 np0005531887 neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c[220420]: [WARNING]  (220440) : Exiting Master process...
Nov 22 02:52:41 np0005531887 neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c[220420]: [ALERT]    (220440) : Current worker (220447) exited with code 143 (Terminated)
Nov 22 02:52:41 np0005531887 neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c[220420]: [WARNING]  (220440) : All workers exited. Exiting... (0)
Nov 22 02:52:41 np0005531887 systemd[1]: libpod-bf6e758a090fff42b2b18cfcd3f2948d8d0ff083a5f92f985acc81060c1fe90a.scope: Deactivated successfully.
Nov 22 02:52:41 np0005531887 podman[220589]: 2025-11-22 07:52:41.231290151 +0000 UTC m=+0.147652223 container died bf6e758a090fff42b2b18cfcd3f2948d8d0ff083a5f92f985acc81060c1fe90a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 22 02:52:41 np0005531887 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bf6e758a090fff42b2b18cfcd3f2948d8d0ff083a5f92f985acc81060c1fe90a-userdata-shm.mount: Deactivated successfully.
Nov 22 02:52:41 np0005531887 systemd[1]: var-lib-containers-storage-overlay-c798bf80d9165c6eb18676c155599559a84ef3a1272473aef330213583ebe1c8-merged.mount: Deactivated successfully.
Nov 22 02:52:41 np0005531887 podman[220589]: 2025-11-22 07:52:41.575931515 +0000 UTC m=+0.492293597 container cleanup bf6e758a090fff42b2b18cfcd3f2948d8d0ff083a5f92f985acc81060c1fe90a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 22 02:52:41 np0005531887 systemd[1]: libpod-conmon-bf6e758a090fff42b2b18cfcd3f2948d8d0ff083a5f92f985acc81060c1fe90a.scope: Deactivated successfully.
Nov 22 02:52:41 np0005531887 podman[220638]: 2025-11-22 07:52:41.666698045 +0000 UTC m=+0.069466190 container remove bf6e758a090fff42b2b18cfcd3f2948d8d0ff083a5f92f985acc81060c1fe90a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:52:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:52:41.675 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[6772c626-4a9c-4ed7-b52f-81f0b83f6455]: (4, ('Sat Nov 22 07:52:41 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c (bf6e758a090fff42b2b18cfcd3f2948d8d0ff083a5f92f985acc81060c1fe90a)\nbf6e758a090fff42b2b18cfcd3f2948d8d0ff083a5f92f985acc81060c1fe90a\nSat Nov 22 07:52:41 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c (bf6e758a090fff42b2b18cfcd3f2948d8d0ff083a5f92f985acc81060c1fe90a)\nbf6e758a090fff42b2b18cfcd3f2948d8d0ff083a5f92f985acc81060c1fe90a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:52:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:52:41.676 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[92c6da97-ef3c-46d0-ad37-295ef1d1f18f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:52:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:52:41.678 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5e910dbb-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:52:41 np0005531887 kernel: tap5e910dbb-20: left promiscuous mode
Nov 22 02:52:41 np0005531887 nova_compute[186849]: 2025-11-22 07:52:41.680 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:41 np0005531887 nova_compute[186849]: 2025-11-22 07:52:41.698 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:52:41.702 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[5df0d2c9-5d08-400d-9416-f794ad6b9cb9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:52:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:52:41.720 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[4cb7057d-2ecc-4290-9d1e-5e69ad309dcd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:52:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:52:41.722 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[bf9a220f-6d2c-4106-baa1-b67aeb995a36]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:52:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:52:41.740 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[1ef3a1eb-c6e0-4c2b-a9ea-0d2a61154990]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 468293, 'reachable_time': 30011, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220656, 'error': None, 'target': 'ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:52:41 np0005531887 systemd[1]: run-netns-ovnmeta\x2d5e910dbb\x2d27d1\x2d4915\x2d8b74\x2dd0538d33c33c.mount: Deactivated successfully.
Nov 22 02:52:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:52:41.744 104200 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 02:52:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:52:41.744 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[281aad33-836c-4e8e-8e40-524b998613cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:52:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:52:41.745 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 02:52:41 np0005531887 nova_compute[186849]: 2025-11-22 07:52:41.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:52:41 np0005531887 nova_compute[186849]: 2025-11-22 07:52:41.793 186853 INFO nova.virt.libvirt.driver [None req-e4d15b4c-727c-4aa9-af57-b0d06240e30b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Instance shutdown successfully after 13 seconds.#033[00m
Nov 22 02:52:41 np0005531887 nova_compute[186849]: 2025-11-22 07:52:41.799 186853 INFO nova.virt.libvirt.driver [-] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Instance destroyed successfully.#033[00m
Nov 22 02:52:41 np0005531887 nova_compute[186849]: 2025-11-22 07:52:41.799 186853 DEBUG nova.objects.instance [None req-e4d15b4c-727c-4aa9-af57-b0d06240e30b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lazy-loading 'numa_topology' on Instance uuid f4f28646-e9a4-464e-a7f6-db4a7d0d83a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:52:41 np0005531887 nova_compute[186849]: 2025-11-22 07:52:41.821 186853 DEBUG nova.compute.manager [None req-e4d15b4c-727c-4aa9-af57-b0d06240e30b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:52:42 np0005531887 nova_compute[186849]: 2025-11-22 07:52:42.015 186853 DEBUG oslo_concurrency.lockutils [None req-e4d15b4c-727c-4aa9-af57-b0d06240e30b 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "f4f28646-e9a4-464e-a7f6-db4a7d0d83a9" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 13.356s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:52:42 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:52:42.746 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:52:43 np0005531887 nova_compute[186849]: 2025-11-22 07:52:43.407 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:43 np0005531887 nova_compute[186849]: 2025-11-22 07:52:43.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:52:44 np0005531887 nova_compute[186849]: 2025-11-22 07:52:44.761 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:52:44 np0005531887 nova_compute[186849]: 2025-11-22 07:52:44.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:52:44 np0005531887 nova_compute[186849]: 2025-11-22 07:52:44.804 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:52:44 np0005531887 nova_compute[186849]: 2025-11-22 07:52:44.804 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:52:44 np0005531887 nova_compute[186849]: 2025-11-22 07:52:44.804 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:52:44 np0005531887 nova_compute[186849]: 2025-11-22 07:52:44.805 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 02:52:44 np0005531887 podman[220657]: 2025-11-22 07:52:44.860238444 +0000 UTC m=+0.078645245 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent)
Nov 22 02:52:44 np0005531887 nova_compute[186849]: 2025-11-22 07:52:44.918 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f4f28646-e9a4-464e-a7f6-db4a7d0d83a9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:52:44 np0005531887 nova_compute[186849]: 2025-11-22 07:52:44.981 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f4f28646-e9a4-464e-a7f6-db4a7d0d83a9/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:52:44 np0005531887 nova_compute[186849]: 2025-11-22 07:52:44.982 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f4f28646-e9a4-464e-a7f6-db4a7d0d83a9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:52:45 np0005531887 nova_compute[186849]: 2025-11-22 07:52:45.044 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f4f28646-e9a4-464e-a7f6-db4a7d0d83a9/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:52:45 np0005531887 nova_compute[186849]: 2025-11-22 07:52:45.174 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:52:45 np0005531887 nova_compute[186849]: 2025-11-22 07:52:45.175 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5735MB free_disk=73.42963409423828GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 02:52:45 np0005531887 nova_compute[186849]: 2025-11-22 07:52:45.175 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:52:45 np0005531887 nova_compute[186849]: 2025-11-22 07:52:45.175 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:52:45 np0005531887 nova_compute[186849]: 2025-11-22 07:52:45.178 186853 DEBUG oslo_concurrency.lockutils [None req-2de74473-ece6-4c47-9f6a-a9145a026c50 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "f4f28646-e9a4-464e-a7f6-db4a7d0d83a9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:52:45 np0005531887 nova_compute[186849]: 2025-11-22 07:52:45.179 186853 DEBUG oslo_concurrency.lockutils [None req-2de74473-ece6-4c47-9f6a-a9145a026c50 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "f4f28646-e9a4-464e-a7f6-db4a7d0d83a9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:52:45 np0005531887 nova_compute[186849]: 2025-11-22 07:52:45.179 186853 DEBUG oslo_concurrency.lockutils [None req-2de74473-ece6-4c47-9f6a-a9145a026c50 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "f4f28646-e9a4-464e-a7f6-db4a7d0d83a9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:52:45 np0005531887 nova_compute[186849]: 2025-11-22 07:52:45.179 186853 DEBUG oslo_concurrency.lockutils [None req-2de74473-ece6-4c47-9f6a-a9145a026c50 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "f4f28646-e9a4-464e-a7f6-db4a7d0d83a9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:52:45 np0005531887 nova_compute[186849]: 2025-11-22 07:52:45.179 186853 DEBUG oslo_concurrency.lockutils [None req-2de74473-ece6-4c47-9f6a-a9145a026c50 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "f4f28646-e9a4-464e-a7f6-db4a7d0d83a9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:52:45 np0005531887 nova_compute[186849]: 2025-11-22 07:52:45.187 186853 INFO nova.compute.manager [None req-2de74473-ece6-4c47-9f6a-a9145a026c50 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Terminating instance#033[00m
Nov 22 02:52:45 np0005531887 nova_compute[186849]: 2025-11-22 07:52:45.192 186853 DEBUG nova.compute.manager [None req-2de74473-ece6-4c47-9f6a-a9145a026c50 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 02:52:45 np0005531887 nova_compute[186849]: 2025-11-22 07:52:45.197 186853 INFO nova.virt.libvirt.driver [-] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Instance destroyed successfully.#033[00m
Nov 22 02:52:45 np0005531887 nova_compute[186849]: 2025-11-22 07:52:45.197 186853 DEBUG nova.objects.instance [None req-2de74473-ece6-4c47-9f6a-a9145a026c50 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lazy-loading 'resources' on Instance uuid f4f28646-e9a4-464e-a7f6-db4a7d0d83a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:52:45 np0005531887 nova_compute[186849]: 2025-11-22 07:52:45.217 186853 DEBUG nova.virt.libvirt.vif [None req-2de74473-ece6-4c47-9f6a-a9145a026c50 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:52:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1914028081',display_name='tempest-DeleteServersTestJSON-server-1914028081',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1914028081',id=55,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:52:22Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='6b68db2b61a54aeaa8ac219f44ed3e75',ramdisk_id='',reservation_id='r-x6deem4b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-DeleteServersTestJSON-550712359',owner_user_name='tempest-DeleteServersTestJSON-550712359-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:52:41Z,user_data=None,user_id='57077a1511bf46d897beb6fd5eedfa67',uuid=f4f28646-e9a4-464e-a7f6-db4a7d0d83a9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "8eb86bde-6b5b-491d-9ae4-94dcc46b6169", "address": "fa:16:3e:3d:f9:da", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8eb86bde-6b", "ovs_interfaceid": "8eb86bde-6b5b-491d-9ae4-94dcc46b6169", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 02:52:45 np0005531887 nova_compute[186849]: 2025-11-22 07:52:45.218 186853 DEBUG nova.network.os_vif_util [None req-2de74473-ece6-4c47-9f6a-a9145a026c50 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Converting VIF {"id": "8eb86bde-6b5b-491d-9ae4-94dcc46b6169", "address": "fa:16:3e:3d:f9:da", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8eb86bde-6b", "ovs_interfaceid": "8eb86bde-6b5b-491d-9ae4-94dcc46b6169", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:52:45 np0005531887 nova_compute[186849]: 2025-11-22 07:52:45.218 186853 DEBUG nova.network.os_vif_util [None req-2de74473-ece6-4c47-9f6a-a9145a026c50 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3d:f9:da,bridge_name='br-int',has_traffic_filtering=True,id=8eb86bde-6b5b-491d-9ae4-94dcc46b6169,network=Network(5e910dbb-27d1-4915-8b74-d0538d33c33c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8eb86bde-6b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:52:45 np0005531887 nova_compute[186849]: 2025-11-22 07:52:45.218 186853 DEBUG os_vif [None req-2de74473-ece6-4c47-9f6a-a9145a026c50 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3d:f9:da,bridge_name='br-int',has_traffic_filtering=True,id=8eb86bde-6b5b-491d-9ae4-94dcc46b6169,network=Network(5e910dbb-27d1-4915-8b74-d0538d33c33c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8eb86bde-6b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 02:52:45 np0005531887 nova_compute[186849]: 2025-11-22 07:52:45.220 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:45 np0005531887 nova_compute[186849]: 2025-11-22 07:52:45.220 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8eb86bde-6b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:52:45 np0005531887 nova_compute[186849]: 2025-11-22 07:52:45.221 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:45 np0005531887 nova_compute[186849]: 2025-11-22 07:52:45.223 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:45 np0005531887 nova_compute[186849]: 2025-11-22 07:52:45.226 186853 INFO os_vif [None req-2de74473-ece6-4c47-9f6a-a9145a026c50 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3d:f9:da,bridge_name='br-int',has_traffic_filtering=True,id=8eb86bde-6b5b-491d-9ae4-94dcc46b6169,network=Network(5e910dbb-27d1-4915-8b74-d0538d33c33c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8eb86bde-6b')#033[00m
Nov 22 02:52:45 np0005531887 nova_compute[186849]: 2025-11-22 07:52:45.226 186853 INFO nova.virt.libvirt.driver [None req-2de74473-ece6-4c47-9f6a-a9145a026c50 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Deleting instance files /var/lib/nova/instances/f4f28646-e9a4-464e-a7f6-db4a7d0d83a9_del#033[00m
Nov 22 02:52:45 np0005531887 nova_compute[186849]: 2025-11-22 07:52:45.227 186853 INFO nova.virt.libvirt.driver [None req-2de74473-ece6-4c47-9f6a-a9145a026c50 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Deletion of /var/lib/nova/instances/f4f28646-e9a4-464e-a7f6-db4a7d0d83a9_del complete#033[00m
Nov 22 02:52:45 np0005531887 nova_compute[186849]: 2025-11-22 07:52:45.304 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Instance f4f28646-e9a4-464e-a7f6-db4a7d0d83a9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 02:52:45 np0005531887 nova_compute[186849]: 2025-11-22 07:52:45.304 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 02:52:45 np0005531887 nova_compute[186849]: 2025-11-22 07:52:45.304 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 02:52:45 np0005531887 nova_compute[186849]: 2025-11-22 07:52:45.330 186853 INFO nova.compute.manager [None req-2de74473-ece6-4c47-9f6a-a9145a026c50 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Took 0.14 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 02:52:45 np0005531887 nova_compute[186849]: 2025-11-22 07:52:45.330 186853 DEBUG oslo.service.loopingcall [None req-2de74473-ece6-4c47-9f6a-a9145a026c50 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 02:52:45 np0005531887 nova_compute[186849]: 2025-11-22 07:52:45.331 186853 DEBUG nova.compute.manager [-] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 02:52:45 np0005531887 nova_compute[186849]: 2025-11-22 07:52:45.331 186853 DEBUG nova.network.neutron [-] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 02:52:45 np0005531887 nova_compute[186849]: 2025-11-22 07:52:45.402 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:52:45 np0005531887 nova_compute[186849]: 2025-11-22 07:52:45.422 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:52:45 np0005531887 nova_compute[186849]: 2025-11-22 07:52:45.456 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 02:52:45 np0005531887 nova_compute[186849]: 2025-11-22 07:52:45.456 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.281s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:52:45 np0005531887 nova_compute[186849]: 2025-11-22 07:52:45.551 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:46 np0005531887 nova_compute[186849]: 2025-11-22 07:52:46.457 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:52:46 np0005531887 nova_compute[186849]: 2025-11-22 07:52:46.457 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 02:52:46 np0005531887 nova_compute[186849]: 2025-11-22 07:52:46.457 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 02:52:46 np0005531887 nova_compute[186849]: 2025-11-22 07:52:46.477 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Nov 22 02:52:46 np0005531887 nova_compute[186849]: 2025-11-22 07:52:46.477 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 02:52:46 np0005531887 nova_compute[186849]: 2025-11-22 07:52:46.478 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:52:46 np0005531887 nova_compute[186849]: 2025-11-22 07:52:46.478 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:52:46 np0005531887 nova_compute[186849]: 2025-11-22 07:52:46.478 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:52:46 np0005531887 nova_compute[186849]: 2025-11-22 07:52:46.478 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 02:52:46 np0005531887 podman[220683]: 2025-11-22 07:52:46.87295148 +0000 UTC m=+0.087116292 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:52:47 np0005531887 nova_compute[186849]: 2025-11-22 07:52:47.527 186853 DEBUG nova.compute.manager [req-21654b70-d21f-4d44-b29e-67c6e37cceaa req-920a0dcf-147f-4206-aa41-840d93d71288 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Received event network-vif-plugged-8eb86bde-6b5b-491d-9ae4-94dcc46b6169 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:52:47 np0005531887 nova_compute[186849]: 2025-11-22 07:52:47.527 186853 DEBUG oslo_concurrency.lockutils [req-21654b70-d21f-4d44-b29e-67c6e37cceaa req-920a0dcf-147f-4206-aa41-840d93d71288 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "f4f28646-e9a4-464e-a7f6-db4a7d0d83a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:52:47 np0005531887 nova_compute[186849]: 2025-11-22 07:52:47.528 186853 DEBUG oslo_concurrency.lockutils [req-21654b70-d21f-4d44-b29e-67c6e37cceaa req-920a0dcf-147f-4206-aa41-840d93d71288 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "f4f28646-e9a4-464e-a7f6-db4a7d0d83a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:52:47 np0005531887 nova_compute[186849]: 2025-11-22 07:52:47.528 186853 DEBUG oslo_concurrency.lockutils [req-21654b70-d21f-4d44-b29e-67c6e37cceaa req-920a0dcf-147f-4206-aa41-840d93d71288 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "f4f28646-e9a4-464e-a7f6-db4a7d0d83a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:52:47 np0005531887 nova_compute[186849]: 2025-11-22 07:52:47.528 186853 DEBUG nova.compute.manager [req-21654b70-d21f-4d44-b29e-67c6e37cceaa req-920a0dcf-147f-4206-aa41-840d93d71288 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] No waiting events found dispatching network-vif-plugged-8eb86bde-6b5b-491d-9ae4-94dcc46b6169 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:52:47 np0005531887 nova_compute[186849]: 2025-11-22 07:52:47.528 186853 WARNING nova.compute.manager [req-21654b70-d21f-4d44-b29e-67c6e37cceaa req-920a0dcf-147f-4206-aa41-840d93d71288 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Received unexpected event network-vif-plugged-8eb86bde-6b5b-491d-9ae4-94dcc46b6169 for instance with vm_state stopped and task_state deleting.#033[00m
Nov 22 02:52:47 np0005531887 nova_compute[186849]: 2025-11-22 07:52:47.566 186853 DEBUG nova.network.neutron [-] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:52:47 np0005531887 nova_compute[186849]: 2025-11-22 07:52:47.643 186853 INFO nova.compute.manager [-] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Took 2.31 seconds to deallocate network for instance.#033[00m
Nov 22 02:52:47 np0005531887 nova_compute[186849]: 2025-11-22 07:52:47.765 186853 DEBUG oslo_concurrency.lockutils [None req-2de74473-ece6-4c47-9f6a-a9145a026c50 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:52:47 np0005531887 nova_compute[186849]: 2025-11-22 07:52:47.766 186853 DEBUG oslo_concurrency.lockutils [None req-2de74473-ece6-4c47-9f6a-a9145a026c50 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:52:47 np0005531887 nova_compute[186849]: 2025-11-22 07:52:47.840 186853 DEBUG nova.compute.provider_tree [None req-2de74473-ece6-4c47-9f6a-a9145a026c50 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:52:47 np0005531887 nova_compute[186849]: 2025-11-22 07:52:47.892 186853 DEBUG nova.scheduler.client.report [None req-2de74473-ece6-4c47-9f6a-a9145a026c50 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:52:47 np0005531887 nova_compute[186849]: 2025-11-22 07:52:47.919 186853 DEBUG oslo_concurrency.lockutils [None req-2de74473-ece6-4c47-9f6a-a9145a026c50 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.153s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:52:47 np0005531887 nova_compute[186849]: 2025-11-22 07:52:47.976 186853 INFO nova.scheduler.client.report [None req-2de74473-ece6-4c47-9f6a-a9145a026c50 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Deleted allocations for instance f4f28646-e9a4-464e-a7f6-db4a7d0d83a9#033[00m
Nov 22 02:52:48 np0005531887 nova_compute[186849]: 2025-11-22 07:52:48.103 186853 DEBUG oslo_concurrency.lockutils [None req-2de74473-ece6-4c47-9f6a-a9145a026c50 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "f4f28646-e9a4-464e-a7f6-db4a7d0d83a9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.924s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:52:49 np0005531887 nova_compute[186849]: 2025-11-22 07:52:49.659 186853 DEBUG nova.compute.manager [req-7348fd33-5ad6-479e-b779-975279fdb439 req-edb6bd88-efc8-4b60-b548-ae5c919f79b8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Received event network-vif-deleted-8eb86bde-6b5b-491d-9ae4-94dcc46b6169 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:52:50 np0005531887 nova_compute[186849]: 2025-11-22 07:52:50.223 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:50 np0005531887 nova_compute[186849]: 2025-11-22 07:52:50.552 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:51 np0005531887 podman[220705]: 2025-11-22 07:52:51.868414967 +0000 UTC m=+0.084671374 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 02:52:55 np0005531887 nova_compute[186849]: 2025-11-22 07:52:55.225 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:55 np0005531887 nova_compute[186849]: 2025-11-22 07:52:55.553 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:52:56 np0005531887 nova_compute[186849]: 2025-11-22 07:52:56.263 186853 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797961.262077, f4f28646-e9a4-464e-a7f6-db4a7d0d83a9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:52:56 np0005531887 nova_compute[186849]: 2025-11-22 07:52:56.263 186853 INFO nova.compute.manager [-] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:52:56 np0005531887 nova_compute[186849]: 2025-11-22 07:52:56.281 186853 DEBUG nova.compute.manager [None req-ffd21056-2d3a-43f3-89ca-ed6b481bb288 - - - - - -] [instance: f4f28646-e9a4-464e-a7f6-db4a7d0d83a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:52:58 np0005531887 podman[220730]: 2025-11-22 07:52:58.853189136 +0000 UTC m=+0.069979123 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., architecture=x86_64, config_id=edpm, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.component=ubi9-minimal-container, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=)
Nov 22 02:53:00 np0005531887 nova_compute[186849]: 2025-11-22 07:53:00.228 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:00 np0005531887 nova_compute[186849]: 2025-11-22 07:53:00.554 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:02 np0005531887 podman[220752]: 2025-11-22 07:53:02.848849631 +0000 UTC m=+0.070488275 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS)
Nov 22 02:53:02 np0005531887 podman[220753]: 2025-11-22 07:53:02.879076021 +0000 UTC m=+0.096134713 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 22 02:53:05 np0005531887 nova_compute[186849]: 2025-11-22 07:53:05.231 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:05 np0005531887 nova_compute[186849]: 2025-11-22 07:53:05.557 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:10 np0005531887 nova_compute[186849]: 2025-11-22 07:53:10.234 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:10 np0005531887 nova_compute[186849]: 2025-11-22 07:53:10.560 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:10 np0005531887 podman[220797]: 2025-11-22 07:53:10.852742536 +0000 UTC m=+0.077172998 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 02:53:15 np0005531887 nova_compute[186849]: 2025-11-22 07:53:15.237 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:15 np0005531887 nova_compute[186849]: 2025-11-22 07:53:15.561 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:15 np0005531887 podman[220823]: 2025-11-22 07:53:15.846646695 +0000 UTC m=+0.062327766 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 22 02:53:17 np0005531887 podman[220843]: 2025-11-22 07:53:17.835708553 +0000 UTC m=+0.059984509 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 22 02:53:20 np0005531887 nova_compute[186849]: 2025-11-22 07:53:20.240 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:20 np0005531887 nova_compute[186849]: 2025-11-22 07:53:20.562 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:22 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:53:22.186 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:53:22 np0005531887 nova_compute[186849]: 2025-11-22 07:53:22.187 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:22 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:53:22.188 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 02:53:22 np0005531887 podman[220863]: 2025-11-22 07:53:22.834115812 +0000 UTC m=+0.058707777 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 02:53:24 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:53:24.190 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:53:25 np0005531887 nova_compute[186849]: 2025-11-22 07:53:25.243 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:25 np0005531887 nova_compute[186849]: 2025-11-22 07:53:25.562 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:29 np0005531887 podman[220887]: 2025-11-22 07:53:29.845599344 +0000 UTC m=+0.055727884 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, maintainer=Red Hat, Inc., vcs-type=git, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, managed_by=edpm_ansible, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 22 02:53:30 np0005531887 nova_compute[186849]: 2025-11-22 07:53:30.246 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:30 np0005531887 nova_compute[186849]: 2025-11-22 07:53:30.564 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:33 np0005531887 podman[220908]: 2025-11-22 07:53:33.842229433 +0000 UTC m=+0.065575606 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:53:33 np0005531887 podman[220909]: 2025-11-22 07:53:33.890557715 +0000 UTC m=+0.108355732 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:53:35 np0005531887 nova_compute[186849]: 2025-11-22 07:53:35.250 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:35 np0005531887 nova_compute[186849]: 2025-11-22 07:53:35.565 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:53:37.324 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:53:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:53:37.324 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:53:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:53:37.324 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:53:40 np0005531887 nova_compute[186849]: 2025-11-22 07:53:40.254 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:40 np0005531887 nova_compute[186849]: 2025-11-22 07:53:40.566 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:41 np0005531887 nova_compute[186849]: 2025-11-22 07:53:41.770 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:53:41 np0005531887 nova_compute[186849]: 2025-11-22 07:53:41.770 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:53:41 np0005531887 nova_compute[186849]: 2025-11-22 07:53:41.771 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:53:41 np0005531887 nova_compute[186849]: 2025-11-22 07:53:41.771 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 22 02:53:41 np0005531887 nova_compute[186849]: 2025-11-22 07:53:41.803 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 22 02:53:41 np0005531887 podman[220958]: 2025-11-22 07:53:41.84653942 +0000 UTC m=+0.062015979 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 02:53:44 np0005531887 nova_compute[186849]: 2025-11-22 07:53:44.795 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:53:45 np0005531887 nova_compute[186849]: 2025-11-22 07:53:45.257 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:45 np0005531887 nova_compute[186849]: 2025-11-22 07:53:45.568 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:45 np0005531887 nova_compute[186849]: 2025-11-22 07:53:45.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:53:45 np0005531887 nova_compute[186849]: 2025-11-22 07:53:45.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:53:45 np0005531887 nova_compute[186849]: 2025-11-22 07:53:45.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:53:45 np0005531887 nova_compute[186849]: 2025-11-22 07:53:45.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 02:53:45 np0005531887 nova_compute[186849]: 2025-11-22 07:53:45.770 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:53:45 np0005531887 nova_compute[186849]: 2025-11-22 07:53:45.807 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:53:45 np0005531887 nova_compute[186849]: 2025-11-22 07:53:45.808 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:53:45 np0005531887 nova_compute[186849]: 2025-11-22 07:53:45.808 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:53:45 np0005531887 nova_compute[186849]: 2025-11-22 07:53:45.808 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 02:53:46 np0005531887 nova_compute[186849]: 2025-11-22 07:53:46.015 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:53:46 np0005531887 nova_compute[186849]: 2025-11-22 07:53:46.016 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5769MB free_disk=73.45825576782227GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 02:53:46 np0005531887 nova_compute[186849]: 2025-11-22 07:53:46.016 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:53:46 np0005531887 nova_compute[186849]: 2025-11-22 07:53:46.016 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:53:46 np0005531887 nova_compute[186849]: 2025-11-22 07:53:46.335 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 02:53:46 np0005531887 nova_compute[186849]: 2025-11-22 07:53:46.336 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 02:53:46 np0005531887 nova_compute[186849]: 2025-11-22 07:53:46.562 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:53:46 np0005531887 nova_compute[186849]: 2025-11-22 07:53:46.593 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:53:46 np0005531887 nova_compute[186849]: 2025-11-22 07:53:46.637 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 02:53:46 np0005531887 nova_compute[186849]: 2025-11-22 07:53:46.638 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.622s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:53:46 np0005531887 nova_compute[186849]: 2025-11-22 07:53:46.639 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:53:46 np0005531887 podman[220983]: 2025-11-22 07:53:46.86865744 +0000 UTC m=+0.084764775 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 22 02:53:47 np0005531887 nova_compute[186849]: 2025-11-22 07:53:47.651 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:53:47 np0005531887 nova_compute[186849]: 2025-11-22 07:53:47.651 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 02:53:47 np0005531887 nova_compute[186849]: 2025-11-22 07:53:47.652 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 02:53:47 np0005531887 nova_compute[186849]: 2025-11-22 07:53:47.676 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 02:53:47 np0005531887 nova_compute[186849]: 2025-11-22 07:53:47.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:53:48 np0005531887 podman[221003]: 2025-11-22 07:53:48.847532338 +0000 UTC m=+0.070110845 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd)
Nov 22 02:53:50 np0005531887 nova_compute[186849]: 2025-11-22 07:53:50.263 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:50 np0005531887 nova_compute[186849]: 2025-11-22 07:53:50.571 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:50 np0005531887 nova_compute[186849]: 2025-11-22 07:53:50.763 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:53:52 np0005531887 nova_compute[186849]: 2025-11-22 07:53:52.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:53:52 np0005531887 nova_compute[186849]: 2025-11-22 07:53:52.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 22 02:53:53 np0005531887 podman[221023]: 2025-11-22 07:53:53.867727929 +0000 UTC m=+0.087256635 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 02:53:55 np0005531887 nova_compute[186849]: 2025-11-22 07:53:55.267 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:55 np0005531887 nova_compute[186849]: 2025-11-22 07:53:55.573 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:53:56 np0005531887 nova_compute[186849]: 2025-11-22 07:53:56.676 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:54:00 np0005531887 nova_compute[186849]: 2025-11-22 07:54:00.270 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:00 np0005531887 nova_compute[186849]: 2025-11-22 07:54:00.576 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:00 np0005531887 podman[221049]: 2025-11-22 07:54:00.845475148 +0000 UTC m=+0.062332387 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350, com.redhat.component=ubi9-minimal-container, config_id=edpm, name=ubi9-minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, vcs-type=git, container_name=openstack_network_exporter, io.openshift.expose-services=, vendor=Red Hat, Inc., version=9.6)
Nov 22 02:54:02 np0005531887 nova_compute[186849]: 2025-11-22 07:54:02.301 186853 DEBUG oslo_concurrency.lockutils [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "669c1c7b-c493-4f31-83dd-737239095b63" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:02 np0005531887 nova_compute[186849]: 2025-11-22 07:54:02.302 186853 DEBUG oslo_concurrency.lockutils [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "669c1c7b-c493-4f31-83dd-737239095b63" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:02 np0005531887 nova_compute[186849]: 2025-11-22 07:54:02.332 186853 DEBUG nova.compute.manager [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 02:54:02 np0005531887 nova_compute[186849]: 2025-11-22 07:54:02.463 186853 DEBUG oslo_concurrency.lockutils [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:02 np0005531887 nova_compute[186849]: 2025-11-22 07:54:02.464 186853 DEBUG oslo_concurrency.lockutils [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:02 np0005531887 nova_compute[186849]: 2025-11-22 07:54:02.473 186853 DEBUG nova.virt.hardware [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:54:02 np0005531887 nova_compute[186849]: 2025-11-22 07:54:02.475 186853 INFO nova.compute.claims [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 22 02:54:02 np0005531887 nova_compute[186849]: 2025-11-22 07:54:02.726 186853 DEBUG nova.compute.provider_tree [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:54:02 np0005531887 nova_compute[186849]: 2025-11-22 07:54:02.744 186853 DEBUG nova.scheduler.client.report [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:54:02 np0005531887 nova_compute[186849]: 2025-11-22 07:54:02.792 186853 DEBUG oslo_concurrency.lockutils [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.327s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:02 np0005531887 nova_compute[186849]: 2025-11-22 07:54:02.793 186853 DEBUG nova.compute.manager [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 02:54:02 np0005531887 nova_compute[186849]: 2025-11-22 07:54:02.868 186853 DEBUG nova.compute.manager [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 02:54:02 np0005531887 nova_compute[186849]: 2025-11-22 07:54:02.869 186853 DEBUG nova.network.neutron [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 02:54:02 np0005531887 nova_compute[186849]: 2025-11-22 07:54:02.909 186853 INFO nova.virt.libvirt.driver [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 02:54:02 np0005531887 nova_compute[186849]: 2025-11-22 07:54:02.942 186853 DEBUG nova.compute.manager [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 02:54:03 np0005531887 nova_compute[186849]: 2025-11-22 07:54:03.132 186853 DEBUG nova.compute.manager [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 02:54:03 np0005531887 nova_compute[186849]: 2025-11-22 07:54:03.134 186853 DEBUG nova.virt.libvirt.driver [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:54:03 np0005531887 nova_compute[186849]: 2025-11-22 07:54:03.135 186853 INFO nova.virt.libvirt.driver [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Creating image(s)#033[00m
Nov 22 02:54:03 np0005531887 nova_compute[186849]: 2025-11-22 07:54:03.135 186853 DEBUG oslo_concurrency.lockutils [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "/var/lib/nova/instances/669c1c7b-c493-4f31-83dd-737239095b63/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:03 np0005531887 nova_compute[186849]: 2025-11-22 07:54:03.137 186853 DEBUG oslo_concurrency.lockutils [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "/var/lib/nova/instances/669c1c7b-c493-4f31-83dd-737239095b63/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:03 np0005531887 nova_compute[186849]: 2025-11-22 07:54:03.139 186853 DEBUG oslo_concurrency.lockutils [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "/var/lib/nova/instances/669c1c7b-c493-4f31-83dd-737239095b63/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:03 np0005531887 nova_compute[186849]: 2025-11-22 07:54:03.159 186853 DEBUG oslo_concurrency.processutils [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:54:03 np0005531887 nova_compute[186849]: 2025-11-22 07:54:03.222 186853 DEBUG oslo_concurrency.processutils [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:54:03 np0005531887 nova_compute[186849]: 2025-11-22 07:54:03.223 186853 DEBUG oslo_concurrency.lockutils [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:03 np0005531887 nova_compute[186849]: 2025-11-22 07:54:03.224 186853 DEBUG oslo_concurrency.lockutils [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:03 np0005531887 nova_compute[186849]: 2025-11-22 07:54:03.236 186853 DEBUG oslo_concurrency.processutils [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:54:03 np0005531887 nova_compute[186849]: 2025-11-22 07:54:03.295 186853 DEBUG oslo_concurrency.processutils [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:54:03 np0005531887 nova_compute[186849]: 2025-11-22 07:54:03.296 186853 DEBUG oslo_concurrency.processutils [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/669c1c7b-c493-4f31-83dd-737239095b63/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:54:03 np0005531887 nova_compute[186849]: 2025-11-22 07:54:03.344 186853 DEBUG oslo_concurrency.processutils [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/669c1c7b-c493-4f31-83dd-737239095b63/disk 1073741824" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:54:03 np0005531887 nova_compute[186849]: 2025-11-22 07:54:03.346 186853 DEBUG oslo_concurrency.lockutils [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:03 np0005531887 nova_compute[186849]: 2025-11-22 07:54:03.346 186853 DEBUG oslo_concurrency.processutils [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:54:03 np0005531887 nova_compute[186849]: 2025-11-22 07:54:03.377 186853 DEBUG nova.policy [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e24c302b62fb470aa189b76d4676733b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '063bf16c91af408ca075c690797e09d8', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 02:54:03 np0005531887 nova_compute[186849]: 2025-11-22 07:54:03.415 186853 DEBUG oslo_concurrency.processutils [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:54:03 np0005531887 nova_compute[186849]: 2025-11-22 07:54:03.416 186853 DEBUG nova.virt.disk.api [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Checking if we can resize image /var/lib/nova/instances/669c1c7b-c493-4f31-83dd-737239095b63/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:54:03 np0005531887 nova_compute[186849]: 2025-11-22 07:54:03.417 186853 DEBUG oslo_concurrency.processutils [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/669c1c7b-c493-4f31-83dd-737239095b63/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:54:03 np0005531887 nova_compute[186849]: 2025-11-22 07:54:03.482 186853 DEBUG oslo_concurrency.processutils [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/669c1c7b-c493-4f31-83dd-737239095b63/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:54:03 np0005531887 nova_compute[186849]: 2025-11-22 07:54:03.483 186853 DEBUG nova.virt.disk.api [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Cannot resize image /var/lib/nova/instances/669c1c7b-c493-4f31-83dd-737239095b63/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:54:03 np0005531887 nova_compute[186849]: 2025-11-22 07:54:03.484 186853 DEBUG nova.objects.instance [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lazy-loading 'migration_context' on Instance uuid 669c1c7b-c493-4f31-83dd-737239095b63 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:54:03 np0005531887 nova_compute[186849]: 2025-11-22 07:54:03.509 186853 DEBUG nova.virt.libvirt.driver [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:54:03 np0005531887 nova_compute[186849]: 2025-11-22 07:54:03.509 186853 DEBUG nova.virt.libvirt.driver [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Ensure instance console log exists: /var/lib/nova/instances/669c1c7b-c493-4f31-83dd-737239095b63/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:54:03 np0005531887 nova_compute[186849]: 2025-11-22 07:54:03.510 186853 DEBUG oslo_concurrency.lockutils [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:03 np0005531887 nova_compute[186849]: 2025-11-22 07:54:03.510 186853 DEBUG oslo_concurrency.lockutils [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:03 np0005531887 nova_compute[186849]: 2025-11-22 07:54:03.511 186853 DEBUG oslo_concurrency.lockutils [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:04 np0005531887 podman[221085]: 2025-11-22 07:54:04.847635311 +0000 UTC m=+0.064147400 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=edpm)
Nov 22 02:54:04 np0005531887 podman[221086]: 2025-11-22 07:54:04.877378969 +0000 UTC m=+0.088463426 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:54:05 np0005531887 nova_compute[186849]: 2025-11-22 07:54:05.153 186853 DEBUG nova.network.neutron [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Successfully created port: 4f77ff1b-e147-4c07-9d9b-feabd33edead _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 02:54:05 np0005531887 nova_compute[186849]: 2025-11-22 07:54:05.273 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:05 np0005531887 nova_compute[186849]: 2025-11-22 07:54:05.578 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:07 np0005531887 nova_compute[186849]: 2025-11-22 07:54:07.843 186853 DEBUG nova.network.neutron [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Successfully updated port: 4f77ff1b-e147-4c07-9d9b-feabd33edead _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 02:54:07 np0005531887 nova_compute[186849]: 2025-11-22 07:54:07.943 186853 DEBUG oslo_concurrency.lockutils [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "refresh_cache-669c1c7b-c493-4f31-83dd-737239095b63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:54:07 np0005531887 nova_compute[186849]: 2025-11-22 07:54:07.944 186853 DEBUG oslo_concurrency.lockutils [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquired lock "refresh_cache-669c1c7b-c493-4f31-83dd-737239095b63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:54:07 np0005531887 nova_compute[186849]: 2025-11-22 07:54:07.945 186853 DEBUG nova.network.neutron [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:54:08 np0005531887 nova_compute[186849]: 2025-11-22 07:54:08.112 186853 DEBUG nova.compute.manager [req-88a78347-4c15-4c45-b6e3-c6c64ee02147 req-54b0c5c8-9122-4210-98e2-cf47482104f9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Received event network-changed-4f77ff1b-e147-4c07-9d9b-feabd33edead external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:54:08 np0005531887 nova_compute[186849]: 2025-11-22 07:54:08.113 186853 DEBUG nova.compute.manager [req-88a78347-4c15-4c45-b6e3-c6c64ee02147 req-54b0c5c8-9122-4210-98e2-cf47482104f9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Refreshing instance network info cache due to event network-changed-4f77ff1b-e147-4c07-9d9b-feabd33edead. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 02:54:08 np0005531887 nova_compute[186849]: 2025-11-22 07:54:08.113 186853 DEBUG oslo_concurrency.lockutils [req-88a78347-4c15-4c45-b6e3-c6c64ee02147 req-54b0c5c8-9122-4210-98e2-cf47482104f9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-669c1c7b-c493-4f31-83dd-737239095b63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:54:09 np0005531887 nova_compute[186849]: 2025-11-22 07:54:09.029 186853 DEBUG nova.network.neutron [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:54:10 np0005531887 nova_compute[186849]: 2025-11-22 07:54:10.276 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:10 np0005531887 nova_compute[186849]: 2025-11-22 07:54:10.579 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:10 np0005531887 nova_compute[186849]: 2025-11-22 07:54:10.855 186853 DEBUG nova.network.neutron [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Updating instance_info_cache with network_info: [{"id": "4f77ff1b-e147-4c07-9d9b-feabd33edead", "address": "fa:16:3e:c6:b1:2a", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f77ff1b-e1", "ovs_interfaceid": "4f77ff1b-e147-4c07-9d9b-feabd33edead", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:54:10 np0005531887 nova_compute[186849]: 2025-11-22 07:54:10.903 186853 DEBUG oslo_concurrency.lockutils [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Releasing lock "refresh_cache-669c1c7b-c493-4f31-83dd-737239095b63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:54:10 np0005531887 nova_compute[186849]: 2025-11-22 07:54:10.904 186853 DEBUG nova.compute.manager [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Instance network_info: |[{"id": "4f77ff1b-e147-4c07-9d9b-feabd33edead", "address": "fa:16:3e:c6:b1:2a", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f77ff1b-e1", "ovs_interfaceid": "4f77ff1b-e147-4c07-9d9b-feabd33edead", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 02:54:10 np0005531887 nova_compute[186849]: 2025-11-22 07:54:10.904 186853 DEBUG oslo_concurrency.lockutils [req-88a78347-4c15-4c45-b6e3-c6c64ee02147 req-54b0c5c8-9122-4210-98e2-cf47482104f9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-669c1c7b-c493-4f31-83dd-737239095b63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:54:10 np0005531887 nova_compute[186849]: 2025-11-22 07:54:10.905 186853 DEBUG nova.network.neutron [req-88a78347-4c15-4c45-b6e3-c6c64ee02147 req-54b0c5c8-9122-4210-98e2-cf47482104f9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Refreshing network info cache for port 4f77ff1b-e147-4c07-9d9b-feabd33edead _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 02:54:10 np0005531887 nova_compute[186849]: 2025-11-22 07:54:10.907 186853 DEBUG nova.virt.libvirt.driver [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Start _get_guest_xml network_info=[{"id": "4f77ff1b-e147-4c07-9d9b-feabd33edead", "address": "fa:16:3e:c6:b1:2a", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f77ff1b-e1", "ovs_interfaceid": "4f77ff1b-e147-4c07-9d9b-feabd33edead", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'size': 0, 'boot_index': 0, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:54:10 np0005531887 nova_compute[186849]: 2025-11-22 07:54:10.912 186853 WARNING nova.virt.libvirt.driver [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:54:10 np0005531887 nova_compute[186849]: 2025-11-22 07:54:10.918 186853 DEBUG nova.virt.libvirt.host [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:54:10 np0005531887 nova_compute[186849]: 2025-11-22 07:54:10.919 186853 DEBUG nova.virt.libvirt.host [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:54:10 np0005531887 nova_compute[186849]: 2025-11-22 07:54:10.925 186853 DEBUG nova.virt.libvirt.host [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:54:10 np0005531887 nova_compute[186849]: 2025-11-22 07:54:10.926 186853 DEBUG nova.virt.libvirt.host [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:54:10 np0005531887 nova_compute[186849]: 2025-11-22 07:54:10.927 186853 DEBUG nova.virt.libvirt.driver [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:54:10 np0005531887 nova_compute[186849]: 2025-11-22 07:54:10.928 186853 DEBUG nova.virt.hardware [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:54:10 np0005531887 nova_compute[186849]: 2025-11-22 07:54:10.928 186853 DEBUG nova.virt.hardware [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:54:10 np0005531887 nova_compute[186849]: 2025-11-22 07:54:10.929 186853 DEBUG nova.virt.hardware [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:54:10 np0005531887 nova_compute[186849]: 2025-11-22 07:54:10.929 186853 DEBUG nova.virt.hardware [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:54:10 np0005531887 nova_compute[186849]: 2025-11-22 07:54:10.929 186853 DEBUG nova.virt.hardware [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:54:10 np0005531887 nova_compute[186849]: 2025-11-22 07:54:10.930 186853 DEBUG nova.virt.hardware [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:54:10 np0005531887 nova_compute[186849]: 2025-11-22 07:54:10.930 186853 DEBUG nova.virt.hardware [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:54:10 np0005531887 nova_compute[186849]: 2025-11-22 07:54:10.930 186853 DEBUG nova.virt.hardware [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:54:10 np0005531887 nova_compute[186849]: 2025-11-22 07:54:10.930 186853 DEBUG nova.virt.hardware [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:54:10 np0005531887 nova_compute[186849]: 2025-11-22 07:54:10.931 186853 DEBUG nova.virt.hardware [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:54:10 np0005531887 nova_compute[186849]: 2025-11-22 07:54:10.931 186853 DEBUG nova.virt.hardware [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:54:10 np0005531887 nova_compute[186849]: 2025-11-22 07:54:10.937 186853 DEBUG nova.virt.libvirt.vif [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:54:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-519804368',display_name='tempest-ServerDiskConfigTestJSON-server-519804368',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-519804368',id=61,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='063bf16c91af408ca075c690797e09d8',ramdisk_id='',reservation_id='r-pgwqi3ub',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-592691466',owner_user_name='tempest-ServerDiskConfigTestJSON-592691466-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:54:02Z,user_data=None,user_id='e24c302b62fb470aa189b76d4676733b',uuid=669c1c7b-c493-4f31-83dd-737239095b63,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4f77ff1b-e147-4c07-9d9b-feabd33edead", "address": "fa:16:3e:c6:b1:2a", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f77ff1b-e1", "ovs_interfaceid": "4f77ff1b-e147-4c07-9d9b-feabd33edead", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 02:54:10 np0005531887 nova_compute[186849]: 2025-11-22 07:54:10.938 186853 DEBUG nova.network.os_vif_util [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Converting VIF {"id": "4f77ff1b-e147-4c07-9d9b-feabd33edead", "address": "fa:16:3e:c6:b1:2a", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f77ff1b-e1", "ovs_interfaceid": "4f77ff1b-e147-4c07-9d9b-feabd33edead", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:54:10 np0005531887 nova_compute[186849]: 2025-11-22 07:54:10.939 186853 DEBUG nova.network.os_vif_util [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c6:b1:2a,bridge_name='br-int',has_traffic_filtering=True,id=4f77ff1b-e147-4c07-9d9b-feabd33edead,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f77ff1b-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:54:10 np0005531887 nova_compute[186849]: 2025-11-22 07:54:10.940 186853 DEBUG nova.objects.instance [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lazy-loading 'pci_devices' on Instance uuid 669c1c7b-c493-4f31-83dd-737239095b63 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:54:10 np0005531887 nova_compute[186849]: 2025-11-22 07:54:10.973 186853 DEBUG nova.virt.libvirt.driver [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:54:10 np0005531887 nova_compute[186849]:  <uuid>669c1c7b-c493-4f31-83dd-737239095b63</uuid>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:  <name>instance-0000003d</name>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:  <memory>131072</memory>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:  <vcpu>1</vcpu>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:54:10 np0005531887 nova_compute[186849]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:      <nova:name>tempest-ServerDiskConfigTestJSON-server-519804368</nova:name>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:      <nova:creationTime>2025-11-22 07:54:10</nova:creationTime>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:      <nova:flavor name="m1.nano">
Nov 22 02:54:10 np0005531887 nova_compute[186849]:        <nova:memory>128</nova:memory>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:        <nova:disk>1</nova:disk>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:        <nova:swap>0</nova:swap>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:      </nova:flavor>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:      <nova:owner>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:        <nova:user uuid="e24c302b62fb470aa189b76d4676733b">tempest-ServerDiskConfigTestJSON-592691466-project-member</nova:user>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:        <nova:project uuid="063bf16c91af408ca075c690797e09d8">tempest-ServerDiskConfigTestJSON-592691466</nova:project>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:      </nova:owner>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:      <nova:ports>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:        <nova:port uuid="4f77ff1b-e147-4c07-9d9b-feabd33edead">
Nov 22 02:54:10 np0005531887 nova_compute[186849]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:        </nova:port>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:      </nova:ports>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:    </nova:instance>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:  <sysinfo type="smbios">
Nov 22 02:54:10 np0005531887 nova_compute[186849]:    <system>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:      <entry name="serial">669c1c7b-c493-4f31-83dd-737239095b63</entry>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:      <entry name="uuid">669c1c7b-c493-4f31-83dd-737239095b63</entry>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:    </system>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:  <os>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:    <boot dev="hd"/>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:    <smbios mode="sysinfo"/>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:  </os>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:  <features>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:    <vmcoreinfo/>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:  </features>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:  <clock offset="utc">
Nov 22 02:54:10 np0005531887 nova_compute[186849]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:    <timer name="hpet" present="no"/>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:  </clock>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:  <cpu mode="custom" match="exact">
Nov 22 02:54:10 np0005531887 nova_compute[186849]:    <model>Nehalem</model>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:  <devices>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:    <disk type="file" device="disk">
Nov 22 02:54:10 np0005531887 nova_compute[186849]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/669c1c7b-c493-4f31-83dd-737239095b63/disk"/>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:      <target dev="vda" bus="virtio"/>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:    <disk type="file" device="cdrom">
Nov 22 02:54:10 np0005531887 nova_compute[186849]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/669c1c7b-c493-4f31-83dd-737239095b63/disk.config"/>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:      <target dev="sda" bus="sata"/>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:    <interface type="ethernet">
Nov 22 02:54:10 np0005531887 nova_compute[186849]:      <mac address="fa:16:3e:c6:b1:2a"/>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:      <mtu size="1442"/>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:      <target dev="tap4f77ff1b-e1"/>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:    </interface>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:    <serial type="pty">
Nov 22 02:54:10 np0005531887 nova_compute[186849]:      <log file="/var/lib/nova/instances/669c1c7b-c493-4f31-83dd-737239095b63/console.log" append="off"/>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:    </serial>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:    <video>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:    </video>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:    <input type="tablet" bus="usb"/>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:    <rng model="virtio">
Nov 22 02:54:10 np0005531887 nova_compute[186849]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:    </rng>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:    <controller type="usb" index="0"/>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:    <memballoon model="virtio">
Nov 22 02:54:10 np0005531887 nova_compute[186849]:      <stats period="10"/>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 02:54:10 np0005531887 nova_compute[186849]:  </devices>
Nov 22 02:54:10 np0005531887 nova_compute[186849]: </domain>
Nov 22 02:54:10 np0005531887 nova_compute[186849]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:54:10 np0005531887 nova_compute[186849]: 2025-11-22 07:54:10.975 186853 DEBUG nova.compute.manager [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Preparing to wait for external event network-vif-plugged-4f77ff1b-e147-4c07-9d9b-feabd33edead prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 02:54:10 np0005531887 nova_compute[186849]: 2025-11-22 07:54:10.976 186853 DEBUG oslo_concurrency.lockutils [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "669c1c7b-c493-4f31-83dd-737239095b63-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:10 np0005531887 nova_compute[186849]: 2025-11-22 07:54:10.977 186853 DEBUG oslo_concurrency.lockutils [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "669c1c7b-c493-4f31-83dd-737239095b63-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:10 np0005531887 nova_compute[186849]: 2025-11-22 07:54:10.977 186853 DEBUG oslo_concurrency.lockutils [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "669c1c7b-c493-4f31-83dd-737239095b63-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:10 np0005531887 nova_compute[186849]: 2025-11-22 07:54:10.978 186853 DEBUG nova.virt.libvirt.vif [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:54:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-519804368',display_name='tempest-ServerDiskConfigTestJSON-server-519804368',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-519804368',id=61,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='063bf16c91af408ca075c690797e09d8',ramdisk_id='',reservation_id='r-pgwqi3ub',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-592691466',owner_user_name='tempest-ServerDiskConfigTestJSON-592691466-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:54:02Z,user_data=None,user_id='e24c302b62fb470aa189b76d4676733b',uuid=669c1c7b-c493-4f31-83dd-737239095b63,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4f77ff1b-e147-4c07-9d9b-feabd33edead", "address": "fa:16:3e:c6:b1:2a", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f77ff1b-e1", "ovs_interfaceid": "4f77ff1b-e147-4c07-9d9b-feabd33edead", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 02:54:10 np0005531887 nova_compute[186849]: 2025-11-22 07:54:10.979 186853 DEBUG nova.network.os_vif_util [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Converting VIF {"id": "4f77ff1b-e147-4c07-9d9b-feabd33edead", "address": "fa:16:3e:c6:b1:2a", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f77ff1b-e1", "ovs_interfaceid": "4f77ff1b-e147-4c07-9d9b-feabd33edead", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:54:10 np0005531887 nova_compute[186849]: 2025-11-22 07:54:10.980 186853 DEBUG nova.network.os_vif_util [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c6:b1:2a,bridge_name='br-int',has_traffic_filtering=True,id=4f77ff1b-e147-4c07-9d9b-feabd33edead,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f77ff1b-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:54:10 np0005531887 nova_compute[186849]: 2025-11-22 07:54:10.980 186853 DEBUG os_vif [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c6:b1:2a,bridge_name='br-int',has_traffic_filtering=True,id=4f77ff1b-e147-4c07-9d9b-feabd33edead,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f77ff1b-e1') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 02:54:10 np0005531887 nova_compute[186849]: 2025-11-22 07:54:10.982 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:10 np0005531887 nova_compute[186849]: 2025-11-22 07:54:10.982 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:54:10 np0005531887 nova_compute[186849]: 2025-11-22 07:54:10.983 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:54:10 np0005531887 nova_compute[186849]: 2025-11-22 07:54:10.988 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:10 np0005531887 nova_compute[186849]: 2025-11-22 07:54:10.989 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4f77ff1b-e1, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:54:10 np0005531887 nova_compute[186849]: 2025-11-22 07:54:10.989 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4f77ff1b-e1, col_values=(('external_ids', {'iface-id': '4f77ff1b-e147-4c07-9d9b-feabd33edead', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c6:b1:2a', 'vm-uuid': '669c1c7b-c493-4f31-83dd-737239095b63'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:54:10 np0005531887 nova_compute[186849]: 2025-11-22 07:54:10.992 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:10 np0005531887 NetworkManager[55210]: <info>  [1763798050.9930] manager: (tap4f77ff1b-e1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/68)
Nov 22 02:54:10 np0005531887 nova_compute[186849]: 2025-11-22 07:54:10.996 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:54:11 np0005531887 nova_compute[186849]: 2025-11-22 07:54:11.000 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:11 np0005531887 nova_compute[186849]: 2025-11-22 07:54:11.001 186853 INFO os_vif [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c6:b1:2a,bridge_name='br-int',has_traffic_filtering=True,id=4f77ff1b-e147-4c07-9d9b-feabd33edead,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f77ff1b-e1')#033[00m
Nov 22 02:54:11 np0005531887 nova_compute[186849]: 2025-11-22 07:54:11.063 186853 DEBUG nova.virt.libvirt.driver [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:54:11 np0005531887 nova_compute[186849]: 2025-11-22 07:54:11.064 186853 DEBUG nova.virt.libvirt.driver [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:54:11 np0005531887 nova_compute[186849]: 2025-11-22 07:54:11.064 186853 DEBUG nova.virt.libvirt.driver [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] No VIF found with MAC fa:16:3e:c6:b1:2a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 02:54:11 np0005531887 nova_compute[186849]: 2025-11-22 07:54:11.065 186853 INFO nova.virt.libvirt.driver [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Using config drive#033[00m
Nov 22 02:54:11 np0005531887 nova_compute[186849]: 2025-11-22 07:54:11.823 186853 INFO nova.virt.libvirt.driver [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Creating config drive at /var/lib/nova/instances/669c1c7b-c493-4f31-83dd-737239095b63/disk.config#033[00m
Nov 22 02:54:11 np0005531887 nova_compute[186849]: 2025-11-22 07:54:11.828 186853 DEBUG oslo_concurrency.processutils [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/669c1c7b-c493-4f31-83dd-737239095b63/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmqlfy42z execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:54:11 np0005531887 nova_compute[186849]: 2025-11-22 07:54:11.954 186853 DEBUG oslo_concurrency.processutils [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/669c1c7b-c493-4f31-83dd-737239095b63/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmqlfy42z" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:54:12 np0005531887 kernel: tap4f77ff1b-e1: entered promiscuous mode
Nov 22 02:54:12 np0005531887 ovn_controller[95130]: 2025-11-22T07:54:12Z|00118|binding|INFO|Claiming lport 4f77ff1b-e147-4c07-9d9b-feabd33edead for this chassis.
Nov 22 02:54:12 np0005531887 ovn_controller[95130]: 2025-11-22T07:54:12Z|00119|binding|INFO|4f77ff1b-e147-4c07-9d9b-feabd33edead: Claiming fa:16:3e:c6:b1:2a 10.100.0.8
Nov 22 02:54:12 np0005531887 nova_compute[186849]: 2025-11-22 07:54:12.052 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:12 np0005531887 nova_compute[186849]: 2025-11-22 07:54:12.056 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:12 np0005531887 NetworkManager[55210]: <info>  [1763798052.0596] manager: (tap4f77ff1b-e1): new Tun device (/org/freedesktop/NetworkManager/Devices/69)
Nov 22 02:54:12 np0005531887 systemd-udevd[221162]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:54:12 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:12.102 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c6:b1:2a 10.100.0.8'], port_security=['fa:16:3e:c6:b1:2a 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '669c1c7b-c493-4f31-83dd-737239095b63', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '063bf16c91af408ca075c690797e09d8', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f7dee984-8c58-404b-bb82-9a414aef709d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3245992a-6d76-4250-aff6-2ef09c2bac10, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=4f77ff1b-e147-4c07-9d9b-feabd33edead) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:54:12 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:12.104 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 4f77ff1b-e147-4c07-9d9b-feabd33edead in datapath d54e232a-5c68-4cc7-b58c-054da9c4646f bound to our chassis#033[00m
Nov 22 02:54:12 np0005531887 NetworkManager[55210]: <info>  [1763798052.1072] device (tap4f77ff1b-e1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 02:54:12 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:12.106 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d54e232a-5c68-4cc7-b58c-054da9c4646f#033[00m
Nov 22 02:54:12 np0005531887 NetworkManager[55210]: <info>  [1763798052.1084] device (tap4f77ff1b-e1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 02:54:12 np0005531887 nova_compute[186849]: 2025-11-22 07:54:12.123 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:12 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:12.126 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[f4545ee8-057c-4b78-bd01-a4dd2c7724e4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:12 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:12.127 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd54e232a-51 in ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 02:54:12 np0005531887 systemd-machined[153180]: New machine qemu-21-instance-0000003d.
Nov 22 02:54:12 np0005531887 ovn_controller[95130]: 2025-11-22T07:54:12Z|00120|binding|INFO|Setting lport 4f77ff1b-e147-4c07-9d9b-feabd33edead ovn-installed in OVS
Nov 22 02:54:12 np0005531887 ovn_controller[95130]: 2025-11-22T07:54:12Z|00121|binding|INFO|Setting lport 4f77ff1b-e147-4c07-9d9b-feabd33edead up in Southbound
Nov 22 02:54:12 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:12.130 213790 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd54e232a-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 02:54:12 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:12.130 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[98224b64-5ad6-48d8-8057-2a5409401246]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:12 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:12.131 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[2aa29658-e3ce-4158-8f99-0a90a2ffbf80]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:12 np0005531887 nova_compute[186849]: 2025-11-22 07:54:12.132 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:12 np0005531887 systemd[1]: Started Virtual Machine qemu-21-instance-0000003d.
Nov 22 02:54:12 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:12.151 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[0fa941bc-1018-4462-9fc8-bb977bc7e496]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:12 np0005531887 podman[221142]: 2025-11-22 07:54:12.1617906 +0000 UTC m=+0.115705542 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 02:54:12 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:12.185 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[6d8691ad-b35b-4fd5-b05e-1a5c3e031312]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:12 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:12.227 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[c812400a-8cb9-4ef1-9242-3ebae40c023b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:12 np0005531887 NetworkManager[55210]: <info>  [1763798052.2364] manager: (tapd54e232a-50): new Veth device (/org/freedesktop/NetworkManager/Devices/70)
Nov 22 02:54:12 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:12.235 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[8e6adc29-2470-49ab-a573-6df81048682e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:12 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:12.273 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[b1ef81a1-4c15-41ff-a3a2-b95a8c0cef0f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:12 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:12.277 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[2938f5f8-338b-4200-b89e-9523d38b752a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:12 np0005531887 NetworkManager[55210]: <info>  [1763798052.3050] device (tapd54e232a-50): carrier: link connected
Nov 22 02:54:12 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:12.311 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[b8ad4881-d71b-423f-802e-a36c36a65f56]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:12 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:12.337 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[6e42b785-c31d-4d9b-ace9-c44eee2b4f37]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd54e232a-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5b:69:51'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 40], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 479491, 'reachable_time': 17518, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221205, 'error': None, 'target': 'ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:12 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:12.358 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[72d058c6-6370-409d-9582-0e8f9208a039]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5b:6951'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 479491, 'tstamp': 479491}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221206, 'error': None, 'target': 'ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:12 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:12.383 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[b09023ae-e11d-4fc5-95f7-6b281c076ac4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd54e232a-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5b:69:51'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 40], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 479491, 'reachable_time': 17518, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221207, 'error': None, 'target': 'ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:12 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:12.422 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[a7ca3d1e-c39c-46ee-b0e8-910f19bab5ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:12 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:12.504 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[736b3412-5f7f-4d35-b2ed-28dc8339f52c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:12 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:12.507 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd54e232a-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:54:12 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:12.507 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:54:12 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:12.508 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd54e232a-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:54:12 np0005531887 kernel: tapd54e232a-50: entered promiscuous mode
Nov 22 02:54:12 np0005531887 NetworkManager[55210]: <info>  [1763798052.5114] manager: (tapd54e232a-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/71)
Nov 22 02:54:12 np0005531887 nova_compute[186849]: 2025-11-22 07:54:12.510 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:12 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:12.517 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd54e232a-50, col_values=(('external_ids', {'iface-id': 'bab7bafe-e92a-4e88-a16b-e3bd78ab8944'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:54:12 np0005531887 ovn_controller[95130]: 2025-11-22T07:54:12Z|00122|binding|INFO|Releasing lport bab7bafe-e92a-4e88-a16b-e3bd78ab8944 from this chassis (sb_readonly=0)
Nov 22 02:54:12 np0005531887 nova_compute[186849]: 2025-11-22 07:54:12.519 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:12 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:12.524 104084 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d54e232a-5c68-4cc7-b58c-054da9c4646f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d54e232a-5c68-4cc7-b58c-054da9c4646f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 02:54:12 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:12.525 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[3258d999-be2d-4ed2-9f63-53cbcbd3c2cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:12 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:12.526 104084 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 02:54:12 np0005531887 ovn_metadata_agent[104079]: global
Nov 22 02:54:12 np0005531887 ovn_metadata_agent[104079]:    log         /dev/log local0 debug
Nov 22 02:54:12 np0005531887 ovn_metadata_agent[104079]:    log-tag     haproxy-metadata-proxy-d54e232a-5c68-4cc7-b58c-054da9c4646f
Nov 22 02:54:12 np0005531887 ovn_metadata_agent[104079]:    user        root
Nov 22 02:54:12 np0005531887 ovn_metadata_agent[104079]:    group       root
Nov 22 02:54:12 np0005531887 ovn_metadata_agent[104079]:    maxconn     1024
Nov 22 02:54:12 np0005531887 ovn_metadata_agent[104079]:    pidfile     /var/lib/neutron/external/pids/d54e232a-5c68-4cc7-b58c-054da9c4646f.pid.haproxy
Nov 22 02:54:12 np0005531887 ovn_metadata_agent[104079]:    daemon
Nov 22 02:54:12 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:54:12 np0005531887 ovn_metadata_agent[104079]: defaults
Nov 22 02:54:12 np0005531887 ovn_metadata_agent[104079]:    log global
Nov 22 02:54:12 np0005531887 ovn_metadata_agent[104079]:    mode http
Nov 22 02:54:12 np0005531887 ovn_metadata_agent[104079]:    option httplog
Nov 22 02:54:12 np0005531887 ovn_metadata_agent[104079]:    option dontlognull
Nov 22 02:54:12 np0005531887 ovn_metadata_agent[104079]:    option http-server-close
Nov 22 02:54:12 np0005531887 ovn_metadata_agent[104079]:    option forwardfor
Nov 22 02:54:12 np0005531887 ovn_metadata_agent[104079]:    retries                 3
Nov 22 02:54:12 np0005531887 ovn_metadata_agent[104079]:    timeout http-request    30s
Nov 22 02:54:12 np0005531887 ovn_metadata_agent[104079]:    timeout connect         30s
Nov 22 02:54:12 np0005531887 ovn_metadata_agent[104079]:    timeout client          32s
Nov 22 02:54:12 np0005531887 ovn_metadata_agent[104079]:    timeout server          32s
Nov 22 02:54:12 np0005531887 ovn_metadata_agent[104079]:    timeout http-keep-alive 30s
Nov 22 02:54:12 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:54:12 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:54:12 np0005531887 ovn_metadata_agent[104079]: listen listener
Nov 22 02:54:12 np0005531887 ovn_metadata_agent[104079]:    bind 169.254.169.254:80
Nov 22 02:54:12 np0005531887 ovn_metadata_agent[104079]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 02:54:12 np0005531887 ovn_metadata_agent[104079]:    http-request add-header X-OVN-Network-ID d54e232a-5c68-4cc7-b58c-054da9c4646f
Nov 22 02:54:12 np0005531887 ovn_metadata_agent[104079]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 02:54:12 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:12.527 104084 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'env', 'PROCESS_TAG=haproxy-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d54e232a-5c68-4cc7-b58c-054da9c4646f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 02:54:12 np0005531887 nova_compute[186849]: 2025-11-22 07:54:12.532 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:12 np0005531887 nova_compute[186849]: 2025-11-22 07:54:12.633 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798052.631569, 669c1c7b-c493-4f31-83dd-737239095b63 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:54:12 np0005531887 nova_compute[186849]: 2025-11-22 07:54:12.639 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] VM Started (Lifecycle Event)#033[00m
Nov 22 02:54:12 np0005531887 nova_compute[186849]: 2025-11-22 07:54:12.669 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:54:12 np0005531887 nova_compute[186849]: 2025-11-22 07:54:12.676 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798052.632393, 669c1c7b-c493-4f31-83dd-737239095b63 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:54:12 np0005531887 nova_compute[186849]: 2025-11-22 07:54:12.676 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] VM Paused (Lifecycle Event)#033[00m
Nov 22 02:54:12 np0005531887 nova_compute[186849]: 2025-11-22 07:54:12.710 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:54:12 np0005531887 nova_compute[186849]: 2025-11-22 07:54:12.717 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:54:12 np0005531887 nova_compute[186849]: 2025-11-22 07:54:12.740 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:54:13 np0005531887 podman[221246]: 2025-11-22 07:54:12.934701721 +0000 UTC m=+0.026100289 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 02:54:13 np0005531887 podman[221246]: 2025-11-22 07:54:13.400389726 +0000 UTC m=+0.491788284 container create bf727e3af2c59745f5fbb42f9de6211ef8c23d4f25ac67859bc9f4740d167e73 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118)
Nov 22 02:54:13 np0005531887 nova_compute[186849]: 2025-11-22 07:54:13.433 186853 DEBUG nova.compute.manager [req-ffb0926d-eba6-476c-b35c-6cbdbce845e5 req-32739d2f-44c8-4a88-99d9-af46ef482c57 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Received event network-vif-plugged-4f77ff1b-e147-4c07-9d9b-feabd33edead external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:54:13 np0005531887 nova_compute[186849]: 2025-11-22 07:54:13.434 186853 DEBUG oslo_concurrency.lockutils [req-ffb0926d-eba6-476c-b35c-6cbdbce845e5 req-32739d2f-44c8-4a88-99d9-af46ef482c57 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "669c1c7b-c493-4f31-83dd-737239095b63-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:13 np0005531887 nova_compute[186849]: 2025-11-22 07:54:13.434 186853 DEBUG oslo_concurrency.lockutils [req-ffb0926d-eba6-476c-b35c-6cbdbce845e5 req-32739d2f-44c8-4a88-99d9-af46ef482c57 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "669c1c7b-c493-4f31-83dd-737239095b63-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:13 np0005531887 nova_compute[186849]: 2025-11-22 07:54:13.434 186853 DEBUG oslo_concurrency.lockutils [req-ffb0926d-eba6-476c-b35c-6cbdbce845e5 req-32739d2f-44c8-4a88-99d9-af46ef482c57 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "669c1c7b-c493-4f31-83dd-737239095b63-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:13 np0005531887 nova_compute[186849]: 2025-11-22 07:54:13.434 186853 DEBUG nova.compute.manager [req-ffb0926d-eba6-476c-b35c-6cbdbce845e5 req-32739d2f-44c8-4a88-99d9-af46ef482c57 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Processing event network-vif-plugged-4f77ff1b-e147-4c07-9d9b-feabd33edead _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 02:54:13 np0005531887 nova_compute[186849]: 2025-11-22 07:54:13.435 186853 DEBUG nova.compute.manager [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:54:13 np0005531887 nova_compute[186849]: 2025-11-22 07:54:13.440 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798053.4405727, 669c1c7b-c493-4f31-83dd-737239095b63 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:54:13 np0005531887 nova_compute[186849]: 2025-11-22 07:54:13.441 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:54:13 np0005531887 nova_compute[186849]: 2025-11-22 07:54:13.444 186853 DEBUG nova.virt.libvirt.driver [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 02:54:13 np0005531887 nova_compute[186849]: 2025-11-22 07:54:13.448 186853 INFO nova.virt.libvirt.driver [-] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Instance spawned successfully.#033[00m
Nov 22 02:54:13 np0005531887 nova_compute[186849]: 2025-11-22 07:54:13.449 186853 DEBUG nova.virt.libvirt.driver [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 02:54:13 np0005531887 nova_compute[186849]: 2025-11-22 07:54:13.475 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:54:13 np0005531887 nova_compute[186849]: 2025-11-22 07:54:13.479 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:54:13 np0005531887 nova_compute[186849]: 2025-11-22 07:54:13.489 186853 DEBUG nova.virt.libvirt.driver [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:54:13 np0005531887 nova_compute[186849]: 2025-11-22 07:54:13.490 186853 DEBUG nova.virt.libvirt.driver [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:54:13 np0005531887 nova_compute[186849]: 2025-11-22 07:54:13.491 186853 DEBUG nova.virt.libvirt.driver [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:54:13 np0005531887 nova_compute[186849]: 2025-11-22 07:54:13.492 186853 DEBUG nova.virt.libvirt.driver [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:54:13 np0005531887 nova_compute[186849]: 2025-11-22 07:54:13.492 186853 DEBUG nova.virt.libvirt.driver [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:54:13 np0005531887 nova_compute[186849]: 2025-11-22 07:54:13.493 186853 DEBUG nova.virt.libvirt.driver [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:54:13 np0005531887 nova_compute[186849]: 2025-11-22 07:54:13.504 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:54:13 np0005531887 systemd[1]: Started libpod-conmon-bf727e3af2c59745f5fbb42f9de6211ef8c23d4f25ac67859bc9f4740d167e73.scope.
Nov 22 02:54:13 np0005531887 nova_compute[186849]: 2025-11-22 07:54:13.602 186853 INFO nova.compute.manager [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Took 10.47 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 02:54:13 np0005531887 nova_compute[186849]: 2025-11-22 07:54:13.603 186853 DEBUG nova.compute.manager [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:54:13 np0005531887 systemd[1]: Started libcrun container.
Nov 22 02:54:13 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c06c55662e417023b847692bddd6168e78bfe2036e3ebf3b5c7ffd8625258679/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 02:54:13 np0005531887 podman[221246]: 2025-11-22 07:54:13.635893937 +0000 UTC m=+0.727292505 container init bf727e3af2c59745f5fbb42f9de6211ef8c23d4f25ac67859bc9f4740d167e73 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 22 02:54:13 np0005531887 podman[221246]: 2025-11-22 07:54:13.644489938 +0000 UTC m=+0.735888486 container start bf727e3af2c59745f5fbb42f9de6211ef8c23d4f25ac67859bc9f4740d167e73 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:54:13 np0005531887 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[221261]: [NOTICE]   (221265) : New worker (221267) forked
Nov 22 02:54:13 np0005531887 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[221261]: [NOTICE]   (221265) : Loading success.
Nov 22 02:54:13 np0005531887 nova_compute[186849]: 2025-11-22 07:54:13.701 186853 INFO nova.compute.manager [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Took 11.28 seconds to build instance.#033[00m
Nov 22 02:54:13 np0005531887 nova_compute[186849]: 2025-11-22 07:54:13.724 186853 DEBUG oslo_concurrency.lockutils [None req-ae234ece-b85a-4e68-93c4-a0f7a1f25155 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "669c1c7b-c493-4f31-83dd-737239095b63" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.422s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:14 np0005531887 nova_compute[186849]: 2025-11-22 07:54:14.528 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:14 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:14.529 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:54:14 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:14.531 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 02:54:14 np0005531887 nova_compute[186849]: 2025-11-22 07:54:14.556 186853 DEBUG nova.compute.manager [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Nov 22 02:54:14 np0005531887 nova_compute[186849]: 2025-11-22 07:54:14.814 186853 DEBUG oslo_concurrency.lockutils [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:14 np0005531887 nova_compute[186849]: 2025-11-22 07:54:14.817 186853 DEBUG oslo_concurrency.lockutils [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:14 np0005531887 nova_compute[186849]: 2025-11-22 07:54:14.899 186853 DEBUG nova.objects.instance [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lazy-loading 'pci_requests' on Instance uuid c64e78b6-87b2-425c-aef9-771bcd042d58 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:54:14 np0005531887 nova_compute[186849]: 2025-11-22 07:54:14.916 186853 DEBUG nova.virt.hardware [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:54:14 np0005531887 nova_compute[186849]: 2025-11-22 07:54:14.917 186853 INFO nova.compute.claims [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 22 02:54:14 np0005531887 nova_compute[186849]: 2025-11-22 07:54:14.917 186853 DEBUG nova.objects.instance [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lazy-loading 'resources' on Instance uuid c64e78b6-87b2-425c-aef9-771bcd042d58 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:54:14 np0005531887 nova_compute[186849]: 2025-11-22 07:54:14.938 186853 DEBUG nova.objects.instance [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lazy-loading 'pci_devices' on Instance uuid c64e78b6-87b2-425c-aef9-771bcd042d58 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:54:15 np0005531887 nova_compute[186849]: 2025-11-22 07:54:15.010 186853 INFO nova.compute.resource_tracker [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Updating resource usage from migration fe6cc4ba-d1c6-416a-a067-c3816513554b#033[00m
Nov 22 02:54:15 np0005531887 nova_compute[186849]: 2025-11-22 07:54:15.011 186853 DEBUG nova.compute.resource_tracker [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Starting to track incoming migration fe6cc4ba-d1c6-416a-a067-c3816513554b with flavor 1c351edf-5b2d-477d-93d0-c380bdae83e7 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Nov 22 02:54:15 np0005531887 nova_compute[186849]: 2025-11-22 07:54:15.199 186853 DEBUG nova.compute.provider_tree [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:54:15 np0005531887 nova_compute[186849]: 2025-11-22 07:54:15.222 186853 DEBUG nova.scheduler.client.report [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:54:15 np0005531887 nova_compute[186849]: 2025-11-22 07:54:15.274 186853 DEBUG oslo_concurrency.lockutils [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.457s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:15 np0005531887 nova_compute[186849]: 2025-11-22 07:54:15.274 186853 INFO nova.compute.manager [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Migrating#033[00m
Nov 22 02:54:15 np0005531887 nova_compute[186849]: 2025-11-22 07:54:15.588 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:15 np0005531887 nova_compute[186849]: 2025-11-22 07:54:15.718 186853 DEBUG nova.network.neutron [req-88a78347-4c15-4c45-b6e3-c6c64ee02147 req-54b0c5c8-9122-4210-98e2-cf47482104f9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Updated VIF entry in instance network info cache for port 4f77ff1b-e147-4c07-9d9b-feabd33edead. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 02:54:15 np0005531887 nova_compute[186849]: 2025-11-22 07:54:15.719 186853 DEBUG nova.network.neutron [req-88a78347-4c15-4c45-b6e3-c6c64ee02147 req-54b0c5c8-9122-4210-98e2-cf47482104f9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Updating instance_info_cache with network_info: [{"id": "4f77ff1b-e147-4c07-9d9b-feabd33edead", "address": "fa:16:3e:c6:b1:2a", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f77ff1b-e1", "ovs_interfaceid": "4f77ff1b-e147-4c07-9d9b-feabd33edead", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:54:15 np0005531887 nova_compute[186849]: 2025-11-22 07:54:15.793 186853 DEBUG oslo_concurrency.lockutils [req-88a78347-4c15-4c45-b6e3-c6c64ee02147 req-54b0c5c8-9122-4210-98e2-cf47482104f9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-669c1c7b-c493-4f31-83dd-737239095b63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:54:15 np0005531887 nova_compute[186849]: 2025-11-22 07:54:15.886 186853 DEBUG nova.compute.manager [req-7550950e-ef2b-44bc-ab80-0f2cd7226622 req-09c4999c-cfdf-4349-a0af-50b26286d0c7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Received event network-vif-plugged-4f77ff1b-e147-4c07-9d9b-feabd33edead external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:54:15 np0005531887 nova_compute[186849]: 2025-11-22 07:54:15.887 186853 DEBUG oslo_concurrency.lockutils [req-7550950e-ef2b-44bc-ab80-0f2cd7226622 req-09c4999c-cfdf-4349-a0af-50b26286d0c7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "669c1c7b-c493-4f31-83dd-737239095b63-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:15 np0005531887 nova_compute[186849]: 2025-11-22 07:54:15.888 186853 DEBUG oslo_concurrency.lockutils [req-7550950e-ef2b-44bc-ab80-0f2cd7226622 req-09c4999c-cfdf-4349-a0af-50b26286d0c7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "669c1c7b-c493-4f31-83dd-737239095b63-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:15 np0005531887 nova_compute[186849]: 2025-11-22 07:54:15.888 186853 DEBUG oslo_concurrency.lockutils [req-7550950e-ef2b-44bc-ab80-0f2cd7226622 req-09c4999c-cfdf-4349-a0af-50b26286d0c7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "669c1c7b-c493-4f31-83dd-737239095b63-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:15 np0005531887 nova_compute[186849]: 2025-11-22 07:54:15.889 186853 DEBUG nova.compute.manager [req-7550950e-ef2b-44bc-ab80-0f2cd7226622 req-09c4999c-cfdf-4349-a0af-50b26286d0c7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] No waiting events found dispatching network-vif-plugged-4f77ff1b-e147-4c07-9d9b-feabd33edead pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:54:15 np0005531887 nova_compute[186849]: 2025-11-22 07:54:15.889 186853 WARNING nova.compute.manager [req-7550950e-ef2b-44bc-ab80-0f2cd7226622 req-09c4999c-cfdf-4349-a0af-50b26286d0c7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Received unexpected event network-vif-plugged-4f77ff1b-e147-4c07-9d9b-feabd33edead for instance with vm_state active and task_state None.#033[00m
Nov 22 02:54:15 np0005531887 nova_compute[186849]: 2025-11-22 07:54:15.992 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:17 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:17.533 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:54:17 np0005531887 podman[221277]: 2025-11-22 07:54:17.867649709 +0000 UTC m=+0.067863301 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:54:19 np0005531887 podman[221296]: 2025-11-22 07:54:19.885009719 +0000 UTC m=+0.098836139 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:54:20 np0005531887 nova_compute[186849]: 2025-11-22 07:54:20.597 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:20 np0005531887 nova_compute[186849]: 2025-11-22 07:54:20.793 186853 DEBUG oslo_concurrency.lockutils [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Acquiring lock "10a29489-706f-428f-b645-1c688d642f0b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:20 np0005531887 nova_compute[186849]: 2025-11-22 07:54:20.794 186853 DEBUG oslo_concurrency.lockutils [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lock "10a29489-706f-428f-b645-1c688d642f0b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:20 np0005531887 nova_compute[186849]: 2025-11-22 07:54:20.852 186853 DEBUG nova.compute.manager [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 02:54:20 np0005531887 nova_compute[186849]: 2025-11-22 07:54:20.995 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:21 np0005531887 nova_compute[186849]: 2025-11-22 07:54:21.174 186853 DEBUG oslo_concurrency.lockutils [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:21 np0005531887 nova_compute[186849]: 2025-11-22 07:54:21.175 186853 DEBUG oslo_concurrency.lockutils [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:21 np0005531887 nova_compute[186849]: 2025-11-22 07:54:21.187 186853 DEBUG nova.virt.hardware [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:54:21 np0005531887 nova_compute[186849]: 2025-11-22 07:54:21.188 186853 INFO nova.compute.claims [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 22 02:54:21 np0005531887 nova_compute[186849]: 2025-11-22 07:54:21.544 186853 DEBUG nova.compute.provider_tree [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:54:21 np0005531887 nova_compute[186849]: 2025-11-22 07:54:21.576 186853 DEBUG nova.scheduler.client.report [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:54:21 np0005531887 nova_compute[186849]: 2025-11-22 07:54:21.691 186853 DEBUG oslo_concurrency.lockutils [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.516s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:21 np0005531887 nova_compute[186849]: 2025-11-22 07:54:21.692 186853 DEBUG nova.compute.manager [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 02:54:21 np0005531887 systemd[1]: Created slice User Slice of UID 42436.
Nov 22 02:54:21 np0005531887 nova_compute[186849]: 2025-11-22 07:54:21.828 186853 DEBUG nova.compute.manager [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 02:54:21 np0005531887 nova_compute[186849]: 2025-11-22 07:54:21.830 186853 DEBUG nova.network.neutron [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 10a29489-706f-428f-b645-1c688d642f0b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 02:54:21 np0005531887 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 22 02:54:21 np0005531887 systemd-logind[821]: New session 38 of user nova.
Nov 22 02:54:21 np0005531887 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 22 02:54:21 np0005531887 systemd[1]: Starting User Manager for UID 42436...
Nov 22 02:54:21 np0005531887 nova_compute[186849]: 2025-11-22 07:54:21.891 186853 INFO nova.virt.libvirt.driver [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 02:54:21 np0005531887 nova_compute[186849]: 2025-11-22 07:54:21.936 186853 DEBUG nova.compute.manager [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 02:54:22 np0005531887 systemd[221320]: Queued start job for default target Main User Target.
Nov 22 02:54:22 np0005531887 systemd[221320]: Created slice User Application Slice.
Nov 22 02:54:22 np0005531887 systemd[221320]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 22 02:54:22 np0005531887 systemd[221320]: Started Daily Cleanup of User's Temporary Directories.
Nov 22 02:54:22 np0005531887 systemd[221320]: Reached target Paths.
Nov 22 02:54:22 np0005531887 systemd[221320]: Reached target Timers.
Nov 22 02:54:22 np0005531887 systemd[221320]: Starting D-Bus User Message Bus Socket...
Nov 22 02:54:22 np0005531887 systemd[221320]: Starting Create User's Volatile Files and Directories...
Nov 22 02:54:22 np0005531887 systemd[221320]: Listening on D-Bus User Message Bus Socket.
Nov 22 02:54:22 np0005531887 systemd[221320]: Reached target Sockets.
Nov 22 02:54:22 np0005531887 systemd[221320]: Finished Create User's Volatile Files and Directories.
Nov 22 02:54:22 np0005531887 systemd[221320]: Reached target Basic System.
Nov 22 02:54:22 np0005531887 systemd[221320]: Reached target Main User Target.
Nov 22 02:54:22 np0005531887 systemd[1]: Started User Manager for UID 42436.
Nov 22 02:54:22 np0005531887 systemd[221320]: Startup finished in 163ms.
Nov 22 02:54:22 np0005531887 systemd[1]: Started Session 38 of User nova.
Nov 22 02:54:22 np0005531887 nova_compute[186849]: 2025-11-22 07:54:22.162 186853 DEBUG nova.compute.manager [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 02:54:22 np0005531887 nova_compute[186849]: 2025-11-22 07:54:22.165 186853 DEBUG nova.virt.libvirt.driver [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:54:22 np0005531887 nova_compute[186849]: 2025-11-22 07:54:22.165 186853 INFO nova.virt.libvirt.driver [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Creating image(s)#033[00m
Nov 22 02:54:22 np0005531887 nova_compute[186849]: 2025-11-22 07:54:22.166 186853 DEBUG oslo_concurrency.lockutils [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Acquiring lock "/var/lib/nova/instances/10a29489-706f-428f-b645-1c688d642f0b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:22 np0005531887 nova_compute[186849]: 2025-11-22 07:54:22.166 186853 DEBUG oslo_concurrency.lockutils [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lock "/var/lib/nova/instances/10a29489-706f-428f-b645-1c688d642f0b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:22 np0005531887 nova_compute[186849]: 2025-11-22 07:54:22.167 186853 DEBUG oslo_concurrency.lockutils [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lock "/var/lib/nova/instances/10a29489-706f-428f-b645-1c688d642f0b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:22 np0005531887 systemd[1]: session-38.scope: Deactivated successfully.
Nov 22 02:54:22 np0005531887 systemd-logind[821]: Session 38 logged out. Waiting for processes to exit.
Nov 22 02:54:22 np0005531887 systemd-logind[821]: Removed session 38.
Nov 22 02:54:22 np0005531887 nova_compute[186849]: 2025-11-22 07:54:22.182 186853 DEBUG oslo_concurrency.processutils [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:54:22 np0005531887 nova_compute[186849]: 2025-11-22 07:54:22.250 186853 DEBUG oslo_concurrency.processutils [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:54:22 np0005531887 nova_compute[186849]: 2025-11-22 07:54:22.251 186853 DEBUG oslo_concurrency.lockutils [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:22 np0005531887 nova_compute[186849]: 2025-11-22 07:54:22.252 186853 DEBUG oslo_concurrency.lockutils [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:22 np0005531887 nova_compute[186849]: 2025-11-22 07:54:22.266 186853 DEBUG oslo_concurrency.processutils [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:54:22 np0005531887 systemd-logind[821]: New session 40 of user nova.
Nov 22 02:54:22 np0005531887 systemd[1]: Started Session 40 of User nova.
Nov 22 02:54:22 np0005531887 nova_compute[186849]: 2025-11-22 07:54:22.332 186853 DEBUG oslo_concurrency.processutils [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:54:22 np0005531887 nova_compute[186849]: 2025-11-22 07:54:22.334 186853 DEBUG oslo_concurrency.processutils [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/10a29489-706f-428f-b645-1c688d642f0b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:54:22 np0005531887 nova_compute[186849]: 2025-11-22 07:54:22.363 186853 DEBUG nova.policy [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 02:54:22 np0005531887 nova_compute[186849]: 2025-11-22 07:54:22.385 186853 DEBUG oslo_concurrency.processutils [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/10a29489-706f-428f-b645-1c688d642f0b/disk 1073741824" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:54:22 np0005531887 nova_compute[186849]: 2025-11-22 07:54:22.386 186853 DEBUG oslo_concurrency.lockutils [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:22 np0005531887 nova_compute[186849]: 2025-11-22 07:54:22.387 186853 DEBUG oslo_concurrency.processutils [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:54:22 np0005531887 systemd[1]: session-40.scope: Deactivated successfully.
Nov 22 02:54:22 np0005531887 systemd-logind[821]: Session 40 logged out. Waiting for processes to exit.
Nov 22 02:54:22 np0005531887 systemd-logind[821]: Removed session 40.
Nov 22 02:54:22 np0005531887 nova_compute[186849]: 2025-11-22 07:54:22.454 186853 DEBUG oslo_concurrency.processutils [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:54:22 np0005531887 nova_compute[186849]: 2025-11-22 07:54:22.457 186853 DEBUG nova.virt.disk.api [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Checking if we can resize image /var/lib/nova/instances/10a29489-706f-428f-b645-1c688d642f0b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:54:22 np0005531887 nova_compute[186849]: 2025-11-22 07:54:22.458 186853 DEBUG oslo_concurrency.processutils [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/10a29489-706f-428f-b645-1c688d642f0b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:54:22 np0005531887 nova_compute[186849]: 2025-11-22 07:54:22.520 186853 DEBUG oslo_concurrency.processutils [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/10a29489-706f-428f-b645-1c688d642f0b/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:54:22 np0005531887 nova_compute[186849]: 2025-11-22 07:54:22.522 186853 DEBUG nova.virt.disk.api [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Cannot resize image /var/lib/nova/instances/10a29489-706f-428f-b645-1c688d642f0b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:54:22 np0005531887 nova_compute[186849]: 2025-11-22 07:54:22.523 186853 DEBUG nova.objects.instance [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lazy-loading 'migration_context' on Instance uuid 10a29489-706f-428f-b645-1c688d642f0b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:54:22 np0005531887 nova_compute[186849]: 2025-11-22 07:54:22.547 186853 DEBUG nova.virt.libvirt.driver [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:54:22 np0005531887 nova_compute[186849]: 2025-11-22 07:54:22.548 186853 DEBUG nova.virt.libvirt.driver [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Ensure instance console log exists: /var/lib/nova/instances/10a29489-706f-428f-b645-1c688d642f0b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:54:22 np0005531887 nova_compute[186849]: 2025-11-22 07:54:22.549 186853 DEBUG oslo_concurrency.lockutils [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:22 np0005531887 nova_compute[186849]: 2025-11-22 07:54:22.549 186853 DEBUG oslo_concurrency.lockutils [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:22 np0005531887 nova_compute[186849]: 2025-11-22 07:54:22.550 186853 DEBUG oslo_concurrency.lockutils [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:22 np0005531887 nova_compute[186849]: 2025-11-22 07:54:22.612 186853 INFO nova.compute.manager [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Rebuilding instance#033[00m
Nov 22 02:54:23 np0005531887 nova_compute[186849]: 2025-11-22 07:54:23.051 186853 DEBUG nova.compute.manager [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:54:23 np0005531887 nova_compute[186849]: 2025-11-22 07:54:23.217 186853 DEBUG nova.objects.instance [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lazy-loading 'pci_requests' on Instance uuid 669c1c7b-c493-4f31-83dd-737239095b63 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:54:23 np0005531887 nova_compute[186849]: 2025-11-22 07:54:23.269 186853 DEBUG nova.objects.instance [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lazy-loading 'pci_devices' on Instance uuid 669c1c7b-c493-4f31-83dd-737239095b63 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:54:23 np0005531887 nova_compute[186849]: 2025-11-22 07:54:23.280 186853 DEBUG nova.objects.instance [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lazy-loading 'resources' on Instance uuid 669c1c7b-c493-4f31-83dd-737239095b63 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:54:23 np0005531887 nova_compute[186849]: 2025-11-22 07:54:23.291 186853 DEBUG nova.objects.instance [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lazy-loading 'migration_context' on Instance uuid 669c1c7b-c493-4f31-83dd-737239095b63 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:54:23 np0005531887 nova_compute[186849]: 2025-11-22 07:54:23.326 186853 DEBUG nova.objects.instance [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Nov 22 02:54:23 np0005531887 nova_compute[186849]: 2025-11-22 07:54:23.331 186853 DEBUG nova.virt.libvirt.driver [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 22 02:54:24 np0005531887 nova_compute[186849]: 2025-11-22 07:54:24.254 186853 DEBUG nova.network.neutron [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Successfully created port: c27f5a73-ae9a-4f31-95ec-4d5fa852d61b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 02:54:24 np0005531887 podman[221358]: 2025-11-22 07:54:24.870533523 +0000 UTC m=+0.073042478 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 02:54:25 np0005531887 nova_compute[186849]: 2025-11-22 07:54:25.599 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:25 np0005531887 systemd-logind[821]: New session 41 of user nova.
Nov 22 02:54:25 np0005531887 systemd[1]: Started Session 41 of User nova.
Nov 22 02:54:26 np0005531887 nova_compute[186849]: 2025-11-22 07:54:25.999 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:26 np0005531887 systemd[1]: session-41.scope: Deactivated successfully.
Nov 22 02:54:26 np0005531887 systemd-logind[821]: Session 41 logged out. Waiting for processes to exit.
Nov 22 02:54:26 np0005531887 systemd-logind[821]: Removed session 41.
Nov 22 02:54:26 np0005531887 nova_compute[186849]: 2025-11-22 07:54:26.301 186853 DEBUG nova.compute.manager [req-df58b878-2d77-45ad-8d50-5057f120808b req-b9f97595-2751-4fe5-883c-d1bac46a629e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Received event network-vif-unplugged-a038edb6-47af-4f7e-9f5e-715660b6da32 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:54:26 np0005531887 nova_compute[186849]: 2025-11-22 07:54:26.301 186853 DEBUG oslo_concurrency.lockutils [req-df58b878-2d77-45ad-8d50-5057f120808b req-b9f97595-2751-4fe5-883c-d1bac46a629e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c64e78b6-87b2-425c-aef9-771bcd042d58-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:26 np0005531887 nova_compute[186849]: 2025-11-22 07:54:26.301 186853 DEBUG oslo_concurrency.lockutils [req-df58b878-2d77-45ad-8d50-5057f120808b req-b9f97595-2751-4fe5-883c-d1bac46a629e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c64e78b6-87b2-425c-aef9-771bcd042d58-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:26 np0005531887 nova_compute[186849]: 2025-11-22 07:54:26.302 186853 DEBUG oslo_concurrency.lockutils [req-df58b878-2d77-45ad-8d50-5057f120808b req-b9f97595-2751-4fe5-883c-d1bac46a629e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c64e78b6-87b2-425c-aef9-771bcd042d58-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:26 np0005531887 nova_compute[186849]: 2025-11-22 07:54:26.302 186853 DEBUG nova.compute.manager [req-df58b878-2d77-45ad-8d50-5057f120808b req-b9f97595-2751-4fe5-883c-d1bac46a629e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] No waiting events found dispatching network-vif-unplugged-a038edb6-47af-4f7e-9f5e-715660b6da32 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:54:26 np0005531887 nova_compute[186849]: 2025-11-22 07:54:26.302 186853 WARNING nova.compute.manager [req-df58b878-2d77-45ad-8d50-5057f120808b req-b9f97595-2751-4fe5-883c-d1bac46a629e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Received unexpected event network-vif-unplugged-a038edb6-47af-4f7e-9f5e-715660b6da32 for instance with vm_state active and task_state resize_migrating.#033[00m
Nov 22 02:54:26 np0005531887 nova_compute[186849]: 2025-11-22 07:54:26.383 186853 DEBUG nova.network.neutron [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Successfully updated port: c27f5a73-ae9a-4f31-95ec-4d5fa852d61b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 02:54:26 np0005531887 systemd-logind[821]: New session 42 of user nova.
Nov 22 02:54:26 np0005531887 systemd[1]: Started Session 42 of User nova.
Nov 22 02:54:26 np0005531887 nova_compute[186849]: 2025-11-22 07:54:26.403 186853 DEBUG oslo_concurrency.lockutils [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Acquiring lock "refresh_cache-10a29489-706f-428f-b645-1c688d642f0b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:54:26 np0005531887 nova_compute[186849]: 2025-11-22 07:54:26.403 186853 DEBUG oslo_concurrency.lockutils [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Acquired lock "refresh_cache-10a29489-706f-428f-b645-1c688d642f0b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:54:26 np0005531887 nova_compute[186849]: 2025-11-22 07:54:26.404 186853 DEBUG nova.network.neutron [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:54:26 np0005531887 systemd[1]: session-42.scope: Deactivated successfully.
Nov 22 02:54:26 np0005531887 systemd-logind[821]: Session 42 logged out. Waiting for processes to exit.
Nov 22 02:54:26 np0005531887 systemd-logind[821]: Removed session 42.
Nov 22 02:54:26 np0005531887 systemd-logind[821]: New session 43 of user nova.
Nov 22 02:54:26 np0005531887 systemd[1]: Started Session 43 of User nova.
Nov 22 02:54:26 np0005531887 nova_compute[186849]: 2025-11-22 07:54:26.666 186853 DEBUG nova.compute.manager [req-59e97192-c52c-48de-948a-8442df99d8f7 req-8bdf2ee4-bb19-4807-81d6-38b6926c9d3b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Received event network-changed-c27f5a73-ae9a-4f31-95ec-4d5fa852d61b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:54:26 np0005531887 nova_compute[186849]: 2025-11-22 07:54:26.668 186853 DEBUG nova.compute.manager [req-59e97192-c52c-48de-948a-8442df99d8f7 req-8bdf2ee4-bb19-4807-81d6-38b6926c9d3b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Refreshing instance network info cache due to event network-changed-c27f5a73-ae9a-4f31-95ec-4d5fa852d61b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 02:54:26 np0005531887 nova_compute[186849]: 2025-11-22 07:54:26.669 186853 DEBUG oslo_concurrency.lockutils [req-59e97192-c52c-48de-948a-8442df99d8f7 req-8bdf2ee4-bb19-4807-81d6-38b6926c9d3b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-10a29489-706f-428f-b645-1c688d642f0b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:54:26 np0005531887 systemd[1]: session-43.scope: Deactivated successfully.
Nov 22 02:54:26 np0005531887 systemd-logind[821]: Session 43 logged out. Waiting for processes to exit.
Nov 22 02:54:26 np0005531887 systemd-logind[821]: Removed session 43.
Nov 22 02:54:27 np0005531887 nova_compute[186849]: 2025-11-22 07:54:27.042 186853 DEBUG nova.network.neutron [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:54:27 np0005531887 nova_compute[186849]: 2025-11-22 07:54:27.510 186853 INFO nova.network.neutron [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Updating port a038edb6-47af-4f7e-9f5e-715660b6da32 with attributes {'binding:host_id': 'compute-1.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Nov 22 02:54:28 np0005531887 nova_compute[186849]: 2025-11-22 07:54:28.531 186853 DEBUG nova.compute.manager [req-483bab99-e43f-4687-9fb0-2228f6c8c12e req-9a470bed-a0ed-46b0-8740-2261ea456fab 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Received event network-vif-plugged-a038edb6-47af-4f7e-9f5e-715660b6da32 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:54:28 np0005531887 nova_compute[186849]: 2025-11-22 07:54:28.532 186853 DEBUG oslo_concurrency.lockutils [req-483bab99-e43f-4687-9fb0-2228f6c8c12e req-9a470bed-a0ed-46b0-8740-2261ea456fab 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c64e78b6-87b2-425c-aef9-771bcd042d58-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:28 np0005531887 nova_compute[186849]: 2025-11-22 07:54:28.533 186853 DEBUG oslo_concurrency.lockutils [req-483bab99-e43f-4687-9fb0-2228f6c8c12e req-9a470bed-a0ed-46b0-8740-2261ea456fab 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c64e78b6-87b2-425c-aef9-771bcd042d58-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:28 np0005531887 nova_compute[186849]: 2025-11-22 07:54:28.533 186853 DEBUG oslo_concurrency.lockutils [req-483bab99-e43f-4687-9fb0-2228f6c8c12e req-9a470bed-a0ed-46b0-8740-2261ea456fab 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c64e78b6-87b2-425c-aef9-771bcd042d58-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:28 np0005531887 nova_compute[186849]: 2025-11-22 07:54:28.533 186853 DEBUG nova.compute.manager [req-483bab99-e43f-4687-9fb0-2228f6c8c12e req-9a470bed-a0ed-46b0-8740-2261ea456fab 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] No waiting events found dispatching network-vif-plugged-a038edb6-47af-4f7e-9f5e-715660b6da32 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:54:28 np0005531887 nova_compute[186849]: 2025-11-22 07:54:28.534 186853 WARNING nova.compute.manager [req-483bab99-e43f-4687-9fb0-2228f6c8c12e req-9a470bed-a0ed-46b0-8740-2261ea456fab 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Received unexpected event network-vif-plugged-a038edb6-47af-4f7e-9f5e-715660b6da32 for instance with vm_state active and task_state resize_migrated.#033[00m
Nov 22 02:54:28 np0005531887 nova_compute[186849]: 2025-11-22 07:54:28.635 186853 DEBUG nova.network.neutron [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Updating instance_info_cache with network_info: [{"id": "c27f5a73-ae9a-4f31-95ec-4d5fa852d61b", "address": "fa:16:3e:15:d5:34", "network": {"id": "62930ff4-55a3-4e08-8229-5532aa7dcaed", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-130265711-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4ca2b2e65ac4bf8b3d14f3310a3a7bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc27f5a73-ae", "ovs_interfaceid": "c27f5a73-ae9a-4f31-95ec-4d5fa852d61b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:54:28 np0005531887 nova_compute[186849]: 2025-11-22 07:54:28.657 186853 DEBUG oslo_concurrency.lockutils [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Releasing lock "refresh_cache-10a29489-706f-428f-b645-1c688d642f0b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:54:28 np0005531887 nova_compute[186849]: 2025-11-22 07:54:28.658 186853 DEBUG nova.compute.manager [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Instance network_info: |[{"id": "c27f5a73-ae9a-4f31-95ec-4d5fa852d61b", "address": "fa:16:3e:15:d5:34", "network": {"id": "62930ff4-55a3-4e08-8229-5532aa7dcaed", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-130265711-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4ca2b2e65ac4bf8b3d14f3310a3a7bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc27f5a73-ae", "ovs_interfaceid": "c27f5a73-ae9a-4f31-95ec-4d5fa852d61b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 02:54:28 np0005531887 nova_compute[186849]: 2025-11-22 07:54:28.659 186853 DEBUG oslo_concurrency.lockutils [req-59e97192-c52c-48de-948a-8442df99d8f7 req-8bdf2ee4-bb19-4807-81d6-38b6926c9d3b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-10a29489-706f-428f-b645-1c688d642f0b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:54:28 np0005531887 nova_compute[186849]: 2025-11-22 07:54:28.659 186853 DEBUG nova.network.neutron [req-59e97192-c52c-48de-948a-8442df99d8f7 req-8bdf2ee4-bb19-4807-81d6-38b6926c9d3b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Refreshing network info cache for port c27f5a73-ae9a-4f31-95ec-4d5fa852d61b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 02:54:28 np0005531887 nova_compute[186849]: 2025-11-22 07:54:28.663 186853 DEBUG nova.virt.libvirt.driver [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Start _get_guest_xml network_info=[{"id": "c27f5a73-ae9a-4f31-95ec-4d5fa852d61b", "address": "fa:16:3e:15:d5:34", "network": {"id": "62930ff4-55a3-4e08-8229-5532aa7dcaed", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-130265711-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4ca2b2e65ac4bf8b3d14f3310a3a7bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc27f5a73-ae", "ovs_interfaceid": "c27f5a73-ae9a-4f31-95ec-4d5fa852d61b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'size': 0, 'boot_index': 0, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:54:28 np0005531887 nova_compute[186849]: 2025-11-22 07:54:28.668 186853 WARNING nova.virt.libvirt.driver [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:54:28 np0005531887 nova_compute[186849]: 2025-11-22 07:54:28.675 186853 DEBUG nova.virt.libvirt.host [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:54:28 np0005531887 nova_compute[186849]: 2025-11-22 07:54:28.677 186853 DEBUG nova.virt.libvirt.host [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:54:28 np0005531887 nova_compute[186849]: 2025-11-22 07:54:28.688 186853 DEBUG nova.virt.libvirt.host [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:54:28 np0005531887 nova_compute[186849]: 2025-11-22 07:54:28.689 186853 DEBUG nova.virt.libvirt.host [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:54:28 np0005531887 nova_compute[186849]: 2025-11-22 07:54:28.690 186853 DEBUG nova.virt.libvirt.driver [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:54:28 np0005531887 nova_compute[186849]: 2025-11-22 07:54:28.691 186853 DEBUG nova.virt.hardware [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1c351edf-5b2d-477d-93d0-c380bdae83e7',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:54:28 np0005531887 nova_compute[186849]: 2025-11-22 07:54:28.691 186853 DEBUG nova.virt.hardware [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:54:28 np0005531887 nova_compute[186849]: 2025-11-22 07:54:28.692 186853 DEBUG nova.virt.hardware [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:54:28 np0005531887 nova_compute[186849]: 2025-11-22 07:54:28.692 186853 DEBUG nova.virt.hardware [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:54:28 np0005531887 nova_compute[186849]: 2025-11-22 07:54:28.692 186853 DEBUG nova.virt.hardware [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:54:28 np0005531887 nova_compute[186849]: 2025-11-22 07:54:28.693 186853 DEBUG nova.virt.hardware [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:54:28 np0005531887 nova_compute[186849]: 2025-11-22 07:54:28.693 186853 DEBUG nova.virt.hardware [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:54:28 np0005531887 nova_compute[186849]: 2025-11-22 07:54:28.693 186853 DEBUG nova.virt.hardware [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:54:28 np0005531887 nova_compute[186849]: 2025-11-22 07:54:28.693 186853 DEBUG nova.virt.hardware [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:54:28 np0005531887 nova_compute[186849]: 2025-11-22 07:54:28.694 186853 DEBUG nova.virt.hardware [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:54:28 np0005531887 nova_compute[186849]: 2025-11-22 07:54:28.694 186853 DEBUG nova.virt.hardware [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:54:28 np0005531887 nova_compute[186849]: 2025-11-22 07:54:28.697 186853 DEBUG nova.virt.libvirt.vif [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:54:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-688404127',display_name='tempest-ListServerFiltersTestJSON-instance-688404127',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-688404127',id=64,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b4ca2b2e65ac4bf8b3d14f3310a3a7bf',ramdisk_id='',reservation_id='r-d6cq3b30',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-1217253496',owner_user_name='tempest-ListServerFiltersTestJSON-1217253496-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:54:22Z,user_data=None,user_id='6d9b8aa760ed4afdbf24f9deb5d29190',uuid=10a29489-706f-428f-b645-1c688d642f0b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c27f5a73-ae9a-4f31-95ec-4d5fa852d61b", "address": "fa:16:3e:15:d5:34", "network": {"id": "62930ff4-55a3-4e08-8229-5532aa7dcaed", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-130265711-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4ca2b2e65ac4bf8b3d14f3310a3a7bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc27f5a73-ae", "ovs_interfaceid": "c27f5a73-ae9a-4f31-95ec-4d5fa852d61b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 02:54:28 np0005531887 nova_compute[186849]: 2025-11-22 07:54:28.698 186853 DEBUG nova.network.os_vif_util [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Converting VIF {"id": "c27f5a73-ae9a-4f31-95ec-4d5fa852d61b", "address": "fa:16:3e:15:d5:34", "network": {"id": "62930ff4-55a3-4e08-8229-5532aa7dcaed", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-130265711-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4ca2b2e65ac4bf8b3d14f3310a3a7bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc27f5a73-ae", "ovs_interfaceid": "c27f5a73-ae9a-4f31-95ec-4d5fa852d61b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:54:28 np0005531887 nova_compute[186849]: 2025-11-22 07:54:28.699 186853 DEBUG nova.network.os_vif_util [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:15:d5:34,bridge_name='br-int',has_traffic_filtering=True,id=c27f5a73-ae9a-4f31-95ec-4d5fa852d61b,network=Network(62930ff4-55a3-4e08-8229-5532aa7dcaed),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc27f5a73-ae') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:54:28 np0005531887 nova_compute[186849]: 2025-11-22 07:54:28.700 186853 DEBUG nova.objects.instance [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lazy-loading 'pci_devices' on Instance uuid 10a29489-706f-428f-b645-1c688d642f0b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:54:28 np0005531887 nova_compute[186849]: 2025-11-22 07:54:28.717 186853 DEBUG nova.virt.libvirt.driver [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 10a29489-706f-428f-b645-1c688d642f0b] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:54:28 np0005531887 nova_compute[186849]:  <uuid>10a29489-706f-428f-b645-1c688d642f0b</uuid>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:  <name>instance-00000040</name>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:  <memory>196608</memory>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:  <vcpu>1</vcpu>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:54:28 np0005531887 nova_compute[186849]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:      <nova:name>tempest-ListServerFiltersTestJSON-instance-688404127</nova:name>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:      <nova:creationTime>2025-11-22 07:54:28</nova:creationTime>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:      <nova:flavor name="m1.micro">
Nov 22 02:54:28 np0005531887 nova_compute[186849]:        <nova:memory>192</nova:memory>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:        <nova:disk>1</nova:disk>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:        <nova:swap>0</nova:swap>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:      </nova:flavor>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:      <nova:owner>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:        <nova:user uuid="6d9b8aa760ed4afdbf24f9deb5d29190">tempest-ListServerFiltersTestJSON-1217253496-project-member</nova:user>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:        <nova:project uuid="b4ca2b2e65ac4bf8b3d14f3310a3a7bf">tempest-ListServerFiltersTestJSON-1217253496</nova:project>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:      </nova:owner>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:      <nova:ports>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:        <nova:port uuid="c27f5a73-ae9a-4f31-95ec-4d5fa852d61b">
Nov 22 02:54:28 np0005531887 nova_compute[186849]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:        </nova:port>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:      </nova:ports>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:    </nova:instance>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:  <sysinfo type="smbios">
Nov 22 02:54:28 np0005531887 nova_compute[186849]:    <system>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:      <entry name="serial">10a29489-706f-428f-b645-1c688d642f0b</entry>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:      <entry name="uuid">10a29489-706f-428f-b645-1c688d642f0b</entry>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:    </system>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:  <os>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:    <boot dev="hd"/>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:    <smbios mode="sysinfo"/>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:  </os>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:  <features>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:    <vmcoreinfo/>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:  </features>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:  <clock offset="utc">
Nov 22 02:54:28 np0005531887 nova_compute[186849]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:    <timer name="hpet" present="no"/>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:  </clock>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:  <cpu mode="custom" match="exact">
Nov 22 02:54:28 np0005531887 nova_compute[186849]:    <model>Nehalem</model>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:  <devices>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:    <disk type="file" device="disk">
Nov 22 02:54:28 np0005531887 nova_compute[186849]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/10a29489-706f-428f-b645-1c688d642f0b/disk"/>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:      <target dev="vda" bus="virtio"/>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:    <disk type="file" device="cdrom">
Nov 22 02:54:28 np0005531887 nova_compute[186849]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/10a29489-706f-428f-b645-1c688d642f0b/disk.config"/>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:      <target dev="sda" bus="sata"/>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:    <interface type="ethernet">
Nov 22 02:54:28 np0005531887 nova_compute[186849]:      <mac address="fa:16:3e:15:d5:34"/>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:      <mtu size="1442"/>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:      <target dev="tapc27f5a73-ae"/>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:    </interface>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:    <serial type="pty">
Nov 22 02:54:28 np0005531887 nova_compute[186849]:      <log file="/var/lib/nova/instances/10a29489-706f-428f-b645-1c688d642f0b/console.log" append="off"/>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:    </serial>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:    <video>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:    </video>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:    <input type="tablet" bus="usb"/>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:    <rng model="virtio">
Nov 22 02:54:28 np0005531887 nova_compute[186849]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:    </rng>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:    <controller type="usb" index="0"/>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:    <memballoon model="virtio">
Nov 22 02:54:28 np0005531887 nova_compute[186849]:      <stats period="10"/>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 02:54:28 np0005531887 nova_compute[186849]:  </devices>
Nov 22 02:54:28 np0005531887 nova_compute[186849]: </domain>
Nov 22 02:54:28 np0005531887 nova_compute[186849]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:54:28 np0005531887 nova_compute[186849]: 2025-11-22 07:54:28.723 186853 DEBUG nova.compute.manager [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Preparing to wait for external event network-vif-plugged-c27f5a73-ae9a-4f31-95ec-4d5fa852d61b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 02:54:28 np0005531887 nova_compute[186849]: 2025-11-22 07:54:28.723 186853 DEBUG oslo_concurrency.lockutils [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Acquiring lock "10a29489-706f-428f-b645-1c688d642f0b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:28 np0005531887 nova_compute[186849]: 2025-11-22 07:54:28.723 186853 DEBUG oslo_concurrency.lockutils [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lock "10a29489-706f-428f-b645-1c688d642f0b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:28 np0005531887 nova_compute[186849]: 2025-11-22 07:54:28.724 186853 DEBUG oslo_concurrency.lockutils [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lock "10a29489-706f-428f-b645-1c688d642f0b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:28 np0005531887 nova_compute[186849]: 2025-11-22 07:54:28.725 186853 DEBUG nova.virt.libvirt.vif [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:54:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-688404127',display_name='tempest-ListServerFiltersTestJSON-instance-688404127',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-688404127',id=64,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b4ca2b2e65ac4bf8b3d14f3310a3a7bf',ramdisk_id='',reservation_id='r-d6cq3b30',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-1217253496',owner_user_name='tempest-ListServerFiltersTestJSON-1217253496-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:54:22Z,user_data=None,user_id='6d9b8aa760ed4afdbf24f9deb5d29190',uuid=10a29489-706f-428f-b645-1c688d642f0b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c27f5a73-ae9a-4f31-95ec-4d5fa852d61b", "address": "fa:16:3e:15:d5:34", "network": {"id": "62930ff4-55a3-4e08-8229-5532aa7dcaed", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-130265711-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4ca2b2e65ac4bf8b3d14f3310a3a7bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc27f5a73-ae", "ovs_interfaceid": "c27f5a73-ae9a-4f31-95ec-4d5fa852d61b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 02:54:28 np0005531887 nova_compute[186849]: 2025-11-22 07:54:28.725 186853 DEBUG nova.network.os_vif_util [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Converting VIF {"id": "c27f5a73-ae9a-4f31-95ec-4d5fa852d61b", "address": "fa:16:3e:15:d5:34", "network": {"id": "62930ff4-55a3-4e08-8229-5532aa7dcaed", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-130265711-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4ca2b2e65ac4bf8b3d14f3310a3a7bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc27f5a73-ae", "ovs_interfaceid": "c27f5a73-ae9a-4f31-95ec-4d5fa852d61b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:54:28 np0005531887 nova_compute[186849]: 2025-11-22 07:54:28.726 186853 DEBUG nova.network.os_vif_util [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:15:d5:34,bridge_name='br-int',has_traffic_filtering=True,id=c27f5a73-ae9a-4f31-95ec-4d5fa852d61b,network=Network(62930ff4-55a3-4e08-8229-5532aa7dcaed),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc27f5a73-ae') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:54:28 np0005531887 nova_compute[186849]: 2025-11-22 07:54:28.726 186853 DEBUG os_vif [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:15:d5:34,bridge_name='br-int',has_traffic_filtering=True,id=c27f5a73-ae9a-4f31-95ec-4d5fa852d61b,network=Network(62930ff4-55a3-4e08-8229-5532aa7dcaed),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc27f5a73-ae') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 02:54:28 np0005531887 nova_compute[186849]: 2025-11-22 07:54:28.727 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:28 np0005531887 nova_compute[186849]: 2025-11-22 07:54:28.727 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:54:28 np0005531887 nova_compute[186849]: 2025-11-22 07:54:28.728 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:54:28 np0005531887 nova_compute[186849]: 2025-11-22 07:54:28.731 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:28 np0005531887 nova_compute[186849]: 2025-11-22 07:54:28.732 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc27f5a73-ae, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:54:28 np0005531887 nova_compute[186849]: 2025-11-22 07:54:28.732 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc27f5a73-ae, col_values=(('external_ids', {'iface-id': 'c27f5a73-ae9a-4f31-95ec-4d5fa852d61b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:15:d5:34', 'vm-uuid': '10a29489-706f-428f-b645-1c688d642f0b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:54:28 np0005531887 nova_compute[186849]: 2025-11-22 07:54:28.734 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:28 np0005531887 NetworkManager[55210]: <info>  [1763798068.7355] manager: (tapc27f5a73-ae): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/72)
Nov 22 02:54:28 np0005531887 nova_compute[186849]: 2025-11-22 07:54:28.739 186853 DEBUG oslo_concurrency.lockutils [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "refresh_cache-c64e78b6-87b2-425c-aef9-771bcd042d58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:54:28 np0005531887 nova_compute[186849]: 2025-11-22 07:54:28.739 186853 DEBUG oslo_concurrency.lockutils [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquired lock "refresh_cache-c64e78b6-87b2-425c-aef9-771bcd042d58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:54:28 np0005531887 nova_compute[186849]: 2025-11-22 07:54:28.740 186853 DEBUG nova.network.neutron [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:54:28 np0005531887 nova_compute[186849]: 2025-11-22 07:54:28.741 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:54:28 np0005531887 nova_compute[186849]: 2025-11-22 07:54:28.743 186853 INFO os_vif [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:15:d5:34,bridge_name='br-int',has_traffic_filtering=True,id=c27f5a73-ae9a-4f31-95ec-4d5fa852d61b,network=Network(62930ff4-55a3-4e08-8229-5532aa7dcaed),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc27f5a73-ae')#033[00m
Nov 22 02:54:28 np0005531887 nova_compute[186849]: 2025-11-22 07:54:28.817 186853 DEBUG nova.virt.libvirt.driver [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:54:28 np0005531887 nova_compute[186849]: 2025-11-22 07:54:28.818 186853 DEBUG nova.virt.libvirt.driver [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:54:28 np0005531887 nova_compute[186849]: 2025-11-22 07:54:28.818 186853 DEBUG nova.virt.libvirt.driver [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] No VIF found with MAC fa:16:3e:15:d5:34, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 02:54:28 np0005531887 nova_compute[186849]: 2025-11-22 07:54:28.819 186853 INFO nova.virt.libvirt.driver [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Using config drive#033[00m
Nov 22 02:54:28 np0005531887 nova_compute[186849]: 2025-11-22 07:54:28.925 186853 DEBUG nova.compute.manager [req-37cdf781-8d53-43b5-bf15-8fa899a10b27 req-dab78e80-0b20-4402-b8ca-e29b0f9a14aa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Received event network-changed-a038edb6-47af-4f7e-9f5e-715660b6da32 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:54:28 np0005531887 nova_compute[186849]: 2025-11-22 07:54:28.926 186853 DEBUG nova.compute.manager [req-37cdf781-8d53-43b5-bf15-8fa899a10b27 req-dab78e80-0b20-4402-b8ca-e29b0f9a14aa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Refreshing instance network info cache due to event network-changed-a038edb6-47af-4f7e-9f5e-715660b6da32. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 02:54:28 np0005531887 nova_compute[186849]: 2025-11-22 07:54:28.926 186853 DEBUG oslo_concurrency.lockutils [req-37cdf781-8d53-43b5-bf15-8fa899a10b27 req-dab78e80-0b20-4402-b8ca-e29b0f9a14aa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-c64e78b6-87b2-425c-aef9-771bcd042d58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:54:29 np0005531887 ovn_controller[95130]: 2025-11-22T07:54:29Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c6:b1:2a 10.100.0.8
Nov 22 02:54:29 np0005531887 ovn_controller[95130]: 2025-11-22T07:54:29Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c6:b1:2a 10.100.0.8
Nov 22 02:54:29 np0005531887 nova_compute[186849]: 2025-11-22 07:54:29.869 186853 INFO nova.virt.libvirt.driver [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Creating config drive at /var/lib/nova/instances/10a29489-706f-428f-b645-1c688d642f0b/disk.config#033[00m
Nov 22 02:54:29 np0005531887 nova_compute[186849]: 2025-11-22 07:54:29.876 186853 DEBUG oslo_concurrency.processutils [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/10a29489-706f-428f-b645-1c688d642f0b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpn9_og1o2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:54:30 np0005531887 nova_compute[186849]: 2025-11-22 07:54:30.007 186853 DEBUG oslo_concurrency.processutils [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/10a29489-706f-428f-b645-1c688d642f0b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpn9_og1o2" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:54:30 np0005531887 kernel: tapc27f5a73-ae: entered promiscuous mode
Nov 22 02:54:30 np0005531887 NetworkManager[55210]: <info>  [1763798070.0700] manager: (tapc27f5a73-ae): new Tun device (/org/freedesktop/NetworkManager/Devices/73)
Nov 22 02:54:30 np0005531887 ovn_controller[95130]: 2025-11-22T07:54:30Z|00123|binding|INFO|Claiming lport c27f5a73-ae9a-4f31-95ec-4d5fa852d61b for this chassis.
Nov 22 02:54:30 np0005531887 ovn_controller[95130]: 2025-11-22T07:54:30Z|00124|binding|INFO|c27f5a73-ae9a-4f31-95ec-4d5fa852d61b: Claiming fa:16:3e:15:d5:34 10.100.0.14
Nov 22 02:54:30 np0005531887 nova_compute[186849]: 2025-11-22 07:54:30.071 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:30.088 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:15:d5:34 10.100.0.14'], port_security=['fa:16:3e:15:d5:34 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '10a29489-706f-428f-b645-1c688d642f0b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62930ff4-55a3-4e08-8229-5532aa7dcaed', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'cd63e957-ae08-4ca1-9eb9-8ce253173257', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=13b92379-ae34-491c-b971-1757bc6e8c79, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=c27f5a73-ae9a-4f31-95ec-4d5fa852d61b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:54:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:30.090 104084 INFO neutron.agent.ovn.metadata.agent [-] Port c27f5a73-ae9a-4f31-95ec-4d5fa852d61b in datapath 62930ff4-55a3-4e08-8229-5532aa7dcaed bound to our chassis#033[00m
Nov 22 02:54:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:30.091 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 62930ff4-55a3-4e08-8229-5532aa7dcaed#033[00m
Nov 22 02:54:30 np0005531887 nova_compute[186849]: 2025-11-22 07:54:30.106 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:30 np0005531887 systemd-udevd[221440]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:54:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:30.105 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[5c85378b-4be2-4417-a69d-f43ce7915000]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:30.106 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap62930ff4-51 in ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 02:54:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:30.109 213790 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap62930ff4-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 02:54:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:30.110 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[9cef843c-3bc6-4bf8-babf-c6200ac0a8bf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:30 np0005531887 ovn_controller[95130]: 2025-11-22T07:54:30Z|00125|binding|INFO|Setting lport c27f5a73-ae9a-4f31-95ec-4d5fa852d61b ovn-installed in OVS
Nov 22 02:54:30 np0005531887 ovn_controller[95130]: 2025-11-22T07:54:30Z|00126|binding|INFO|Setting lport c27f5a73-ae9a-4f31-95ec-4d5fa852d61b up in Southbound
Nov 22 02:54:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:30.111 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[c3c7a35c-a15a-4a08-8648-acb5b946449b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:30 np0005531887 nova_compute[186849]: 2025-11-22 07:54:30.115 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:30.125 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[c14742da-14d9-4b10-bcc0-567f104a0b80]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:30 np0005531887 systemd-machined[153180]: New machine qemu-22-instance-00000040.
Nov 22 02:54:30 np0005531887 NetworkManager[55210]: <info>  [1763798070.1329] device (tapc27f5a73-ae): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 02:54:30 np0005531887 NetworkManager[55210]: <info>  [1763798070.1339] device (tapc27f5a73-ae): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 02:54:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:30.141 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[c5b4bfd6-ea73-4c04-947b-8ed95c578b84]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:30 np0005531887 systemd[1]: Started Virtual Machine qemu-22-instance-00000040.
Nov 22 02:54:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:30.176 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[3167ceb7-efee-4691-aa77-3ea6a088454f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:30 np0005531887 NetworkManager[55210]: <info>  [1763798070.1878] manager: (tap62930ff4-50): new Veth device (/org/freedesktop/NetworkManager/Devices/74)
Nov 22 02:54:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:30.186 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[05482e40-6b7d-4084-be24-00aa8563864b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:30.223 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[ac42df69-0f47-4dad-a7a2-1a3ef2afd103]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:30.228 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[e6806b0b-06bc-4e2f-a575-b49a1e285396]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:30 np0005531887 NetworkManager[55210]: <info>  [1763798070.2598] device (tap62930ff4-50): carrier: link connected
Nov 22 02:54:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:30.263 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[0a4acdc6-137b-42af-9bff-6597aa660d7e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:30.283 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[50a39550-dd20-46ed-8065-e29e095ef30c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap62930ff4-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c3:07:14'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 42], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 481286, 'reachable_time': 34502, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221473, 'error': None, 'target': 'ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:30.299 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[96be3b07-fd5e-4b2c-ba67-b927e9b16c4d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec3:714'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 481286, 'tstamp': 481286}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221474, 'error': None, 'target': 'ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:30.321 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[785eb102-be65-45b5-b0ac-bb470c6b8099]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap62930ff4-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c3:07:14'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 42], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 481286, 'reachable_time': 34502, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221475, 'error': None, 'target': 'ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:30.355 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[ced644e9-6f32-4753-a378-dd32176b6e56]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:30.427 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[7085063c-68da-49a6-80cc-521c98d26d6d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:30.430 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62930ff4-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:54:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:30.430 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:54:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:30.430 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap62930ff4-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:54:30 np0005531887 nova_compute[186849]: 2025-11-22 07:54:30.433 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:30 np0005531887 NetworkManager[55210]: <info>  [1763798070.4339] manager: (tap62930ff4-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/75)
Nov 22 02:54:30 np0005531887 kernel: tap62930ff4-50: entered promiscuous mode
Nov 22 02:54:30 np0005531887 nova_compute[186849]: 2025-11-22 07:54:30.437 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:30.439 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap62930ff4-50, col_values=(('external_ids', {'iface-id': '02324e7a-c5bf-443b-a6e3-5a1cdac9fee4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:54:30 np0005531887 nova_compute[186849]: 2025-11-22 07:54:30.441 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:30 np0005531887 nova_compute[186849]: 2025-11-22 07:54:30.442 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:30.443 104084 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/62930ff4-55a3-4e08-8229-5532aa7dcaed.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/62930ff4-55a3-4e08-8229-5532aa7dcaed.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 02:54:30 np0005531887 ovn_controller[95130]: 2025-11-22T07:54:30Z|00127|binding|INFO|Releasing lport 02324e7a-c5bf-443b-a6e3-5a1cdac9fee4 from this chassis (sb_readonly=0)
Nov 22 02:54:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:30.444 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[d60ff780-e118-4a68-a8dd-6a06c0ca2ece]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:30.445 104084 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 02:54:30 np0005531887 ovn_metadata_agent[104079]: global
Nov 22 02:54:30 np0005531887 ovn_metadata_agent[104079]:    log         /dev/log local0 debug
Nov 22 02:54:30 np0005531887 ovn_metadata_agent[104079]:    log-tag     haproxy-metadata-proxy-62930ff4-55a3-4e08-8229-5532aa7dcaed
Nov 22 02:54:30 np0005531887 ovn_metadata_agent[104079]:    user        root
Nov 22 02:54:30 np0005531887 ovn_metadata_agent[104079]:    group       root
Nov 22 02:54:30 np0005531887 ovn_metadata_agent[104079]:    maxconn     1024
Nov 22 02:54:30 np0005531887 ovn_metadata_agent[104079]:    pidfile     /var/lib/neutron/external/pids/62930ff4-55a3-4e08-8229-5532aa7dcaed.pid.haproxy
Nov 22 02:54:30 np0005531887 ovn_metadata_agent[104079]:    daemon
Nov 22 02:54:30 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:54:30 np0005531887 ovn_metadata_agent[104079]: defaults
Nov 22 02:54:30 np0005531887 ovn_metadata_agent[104079]:    log global
Nov 22 02:54:30 np0005531887 ovn_metadata_agent[104079]:    mode http
Nov 22 02:54:30 np0005531887 ovn_metadata_agent[104079]:    option httplog
Nov 22 02:54:30 np0005531887 ovn_metadata_agent[104079]:    option dontlognull
Nov 22 02:54:30 np0005531887 ovn_metadata_agent[104079]:    option http-server-close
Nov 22 02:54:30 np0005531887 ovn_metadata_agent[104079]:    option forwardfor
Nov 22 02:54:30 np0005531887 ovn_metadata_agent[104079]:    retries                 3
Nov 22 02:54:30 np0005531887 ovn_metadata_agent[104079]:    timeout http-request    30s
Nov 22 02:54:30 np0005531887 ovn_metadata_agent[104079]:    timeout connect         30s
Nov 22 02:54:30 np0005531887 ovn_metadata_agent[104079]:    timeout client          32s
Nov 22 02:54:30 np0005531887 ovn_metadata_agent[104079]:    timeout server          32s
Nov 22 02:54:30 np0005531887 ovn_metadata_agent[104079]:    timeout http-keep-alive 30s
Nov 22 02:54:30 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:54:30 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:54:30 np0005531887 ovn_metadata_agent[104079]: listen listener
Nov 22 02:54:30 np0005531887 ovn_metadata_agent[104079]:    bind 169.254.169.254:80
Nov 22 02:54:30 np0005531887 ovn_metadata_agent[104079]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 02:54:30 np0005531887 ovn_metadata_agent[104079]:    http-request add-header X-OVN-Network-ID 62930ff4-55a3-4e08-8229-5532aa7dcaed
Nov 22 02:54:30 np0005531887 ovn_metadata_agent[104079]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 02:54:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:30.447 104084 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed', 'env', 'PROCESS_TAG=haproxy-62930ff4-55a3-4e08-8229-5532aa7dcaed', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/62930ff4-55a3-4e08-8229-5532aa7dcaed.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 02:54:30 np0005531887 nova_compute[186849]: 2025-11-22 07:54:30.457 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:30 np0005531887 nova_compute[186849]: 2025-11-22 07:54:30.601 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:30 np0005531887 podman[221509]: 2025-11-22 07:54:30.848541781 +0000 UTC m=+0.062388078 container create a8561b611dac9379a07a3ef23221fb187772a6529ad6dfa0f6c85ccffaaf64ab (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true)
Nov 22 02:54:30 np0005531887 nova_compute[186849]: 2025-11-22 07:54:30.869 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798070.869425, 10a29489-706f-428f-b645-1c688d642f0b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:54:30 np0005531887 nova_compute[186849]: 2025-11-22 07:54:30.870 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 10a29489-706f-428f-b645-1c688d642f0b] VM Started (Lifecycle Event)#033[00m
Nov 22 02:54:30 np0005531887 systemd[1]: Started libpod-conmon-a8561b611dac9379a07a3ef23221fb187772a6529ad6dfa0f6c85ccffaaf64ab.scope.
Nov 22 02:54:30 np0005531887 nova_compute[186849]: 2025-11-22 07:54:30.900 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:54:30 np0005531887 nova_compute[186849]: 2025-11-22 07:54:30.906 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798070.8695493, 10a29489-706f-428f-b645-1c688d642f0b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:54:30 np0005531887 nova_compute[186849]: 2025-11-22 07:54:30.906 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 10a29489-706f-428f-b645-1c688d642f0b] VM Paused (Lifecycle Event)#033[00m
Nov 22 02:54:30 np0005531887 podman[221509]: 2025-11-22 07:54:30.814475157 +0000 UTC m=+0.028321484 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 02:54:30 np0005531887 systemd[1]: Started libcrun container.
Nov 22 02:54:30 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/59c2ae45fd39adfa52908665011272049330b1170d173531091d346bd390324d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 02:54:30 np0005531887 nova_compute[186849]: 2025-11-22 07:54:30.933 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:54:30 np0005531887 nova_compute[186849]: 2025-11-22 07:54:30.937 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:54:30 np0005531887 nova_compute[186849]: 2025-11-22 07:54:30.958 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 10a29489-706f-428f-b645-1c688d642f0b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:54:30 np0005531887 podman[221509]: 2025-11-22 07:54:30.968591188 +0000 UTC m=+0.182437525 container init a8561b611dac9379a07a3ef23221fb187772a6529ad6dfa0f6c85ccffaaf64ab (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118)
Nov 22 02:54:30 np0005531887 podman[221528]: 2025-11-22 07:54:30.969764857 +0000 UTC m=+0.094437661 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, vcs-type=git, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, config_id=edpm, distribution-scope=public, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, version=9.6, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 02:54:30 np0005531887 podman[221509]: 2025-11-22 07:54:30.975163999 +0000 UTC m=+0.189010316 container start a8561b611dac9379a07a3ef23221fb187772a6529ad6dfa0f6c85ccffaaf64ab (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Nov 22 02:54:30 np0005531887 neutron-haproxy-ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed[221541]: [NOTICE]   (221555) : New worker (221557) forked
Nov 22 02:54:30 np0005531887 neutron-haproxy-ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed[221541]: [NOTICE]   (221555) : Loading success.
Nov 22 02:54:31 np0005531887 nova_compute[186849]: 2025-11-22 07:54:31.126 186853 DEBUG nova.compute.manager [req-237d7556-abf7-49a8-95c9-735f637f0dd2 req-034a520a-cdc3-421d-96cc-640205aa27e5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Received event network-vif-plugged-c27f5a73-ae9a-4f31-95ec-4d5fa852d61b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:54:31 np0005531887 nova_compute[186849]: 2025-11-22 07:54:31.126 186853 DEBUG oslo_concurrency.lockutils [req-237d7556-abf7-49a8-95c9-735f637f0dd2 req-034a520a-cdc3-421d-96cc-640205aa27e5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "10a29489-706f-428f-b645-1c688d642f0b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:31 np0005531887 nova_compute[186849]: 2025-11-22 07:54:31.127 186853 DEBUG oslo_concurrency.lockutils [req-237d7556-abf7-49a8-95c9-735f637f0dd2 req-034a520a-cdc3-421d-96cc-640205aa27e5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "10a29489-706f-428f-b645-1c688d642f0b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:31 np0005531887 nova_compute[186849]: 2025-11-22 07:54:31.127 186853 DEBUG oslo_concurrency.lockutils [req-237d7556-abf7-49a8-95c9-735f637f0dd2 req-034a520a-cdc3-421d-96cc-640205aa27e5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "10a29489-706f-428f-b645-1c688d642f0b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:31 np0005531887 nova_compute[186849]: 2025-11-22 07:54:31.127 186853 DEBUG nova.compute.manager [req-237d7556-abf7-49a8-95c9-735f637f0dd2 req-034a520a-cdc3-421d-96cc-640205aa27e5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Processing event network-vif-plugged-c27f5a73-ae9a-4f31-95ec-4d5fa852d61b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 02:54:31 np0005531887 nova_compute[186849]: 2025-11-22 07:54:31.128 186853 DEBUG nova.compute.manager [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:54:31 np0005531887 nova_compute[186849]: 2025-11-22 07:54:31.134 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798071.1336675, 10a29489-706f-428f-b645-1c688d642f0b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:54:31 np0005531887 nova_compute[186849]: 2025-11-22 07:54:31.134 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 10a29489-706f-428f-b645-1c688d642f0b] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:54:31 np0005531887 nova_compute[186849]: 2025-11-22 07:54:31.136 186853 DEBUG nova.virt.libvirt.driver [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 02:54:31 np0005531887 nova_compute[186849]: 2025-11-22 07:54:31.140 186853 INFO nova.virt.libvirt.driver [-] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Instance spawned successfully.#033[00m
Nov 22 02:54:31 np0005531887 nova_compute[186849]: 2025-11-22 07:54:31.140 186853 DEBUG nova.virt.libvirt.driver [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 02:54:31 np0005531887 nova_compute[186849]: 2025-11-22 07:54:31.166 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:54:31 np0005531887 nova_compute[186849]: 2025-11-22 07:54:31.170 186853 DEBUG nova.virt.libvirt.driver [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:54:31 np0005531887 nova_compute[186849]: 2025-11-22 07:54:31.170 186853 DEBUG nova.virt.libvirt.driver [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:54:31 np0005531887 nova_compute[186849]: 2025-11-22 07:54:31.171 186853 DEBUG nova.virt.libvirt.driver [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:54:31 np0005531887 nova_compute[186849]: 2025-11-22 07:54:31.171 186853 DEBUG nova.virt.libvirt.driver [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:54:31 np0005531887 nova_compute[186849]: 2025-11-22 07:54:31.172 186853 DEBUG nova.virt.libvirt.driver [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:54:31 np0005531887 nova_compute[186849]: 2025-11-22 07:54:31.172 186853 DEBUG nova.virt.libvirt.driver [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:54:31 np0005531887 nova_compute[186849]: 2025-11-22 07:54:31.177 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:54:31 np0005531887 nova_compute[186849]: 2025-11-22 07:54:31.224 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 10a29489-706f-428f-b645-1c688d642f0b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:54:31 np0005531887 nova_compute[186849]: 2025-11-22 07:54:31.264 186853 INFO nova.compute.manager [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Took 9.10 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 02:54:31 np0005531887 nova_compute[186849]: 2025-11-22 07:54:31.265 186853 DEBUG nova.compute.manager [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:54:31 np0005531887 nova_compute[186849]: 2025-11-22 07:54:31.361 186853 INFO nova.compute.manager [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Took 10.25 seconds to build instance.#033[00m
Nov 22 02:54:31 np0005531887 nova_compute[186849]: 2025-11-22 07:54:31.395 186853 DEBUG oslo_concurrency.lockutils [None req-00dcfbad-6e8f-4091-bdc9-7a2f8fd1dcd2 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lock "10a29489-706f-428f-b645-1c688d642f0b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.602s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:32 np0005531887 nova_compute[186849]: 2025-11-22 07:54:32.049 186853 DEBUG nova.network.neutron [req-59e97192-c52c-48de-948a-8442df99d8f7 req-8bdf2ee4-bb19-4807-81d6-38b6926c9d3b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Updated VIF entry in instance network info cache for port c27f5a73-ae9a-4f31-95ec-4d5fa852d61b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 02:54:32 np0005531887 nova_compute[186849]: 2025-11-22 07:54:32.049 186853 DEBUG nova.network.neutron [req-59e97192-c52c-48de-948a-8442df99d8f7 req-8bdf2ee4-bb19-4807-81d6-38b6926c9d3b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Updating instance_info_cache with network_info: [{"id": "c27f5a73-ae9a-4f31-95ec-4d5fa852d61b", "address": "fa:16:3e:15:d5:34", "network": {"id": "62930ff4-55a3-4e08-8229-5532aa7dcaed", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-130265711-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4ca2b2e65ac4bf8b3d14f3310a3a7bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc27f5a73-ae", "ovs_interfaceid": "c27f5a73-ae9a-4f31-95ec-4d5fa852d61b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:54:32 np0005531887 nova_compute[186849]: 2025-11-22 07:54:32.070 186853 DEBUG oslo_concurrency.lockutils [req-59e97192-c52c-48de-948a-8442df99d8f7 req-8bdf2ee4-bb19-4807-81d6-38b6926c9d3b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-10a29489-706f-428f-b645-1c688d642f0b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.055 186853 DEBUG nova.network.neutron [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Updating instance_info_cache with network_info: [{"id": "a038edb6-47af-4f7e-9f5e-715660b6da32", "address": "fa:16:3e:36:ab:fc", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa038edb6-47", "ovs_interfaceid": "a038edb6-47af-4f7e-9f5e-715660b6da32", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.084 186853 DEBUG oslo_concurrency.lockutils [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Releasing lock "refresh_cache-c64e78b6-87b2-425c-aef9-771bcd042d58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.088 186853 DEBUG oslo_concurrency.lockutils [req-37cdf781-8d53-43b5-bf15-8fa899a10b27 req-dab78e80-0b20-4402-b8ca-e29b0f9a14aa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-c64e78b6-87b2-425c-aef9-771bcd042d58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.088 186853 DEBUG nova.network.neutron [req-37cdf781-8d53-43b5-bf15-8fa899a10b27 req-dab78e80-0b20-4402-b8ca-e29b0f9a14aa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Refreshing network info cache for port a038edb6-47af-4f7e-9f5e-715660b6da32 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.230 186853 DEBUG nova.virt.libvirt.driver [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.232 186853 DEBUG nova.virt.libvirt.driver [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.232 186853 INFO nova.virt.libvirt.driver [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Creating image(s)#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.233 186853 DEBUG nova.objects.instance [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lazy-loading 'trusted_certs' on Instance uuid c64e78b6-87b2-425c-aef9-771bcd042d58 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.270 186853 DEBUG oslo_concurrency.processutils [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.336 186853 DEBUG oslo_concurrency.processutils [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.337 186853 DEBUG nova.virt.disk.api [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Checking if we can resize image /var/lib/nova/instances/c64e78b6-87b2-425c-aef9-771bcd042d58/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.338 186853 DEBUG oslo_concurrency.processutils [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c64e78b6-87b2-425c-aef9-771bcd042d58/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.377 186853 DEBUG nova.compute.manager [req-7bfb17f5-44a9-4c2d-82f0-a13de95141de req-f476c4d0-b3c4-4fa4-a938-149053d17c98 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Received event network-vif-plugged-c27f5a73-ae9a-4f31-95ec-4d5fa852d61b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.378 186853 DEBUG oslo_concurrency.lockutils [req-7bfb17f5-44a9-4c2d-82f0-a13de95141de req-f476c4d0-b3c4-4fa4-a938-149053d17c98 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "10a29489-706f-428f-b645-1c688d642f0b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.378 186853 DEBUG oslo_concurrency.lockutils [req-7bfb17f5-44a9-4c2d-82f0-a13de95141de req-f476c4d0-b3c4-4fa4-a938-149053d17c98 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "10a29489-706f-428f-b645-1c688d642f0b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.378 186853 DEBUG oslo_concurrency.lockutils [req-7bfb17f5-44a9-4c2d-82f0-a13de95141de req-f476c4d0-b3c4-4fa4-a938-149053d17c98 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "10a29489-706f-428f-b645-1c688d642f0b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.379 186853 DEBUG nova.compute.manager [req-7bfb17f5-44a9-4c2d-82f0-a13de95141de req-f476c4d0-b3c4-4fa4-a938-149053d17c98 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 10a29489-706f-428f-b645-1c688d642f0b] No waiting events found dispatching network-vif-plugged-c27f5a73-ae9a-4f31-95ec-4d5fa852d61b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.379 186853 WARNING nova.compute.manager [req-7bfb17f5-44a9-4c2d-82f0-a13de95141de req-f476c4d0-b3c4-4fa4-a938-149053d17c98 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Received unexpected event network-vif-plugged-c27f5a73-ae9a-4f31-95ec-4d5fa852d61b for instance with vm_state active and task_state None.#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.383 186853 DEBUG nova.virt.libvirt.driver [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.411 186853 DEBUG oslo_concurrency.processutils [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c64e78b6-87b2-425c-aef9-771bcd042d58/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.412 186853 DEBUG nova.virt.disk.api [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Cannot resize image /var/lib/nova/instances/c64e78b6-87b2-425c-aef9-771bcd042d58/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.424 186853 DEBUG nova.virt.libvirt.driver [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.425 186853 DEBUG nova.virt.libvirt.driver [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Ensure instance console log exists: /var/lib/nova/instances/c64e78b6-87b2-425c-aef9-771bcd042d58/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.426 186853 DEBUG oslo_concurrency.lockutils [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.426 186853 DEBUG oslo_concurrency.lockutils [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.426 186853 DEBUG oslo_concurrency.lockutils [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.429 186853 DEBUG nova.virt.libvirt.driver [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Start _get_guest_xml network_info=[{"id": "a038edb6-47af-4f7e-9f5e-715660b6da32", "address": "fa:16:3e:36:ab:fc", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-DeleteServersTestJSON-667619475-network", "vif_mac": "fa:16:3e:36:ab:fc"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa038edb6-47", "ovs_interfaceid": "a038edb6-47af-4f7e-9f5e-715660b6da32", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'size': 0, 'boot_index': 0, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.433 186853 WARNING nova.virt.libvirt.driver [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.454 186853 DEBUG nova.virt.libvirt.host [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.455 186853 DEBUG nova.virt.libvirt.host [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.459 186853 DEBUG nova.virt.libvirt.host [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.460 186853 DEBUG nova.virt.libvirt.host [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.461 186853 DEBUG nova.virt.libvirt.driver [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.461 186853 DEBUG nova.virt.hardware [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1c351edf-5b2d-477d-93d0-c380bdae83e7',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.462 186853 DEBUG nova.virt.hardware [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.462 186853 DEBUG nova.virt.hardware [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.462 186853 DEBUG nova.virt.hardware [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.462 186853 DEBUG nova.virt.hardware [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.463 186853 DEBUG nova.virt.hardware [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.463 186853 DEBUG nova.virt.hardware [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.463 186853 DEBUG nova.virt.hardware [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.464 186853 DEBUG nova.virt.hardware [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.464 186853 DEBUG nova.virt.hardware [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.464 186853 DEBUG nova.virt.hardware [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.464 186853 DEBUG nova.objects.instance [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lazy-loading 'vcpu_model' on Instance uuid c64e78b6-87b2-425c-aef9-771bcd042d58 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.486 186853 DEBUG oslo_concurrency.processutils [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c64e78b6-87b2-425c-aef9-771bcd042d58/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.544 186853 DEBUG oslo_concurrency.processutils [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c64e78b6-87b2-425c-aef9-771bcd042d58/disk.config --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.547 186853 DEBUG oslo_concurrency.lockutils [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "/var/lib/nova/instances/c64e78b6-87b2-425c-aef9-771bcd042d58/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.548 186853 DEBUG oslo_concurrency.lockutils [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "/var/lib/nova/instances/c64e78b6-87b2-425c-aef9-771bcd042d58/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.550 186853 DEBUG oslo_concurrency.lockutils [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "/var/lib/nova/instances/c64e78b6-87b2-425c-aef9-771bcd042d58/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.552 186853 DEBUG nova.virt.libvirt.vif [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:53:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1711924286',display_name='tempest-DeleteServersTestJSON-server-1711924286',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1711924286',id=59,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:54:04Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='6b68db2b61a54aeaa8ac219f44ed3e75',ramdisk_id='',reservation_id='r-8pf23wx3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-DeleteServersTestJSON-550712359',owner_user_name='tempest-DeleteServersTestJSON-550712359-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:54:27Z,user_data=None,user_id='57077a1511bf46d897beb6fd5eedfa67',uuid=c64e78b6-87b2-425c-aef9-771bcd042d58,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a038edb6-47af-4f7e-9f5e-715660b6da32", "address": "fa:16:3e:36:ab:fc", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-DeleteServersTestJSON-667619475-network", "vif_mac": "fa:16:3e:36:ab:fc"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa038edb6-47", "ovs_interfaceid": "a038edb6-47af-4f7e-9f5e-715660b6da32", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.553 186853 DEBUG nova.network.os_vif_util [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Converting VIF {"id": "a038edb6-47af-4f7e-9f5e-715660b6da32", "address": "fa:16:3e:36:ab:fc", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-DeleteServersTestJSON-667619475-network", "vif_mac": "fa:16:3e:36:ab:fc"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa038edb6-47", "ovs_interfaceid": "a038edb6-47af-4f7e-9f5e-715660b6da32", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.554 186853 DEBUG nova.network.os_vif_util [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:36:ab:fc,bridge_name='br-int',has_traffic_filtering=True,id=a038edb6-47af-4f7e-9f5e-715660b6da32,network=Network(5e910dbb-27d1-4915-8b74-d0538d33c33c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa038edb6-47') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.558 186853 DEBUG nova.virt.libvirt.driver [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:54:33 np0005531887 nova_compute[186849]:  <uuid>c64e78b6-87b2-425c-aef9-771bcd042d58</uuid>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:  <name>instance-0000003b</name>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:  <memory>196608</memory>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:  <vcpu>1</vcpu>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:54:33 np0005531887 nova_compute[186849]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:      <nova:name>tempest-DeleteServersTestJSON-server-1711924286</nova:name>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:      <nova:creationTime>2025-11-22 07:54:33</nova:creationTime>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:      <nova:flavor name="m1.micro">
Nov 22 02:54:33 np0005531887 nova_compute[186849]:        <nova:memory>192</nova:memory>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:        <nova:disk>1</nova:disk>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:        <nova:swap>0</nova:swap>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:      </nova:flavor>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:      <nova:owner>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:        <nova:user uuid="57077a1511bf46d897beb6fd5eedfa67">tempest-DeleteServersTestJSON-550712359-project-member</nova:user>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:        <nova:project uuid="6b68db2b61a54aeaa8ac219f44ed3e75">tempest-DeleteServersTestJSON-550712359</nova:project>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:      </nova:owner>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:      <nova:ports>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:        <nova:port uuid="a038edb6-47af-4f7e-9f5e-715660b6da32">
Nov 22 02:54:33 np0005531887 nova_compute[186849]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:        </nova:port>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:      </nova:ports>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:    </nova:instance>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:  <sysinfo type="smbios">
Nov 22 02:54:33 np0005531887 nova_compute[186849]:    <system>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:      <entry name="serial">c64e78b6-87b2-425c-aef9-771bcd042d58</entry>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:      <entry name="uuid">c64e78b6-87b2-425c-aef9-771bcd042d58</entry>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:    </system>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:  <os>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:    <boot dev="hd"/>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:    <smbios mode="sysinfo"/>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:  </os>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:  <features>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:    <vmcoreinfo/>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:  </features>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:  <clock offset="utc">
Nov 22 02:54:33 np0005531887 nova_compute[186849]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:    <timer name="hpet" present="no"/>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:  </clock>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:  <cpu mode="custom" match="exact">
Nov 22 02:54:33 np0005531887 nova_compute[186849]:    <model>Nehalem</model>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:  <devices>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:    <disk type="file" device="disk">
Nov 22 02:54:33 np0005531887 nova_compute[186849]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/c64e78b6-87b2-425c-aef9-771bcd042d58/disk"/>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:      <target dev="vda" bus="virtio"/>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:    <disk type="file" device="cdrom">
Nov 22 02:54:33 np0005531887 nova_compute[186849]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/c64e78b6-87b2-425c-aef9-771bcd042d58/disk.config"/>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:      <target dev="sda" bus="sata"/>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:    <interface type="ethernet">
Nov 22 02:54:33 np0005531887 nova_compute[186849]:      <mac address="fa:16:3e:36:ab:fc"/>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:      <mtu size="1442"/>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:      <target dev="tapa038edb6-47"/>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:    </interface>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:    <serial type="pty">
Nov 22 02:54:33 np0005531887 nova_compute[186849]:      <log file="/var/lib/nova/instances/c64e78b6-87b2-425c-aef9-771bcd042d58/console.log" append="off"/>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:    </serial>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:    <video>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:    </video>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:    <input type="tablet" bus="usb"/>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:    <rng model="virtio">
Nov 22 02:54:33 np0005531887 nova_compute[186849]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:    </rng>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:    <controller type="usb" index="0"/>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:    <memballoon model="virtio">
Nov 22 02:54:33 np0005531887 nova_compute[186849]:      <stats period="10"/>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 02:54:33 np0005531887 nova_compute[186849]:  </devices>
Nov 22 02:54:33 np0005531887 nova_compute[186849]: </domain>
Nov 22 02:54:33 np0005531887 nova_compute[186849]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.560 186853 DEBUG nova.virt.libvirt.vif [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:53:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1711924286',display_name='tempest-DeleteServersTestJSON-server-1711924286',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1711924286',id=59,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:54:04Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='6b68db2b61a54aeaa8ac219f44ed3e75',ramdisk_id='',reservation_id='r-8pf23wx3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-DeleteServersTestJSON-550712359',owner_user_name='tempest-DeleteServersTestJSON-550712359-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:54:27Z,user_data=None,user_id='57077a1511bf46d897beb6fd5eedfa67',uuid=c64e78b6-87b2-425c-aef9-771bcd042d58,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a038edb6-47af-4f7e-9f5e-715660b6da32", "address": "fa:16:3e:36:ab:fc", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-DeleteServersTestJSON-667619475-network", "vif_mac": "fa:16:3e:36:ab:fc"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa038edb6-47", "ovs_interfaceid": "a038edb6-47af-4f7e-9f5e-715660b6da32", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.561 186853 DEBUG nova.network.os_vif_util [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Converting VIF {"id": "a038edb6-47af-4f7e-9f5e-715660b6da32", "address": "fa:16:3e:36:ab:fc", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-DeleteServersTestJSON-667619475-network", "vif_mac": "fa:16:3e:36:ab:fc"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa038edb6-47", "ovs_interfaceid": "a038edb6-47af-4f7e-9f5e-715660b6da32", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.561 186853 DEBUG nova.network.os_vif_util [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:36:ab:fc,bridge_name='br-int',has_traffic_filtering=True,id=a038edb6-47af-4f7e-9f5e-715660b6da32,network=Network(5e910dbb-27d1-4915-8b74-d0538d33c33c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa038edb6-47') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.562 186853 DEBUG os_vif [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:36:ab:fc,bridge_name='br-int',has_traffic_filtering=True,id=a038edb6-47af-4f7e-9f5e-715660b6da32,network=Network(5e910dbb-27d1-4915-8b74-d0538d33c33c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa038edb6-47') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.563 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.563 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.564 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.568 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.568 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa038edb6-47, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.569 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa038edb6-47, col_values=(('external_ids', {'iface-id': 'a038edb6-47af-4f7e-9f5e-715660b6da32', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:36:ab:fc', 'vm-uuid': 'c64e78b6-87b2-425c-aef9-771bcd042d58'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.570 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:33 np0005531887 NetworkManager[55210]: <info>  [1763798073.5717] manager: (tapa038edb6-47): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/76)
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.573 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.580 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.581 186853 INFO os_vif [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:36:ab:fc,bridge_name='br-int',has_traffic_filtering=True,id=a038edb6-47af-4f7e-9f5e-715660b6da32,network=Network(5e910dbb-27d1-4915-8b74-d0538d33c33c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa038edb6-47')#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.654 186853 DEBUG nova.virt.libvirt.driver [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.655 186853 DEBUG nova.virt.libvirt.driver [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.656 186853 DEBUG nova.virt.libvirt.driver [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] No VIF found with MAC fa:16:3e:36:ab:fc, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.656 186853 INFO nova.virt.libvirt.driver [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Using config drive#033[00m
Nov 22 02:54:33 np0005531887 NetworkManager[55210]: <info>  [1763798073.7205] manager: (tapa038edb6-47): new Tun device (/org/freedesktop/NetworkManager/Devices/77)
Nov 22 02:54:33 np0005531887 kernel: tapa038edb6-47: entered promiscuous mode
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.736 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:33 np0005531887 ovn_controller[95130]: 2025-11-22T07:54:33Z|00128|binding|INFO|Claiming lport a038edb6-47af-4f7e-9f5e-715660b6da32 for this chassis.
Nov 22 02:54:33 np0005531887 ovn_controller[95130]: 2025-11-22T07:54:33Z|00129|binding|INFO|a038edb6-47af-4f7e-9f5e-715660b6da32: Claiming fa:16:3e:36:ab:fc 10.100.0.7
Nov 22 02:54:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:33.747 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:36:ab:fc 10.100.0.7'], port_security=['fa:16:3e:36:ab:fc 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'c64e78b6-87b2-425c-aef9-771bcd042d58', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5e910dbb-27d1-4915-8b74-d0538d33c33c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'd8cd7544-2677-4974-86a3-a18d0c107043', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7bb67d1a-54cf-4f4c-900a-e9306bad2f5e, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=a038edb6-47af-4f7e-9f5e-715660b6da32) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:54:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:33.748 104084 INFO neutron.agent.ovn.metadata.agent [-] Port a038edb6-47af-4f7e-9f5e-715660b6da32 in datapath 5e910dbb-27d1-4915-8b74-d0538d33c33c bound to our chassis#033[00m
Nov 22 02:54:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:33.750 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5e910dbb-27d1-4915-8b74-d0538d33c33c#033[00m
Nov 22 02:54:33 np0005531887 ovn_controller[95130]: 2025-11-22T07:54:33Z|00130|binding|INFO|Setting lport a038edb6-47af-4f7e-9f5e-715660b6da32 ovn-installed in OVS
Nov 22 02:54:33 np0005531887 ovn_controller[95130]: 2025-11-22T07:54:33Z|00131|binding|INFO|Setting lport a038edb6-47af-4f7e-9f5e-715660b6da32 up in Southbound
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.757 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:33 np0005531887 nova_compute[186849]: 2025-11-22 07:54:33.759 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:33.762 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[ebe20422-0826-4068-84ef-03a23521ede7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:33.765 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap5e910dbb-21 in ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 02:54:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:33.766 213790 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap5e910dbb-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 02:54:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:33.767 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[95f0abec-4052-4102-a180-d48249d4c343]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:33.768 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[840ad782-27cb-4642-9784-4f092ce3d93f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:33 np0005531887 systemd-machined[153180]: New machine qemu-23-instance-0000003b.
Nov 22 02:54:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:33.780 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[35418320-bb22-47fe-b0c5-35904e31e8d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:33 np0005531887 systemd[1]: Started Virtual Machine qemu-23-instance-0000003b.
Nov 22 02:54:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:33.799 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[2249d116-8a49-4680-9274-ee90d8fb8692]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:33 np0005531887 systemd-udevd[221596]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:54:33 np0005531887 NetworkManager[55210]: <info>  [1763798073.8222] device (tapa038edb6-47): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 02:54:33 np0005531887 NetworkManager[55210]: <info>  [1763798073.8229] device (tapa038edb6-47): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 02:54:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:33.837 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[620968c0-edfc-4010-bc82-6dead8caa19c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:33 np0005531887 NetworkManager[55210]: <info>  [1763798073.8510] manager: (tap5e910dbb-20): new Veth device (/org/freedesktop/NetworkManager/Devices/78)
Nov 22 02:54:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:33.850 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[40be1bf2-0f74-4c1d-8eb1-07cef705b1ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:33.888 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[4eb83650-dec7-49fc-a8d9-fadf8d96f182]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:33.891 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[753c8347-f113-4f23-be9a-50861b219535]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:33 np0005531887 NetworkManager[55210]: <info>  [1763798073.9134] device (tap5e910dbb-20): carrier: link connected
Nov 22 02:54:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:33.921 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[44d52038-cdf2-4420-badc-692bfdd474ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:33.941 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[b1651ec7-4a70-4a52-a4b1-4aef445cd76e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5e910dbb-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:27:e8:59'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 481652, 'reachable_time': 17911, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221625, 'error': None, 'target': 'ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:33.964 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[85a7e945-5329-4b85-8f88-ba72152de7f1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe27:e859'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 481652, 'tstamp': 481652}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221626, 'error': None, 'target': 'ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:33.983 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[cb3943b2-3893-4a46-a232-424715d1591a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5e910dbb-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:27:e8:59'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 481652, 'reachable_time': 17911, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221627, 'error': None, 'target': 'ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:34.020 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[b48a35ef-8d7b-4f08-b3cb-f5d3424f42f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:34.092 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[ea51c5bc-f55d-476a-a5e6-97494dd0e8e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:34.094 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5e910dbb-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:54:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:34.094 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:54:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:34.094 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5e910dbb-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:54:34 np0005531887 NetworkManager[55210]: <info>  [1763798074.0972] manager: (tap5e910dbb-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/79)
Nov 22 02:54:34 np0005531887 kernel: tap5e910dbb-20: entered promiscuous mode
Nov 22 02:54:34 np0005531887 nova_compute[186849]: 2025-11-22 07:54:34.096 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:34.101 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5e910dbb-20, col_values=(('external_ids', {'iface-id': 'df80c07a-3ea3-4dde-8219-31b028a556e5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:54:34 np0005531887 ovn_controller[95130]: 2025-11-22T07:54:34Z|00132|binding|INFO|Releasing lport df80c07a-3ea3-4dde-8219-31b028a556e5 from this chassis (sb_readonly=0)
Nov 22 02:54:34 np0005531887 nova_compute[186849]: 2025-11-22 07:54:34.103 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:34.119 104084 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5e910dbb-27d1-4915-8b74-d0538d33c33c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5e910dbb-27d1-4915-8b74-d0538d33c33c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 02:54:34 np0005531887 nova_compute[186849]: 2025-11-22 07:54:34.118 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:34.121 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[dfcc1bf2-240d-453c-a2b2-12ba08072fab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:34.122 104084 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 02:54:34 np0005531887 ovn_metadata_agent[104079]: global
Nov 22 02:54:34 np0005531887 ovn_metadata_agent[104079]:    log         /dev/log local0 debug
Nov 22 02:54:34 np0005531887 ovn_metadata_agent[104079]:    log-tag     haproxy-metadata-proxy-5e910dbb-27d1-4915-8b74-d0538d33c33c
Nov 22 02:54:34 np0005531887 ovn_metadata_agent[104079]:    user        root
Nov 22 02:54:34 np0005531887 ovn_metadata_agent[104079]:    group       root
Nov 22 02:54:34 np0005531887 ovn_metadata_agent[104079]:    maxconn     1024
Nov 22 02:54:34 np0005531887 ovn_metadata_agent[104079]:    pidfile     /var/lib/neutron/external/pids/5e910dbb-27d1-4915-8b74-d0538d33c33c.pid.haproxy
Nov 22 02:54:34 np0005531887 ovn_metadata_agent[104079]:    daemon
Nov 22 02:54:34 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:54:34 np0005531887 ovn_metadata_agent[104079]: defaults
Nov 22 02:54:34 np0005531887 ovn_metadata_agent[104079]:    log global
Nov 22 02:54:34 np0005531887 ovn_metadata_agent[104079]:    mode http
Nov 22 02:54:34 np0005531887 ovn_metadata_agent[104079]:    option httplog
Nov 22 02:54:34 np0005531887 ovn_metadata_agent[104079]:    option dontlognull
Nov 22 02:54:34 np0005531887 ovn_metadata_agent[104079]:    option http-server-close
Nov 22 02:54:34 np0005531887 ovn_metadata_agent[104079]:    option forwardfor
Nov 22 02:54:34 np0005531887 ovn_metadata_agent[104079]:    retries                 3
Nov 22 02:54:34 np0005531887 ovn_metadata_agent[104079]:    timeout http-request    30s
Nov 22 02:54:34 np0005531887 ovn_metadata_agent[104079]:    timeout connect         30s
Nov 22 02:54:34 np0005531887 ovn_metadata_agent[104079]:    timeout client          32s
Nov 22 02:54:34 np0005531887 ovn_metadata_agent[104079]:    timeout server          32s
Nov 22 02:54:34 np0005531887 ovn_metadata_agent[104079]:    timeout http-keep-alive 30s
Nov 22 02:54:34 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:54:34 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:54:34 np0005531887 ovn_metadata_agent[104079]: listen listener
Nov 22 02:54:34 np0005531887 ovn_metadata_agent[104079]:    bind 169.254.169.254:80
Nov 22 02:54:34 np0005531887 ovn_metadata_agent[104079]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 02:54:34 np0005531887 ovn_metadata_agent[104079]:    http-request add-header X-OVN-Network-ID 5e910dbb-27d1-4915-8b74-d0538d33c33c
Nov 22 02:54:34 np0005531887 ovn_metadata_agent[104079]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 02:54:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:34.122 104084 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c', 'env', 'PROCESS_TAG=haproxy-5e910dbb-27d1-4915-8b74-d0538d33c33c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/5e910dbb-27d1-4915-8b74-d0538d33c33c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 02:54:34 np0005531887 nova_compute[186849]: 2025-11-22 07:54:34.203 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798074.2027059, c64e78b6-87b2-425c-aef9-771bcd042d58 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:54:34 np0005531887 nova_compute[186849]: 2025-11-22 07:54:34.203 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:54:34 np0005531887 nova_compute[186849]: 2025-11-22 07:54:34.205 186853 DEBUG nova.compute.manager [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:54:34 np0005531887 nova_compute[186849]: 2025-11-22 07:54:34.209 186853 INFO nova.virt.libvirt.driver [-] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Instance running successfully.#033[00m
Nov 22 02:54:34 np0005531887 virtqemud[186424]: argument unsupported: QEMU guest agent is not configured
Nov 22 02:54:34 np0005531887 nova_compute[186849]: 2025-11-22 07:54:34.212 186853 DEBUG nova.virt.libvirt.guest [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Nov 22 02:54:34 np0005531887 nova_compute[186849]: 2025-11-22 07:54:34.212 186853 DEBUG nova.virt.libvirt.driver [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Nov 22 02:54:34 np0005531887 nova_compute[186849]: 2025-11-22 07:54:34.232 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:54:34 np0005531887 nova_compute[186849]: 2025-11-22 07:54:34.236 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:54:34 np0005531887 nova_compute[186849]: 2025-11-22 07:54:34.262 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Nov 22 02:54:34 np0005531887 nova_compute[186849]: 2025-11-22 07:54:34.263 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798074.2036688, c64e78b6-87b2-425c-aef9-771bcd042d58 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:54:34 np0005531887 nova_compute[186849]: 2025-11-22 07:54:34.263 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] VM Started (Lifecycle Event)#033[00m
Nov 22 02:54:34 np0005531887 nova_compute[186849]: 2025-11-22 07:54:34.288 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:54:34 np0005531887 nova_compute[186849]: 2025-11-22 07:54:34.291 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:54:34 np0005531887 nova_compute[186849]: 2025-11-22 07:54:34.316 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Nov 22 02:54:34 np0005531887 podman[221665]: 2025-11-22 07:54:34.544081652 +0000 UTC m=+0.062745537 container create f1fc8018c136caa19f628ccbc65722d14e41dbb93289349b0680dff2d25ecbaf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0)
Nov 22 02:54:34 np0005531887 systemd[1]: Started libpod-conmon-f1fc8018c136caa19f628ccbc65722d14e41dbb93289349b0680dff2d25ecbaf.scope.
Nov 22 02:54:34 np0005531887 podman[221665]: 2025-11-22 07:54:34.509656059 +0000 UTC m=+0.028319964 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 02:54:34 np0005531887 systemd[1]: Started libcrun container.
Nov 22 02:54:34 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5f4cc6e409bad27ed3371f6eae91caca30201c042bb128ba6a8f0afbb86589d3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 02:54:34 np0005531887 podman[221665]: 2025-11-22 07:54:34.639381353 +0000 UTC m=+0.158045258 container init f1fc8018c136caa19f628ccbc65722d14e41dbb93289349b0680dff2d25ecbaf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0)
Nov 22 02:54:34 np0005531887 podman[221665]: 2025-11-22 07:54:34.647142434 +0000 UTC m=+0.165806319 container start f1fc8018c136caa19f628ccbc65722d14e41dbb93289349b0680dff2d25ecbaf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:54:34 np0005531887 neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c[221680]: [NOTICE]   (221684) : New worker (221686) forked
Nov 22 02:54:34 np0005531887 neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c[221680]: [NOTICE]   (221684) : Loading success.
Nov 22 02:54:35 np0005531887 kernel: tap4f77ff1b-e1 (unregistering): left promiscuous mode
Nov 22 02:54:35 np0005531887 NetworkManager[55210]: <info>  [1763798075.6073] device (tap4f77ff1b-e1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 02:54:35 np0005531887 ovn_controller[95130]: 2025-11-22T07:54:35Z|00133|binding|INFO|Releasing lport 4f77ff1b-e147-4c07-9d9b-feabd33edead from this chassis (sb_readonly=0)
Nov 22 02:54:35 np0005531887 ovn_controller[95130]: 2025-11-22T07:54:35Z|00134|binding|INFO|Setting lport 4f77ff1b-e147-4c07-9d9b-feabd33edead down in Southbound
Nov 22 02:54:35 np0005531887 ovn_controller[95130]: 2025-11-22T07:54:35Z|00135|binding|INFO|Removing iface tap4f77ff1b-e1 ovn-installed in OVS
Nov 22 02:54:35 np0005531887 nova_compute[186849]: 2025-11-22 07:54:35.628 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:35.638 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c6:b1:2a 10.100.0.8'], port_security=['fa:16:3e:c6:b1:2a 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '669c1c7b-c493-4f31-83dd-737239095b63', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '063bf16c91af408ca075c690797e09d8', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f7dee984-8c58-404b-bb82-9a414aef709d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3245992a-6d76-4250-aff6-2ef09c2bac10, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=4f77ff1b-e147-4c07-9d9b-feabd33edead) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:54:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:35.640 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 4f77ff1b-e147-4c07-9d9b-feabd33edead in datapath d54e232a-5c68-4cc7-b58c-054da9c4646f unbound from our chassis#033[00m
Nov 22 02:54:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:35.642 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d54e232a-5c68-4cc7-b58c-054da9c4646f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 02:54:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:35.643 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[beff51bf-6486-43b4-ac1a-59f185510131]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:35.644 104084 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f namespace which is not needed anymore#033[00m
Nov 22 02:54:35 np0005531887 nova_compute[186849]: 2025-11-22 07:54:35.647 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:35 np0005531887 nova_compute[186849]: 2025-11-22 07:54:35.663 186853 DEBUG nova.compute.manager [req-bb987b4f-d7b6-4bde-9a8f-9038e3b1f3c4 req-441a9fd7-f188-4017-8780-1af9888cb291 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Received event network-vif-plugged-a038edb6-47af-4f7e-9f5e-715660b6da32 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:54:35 np0005531887 nova_compute[186849]: 2025-11-22 07:54:35.664 186853 DEBUG oslo_concurrency.lockutils [req-bb987b4f-d7b6-4bde-9a8f-9038e3b1f3c4 req-441a9fd7-f188-4017-8780-1af9888cb291 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c64e78b6-87b2-425c-aef9-771bcd042d58-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:35 np0005531887 nova_compute[186849]: 2025-11-22 07:54:35.664 186853 DEBUG oslo_concurrency.lockutils [req-bb987b4f-d7b6-4bde-9a8f-9038e3b1f3c4 req-441a9fd7-f188-4017-8780-1af9888cb291 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c64e78b6-87b2-425c-aef9-771bcd042d58-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:35 np0005531887 nova_compute[186849]: 2025-11-22 07:54:35.664 186853 DEBUG oslo_concurrency.lockutils [req-bb987b4f-d7b6-4bde-9a8f-9038e3b1f3c4 req-441a9fd7-f188-4017-8780-1af9888cb291 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c64e78b6-87b2-425c-aef9-771bcd042d58-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:35 np0005531887 nova_compute[186849]: 2025-11-22 07:54:35.665 186853 DEBUG nova.compute.manager [req-bb987b4f-d7b6-4bde-9a8f-9038e3b1f3c4 req-441a9fd7-f188-4017-8780-1af9888cb291 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] No waiting events found dispatching network-vif-plugged-a038edb6-47af-4f7e-9f5e-715660b6da32 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:54:35 np0005531887 nova_compute[186849]: 2025-11-22 07:54:35.665 186853 WARNING nova.compute.manager [req-bb987b4f-d7b6-4bde-9a8f-9038e3b1f3c4 req-441a9fd7-f188-4017-8780-1af9888cb291 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Received unexpected event network-vif-plugged-a038edb6-47af-4f7e-9f5e-715660b6da32 for instance with vm_state resized and task_state None.#033[00m
Nov 22 02:54:35 np0005531887 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d0000003d.scope: Deactivated successfully.
Nov 22 02:54:35 np0005531887 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d0000003d.scope: Consumed 16.133s CPU time.
Nov 22 02:54:35 np0005531887 systemd-machined[153180]: Machine qemu-21-instance-0000003d terminated.
Nov 22 02:54:35 np0005531887 podman[221696]: 2025-11-22 07:54:35.716124389 +0000 UTC m=+0.085966354 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm)
Nov 22 02:54:35 np0005531887 podman[221701]: 2025-11-22 07:54:35.736873186 +0000 UTC m=+0.101124715 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 02:54:35 np0005531887 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[221261]: [NOTICE]   (221265) : haproxy version is 2.8.14-c23fe91
Nov 22 02:54:35 np0005531887 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[221261]: [NOTICE]   (221265) : path to executable is /usr/sbin/haproxy
Nov 22 02:54:35 np0005531887 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[221261]: [WARNING]  (221265) : Exiting Master process...
Nov 22 02:54:35 np0005531887 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[221261]: [ALERT]    (221265) : Current worker (221267) exited with code 143 (Terminated)
Nov 22 02:54:35 np0005531887 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[221261]: [WARNING]  (221265) : All workers exited. Exiting... (0)
Nov 22 02:54:35 np0005531887 systemd[1]: libpod-bf727e3af2c59745f5fbb42f9de6211ef8c23d4f25ac67859bc9f4740d167e73.scope: Deactivated successfully.
Nov 22 02:54:35 np0005531887 conmon[221261]: conmon bf727e3af2c59745f5fb <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-bf727e3af2c59745f5fbb42f9de6211ef8c23d4f25ac67859bc9f4740d167e73.scope/container/memory.events
Nov 22 02:54:35 np0005531887 podman[221762]: 2025-11-22 07:54:35.798915294 +0000 UTC m=+0.049081622 container died bf727e3af2c59745f5fbb42f9de6211ef8c23d4f25ac67859bc9f4740d167e73 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 22 02:54:35 np0005531887 kernel: tap4f77ff1b-e1: entered promiscuous mode
Nov 22 02:54:35 np0005531887 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bf727e3af2c59745f5fbb42f9de6211ef8c23d4f25ac67859bc9f4740d167e73-userdata-shm.mount: Deactivated successfully.
Nov 22 02:54:35 np0005531887 systemd-udevd[221621]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:54:35 np0005531887 nova_compute[186849]: 2025-11-22 07:54:35.854 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:35 np0005531887 NetworkManager[55210]: <info>  [1763798075.8561] manager: (tap4f77ff1b-e1): new Tun device (/org/freedesktop/NetworkManager/Devices/80)
Nov 22 02:54:35 np0005531887 systemd[1]: var-lib-containers-storage-overlay-c06c55662e417023b847692bddd6168e78bfe2036e3ebf3b5c7ffd8625258679-merged.mount: Deactivated successfully.
Nov 22 02:54:35 np0005531887 ovn_controller[95130]: 2025-11-22T07:54:35Z|00136|binding|INFO|Claiming lport 4f77ff1b-e147-4c07-9d9b-feabd33edead for this chassis.
Nov 22 02:54:35 np0005531887 ovn_controller[95130]: 2025-11-22T07:54:35Z|00137|binding|INFO|4f77ff1b-e147-4c07-9d9b-feabd33edead: Claiming fa:16:3e:c6:b1:2a 10.100.0.8
Nov 22 02:54:35 np0005531887 kernel: tap4f77ff1b-e1 (unregistering): left promiscuous mode
Nov 22 02:54:35 np0005531887 virtnodedevd[186714]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, )
Nov 22 02:54:35 np0005531887 virtnodedevd[186714]: hostname: compute-1
Nov 22 02:54:35 np0005531887 virtnodedevd[186714]: ethtool ioctl error on tap4f77ff1b-e1: No such device
Nov 22 02:54:35 np0005531887 ovn_controller[95130]: 2025-11-22T07:54:35Z|00138|binding|INFO|Setting lport 4f77ff1b-e147-4c07-9d9b-feabd33edead ovn-installed in OVS
Nov 22 02:54:35 np0005531887 ovn_controller[95130]: 2025-11-22T07:54:35Z|00139|binding|INFO|Setting lport 4f77ff1b-e147-4c07-9d9b-feabd33edead up in Southbound
Nov 22 02:54:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:35.881 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c6:b1:2a 10.100.0.8'], port_security=['fa:16:3e:c6:b1:2a 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '669c1c7b-c493-4f31-83dd-737239095b63', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '063bf16c91af408ca075c690797e09d8', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f7dee984-8c58-404b-bb82-9a414aef709d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3245992a-6d76-4250-aff6-2ef09c2bac10, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=4f77ff1b-e147-4c07-9d9b-feabd33edead) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:54:35 np0005531887 nova_compute[186849]: 2025-11-22 07:54:35.882 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:35 np0005531887 ovn_controller[95130]: 2025-11-22T07:54:35Z|00140|binding|INFO|Releasing lport 4f77ff1b-e147-4c07-9d9b-feabd33edead from this chassis (sb_readonly=1)
Nov 22 02:54:35 np0005531887 ovn_controller[95130]: 2025-11-22T07:54:35Z|00141|binding|INFO|Removing iface tap4f77ff1b-e1 ovn-installed in OVS
Nov 22 02:54:35 np0005531887 virtnodedevd[186714]: ethtool ioctl error on tap4f77ff1b-e1: No such device
Nov 22 02:54:35 np0005531887 ovn_controller[95130]: 2025-11-22T07:54:35Z|00142|if_status|INFO|Not setting lport 4f77ff1b-e147-4c07-9d9b-feabd33edead down as sb is readonly
Nov 22 02:54:35 np0005531887 nova_compute[186849]: 2025-11-22 07:54:35.886 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:35 np0005531887 podman[221762]: 2025-11-22 07:54:35.888710341 +0000 UTC m=+0.138876649 container cleanup bf727e3af2c59745f5fbb42f9de6211ef8c23d4f25ac67859bc9f4740d167e73 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 02:54:35 np0005531887 virtnodedevd[186714]: ethtool ioctl error on tap4f77ff1b-e1: No such device
Nov 22 02:54:35 np0005531887 virtnodedevd[186714]: ethtool ioctl error on tap4f77ff1b-e1: No such device
Nov 22 02:54:35 np0005531887 systemd[1]: libpod-conmon-bf727e3af2c59745f5fbb42f9de6211ef8c23d4f25ac67859bc9f4740d167e73.scope: Deactivated successfully.
Nov 22 02:54:35 np0005531887 ovn_controller[95130]: 2025-11-22T07:54:35Z|00143|binding|INFO|Releasing lport 4f77ff1b-e147-4c07-9d9b-feabd33edead from this chassis (sb_readonly=0)
Nov 22 02:54:35 np0005531887 ovn_controller[95130]: 2025-11-22T07:54:35Z|00144|binding|INFO|Setting lport 4f77ff1b-e147-4c07-9d9b-feabd33edead down in Southbound
Nov 22 02:54:35 np0005531887 virtnodedevd[186714]: ethtool ioctl error on tap4f77ff1b-e1: No such device
Nov 22 02:54:35 np0005531887 nova_compute[186849]: 2025-11-22 07:54:35.902 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:35 np0005531887 virtnodedevd[186714]: ethtool ioctl error on tap4f77ff1b-e1: No such device
Nov 22 02:54:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:35.907 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c6:b1:2a 10.100.0.8'], port_security=['fa:16:3e:c6:b1:2a 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '669c1c7b-c493-4f31-83dd-737239095b63', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '063bf16c91af408ca075c690797e09d8', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f7dee984-8c58-404b-bb82-9a414aef709d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3245992a-6d76-4250-aff6-2ef09c2bac10, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=4f77ff1b-e147-4c07-9d9b-feabd33edead) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:54:35 np0005531887 virtnodedevd[186714]: ethtool ioctl error on tap4f77ff1b-e1: No such device
Nov 22 02:54:35 np0005531887 virtnodedevd[186714]: ethtool ioctl error on tap4f77ff1b-e1: No such device
Nov 22 02:54:35 np0005531887 podman[221808]: 2025-11-22 07:54:35.973216169 +0000 UTC m=+0.052095126 container remove bf727e3af2c59745f5fbb42f9de6211ef8c23d4f25ac67859bc9f4740d167e73 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 02:54:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:35.979 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[b14209f7-fc93-4a35-b8ca-fd95723c8ce6]: (4, ('Sat Nov 22 07:54:35 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f (bf727e3af2c59745f5fbb42f9de6211ef8c23d4f25ac67859bc9f4740d167e73)\nbf727e3af2c59745f5fbb42f9de6211ef8c23d4f25ac67859bc9f4740d167e73\nSat Nov 22 07:54:35 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f (bf727e3af2c59745f5fbb42f9de6211ef8c23d4f25ac67859bc9f4740d167e73)\nbf727e3af2c59745f5fbb42f9de6211ef8c23d4f25ac67859bc9f4740d167e73\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:35.982 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[749a2301-f2d6-4f46-bded-1403ee026370]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:35.983 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd54e232a-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:54:35 np0005531887 nova_compute[186849]: 2025-11-22 07:54:35.985 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:35 np0005531887 kernel: tapd54e232a-50: left promiscuous mode
Nov 22 02:54:36 np0005531887 nova_compute[186849]: 2025-11-22 07:54:36.002 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:36.005 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[ae47b2ba-8475-4185-82d4-f0d5c81e7c42]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:36.020 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[95becbab-9a6b-42a2-927f-b0bcc9b67c3f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:36.022 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[9b738b9d-6d9f-4a2a-a9df-25a0b3cefa3c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:36.037 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[396b2b5a-2913-4465-95fd-6cdba8d38e49]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 479482, 'reachable_time': 43703, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221836, 'error': None, 'target': 'ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:36 np0005531887 systemd[1]: run-netns-ovnmeta\x2dd54e232a\x2d5c68\x2d4cc7\x2db58c\x2d054da9c4646f.mount: Deactivated successfully.
Nov 22 02:54:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:36.042 104200 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 02:54:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:36.042 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[84ebb24d-83c4-4291-9401-9ea9ec281e94]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:36.043 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 4f77ff1b-e147-4c07-9d9b-feabd33edead in datapath d54e232a-5c68-4cc7-b58c-054da9c4646f unbound from our chassis#033[00m
Nov 22 02:54:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:36.044 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d54e232a-5c68-4cc7-b58c-054da9c4646f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 02:54:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:36.045 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[6a91fee7-a813-43dc-8281-0ca47ff6d3f6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:36.046 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 4f77ff1b-e147-4c07-9d9b-feabd33edead in datapath d54e232a-5c68-4cc7-b58c-054da9c4646f unbound from our chassis#033[00m
Nov 22 02:54:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:36.047 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d54e232a-5c68-4cc7-b58c-054da9c4646f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 02:54:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:36.047 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[03db21dc-45fa-4f71-bac6-d817c90b20f6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:36 np0005531887 nova_compute[186849]: 2025-11-22 07:54:36.408 186853 INFO nova.virt.libvirt.driver [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Instance shutdown successfully after 13 seconds.#033[00m
Nov 22 02:54:36 np0005531887 nova_compute[186849]: 2025-11-22 07:54:36.414 186853 INFO nova.virt.libvirt.driver [-] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Instance destroyed successfully.#033[00m
Nov 22 02:54:36 np0005531887 nova_compute[186849]: 2025-11-22 07:54:36.419 186853 INFO nova.virt.libvirt.driver [-] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Instance destroyed successfully.#033[00m
Nov 22 02:54:36 np0005531887 nova_compute[186849]: 2025-11-22 07:54:36.420 186853 DEBUG nova.virt.libvirt.vif [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:54:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-519804368',display_name='tempest-ServerDiskConfigTestJSON-server-519804368',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-519804368',id=61,image_ref='360f90ca-2ddb-4e60-a48e-364e3b48bd96',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:54:13Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='063bf16c91af408ca075c690797e09d8',ramdisk_id='',reservation_id='r-pgwqi3ub',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='360f90ca-2ddb-4e60-a48e-364e3b48bd96',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-592691466',owner_user_name='tempest-ServerDiskConfigTestJSON-592691466-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:54:21Z,user_data=None,user_id='e24c302b62fb470aa189b76d4676733b',uuid=669c1c7b-c493-4f31-83dd-737239095b63,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4f77ff1b-e147-4c07-9d9b-feabd33edead", "address": "fa:16:3e:c6:b1:2a", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f77ff1b-e1", "ovs_interfaceid": "4f77ff1b-e147-4c07-9d9b-feabd33edead", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 02:54:36 np0005531887 nova_compute[186849]: 2025-11-22 07:54:36.421 186853 DEBUG nova.network.os_vif_util [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Converting VIF {"id": "4f77ff1b-e147-4c07-9d9b-feabd33edead", "address": "fa:16:3e:c6:b1:2a", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f77ff1b-e1", "ovs_interfaceid": "4f77ff1b-e147-4c07-9d9b-feabd33edead", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:54:36 np0005531887 nova_compute[186849]: 2025-11-22 07:54:36.421 186853 DEBUG nova.network.os_vif_util [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c6:b1:2a,bridge_name='br-int',has_traffic_filtering=True,id=4f77ff1b-e147-4c07-9d9b-feabd33edead,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f77ff1b-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:54:36 np0005531887 nova_compute[186849]: 2025-11-22 07:54:36.422 186853 DEBUG os_vif [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c6:b1:2a,bridge_name='br-int',has_traffic_filtering=True,id=4f77ff1b-e147-4c07-9d9b-feabd33edead,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f77ff1b-e1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 02:54:36 np0005531887 nova_compute[186849]: 2025-11-22 07:54:36.423 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:36 np0005531887 nova_compute[186849]: 2025-11-22 07:54:36.424 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4f77ff1b-e1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:54:36 np0005531887 nova_compute[186849]: 2025-11-22 07:54:36.425 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:36 np0005531887 nova_compute[186849]: 2025-11-22 07:54:36.429 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:36 np0005531887 nova_compute[186849]: 2025-11-22 07:54:36.431 186853 INFO os_vif [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c6:b1:2a,bridge_name='br-int',has_traffic_filtering=True,id=4f77ff1b-e147-4c07-9d9b-feabd33edead,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f77ff1b-e1')#033[00m
Nov 22 02:54:36 np0005531887 nova_compute[186849]: 2025-11-22 07:54:36.432 186853 INFO nova.virt.libvirt.driver [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Deleting instance files /var/lib/nova/instances/669c1c7b-c493-4f31-83dd-737239095b63_del#033[00m
Nov 22 02:54:36 np0005531887 nova_compute[186849]: 2025-11-22 07:54:36.432 186853 INFO nova.virt.libvirt.driver [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Deletion of /var/lib/nova/instances/669c1c7b-c493-4f31-83dd-737239095b63_del complete#033[00m
Nov 22 02:54:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:36.665 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '669c1c7b-c493-4f31-83dd-737239095b63', 'name': 'tempest-ServerDiskConfigTestJSON-server-519804368', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000003d', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'shutdown', 'tenant_id': '063bf16c91af408ca075c690797e09d8', 'user_id': 'e24c302b62fb470aa189b76d4676733b', 'hostId': 'a10f2ed0b6aa467d450654d72aff6dd20902a72cd876ebec605abd3e', 'status': 'stopped', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 02:54:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:36.670 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}a11a3876e3b0d6d40540fef270f2527f376e5c95223cda6a3c2983f6e783b7d9" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Nov 22 02:54:36 np0005531887 systemd[1]: Stopping User Manager for UID 42436...
Nov 22 02:54:36 np0005531887 systemd[221320]: Activating special unit Exit the Session...
Nov 22 02:54:36 np0005531887 systemd[221320]: Stopped target Main User Target.
Nov 22 02:54:36 np0005531887 systemd[221320]: Stopped target Basic System.
Nov 22 02:54:36 np0005531887 systemd[221320]: Stopped target Paths.
Nov 22 02:54:36 np0005531887 systemd[221320]: Stopped target Sockets.
Nov 22 02:54:36 np0005531887 systemd[221320]: Stopped target Timers.
Nov 22 02:54:36 np0005531887 systemd[221320]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 22 02:54:36 np0005531887 systemd[221320]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 22 02:54:36 np0005531887 systemd[221320]: Closed D-Bus User Message Bus Socket.
Nov 22 02:54:36 np0005531887 systemd[221320]: Stopped Create User's Volatile Files and Directories.
Nov 22 02:54:36 np0005531887 systemd[221320]: Removed slice User Application Slice.
Nov 22 02:54:36 np0005531887 systemd[221320]: Reached target Shutdown.
Nov 22 02:54:36 np0005531887 systemd[221320]: Finished Exit the Session.
Nov 22 02:54:36 np0005531887 systemd[221320]: Reached target Exit the Session.
Nov 22 02:54:36 np0005531887 systemd[1]: user@42436.service: Deactivated successfully.
Nov 22 02:54:36 np0005531887 systemd[1]: Stopped User Manager for UID 42436.
Nov 22 02:54:36 np0005531887 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 22 02:54:36 np0005531887 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 22 02:54:36 np0005531887 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 22 02:54:36 np0005531887 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 22 02:54:36 np0005531887 systemd[1]: Removed slice User Slice of UID 42436.
Nov 22 02:54:37 np0005531887 nova_compute[186849]: 2025-11-22 07:54:37.130 186853 DEBUG nova.network.neutron [req-37cdf781-8d53-43b5-bf15-8fa899a10b27 req-dab78e80-0b20-4402-b8ca-e29b0f9a14aa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Updated VIF entry in instance network info cache for port a038edb6-47af-4f7e-9f5e-715660b6da32. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 02:54:37 np0005531887 nova_compute[186849]: 2025-11-22 07:54:37.130 186853 DEBUG nova.network.neutron [req-37cdf781-8d53-43b5-bf15-8fa899a10b27 req-dab78e80-0b20-4402-b8ca-e29b0f9a14aa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Updating instance_info_cache with network_info: [{"id": "a038edb6-47af-4f7e-9f5e-715660b6da32", "address": "fa:16:3e:36:ab:fc", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa038edb6-47", "ovs_interfaceid": "a038edb6-47af-4f7e-9f5e-715660b6da32", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:54:37 np0005531887 nova_compute[186849]: 2025-11-22 07:54:37.158 186853 DEBUG oslo_concurrency.lockutils [req-37cdf781-8d53-43b5-bf15-8fa899a10b27 req-dab78e80-0b20-4402-b8ca-e29b0f9a14aa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-c64e78b6-87b2-425c-aef9-771bcd042d58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:54:37 np0005531887 nova_compute[186849]: 2025-11-22 07:54:37.256 186853 DEBUG nova.virt.libvirt.driver [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:54:37 np0005531887 nova_compute[186849]: 2025-11-22 07:54:37.256 186853 INFO nova.virt.libvirt.driver [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Creating image(s)#033[00m
Nov 22 02:54:37 np0005531887 nova_compute[186849]: 2025-11-22 07:54:37.257 186853 DEBUG oslo_concurrency.lockutils [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "/var/lib/nova/instances/669c1c7b-c493-4f31-83dd-737239095b63/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:37 np0005531887 nova_compute[186849]: 2025-11-22 07:54:37.257 186853 DEBUG oslo_concurrency.lockutils [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "/var/lib/nova/instances/669c1c7b-c493-4f31-83dd-737239095b63/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:37 np0005531887 nova_compute[186849]: 2025-11-22 07:54:37.258 186853 DEBUG oslo_concurrency.lockutils [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "/var/lib/nova/instances/669c1c7b-c493-4f31-83dd-737239095b63/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:37 np0005531887 nova_compute[186849]: 2025-11-22 07:54:37.258 186853 DEBUG oslo_concurrency.lockutils [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "2882af3479446958b785a3f508ce087a26493f42" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:37 np0005531887 nova_compute[186849]: 2025-11-22 07:54:37.258 186853 DEBUG oslo_concurrency.lockutils [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "2882af3479446958b785a3f508ce087a26493f42" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:37 np0005531887 nova_compute[186849]: 2025-11-22 07:54:37.294 186853 DEBUG nova.compute.manager [req-66fcd351-69be-4c82-ac78-977ce9bdd553 req-85f7a1a4-c3c5-44ed-86ac-ff90e5098c2f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Received event network-vif-unplugged-4f77ff1b-e147-4c07-9d9b-feabd33edead external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:54:37 np0005531887 nova_compute[186849]: 2025-11-22 07:54:37.295 186853 DEBUG oslo_concurrency.lockutils [req-66fcd351-69be-4c82-ac78-977ce9bdd553 req-85f7a1a4-c3c5-44ed-86ac-ff90e5098c2f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "669c1c7b-c493-4f31-83dd-737239095b63-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:37 np0005531887 nova_compute[186849]: 2025-11-22 07:54:37.295 186853 DEBUG oslo_concurrency.lockutils [req-66fcd351-69be-4c82-ac78-977ce9bdd553 req-85f7a1a4-c3c5-44ed-86ac-ff90e5098c2f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "669c1c7b-c493-4f31-83dd-737239095b63-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:37 np0005531887 nova_compute[186849]: 2025-11-22 07:54:37.295 186853 DEBUG oslo_concurrency.lockutils [req-66fcd351-69be-4c82-ac78-977ce9bdd553 req-85f7a1a4-c3c5-44ed-86ac-ff90e5098c2f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "669c1c7b-c493-4f31-83dd-737239095b63-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:37 np0005531887 nova_compute[186849]: 2025-11-22 07:54:37.296 186853 DEBUG nova.compute.manager [req-66fcd351-69be-4c82-ac78-977ce9bdd553 req-85f7a1a4-c3c5-44ed-86ac-ff90e5098c2f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] No waiting events found dispatching network-vif-unplugged-4f77ff1b-e147-4c07-9d9b-feabd33edead pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:54:37 np0005531887 nova_compute[186849]: 2025-11-22 07:54:37.296 186853 WARNING nova.compute.manager [req-66fcd351-69be-4c82-ac78-977ce9bdd553 req-85f7a1a4-c3c5-44ed-86ac-ff90e5098c2f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Received unexpected event network-vif-unplugged-4f77ff1b-e147-4c07-9d9b-feabd33edead for instance with vm_state active and task_state rebuild_spawning.#033[00m
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.305 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 644 Content-Type: application/json Date: Sat, 22 Nov 2025 07:54:36 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-6fbb316c-5f9f-4ed5-950d-d3282187db0c x-openstack-request-id: req-6fbb316c-5f9f-4ed5-950d-d3282187db0c _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.307 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "1c351edf-5b2d-477d-93d0-c380bdae83e7", "name": "m1.micro", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/1c351edf-5b2d-477d-93d0-c380bdae83e7"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/1c351edf-5b2d-477d-93d0-c380bdae83e7"}]}, {"id": "31612188-3cd6-428b-9166-9568f0affd4a", "name": "m1.nano", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/31612188-3cd6-428b-9166-9568f0affd4a"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/31612188-3cd6-428b-9166-9568f0affd4a"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.307 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-6fbb316c-5f9f-4ed5-950d-d3282187db0c request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.309 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors/1c351edf-5b2d-477d-93d0-c380bdae83e7 -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}a11a3876e3b0d6d40540fef270f2527f376e5c95223cda6a3c2983f6e783b7d9" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Nov 22 02:54:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:37.325 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:37.325 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:37.326 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.446 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 496 Content-Type: application/json Date: Sat, 22 Nov 2025 07:54:37 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-cebae813-9275-49eb-a800-1cbdea2e7f13 x-openstack-request-id: req-cebae813-9275-49eb-a800-1cbdea2e7f13 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.447 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "1c351edf-5b2d-477d-93d0-c380bdae83e7", "name": "m1.micro", "ram": 192, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 0, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/1c351edf-5b2d-477d-93d0-c380bdae83e7"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/1c351edf-5b2d-477d-93d0-c380bdae83e7"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.447 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors/1c351edf-5b2d-477d-93d0-c380bdae83e7 used request id req-cebae813-9275-49eb-a800-1cbdea2e7f13 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.448 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '10a29489-706f-428f-b645-1c688d642f0b', 'name': 'tempest-ListServerFiltersTestJSON-instance-688404127', 'flavor': {'id': '1c351edf-5b2d-477d-93d0-c380bdae83e7', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000040', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'hostId': 'fecdd2f490be1cc81803448995491b37056c7061fb937a5a9410db18', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.452 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c64e78b6-87b2-425c-aef9-771bcd042d58', 'name': 'tempest-DeleteServersTestJSON-server-1711924286', 'flavor': {'id': '1c351edf-5b2d-477d-93d0-c380bdae83e7', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000003b', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'hostId': '41cdcff15b5a7c4ff81f1776cad17739f7e062616f0dbe910b638a21', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.453 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.453 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.453 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServerDiskConfigTestJSON-server-519804368>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-688404127>, <NovaLikeServer: tempest-DeleteServersTestJSON-server-1711924286>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerDiskConfigTestJSON-server-519804368>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-688404127>, <NovaLikeServer: tempest-DeleteServersTestJSON-server-1711924286>]
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.454 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.454 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.454 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerDiskConfigTestJSON-server-519804368>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-688404127>, <NovaLikeServer: tempest-DeleteServersTestJSON-server-1711924286>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerDiskConfigTestJSON-server-519804368>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-688404127>, <NovaLikeServer: tempest-DeleteServersTestJSON-server-1711924286>]
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.455 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '669c1c7b-c493-4f31-83dd-737239095b63'
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.456 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000003d, id=669c1c7b-c493-4f31-83dd-737239095b63>: [Error Code 42] Domain not found: no domain with matching uuid '669c1c7b-c493-4f31-83dd-737239095b63' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.459 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 10a29489-706f-428f-b645-1c688d642f0b / tapc27f5a73-ae inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.459 12 DEBUG ceilometer.compute.pollsters [-] 10a29489-706f-428f-b645-1c688d642f0b/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.462 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for c64e78b6-87b2-425c-aef9-771bcd042d58 / tapa038edb6-47 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.462 12 DEBUG ceilometer.compute.pollsters [-] c64e78b6-87b2-425c-aef9-771bcd042d58/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.464 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a00cccff-3801-471d-a3f1-7de002d4dbed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': 'instance-00000040-10a29489-706f-428f-b645-1c688d642f0b-tapc27f5a73-ae', 'timestamp': '2025-11-22T07:54:37.455169', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-688404127', 'name': 'tapc27f5a73-ae', 'instance_id': '10a29489-706f-428f-b645-1c688d642f0b', 'instance_type': 'm1.micro', 'host': 'fecdd2f490be1cc81803448995491b37056c7061fb937a5a9410db18', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1c351edf-5b2d-477d-93d0-c380bdae83e7', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:15:d5:34', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc27f5a73-ae'}, 'message_id': '7e214c96-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4820.126626007, 'message_signature': 'c84bf11594a02ec26049cafe9a7e587a87a24ddebb1f48893c4d75a46642825a'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_name': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_name': None, 'resource_id': 'instance-0000003b-c64e78b6-87b2-425c-aef9-771bcd042d58-tapa038edb6-47', 'timestamp': '2025-11-22T07:54:37.455169', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1711924286', 'name': 'tapa038edb6-47', 'instance_id': 'c64e78b6-87b2-425c-aef9-771bcd042d58', 'instance_type': 'm1.micro', 'host': '41cdcff15b5a7c4ff81f1776cad17739f7e062616f0dbe910b638a21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1c351edf-5b2d-477d-93d0-c380bdae83e7', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:36:ab:fc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa038edb6-47'}, 'message_id': '7e21be88-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4820.129949499, 'message_signature': '938ee384585f256f518d266a0bcd36f1cb0c6d703bb8436ac86b65813a332ee7'}]}, 'timestamp': '2025-11-22 07:54:37.463102', '_unique_id': 'bf3e0742bf9c44d8ba5f33db239e2096'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.464 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.464 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.464 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.464 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.464 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.464 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.464 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.464 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.464 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.464 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.464 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.464 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.464 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.464 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.464 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.464 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.464 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.464 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.464 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.464 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.464 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.464 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.464 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.464 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.464 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.464 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.464 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.464 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.464 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.464 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.464 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.467 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '669c1c7b-c493-4f31-83dd-737239095b63'
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.468 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000003d, id=669c1c7b-c493-4f31-83dd-737239095b63>: [Error Code 42] Domain not found: no domain with matching uuid '669c1c7b-c493-4f31-83dd-737239095b63' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.468 12 DEBUG ceilometer.compute.pollsters [-] 10a29489-706f-428f-b645-1c688d642f0b/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.469 12 DEBUG ceilometer.compute.pollsters [-] c64e78b6-87b2-425c-aef9-771bcd042d58/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.470 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3449f2f2-f21c-43c5-b916-0b37eaaf7baf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': 'instance-00000040-10a29489-706f-428f-b645-1c688d642f0b-tapc27f5a73-ae', 'timestamp': '2025-11-22T07:54:37.467743', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-688404127', 'name': 'tapc27f5a73-ae', 'instance_id': '10a29489-706f-428f-b645-1c688d642f0b', 'instance_type': 'm1.micro', 'host': 'fecdd2f490be1cc81803448995491b37056c7061fb937a5a9410db18', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1c351edf-5b2d-477d-93d0-c380bdae83e7', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:15:d5:34', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc27f5a73-ae'}, 'message_id': '7e22b554-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4820.126626007, 'message_signature': 'e93e09b8c73a8776fc63c1cc59200b5ac22d0cf0ffa702b50094635c2461ad3a'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_name': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_name': None, 'resource_id': 'instance-0000003b-c64e78b6-87b2-425c-aef9-771bcd042d58-tapa038edb6-47', 'timestamp': '2025-11-22T07:54:37.467743', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1711924286', 'name': 'tapa038edb6-47', 'instance_id': 'c64e78b6-87b2-425c-aef9-771bcd042d58', 'instance_type': 'm1.micro', 'host': '41cdcff15b5a7c4ff81f1776cad17739f7e062616f0dbe910b638a21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1c351edf-5b2d-477d-93d0-c380bdae83e7', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:36:ab:fc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa038edb6-47'}, 'message_id': '7e22c6ac-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4820.129949499, 'message_signature': 'b6c79954f314e1ba954741163e9d36e4a914dfceb22be2008817c918ea486be0'}]}, 'timestamp': '2025-11-22 07:54:37.469809', '_unique_id': 'a67fbcf0db464241b7504623938e17f6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.470 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.470 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.470 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.470 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.470 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.470 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.470 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.470 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.470 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.470 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.470 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.470 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.470 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.470 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.470 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.470 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.470 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.470 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.470 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.470 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.470 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.470 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.470 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.470 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.470 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.470 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.470 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.470 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.470 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.470 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.470 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.474 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '669c1c7b-c493-4f31-83dd-737239095b63'
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.475 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000003d, id=669c1c7b-c493-4f31-83dd-737239095b63>: [Error Code 42] Domain not found: no domain with matching uuid '669c1c7b-c493-4f31-83dd-737239095b63' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.487 12 DEBUG ceilometer.compute.pollsters [-] 10a29489-706f-428f-b645-1c688d642f0b/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.488 12 DEBUG ceilometer.compute.pollsters [-] 10a29489-706f-428f-b645-1c688d642f0b/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.501 12 DEBUG ceilometer.compute.pollsters [-] c64e78b6-87b2-425c-aef9-771bcd042d58/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.502 12 DEBUG ceilometer.compute.pollsters [-] c64e78b6-87b2-425c-aef9-771bcd042d58/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.504 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '49df1fb3-15d2-4b67-8eb5-21e1b1e655a6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': '10a29489-706f-428f-b645-1c688d642f0b-vda', 'timestamp': '2025-11-22T07:54:37.474947', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-688404127', 'name': 'instance-00000040', 'instance_id': '10a29489-706f-428f-b645-1c688d642f0b', 'instance_type': 'm1.micro', 'host': 'fecdd2f490be1cc81803448995491b37056c7061fb937a5a9410db18', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1c351edf-5b2d-477d-93d0-c380bdae83e7', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7e259148-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4820.145670544, 'message_signature': '1bcdca9a6b59c3472c95965b03689da1c8f874d14e4313e1744bb6e4768f2c76'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': '10a29489-706f-428f-b645-1c688d642f0b-sda', 'timestamp': '2025-11-22T07:54:37.474947', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-688404127', 'name': 'instance-00000040', 'instance_id': '10a29489-706f-428f-b645-1c688d642f0b', 'instance_type': 'm1.micro', 'host': 'fecdd2f490be1cc81803448995491b37056c7061fb937a5a9410db18', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1c351edf-5b2d-477d-93d0-c380bdae83e7', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7e25a64c-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4820.145670544, 'message_signature': 'a26be4e4d6d9e08b936c02e71475570eb5858113db310e134eaf5a8df61dac2f'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_name': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_name': None, 'resource_id': 'c64e78b6-87b2-425c-aef9-771bcd042d58-vda', 'timestamp': '2025-11-22T07:54:37.474947', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1711924286', 'name': 'instance-0000003b', 'instance_id': 'c64e78b6-87b2-425c-aef9-771bcd042d58', 'instance_type': 'm1.micro', 'host': '41cdcff15b5a7c4ff81f1776cad17739f7e062616f0dbe910b638a21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1c351edf-5b2d-477d-93d0-c380bdae83e7', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7e27c350-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4820.158459837, 'message_signature': '2ea5b78e9dac31bb9b549796df1b8ef314212cd29317972e3d4a269b7b76bec6'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_name': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_name': None, 'resource_id': 'c64e78b6-87b2-425c-aef9-771bcd042d58-sda', 'timestamp': '2025-11-22T07:54:37.474947', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1711924286', 'name': 'instance-0000003b', 'instance_id': 'c64e78b6-87b2-425c-aef9-771bcd042d58', 'instance_type': 'm1.micro', 'host': '41cdcff15b5a7c4ff81f1776cad17739f7e062616f0dbe910b638a21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1c351edf-5b2d-477d-93d0-c380bdae83e7', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7e27d354-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4820.158459837, 'message_signature': '86407e61af05fb07d370a088b708df8cded42386643e6ba480d649d16fded099'}]}, 'timestamp': '2025-11-22 07:54:37.502914', '_unique_id': '42a87a56f8784d68903ce90de1565a5b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.504 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.504 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.504 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.504 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.504 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.504 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.504 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.504 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.504 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.504 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.504 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.504 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.504 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.504 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.504 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.504 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.504 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.504 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.504 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.504 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.504 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.504 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.504 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.504 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.504 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.504 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.504 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.504 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.504 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.504 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.504 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.508 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '669c1c7b-c493-4f31-83dd-737239095b63'
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.509 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000003d, id=669c1c7b-c493-4f31-83dd-737239095b63>: [Error Code 42] Domain not found: no domain with matching uuid '669c1c7b-c493-4f31-83dd-737239095b63' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.509 12 DEBUG ceilometer.compute.pollsters [-] 10a29489-706f-428f-b645-1c688d642f0b/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.510 12 DEBUG ceilometer.compute.pollsters [-] c64e78b6-87b2-425c-aef9-771bcd042d58/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.511 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '02a8300f-1fb2-4541-af08-0addd57d8c6c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': 'instance-00000040-10a29489-706f-428f-b645-1c688d642f0b-tapc27f5a73-ae', 'timestamp': '2025-11-22T07:54:37.508893', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-688404127', 'name': 'tapc27f5a73-ae', 'instance_id': '10a29489-706f-428f-b645-1c688d642f0b', 'instance_type': 'm1.micro', 'host': 'fecdd2f490be1cc81803448995491b37056c7061fb937a5a9410db18', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1c351edf-5b2d-477d-93d0-c380bdae83e7', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:15:d5:34', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc27f5a73-ae'}, 'message_id': '7e28f4dc-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4820.126626007, 'message_signature': 'abb5b8d9c4e26e5fe9688eeb6c1e022e830f7d8d3eee4045c29e9bb93e4b3be7'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_name': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_name': None, 'resource_id': 'instance-0000003b-c64e78b6-87b2-425c-aef9-771bcd042d58-tapa038edb6-47', 'timestamp': '2025-11-22T07:54:37.508893', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1711924286', 'name': 'tapa038edb6-47', 'instance_id': 'c64e78b6-87b2-425c-aef9-771bcd042d58', 'instance_type': 'm1.micro', 'host': '41cdcff15b5a7c4ff81f1776cad17739f7e062616f0dbe910b638a21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1c351edf-5b2d-477d-93d0-c380bdae83e7', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:36:ab:fc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa038edb6-47'}, 'message_id': '7e290a62-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4820.129949499, 'message_signature': '6e29acc1dd8a17d4155fc372a199ed0005322fb23b0be00ea6cb1c9e3fca6cc6'}]}, 'timestamp': '2025-11-22 07:54:37.510842', '_unique_id': 'c67a0d57f2b8462e8efa108968ab7e45'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.511 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.511 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.511 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.511 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.511 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.511 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.511 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.511 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.511 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.511 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.511 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.511 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.511 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.511 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.511 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.511 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.511 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.511 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.511 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.511 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.511 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.511 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.511 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.511 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.511 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.511 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.511 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.511 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.511 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.511 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.511 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.515 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '669c1c7b-c493-4f31-83dd-737239095b63'
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.518 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000003d, id=669c1c7b-c493-4f31-83dd-737239095b63>: [Error Code 42] Domain not found: no domain with matching uuid '669c1c7b-c493-4f31-83dd-737239095b63' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.518 12 DEBUG ceilometer.compute.pollsters [-] 10a29489-706f-428f-b645-1c688d642f0b/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.519 12 DEBUG ceilometer.compute.pollsters [-] c64e78b6-87b2-425c-aef9-771bcd042d58/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.526 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c1f35396-40c2-4fa2-9bde-5197df677033', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': 'instance-00000040-10a29489-706f-428f-b645-1c688d642f0b-tapc27f5a73-ae', 'timestamp': '2025-11-22T07:54:37.516071', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-688404127', 'name': 'tapc27f5a73-ae', 'instance_id': '10a29489-706f-428f-b645-1c688d642f0b', 'instance_type': 'm1.micro', 'host': 'fecdd2f490be1cc81803448995491b37056c7061fb937a5a9410db18', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1c351edf-5b2d-477d-93d0-c380bdae83e7', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:15:d5:34', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc27f5a73-ae'}, 'message_id': '7e2a4936-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4820.126626007, 'message_signature': 'a8a89dbdf2a04b61bb55581d064aeae844d21af9af626ba4dce24939031a5a89'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_name': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_name': None, 'resource_id': 'instance-0000003b-c64e78b6-87b2-425c-aef9-771bcd042d58-tapa038edb6-47', 'timestamp': '2025-11-22T07:54:37.516071', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1711924286', 'name': 'tapa038edb6-47', 'instance_id': 'c64e78b6-87b2-425c-aef9-771bcd042d58', 'instance_type': 'm1.micro', 'host': '41cdcff15b5a7c4ff81f1776cad17739f7e062616f0dbe910b638a21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1c351edf-5b2d-477d-93d0-c380bdae83e7', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:36:ab:fc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa038edb6-47'}, 'message_id': '7e2a5ce6-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4820.129949499, 'message_signature': '52f3703b935e9ab829bc06aacd89102ad0d18cb910256167395991a42640192c'}]}, 'timestamp': '2025-11-22 07:54:37.519556', '_unique_id': '9187cbcb9a0f4433bca6a72bab24d766'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.526 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.526 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.526 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.526 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.526 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.526 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.526 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.526 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.526 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.526 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.526 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.526 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.526 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.526 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.526 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.526 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.526 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.526 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.526 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.526 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.526 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.526 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.526 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.526 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.526 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.526 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.526 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.526 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.526 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.526 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.526 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.531 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '669c1c7b-c493-4f31-83dd-737239095b63'
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.532 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000003d, id=669c1c7b-c493-4f31-83dd-737239095b63>: [Error Code 42] Domain not found: no domain with matching uuid '669c1c7b-c493-4f31-83dd-737239095b63' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.551 12 DEBUG ceilometer.compute.pollsters [-] 10a29489-706f-428f-b645-1c688d642f0b/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.551 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 10a29489-706f-428f-b645-1c688d642f0b: ceilometer.compute.pollsters.NoVolumeException
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.567 12 DEBUG ceilometer.compute.pollsters [-] c64e78b6-87b2-425c-aef9-771bcd042d58/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.568 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance c64e78b6-87b2-425c-aef9-771bcd042d58: ceilometer.compute.pollsters.NoVolumeException
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.568 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '669c1c7b-c493-4f31-83dd-737239095b63'
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.569 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000003d, id=669c1c7b-c493-4f31-83dd-737239095b63>: [Error Code 42] Domain not found: no domain with matching uuid '669c1c7b-c493-4f31-83dd-737239095b63' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.569 12 DEBUG ceilometer.compute.pollsters [-] 10a29489-706f-428f-b645-1c688d642f0b/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.570 12 DEBUG ceilometer.compute.pollsters [-] c64e78b6-87b2-425c-aef9-771bcd042d58/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.571 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a5fe81d4-1543-4e40-a6a9-68fa90a2d393', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': 'instance-00000040-10a29489-706f-428f-b645-1c688d642f0b-tapc27f5a73-ae', 'timestamp': '2025-11-22T07:54:37.568704', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-688404127', 'name': 'tapc27f5a73-ae', 'instance_id': '10a29489-706f-428f-b645-1c688d642f0b', 'instance_type': 'm1.micro', 'host': 'fecdd2f490be1cc81803448995491b37056c7061fb937a5a9410db18', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1c351edf-5b2d-477d-93d0-c380bdae83e7', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:15:d5:34', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc27f5a73-ae'}, 'message_id': '7e32112a-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4820.126626007, 'message_signature': '25db243766705051bbb3926069b2a2ad115010ffc650f3ebad0bf6495316972a'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_name': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_name': None, 'resource_id': 'instance-0000003b-c64e78b6-87b2-425c-aef9-771bcd042d58-tapa038edb6-47', 'timestamp': '2025-11-22T07:54:37.568704', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1711924286', 'name': 'tapa038edb6-47', 'instance_id': 'c64e78b6-87b2-425c-aef9-771bcd042d58', 'instance_type': 'm1.micro', 'host': '41cdcff15b5a7c4ff81f1776cad17739f7e062616f0dbe910b638a21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1c351edf-5b2d-477d-93d0-c380bdae83e7', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:36:ab:fc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa038edb6-47'}, 'message_id': '7e322174-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4820.129949499, 'message_signature': 'bc89c6a74f4a39b16a793e633dd947b10099f43f7dfa9e87b4e21722666b78fc'}]}, 'timestamp': '2025-11-22 07:54:37.570415', '_unique_id': '2382ad6fc4bf4344b61853411393fba5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.571 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.571 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.571 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.571 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.571 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.571 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.571 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.571 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.571 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.571 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.571 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.571 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.571 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.571 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.571 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.571 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.571 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.571 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.571 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.571 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.571 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.571 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.571 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.571 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.571 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.571 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.571 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.571 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.571 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.571 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.571 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.571 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.574 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '669c1c7b-c493-4f31-83dd-737239095b63'
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.576 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000003d, id=669c1c7b-c493-4f31-83dd-737239095b63>: [Error Code 42] Domain not found: no domain with matching uuid '669c1c7b-c493-4f31-83dd-737239095b63' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.602 12 DEBUG ceilometer.compute.pollsters [-] 10a29489-706f-428f-b645-1c688d642f0b/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.603 12 DEBUG ceilometer.compute.pollsters [-] 10a29489-706f-428f-b645-1c688d642f0b/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.647 12 DEBUG ceilometer.compute.pollsters [-] c64e78b6-87b2-425c-aef9-771bcd042d58/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.648 12 DEBUG ceilometer.compute.pollsters [-] c64e78b6-87b2-425c-aef9-771bcd042d58/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.650 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '96333f51-262e-42bd-b851-2ff58bc7c6b0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': '10a29489-706f-428f-b645-1c688d642f0b-vda', 'timestamp': '2025-11-22T07:54:37.575123', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-688404127', 'name': 'instance-00000040', 'instance_id': '10a29489-706f-428f-b645-1c688d642f0b', 'instance_type': 'm1.micro', 'host': 'fecdd2f490be1cc81803448995491b37056c7061fb937a5a9410db18', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1c351edf-5b2d-477d-93d0-c380bdae83e7', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7e3717b0-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4820.245960298, 'message_signature': 'ec30c34252274ec9e54d83f876b7bae92d1f9a38fc3ae211cf67bb119a844466'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': '10a29489-706f-428f-b645-1c688d642f0b-sda', 'timestamp': '2025-11-22T07:54:37.575123', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-688404127', 'name': 'instance-00000040', 'instance_id': '10a29489-706f-428f-b645-1c688d642f0b', 'instance_type': 'm1.micro', 'host': 'fecdd2f490be1cc81803448995491b37056c7061fb937a5a9410db18', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1c351edf-5b2d-477d-93d0-c380bdae83e7', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7e372c96-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4820.245960298, 'message_signature': 'fd8ac96122b08867e5282f99bd234a814602310dacacddbdb023d156a4586e60'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_name': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_name': None, 'resource_id': 'c64e78b6-87b2-425c-aef9-771bcd042d58-vda', 'timestamp': '2025-11-22T07:54:37.575123', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1711924286', 'name': 'instance-0000003b', 'instance_id': 'c64e78b6-87b2-425c-aef9-771bcd042d58', 'instance_type': 'm1.micro', 'host': '41cdcff15b5a7c4ff81f1776cad17739f7e062616f0dbe910b638a21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1c351edf-5b2d-477d-93d0-c380bdae83e7', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7e3e0282-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4820.273315657, 'message_signature': 'cb73caf0bbc9828c41deade8ae7136819e36874f9f1e6f168845eb52ccad5585'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_name': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_name': None, 'resource_id': 'c64e78b6-87b2-425c-aef9-771bcd042d58-sda', 'timestamp': '2025-11-22T07:54:37.575123', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1711924286', 'name': 'instance-0000003b', 'instance_id': 'c64e78b6-87b2-425c-aef9-771bcd042d58', 'instance_type': 'm1.micro', 'host': '41cdcff15b5a7c4ff81f1776cad17739f7e062616f0dbe910b638a21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1c351edf-5b2d-477d-93d0-c380bdae83e7', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7e3e17fe-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4820.273315657, 'message_signature': '8ea8c6a12b7d872576715854a2b1a038cf5d48600997e6a28c78cbc6f039de6e'}]}, 'timestamp': '2025-11-22 07:54:37.648861', '_unique_id': 'abdb39d37b5d46db89f594cee6eeaf02'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.650 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.650 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.650 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.650 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.650 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.650 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.650 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.650 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.650 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.650 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.650 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.650 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.650 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.650 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.650 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.650 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.650 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.650 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.650 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.650 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.650 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.650 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.650 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.650 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.650 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.650 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.650 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.650 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.650 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.650 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.650 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.654 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '669c1c7b-c493-4f31-83dd-737239095b63'
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.655 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000003d, id=669c1c7b-c493-4f31-83dd-737239095b63>: [Error Code 42] Domain not found: no domain with matching uuid '669c1c7b-c493-4f31-83dd-737239095b63' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.655 12 DEBUG ceilometer.compute.pollsters [-] 10a29489-706f-428f-b645-1c688d642f0b/cpu volume: 6050000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.656 12 DEBUG ceilometer.compute.pollsters [-] c64e78b6-87b2-425c-aef9-771bcd042d58/cpu volume: 3220000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.657 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b55dbb88-4609-4f95-9502-c9be865075fe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 6050000000, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': '10a29489-706f-428f-b645-1c688d642f0b', 'timestamp': '2025-11-22T07:54:37.654489', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-688404127', 'name': 'instance-00000040', 'instance_id': '10a29489-706f-428f-b645-1c688d642f0b', 'instance_type': 'm1.micro', 'host': 'fecdd2f490be1cc81803448995491b37056c7061fb937a5a9410db18', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1c351edf-5b2d-477d-93d0-c380bdae83e7', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '7e3f3af8-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4820.220669039, 'message_signature': '3cbf43ad702489decb07edeb62fd08ee138a32ea1f292831120d66157e63efd7'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3220000000, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_name': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_name': None, 'resource_id': 'c64e78b6-87b2-425c-aef9-771bcd042d58', 'timestamp': '2025-11-22T07:54:37.654489', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1711924286', 'name': 'instance-0000003b', 'instance_id': 'c64e78b6-87b2-425c-aef9-771bcd042d58', 'instance_type': 'm1.micro', 'host': '41cdcff15b5a7c4ff81f1776cad17739f7e062616f0dbe910b638a21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1c351edf-5b2d-477d-93d0-c380bdae83e7', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '7e3f4d0e-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4820.237218283, 'message_signature': '16efc05abc98d0a99ae865e11441796436be5437ceb700361a57a4dbc8775851'}]}, 'timestamp': '2025-11-22 07:54:37.656755', '_unique_id': '4eb246a218e4454ba50b13821454c24f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.657 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.657 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.657 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.657 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.657 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.657 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.657 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.657 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.657 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.657 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.657 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.657 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.657 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.657 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.657 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.657 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.657 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.657 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.657 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.657 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.657 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.657 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.657 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.657 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.657 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.657 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.657 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.657 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.657 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.657 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.657 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.661 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '669c1c7b-c493-4f31-83dd-737239095b63'
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.662 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000003d, id=669c1c7b-c493-4f31-83dd-737239095b63>: [Error Code 42] Domain not found: no domain with matching uuid '669c1c7b-c493-4f31-83dd-737239095b63' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.662 12 DEBUG ceilometer.compute.pollsters [-] 10a29489-706f-428f-b645-1c688d642f0b/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.663 12 DEBUG ceilometer.compute.pollsters [-] 10a29489-706f-428f-b645-1c688d642f0b/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.663 12 DEBUG ceilometer.compute.pollsters [-] c64e78b6-87b2-425c-aef9-771bcd042d58/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.664 12 DEBUG ceilometer.compute.pollsters [-] c64e78b6-87b2-425c-aef9-771bcd042d58/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.665 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0fd60359-6b9e-433c-bc82-8001b31debb0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': '10a29489-706f-428f-b645-1c688d642f0b-vda', 'timestamp': '2025-11-22T07:54:37.661703', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-688404127', 'name': 'instance-00000040', 'instance_id': '10a29489-706f-428f-b645-1c688d642f0b', 'instance_type': 'm1.micro', 'host': 'fecdd2f490be1cc81803448995491b37056c7061fb937a5a9410db18', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1c351edf-5b2d-477d-93d0-c380bdae83e7', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7e404966-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4820.245960298, 'message_signature': 'a50d7ac902827d7c7ad93d08bdd83f6adda39e39f33dd672ce29b59882f59fcb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': '10a29489-706f-428f-b645-1c688d642f0b-sda', 'timestamp': '2025-11-22T07:54:37.661703', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-688404127', 'name': 'instance-00000040', 'instance_id': '10a29489-706f-428f-b645-1c688d642f0b', 'instance_type': 'm1.micro', 'host': 'fecdd2f490be1cc81803448995491b37056c7061fb937a5a9410db18', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1c351edf-5b2d-477d-93d0-c380bdae83e7', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7e405aa0-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4820.245960298, 'message_signature': '731d99619e7a8bf1c3dd4d49d33b88ca189bfc83be1f7f3792c9a8688443ea5f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_name': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_name': None, 'resource_id': 'c64e78b6-87b2-425c-aef9-771bcd042d58-vda', 'timestamp': '2025-11-22T07:54:37.661703', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1711924286', 'name': 'instance-0000003b', 'instance_id': 'c64e78b6-87b2-425c-aef9-771bcd042d58', 'instance_type': 'm1.micro', 'host': '41cdcff15b5a7c4ff81f1776cad17739f7e062616f0dbe910b638a21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1c351edf-5b2d-477d-93d0-c380bdae83e7', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7e406b94-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4820.273315657, 'message_signature': '42d03e368976aaf62ce15657b0545e070eb2056279a2efbd6057751b458e7384'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_name': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_name': None, 'resource_id': 'c64e78b6-87b2-425c-aef9-771bcd042d58-sda', 'timestamp': '2025-11-22T07:54:37.661703', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1711924286', 'name': 'instance-0000003b', 'instance_id': 'c64e78b6-87b2-425c-aef9-771bcd042d58', 'instance_type': 'm1.micro', 'host': '41cdcff15b5a7c4ff81f1776cad17739f7e062616f0dbe910b638a21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1c351edf-5b2d-477d-93d0-c380bdae83e7', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7e407bc0-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4820.273315657, 'message_signature': '8738ee5efd9e5a7c1d9f4e27e64d6779e18f34741edce7d0c6140105c624f63a'}]}, 'timestamp': '2025-11-22 07:54:37.664495', '_unique_id': '5556be311f9942b7b4ffb83780ffd452'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.665 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.665 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.665 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.665 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.665 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.665 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.665 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.665 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.665 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.665 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.665 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.665 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.665 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.665 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.665 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.665 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.665 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.665 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.665 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.665 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.665 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.665 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.665 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.665 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.665 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.665 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.665 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.665 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.665 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.665 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.665 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.669 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '669c1c7b-c493-4f31-83dd-737239095b63'
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.670 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000003d, id=669c1c7b-c493-4f31-83dd-737239095b63>: [Error Code 42] Domain not found: no domain with matching uuid '669c1c7b-c493-4f31-83dd-737239095b63' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.670 12 DEBUG ceilometer.compute.pollsters [-] 10a29489-706f-428f-b645-1c688d642f0b/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.670 12 DEBUG ceilometer.compute.pollsters [-] 10a29489-706f-428f-b645-1c688d642f0b/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.671 12 DEBUG ceilometer.compute.pollsters [-] c64e78b6-87b2-425c-aef9-771bcd042d58/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.671 12 DEBUG ceilometer.compute.pollsters [-] c64e78b6-87b2-425c-aef9-771bcd042d58/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.673 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9ec2fbab-19aa-44f9-a460-6d734d972992', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': '10a29489-706f-428f-b645-1c688d642f0b-vda', 'timestamp': '2025-11-22T07:54:37.669525', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-688404127', 'name': 'instance-00000040', 'instance_id': '10a29489-706f-428f-b645-1c688d642f0b', 'instance_type': 'm1.micro', 'host': 'fecdd2f490be1cc81803448995491b37056c7061fb937a5a9410db18', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1c351edf-5b2d-477d-93d0-c380bdae83e7', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7e417944-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4820.245960298, 'message_signature': '04d707dcab711c6173243e20d3f29548b5d2f75bab689898496b604469f37d27'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': '10a29489-706f-428f-b645-1c688d642f0b-sda', 'timestamp': '2025-11-22T07:54:37.669525', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-688404127', 'name': 'instance-00000040', 'instance_id': '10a29489-706f-428f-b645-1c688d642f0b', 'instance_type': 'm1.micro', 'host': 'fecdd2f490be1cc81803448995491b37056c7061fb937a5a9410db18', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1c351edf-5b2d-477d-93d0-c380bdae83e7', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7e418934-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4820.245960298, 'message_signature': '6ca3c509175e863590cf23cba1b4b2b2beb68dbc774c85311d12fd63031f405c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_name': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_name': None, 'resource_id': 'c64e78b6-87b2-425c-aef9-771bcd042d58-vda', 'timestamp': '2025-11-22T07:54:37.669525', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1711924286', 'name': 'instance-0000003b', 'instance_id': 'c64e78b6-87b2-425c-aef9-771bcd042d58', 'instance_type': 'm1.micro', 'host': '41cdcff15b5a7c4ff81f1776cad17739f7e062616f0dbe910b638a21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1c351edf-5b2d-477d-93d0-c380bdae83e7', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7e4199d8-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4820.273315657, 'message_signature': '85a69d82d9f99a75e024d9c67f29a84dbdc983e4d11ac3ae21a2b1d476c1b61e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_name': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_name': None, 'resource_id': 'c64e78b6-87b2-425c-aef9-771bcd042d58-sda', 'timestamp': '2025-11-22T07:54:37.669525', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1711924286', 'name': 'instance-0000003b', 'instance_id': 'c64e78b6-87b2-425c-aef9-771bcd042d58', 'instance_type': 'm1.micro', 'host': '41cdcff15b5a7c4ff81f1776cad17739f7e062616f0dbe910b638a21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1c351edf-5b2d-477d-93d0-c380bdae83e7', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7e41ab58-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4820.273315657, 'message_signature': 'c5a26b7b2a8c9e5e1a374b9cb5f65ec4b6777e24507b8d6d002fce56ccf5819d'}]}, 'timestamp': '2025-11-22 07:54:37.672284', '_unique_id': '28ba63a801d446d0bdf9c7749cfe7670'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.673 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.673 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.673 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.673 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.673 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.673 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.673 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.673 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.673 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.673 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.673 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.673 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.673 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.673 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.673 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.673 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.673 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.673 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.673 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.673 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.673 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.673 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.673 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.673 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.673 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.673 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.673 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.673 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.673 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.673 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.673 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.677 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '669c1c7b-c493-4f31-83dd-737239095b63'
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.678 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000003d, id=669c1c7b-c493-4f31-83dd-737239095b63>: [Error Code 42] Domain not found: no domain with matching uuid '669c1c7b-c493-4f31-83dd-737239095b63' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.678 12 DEBUG ceilometer.compute.pollsters [-] 10a29489-706f-428f-b645-1c688d642f0b/disk.device.read.latency volume: 624532154 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.679 12 DEBUG ceilometer.compute.pollsters [-] 10a29489-706f-428f-b645-1c688d642f0b/disk.device.read.latency volume: 2954183 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.679 12 DEBUG ceilometer.compute.pollsters [-] c64e78b6-87b2-425c-aef9-771bcd042d58/disk.device.read.latency volume: 701135125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.680 12 DEBUG ceilometer.compute.pollsters [-] c64e78b6-87b2-425c-aef9-771bcd042d58/disk.device.read.latency volume: 687356 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.681 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7e93a12f-3401-4f07-9c20-f66117ed393d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 624532154, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': '10a29489-706f-428f-b645-1c688d642f0b-vda', 'timestamp': '2025-11-22T07:54:37.677662', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-688404127', 'name': 'instance-00000040', 'instance_id': '10a29489-706f-428f-b645-1c688d642f0b', 'instance_type': 'm1.micro', 'host': 'fecdd2f490be1cc81803448995491b37056c7061fb937a5a9410db18', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1c351edf-5b2d-477d-93d0-c380bdae83e7', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7e42bdfe-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4820.245960298, 'message_signature': '7dc451c8a02081be9c033dae7e99ece6cccc3792dd673116759a02bd0a4b7c42'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2954183, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': '10a29489-706f-428f-b645-1c688d642f0b-sda', 'timestamp': '2025-11-22T07:54:37.677662', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-688404127', 'name': 'instance-00000040', 'instance_id': '10a29489-706f-428f-b645-1c688d642f0b', 'instance_type': 'm1.micro', 'host': 'fecdd2f490be1cc81803448995491b37056c7061fb937a5a9410db18', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1c351edf-5b2d-477d-93d0-c380bdae83e7', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7e42ccae-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4820.245960298, 'message_signature': '64e4612b4d0df80b92bb65a47ebe651c3722e11d7919a9263be89410e51497ce'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 701135125, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_name': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_name': None, 'resource_id': 'c64e78b6-87b2-425c-aef9-771bcd042d58-vda', 'timestamp': '2025-11-22T07:54:37.677662', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1711924286', 'name': 'instance-0000003b', 'instance_id': 'c64e78b6-87b2-425c-aef9-771bcd042d58', 'instance_type': 'm1.micro', 'host': '41cdcff15b5a7c4ff81f1776cad17739f7e062616f0dbe910b638a21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1c351edf-5b2d-477d-93d0-c380bdae83e7', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7e42db54-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4820.273315657, 'message_signature': 'fada28efd7d7e1d6a4d6dfec1b13cd95ac74f97338f32e6d96815dc8ce0cc33b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 687356, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_name': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_name': None, 'resource_id': 'c64e78b6-87b2-425c-aef9-771bcd042d58-sda', 'timestamp': '2025-11-22T07:54:37.677662', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1711924286', 'name': 'instance-0000003b', 'instance_id': 'c64e78b6-87b2-425c-aef9-771bcd042d58', 'instance_type': 'm1.micro', 'host': '41cdcff15b5a7c4ff81f1776cad17739f7e062616f0dbe910b638a21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1c351edf-5b2d-477d-93d0-c380bdae83e7', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7e42e8ba-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4820.273315657, 'message_signature': 'f0a64bff38117efc35a7dc85e3cbae8c7f536c553a9ac78ca22f5b2e9dd9de07'}]}, 'timestamp': '2025-11-22 07:54:37.680399', '_unique_id': '3732c25bfef141438987082db2b3077e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.681 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.681 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.681 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.681 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.681 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.681 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.681 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.681 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.681 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.681 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.681 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.681 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.681 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.681 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.681 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.681 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.681 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.681 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.681 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.681 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.681 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.681 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.681 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.681 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.681 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.681 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.681 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.681 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.681 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.681 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.681 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.684 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '669c1c7b-c493-4f31-83dd-737239095b63'
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.685 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000003d, id=669c1c7b-c493-4f31-83dd-737239095b63>: [Error Code 42] Domain not found: no domain with matching uuid '669c1c7b-c493-4f31-83dd-737239095b63' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.685 12 DEBUG ceilometer.compute.pollsters [-] 10a29489-706f-428f-b645-1c688d642f0b/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.686 12 DEBUG ceilometer.compute.pollsters [-] 10a29489-706f-428f-b645-1c688d642f0b/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.686 12 DEBUG ceilometer.compute.pollsters [-] c64e78b6-87b2-425c-aef9-771bcd042d58/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.686 12 DEBUG ceilometer.compute.pollsters [-] c64e78b6-87b2-425c-aef9-771bcd042d58/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.687 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '74a77365-2f0d-4602-ade5-b0b38e097d43', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': '10a29489-706f-428f-b645-1c688d642f0b-vda', 'timestamp': '2025-11-22T07:54:37.684793', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-688404127', 'name': 'instance-00000040', 'instance_id': '10a29489-706f-428f-b645-1c688d642f0b', 'instance_type': 'm1.micro', 'host': 'fecdd2f490be1cc81803448995491b37056c7061fb937a5a9410db18', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1c351edf-5b2d-477d-93d0-c380bdae83e7', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7e43ccb2-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4820.245960298, 'message_signature': '8818c99f6d7a1ac53b1e96f27b3cdf0bca5d8b4462a33f33efb5bec6cc918bfc'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': '10a29489-706f-428f-b645-1c688d642f0b-sda', 'timestamp': '2025-11-22T07:54:37.684793', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-688404127', 'name': 'instance-00000040', 'instance_id': '10a29489-706f-428f-b645-1c688d642f0b', 'instance_type': 'm1.micro', 'host': 'fecdd2f490be1cc81803448995491b37056c7061fb937a5a9410db18', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1c351edf-5b2d-477d-93d0-c380bdae83e7', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7e43da54-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4820.245960298, 'message_signature': '2e52a9510f48218a7b8814f329b6cc663e9910ec47ca7008e96cd65be243cfb1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_name': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_name': None, 'resource_id': 'c64e78b6-87b2-425c-aef9-771bcd042d58-vda', 'timestamp': '2025-11-22T07:54:37.684793', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1711924286', 'name': 'instance-0000003b', 'instance_id': 'c64e78b6-87b2-425c-aef9-771bcd042d58', 'instance_type': 'm1.micro', 'host': '41cdcff15b5a7c4ff81f1776cad17739f7e062616f0dbe910b638a21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1c351edf-5b2d-477d-93d0-c380bdae83e7', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7e43e698-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4820.273315657, 'message_signature': '07f84de8d29b9ff8e82be487e8bd9f96f974464779c76a4a8539d76222ccc911'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_name': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_name': None, 'resource_id': 'c64e78b6-87b2-425c-aef9-771bcd042d58-sda', 'timestamp': '2025-11-22T07:54:37.684793', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1711924286', 'name': 'instance-0000003b', 'instance_id': 'c64e78b6-87b2-425c-aef9-771bcd042d58', 'instance_type': 'm1.micro', 'host': '41cdcff15b5a7c4ff81f1776cad17739f7e062616f0dbe910b638a21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1c351edf-5b2d-477d-93d0-c380bdae83e7', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7e43f30e-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4820.273315657, 'message_signature': 'ae958f6c70e9ac0e77f5b15f9d8fa5e8d1bb32ca6923c3a3515e7146a61ed045'}]}, 'timestamp': '2025-11-22 07:54:37.687174', '_unique_id': '269e3e3984344c64868e9e4047df43f9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.687 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.687 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.687 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.687 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.687 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.687 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.687 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.687 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.687 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.687 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.687 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.687 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.687 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.687 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.687 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.687 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.687 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.687 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.687 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.687 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.687 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.687 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.687 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.687 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.687 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.687 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.687 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.687 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.687 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.687 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.687 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.691 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.691 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.691 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerDiskConfigTestJSON-server-519804368>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-688404127>, <NovaLikeServer: tempest-DeleteServersTestJSON-server-1711924286>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerDiskConfigTestJSON-server-519804368>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-688404127>, <NovaLikeServer: tempest-DeleteServersTestJSON-server-1711924286>]
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.691 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '669c1c7b-c493-4f31-83dd-737239095b63'
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.692 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000003d, id=669c1c7b-c493-4f31-83dd-737239095b63>: [Error Code 42] Domain not found: no domain with matching uuid '669c1c7b-c493-4f31-83dd-737239095b63' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.693 12 DEBUG ceilometer.compute.pollsters [-] 10a29489-706f-428f-b645-1c688d642f0b/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.693 12 DEBUG ceilometer.compute.pollsters [-] c64e78b6-87b2-425c-aef9-771bcd042d58/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.695 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '27361f20-302b-4788-bc06-43ca99a45023', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': 'instance-00000040-10a29489-706f-428f-b645-1c688d642f0b-tapc27f5a73-ae', 'timestamp': '2025-11-22T07:54:37.691974', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-688404127', 'name': 'tapc27f5a73-ae', 'instance_id': '10a29489-706f-428f-b645-1c688d642f0b', 'instance_type': 'm1.micro', 'host': 'fecdd2f490be1cc81803448995491b37056c7061fb937a5a9410db18', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1c351edf-5b2d-477d-93d0-c380bdae83e7', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:15:d5:34', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc27f5a73-ae'}, 'message_id': '7e44e796-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4820.126626007, 'message_signature': 'df94b7cc1b8246c8f26bb8b711a21bdd078595dd9539eff5af3a064d5c446017'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_name': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_name': None, 'resource_id': 'instance-0000003b-c64e78b6-87b2-425c-aef9-771bcd042d58-tapa038edb6-47', 'timestamp': '2025-11-22T07:54:37.691974', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1711924286', 'name': 'tapa038edb6-47', 'instance_id': 'c64e78b6-87b2-425c-aef9-771bcd042d58', 'instance_type': 'm1.micro', 'host': '41cdcff15b5a7c4ff81f1776cad17739f7e062616f0dbe910b638a21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1c351edf-5b2d-477d-93d0-c380bdae83e7', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:36:ab:fc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa038edb6-47'}, 'message_id': '7e44fc40-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4820.129949499, 'message_signature': '87dab2ae59697240c8cfa1081ec4a656f960e32f45663f8d0d00ba0531dc5cad'}]}, 'timestamp': '2025-11-22 07:54:37.694012', '_unique_id': 'c99588b72d77418d9c413e6f24d51f6c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.695 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.695 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.695 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.695 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.695 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.695 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.695 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.695 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.695 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.695 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.695 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.695 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.695 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.695 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.695 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.695 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.695 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.695 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.695 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.695 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.695 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.695 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.695 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.695 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.695 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.695 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.695 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.695 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.695 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.695 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.695 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.697 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '669c1c7b-c493-4f31-83dd-737239095b63'
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.698 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000003d, id=669c1c7b-c493-4f31-83dd-737239095b63>: [Error Code 42] Domain not found: no domain with matching uuid '669c1c7b-c493-4f31-83dd-737239095b63' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.698 12 DEBUG ceilometer.compute.pollsters [-] 10a29489-706f-428f-b645-1c688d642f0b/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.698 12 DEBUG ceilometer.compute.pollsters [-] c64e78b6-87b2-425c-aef9-771bcd042d58/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.699 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a4da08c9-2dac-400f-8d97-556f3b55f38a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': 'instance-00000040-10a29489-706f-428f-b645-1c688d642f0b-tapc27f5a73-ae', 'timestamp': '2025-11-22T07:54:37.697369', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-688404127', 'name': 'tapc27f5a73-ae', 'instance_id': '10a29489-706f-428f-b645-1c688d642f0b', 'instance_type': 'm1.micro', 'host': 'fecdd2f490be1cc81803448995491b37056c7061fb937a5a9410db18', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1c351edf-5b2d-477d-93d0-c380bdae83e7', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:15:d5:34', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc27f5a73-ae'}, 'message_id': '7e45b1b2-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4820.126626007, 'message_signature': '2028b03986bfd8c8acc380f0b202e612a4c966f38e67f2767f917524186c8ee1'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_name': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_name': None, 'resource_id': 'instance-0000003b-c64e78b6-87b2-425c-aef9-771bcd042d58-tapa038edb6-47', 'timestamp': '2025-11-22T07:54:37.697369', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1711924286', 'name': 'tapa038edb6-47', 'instance_id': 'c64e78b6-87b2-425c-aef9-771bcd042d58', 'instance_type': 'm1.micro', 'host': '41cdcff15b5a7c4ff81f1776cad17739f7e062616f0dbe910b638a21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1c351edf-5b2d-477d-93d0-c380bdae83e7', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:36:ab:fc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa038edb6-47'}, 'message_id': '7e45be5a-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4820.129949499, 'message_signature': 'c4f50173186aeac9b31fffba26dc4a5490bb36e7d064e5565da77db8b3a14721'}]}, 'timestamp': '2025-11-22 07:54:37.698992', '_unique_id': '6c7f911bd8fa4d759655c649cdeb18d3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.699 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.699 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.699 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.699 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.699 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.699 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.699 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.699 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.699 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.699 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.699 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.699 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.699 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.699 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.699 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.699 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.699 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.699 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.699 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.699 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.699 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.699 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.699 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.699 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.699 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.699 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.699 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.699 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.699 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.699 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.699 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.700 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '669c1c7b-c493-4f31-83dd-737239095b63'
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.702 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000003d, id=669c1c7b-c493-4f31-83dd-737239095b63>: [Error Code 42] Domain not found: no domain with matching uuid '669c1c7b-c493-4f31-83dd-737239095b63' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.702 12 DEBUG ceilometer.compute.pollsters [-] 10a29489-706f-428f-b645-1c688d642f0b/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.702 12 DEBUG ceilometer.compute.pollsters [-] c64e78b6-87b2-425c-aef9-771bcd042d58/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.703 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '37af3612-af2b-434f-bd86-b5565b138ae7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': 'instance-00000040-10a29489-706f-428f-b645-1c688d642f0b-tapc27f5a73-ae', 'timestamp': '2025-11-22T07:54:37.700746', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-688404127', 'name': 'tapc27f5a73-ae', 'instance_id': '10a29489-706f-428f-b645-1c688d642f0b', 'instance_type': 'm1.micro', 'host': 'fecdd2f490be1cc81803448995491b37056c7061fb937a5a9410db18', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1c351edf-5b2d-477d-93d0-c380bdae83e7', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:15:d5:34', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc27f5a73-ae'}, 'message_id': '7e464794-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4820.126626007, 'message_signature': '424484c51aa0ca6badf2658c78eed1ea1c91f3482f39548f0132a2e9675e0db3'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_name': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_name': None, 'resource_id': 'instance-0000003b-c64e78b6-87b2-425c-aef9-771bcd042d58-tapa038edb6-47', 'timestamp': '2025-11-22T07:54:37.700746', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1711924286', 'name': 'tapa038edb6-47', 'instance_id': 'c64e78b6-87b2-425c-aef9-771bcd042d58', 'instance_type': 'm1.micro', 'host': '41cdcff15b5a7c4ff81f1776cad17739f7e062616f0dbe910b638a21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1c351edf-5b2d-477d-93d0-c380bdae83e7', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:36:ab:fc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa038edb6-47'}, 'message_id': '7e46504a-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4820.129949499, 'message_signature': '1086292a41cef66bcac8842137e6b404be0e22393dfb7216f4d91ab440d5dc1a'}]}, 'timestamp': '2025-11-22 07:54:37.702664', '_unique_id': 'db94ceaac2cb4f9cac014a4e1f41ad09'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.703 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.703 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.703 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.703 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.703 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.703 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.703 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.703 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.703 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.703 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.703 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.703 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.703 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.703 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.703 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.703 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.703 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.703 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.703 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.703 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.703 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.703 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.703 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.703 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.703 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.703 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.703 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.703 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.703 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.703 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.703 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.703 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '669c1c7b-c493-4f31-83dd-737239095b63'
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.704 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000003d, id=669c1c7b-c493-4f31-83dd-737239095b63>: [Error Code 42] Domain not found: no domain with matching uuid '669c1c7b-c493-4f31-83dd-737239095b63' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.704 12 DEBUG ceilometer.compute.pollsters [-] 10a29489-706f-428f-b645-1c688d642f0b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.704 12 DEBUG ceilometer.compute.pollsters [-] 10a29489-706f-428f-b645-1c688d642f0b/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.704 12 DEBUG ceilometer.compute.pollsters [-] c64e78b6-87b2-425c-aef9-771bcd042d58/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.705 12 DEBUG ceilometer.compute.pollsters [-] c64e78b6-87b2-425c-aef9-771bcd042d58/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.705 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bf2b371c-17e1-4f2a-b3b4-efe448fecee7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': '10a29489-706f-428f-b645-1c688d642f0b-vda', 'timestamp': '2025-11-22T07:54:37.703854', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-688404127', 'name': 'instance-00000040', 'instance_id': '10a29489-706f-428f-b645-1c688d642f0b', 'instance_type': 'm1.micro', 'host': 'fecdd2f490be1cc81803448995491b37056c7061fb937a5a9410db18', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1c351edf-5b2d-477d-93d0-c380bdae83e7', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7e469dac-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4820.145670544, 'message_signature': '34d9e317709368f0aed788776252e07ec27f9661d7d57c89d087a55431282955'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': '10a29489-706f-428f-b645-1c688d642f0b-sda', 'timestamp': '2025-11-22T07:54:37.703854', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-688404127', 'name': 'instance-00000040', 'instance_id': '10a29489-706f-428f-b645-1c688d642f0b', 'instance_type': 'm1.micro', 'host': 'fecdd2f490be1cc81803448995491b37056c7061fb937a5a9410db18', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1c351edf-5b2d-477d-93d0-c380bdae83e7', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7e46a784-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4820.145670544, 'message_signature': 'b3d2c0951686899a17c591484f794cd12960baffd2b7e335338824bab8f7cd30'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_name': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_name': None, 'resource_id': 'c64e78b6-87b2-425c-aef9-771bcd042d58-vda', 'timestamp': '2025-11-22T07:54:37.703854', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1711924286', 'name': 'instance-0000003b', 'instance_id': 'c64e78b6-87b2-425c-aef9-771bcd042d58', 'instance_type': 'm1.micro', 'host': '41cdcff15b5a7c4ff81f1776cad17739f7e062616f0dbe910b638a21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1c351edf-5b2d-477d-93d0-c380bdae83e7', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7e46af36-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4820.158459837, 'message_signature': '36a62b78508100508c8d040dd69d0517e1045dbb20db1d4ffe137a2cba85da4b'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_name': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_name': None, 'resource_id': 'c64e78b6-87b2-425c-aef9-771bcd042d58-sda', 'timestamp': '2025-11-22T07:54:37.703854', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1711924286', 'name': 'instance-0000003b', 'instance_id': 'c64e78b6-87b2-425c-aef9-771bcd042d58', 'instance_type': 'm1.micro', 'host': '41cdcff15b5a7c4ff81f1776cad17739f7e062616f0dbe910b638a21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1c351edf-5b2d-477d-93d0-c380bdae83e7', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7e46b6ac-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4820.158459837, 'message_signature': '88b85276f00343e57683887485f1ce13f535f6260d82a38193b72cd8b7f56287'}]}, 'timestamp': '2025-11-22 07:54:37.705288', '_unique_id': 'afe12f0305454e348346da26bae77bd1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.705 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.705 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.705 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.705 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.705 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.705 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.705 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.705 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.705 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.705 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.705 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.705 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.705 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.705 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.705 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.705 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.705 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.705 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.705 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.705 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.705 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.705 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.705 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.705 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.705 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.705 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.705 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.705 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.705 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.705 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.705 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.706 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '669c1c7b-c493-4f31-83dd-737239095b63'
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.706 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000003d, id=669c1c7b-c493-4f31-83dd-737239095b63>: [Error Code 42] Domain not found: no domain with matching uuid '669c1c7b-c493-4f31-83dd-737239095b63' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.707 12 DEBUG ceilometer.compute.pollsters [-] 10a29489-706f-428f-b645-1c688d642f0b/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.707 12 DEBUG ceilometer.compute.pollsters [-] c64e78b6-87b2-425c-aef9-771bcd042d58/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.708 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'db5390f0-eb4b-44b8-a11f-40aab7c08dfb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': 'instance-00000040-10a29489-706f-428f-b645-1c688d642f0b-tapc27f5a73-ae', 'timestamp': '2025-11-22T07:54:37.706439', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-688404127', 'name': 'tapc27f5a73-ae', 'instance_id': '10a29489-706f-428f-b645-1c688d642f0b', 'instance_type': 'm1.micro', 'host': 'fecdd2f490be1cc81803448995491b37056c7061fb937a5a9410db18', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1c351edf-5b2d-477d-93d0-c380bdae83e7', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:15:d5:34', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc27f5a73-ae'}, 'message_id': '7e470382-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4820.126626007, 'message_signature': '44360811a3499ad2060f3ff4fde398e697345d553459d43bc19e94c87a0466d7'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_name': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_name': None, 'resource_id': 'instance-0000003b-c64e78b6-87b2-425c-aef9-771bcd042d58-tapa038edb6-47', 'timestamp': '2025-11-22T07:54:37.706439', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1711924286', 'name': 'tapa038edb6-47', 'instance_id': 'c64e78b6-87b2-425c-aef9-771bcd042d58', 'instance_type': 'm1.micro', 'host': '41cdcff15b5a7c4ff81f1776cad17739f7e062616f0dbe910b638a21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1c351edf-5b2d-477d-93d0-c380bdae83e7', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:36:ab:fc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa038edb6-47'}, 'message_id': '7e470d0a-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4820.129949499, 'message_signature': 'ee4ce3041f52e5d060c31d215e06f99f44a953e191ef506a5ef872f3ab3f3655'}]}, 'timestamp': '2025-11-22 07:54:37.707495', '_unique_id': '0fcf7f549d7742adbebf3776953dfed5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.708 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.708 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.708 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.708 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.708 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.708 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.708 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.708 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.708 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.708 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.708 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.708 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.708 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.708 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.708 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.708 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.708 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.708 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.708 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.708 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.708 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.708 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.708 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.708 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.708 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.708 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.708 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.708 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.708 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.708 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.708 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.708 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.708 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.708 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServerDiskConfigTestJSON-server-519804368>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-688404127>, <NovaLikeServer: tempest-DeleteServersTestJSON-server-1711924286>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerDiskConfigTestJSON-server-519804368>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-688404127>, <NovaLikeServer: tempest-DeleteServersTestJSON-server-1711924286>]
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.708 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '669c1c7b-c493-4f31-83dd-737239095b63'
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.710 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000003d, id=669c1c7b-c493-4f31-83dd-737239095b63>: [Error Code 42] Domain not found: no domain with matching uuid '669c1c7b-c493-4f31-83dd-737239095b63' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.710 12 DEBUG ceilometer.compute.pollsters [-] 10a29489-706f-428f-b645-1c688d642f0b/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.710 12 DEBUG ceilometer.compute.pollsters [-] 10a29489-706f-428f-b645-1c688d642f0b/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.710 12 DEBUG ceilometer.compute.pollsters [-] c64e78b6-87b2-425c-aef9-771bcd042d58/disk.device.allocation volume: 29954048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.710 12 DEBUG ceilometer.compute.pollsters [-] c64e78b6-87b2-425c-aef9-771bcd042d58/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.711 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3fcd7ef8-c8b2-423f-ac9a-5fd3551d8bbf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': '10a29489-706f-428f-b645-1c688d642f0b-vda', 'timestamp': '2025-11-22T07:54:37.709014', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-688404127', 'name': 'instance-00000040', 'instance_id': '10a29489-706f-428f-b645-1c688d642f0b', 'instance_type': 'm1.micro', 'host': 'fecdd2f490be1cc81803448995491b37056c7061fb937a5a9410db18', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1c351edf-5b2d-477d-93d0-c380bdae83e7', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7e47808c-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4820.145670544, 'message_signature': 'a0b4a48c122467f09a4f520c739f3772039d2fd87887e2a2536cdbe6d6e1bf11'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': '10a29489-706f-428f-b645-1c688d642f0b-sda', 'timestamp': '2025-11-22T07:54:37.709014', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-688404127', 'name': 'instance-00000040', 'instance_id': '10a29489-706f-428f-b645-1c688d642f0b', 'instance_type': 'm1.micro', 'host': 'fecdd2f490be1cc81803448995491b37056c7061fb937a5a9410db18', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1c351edf-5b2d-477d-93d0-c380bdae83e7', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7e4788f2-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4820.145670544, 'message_signature': '0d85efea79d202686427b1e791ebc402f8e4007d55b7845e861137c18bf225da'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29954048, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_name': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_name': None, 'resource_id': 'c64e78b6-87b2-425c-aef9-771bcd042d58-vda', 'timestamp': '2025-11-22T07:54:37.709014', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1711924286', 'name': 'instance-0000003b', 'instance_id': 'c64e78b6-87b2-425c-aef9-771bcd042d58', 'instance_type': 'm1.micro', 'host': '41cdcff15b5a7c4ff81f1776cad17739f7e062616f0dbe910b638a21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1c351edf-5b2d-477d-93d0-c380bdae83e7', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7e479144-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4820.158459837, 'message_signature': 'd439a7a2bc74588720419e73e633453bc23390b3cdc6a12ea5a318175d6cf7a6'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_name': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_name': None, 'resource_id': 'c64e78b6-87b2-425c-aef9-771bcd042d58-sda', 'timestamp': '2025-11-22T07:54:37.709014', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1711924286', 'name': 'instance-0000003b', 'instance_id': 'c64e78b6-87b2-425c-aef9-771bcd042d58', 'instance_type': 'm1.micro', 'host': '41cdcff15b5a7c4ff81f1776cad17739f7e062616f0dbe910b638a21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1c351edf-5b2d-477d-93d0-c380bdae83e7', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7e479ae0-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4820.158459837, 'message_signature': '3eb66f6fd34711fb85e21ad5e6f95298eeeb52c9c0888143e5d1bb4936f3191a'}]}, 'timestamp': '2025-11-22 07:54:37.711121', '_unique_id': 'c51ca90d51944481b358f23678d2f2ba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.711 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.711 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.711 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.711 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.711 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.711 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.711 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.711 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.711 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.711 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.711 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.711 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.711 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.711 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.711 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.711 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.711 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.711 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.711 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.711 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.711 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.711 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.711 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.711 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.711 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.711 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.711 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.711 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.711 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.711 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.711 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.712 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '669c1c7b-c493-4f31-83dd-737239095b63'
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.712 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000003d, id=669c1c7b-c493-4f31-83dd-737239095b63>: [Error Code 42] Domain not found: no domain with matching uuid '669c1c7b-c493-4f31-83dd-737239095b63' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.712 12 DEBUG ceilometer.compute.pollsters [-] 10a29489-706f-428f-b645-1c688d642f0b/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.713 12 DEBUG ceilometer.compute.pollsters [-] c64e78b6-87b2-425c-aef9-771bcd042d58/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.713 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '81f6f177-7238-4ca2-8a51-7e3e061588be', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': 'instance-00000040-10a29489-706f-428f-b645-1c688d642f0b-tapc27f5a73-ae', 'timestamp': '2025-11-22T07:54:37.712306', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-688404127', 'name': 'tapc27f5a73-ae', 'instance_id': '10a29489-706f-428f-b645-1c688d642f0b', 'instance_type': 'm1.micro', 'host': 'fecdd2f490be1cc81803448995491b37056c7061fb937a5a9410db18', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1c351edf-5b2d-477d-93d0-c380bdae83e7', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:15:d5:34', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc27f5a73-ae'}, 'message_id': '7e47e784-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4820.126626007, 'message_signature': 'f7fe1c23285e284cdf0975b219e2dcbc2e207063df37ab953a1e45643fea95d9'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_name': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_name': None, 'resource_id': 'instance-0000003b-c64e78b6-87b2-425c-aef9-771bcd042d58-tapa038edb6-47', 'timestamp': '2025-11-22T07:54:37.712306', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1711924286', 'name': 'tapa038edb6-47', 'instance_id': 'c64e78b6-87b2-425c-aef9-771bcd042d58', 'instance_type': 'm1.micro', 'host': '41cdcff15b5a7c4ff81f1776cad17739f7e062616f0dbe910b638a21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1c351edf-5b2d-477d-93d0-c380bdae83e7', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:36:ab:fc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa038edb6-47'}, 'message_id': '7e47effe-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4820.129949499, 'message_signature': '620806ca47f1bb38d43b8ed8bc78777ab2325be98ed86afe73cbc51210dbaadf'}]}, 'timestamp': '2025-11-22 07:54:37.713329', '_unique_id': '76890a4e09ca45cab4086ae90ec0e0bb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.713 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.713 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.713 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.713 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.713 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.713 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.713 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.713 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.713 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.713 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.713 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.713 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.713 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.713 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.713 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.713 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.713 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.713 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.713 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.713 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.713 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.713 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.713 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.713 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.713 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.713 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.713 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.713 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.713 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.713 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.713 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.714 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '669c1c7b-c493-4f31-83dd-737239095b63'
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.714 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000003d, id=669c1c7b-c493-4f31-83dd-737239095b63>: [Error Code 42] Domain not found: no domain with matching uuid '669c1c7b-c493-4f31-83dd-737239095b63' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.714 12 DEBUG ceilometer.compute.pollsters [-] 10a29489-706f-428f-b645-1c688d642f0b/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.715 12 DEBUG ceilometer.compute.pollsters [-] 10a29489-706f-428f-b645-1c688d642f0b/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.715 12 DEBUG ceilometer.compute.pollsters [-] c64e78b6-87b2-425c-aef9-771bcd042d58/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.715 12 DEBUG ceilometer.compute.pollsters [-] c64e78b6-87b2-425c-aef9-771bcd042d58/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.716 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1adbff50-a955-4137-8688-c0d01f0af72a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': '10a29489-706f-428f-b645-1c688d642f0b-vda', 'timestamp': '2025-11-22T07:54:37.714463', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-688404127', 'name': 'instance-00000040', 'instance_id': '10a29489-706f-428f-b645-1c688d642f0b', 'instance_type': 'm1.micro', 'host': 'fecdd2f490be1cc81803448995491b37056c7061fb937a5a9410db18', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1c351edf-5b2d-477d-93d0-c380bdae83e7', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7e483748-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4820.245960298, 'message_signature': '69ff3ae7ce3c803e1eecbee39c3349005414f56659449dcf0327042a2622ce54'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': '10a29489-706f-428f-b645-1c688d642f0b-sda', 'timestamp': '2025-11-22T07:54:37.714463', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-688404127', 'name': 'instance-00000040', 'instance_id': '10a29489-706f-428f-b645-1c688d642f0b', 'instance_type': 'm1.micro', 'host': 'fecdd2f490be1cc81803448995491b37056c7061fb937a5a9410db18', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1c351edf-5b2d-477d-93d0-c380bdae83e7', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7e484102-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4820.245960298, 'message_signature': '8236317afea58843f46255a5ef7e52270a3cd92e6e478a5cd407584f94365499'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_name': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_name': None, 'resource_id': 'c64e78b6-87b2-425c-aef9-771bcd042d58-vda', 'timestamp': '2025-11-22T07:54:37.714463', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1711924286', 'name': 'instance-0000003b', 'instance_id': 'c64e78b6-87b2-425c-aef9-771bcd042d58', 'instance_type': 'm1.micro', 'host': '41cdcff15b5a7c4ff81f1776cad17739f7e062616f0dbe910b638a21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1c351edf-5b2d-477d-93d0-c380bdae83e7', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7e48497c-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4820.273315657, 'message_signature': '4ac4d8b5b1e34be03b129b04338300405fdf5f477e8d78e6654e8269407e330a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_name': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_name': None, 'resource_id': 'c64e78b6-87b2-425c-aef9-771bcd042d58-sda', 'timestamp': '2025-11-22T07:54:37.714463', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1711924286', 'name': 'instance-0000003b', 'instance_id': 'c64e78b6-87b2-425c-aef9-771bcd042d58', 'instance_type': 'm1.micro', 'host': '41cdcff15b5a7c4ff81f1776cad17739f7e062616f0dbe910b638a21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1c351edf-5b2d-477d-93d0-c380bdae83e7', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7e485444-c778-11f0-9b25-fa163ecc0304', 'monotonic_time': 4820.273315657, 'message_signature': '4f046e4edc7a64f2785012a4aab28b8ae37c11017f032804bf686b9ce3b55cbe'}]}, 'timestamp': '2025-11-22 07:54:37.715904', '_unique_id': '4cb451343a6a460a8244ffd75c203252'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.716 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.716 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.716 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.716 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.716 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.716 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.716 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.716 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.716 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.716 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.716 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.716 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.716 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.716 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.716 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.716 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.716 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.716 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.716 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.716 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.716 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.716 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.716 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.716 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.716 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.716 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.716 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.716 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.716 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.716 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:54:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:54:37.716 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:54:39 np0005531887 nova_compute[186849]: 2025-11-22 07:54:39.345 186853 DEBUG oslo_concurrency.processutils [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:54:39 np0005531887 nova_compute[186849]: 2025-11-22 07:54:39.420 186853 DEBUG oslo_concurrency.processutils [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42.part --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:54:39 np0005531887 nova_compute[186849]: 2025-11-22 07:54:39.421 186853 DEBUG nova.virt.images [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] 360f90ca-2ddb-4e60-a48e-364e3b48bd96 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Nov 22 02:54:39 np0005531887 nova_compute[186849]: 2025-11-22 07:54:39.423 186853 DEBUG nova.privsep.utils [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 22 02:54:39 np0005531887 nova_compute[186849]: 2025-11-22 07:54:39.423 186853 DEBUG oslo_concurrency.processutils [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42.part /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:54:39 np0005531887 nova_compute[186849]: 2025-11-22 07:54:39.803 186853 DEBUG oslo_concurrency.processutils [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42.part /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42.converted" returned: 0 in 0.379s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:54:39 np0005531887 nova_compute[186849]: 2025-11-22 07:54:39.807 186853 DEBUG oslo_concurrency.processutils [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:54:39 np0005531887 nova_compute[186849]: 2025-11-22 07:54:39.827 186853 DEBUG nova.compute.manager [req-13564813-d894-49d1-8261-29aa4b77dc5f req-df81d7d1-d16b-4225-8e13-232c87d69386 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Received event network-vif-plugged-4f77ff1b-e147-4c07-9d9b-feabd33edead external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:54:39 np0005531887 nova_compute[186849]: 2025-11-22 07:54:39.828 186853 DEBUG oslo_concurrency.lockutils [req-13564813-d894-49d1-8261-29aa4b77dc5f req-df81d7d1-d16b-4225-8e13-232c87d69386 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "669c1c7b-c493-4f31-83dd-737239095b63-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:39 np0005531887 nova_compute[186849]: 2025-11-22 07:54:39.829 186853 DEBUG oslo_concurrency.lockutils [req-13564813-d894-49d1-8261-29aa4b77dc5f req-df81d7d1-d16b-4225-8e13-232c87d69386 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "669c1c7b-c493-4f31-83dd-737239095b63-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:39 np0005531887 nova_compute[186849]: 2025-11-22 07:54:39.829 186853 DEBUG oslo_concurrency.lockutils [req-13564813-d894-49d1-8261-29aa4b77dc5f req-df81d7d1-d16b-4225-8e13-232c87d69386 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "669c1c7b-c493-4f31-83dd-737239095b63-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:39 np0005531887 nova_compute[186849]: 2025-11-22 07:54:39.829 186853 DEBUG nova.compute.manager [req-13564813-d894-49d1-8261-29aa4b77dc5f req-df81d7d1-d16b-4225-8e13-232c87d69386 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] No waiting events found dispatching network-vif-plugged-4f77ff1b-e147-4c07-9d9b-feabd33edead pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:54:39 np0005531887 nova_compute[186849]: 2025-11-22 07:54:39.829 186853 WARNING nova.compute.manager [req-13564813-d894-49d1-8261-29aa4b77dc5f req-df81d7d1-d16b-4225-8e13-232c87d69386 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Received unexpected event network-vif-plugged-4f77ff1b-e147-4c07-9d9b-feabd33edead for instance with vm_state active and task_state rebuild_spawning.#033[00m
Nov 22 02:54:39 np0005531887 nova_compute[186849]: 2025-11-22 07:54:39.830 186853 DEBUG nova.compute.manager [req-13564813-d894-49d1-8261-29aa4b77dc5f req-df81d7d1-d16b-4225-8e13-232c87d69386 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Received event network-vif-plugged-4f77ff1b-e147-4c07-9d9b-feabd33edead external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:54:39 np0005531887 nova_compute[186849]: 2025-11-22 07:54:39.830 186853 DEBUG oslo_concurrency.lockutils [req-13564813-d894-49d1-8261-29aa4b77dc5f req-df81d7d1-d16b-4225-8e13-232c87d69386 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "669c1c7b-c493-4f31-83dd-737239095b63-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:39 np0005531887 nova_compute[186849]: 2025-11-22 07:54:39.830 186853 DEBUG oslo_concurrency.lockutils [req-13564813-d894-49d1-8261-29aa4b77dc5f req-df81d7d1-d16b-4225-8e13-232c87d69386 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "669c1c7b-c493-4f31-83dd-737239095b63-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:39 np0005531887 nova_compute[186849]: 2025-11-22 07:54:39.831 186853 DEBUG oslo_concurrency.lockutils [req-13564813-d894-49d1-8261-29aa4b77dc5f req-df81d7d1-d16b-4225-8e13-232c87d69386 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "669c1c7b-c493-4f31-83dd-737239095b63-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:39 np0005531887 nova_compute[186849]: 2025-11-22 07:54:39.831 186853 DEBUG nova.compute.manager [req-13564813-d894-49d1-8261-29aa4b77dc5f req-df81d7d1-d16b-4225-8e13-232c87d69386 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] No waiting events found dispatching network-vif-plugged-4f77ff1b-e147-4c07-9d9b-feabd33edead pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:54:39 np0005531887 nova_compute[186849]: 2025-11-22 07:54:39.831 186853 WARNING nova.compute.manager [req-13564813-d894-49d1-8261-29aa4b77dc5f req-df81d7d1-d16b-4225-8e13-232c87d69386 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Received unexpected event network-vif-plugged-4f77ff1b-e147-4c07-9d9b-feabd33edead for instance with vm_state active and task_state rebuild_spawning.#033[00m
Nov 22 02:54:39 np0005531887 nova_compute[186849]: 2025-11-22 07:54:39.831 186853 DEBUG nova.compute.manager [req-13564813-d894-49d1-8261-29aa4b77dc5f req-df81d7d1-d16b-4225-8e13-232c87d69386 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Received event network-vif-plugged-4f77ff1b-e147-4c07-9d9b-feabd33edead external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:54:39 np0005531887 nova_compute[186849]: 2025-11-22 07:54:39.832 186853 DEBUG oslo_concurrency.lockutils [req-13564813-d894-49d1-8261-29aa4b77dc5f req-df81d7d1-d16b-4225-8e13-232c87d69386 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "669c1c7b-c493-4f31-83dd-737239095b63-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:39 np0005531887 nova_compute[186849]: 2025-11-22 07:54:39.832 186853 DEBUG oslo_concurrency.lockutils [req-13564813-d894-49d1-8261-29aa4b77dc5f req-df81d7d1-d16b-4225-8e13-232c87d69386 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "669c1c7b-c493-4f31-83dd-737239095b63-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:39 np0005531887 nova_compute[186849]: 2025-11-22 07:54:39.832 186853 DEBUG oslo_concurrency.lockutils [req-13564813-d894-49d1-8261-29aa4b77dc5f req-df81d7d1-d16b-4225-8e13-232c87d69386 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "669c1c7b-c493-4f31-83dd-737239095b63-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:39 np0005531887 nova_compute[186849]: 2025-11-22 07:54:39.832 186853 DEBUG nova.compute.manager [req-13564813-d894-49d1-8261-29aa4b77dc5f req-df81d7d1-d16b-4225-8e13-232c87d69386 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] No waiting events found dispatching network-vif-plugged-4f77ff1b-e147-4c07-9d9b-feabd33edead pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:54:39 np0005531887 nova_compute[186849]: 2025-11-22 07:54:39.832 186853 WARNING nova.compute.manager [req-13564813-d894-49d1-8261-29aa4b77dc5f req-df81d7d1-d16b-4225-8e13-232c87d69386 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Received unexpected event network-vif-plugged-4f77ff1b-e147-4c07-9d9b-feabd33edead for instance with vm_state active and task_state rebuild_spawning.#033[00m
Nov 22 02:54:39 np0005531887 nova_compute[186849]: 2025-11-22 07:54:39.833 186853 DEBUG nova.compute.manager [req-13564813-d894-49d1-8261-29aa4b77dc5f req-df81d7d1-d16b-4225-8e13-232c87d69386 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Received event network-vif-unplugged-4f77ff1b-e147-4c07-9d9b-feabd33edead external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:54:39 np0005531887 nova_compute[186849]: 2025-11-22 07:54:39.833 186853 DEBUG oslo_concurrency.lockutils [req-13564813-d894-49d1-8261-29aa4b77dc5f req-df81d7d1-d16b-4225-8e13-232c87d69386 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "669c1c7b-c493-4f31-83dd-737239095b63-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:39 np0005531887 nova_compute[186849]: 2025-11-22 07:54:39.833 186853 DEBUG oslo_concurrency.lockutils [req-13564813-d894-49d1-8261-29aa4b77dc5f req-df81d7d1-d16b-4225-8e13-232c87d69386 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "669c1c7b-c493-4f31-83dd-737239095b63-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:39 np0005531887 nova_compute[186849]: 2025-11-22 07:54:39.833 186853 DEBUG oslo_concurrency.lockutils [req-13564813-d894-49d1-8261-29aa4b77dc5f req-df81d7d1-d16b-4225-8e13-232c87d69386 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "669c1c7b-c493-4f31-83dd-737239095b63-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:39 np0005531887 nova_compute[186849]: 2025-11-22 07:54:39.834 186853 DEBUG nova.compute.manager [req-13564813-d894-49d1-8261-29aa4b77dc5f req-df81d7d1-d16b-4225-8e13-232c87d69386 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] No waiting events found dispatching network-vif-unplugged-4f77ff1b-e147-4c07-9d9b-feabd33edead pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:54:39 np0005531887 nova_compute[186849]: 2025-11-22 07:54:39.834 186853 WARNING nova.compute.manager [req-13564813-d894-49d1-8261-29aa4b77dc5f req-df81d7d1-d16b-4225-8e13-232c87d69386 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Received unexpected event network-vif-unplugged-4f77ff1b-e147-4c07-9d9b-feabd33edead for instance with vm_state active and task_state rebuild_spawning.#033[00m
Nov 22 02:54:39 np0005531887 nova_compute[186849]: 2025-11-22 07:54:39.834 186853 DEBUG nova.compute.manager [req-13564813-d894-49d1-8261-29aa4b77dc5f req-df81d7d1-d16b-4225-8e13-232c87d69386 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Received event network-vif-plugged-4f77ff1b-e147-4c07-9d9b-feabd33edead external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:54:39 np0005531887 nova_compute[186849]: 2025-11-22 07:54:39.834 186853 DEBUG oslo_concurrency.lockutils [req-13564813-d894-49d1-8261-29aa4b77dc5f req-df81d7d1-d16b-4225-8e13-232c87d69386 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "669c1c7b-c493-4f31-83dd-737239095b63-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:39 np0005531887 nova_compute[186849]: 2025-11-22 07:54:39.834 186853 DEBUG oslo_concurrency.lockutils [req-13564813-d894-49d1-8261-29aa4b77dc5f req-df81d7d1-d16b-4225-8e13-232c87d69386 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "669c1c7b-c493-4f31-83dd-737239095b63-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:39 np0005531887 nova_compute[186849]: 2025-11-22 07:54:39.835 186853 DEBUG oslo_concurrency.lockutils [req-13564813-d894-49d1-8261-29aa4b77dc5f req-df81d7d1-d16b-4225-8e13-232c87d69386 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "669c1c7b-c493-4f31-83dd-737239095b63-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:39 np0005531887 nova_compute[186849]: 2025-11-22 07:54:39.835 186853 DEBUG nova.compute.manager [req-13564813-d894-49d1-8261-29aa4b77dc5f req-df81d7d1-d16b-4225-8e13-232c87d69386 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] No waiting events found dispatching network-vif-plugged-4f77ff1b-e147-4c07-9d9b-feabd33edead pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:54:39 np0005531887 nova_compute[186849]: 2025-11-22 07:54:39.835 186853 WARNING nova.compute.manager [req-13564813-d894-49d1-8261-29aa4b77dc5f req-df81d7d1-d16b-4225-8e13-232c87d69386 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Received unexpected event network-vif-plugged-4f77ff1b-e147-4c07-9d9b-feabd33edead for instance with vm_state active and task_state rebuild_spawning.#033[00m
Nov 22 02:54:39 np0005531887 nova_compute[186849]: 2025-11-22 07:54:39.870 186853 DEBUG oslo_concurrency.processutils [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42.converted --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:54:39 np0005531887 nova_compute[186849]: 2025-11-22 07:54:39.871 186853 DEBUG oslo_concurrency.lockutils [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "2882af3479446958b785a3f508ce087a26493f42" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.613s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:39 np0005531887 nova_compute[186849]: 2025-11-22 07:54:39.891 186853 DEBUG oslo_concurrency.processutils [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:54:39 np0005531887 nova_compute[186849]: 2025-11-22 07:54:39.944 186853 DEBUG nova.compute.manager [req-1cd56adc-fe46-41e8-8c74-987026b94b12 req-223a4628-1bc6-4f22-8044-17ed671c2c77 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Received event network-vif-plugged-a038edb6-47af-4f7e-9f5e-715660b6da32 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:54:39 np0005531887 nova_compute[186849]: 2025-11-22 07:54:39.944 186853 DEBUG oslo_concurrency.lockutils [req-1cd56adc-fe46-41e8-8c74-987026b94b12 req-223a4628-1bc6-4f22-8044-17ed671c2c77 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c64e78b6-87b2-425c-aef9-771bcd042d58-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:39 np0005531887 nova_compute[186849]: 2025-11-22 07:54:39.947 186853 DEBUG oslo_concurrency.lockutils [req-1cd56adc-fe46-41e8-8c74-987026b94b12 req-223a4628-1bc6-4f22-8044-17ed671c2c77 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c64e78b6-87b2-425c-aef9-771bcd042d58-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:39 np0005531887 nova_compute[186849]: 2025-11-22 07:54:39.948 186853 DEBUG oslo_concurrency.lockutils [req-1cd56adc-fe46-41e8-8c74-987026b94b12 req-223a4628-1bc6-4f22-8044-17ed671c2c77 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c64e78b6-87b2-425c-aef9-771bcd042d58-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:39 np0005531887 nova_compute[186849]: 2025-11-22 07:54:39.948 186853 DEBUG nova.compute.manager [req-1cd56adc-fe46-41e8-8c74-987026b94b12 req-223a4628-1bc6-4f22-8044-17ed671c2c77 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] No waiting events found dispatching network-vif-plugged-a038edb6-47af-4f7e-9f5e-715660b6da32 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:54:39 np0005531887 nova_compute[186849]: 2025-11-22 07:54:39.948 186853 WARNING nova.compute.manager [req-1cd56adc-fe46-41e8-8c74-987026b94b12 req-223a4628-1bc6-4f22-8044-17ed671c2c77 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Received unexpected event network-vif-plugged-a038edb6-47af-4f7e-9f5e-715660b6da32 for instance with vm_state resized and task_state deleting.#033[00m
Nov 22 02:54:39 np0005531887 nova_compute[186849]: 2025-11-22 07:54:39.957 186853 DEBUG oslo_concurrency.processutils [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:54:39 np0005531887 nova_compute[186849]: 2025-11-22 07:54:39.957 186853 DEBUG oslo_concurrency.lockutils [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "2882af3479446958b785a3f508ce087a26493f42" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:39 np0005531887 nova_compute[186849]: 2025-11-22 07:54:39.958 186853 DEBUG oslo_concurrency.lockutils [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "2882af3479446958b785a3f508ce087a26493f42" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:39 np0005531887 nova_compute[186849]: 2025-11-22 07:54:39.974 186853 DEBUG oslo_concurrency.processutils [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.034 186853 DEBUG oslo_concurrency.processutils [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.035 186853 DEBUG oslo_concurrency.processutils [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42,backing_fmt=raw /var/lib/nova/instances/669c1c7b-c493-4f31-83dd-737239095b63/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.078 186853 DEBUG oslo_concurrency.processutils [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42,backing_fmt=raw /var/lib/nova/instances/669c1c7b-c493-4f31-83dd-737239095b63/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.079 186853 DEBUG oslo_concurrency.lockutils [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "2882af3479446958b785a3f508ce087a26493f42" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.080 186853 DEBUG oslo_concurrency.processutils [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.147 186853 DEBUG oslo_concurrency.processutils [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.148 186853 DEBUG nova.virt.disk.api [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Checking if we can resize image /var/lib/nova/instances/669c1c7b-c493-4f31-83dd-737239095b63/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.149 186853 DEBUG oslo_concurrency.processutils [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/669c1c7b-c493-4f31-83dd-737239095b63/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.211 186853 DEBUG oslo_concurrency.processutils [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/669c1c7b-c493-4f31-83dd-737239095b63/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.213 186853 DEBUG nova.virt.disk.api [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Cannot resize image /var/lib/nova/instances/669c1c7b-c493-4f31-83dd-737239095b63/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.213 186853 DEBUG nova.virt.libvirt.driver [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.214 186853 DEBUG nova.virt.libvirt.driver [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Ensure instance console log exists: /var/lib/nova/instances/669c1c7b-c493-4f31-83dd-737239095b63/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.214 186853 DEBUG oslo_concurrency.lockutils [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.215 186853 DEBUG oslo_concurrency.lockutils [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.215 186853 DEBUG oslo_concurrency.lockutils [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.217 186853 DEBUG nova.virt.libvirt.driver [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Start _get_guest_xml network_info=[{"id": "4f77ff1b-e147-4c07-9d9b-feabd33edead", "address": "fa:16:3e:c6:b1:2a", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f77ff1b-e1", "ovs_interfaceid": "4f77ff1b-e147-4c07-9d9b-feabd33edead", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:39:01Z,direct_url=<?>,disk_format='qcow2',id=360f90ca-2ddb-4e60-a48e-364e3b48bd96,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:02Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'size': 0, 'boot_index': 0, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.221 186853 WARNING nova.virt.libvirt.driver [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.228 186853 DEBUG nova.virt.libvirt.host [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.230 186853 DEBUG nova.virt.libvirt.host [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.237 186853 DEBUG nova.virt.libvirt.host [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.238 186853 DEBUG nova.virt.libvirt.host [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.240 186853 DEBUG nova.virt.libvirt.driver [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.241 186853 DEBUG nova.virt.hardware [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:39:01Z,direct_url=<?>,disk_format='qcow2',id=360f90ca-2ddb-4e60-a48e-364e3b48bd96,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:02Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.241 186853 DEBUG nova.virt.hardware [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.242 186853 DEBUG nova.virt.hardware [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.242 186853 DEBUG nova.virt.hardware [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.242 186853 DEBUG nova.virt.hardware [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.242 186853 DEBUG nova.virt.hardware [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.243 186853 DEBUG nova.virt.hardware [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.243 186853 DEBUG nova.virt.hardware [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.243 186853 DEBUG nova.virt.hardware [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.244 186853 DEBUG nova.virt.hardware [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.244 186853 DEBUG nova.virt.hardware [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.244 186853 DEBUG nova.objects.instance [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 669c1c7b-c493-4f31-83dd-737239095b63 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.264 186853 DEBUG nova.virt.libvirt.vif [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-22T07:54:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-519804368',display_name='tempest-ServerDiskConfigTestJSON-server-519804368',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-519804368',id=61,image_ref='360f90ca-2ddb-4e60-a48e-364e3b48bd96',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:54:13Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='063bf16c91af408ca075c690797e09d8',ramdisk_id='',reservation_id='r-pgwqi3ub',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='360f90ca-2ddb-4e60-a48e-364e3b48bd96',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-592691466',owner_user_name='tempest-ServerDiskConfigTestJSON-592691466-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:54:37Z,user_data=None,user_id='e24c302b62fb470aa189b76d4676733b',uuid=669c1c7b-c493-4f31-83dd-737239095b63,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4f77ff1b-e147-4c07-9d9b-feabd33edead", "address": "fa:16:3e:c6:b1:2a", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f77ff1b-e1", "ovs_interfaceid": "4f77ff1b-e147-4c07-9d9b-feabd33edead", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.266 186853 DEBUG nova.network.os_vif_util [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Converting VIF {"id": "4f77ff1b-e147-4c07-9d9b-feabd33edead", "address": "fa:16:3e:c6:b1:2a", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f77ff1b-e1", "ovs_interfaceid": "4f77ff1b-e147-4c07-9d9b-feabd33edead", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.267 186853 DEBUG nova.network.os_vif_util [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c6:b1:2a,bridge_name='br-int',has_traffic_filtering=True,id=4f77ff1b-e147-4c07-9d9b-feabd33edead,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f77ff1b-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.270 186853 DEBUG nova.virt.libvirt.driver [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:54:40 np0005531887 nova_compute[186849]:  <uuid>669c1c7b-c493-4f31-83dd-737239095b63</uuid>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:  <name>instance-0000003d</name>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:  <memory>131072</memory>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:  <vcpu>1</vcpu>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:54:40 np0005531887 nova_compute[186849]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:      <nova:name>tempest-ServerDiskConfigTestJSON-server-519804368</nova:name>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:      <nova:creationTime>2025-11-22 07:54:40</nova:creationTime>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:      <nova:flavor name="m1.nano">
Nov 22 02:54:40 np0005531887 nova_compute[186849]:        <nova:memory>128</nova:memory>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:        <nova:disk>1</nova:disk>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:        <nova:swap>0</nova:swap>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:      </nova:flavor>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:      <nova:owner>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:        <nova:user uuid="e24c302b62fb470aa189b76d4676733b">tempest-ServerDiskConfigTestJSON-592691466-project-member</nova:user>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:        <nova:project uuid="063bf16c91af408ca075c690797e09d8">tempest-ServerDiskConfigTestJSON-592691466</nova:project>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:      </nova:owner>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:      <nova:root type="image" uuid="360f90ca-2ddb-4e60-a48e-364e3b48bd96"/>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:      <nova:ports>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:        <nova:port uuid="4f77ff1b-e147-4c07-9d9b-feabd33edead">
Nov 22 02:54:40 np0005531887 nova_compute[186849]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:        </nova:port>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:      </nova:ports>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:    </nova:instance>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:  <sysinfo type="smbios">
Nov 22 02:54:40 np0005531887 nova_compute[186849]:    <system>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:      <entry name="serial">669c1c7b-c493-4f31-83dd-737239095b63</entry>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:      <entry name="uuid">669c1c7b-c493-4f31-83dd-737239095b63</entry>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:    </system>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:  <os>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:    <boot dev="hd"/>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:    <smbios mode="sysinfo"/>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:  </os>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:  <features>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:    <vmcoreinfo/>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:  </features>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:  <clock offset="utc">
Nov 22 02:54:40 np0005531887 nova_compute[186849]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:    <timer name="hpet" present="no"/>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:  </clock>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:  <cpu mode="custom" match="exact">
Nov 22 02:54:40 np0005531887 nova_compute[186849]:    <model>Nehalem</model>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:  <devices>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:    <disk type="file" device="disk">
Nov 22 02:54:40 np0005531887 nova_compute[186849]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/669c1c7b-c493-4f31-83dd-737239095b63/disk"/>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:      <target dev="vda" bus="virtio"/>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:    <disk type="file" device="cdrom">
Nov 22 02:54:40 np0005531887 nova_compute[186849]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/669c1c7b-c493-4f31-83dd-737239095b63/disk.config"/>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:      <target dev="sda" bus="sata"/>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:    <interface type="ethernet">
Nov 22 02:54:40 np0005531887 nova_compute[186849]:      <mac address="fa:16:3e:c6:b1:2a"/>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:      <mtu size="1442"/>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:      <target dev="tap4f77ff1b-e1"/>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:    </interface>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:    <serial type="pty">
Nov 22 02:54:40 np0005531887 nova_compute[186849]:      <log file="/var/lib/nova/instances/669c1c7b-c493-4f31-83dd-737239095b63/console.log" append="off"/>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:    </serial>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:    <video>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:    </video>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:    <input type="tablet" bus="usb"/>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:    <rng model="virtio">
Nov 22 02:54:40 np0005531887 nova_compute[186849]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:    </rng>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:    <controller type="usb" index="0"/>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:    <memballoon model="virtio">
Nov 22 02:54:40 np0005531887 nova_compute[186849]:      <stats period="10"/>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 02:54:40 np0005531887 nova_compute[186849]:  </devices>
Nov 22 02:54:40 np0005531887 nova_compute[186849]: </domain>
Nov 22 02:54:40 np0005531887 nova_compute[186849]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.276 186853 DEBUG nova.compute.manager [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Preparing to wait for external event network-vif-plugged-4f77ff1b-e147-4c07-9d9b-feabd33edead prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.277 186853 DEBUG oslo_concurrency.lockutils [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "669c1c7b-c493-4f31-83dd-737239095b63-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.277 186853 DEBUG oslo_concurrency.lockutils [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "669c1c7b-c493-4f31-83dd-737239095b63-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.277 186853 DEBUG oslo_concurrency.lockutils [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "669c1c7b-c493-4f31-83dd-737239095b63-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.278 186853 DEBUG nova.virt.libvirt.vif [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-22T07:54:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-519804368',display_name='tempest-ServerDiskConfigTestJSON-server-519804368',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-519804368',id=61,image_ref='360f90ca-2ddb-4e60-a48e-364e3b48bd96',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:54:13Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='063bf16c91af408ca075c690797e09d8',ramdisk_id='',reservation_id='r-pgwqi3ub',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='360f90ca-2ddb-4e60-a48e-364e3b48bd96',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-592691466',owner_user_name='tempest-ServerDiskConfigTestJSON-592691466-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:54:37Z,user_data=None,user_id='e24c302b62fb470aa189b76d4676733b',uuid=669c1c7b-c493-4f31-83dd-737239095b63,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4f77ff1b-e147-4c07-9d9b-feabd33edead", "address": "fa:16:3e:c6:b1:2a", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f77ff1b-e1", "ovs_interfaceid": "4f77ff1b-e147-4c07-9d9b-feabd33edead", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.278 186853 DEBUG nova.network.os_vif_util [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Converting VIF {"id": "4f77ff1b-e147-4c07-9d9b-feabd33edead", "address": "fa:16:3e:c6:b1:2a", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f77ff1b-e1", "ovs_interfaceid": "4f77ff1b-e147-4c07-9d9b-feabd33edead", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.279 186853 DEBUG nova.network.os_vif_util [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c6:b1:2a,bridge_name='br-int',has_traffic_filtering=True,id=4f77ff1b-e147-4c07-9d9b-feabd33edead,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f77ff1b-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.279 186853 DEBUG os_vif [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c6:b1:2a,bridge_name='br-int',has_traffic_filtering=True,id=4f77ff1b-e147-4c07-9d9b-feabd33edead,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f77ff1b-e1') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.281 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.282 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.283 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.286 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.286 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4f77ff1b-e1, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.286 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4f77ff1b-e1, col_values=(('external_ids', {'iface-id': '4f77ff1b-e147-4c07-9d9b-feabd33edead', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c6:b1:2a', 'vm-uuid': '669c1c7b-c493-4f31-83dd-737239095b63'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.288 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:40 np0005531887 NetworkManager[55210]: <info>  [1763798080.2895] manager: (tap4f77ff1b-e1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/81)
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.291 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.296 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.297 186853 INFO os_vif [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c6:b1:2a,bridge_name='br-int',has_traffic_filtering=True,id=4f77ff1b-e147-4c07-9d9b-feabd33edead,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f77ff1b-e1')#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.365 186853 DEBUG nova.virt.libvirt.driver [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.366 186853 DEBUG nova.virt.libvirt.driver [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.367 186853 DEBUG nova.virt.libvirt.driver [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] No VIF found with MAC fa:16:3e:c6:b1:2a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.367 186853 INFO nova.virt.libvirt.driver [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Using config drive#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.382 186853 DEBUG nova.objects.instance [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 669c1c7b-c493-4f31-83dd-737239095b63 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.418 186853 DEBUG nova.objects.instance [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lazy-loading 'keypairs' on Instance uuid 669c1c7b-c493-4f31-83dd-737239095b63 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.647 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.674 186853 DEBUG oslo_concurrency.lockutils [None req-a687e413-2d03-4fff-83c3-fe61f93b477f 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "c64e78b6-87b2-425c-aef9-771bcd042d58" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.675 186853 DEBUG oslo_concurrency.lockutils [None req-a687e413-2d03-4fff-83c3-fe61f93b477f 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "c64e78b6-87b2-425c-aef9-771bcd042d58" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.675 186853 DEBUG oslo_concurrency.lockutils [None req-a687e413-2d03-4fff-83c3-fe61f93b477f 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "c64e78b6-87b2-425c-aef9-771bcd042d58-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.675 186853 DEBUG oslo_concurrency.lockutils [None req-a687e413-2d03-4fff-83c3-fe61f93b477f 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "c64e78b6-87b2-425c-aef9-771bcd042d58-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.676 186853 DEBUG oslo_concurrency.lockutils [None req-a687e413-2d03-4fff-83c3-fe61f93b477f 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "c64e78b6-87b2-425c-aef9-771bcd042d58-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.684 186853 INFO nova.compute.manager [None req-a687e413-2d03-4fff-83c3-fe61f93b477f 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Terminating instance#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.691 186853 DEBUG nova.compute.manager [None req-a687e413-2d03-4fff-83c3-fe61f93b477f 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 02:54:40 np0005531887 kernel: tapa038edb6-47 (unregistering): left promiscuous mode
Nov 22 02:54:40 np0005531887 NetworkManager[55210]: <info>  [1763798080.7250] device (tapa038edb6-47): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 02:54:40 np0005531887 ovn_controller[95130]: 2025-11-22T07:54:40Z|00145|binding|INFO|Releasing lport a038edb6-47af-4f7e-9f5e-715660b6da32 from this chassis (sb_readonly=0)
Nov 22 02:54:40 np0005531887 ovn_controller[95130]: 2025-11-22T07:54:40Z|00146|binding|INFO|Setting lport a038edb6-47af-4f7e-9f5e-715660b6da32 down in Southbound
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.735 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:40 np0005531887 ovn_controller[95130]: 2025-11-22T07:54:40Z|00147|binding|INFO|Removing iface tapa038edb6-47 ovn-installed in OVS
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.749 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.752 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:40.751 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:36:ab:fc 10.100.0.7'], port_security=['fa:16:3e:36:ab:fc 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'c64e78b6-87b2-425c-aef9-771bcd042d58', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5e910dbb-27d1-4915-8b74-d0538d33c33c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'd8cd7544-2677-4974-86a3-a18d0c107043', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7bb67d1a-54cf-4f4c-900a-e9306bad2f5e, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=a038edb6-47af-4f7e-9f5e-715660b6da32) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:54:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:40.752 104084 INFO neutron.agent.ovn.metadata.agent [-] Port a038edb6-47af-4f7e-9f5e-715660b6da32 in datapath 5e910dbb-27d1-4915-8b74-d0538d33c33c unbound from our chassis#033[00m
Nov 22 02:54:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:40.754 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5e910dbb-27d1-4915-8b74-d0538d33c33c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 02:54:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:40.754 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[6fa60c08-f582-4220-9d31-4e644d157f43]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:40.755 104084 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c namespace which is not needed anymore#033[00m
Nov 22 02:54:40 np0005531887 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d0000003b.scope: Deactivated successfully.
Nov 22 02:54:40 np0005531887 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d0000003b.scope: Consumed 6.969s CPU time.
Nov 22 02:54:40 np0005531887 systemd-machined[153180]: Machine qemu-23-instance-0000003b terminated.
Nov 22 02:54:40 np0005531887 neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c[221680]: [NOTICE]   (221684) : haproxy version is 2.8.14-c23fe91
Nov 22 02:54:40 np0005531887 neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c[221680]: [NOTICE]   (221684) : path to executable is /usr/sbin/haproxy
Nov 22 02:54:40 np0005531887 neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c[221680]: [WARNING]  (221684) : Exiting Master process...
Nov 22 02:54:40 np0005531887 neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c[221680]: [ALERT]    (221684) : Current worker (221686) exited with code 143 (Terminated)
Nov 22 02:54:40 np0005531887 neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c[221680]: [WARNING]  (221684) : All workers exited. Exiting... (0)
Nov 22 02:54:40 np0005531887 systemd[1]: libpod-f1fc8018c136caa19f628ccbc65722d14e41dbb93289349b0680dff2d25ecbaf.scope: Deactivated successfully.
Nov 22 02:54:40 np0005531887 podman[221896]: 2025-11-22 07:54:40.901869661 +0000 UTC m=+0.056573515 container died f1fc8018c136caa19f628ccbc65722d14e41dbb93289349b0680dff2d25ecbaf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.926 186853 INFO nova.virt.libvirt.driver [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Creating config drive at /var/lib/nova/instances/669c1c7b-c493-4f31-83dd-737239095b63/disk.config#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.936 186853 DEBUG oslo_concurrency.processutils [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/669c1c7b-c493-4f31-83dd-737239095b63/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpg1s1mzeo execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.955 186853 INFO nova.virt.libvirt.driver [-] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Instance destroyed successfully.#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.956 186853 DEBUG nova.objects.instance [None req-a687e413-2d03-4fff-83c3-fe61f93b477f 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lazy-loading 'resources' on Instance uuid c64e78b6-87b2-425c-aef9-771bcd042d58 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.977 186853 DEBUG nova.virt.libvirt.vif [None req-a687e413-2d03-4fff-83c3-fe61f93b477f 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:53:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1711924286',display_name='tempest-DeleteServersTestJSON-server-1711924286',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1711924286',id=59,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:54:34Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6b68db2b61a54aeaa8ac219f44ed3e75',ramdisk_id='',reservation_id='r-8pf23wx3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-DeleteServersTestJSON-550712359',owner_user_name='tempest-DeleteServersTestJSON-550712359-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:54:34Z,user_data=None,user_id='57077a1511bf46d897beb6fd5eedfa67',uuid=c64e78b6-87b2-425c-aef9-771bcd042d58,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "a038edb6-47af-4f7e-9f5e-715660b6da32", "address": "fa:16:3e:36:ab:fc", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa038edb6-47", "ovs_interfaceid": "a038edb6-47af-4f7e-9f5e-715660b6da32", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.979 186853 DEBUG nova.network.os_vif_util [None req-a687e413-2d03-4fff-83c3-fe61f93b477f 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Converting VIF {"id": "a038edb6-47af-4f7e-9f5e-715660b6da32", "address": "fa:16:3e:36:ab:fc", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa038edb6-47", "ovs_interfaceid": "a038edb6-47af-4f7e-9f5e-715660b6da32", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.980 186853 DEBUG nova.network.os_vif_util [None req-a687e413-2d03-4fff-83c3-fe61f93b477f 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:36:ab:fc,bridge_name='br-int',has_traffic_filtering=True,id=a038edb6-47af-4f7e-9f5e-715660b6da32,network=Network(5e910dbb-27d1-4915-8b74-d0538d33c33c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa038edb6-47') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.981 186853 DEBUG os_vif [None req-a687e413-2d03-4fff-83c3-fe61f93b477f 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:36:ab:fc,bridge_name='br-int',has_traffic_filtering=True,id=a038edb6-47af-4f7e-9f5e-715660b6da32,network=Network(5e910dbb-27d1-4915-8b74-d0538d33c33c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa038edb6-47') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.984 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.985 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa038edb6-47, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.990 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.992 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.995 186853 INFO os_vif [None req-a687e413-2d03-4fff-83c3-fe61f93b477f 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:36:ab:fc,bridge_name='br-int',has_traffic_filtering=True,id=a038edb6-47af-4f7e-9f5e-715660b6da32,network=Network(5e910dbb-27d1-4915-8b74-d0538d33c33c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa038edb6-47')#033[00m
Nov 22 02:54:40 np0005531887 nova_compute[186849]: 2025-11-22 07:54:40.996 186853 INFO nova.virt.libvirt.driver [None req-a687e413-2d03-4fff-83c3-fe61f93b477f 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Deleting instance files /var/lib/nova/instances/c64e78b6-87b2-425c-aef9-771bcd042d58_del#033[00m
Nov 22 02:54:41 np0005531887 nova_compute[186849]: 2025-11-22 07:54:41.004 186853 INFO nova.virt.libvirt.driver [None req-a687e413-2d03-4fff-83c3-fe61f93b477f 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Deletion of /var/lib/nova/instances/c64e78b6-87b2-425c-aef9-771bcd042d58_del complete#033[00m
Nov 22 02:54:41 np0005531887 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f1fc8018c136caa19f628ccbc65722d14e41dbb93289349b0680dff2d25ecbaf-userdata-shm.mount: Deactivated successfully.
Nov 22 02:54:41 np0005531887 systemd[1]: var-lib-containers-storage-overlay-5f4cc6e409bad27ed3371f6eae91caca30201c042bb128ba6a8f0afbb86589d3-merged.mount: Deactivated successfully.
Nov 22 02:54:41 np0005531887 nova_compute[186849]: 2025-11-22 07:54:41.060 186853 DEBUG oslo_concurrency.processutils [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/669c1c7b-c493-4f31-83dd-737239095b63/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpg1s1mzeo" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:54:41 np0005531887 nova_compute[186849]: 2025-11-22 07:54:41.124 186853 INFO nova.compute.manager [None req-a687e413-2d03-4fff-83c3-fe61f93b477f 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Took 0.43 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 02:54:41 np0005531887 nova_compute[186849]: 2025-11-22 07:54:41.125 186853 DEBUG oslo.service.loopingcall [None req-a687e413-2d03-4fff-83c3-fe61f93b477f 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 02:54:41 np0005531887 podman[221896]: 2025-11-22 07:54:41.126058867 +0000 UTC m=+0.280762721 container cleanup f1fc8018c136caa19f628ccbc65722d14e41dbb93289349b0680dff2d25ecbaf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:54:41 np0005531887 nova_compute[186849]: 2025-11-22 07:54:41.126 186853 DEBUG nova.compute.manager [-] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 02:54:41 np0005531887 nova_compute[186849]: 2025-11-22 07:54:41.126 186853 DEBUG nova.network.neutron [-] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 02:54:41 np0005531887 systemd[1]: libpod-conmon-f1fc8018c136caa19f628ccbc65722d14e41dbb93289349b0680dff2d25ecbaf.scope: Deactivated successfully.
Nov 22 02:54:41 np0005531887 kernel: tap4f77ff1b-e1: entered promiscuous mode
Nov 22 02:54:41 np0005531887 NetworkManager[55210]: <info>  [1763798081.1467] manager: (tap4f77ff1b-e1): new Tun device (/org/freedesktop/NetworkManager/Devices/82)
Nov 22 02:54:41 np0005531887 ovn_controller[95130]: 2025-11-22T07:54:41Z|00148|binding|INFO|Claiming lport 4f77ff1b-e147-4c07-9d9b-feabd33edead for this chassis.
Nov 22 02:54:41 np0005531887 ovn_controller[95130]: 2025-11-22T07:54:41Z|00149|binding|INFO|4f77ff1b-e147-4c07-9d9b-feabd33edead: Claiming fa:16:3e:c6:b1:2a 10.100.0.8
Nov 22 02:54:41 np0005531887 nova_compute[186849]: 2025-11-22 07:54:41.148 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:41 np0005531887 systemd-udevd[221874]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:54:41 np0005531887 NetworkManager[55210]: <info>  [1763798081.1628] device (tap4f77ff1b-e1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 02:54:41 np0005531887 NetworkManager[55210]: <info>  [1763798081.1643] device (tap4f77ff1b-e1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 02:54:41 np0005531887 ovn_controller[95130]: 2025-11-22T07:54:41Z|00150|binding|INFO|Setting lport 4f77ff1b-e147-4c07-9d9b-feabd33edead ovn-installed in OVS
Nov 22 02:54:41 np0005531887 nova_compute[186849]: 2025-11-22 07:54:41.170 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:41 np0005531887 nova_compute[186849]: 2025-11-22 07:54:41.174 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:41 np0005531887 systemd-machined[153180]: New machine qemu-24-instance-0000003d.
Nov 22 02:54:41 np0005531887 systemd[1]: Started Virtual Machine qemu-24-instance-0000003d.
Nov 22 02:54:41 np0005531887 podman[221957]: 2025-11-22 07:54:41.261745037 +0000 UTC m=+0.111530571 container remove f1fc8018c136caa19f628ccbc65722d14e41dbb93289349b0680dff2d25ecbaf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:41.267 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[48f63a8f-034f-4f39-b947-cea8a720f6cf]: (4, ('Sat Nov 22 07:54:40 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c (f1fc8018c136caa19f628ccbc65722d14e41dbb93289349b0680dff2d25ecbaf)\nf1fc8018c136caa19f628ccbc65722d14e41dbb93289349b0680dff2d25ecbaf\nSat Nov 22 07:54:41 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c (f1fc8018c136caa19f628ccbc65722d14e41dbb93289349b0680dff2d25ecbaf)\nf1fc8018c136caa19f628ccbc65722d14e41dbb93289349b0680dff2d25ecbaf\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:41.269 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[4546deb0-266d-454d-94a5-c34586ff3235]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:41.270 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5e910dbb-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:54:41 np0005531887 nova_compute[186849]: 2025-11-22 07:54:41.272 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:41 np0005531887 kernel: tap5e910dbb-20: left promiscuous mode
Nov 22 02:54:41 np0005531887 nova_compute[186849]: 2025-11-22 07:54:41.284 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:41.288 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[adb74bee-4fa2-47c8-922b-093411ee90fc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:41.302 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c6:b1:2a 10.100.0.8'], port_security=['fa:16:3e:c6:b1:2a 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '669c1c7b-c493-4f31-83dd-737239095b63', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '063bf16c91af408ca075c690797e09d8', 'neutron:revision_number': '7', 'neutron:security_group_ids': 'f7dee984-8c58-404b-bb82-9a414aef709d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3245992a-6d76-4250-aff6-2ef09c2bac10, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=4f77ff1b-e147-4c07-9d9b-feabd33edead) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:41.304 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[8d310742-92b5-49de-8c2e-95196e56bec2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:41 np0005531887 ovn_controller[95130]: 2025-11-22T07:54:41Z|00151|binding|INFO|Setting lport 4f77ff1b-e147-4c07-9d9b-feabd33edead up in Southbound
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:41.306 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[224dbac5-911b-4865-8055-4301b0caac78]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:41.322 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[80daf5ab-df80-46c1-a598-11bd46e28552]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 481643, 'reachable_time': 31209, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221983, 'error': None, 'target': 'ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:41.325 104200 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:41.325 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[3b07235b-ae54-4080-8b49-580072bfa63e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:41.326 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 4f77ff1b-e147-4c07-9d9b-feabd33edead in datapath d54e232a-5c68-4cc7-b58c-054da9c4646f bound to our chassis#033[00m
Nov 22 02:54:41 np0005531887 systemd[1]: run-netns-ovnmeta\x2d5e910dbb\x2d27d1\x2d4915\x2d8b74\x2dd0538d33c33c.mount: Deactivated successfully.
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:41.328 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d54e232a-5c68-4cc7-b58c-054da9c4646f#033[00m
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:41.337 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[24e47dd7-8b09-4feb-9c9c-f93054c7995e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:41.338 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd54e232a-51 in ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:41.340 213790 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd54e232a-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:41.340 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[b466ea75-fdc2-463e-b492-a0a5fe20e4f6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:41.340 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[c39b19ff-d2ec-46f7-a6de-b6f1c4a3c775]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:41.352 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[6725d95b-bec0-4fd6-809e-083ae94b17de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:41.374 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[b84aabcc-beef-438b-9ee7-774aaf6179d1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:41.411 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[4e2f33b3-dc3b-4fa0-99f3-0783b76cc797]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:41.416 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[73f4d840-1f42-4858-ab06-4a5041f8354b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:41 np0005531887 NetworkManager[55210]: <info>  [1763798081.4190] manager: (tapd54e232a-50): new Veth device (/org/freedesktop/NetworkManager/Devices/83)
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:41.457 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[d8130688-f3df-4afc-91a8-5eb2db76bec0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:41.460 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[4089bb7d-3d61-4886-8a68-6ff7c67a10f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:41 np0005531887 NetworkManager[55210]: <info>  [1763798081.4874] device (tapd54e232a-50): carrier: link connected
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:41.495 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[ad516ee8-43df-4ff8-81cb-6d29374161f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:41.518 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[bc5de439-a8ba-4d73-809c-0ff8fd29f077]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd54e232a-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5b:69:51'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 48], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 482409, 'reachable_time': 31183, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222007, 'error': None, 'target': 'ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:41.541 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[1b1ddaf3-6d82-4fa5-8f61-7cf234ddca25]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5b:6951'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 482409, 'tstamp': 482409}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222008, 'error': None, 'target': 'ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:41.561 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[2eab5788-fd35-40be-beb7-46159ab235fc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd54e232a-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5b:69:51'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 48], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 482409, 'reachable_time': 31183, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222009, 'error': None, 'target': 'ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:41.601 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[a9482131-5ba4-4dd4-b331-e0c226794853]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:41.666 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[116688ca-0474-4889-b8d0-ba4fd36bb9a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:41.668 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd54e232a-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:41.669 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:41.669 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd54e232a-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:54:41 np0005531887 NetworkManager[55210]: <info>  [1763798081.6720] manager: (tapd54e232a-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/84)
Nov 22 02:54:41 np0005531887 kernel: tapd54e232a-50: entered promiscuous mode
Nov 22 02:54:41 np0005531887 nova_compute[186849]: 2025-11-22 07:54:41.671 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:41 np0005531887 nova_compute[186849]: 2025-11-22 07:54:41.672 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:41.674 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd54e232a-50, col_values=(('external_ids', {'iface-id': 'bab7bafe-e92a-4e88-a16b-e3bd78ab8944'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:54:41 np0005531887 nova_compute[186849]: 2025-11-22 07:54:41.675 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:41 np0005531887 ovn_controller[95130]: 2025-11-22T07:54:41Z|00152|binding|INFO|Releasing lport bab7bafe-e92a-4e88-a16b-e3bd78ab8944 from this chassis (sb_readonly=0)
Nov 22 02:54:41 np0005531887 nova_compute[186849]: 2025-11-22 07:54:41.686 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:41.688 104084 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d54e232a-5c68-4cc7-b58c-054da9c4646f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d54e232a-5c68-4cc7-b58c-054da9c4646f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:41.689 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[e8ef6e92-ba97-468a-abe5-958c427b4cce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:41.690 104084 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]: global
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]:    log         /dev/log local0 debug
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]:    log-tag     haproxy-metadata-proxy-d54e232a-5c68-4cc7-b58c-054da9c4646f
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]:    user        root
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]:    group       root
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]:    maxconn     1024
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]:    pidfile     /var/lib/neutron/external/pids/d54e232a-5c68-4cc7-b58c-054da9c4646f.pid.haproxy
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]:    daemon
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]: defaults
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]:    log global
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]:    mode http
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]:    option httplog
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]:    option dontlognull
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]:    option http-server-close
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]:    option forwardfor
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]:    retries                 3
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]:    timeout http-request    30s
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]:    timeout connect         30s
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]:    timeout client          32s
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]:    timeout server          32s
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]:    timeout http-keep-alive 30s
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]: listen listener
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]:    bind 169.254.169.254:80
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]:    http-request add-header X-OVN-Network-ID d54e232a-5c68-4cc7-b58c-054da9c4646f
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 02:54:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:41.690 104084 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'env', 'PROCESS_TAG=haproxy-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d54e232a-5c68-4cc7-b58c-054da9c4646f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 02:54:41 np0005531887 nova_compute[186849]: 2025-11-22 07:54:41.790 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:54:41 np0005531887 nova_compute[186849]: 2025-11-22 07:54:41.791 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:54:41 np0005531887 nova_compute[186849]: 2025-11-22 07:54:41.800 186853 DEBUG nova.virt.libvirt.host [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Removed pending event for 669c1c7b-c493-4f31-83dd-737239095b63 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Nov 22 02:54:41 np0005531887 nova_compute[186849]: 2025-11-22 07:54:41.800 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798081.799905, 669c1c7b-c493-4f31-83dd-737239095b63 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:54:41 np0005531887 nova_compute[186849]: 2025-11-22 07:54:41.800 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] VM Started (Lifecycle Event)#033[00m
Nov 22 02:54:41 np0005531887 nova_compute[186849]: 2025-11-22 07:54:41.822 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:54:41 np0005531887 nova_compute[186849]: 2025-11-22 07:54:41.828 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798081.8007252, 669c1c7b-c493-4f31-83dd-737239095b63 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:54:41 np0005531887 nova_compute[186849]: 2025-11-22 07:54:41.829 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] VM Paused (Lifecycle Event)#033[00m
Nov 22 02:54:41 np0005531887 nova_compute[186849]: 2025-11-22 07:54:41.847 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:54:41 np0005531887 nova_compute[186849]: 2025-11-22 07:54:41.852 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:54:41 np0005531887 nova_compute[186849]: 2025-11-22 07:54:41.884 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Nov 22 02:54:42 np0005531887 podman[222049]: 2025-11-22 07:54:42.03882215 +0000 UTC m=+0.025823414 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 02:54:42 np0005531887 podman[222049]: 2025-11-22 07:54:42.151213399 +0000 UTC m=+0.138214633 container create 01f1df77941a7cc6f88d15b29f80e68be42ff670b99d953e4235d3ce2649a3b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:54:42 np0005531887 systemd[1]: Started libpod-conmon-01f1df77941a7cc6f88d15b29f80e68be42ff670b99d953e4235d3ce2649a3b4.scope.
Nov 22 02:54:42 np0005531887 systemd[1]: Started libcrun container.
Nov 22 02:54:42 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bb714a96c812a33c19fad9bec677a6ea9007f3ab2e47797405d69273e9174326/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 02:54:42 np0005531887 podman[222049]: 2025-11-22 07:54:42.260464883 +0000 UTC m=+0.247466127 container init 01f1df77941a7cc6f88d15b29f80e68be42ff670b99d953e4235d3ce2649a3b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 22 02:54:42 np0005531887 podman[222049]: 2025-11-22 07:54:42.267530215 +0000 UTC m=+0.254531459 container start 01f1df77941a7cc6f88d15b29f80e68be42ff670b99d953e4235d3ce2649a3b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:54:42 np0005531887 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[222065]: [NOTICE]   (222081) : New worker (222083) forked
Nov 22 02:54:42 np0005531887 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[222065]: [NOTICE]   (222081) : Loading success.
Nov 22 02:54:42 np0005531887 podman[222064]: 2025-11-22 07:54:42.319414845 +0000 UTC m=+0.091737865 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 02:54:42 np0005531887 nova_compute[186849]: 2025-11-22 07:54:42.432 186853 DEBUG nova.compute.manager [req-baa9fde4-49ac-41fc-8282-3f53145ea351 req-c8dd17a0-4d7e-4230-b7a1-c123fbd5e79a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Received event network-vif-plugged-4f77ff1b-e147-4c07-9d9b-feabd33edead external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:54:42 np0005531887 nova_compute[186849]: 2025-11-22 07:54:42.433 186853 DEBUG oslo_concurrency.lockutils [req-baa9fde4-49ac-41fc-8282-3f53145ea351 req-c8dd17a0-4d7e-4230-b7a1-c123fbd5e79a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "669c1c7b-c493-4f31-83dd-737239095b63-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:42 np0005531887 nova_compute[186849]: 2025-11-22 07:54:42.435 186853 DEBUG oslo_concurrency.lockutils [req-baa9fde4-49ac-41fc-8282-3f53145ea351 req-c8dd17a0-4d7e-4230-b7a1-c123fbd5e79a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "669c1c7b-c493-4f31-83dd-737239095b63-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:42 np0005531887 nova_compute[186849]: 2025-11-22 07:54:42.435 186853 DEBUG oslo_concurrency.lockutils [req-baa9fde4-49ac-41fc-8282-3f53145ea351 req-c8dd17a0-4d7e-4230-b7a1-c123fbd5e79a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "669c1c7b-c493-4f31-83dd-737239095b63-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:42 np0005531887 nova_compute[186849]: 2025-11-22 07:54:42.435 186853 DEBUG nova.compute.manager [req-baa9fde4-49ac-41fc-8282-3f53145ea351 req-c8dd17a0-4d7e-4230-b7a1-c123fbd5e79a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Processing event network-vif-plugged-4f77ff1b-e147-4c07-9d9b-feabd33edead _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 02:54:42 np0005531887 nova_compute[186849]: 2025-11-22 07:54:42.436 186853 DEBUG nova.compute.manager [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:54:42 np0005531887 nova_compute[186849]: 2025-11-22 07:54:42.441 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798082.4409542, 669c1c7b-c493-4f31-83dd-737239095b63 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:54:42 np0005531887 nova_compute[186849]: 2025-11-22 07:54:42.441 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:54:42 np0005531887 nova_compute[186849]: 2025-11-22 07:54:42.443 186853 DEBUG nova.virt.libvirt.driver [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 02:54:42 np0005531887 nova_compute[186849]: 2025-11-22 07:54:42.447 186853 INFO nova.virt.libvirt.driver [-] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Instance spawned successfully.#033[00m
Nov 22 02:54:42 np0005531887 nova_compute[186849]: 2025-11-22 07:54:42.447 186853 DEBUG nova.virt.libvirt.driver [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 02:54:42 np0005531887 nova_compute[186849]: 2025-11-22 07:54:42.460 186853 DEBUG nova.compute.manager [req-f98ff1e5-d4fa-44d6-bcc5-33f3f0d35329 req-65ba2bf6-e8f9-4e04-b5e5-77e55b1814c7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Received event network-vif-unplugged-a038edb6-47af-4f7e-9f5e-715660b6da32 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:54:42 np0005531887 nova_compute[186849]: 2025-11-22 07:54:42.461 186853 DEBUG oslo_concurrency.lockutils [req-f98ff1e5-d4fa-44d6-bcc5-33f3f0d35329 req-65ba2bf6-e8f9-4e04-b5e5-77e55b1814c7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c64e78b6-87b2-425c-aef9-771bcd042d58-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:42 np0005531887 nova_compute[186849]: 2025-11-22 07:54:42.461 186853 DEBUG oslo_concurrency.lockutils [req-f98ff1e5-d4fa-44d6-bcc5-33f3f0d35329 req-65ba2bf6-e8f9-4e04-b5e5-77e55b1814c7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c64e78b6-87b2-425c-aef9-771bcd042d58-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:42 np0005531887 nova_compute[186849]: 2025-11-22 07:54:42.461 186853 DEBUG oslo_concurrency.lockutils [req-f98ff1e5-d4fa-44d6-bcc5-33f3f0d35329 req-65ba2bf6-e8f9-4e04-b5e5-77e55b1814c7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c64e78b6-87b2-425c-aef9-771bcd042d58-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:42 np0005531887 nova_compute[186849]: 2025-11-22 07:54:42.461 186853 DEBUG nova.compute.manager [req-f98ff1e5-d4fa-44d6-bcc5-33f3f0d35329 req-65ba2bf6-e8f9-4e04-b5e5-77e55b1814c7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] No waiting events found dispatching network-vif-unplugged-a038edb6-47af-4f7e-9f5e-715660b6da32 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:54:42 np0005531887 nova_compute[186849]: 2025-11-22 07:54:42.462 186853 WARNING nova.compute.manager [req-f98ff1e5-d4fa-44d6-bcc5-33f3f0d35329 req-65ba2bf6-e8f9-4e04-b5e5-77e55b1814c7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Received unexpected event network-vif-unplugged-a038edb6-47af-4f7e-9f5e-715660b6da32 for instance with vm_state active and task_state None.#033[00m
Nov 22 02:54:42 np0005531887 nova_compute[186849]: 2025-11-22 07:54:42.477 186853 DEBUG nova.virt.libvirt.driver [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:54:42 np0005531887 nova_compute[186849]: 2025-11-22 07:54:42.477 186853 DEBUG nova.virt.libvirt.driver [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:54:42 np0005531887 nova_compute[186849]: 2025-11-22 07:54:42.477 186853 DEBUG nova.virt.libvirt.driver [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:54:42 np0005531887 nova_compute[186849]: 2025-11-22 07:54:42.478 186853 DEBUG nova.virt.libvirt.driver [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:54:42 np0005531887 nova_compute[186849]: 2025-11-22 07:54:42.478 186853 DEBUG nova.virt.libvirt.driver [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:54:42 np0005531887 nova_compute[186849]: 2025-11-22 07:54:42.478 186853 DEBUG nova.virt.libvirt.driver [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:54:42 np0005531887 nova_compute[186849]: 2025-11-22 07:54:42.483 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:54:42 np0005531887 nova_compute[186849]: 2025-11-22 07:54:42.485 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:54:42 np0005531887 nova_compute[186849]: 2025-11-22 07:54:42.515 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Nov 22 02:54:42 np0005531887 nova_compute[186849]: 2025-11-22 07:54:42.658 186853 DEBUG nova.compute.manager [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:54:42 np0005531887 nova_compute[186849]: 2025-11-22 07:54:42.767 186853 DEBUG oslo_concurrency.lockutils [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:42 np0005531887 nova_compute[186849]: 2025-11-22 07:54:42.769 186853 DEBUG oslo_concurrency.lockutils [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:42 np0005531887 nova_compute[186849]: 2025-11-22 07:54:42.769 186853 DEBUG nova.objects.instance [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Nov 22 02:54:42 np0005531887 nova_compute[186849]: 2025-11-22 07:54:42.863 186853 DEBUG oslo_concurrency.lockutils [None req-76f2434b-de66-4928-86fe-fc863639500b e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.094s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:43 np0005531887 nova_compute[186849]: 2025-11-22 07:54:43.585 186853 DEBUG nova.network.neutron [-] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:54:43 np0005531887 nova_compute[186849]: 2025-11-22 07:54:43.621 186853 INFO nova.compute.manager [-] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Took 2.50 seconds to deallocate network for instance.#033[00m
Nov 22 02:54:43 np0005531887 nova_compute[186849]: 2025-11-22 07:54:43.742 186853 DEBUG oslo_concurrency.lockutils [None req-a687e413-2d03-4fff-83c3-fe61f93b477f 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:43 np0005531887 nova_compute[186849]: 2025-11-22 07:54:43.742 186853 DEBUG oslo_concurrency.lockutils [None req-a687e413-2d03-4fff-83c3-fe61f93b477f 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:43 np0005531887 nova_compute[186849]: 2025-11-22 07:54:43.748 186853 DEBUG oslo_concurrency.lockutils [None req-a687e413-2d03-4fff-83c3-fe61f93b477f 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.006s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:43 np0005531887 nova_compute[186849]: 2025-11-22 07:54:43.801 186853 INFO nova.scheduler.client.report [None req-a687e413-2d03-4fff-83c3-fe61f93b477f 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Deleted allocations for instance c64e78b6-87b2-425c-aef9-771bcd042d58#033[00m
Nov 22 02:54:43 np0005531887 nova_compute[186849]: 2025-11-22 07:54:43.931 186853 DEBUG oslo_concurrency.lockutils [None req-a687e413-2d03-4fff-83c3-fe61f93b477f 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "c64e78b6-87b2-425c-aef9-771bcd042d58" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.256s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:44 np0005531887 nova_compute[186849]: 2025-11-22 07:54:44.613 186853 DEBUG nova.compute.manager [req-8d431c58-f9b7-4012-a6f0-82682b7d862b req-77d00641-802a-4af4-9961-bc1b3d695327 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Received event network-vif-plugged-a038edb6-47af-4f7e-9f5e-715660b6da32 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:54:44 np0005531887 nova_compute[186849]: 2025-11-22 07:54:44.614 186853 DEBUG oslo_concurrency.lockutils [req-8d431c58-f9b7-4012-a6f0-82682b7d862b req-77d00641-802a-4af4-9961-bc1b3d695327 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c64e78b6-87b2-425c-aef9-771bcd042d58-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:44 np0005531887 nova_compute[186849]: 2025-11-22 07:54:44.614 186853 DEBUG oslo_concurrency.lockutils [req-8d431c58-f9b7-4012-a6f0-82682b7d862b req-77d00641-802a-4af4-9961-bc1b3d695327 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c64e78b6-87b2-425c-aef9-771bcd042d58-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:44 np0005531887 nova_compute[186849]: 2025-11-22 07:54:44.614 186853 DEBUG oslo_concurrency.lockutils [req-8d431c58-f9b7-4012-a6f0-82682b7d862b req-77d00641-802a-4af4-9961-bc1b3d695327 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c64e78b6-87b2-425c-aef9-771bcd042d58-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:44 np0005531887 nova_compute[186849]: 2025-11-22 07:54:44.614 186853 DEBUG nova.compute.manager [req-8d431c58-f9b7-4012-a6f0-82682b7d862b req-77d00641-802a-4af4-9961-bc1b3d695327 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] No waiting events found dispatching network-vif-plugged-a038edb6-47af-4f7e-9f5e-715660b6da32 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:54:44 np0005531887 nova_compute[186849]: 2025-11-22 07:54:44.615 186853 WARNING nova.compute.manager [req-8d431c58-f9b7-4012-a6f0-82682b7d862b req-77d00641-802a-4af4-9961-bc1b3d695327 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Received unexpected event network-vif-plugged-a038edb6-47af-4f7e-9f5e-715660b6da32 for instance with vm_state deleted and task_state None.#033[00m
Nov 22 02:54:44 np0005531887 nova_compute[186849]: 2025-11-22 07:54:44.622 186853 DEBUG nova.compute.manager [req-6d99463f-25d8-45af-8a99-0df0d0b1b15d req-7e4851e5-852d-4d93-9c0c-ec67646707c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Received event network-vif-plugged-4f77ff1b-e147-4c07-9d9b-feabd33edead external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:54:44 np0005531887 nova_compute[186849]: 2025-11-22 07:54:44.623 186853 DEBUG oslo_concurrency.lockutils [req-6d99463f-25d8-45af-8a99-0df0d0b1b15d req-7e4851e5-852d-4d93-9c0c-ec67646707c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "669c1c7b-c493-4f31-83dd-737239095b63-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:44 np0005531887 nova_compute[186849]: 2025-11-22 07:54:44.623 186853 DEBUG oslo_concurrency.lockutils [req-6d99463f-25d8-45af-8a99-0df0d0b1b15d req-7e4851e5-852d-4d93-9c0c-ec67646707c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "669c1c7b-c493-4f31-83dd-737239095b63-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:44 np0005531887 nova_compute[186849]: 2025-11-22 07:54:44.624 186853 DEBUG oslo_concurrency.lockutils [req-6d99463f-25d8-45af-8a99-0df0d0b1b15d req-7e4851e5-852d-4d93-9c0c-ec67646707c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "669c1c7b-c493-4f31-83dd-737239095b63-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:44 np0005531887 nova_compute[186849]: 2025-11-22 07:54:44.624 186853 DEBUG nova.compute.manager [req-6d99463f-25d8-45af-8a99-0df0d0b1b15d req-7e4851e5-852d-4d93-9c0c-ec67646707c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] No waiting events found dispatching network-vif-plugged-4f77ff1b-e147-4c07-9d9b-feabd33edead pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:54:44 np0005531887 nova_compute[186849]: 2025-11-22 07:54:44.624 186853 WARNING nova.compute.manager [req-6d99463f-25d8-45af-8a99-0df0d0b1b15d req-7e4851e5-852d-4d93-9c0c-ec67646707c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Received unexpected event network-vif-plugged-4f77ff1b-e147-4c07-9d9b-feabd33edead for instance with vm_state active and task_state None.#033[00m
Nov 22 02:54:44 np0005531887 nova_compute[186849]: 2025-11-22 07:54:44.624 186853 DEBUG nova.compute.manager [req-6d99463f-25d8-45af-8a99-0df0d0b1b15d req-7e4851e5-852d-4d93-9c0c-ec67646707c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Received event network-vif-deleted-a038edb6-47af-4f7e-9f5e-715660b6da32 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:54:45 np0005531887 nova_compute[186849]: 2025-11-22 07:54:45.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:54:45 np0005531887 nova_compute[186849]: 2025-11-22 07:54:45.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 02:54:45 np0005531887 nova_compute[186849]: 2025-11-22 07:54:45.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:54:46 np0005531887 nova_compute[186849]: 2025-11-22 07:54:46.482 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:46 np0005531887 ovn_controller[95130]: 2025-11-22T07:54:46Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:15:d5:34 10.100.0.14
Nov 22 02:54:46 np0005531887 ovn_controller[95130]: 2025-11-22T07:54:46Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:15:d5:34 10.100.0.14
Nov 22 02:54:46 np0005531887 nova_compute[186849]: 2025-11-22 07:54:46.503 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:46 np0005531887 nova_compute[186849]: 2025-11-22 07:54:46.503 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:46 np0005531887 nova_compute[186849]: 2025-11-22 07:54:46.504 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:46 np0005531887 nova_compute[186849]: 2025-11-22 07:54:46.504 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 02:54:46 np0005531887 nova_compute[186849]: 2025-11-22 07:54:46.589 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/669c1c7b-c493-4f31-83dd-737239095b63/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:54:46 np0005531887 nova_compute[186849]: 2025-11-22 07:54:46.678 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/669c1c7b-c493-4f31-83dd-737239095b63/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:54:46 np0005531887 nova_compute[186849]: 2025-11-22 07:54:46.680 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/669c1c7b-c493-4f31-83dd-737239095b63/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:54:46 np0005531887 nova_compute[186849]: 2025-11-22 07:54:46.750 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/669c1c7b-c493-4f31-83dd-737239095b63/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:54:46 np0005531887 nova_compute[186849]: 2025-11-22 07:54:46.760 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/10a29489-706f-428f-b645-1c688d642f0b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:54:46 np0005531887 nova_compute[186849]: 2025-11-22 07:54:46.831 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/10a29489-706f-428f-b645-1c688d642f0b/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:54:46 np0005531887 nova_compute[186849]: 2025-11-22 07:54:46.833 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/10a29489-706f-428f-b645-1c688d642f0b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:54:46 np0005531887 nova_compute[186849]: 2025-11-22 07:54:46.894 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/10a29489-706f-428f-b645-1c688d642f0b/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:54:47 np0005531887 nova_compute[186849]: 2025-11-22 07:54:47.071 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:54:47 np0005531887 nova_compute[186849]: 2025-11-22 07:54:47.073 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5331MB free_disk=73.38602447509766GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 02:54:47 np0005531887 nova_compute[186849]: 2025-11-22 07:54:47.074 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:47 np0005531887 nova_compute[186849]: 2025-11-22 07:54:47.074 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:47 np0005531887 nova_compute[186849]: 2025-11-22 07:54:47.161 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Instance 669c1c7b-c493-4f31-83dd-737239095b63 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 02:54:47 np0005531887 nova_compute[186849]: 2025-11-22 07:54:47.162 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Instance 10a29489-706f-428f-b645-1c688d642f0b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 02:54:47 np0005531887 nova_compute[186849]: 2025-11-22 07:54:47.162 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 02:54:47 np0005531887 nova_compute[186849]: 2025-11-22 07:54:47.162 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=832MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 02:54:47 np0005531887 nova_compute[186849]: 2025-11-22 07:54:47.226 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:54:47 np0005531887 nova_compute[186849]: 2025-11-22 07:54:47.242 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:54:47 np0005531887 nova_compute[186849]: 2025-11-22 07:54:47.268 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 02:54:47 np0005531887 nova_compute[186849]: 2025-11-22 07:54:47.269 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.195s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:47 np0005531887 nova_compute[186849]: 2025-11-22 07:54:47.586 186853 DEBUG oslo_concurrency.lockutils [None req-c69aa66a-63e0-4ffe-88bd-02a41be4de79 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "669c1c7b-c493-4f31-83dd-737239095b63" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:47 np0005531887 nova_compute[186849]: 2025-11-22 07:54:47.587 186853 DEBUG oslo_concurrency.lockutils [None req-c69aa66a-63e0-4ffe-88bd-02a41be4de79 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "669c1c7b-c493-4f31-83dd-737239095b63" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:47 np0005531887 nova_compute[186849]: 2025-11-22 07:54:47.587 186853 DEBUG oslo_concurrency.lockutils [None req-c69aa66a-63e0-4ffe-88bd-02a41be4de79 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "669c1c7b-c493-4f31-83dd-737239095b63-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:47 np0005531887 nova_compute[186849]: 2025-11-22 07:54:47.588 186853 DEBUG oslo_concurrency.lockutils [None req-c69aa66a-63e0-4ffe-88bd-02a41be4de79 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "669c1c7b-c493-4f31-83dd-737239095b63-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:47 np0005531887 nova_compute[186849]: 2025-11-22 07:54:47.588 186853 DEBUG oslo_concurrency.lockutils [None req-c69aa66a-63e0-4ffe-88bd-02a41be4de79 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "669c1c7b-c493-4f31-83dd-737239095b63-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:47 np0005531887 nova_compute[186849]: 2025-11-22 07:54:47.595 186853 INFO nova.compute.manager [None req-c69aa66a-63e0-4ffe-88bd-02a41be4de79 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Terminating instance#033[00m
Nov 22 02:54:47 np0005531887 nova_compute[186849]: 2025-11-22 07:54:47.602 186853 DEBUG nova.compute.manager [None req-c69aa66a-63e0-4ffe-88bd-02a41be4de79 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 02:54:47 np0005531887 kernel: tap4f77ff1b-e1 (unregistering): left promiscuous mode
Nov 22 02:54:47 np0005531887 NetworkManager[55210]: <info>  [1763798087.6221] device (tap4f77ff1b-e1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 02:54:47 np0005531887 ovn_controller[95130]: 2025-11-22T07:54:47Z|00153|binding|INFO|Releasing lport 4f77ff1b-e147-4c07-9d9b-feabd33edead from this chassis (sb_readonly=0)
Nov 22 02:54:47 np0005531887 ovn_controller[95130]: 2025-11-22T07:54:47Z|00154|binding|INFO|Setting lport 4f77ff1b-e147-4c07-9d9b-feabd33edead down in Southbound
Nov 22 02:54:47 np0005531887 nova_compute[186849]: 2025-11-22 07:54:47.628 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:47 np0005531887 ovn_controller[95130]: 2025-11-22T07:54:47Z|00155|binding|INFO|Removing iface tap4f77ff1b-e1 ovn-installed in OVS
Nov 22 02:54:47 np0005531887 nova_compute[186849]: 2025-11-22 07:54:47.644 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:47 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:47.646 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c6:b1:2a 10.100.0.8'], port_security=['fa:16:3e:c6:b1:2a 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '669c1c7b-c493-4f31-83dd-737239095b63', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '063bf16c91af408ca075c690797e09d8', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'f7dee984-8c58-404b-bb82-9a414aef709d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3245992a-6d76-4250-aff6-2ef09c2bac10, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=4f77ff1b-e147-4c07-9d9b-feabd33edead) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:54:47 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:47.647 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 4f77ff1b-e147-4c07-9d9b-feabd33edead in datapath d54e232a-5c68-4cc7-b58c-054da9c4646f unbound from our chassis#033[00m
Nov 22 02:54:47 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:47.648 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d54e232a-5c68-4cc7-b58c-054da9c4646f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 02:54:47 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:47.649 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[2fabf294-206b-4a18-8472-f144f6098724]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:47 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:47.649 104084 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f namespace which is not needed anymore#033[00m
Nov 22 02:54:47 np0005531887 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d0000003d.scope: Deactivated successfully.
Nov 22 02:54:47 np0005531887 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d0000003d.scope: Consumed 5.781s CPU time.
Nov 22 02:54:47 np0005531887 systemd-machined[153180]: Machine qemu-24-instance-0000003d terminated.
Nov 22 02:54:47 np0005531887 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[222065]: [NOTICE]   (222081) : haproxy version is 2.8.14-c23fe91
Nov 22 02:54:47 np0005531887 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[222065]: [NOTICE]   (222081) : path to executable is /usr/sbin/haproxy
Nov 22 02:54:47 np0005531887 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[222065]: [WARNING]  (222081) : Exiting Master process...
Nov 22 02:54:47 np0005531887 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[222065]: [WARNING]  (222081) : Exiting Master process...
Nov 22 02:54:47 np0005531887 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[222065]: [ALERT]    (222081) : Current worker (222083) exited with code 143 (Terminated)
Nov 22 02:54:47 np0005531887 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[222065]: [WARNING]  (222081) : All workers exited. Exiting... (0)
Nov 22 02:54:47 np0005531887 systemd[1]: libpod-01f1df77941a7cc6f88d15b29f80e68be42ff670b99d953e4235d3ce2649a3b4.scope: Deactivated successfully.
Nov 22 02:54:47 np0005531887 podman[222160]: 2025-11-22 07:54:47.788185753 +0000 UTC m=+0.053389497 container died 01f1df77941a7cc6f88d15b29f80e68be42ff670b99d953e4235d3ce2649a3b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:54:47 np0005531887 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-01f1df77941a7cc6f88d15b29f80e68be42ff670b99d953e4235d3ce2649a3b4-userdata-shm.mount: Deactivated successfully.
Nov 22 02:54:47 np0005531887 systemd[1]: var-lib-containers-storage-overlay-bb714a96c812a33c19fad9bec677a6ea9007f3ab2e47797405d69273e9174326-merged.mount: Deactivated successfully.
Nov 22 02:54:47 np0005531887 podman[222160]: 2025-11-22 07:54:47.843813225 +0000 UTC m=+0.109016969 container cleanup 01f1df77941a7cc6f88d15b29f80e68be42ff670b99d953e4235d3ce2649a3b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 02:54:47 np0005531887 systemd[1]: libpod-conmon-01f1df77941a7cc6f88d15b29f80e68be42ff670b99d953e4235d3ce2649a3b4.scope: Deactivated successfully.
Nov 22 02:54:47 np0005531887 nova_compute[186849]: 2025-11-22 07:54:47.867 186853 INFO nova.virt.libvirt.driver [-] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Instance destroyed successfully.#033[00m
Nov 22 02:54:47 np0005531887 nova_compute[186849]: 2025-11-22 07:54:47.869 186853 DEBUG nova.objects.instance [None req-c69aa66a-63e0-4ffe-88bd-02a41be4de79 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lazy-loading 'resources' on Instance uuid 669c1c7b-c493-4f31-83dd-737239095b63 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:54:47 np0005531887 nova_compute[186849]: 2025-11-22 07:54:47.894 186853 DEBUG nova.virt.libvirt.vif [None req-c69aa66a-63e0-4ffe-88bd-02a41be4de79 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-22T07:54:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-519804368',display_name='tempest-ServerDiskConfigTestJSON-server-519804368',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-519804368',id=61,image_ref='360f90ca-2ddb-4e60-a48e-364e3b48bd96',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:54:42Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='063bf16c91af408ca075c690797e09d8',ramdisk_id='',reservation_id='r-pgwqi3ub',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='360f90ca-2ddb-4e60-a48e-364e3b48bd96',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-592691466',owner_user_name='tempest-ServerDiskConfigTestJSON-592691466-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:54:42Z,user_data=None,user_id='e24c302b62fb470aa189b76d4676733b',uuid=669c1c7b-c493-4f31-83dd-737239095b63,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4f77ff1b-e147-4c07-9d9b-feabd33edead", "address": "fa:16:3e:c6:b1:2a", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f77ff1b-e1", "ovs_interfaceid": "4f77ff1b-e147-4c07-9d9b-feabd33edead", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 02:54:47 np0005531887 nova_compute[186849]: 2025-11-22 07:54:47.895 186853 DEBUG nova.network.os_vif_util [None req-c69aa66a-63e0-4ffe-88bd-02a41be4de79 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Converting VIF {"id": "4f77ff1b-e147-4c07-9d9b-feabd33edead", "address": "fa:16:3e:c6:b1:2a", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f77ff1b-e1", "ovs_interfaceid": "4f77ff1b-e147-4c07-9d9b-feabd33edead", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:54:47 np0005531887 nova_compute[186849]: 2025-11-22 07:54:47.896 186853 DEBUG nova.network.os_vif_util [None req-c69aa66a-63e0-4ffe-88bd-02a41be4de79 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c6:b1:2a,bridge_name='br-int',has_traffic_filtering=True,id=4f77ff1b-e147-4c07-9d9b-feabd33edead,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f77ff1b-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:54:47 np0005531887 nova_compute[186849]: 2025-11-22 07:54:47.896 186853 DEBUG os_vif [None req-c69aa66a-63e0-4ffe-88bd-02a41be4de79 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c6:b1:2a,bridge_name='br-int',has_traffic_filtering=True,id=4f77ff1b-e147-4c07-9d9b-feabd33edead,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f77ff1b-e1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 02:54:47 np0005531887 nova_compute[186849]: 2025-11-22 07:54:47.898 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:47 np0005531887 nova_compute[186849]: 2025-11-22 07:54:47.898 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4f77ff1b-e1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:54:47 np0005531887 nova_compute[186849]: 2025-11-22 07:54:47.900 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:47 np0005531887 nova_compute[186849]: 2025-11-22 07:54:47.903 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:47 np0005531887 nova_compute[186849]: 2025-11-22 07:54:47.906 186853 INFO os_vif [None req-c69aa66a-63e0-4ffe-88bd-02a41be4de79 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c6:b1:2a,bridge_name='br-int',has_traffic_filtering=True,id=4f77ff1b-e147-4c07-9d9b-feabd33edead,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f77ff1b-e1')#033[00m
Nov 22 02:54:47 np0005531887 nova_compute[186849]: 2025-11-22 07:54:47.907 186853 INFO nova.virt.libvirt.driver [None req-c69aa66a-63e0-4ffe-88bd-02a41be4de79 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Deleting instance files /var/lib/nova/instances/669c1c7b-c493-4f31-83dd-737239095b63_del#033[00m
Nov 22 02:54:47 np0005531887 nova_compute[186849]: 2025-11-22 07:54:47.908 186853 INFO nova.virt.libvirt.driver [None req-c69aa66a-63e0-4ffe-88bd-02a41be4de79 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Deletion of /var/lib/nova/instances/669c1c7b-c493-4f31-83dd-737239095b63_del complete#033[00m
Nov 22 02:54:47 np0005531887 podman[222203]: 2025-11-22 07:54:47.921027134 +0000 UTC m=+0.048613191 container remove 01f1df77941a7cc6f88d15b29f80e68be42ff670b99d953e4235d3ce2649a3b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:54:47 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:47.926 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[b40c814b-98f6-4e2f-b7f1-a8fc8a55af86]: (4, ('Sat Nov 22 07:54:47 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f (01f1df77941a7cc6f88d15b29f80e68be42ff670b99d953e4235d3ce2649a3b4)\n01f1df77941a7cc6f88d15b29f80e68be42ff670b99d953e4235d3ce2649a3b4\nSat Nov 22 07:54:47 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f (01f1df77941a7cc6f88d15b29f80e68be42ff670b99d953e4235d3ce2649a3b4)\n01f1df77941a7cc6f88d15b29f80e68be42ff670b99d953e4235d3ce2649a3b4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:47 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:47.928 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[d62a5ac6-6bde-4697-8fac-994bfb1978d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:47 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:47.929 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd54e232a-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:54:47 np0005531887 nova_compute[186849]: 2025-11-22 07:54:47.930 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:47 np0005531887 kernel: tapd54e232a-50: left promiscuous mode
Nov 22 02:54:47 np0005531887 nova_compute[186849]: 2025-11-22 07:54:47.944 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:47 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:47.946 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[634e4a80-4ed4-48f2-bdc1-efc9169bc461]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:47 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:47.962 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[aa02a7a9-0170-48a4-b101-ada510b9e03d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:47 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:47.963 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[e5568cda-6b56-40ad-a417-0fc902c8cdf4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:47 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:47.979 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[7e2123b2-fb22-4fe8-a74c-dc70ec5084f2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 482401, 'reachable_time': 38685, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222222, 'error': None, 'target': 'ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:47 np0005531887 systemd[1]: run-netns-ovnmeta\x2dd54e232a\x2d5c68\x2d4cc7\x2db58c\x2d054da9c4646f.mount: Deactivated successfully.
Nov 22 02:54:47 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:47.983 104200 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 02:54:47 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:47.983 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[87bd5945-6345-4b6a-af6e-fe362b010835]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:54:48 np0005531887 nova_compute[186849]: 2025-11-22 07:54:48.022 186853 INFO nova.compute.manager [None req-c69aa66a-63e0-4ffe-88bd-02a41be4de79 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Took 0.42 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 02:54:48 np0005531887 nova_compute[186849]: 2025-11-22 07:54:48.023 186853 DEBUG oslo.service.loopingcall [None req-c69aa66a-63e0-4ffe-88bd-02a41be4de79 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 02:54:48 np0005531887 nova_compute[186849]: 2025-11-22 07:54:48.023 186853 DEBUG nova.compute.manager [-] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 02:54:48 np0005531887 nova_compute[186849]: 2025-11-22 07:54:48.023 186853 DEBUG nova.network.neutron [-] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 02:54:48 np0005531887 podman[222215]: 2025-11-22 07:54:48.057248856 +0000 UTC m=+0.089237373 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 02:54:48 np0005531887 nova_compute[186849]: 2025-11-22 07:54:48.262 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:54:48 np0005531887 nova_compute[186849]: 2025-11-22 07:54:48.263 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:54:48 np0005531887 nova_compute[186849]: 2025-11-22 07:54:48.263 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:54:48 np0005531887 nova_compute[186849]: 2025-11-22 07:54:48.264 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:54:48 np0005531887 nova_compute[186849]: 2025-11-22 07:54:48.533 186853 DEBUG nova.compute.manager [req-392057c6-ec19-4818-a96b-ac150f11cd69 req-693a340c-5732-4c79-a5ce-d3c533ec8ed0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Received event network-vif-unplugged-4f77ff1b-e147-4c07-9d9b-feabd33edead external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:54:48 np0005531887 nova_compute[186849]: 2025-11-22 07:54:48.534 186853 DEBUG oslo_concurrency.lockutils [req-392057c6-ec19-4818-a96b-ac150f11cd69 req-693a340c-5732-4c79-a5ce-d3c533ec8ed0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "669c1c7b-c493-4f31-83dd-737239095b63-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:48 np0005531887 nova_compute[186849]: 2025-11-22 07:54:48.534 186853 DEBUG oslo_concurrency.lockutils [req-392057c6-ec19-4818-a96b-ac150f11cd69 req-693a340c-5732-4c79-a5ce-d3c533ec8ed0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "669c1c7b-c493-4f31-83dd-737239095b63-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:48 np0005531887 nova_compute[186849]: 2025-11-22 07:54:48.535 186853 DEBUG oslo_concurrency.lockutils [req-392057c6-ec19-4818-a96b-ac150f11cd69 req-693a340c-5732-4c79-a5ce-d3c533ec8ed0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "669c1c7b-c493-4f31-83dd-737239095b63-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:48 np0005531887 nova_compute[186849]: 2025-11-22 07:54:48.535 186853 DEBUG nova.compute.manager [req-392057c6-ec19-4818-a96b-ac150f11cd69 req-693a340c-5732-4c79-a5ce-d3c533ec8ed0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] No waiting events found dispatching network-vif-unplugged-4f77ff1b-e147-4c07-9d9b-feabd33edead pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:54:48 np0005531887 nova_compute[186849]: 2025-11-22 07:54:48.535 186853 DEBUG nova.compute.manager [req-392057c6-ec19-4818-a96b-ac150f11cd69 req-693a340c-5732-4c79-a5ce-d3c533ec8ed0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Received event network-vif-unplugged-4f77ff1b-e147-4c07-9d9b-feabd33edead for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 02:54:48 np0005531887 nova_compute[186849]: 2025-11-22 07:54:48.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:54:48 np0005531887 nova_compute[186849]: 2025-11-22 07:54:48.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 02:54:48 np0005531887 nova_compute[186849]: 2025-11-22 07:54:48.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 02:54:48 np0005531887 nova_compute[186849]: 2025-11-22 07:54:48.792 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Nov 22 02:54:49 np0005531887 nova_compute[186849]: 2025-11-22 07:54:49.037 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "refresh_cache-10a29489-706f-428f-b645-1c688d642f0b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:54:49 np0005531887 nova_compute[186849]: 2025-11-22 07:54:49.037 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquired lock "refresh_cache-10a29489-706f-428f-b645-1c688d642f0b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:54:49 np0005531887 nova_compute[186849]: 2025-11-22 07:54:49.038 186853 DEBUG nova.network.neutron [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 02:54:49 np0005531887 nova_compute[186849]: 2025-11-22 07:54:49.038 186853 DEBUG nova.objects.instance [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 10a29489-706f-428f-b645-1c688d642f0b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:54:50 np0005531887 nova_compute[186849]: 2025-11-22 07:54:50.008 186853 DEBUG nova.network.neutron [-] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:54:50 np0005531887 nova_compute[186849]: 2025-11-22 07:54:50.045 186853 INFO nova.compute.manager [-] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Took 2.02 seconds to deallocate network for instance.#033[00m
Nov 22 02:54:50 np0005531887 nova_compute[186849]: 2025-11-22 07:54:50.219 186853 DEBUG oslo_concurrency.lockutils [None req-c69aa66a-63e0-4ffe-88bd-02a41be4de79 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:50 np0005531887 nova_compute[186849]: 2025-11-22 07:54:50.220 186853 DEBUG oslo_concurrency.lockutils [None req-c69aa66a-63e0-4ffe-88bd-02a41be4de79 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:50 np0005531887 nova_compute[186849]: 2025-11-22 07:54:50.349 186853 DEBUG nova.compute.provider_tree [None req-c69aa66a-63e0-4ffe-88bd-02a41be4de79 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:54:50 np0005531887 nova_compute[186849]: 2025-11-22 07:54:50.374 186853 DEBUG nova.scheduler.client.report [None req-c69aa66a-63e0-4ffe-88bd-02a41be4de79 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:54:50 np0005531887 nova_compute[186849]: 2025-11-22 07:54:50.410 186853 DEBUG oslo_concurrency.lockutils [None req-c69aa66a-63e0-4ffe-88bd-02a41be4de79 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.189s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:50 np0005531887 nova_compute[186849]: 2025-11-22 07:54:50.487 186853 INFO nova.scheduler.client.report [None req-c69aa66a-63e0-4ffe-88bd-02a41be4de79 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Deleted allocations for instance 669c1c7b-c493-4f31-83dd-737239095b63#033[00m
Nov 22 02:54:50 np0005531887 nova_compute[186849]: 2025-11-22 07:54:50.626 186853 DEBUG oslo_concurrency.lockutils [None req-c69aa66a-63e0-4ffe-88bd-02a41be4de79 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "669c1c7b-c493-4f31-83dd-737239095b63" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.039s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:50 np0005531887 podman[222237]: 2025-11-22 07:54:50.876471535 +0000 UTC m=+0.075937728 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 22 02:54:50 np0005531887 nova_compute[186849]: 2025-11-22 07:54:50.880 186853 DEBUG nova.compute.manager [req-01c5dfe9-2108-4736-a000-a936f51933e3 req-d415fdc5-7147-4228-baf5-7b2f3bff3074 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Received event network-vif-plugged-4f77ff1b-e147-4c07-9d9b-feabd33edead external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:54:50 np0005531887 nova_compute[186849]: 2025-11-22 07:54:50.880 186853 DEBUG oslo_concurrency.lockutils [req-01c5dfe9-2108-4736-a000-a936f51933e3 req-d415fdc5-7147-4228-baf5-7b2f3bff3074 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "669c1c7b-c493-4f31-83dd-737239095b63-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:50 np0005531887 nova_compute[186849]: 2025-11-22 07:54:50.880 186853 DEBUG oslo_concurrency.lockutils [req-01c5dfe9-2108-4736-a000-a936f51933e3 req-d415fdc5-7147-4228-baf5-7b2f3bff3074 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "669c1c7b-c493-4f31-83dd-737239095b63-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:50 np0005531887 nova_compute[186849]: 2025-11-22 07:54:50.880 186853 DEBUG oslo_concurrency.lockutils [req-01c5dfe9-2108-4736-a000-a936f51933e3 req-d415fdc5-7147-4228-baf5-7b2f3bff3074 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "669c1c7b-c493-4f31-83dd-737239095b63-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:50 np0005531887 nova_compute[186849]: 2025-11-22 07:54:50.881 186853 DEBUG nova.compute.manager [req-01c5dfe9-2108-4736-a000-a936f51933e3 req-d415fdc5-7147-4228-baf5-7b2f3bff3074 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] No waiting events found dispatching network-vif-plugged-4f77ff1b-e147-4c07-9d9b-feabd33edead pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:54:50 np0005531887 nova_compute[186849]: 2025-11-22 07:54:50.881 186853 WARNING nova.compute.manager [req-01c5dfe9-2108-4736-a000-a936f51933e3 req-d415fdc5-7147-4228-baf5-7b2f3bff3074 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Received unexpected event network-vif-plugged-4f77ff1b-e147-4c07-9d9b-feabd33edead for instance with vm_state deleted and task_state None.#033[00m
Nov 22 02:54:50 np0005531887 nova_compute[186849]: 2025-11-22 07:54:50.881 186853 DEBUG nova.compute.manager [req-01c5dfe9-2108-4736-a000-a936f51933e3 req-d415fdc5-7147-4228-baf5-7b2f3bff3074 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Received event network-vif-deleted-4f77ff1b-e147-4c07-9d9b-feabd33edead external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:54:51 np0005531887 nova_compute[186849]: 2025-11-22 07:54:51.485 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:52 np0005531887 nova_compute[186849]: 2025-11-22 07:54:52.012 186853 DEBUG nova.network.neutron [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Updating instance_info_cache with network_info: [{"id": "c27f5a73-ae9a-4f31-95ec-4d5fa852d61b", "address": "fa:16:3e:15:d5:34", "network": {"id": "62930ff4-55a3-4e08-8229-5532aa7dcaed", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-130265711-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4ca2b2e65ac4bf8b3d14f3310a3a7bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc27f5a73-ae", "ovs_interfaceid": "c27f5a73-ae9a-4f31-95ec-4d5fa852d61b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:54:52 np0005531887 nova_compute[186849]: 2025-11-22 07:54:52.049 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Releasing lock "refresh_cache-10a29489-706f-428f-b645-1c688d642f0b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:54:52 np0005531887 nova_compute[186849]: 2025-11-22 07:54:52.049 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 02:54:52 np0005531887 nova_compute[186849]: 2025-11-22 07:54:52.901 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:54 np0005531887 ovn_controller[95130]: 2025-11-22T07:54:54Z|00156|binding|INFO|Releasing lport 02324e7a-c5bf-443b-a6e3-5a1cdac9fee4 from this chassis (sb_readonly=0)
Nov 22 02:54:54 np0005531887 nova_compute[186849]: 2025-11-22 07:54:54.225 186853 DEBUG oslo_concurrency.lockutils [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "a80b8598-7a3d-4859-8c57-d0c476d1fe01" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:54 np0005531887 nova_compute[186849]: 2025-11-22 07:54:54.225 186853 DEBUG oslo_concurrency.lockutils [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "a80b8598-7a3d-4859-8c57-d0c476d1fe01" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:54 np0005531887 nova_compute[186849]: 2025-11-22 07:54:54.235 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:54 np0005531887 nova_compute[186849]: 2025-11-22 07:54:54.261 186853 DEBUG nova.compute.manager [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 02:54:54 np0005531887 nova_compute[186849]: 2025-11-22 07:54:54.424 186853 DEBUG oslo_concurrency.lockutils [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:54 np0005531887 nova_compute[186849]: 2025-11-22 07:54:54.425 186853 DEBUG oslo_concurrency.lockutils [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:54 np0005531887 nova_compute[186849]: 2025-11-22 07:54:54.445 186853 DEBUG nova.virt.hardware [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:54:54 np0005531887 nova_compute[186849]: 2025-11-22 07:54:54.445 186853 INFO nova.compute.claims [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 22 02:54:54 np0005531887 nova_compute[186849]: 2025-11-22 07:54:54.683 186853 DEBUG nova.compute.provider_tree [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:54:54 np0005531887 nova_compute[186849]: 2025-11-22 07:54:54.706 186853 DEBUG nova.scheduler.client.report [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:54:54 np0005531887 nova_compute[186849]: 2025-11-22 07:54:54.742 186853 DEBUG oslo_concurrency.lockutils [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.317s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:54 np0005531887 nova_compute[186849]: 2025-11-22 07:54:54.743 186853 DEBUG nova.compute.manager [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 02:54:54 np0005531887 nova_compute[186849]: 2025-11-22 07:54:54.824 186853 DEBUG nova.compute.manager [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 02:54:54 np0005531887 nova_compute[186849]: 2025-11-22 07:54:54.824 186853 DEBUG nova.network.neutron [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 02:54:54 np0005531887 nova_compute[186849]: 2025-11-22 07:54:54.852 186853 INFO nova.virt.libvirt.driver [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 02:54:54 np0005531887 nova_compute[186849]: 2025-11-22 07:54:54.872 186853 DEBUG nova.compute.manager [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 02:54:55 np0005531887 nova_compute[186849]: 2025-11-22 07:54:55.067 186853 DEBUG nova.compute.manager [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 02:54:55 np0005531887 nova_compute[186849]: 2025-11-22 07:54:55.069 186853 DEBUG nova.virt.libvirt.driver [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:54:55 np0005531887 nova_compute[186849]: 2025-11-22 07:54:55.069 186853 INFO nova.virt.libvirt.driver [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Creating image(s)#033[00m
Nov 22 02:54:55 np0005531887 nova_compute[186849]: 2025-11-22 07:54:55.070 186853 DEBUG oslo_concurrency.lockutils [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "/var/lib/nova/instances/a80b8598-7a3d-4859-8c57-d0c476d1fe01/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:55 np0005531887 nova_compute[186849]: 2025-11-22 07:54:55.071 186853 DEBUG oslo_concurrency.lockutils [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "/var/lib/nova/instances/a80b8598-7a3d-4859-8c57-d0c476d1fe01/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:55 np0005531887 nova_compute[186849]: 2025-11-22 07:54:55.072 186853 DEBUG oslo_concurrency.lockutils [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "/var/lib/nova/instances/a80b8598-7a3d-4859-8c57-d0c476d1fe01/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:55 np0005531887 nova_compute[186849]: 2025-11-22 07:54:55.088 186853 DEBUG oslo_concurrency.processutils [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:54:55 np0005531887 nova_compute[186849]: 2025-11-22 07:54:55.152 186853 DEBUG oslo_concurrency.processutils [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:54:55 np0005531887 nova_compute[186849]: 2025-11-22 07:54:55.153 186853 DEBUG oslo_concurrency.lockutils [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:55 np0005531887 nova_compute[186849]: 2025-11-22 07:54:55.155 186853 DEBUG oslo_concurrency.lockutils [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:55 np0005531887 nova_compute[186849]: 2025-11-22 07:54:55.166 186853 DEBUG oslo_concurrency.processutils [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:54:55 np0005531887 nova_compute[186849]: 2025-11-22 07:54:55.199 186853 DEBUG nova.policy [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1ac2d2381d294c96aff369941185056a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7ec4007dc8214caab4e2eb40f11fb3cd', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 02:54:55 np0005531887 nova_compute[186849]: 2025-11-22 07:54:55.226 186853 DEBUG oslo_concurrency.processutils [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:54:55 np0005531887 nova_compute[186849]: 2025-11-22 07:54:55.228 186853 DEBUG oslo_concurrency.processutils [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/a80b8598-7a3d-4859-8c57-d0c476d1fe01/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:54:55 np0005531887 nova_compute[186849]: 2025-11-22 07:54:55.262 186853 DEBUG oslo_concurrency.processutils [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/a80b8598-7a3d-4859-8c57-d0c476d1fe01/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:54:55 np0005531887 nova_compute[186849]: 2025-11-22 07:54:55.263 186853 DEBUG oslo_concurrency.lockutils [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:55 np0005531887 nova_compute[186849]: 2025-11-22 07:54:55.264 186853 DEBUG oslo_concurrency.processutils [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:54:55 np0005531887 nova_compute[186849]: 2025-11-22 07:54:55.325 186853 DEBUG oslo_concurrency.processutils [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:54:55 np0005531887 nova_compute[186849]: 2025-11-22 07:54:55.326 186853 DEBUG nova.virt.disk.api [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Checking if we can resize image /var/lib/nova/instances/a80b8598-7a3d-4859-8c57-d0c476d1fe01/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:54:55 np0005531887 nova_compute[186849]: 2025-11-22 07:54:55.326 186853 DEBUG oslo_concurrency.processutils [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a80b8598-7a3d-4859-8c57-d0c476d1fe01/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:54:55 np0005531887 nova_compute[186849]: 2025-11-22 07:54:55.392 186853 DEBUG oslo_concurrency.processutils [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a80b8598-7a3d-4859-8c57-d0c476d1fe01/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:54:55 np0005531887 nova_compute[186849]: 2025-11-22 07:54:55.393 186853 DEBUG nova.virt.disk.api [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Cannot resize image /var/lib/nova/instances/a80b8598-7a3d-4859-8c57-d0c476d1fe01/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:54:55 np0005531887 nova_compute[186849]: 2025-11-22 07:54:55.393 186853 DEBUG nova.objects.instance [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lazy-loading 'migration_context' on Instance uuid a80b8598-7a3d-4859-8c57-d0c476d1fe01 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:54:55 np0005531887 nova_compute[186849]: 2025-11-22 07:54:55.454 186853 DEBUG nova.virt.libvirt.driver [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:54:55 np0005531887 nova_compute[186849]: 2025-11-22 07:54:55.455 186853 DEBUG nova.virt.libvirt.driver [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Ensure instance console log exists: /var/lib/nova/instances/a80b8598-7a3d-4859-8c57-d0c476d1fe01/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:54:55 np0005531887 nova_compute[186849]: 2025-11-22 07:54:55.455 186853 DEBUG oslo_concurrency.lockutils [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:54:55 np0005531887 nova_compute[186849]: 2025-11-22 07:54:55.456 186853 DEBUG oslo_concurrency.lockutils [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:54:55 np0005531887 nova_compute[186849]: 2025-11-22 07:54:55.456 186853 DEBUG oslo_concurrency.lockutils [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:54:55 np0005531887 podman[222274]: 2025-11-22 07:54:55.842565113 +0000 UTC m=+0.057209401 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 02:54:55 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:55.883 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:54:55 np0005531887 nova_compute[186849]: 2025-11-22 07:54:55.884 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:55 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:55.885 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 02:54:55 np0005531887 nova_compute[186849]: 2025-11-22 07:54:55.953 186853 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798080.9477792, c64e78b6-87b2-425c-aef9-771bcd042d58 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:54:55 np0005531887 nova_compute[186849]: 2025-11-22 07:54:55.954 186853 INFO nova.compute.manager [-] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:54:56 np0005531887 nova_compute[186849]: 2025-11-22 07:54:56.179 186853 DEBUG nova.compute.manager [None req-dfd7143c-7afd-4eca-87fd-7e9cc90992e0 - - - - - -] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:54:56 np0005531887 nova_compute[186849]: 2025-11-22 07:54:56.486 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:57 np0005531887 nova_compute[186849]: 2025-11-22 07:54:57.129 186853 DEBUG nova.network.neutron [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Successfully created port: 3d99992d-6b96-45f9-9dff-ff737054f3f5 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 02:54:57 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:54:57.887 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:54:57 np0005531887 nova_compute[186849]: 2025-11-22 07:54:57.905 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:54:59 np0005531887 nova_compute[186849]: 2025-11-22 07:54:59.587 186853 DEBUG nova.network.neutron [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Successfully updated port: 3d99992d-6b96-45f9-9dff-ff737054f3f5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 02:54:59 np0005531887 nova_compute[186849]: 2025-11-22 07:54:59.608 186853 DEBUG oslo_concurrency.lockutils [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "refresh_cache-a80b8598-7a3d-4859-8c57-d0c476d1fe01" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:54:59 np0005531887 nova_compute[186849]: 2025-11-22 07:54:59.608 186853 DEBUG oslo_concurrency.lockutils [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquired lock "refresh_cache-a80b8598-7a3d-4859-8c57-d0c476d1fe01" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:54:59 np0005531887 nova_compute[186849]: 2025-11-22 07:54:59.608 186853 DEBUG nova.network.neutron [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:54:59 np0005531887 nova_compute[186849]: 2025-11-22 07:54:59.800 186853 DEBUG nova.compute.manager [req-d2185022-7dc3-46c8-8d34-1a2a1fc6ebc1 req-17d99590-0ad4-4f7b-85ea-a183bb8fb87c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Received event network-changed-3d99992d-6b96-45f9-9dff-ff737054f3f5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:54:59 np0005531887 nova_compute[186849]: 2025-11-22 07:54:59.801 186853 DEBUG nova.compute.manager [req-d2185022-7dc3-46c8-8d34-1a2a1fc6ebc1 req-17d99590-0ad4-4f7b-85ea-a183bb8fb87c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Refreshing instance network info cache due to event network-changed-3d99992d-6b96-45f9-9dff-ff737054f3f5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 02:54:59 np0005531887 nova_compute[186849]: 2025-11-22 07:54:59.801 186853 DEBUG oslo_concurrency.lockutils [req-d2185022-7dc3-46c8-8d34-1a2a1fc6ebc1 req-17d99590-0ad4-4f7b-85ea-a183bb8fb87c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-a80b8598-7a3d-4859-8c57-d0c476d1fe01" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:55:00 np0005531887 nova_compute[186849]: 2025-11-22 07:55:00.073 186853 DEBUG nova.network.neutron [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:55:01 np0005531887 nova_compute[186849]: 2025-11-22 07:55:01.489 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:01 np0005531887 podman[222298]: 2025-11-22 07:55:01.855634467 +0000 UTC m=+0.066293242 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.33.7, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, architecture=x86_64)
Nov 22 02:55:02 np0005531887 nova_compute[186849]: 2025-11-22 07:55:02.544 186853 DEBUG nova.network.neutron [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Updating instance_info_cache with network_info: [{"id": "3d99992d-6b96-45f9-9dff-ff737054f3f5", "address": "fa:16:3e:ba:c3:74", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3d99992d-6b", "ovs_interfaceid": "3d99992d-6b96-45f9-9dff-ff737054f3f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:55:02 np0005531887 nova_compute[186849]: 2025-11-22 07:55:02.592 186853 DEBUG oslo_concurrency.lockutils [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Releasing lock "refresh_cache-a80b8598-7a3d-4859-8c57-d0c476d1fe01" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:55:02 np0005531887 nova_compute[186849]: 2025-11-22 07:55:02.593 186853 DEBUG nova.compute.manager [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Instance network_info: |[{"id": "3d99992d-6b96-45f9-9dff-ff737054f3f5", "address": "fa:16:3e:ba:c3:74", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3d99992d-6b", "ovs_interfaceid": "3d99992d-6b96-45f9-9dff-ff737054f3f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 02:55:02 np0005531887 nova_compute[186849]: 2025-11-22 07:55:02.595 186853 DEBUG oslo_concurrency.lockutils [req-d2185022-7dc3-46c8-8d34-1a2a1fc6ebc1 req-17d99590-0ad4-4f7b-85ea-a183bb8fb87c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-a80b8598-7a3d-4859-8c57-d0c476d1fe01" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:55:02 np0005531887 nova_compute[186849]: 2025-11-22 07:55:02.596 186853 DEBUG nova.network.neutron [req-d2185022-7dc3-46c8-8d34-1a2a1fc6ebc1 req-17d99590-0ad4-4f7b-85ea-a183bb8fb87c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Refreshing network info cache for port 3d99992d-6b96-45f9-9dff-ff737054f3f5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 02:55:02 np0005531887 nova_compute[186849]: 2025-11-22 07:55:02.599 186853 DEBUG nova.virt.libvirt.driver [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Start _get_guest_xml network_info=[{"id": "3d99992d-6b96-45f9-9dff-ff737054f3f5", "address": "fa:16:3e:ba:c3:74", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3d99992d-6b", "ovs_interfaceid": "3d99992d-6b96-45f9-9dff-ff737054f3f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'size': 0, 'boot_index': 0, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:55:02 np0005531887 nova_compute[186849]: 2025-11-22 07:55:02.605 186853 WARNING nova.virt.libvirt.driver [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:55:02 np0005531887 nova_compute[186849]: 2025-11-22 07:55:02.611 186853 DEBUG nova.virt.libvirt.host [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:55:02 np0005531887 nova_compute[186849]: 2025-11-22 07:55:02.612 186853 DEBUG nova.virt.libvirt.host [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:55:02 np0005531887 nova_compute[186849]: 2025-11-22 07:55:02.615 186853 DEBUG nova.virt.libvirt.host [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:55:02 np0005531887 nova_compute[186849]: 2025-11-22 07:55:02.616 186853 DEBUG nova.virt.libvirt.host [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:55:02 np0005531887 nova_compute[186849]: 2025-11-22 07:55:02.617 186853 DEBUG nova.virt.libvirt.driver [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:55:02 np0005531887 nova_compute[186849]: 2025-11-22 07:55:02.617 186853 DEBUG nova.virt.hardware [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:55:02 np0005531887 nova_compute[186849]: 2025-11-22 07:55:02.618 186853 DEBUG nova.virt.hardware [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:55:02 np0005531887 nova_compute[186849]: 2025-11-22 07:55:02.618 186853 DEBUG nova.virt.hardware [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:55:02 np0005531887 nova_compute[186849]: 2025-11-22 07:55:02.618 186853 DEBUG nova.virt.hardware [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:55:02 np0005531887 nova_compute[186849]: 2025-11-22 07:55:02.619 186853 DEBUG nova.virt.hardware [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:55:02 np0005531887 nova_compute[186849]: 2025-11-22 07:55:02.619 186853 DEBUG nova.virt.hardware [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:55:02 np0005531887 nova_compute[186849]: 2025-11-22 07:55:02.619 186853 DEBUG nova.virt.hardware [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:55:02 np0005531887 nova_compute[186849]: 2025-11-22 07:55:02.619 186853 DEBUG nova.virt.hardware [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:55:02 np0005531887 nova_compute[186849]: 2025-11-22 07:55:02.620 186853 DEBUG nova.virt.hardware [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:55:02 np0005531887 nova_compute[186849]: 2025-11-22 07:55:02.620 186853 DEBUG nova.virt.hardware [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:55:02 np0005531887 nova_compute[186849]: 2025-11-22 07:55:02.620 186853 DEBUG nova.virt.hardware [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:55:02 np0005531887 nova_compute[186849]: 2025-11-22 07:55:02.624 186853 DEBUG nova.virt.libvirt.vif [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:54:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1757265874',display_name='tempest-ImagesTestJSON-server-1757265874',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagestestjson-server-1757265874',id=66,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7ec4007dc8214caab4e2eb40f11fb3cd',ramdisk_id='',reservation_id='r-mp0eyiac',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-117614339',owner_user_name='tempest-ImagesTestJSON-117614339-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:54:54Z,user_data=None,user_id='1ac2d2381d294c96aff369941185056a',uuid=a80b8598-7a3d-4859-8c57-d0c476d1fe01,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3d99992d-6b96-45f9-9dff-ff737054f3f5", "address": "fa:16:3e:ba:c3:74", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3d99992d-6b", "ovs_interfaceid": "3d99992d-6b96-45f9-9dff-ff737054f3f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 02:55:02 np0005531887 nova_compute[186849]: 2025-11-22 07:55:02.625 186853 DEBUG nova.network.os_vif_util [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Converting VIF {"id": "3d99992d-6b96-45f9-9dff-ff737054f3f5", "address": "fa:16:3e:ba:c3:74", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3d99992d-6b", "ovs_interfaceid": "3d99992d-6b96-45f9-9dff-ff737054f3f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:55:02 np0005531887 nova_compute[186849]: 2025-11-22 07:55:02.625 186853 DEBUG nova.network.os_vif_util [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ba:c3:74,bridge_name='br-int',has_traffic_filtering=True,id=3d99992d-6b96-45f9-9dff-ff737054f3f5,network=Network(dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3d99992d-6b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:55:02 np0005531887 nova_compute[186849]: 2025-11-22 07:55:02.626 186853 DEBUG nova.objects.instance [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lazy-loading 'pci_devices' on Instance uuid a80b8598-7a3d-4859-8c57-d0c476d1fe01 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:55:02 np0005531887 nova_compute[186849]: 2025-11-22 07:55:02.647 186853 DEBUG nova.virt.libvirt.driver [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:55:02 np0005531887 nova_compute[186849]:  <uuid>a80b8598-7a3d-4859-8c57-d0c476d1fe01</uuid>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:  <name>instance-00000042</name>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:  <memory>131072</memory>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:  <vcpu>1</vcpu>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:55:02 np0005531887 nova_compute[186849]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:      <nova:name>tempest-ImagesTestJSON-server-1757265874</nova:name>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:      <nova:creationTime>2025-11-22 07:55:02</nova:creationTime>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:      <nova:flavor name="m1.nano">
Nov 22 02:55:02 np0005531887 nova_compute[186849]:        <nova:memory>128</nova:memory>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:        <nova:disk>1</nova:disk>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:        <nova:swap>0</nova:swap>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:      </nova:flavor>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:      <nova:owner>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:        <nova:user uuid="1ac2d2381d294c96aff369941185056a">tempest-ImagesTestJSON-117614339-project-member</nova:user>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:        <nova:project uuid="7ec4007dc8214caab4e2eb40f11fb3cd">tempest-ImagesTestJSON-117614339</nova:project>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:      </nova:owner>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:      <nova:ports>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:        <nova:port uuid="3d99992d-6b96-45f9-9dff-ff737054f3f5">
Nov 22 02:55:02 np0005531887 nova_compute[186849]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:        </nova:port>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:      </nova:ports>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:    </nova:instance>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:  <sysinfo type="smbios">
Nov 22 02:55:02 np0005531887 nova_compute[186849]:    <system>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:      <entry name="serial">a80b8598-7a3d-4859-8c57-d0c476d1fe01</entry>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:      <entry name="uuid">a80b8598-7a3d-4859-8c57-d0c476d1fe01</entry>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:    </system>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:  <os>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:    <boot dev="hd"/>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:    <smbios mode="sysinfo"/>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:  </os>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:  <features>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:    <vmcoreinfo/>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:  </features>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:  <clock offset="utc">
Nov 22 02:55:02 np0005531887 nova_compute[186849]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:    <timer name="hpet" present="no"/>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:  </clock>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:  <cpu mode="custom" match="exact">
Nov 22 02:55:02 np0005531887 nova_compute[186849]:    <model>Nehalem</model>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:  <devices>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:    <disk type="file" device="disk">
Nov 22 02:55:02 np0005531887 nova_compute[186849]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/a80b8598-7a3d-4859-8c57-d0c476d1fe01/disk"/>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:      <target dev="vda" bus="virtio"/>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:    <disk type="file" device="cdrom">
Nov 22 02:55:02 np0005531887 nova_compute[186849]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/a80b8598-7a3d-4859-8c57-d0c476d1fe01/disk.config"/>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:      <target dev="sda" bus="sata"/>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:    <interface type="ethernet">
Nov 22 02:55:02 np0005531887 nova_compute[186849]:      <mac address="fa:16:3e:ba:c3:74"/>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:      <mtu size="1442"/>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:      <target dev="tap3d99992d-6b"/>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:    </interface>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:    <serial type="pty">
Nov 22 02:55:02 np0005531887 nova_compute[186849]:      <log file="/var/lib/nova/instances/a80b8598-7a3d-4859-8c57-d0c476d1fe01/console.log" append="off"/>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:    </serial>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:    <video>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:    </video>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:    <input type="tablet" bus="usb"/>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:    <rng model="virtio">
Nov 22 02:55:02 np0005531887 nova_compute[186849]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:    </rng>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:    <controller type="usb" index="0"/>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:    <memballoon model="virtio">
Nov 22 02:55:02 np0005531887 nova_compute[186849]:      <stats period="10"/>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 02:55:02 np0005531887 nova_compute[186849]:  </devices>
Nov 22 02:55:02 np0005531887 nova_compute[186849]: </domain>
Nov 22 02:55:02 np0005531887 nova_compute[186849]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:55:02 np0005531887 nova_compute[186849]: 2025-11-22 07:55:02.649 186853 DEBUG nova.compute.manager [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Preparing to wait for external event network-vif-plugged-3d99992d-6b96-45f9-9dff-ff737054f3f5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 02:55:02 np0005531887 nova_compute[186849]: 2025-11-22 07:55:02.649 186853 DEBUG oslo_concurrency.lockutils [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "a80b8598-7a3d-4859-8c57-d0c476d1fe01-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:55:02 np0005531887 nova_compute[186849]: 2025-11-22 07:55:02.649 186853 DEBUG oslo_concurrency.lockutils [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "a80b8598-7a3d-4859-8c57-d0c476d1fe01-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:55:02 np0005531887 nova_compute[186849]: 2025-11-22 07:55:02.650 186853 DEBUG oslo_concurrency.lockutils [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "a80b8598-7a3d-4859-8c57-d0c476d1fe01-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:55:02 np0005531887 nova_compute[186849]: 2025-11-22 07:55:02.650 186853 DEBUG nova.virt.libvirt.vif [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:54:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1757265874',display_name='tempest-ImagesTestJSON-server-1757265874',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagestestjson-server-1757265874',id=66,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7ec4007dc8214caab4e2eb40f11fb3cd',ramdisk_id='',reservation_id='r-mp0eyiac',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-117614339',owner_user_name='tempest-ImagesTestJSON-117614339-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:54:54Z,user_data=None,user_id='1ac2d2381d294c96aff369941185056a',uuid=a80b8598-7a3d-4859-8c57-d0c476d1fe01,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3d99992d-6b96-45f9-9dff-ff737054f3f5", "address": "fa:16:3e:ba:c3:74", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3d99992d-6b", "ovs_interfaceid": "3d99992d-6b96-45f9-9dff-ff737054f3f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 02:55:02 np0005531887 nova_compute[186849]: 2025-11-22 07:55:02.651 186853 DEBUG nova.network.os_vif_util [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Converting VIF {"id": "3d99992d-6b96-45f9-9dff-ff737054f3f5", "address": "fa:16:3e:ba:c3:74", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3d99992d-6b", "ovs_interfaceid": "3d99992d-6b96-45f9-9dff-ff737054f3f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:55:02 np0005531887 nova_compute[186849]: 2025-11-22 07:55:02.651 186853 DEBUG nova.network.os_vif_util [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ba:c3:74,bridge_name='br-int',has_traffic_filtering=True,id=3d99992d-6b96-45f9-9dff-ff737054f3f5,network=Network(dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3d99992d-6b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:55:02 np0005531887 nova_compute[186849]: 2025-11-22 07:55:02.652 186853 DEBUG os_vif [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ba:c3:74,bridge_name='br-int',has_traffic_filtering=True,id=3d99992d-6b96-45f9-9dff-ff737054f3f5,network=Network(dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3d99992d-6b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 02:55:02 np0005531887 nova_compute[186849]: 2025-11-22 07:55:02.653 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:02 np0005531887 nova_compute[186849]: 2025-11-22 07:55:02.654 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:55:02 np0005531887 nova_compute[186849]: 2025-11-22 07:55:02.654 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:55:02 np0005531887 nova_compute[186849]: 2025-11-22 07:55:02.658 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:02 np0005531887 nova_compute[186849]: 2025-11-22 07:55:02.659 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3d99992d-6b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:55:02 np0005531887 nova_compute[186849]: 2025-11-22 07:55:02.659 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3d99992d-6b, col_values=(('external_ids', {'iface-id': '3d99992d-6b96-45f9-9dff-ff737054f3f5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ba:c3:74', 'vm-uuid': 'a80b8598-7a3d-4859-8c57-d0c476d1fe01'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:55:02 np0005531887 nova_compute[186849]: 2025-11-22 07:55:02.661 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:02 np0005531887 NetworkManager[55210]: <info>  [1763798102.6629] manager: (tap3d99992d-6b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/85)
Nov 22 02:55:02 np0005531887 nova_compute[186849]: 2025-11-22 07:55:02.663 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:55:02 np0005531887 nova_compute[186849]: 2025-11-22 07:55:02.669 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:02 np0005531887 nova_compute[186849]: 2025-11-22 07:55:02.670 186853 INFO os_vif [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ba:c3:74,bridge_name='br-int',has_traffic_filtering=True,id=3d99992d-6b96-45f9-9dff-ff737054f3f5,network=Network(dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3d99992d-6b')#033[00m
Nov 22 02:55:02 np0005531887 nova_compute[186849]: 2025-11-22 07:55:02.746 186853 DEBUG nova.virt.libvirt.driver [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:55:02 np0005531887 nova_compute[186849]: 2025-11-22 07:55:02.747 186853 DEBUG nova.virt.libvirt.driver [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:55:02 np0005531887 nova_compute[186849]: 2025-11-22 07:55:02.747 186853 DEBUG nova.virt.libvirt.driver [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] No VIF found with MAC fa:16:3e:ba:c3:74, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 02:55:02 np0005531887 nova_compute[186849]: 2025-11-22 07:55:02.748 186853 INFO nova.virt.libvirt.driver [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Using config drive#033[00m
Nov 22 02:55:02 np0005531887 nova_compute[186849]: 2025-11-22 07:55:02.866 186853 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798087.8660126, 669c1c7b-c493-4f31-83dd-737239095b63 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:55:02 np0005531887 nova_compute[186849]: 2025-11-22 07:55:02.866 186853 INFO nova.compute.manager [-] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:55:02 np0005531887 nova_compute[186849]: 2025-11-22 07:55:02.909 186853 DEBUG nova.compute.manager [None req-7a7cb065-5a19-4560-ae0d-301a423b5be6 - - - - - -] [instance: 669c1c7b-c493-4f31-83dd-737239095b63] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:55:04 np0005531887 nova_compute[186849]: 2025-11-22 07:55:04.120 186853 INFO nova.virt.libvirt.driver [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Creating config drive at /var/lib/nova/instances/a80b8598-7a3d-4859-8c57-d0c476d1fe01/disk.config#033[00m
Nov 22 02:55:04 np0005531887 nova_compute[186849]: 2025-11-22 07:55:04.126 186853 DEBUG oslo_concurrency.processutils [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a80b8598-7a3d-4859-8c57-d0c476d1fe01/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6c8su5f8 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:55:04 np0005531887 nova_compute[186849]: 2025-11-22 07:55:04.256 186853 DEBUG oslo_concurrency.processutils [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a80b8598-7a3d-4859-8c57-d0c476d1fe01/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6c8su5f8" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:55:04 np0005531887 kernel: tap3d99992d-6b: entered promiscuous mode
Nov 22 02:55:04 np0005531887 NetworkManager[55210]: <info>  [1763798104.3194] manager: (tap3d99992d-6b): new Tun device (/org/freedesktop/NetworkManager/Devices/86)
Nov 22 02:55:04 np0005531887 ovn_controller[95130]: 2025-11-22T07:55:04Z|00157|binding|INFO|Claiming lport 3d99992d-6b96-45f9-9dff-ff737054f3f5 for this chassis.
Nov 22 02:55:04 np0005531887 ovn_controller[95130]: 2025-11-22T07:55:04Z|00158|binding|INFO|3d99992d-6b96-45f9-9dff-ff737054f3f5: Claiming fa:16:3e:ba:c3:74 10.100.0.7
Nov 22 02:55:04 np0005531887 nova_compute[186849]: 2025-11-22 07:55:04.323 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:04.339 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ba:c3:74 10.100.0.7'], port_security=['fa:16:3e:ba:c3:74 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'a80b8598-7a3d-4859-8c57-d0c476d1fe01', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7ec4007dc8214caab4e2eb40f11fb3cd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9b5409ad-68b6-4279-a5b6-4f93a6b83cf7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b3e65854-82c8-492a-b0f0-e6e843e59756, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=3d99992d-6b96-45f9-9dff-ff737054f3f5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:55:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:04.341 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 3d99992d-6b96-45f9-9dff-ff737054f3f5 in datapath dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a bound to our chassis#033[00m
Nov 22 02:55:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:04.342 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a#033[00m
Nov 22 02:55:04 np0005531887 systemd-udevd[222340]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:55:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:04.356 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[4f03be05-c057-4672-b9ab-694b8e6cf3cf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:04.358 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapdc6b9ee8-e1 in ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 02:55:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:04.360 213790 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapdc6b9ee8-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 02:55:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:04.360 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[e42eeb2b-67cd-4794-8051-0a6c72a4c386]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:04 np0005531887 systemd-machined[153180]: New machine qemu-25-instance-00000042.
Nov 22 02:55:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:04.362 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[3f8bcf9e-85de-4bad-92fa-586022aff714]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:04 np0005531887 NetworkManager[55210]: <info>  [1763798104.3735] device (tap3d99992d-6b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 02:55:04 np0005531887 NetworkManager[55210]: <info>  [1763798104.3744] device (tap3d99992d-6b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 02:55:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:04.375 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[ff59630c-4cc0-47a9-88df-4ef0a2bb2474]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:04 np0005531887 nova_compute[186849]: 2025-11-22 07:55:04.378 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:04 np0005531887 ovn_controller[95130]: 2025-11-22T07:55:04Z|00159|binding|INFO|Setting lport 3d99992d-6b96-45f9-9dff-ff737054f3f5 ovn-installed in OVS
Nov 22 02:55:04 np0005531887 ovn_controller[95130]: 2025-11-22T07:55:04Z|00160|binding|INFO|Setting lport 3d99992d-6b96-45f9-9dff-ff737054f3f5 up in Southbound
Nov 22 02:55:04 np0005531887 nova_compute[186849]: 2025-11-22 07:55:04.385 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:04 np0005531887 systemd[1]: Started Virtual Machine qemu-25-instance-00000042.
Nov 22 02:55:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:04.393 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[bbb3a326-a29c-42b4-9c11-bd2934d846b3]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:04.433 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[a579eefd-4fe6-43bc-b8eb-ea4495c4e68f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:04 np0005531887 NetworkManager[55210]: <info>  [1763798104.4426] manager: (tapdc6b9ee8-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/87)
Nov 22 02:55:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:04.442 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[a152fa69-4b4f-4b72-a407-ad8cf4862a13]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:04.487 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[e7a40c35-9b64-4f8a-9970-9c7b3959a06d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:04.491 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[5e69a453-5120-40db-a629-a5afb32ef70c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:04 np0005531887 NetworkManager[55210]: <info>  [1763798104.5215] device (tapdc6b9ee8-e0): carrier: link connected
Nov 22 02:55:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:04.528 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[a370f9c1-963c-48e2-a85f-f956b7b470ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:04.552 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[21367f0c-9c5e-4e51-bdf3-6d2e7c86f4bf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdc6b9ee8-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cc:d8:9c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 51], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 484712, 'reachable_time': 34656, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222373, 'error': None, 'target': 'ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:04.573 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[8393e59f-b9af-4187-aaaa-771ed7a3efab]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fecc:d89c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 484712, 'tstamp': 484712}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222374, 'error': None, 'target': 'ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:04.593 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[fe5be45b-882d-4644-81ba-cdfdd9dd8267]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdc6b9ee8-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cc:d8:9c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 51], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 484712, 'reachable_time': 34656, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222375, 'error': None, 'target': 'ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:04.630 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[df6a43ab-c15f-480a-916a-8e61c31b9729]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:04 np0005531887 nova_compute[186849]: 2025-11-22 07:55:04.705 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798104.7048485, a80b8598-7a3d-4859-8c57-d0c476d1fe01 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:55:04 np0005531887 nova_compute[186849]: 2025-11-22 07:55:04.706 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] VM Started (Lifecycle Event)#033[00m
Nov 22 02:55:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:04.706 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[a66e6374-4bcb-4a5c-890b-1ba613384cc5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:04.708 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdc6b9ee8-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:55:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:04.708 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:55:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:04.708 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdc6b9ee8-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:55:04 np0005531887 kernel: tapdc6b9ee8-e0: entered promiscuous mode
Nov 22 02:55:04 np0005531887 NetworkManager[55210]: <info>  [1763798104.7120] manager: (tapdc6b9ee8-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/88)
Nov 22 02:55:04 np0005531887 nova_compute[186849]: 2025-11-22 07:55:04.710 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:04 np0005531887 nova_compute[186849]: 2025-11-22 07:55:04.714 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:04.715 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdc6b9ee8-e0, col_values=(('external_ids', {'iface-id': '99cae854-daa9-4d08-8152-257a15e21bf8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:55:04 np0005531887 nova_compute[186849]: 2025-11-22 07:55:04.716 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:04 np0005531887 ovn_controller[95130]: 2025-11-22T07:55:04Z|00161|binding|INFO|Releasing lport 99cae854-daa9-4d08-8152-257a15e21bf8 from this chassis (sb_readonly=0)
Nov 22 02:55:04 np0005531887 nova_compute[186849]: 2025-11-22 07:55:04.717 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:04.717 104084 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 02:55:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:04.719 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[22ded51a-7879-40e5-8336-7dcbe3506e79]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:04.720 104084 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 02:55:04 np0005531887 ovn_metadata_agent[104079]: global
Nov 22 02:55:04 np0005531887 ovn_metadata_agent[104079]:    log         /dev/log local0 debug
Nov 22 02:55:04 np0005531887 ovn_metadata_agent[104079]:    log-tag     haproxy-metadata-proxy-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a
Nov 22 02:55:04 np0005531887 ovn_metadata_agent[104079]:    user        root
Nov 22 02:55:04 np0005531887 ovn_metadata_agent[104079]:    group       root
Nov 22 02:55:04 np0005531887 ovn_metadata_agent[104079]:    maxconn     1024
Nov 22 02:55:04 np0005531887 ovn_metadata_agent[104079]:    pidfile     /var/lib/neutron/external/pids/dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a.pid.haproxy
Nov 22 02:55:04 np0005531887 ovn_metadata_agent[104079]:    daemon
Nov 22 02:55:04 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:55:04 np0005531887 ovn_metadata_agent[104079]: defaults
Nov 22 02:55:04 np0005531887 ovn_metadata_agent[104079]:    log global
Nov 22 02:55:04 np0005531887 ovn_metadata_agent[104079]:    mode http
Nov 22 02:55:04 np0005531887 ovn_metadata_agent[104079]:    option httplog
Nov 22 02:55:04 np0005531887 ovn_metadata_agent[104079]:    option dontlognull
Nov 22 02:55:04 np0005531887 ovn_metadata_agent[104079]:    option http-server-close
Nov 22 02:55:04 np0005531887 ovn_metadata_agent[104079]:    option forwardfor
Nov 22 02:55:04 np0005531887 ovn_metadata_agent[104079]:    retries                 3
Nov 22 02:55:04 np0005531887 ovn_metadata_agent[104079]:    timeout http-request    30s
Nov 22 02:55:04 np0005531887 ovn_metadata_agent[104079]:    timeout connect         30s
Nov 22 02:55:04 np0005531887 ovn_metadata_agent[104079]:    timeout client          32s
Nov 22 02:55:04 np0005531887 ovn_metadata_agent[104079]:    timeout server          32s
Nov 22 02:55:04 np0005531887 ovn_metadata_agent[104079]:    timeout http-keep-alive 30s
Nov 22 02:55:04 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:55:04 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:55:04 np0005531887 ovn_metadata_agent[104079]: listen listener
Nov 22 02:55:04 np0005531887 ovn_metadata_agent[104079]:    bind 169.254.169.254:80
Nov 22 02:55:04 np0005531887 ovn_metadata_agent[104079]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 02:55:04 np0005531887 ovn_metadata_agent[104079]:    http-request add-header X-OVN-Network-ID dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a
Nov 22 02:55:04 np0005531887 ovn_metadata_agent[104079]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 02:55:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:04.721 104084 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a', 'env', 'PROCESS_TAG=haproxy-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 02:55:04 np0005531887 nova_compute[186849]: 2025-11-22 07:55:04.730 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:04 np0005531887 nova_compute[186849]: 2025-11-22 07:55:04.759 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:55:04 np0005531887 nova_compute[186849]: 2025-11-22 07:55:04.768 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798104.7086194, a80b8598-7a3d-4859-8c57-d0c476d1fe01 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:55:04 np0005531887 nova_compute[186849]: 2025-11-22 07:55:04.769 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] VM Paused (Lifecycle Event)#033[00m
Nov 22 02:55:04 np0005531887 nova_compute[186849]: 2025-11-22 07:55:04.794 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:55:04 np0005531887 nova_compute[186849]: 2025-11-22 07:55:04.798 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:55:04 np0005531887 nova_compute[186849]: 2025-11-22 07:55:04.827 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:55:05 np0005531887 nova_compute[186849]: 2025-11-22 07:55:05.074 186853 DEBUG nova.compute.manager [req-1384f58e-e15c-4c69-9e7b-82541fe5360a req-240cf19b-0cc5-43a2-a1a5-61a9deddd0bb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Received event network-vif-plugged-3d99992d-6b96-45f9-9dff-ff737054f3f5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:55:05 np0005531887 nova_compute[186849]: 2025-11-22 07:55:05.076 186853 DEBUG oslo_concurrency.lockutils [req-1384f58e-e15c-4c69-9e7b-82541fe5360a req-240cf19b-0cc5-43a2-a1a5-61a9deddd0bb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "a80b8598-7a3d-4859-8c57-d0c476d1fe01-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:55:05 np0005531887 nova_compute[186849]: 2025-11-22 07:55:05.076 186853 DEBUG oslo_concurrency.lockutils [req-1384f58e-e15c-4c69-9e7b-82541fe5360a req-240cf19b-0cc5-43a2-a1a5-61a9deddd0bb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "a80b8598-7a3d-4859-8c57-d0c476d1fe01-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:55:05 np0005531887 nova_compute[186849]: 2025-11-22 07:55:05.077 186853 DEBUG oslo_concurrency.lockutils [req-1384f58e-e15c-4c69-9e7b-82541fe5360a req-240cf19b-0cc5-43a2-a1a5-61a9deddd0bb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "a80b8598-7a3d-4859-8c57-d0c476d1fe01-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:55:05 np0005531887 nova_compute[186849]: 2025-11-22 07:55:05.077 186853 DEBUG nova.compute.manager [req-1384f58e-e15c-4c69-9e7b-82541fe5360a req-240cf19b-0cc5-43a2-a1a5-61a9deddd0bb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Processing event network-vif-plugged-3d99992d-6b96-45f9-9dff-ff737054f3f5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 02:55:05 np0005531887 nova_compute[186849]: 2025-11-22 07:55:05.078 186853 DEBUG nova.compute.manager [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:55:05 np0005531887 nova_compute[186849]: 2025-11-22 07:55:05.083 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798105.0828922, a80b8598-7a3d-4859-8c57-d0c476d1fe01 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:55:05 np0005531887 nova_compute[186849]: 2025-11-22 07:55:05.083 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:55:05 np0005531887 nova_compute[186849]: 2025-11-22 07:55:05.085 186853 DEBUG nova.virt.libvirt.driver [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 02:55:05 np0005531887 nova_compute[186849]: 2025-11-22 07:55:05.090 186853 INFO nova.virt.libvirt.driver [-] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Instance spawned successfully.#033[00m
Nov 22 02:55:05 np0005531887 nova_compute[186849]: 2025-11-22 07:55:05.090 186853 DEBUG nova.virt.libvirt.driver [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 02:55:05 np0005531887 nova_compute[186849]: 2025-11-22 07:55:05.125 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:55:05 np0005531887 nova_compute[186849]: 2025-11-22 07:55:05.132 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:55:05 np0005531887 nova_compute[186849]: 2025-11-22 07:55:05.135 186853 DEBUG nova.virt.libvirt.driver [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:55:05 np0005531887 nova_compute[186849]: 2025-11-22 07:55:05.135 186853 DEBUG nova.virt.libvirt.driver [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:55:05 np0005531887 nova_compute[186849]: 2025-11-22 07:55:05.135 186853 DEBUG nova.virt.libvirt.driver [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:55:05 np0005531887 nova_compute[186849]: 2025-11-22 07:55:05.136 186853 DEBUG nova.virt.libvirt.driver [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:55:05 np0005531887 nova_compute[186849]: 2025-11-22 07:55:05.136 186853 DEBUG nova.virt.libvirt.driver [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:55:05 np0005531887 nova_compute[186849]: 2025-11-22 07:55:05.136 186853 DEBUG nova.virt.libvirt.driver [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:55:05 np0005531887 podman[222414]: 2025-11-22 07:55:05.151885729 +0000 UTC m=+0.061606948 container create 5f92f9c9e25d2319fe018dd9a4d9ac72c98b4d0db0a3749dae401b6fab9d834a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 02:55:05 np0005531887 nova_compute[186849]: 2025-11-22 07:55:05.172 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:55:05 np0005531887 systemd[1]: Started libpod-conmon-5f92f9c9e25d2319fe018dd9a4d9ac72c98b4d0db0a3749dae401b6fab9d834a.scope.
Nov 22 02:55:05 np0005531887 systemd[1]: Started libcrun container.
Nov 22 02:55:05 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7a1ee5e291e7e28caaa1e2fd153be38ca30298fb88dfc57f3d670cdd0b5d5558/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 02:55:05 np0005531887 podman[222414]: 2025-11-22 07:55:05.113343157 +0000 UTC m=+0.023064396 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 02:55:05 np0005531887 nova_compute[186849]: 2025-11-22 07:55:05.221 186853 INFO nova.compute.manager [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Took 10.15 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 02:55:05 np0005531887 nova_compute[186849]: 2025-11-22 07:55:05.221 186853 DEBUG nova.compute.manager [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:55:05 np0005531887 podman[222414]: 2025-11-22 07:55:05.22548188 +0000 UTC m=+0.135203119 container init 5f92f9c9e25d2319fe018dd9a4d9ac72c98b4d0db0a3749dae401b6fab9d834a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118)
Nov 22 02:55:05 np0005531887 podman[222414]: 2025-11-22 07:55:05.235804173 +0000 UTC m=+0.145525392 container start 5f92f9c9e25d2319fe018dd9a4d9ac72c98b4d0db0a3749dae401b6fab9d834a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 22 02:55:05 np0005531887 neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a[222427]: [NOTICE]   (222433) : New worker (222435) forked
Nov 22 02:55:05 np0005531887 neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a[222427]: [NOTICE]   (222433) : Loading success.
Nov 22 02:55:05 np0005531887 nova_compute[186849]: 2025-11-22 07:55:05.336 186853 INFO nova.compute.manager [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Took 10.98 seconds to build instance.#033[00m
Nov 22 02:55:05 np0005531887 nova_compute[186849]: 2025-11-22 07:55:05.399 186853 DEBUG oslo_concurrency.lockutils [None req-cda58355-66b6-4aed-b90e-5bf691e51026 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "a80b8598-7a3d-4859-8c57-d0c476d1fe01" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.174s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:55:05 np0005531887 nova_compute[186849]: 2025-11-22 07:55:05.704 186853 DEBUG nova.network.neutron [req-d2185022-7dc3-46c8-8d34-1a2a1fc6ebc1 req-17d99590-0ad4-4f7b-85ea-a183bb8fb87c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Updated VIF entry in instance network info cache for port 3d99992d-6b96-45f9-9dff-ff737054f3f5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 02:55:05 np0005531887 nova_compute[186849]: 2025-11-22 07:55:05.705 186853 DEBUG nova.network.neutron [req-d2185022-7dc3-46c8-8d34-1a2a1fc6ebc1 req-17d99590-0ad4-4f7b-85ea-a183bb8fb87c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Updating instance_info_cache with network_info: [{"id": "3d99992d-6b96-45f9-9dff-ff737054f3f5", "address": "fa:16:3e:ba:c3:74", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3d99992d-6b", "ovs_interfaceid": "3d99992d-6b96-45f9-9dff-ff737054f3f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:55:05 np0005531887 nova_compute[186849]: 2025-11-22 07:55:05.798 186853 DEBUG oslo_concurrency.lockutils [req-d2185022-7dc3-46c8-8d34-1a2a1fc6ebc1 req-17d99590-0ad4-4f7b-85ea-a183bb8fb87c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-a80b8598-7a3d-4859-8c57-d0c476d1fe01" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:55:05 np0005531887 podman[222444]: 2025-11-22 07:55:05.851419396 +0000 UTC m=+0.071606333 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:55:05 np0005531887 podman[222445]: 2025-11-22 07:55:05.877868853 +0000 UTC m=+0.094545585 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 22 02:55:06 np0005531887 nova_compute[186849]: 2025-11-22 07:55:06.490 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:07 np0005531887 nova_compute[186849]: 2025-11-22 07:55:07.662 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:08 np0005531887 nova_compute[186849]: 2025-11-22 07:55:08.660 186853 DEBUG nova.compute.manager [req-634e4356-4464-4a07-9562-6fbfe174e1d6 req-ca60d163-7890-441f-acdf-66e63f2f3a1c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Received event network-vif-plugged-3d99992d-6b96-45f9-9dff-ff737054f3f5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:55:08 np0005531887 nova_compute[186849]: 2025-11-22 07:55:08.660 186853 DEBUG oslo_concurrency.lockutils [req-634e4356-4464-4a07-9562-6fbfe174e1d6 req-ca60d163-7890-441f-acdf-66e63f2f3a1c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "a80b8598-7a3d-4859-8c57-d0c476d1fe01-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:55:08 np0005531887 nova_compute[186849]: 2025-11-22 07:55:08.661 186853 DEBUG oslo_concurrency.lockutils [req-634e4356-4464-4a07-9562-6fbfe174e1d6 req-ca60d163-7890-441f-acdf-66e63f2f3a1c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "a80b8598-7a3d-4859-8c57-d0c476d1fe01-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:55:08 np0005531887 nova_compute[186849]: 2025-11-22 07:55:08.661 186853 DEBUG oslo_concurrency.lockutils [req-634e4356-4464-4a07-9562-6fbfe174e1d6 req-ca60d163-7890-441f-acdf-66e63f2f3a1c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "a80b8598-7a3d-4859-8c57-d0c476d1fe01-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:55:08 np0005531887 nova_compute[186849]: 2025-11-22 07:55:08.661 186853 DEBUG nova.compute.manager [req-634e4356-4464-4a07-9562-6fbfe174e1d6 req-ca60d163-7890-441f-acdf-66e63f2f3a1c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] No waiting events found dispatching network-vif-plugged-3d99992d-6b96-45f9-9dff-ff737054f3f5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:55:08 np0005531887 nova_compute[186849]: 2025-11-22 07:55:08.661 186853 WARNING nova.compute.manager [req-634e4356-4464-4a07-9562-6fbfe174e1d6 req-ca60d163-7890-441f-acdf-66e63f2f3a1c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Received unexpected event network-vif-plugged-3d99992d-6b96-45f9-9dff-ff737054f3f5 for instance with vm_state active and task_state None.#033[00m
Nov 22 02:55:09 np0005531887 nova_compute[186849]: 2025-11-22 07:55:09.416 186853 DEBUG nova.objects.instance [None req-f24c095d-b662-4bcf-bffb-8c5d29648cef 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lazy-loading 'pci_devices' on Instance uuid a80b8598-7a3d-4859-8c57-d0c476d1fe01 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:55:09 np0005531887 nova_compute[186849]: 2025-11-22 07:55:09.443 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798109.4432738, a80b8598-7a3d-4859-8c57-d0c476d1fe01 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:55:09 np0005531887 nova_compute[186849]: 2025-11-22 07:55:09.443 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] VM Paused (Lifecycle Event)#033[00m
Nov 22 02:55:09 np0005531887 nova_compute[186849]: 2025-11-22 07:55:09.461 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:55:09 np0005531887 nova_compute[186849]: 2025-11-22 07:55:09.466 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:55:09 np0005531887 nova_compute[186849]: 2025-11-22 07:55:09.486 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] During sync_power_state the instance has a pending task (suspending). Skip.#033[00m
Nov 22 02:55:10 np0005531887 kernel: tap3d99992d-6b (unregistering): left promiscuous mode
Nov 22 02:55:10 np0005531887 NetworkManager[55210]: <info>  [1763798110.0289] device (tap3d99992d-6b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 02:55:10 np0005531887 ovn_controller[95130]: 2025-11-22T07:55:10Z|00162|binding|INFO|Releasing lport 3d99992d-6b96-45f9-9dff-ff737054f3f5 from this chassis (sb_readonly=0)
Nov 22 02:55:10 np0005531887 ovn_controller[95130]: 2025-11-22T07:55:10Z|00163|binding|INFO|Setting lport 3d99992d-6b96-45f9-9dff-ff737054f3f5 down in Southbound
Nov 22 02:55:10 np0005531887 ovn_controller[95130]: 2025-11-22T07:55:10Z|00164|binding|INFO|Removing iface tap3d99992d-6b ovn-installed in OVS
Nov 22 02:55:10 np0005531887 nova_compute[186849]: 2025-11-22 07:55:10.037 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:10 np0005531887 nova_compute[186849]: 2025-11-22 07:55:10.039 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:10.051 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ba:c3:74 10.100.0.7'], port_security=['fa:16:3e:ba:c3:74 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'a80b8598-7a3d-4859-8c57-d0c476d1fe01', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7ec4007dc8214caab4e2eb40f11fb3cd', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9b5409ad-68b6-4279-a5b6-4f93a6b83cf7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b3e65854-82c8-492a-b0f0-e6e843e59756, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=3d99992d-6b96-45f9-9dff-ff737054f3f5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:55:10 np0005531887 nova_compute[186849]: 2025-11-22 07:55:10.051 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:10.053 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 3d99992d-6b96-45f9-9dff-ff737054f3f5 in datapath dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a unbound from our chassis#033[00m
Nov 22 02:55:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:10.054 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 02:55:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:10.055 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[29222486-7b15-4415-b68d-dea3303050e6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:10.055 104084 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a namespace which is not needed anymore#033[00m
Nov 22 02:55:10 np0005531887 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d00000042.scope: Deactivated successfully.
Nov 22 02:55:10 np0005531887 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d00000042.scope: Consumed 4.803s CPU time.
Nov 22 02:55:10 np0005531887 systemd-machined[153180]: Machine qemu-25-instance-00000042 terminated.
Nov 22 02:55:10 np0005531887 neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a[222427]: [NOTICE]   (222433) : haproxy version is 2.8.14-c23fe91
Nov 22 02:55:10 np0005531887 neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a[222427]: [NOTICE]   (222433) : path to executable is /usr/sbin/haproxy
Nov 22 02:55:10 np0005531887 neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a[222427]: [WARNING]  (222433) : Exiting Master process...
Nov 22 02:55:10 np0005531887 neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a[222427]: [ALERT]    (222433) : Current worker (222435) exited with code 143 (Terminated)
Nov 22 02:55:10 np0005531887 neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a[222427]: [WARNING]  (222433) : All workers exited. Exiting... (0)
Nov 22 02:55:10 np0005531887 systemd[1]: libpod-5f92f9c9e25d2319fe018dd9a4d9ac72c98b4d0db0a3749dae401b6fab9d834a.scope: Deactivated successfully.
Nov 22 02:55:10 np0005531887 podman[222515]: 2025-11-22 07:55:10.1977801 +0000 UTC m=+0.052848053 container died 5f92f9c9e25d2319fe018dd9a4d9ac72c98b4d0db0a3749dae401b6fab9d834a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 02:55:10 np0005531887 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5f92f9c9e25d2319fe018dd9a4d9ac72c98b4d0db0a3749dae401b6fab9d834a-userdata-shm.mount: Deactivated successfully.
Nov 22 02:55:10 np0005531887 systemd[1]: var-lib-containers-storage-overlay-7a1ee5e291e7e28caaa1e2fd153be38ca30298fb88dfc57f3d670cdd0b5d5558-merged.mount: Deactivated successfully.
Nov 22 02:55:10 np0005531887 podman[222515]: 2025-11-22 07:55:10.245553289 +0000 UTC m=+0.100621242 container cleanup 5f92f9c9e25d2319fe018dd9a4d9ac72c98b4d0db0a3749dae401b6fab9d834a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 02:55:10 np0005531887 systemd[1]: libpod-conmon-5f92f9c9e25d2319fe018dd9a4d9ac72c98b4d0db0a3749dae401b6fab9d834a.scope: Deactivated successfully.
Nov 22 02:55:10 np0005531887 nova_compute[186849]: 2025-11-22 07:55:10.257 186853 DEBUG nova.compute.manager [None req-f24c095d-b662-4bcf-bffb-8c5d29648cef 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:55:10 np0005531887 podman[222560]: 2025-11-22 07:55:10.307743661 +0000 UTC m=+0.040041151 container remove 5f92f9c9e25d2319fe018dd9a4d9ac72c98b4d0db0a3749dae401b6fab9d834a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 02:55:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:10.313 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[545cb204-3f05-4bff-9f59-05a178dc6d39]: (4, ('Sat Nov 22 07:55:10 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a (5f92f9c9e25d2319fe018dd9a4d9ac72c98b4d0db0a3749dae401b6fab9d834a)\n5f92f9c9e25d2319fe018dd9a4d9ac72c98b4d0db0a3749dae401b6fab9d834a\nSat Nov 22 07:55:10 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a (5f92f9c9e25d2319fe018dd9a4d9ac72c98b4d0db0a3749dae401b6fab9d834a)\n5f92f9c9e25d2319fe018dd9a4d9ac72c98b4d0db0a3749dae401b6fab9d834a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:10.314 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[baf240aa-9044-4ee0-ada0-6c92d8069321]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:10.315 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdc6b9ee8-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:55:10 np0005531887 nova_compute[186849]: 2025-11-22 07:55:10.317 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:10 np0005531887 kernel: tapdc6b9ee8-e0: left promiscuous mode
Nov 22 02:55:10 np0005531887 nova_compute[186849]: 2025-11-22 07:55:10.333 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:10.338 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[78697c68-9f50-4893-b668-0fe6566324ec]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:10.358 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[57b917a6-0ee0-4e25-8255-50f49b55f64a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:10.360 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[b5f8009b-3349-42f0-8d48-abfc7b35ab65]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:10.375 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[9c18b245-9e9d-4ec1-ba9f-3b21fd7da235]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 484703, 'reachable_time': 19631, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222578, 'error': None, 'target': 'ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:10.378 104200 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 02:55:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:10.378 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[ef34297b-8341-4ce2-8f5e-2cf0adbef089]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:10 np0005531887 systemd[1]: run-netns-ovnmeta\x2ddc6b9ee8\x2de824\x2d42ea\x2dbe5e\x2d5b3c4e48e46a.mount: Deactivated successfully.
Nov 22 02:55:10 np0005531887 nova_compute[186849]: 2025-11-22 07:55:10.759 186853 DEBUG nova.compute.manager [req-17e4dae8-0b14-4bc4-8e9e-26d2c08a83a1 req-ed88c7fe-8b8c-47b7-b034-5fc2a8a03386 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Received event network-vif-unplugged-3d99992d-6b96-45f9-9dff-ff737054f3f5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:55:10 np0005531887 nova_compute[186849]: 2025-11-22 07:55:10.760 186853 DEBUG oslo_concurrency.lockutils [req-17e4dae8-0b14-4bc4-8e9e-26d2c08a83a1 req-ed88c7fe-8b8c-47b7-b034-5fc2a8a03386 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "a80b8598-7a3d-4859-8c57-d0c476d1fe01-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:55:10 np0005531887 nova_compute[186849]: 2025-11-22 07:55:10.760 186853 DEBUG oslo_concurrency.lockutils [req-17e4dae8-0b14-4bc4-8e9e-26d2c08a83a1 req-ed88c7fe-8b8c-47b7-b034-5fc2a8a03386 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "a80b8598-7a3d-4859-8c57-d0c476d1fe01-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:55:10 np0005531887 nova_compute[186849]: 2025-11-22 07:55:10.760 186853 DEBUG oslo_concurrency.lockutils [req-17e4dae8-0b14-4bc4-8e9e-26d2c08a83a1 req-ed88c7fe-8b8c-47b7-b034-5fc2a8a03386 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "a80b8598-7a3d-4859-8c57-d0c476d1fe01-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:55:10 np0005531887 nova_compute[186849]: 2025-11-22 07:55:10.760 186853 DEBUG nova.compute.manager [req-17e4dae8-0b14-4bc4-8e9e-26d2c08a83a1 req-ed88c7fe-8b8c-47b7-b034-5fc2a8a03386 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] No waiting events found dispatching network-vif-unplugged-3d99992d-6b96-45f9-9dff-ff737054f3f5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:55:10 np0005531887 nova_compute[186849]: 2025-11-22 07:55:10.761 186853 WARNING nova.compute.manager [req-17e4dae8-0b14-4bc4-8e9e-26d2c08a83a1 req-ed88c7fe-8b8c-47b7-b034-5fc2a8a03386 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Received unexpected event network-vif-unplugged-3d99992d-6b96-45f9-9dff-ff737054f3f5 for instance with vm_state suspended and task_state None.#033[00m
Nov 22 02:55:11 np0005531887 nova_compute[186849]: 2025-11-22 07:55:11.492 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:12 np0005531887 nova_compute[186849]: 2025-11-22 07:55:12.667 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:12 np0005531887 podman[222579]: 2025-11-22 07:55:12.841069246 +0000 UTC m=+0.057208281 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 02:55:13 np0005531887 nova_compute[186849]: 2025-11-22 07:55:13.058 186853 DEBUG nova.compute.manager [req-89e1720f-7f59-46db-a434-918bfddbb0d4 req-f651784a-059a-47a5-b37a-1b56904b84ee 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Received event network-vif-plugged-3d99992d-6b96-45f9-9dff-ff737054f3f5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:55:13 np0005531887 nova_compute[186849]: 2025-11-22 07:55:13.058 186853 DEBUG oslo_concurrency.lockutils [req-89e1720f-7f59-46db-a434-918bfddbb0d4 req-f651784a-059a-47a5-b37a-1b56904b84ee 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "a80b8598-7a3d-4859-8c57-d0c476d1fe01-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:55:13 np0005531887 nova_compute[186849]: 2025-11-22 07:55:13.059 186853 DEBUG oslo_concurrency.lockutils [req-89e1720f-7f59-46db-a434-918bfddbb0d4 req-f651784a-059a-47a5-b37a-1b56904b84ee 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "a80b8598-7a3d-4859-8c57-d0c476d1fe01-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:55:13 np0005531887 nova_compute[186849]: 2025-11-22 07:55:13.059 186853 DEBUG oslo_concurrency.lockutils [req-89e1720f-7f59-46db-a434-918bfddbb0d4 req-f651784a-059a-47a5-b37a-1b56904b84ee 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "a80b8598-7a3d-4859-8c57-d0c476d1fe01-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:55:13 np0005531887 nova_compute[186849]: 2025-11-22 07:55:13.059 186853 DEBUG nova.compute.manager [req-89e1720f-7f59-46db-a434-918bfddbb0d4 req-f651784a-059a-47a5-b37a-1b56904b84ee 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] No waiting events found dispatching network-vif-plugged-3d99992d-6b96-45f9-9dff-ff737054f3f5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:55:13 np0005531887 nova_compute[186849]: 2025-11-22 07:55:13.059 186853 WARNING nova.compute.manager [req-89e1720f-7f59-46db-a434-918bfddbb0d4 req-f651784a-059a-47a5-b37a-1b56904b84ee 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Received unexpected event network-vif-plugged-3d99992d-6b96-45f9-9dff-ff737054f3f5 for instance with vm_state suspended and task_state None.#033[00m
Nov 22 02:55:13 np0005531887 nova_compute[186849]: 2025-11-22 07:55:13.177 186853 DEBUG oslo_concurrency.lockutils [None req-7a047783-f98f-47b0-b150-3b0e31b7503e 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Acquiring lock "10a29489-706f-428f-b645-1c688d642f0b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:55:13 np0005531887 nova_compute[186849]: 2025-11-22 07:55:13.177 186853 DEBUG oslo_concurrency.lockutils [None req-7a047783-f98f-47b0-b150-3b0e31b7503e 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lock "10a29489-706f-428f-b645-1c688d642f0b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:55:13 np0005531887 nova_compute[186849]: 2025-11-22 07:55:13.178 186853 DEBUG oslo_concurrency.lockutils [None req-7a047783-f98f-47b0-b150-3b0e31b7503e 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Acquiring lock "10a29489-706f-428f-b645-1c688d642f0b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:55:13 np0005531887 nova_compute[186849]: 2025-11-22 07:55:13.178 186853 DEBUG oslo_concurrency.lockutils [None req-7a047783-f98f-47b0-b150-3b0e31b7503e 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lock "10a29489-706f-428f-b645-1c688d642f0b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:55:13 np0005531887 nova_compute[186849]: 2025-11-22 07:55:13.178 186853 DEBUG oslo_concurrency.lockutils [None req-7a047783-f98f-47b0-b150-3b0e31b7503e 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lock "10a29489-706f-428f-b645-1c688d642f0b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:55:13 np0005531887 nova_compute[186849]: 2025-11-22 07:55:13.187 186853 INFO nova.compute.manager [None req-7a047783-f98f-47b0-b150-3b0e31b7503e 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Terminating instance#033[00m
Nov 22 02:55:13 np0005531887 nova_compute[186849]: 2025-11-22 07:55:13.198 186853 DEBUG nova.compute.manager [None req-7a047783-f98f-47b0-b150-3b0e31b7503e 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 02:55:13 np0005531887 kernel: tapc27f5a73-ae (unregistering): left promiscuous mode
Nov 22 02:55:13 np0005531887 NetworkManager[55210]: <info>  [1763798113.2294] device (tapc27f5a73-ae): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 02:55:13 np0005531887 ovn_controller[95130]: 2025-11-22T07:55:13Z|00165|binding|INFO|Releasing lport c27f5a73-ae9a-4f31-95ec-4d5fa852d61b from this chassis (sb_readonly=0)
Nov 22 02:55:13 np0005531887 ovn_controller[95130]: 2025-11-22T07:55:13Z|00166|binding|INFO|Setting lport c27f5a73-ae9a-4f31-95ec-4d5fa852d61b down in Southbound
Nov 22 02:55:13 np0005531887 nova_compute[186849]: 2025-11-22 07:55:13.240 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:13 np0005531887 ovn_controller[95130]: 2025-11-22T07:55:13Z|00167|binding|INFO|Removing iface tapc27f5a73-ae ovn-installed in OVS
Nov 22 02:55:13 np0005531887 nova_compute[186849]: 2025-11-22 07:55:13.242 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:13 np0005531887 nova_compute[186849]: 2025-11-22 07:55:13.262 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:13 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:13.284 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:15:d5:34 10.100.0.14'], port_security=['fa:16:3e:15:d5:34 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '10a29489-706f-428f-b645-1c688d642f0b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62930ff4-55a3-4e08-8229-5532aa7dcaed', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'cd63e957-ae08-4ca1-9eb9-8ce253173257', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=13b92379-ae34-491c-b971-1757bc6e8c79, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=c27f5a73-ae9a-4f31-95ec-4d5fa852d61b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:55:13 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:13.285 104084 INFO neutron.agent.ovn.metadata.agent [-] Port c27f5a73-ae9a-4f31-95ec-4d5fa852d61b in datapath 62930ff4-55a3-4e08-8229-5532aa7dcaed unbound from our chassis#033[00m
Nov 22 02:55:13 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:13.287 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 62930ff4-55a3-4e08-8229-5532aa7dcaed, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 02:55:13 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:13.288 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[38c8779e-ae9d-459e-9103-7af68e3fb570]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:13 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:13.288 104084 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed namespace which is not needed anymore#033[00m
Nov 22 02:55:13 np0005531887 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d00000040.scope: Deactivated successfully.
Nov 22 02:55:13 np0005531887 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d00000040.scope: Consumed 15.448s CPU time.
Nov 22 02:55:13 np0005531887 systemd-machined[153180]: Machine qemu-22-instance-00000040 terminated.
Nov 22 02:55:13 np0005531887 neutron-haproxy-ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed[221541]: [NOTICE]   (221555) : haproxy version is 2.8.14-c23fe91
Nov 22 02:55:13 np0005531887 neutron-haproxy-ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed[221541]: [NOTICE]   (221555) : path to executable is /usr/sbin/haproxy
Nov 22 02:55:13 np0005531887 neutron-haproxy-ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed[221541]: [WARNING]  (221555) : Exiting Master process...
Nov 22 02:55:13 np0005531887 neutron-haproxy-ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed[221541]: [ALERT]    (221555) : Current worker (221557) exited with code 143 (Terminated)
Nov 22 02:55:13 np0005531887 neutron-haproxy-ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed[221541]: [WARNING]  (221555) : All workers exited. Exiting... (0)
Nov 22 02:55:13 np0005531887 systemd[1]: libpod-a8561b611dac9379a07a3ef23221fb187772a6529ad6dfa0f6c85ccffaaf64ab.scope: Deactivated successfully.
Nov 22 02:55:13 np0005531887 podman[222629]: 2025-11-22 07:55:13.437136431 +0000 UTC m=+0.055703414 container died a8561b611dac9379a07a3ef23221fb187772a6529ad6dfa0f6c85ccffaaf64ab (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 02:55:13 np0005531887 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a8561b611dac9379a07a3ef23221fb187772a6529ad6dfa0f6c85ccffaaf64ab-userdata-shm.mount: Deactivated successfully.
Nov 22 02:55:13 np0005531887 systemd[1]: var-lib-containers-storage-overlay-59c2ae45fd39adfa52908665011272049330b1170d173531091d346bd390324d-merged.mount: Deactivated successfully.
Nov 22 02:55:13 np0005531887 nova_compute[186849]: 2025-11-22 07:55:13.472 186853 INFO nova.virt.libvirt.driver [-] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Instance destroyed successfully.#033[00m
Nov 22 02:55:13 np0005531887 nova_compute[186849]: 2025-11-22 07:55:13.473 186853 DEBUG nova.objects.instance [None req-7a047783-f98f-47b0-b150-3b0e31b7503e 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lazy-loading 'resources' on Instance uuid 10a29489-706f-428f-b645-1c688d642f0b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:55:13 np0005531887 podman[222629]: 2025-11-22 07:55:13.480080762 +0000 UTC m=+0.098647755 container cleanup a8561b611dac9379a07a3ef23221fb187772a6529ad6dfa0f6c85ccffaaf64ab (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 22 02:55:13 np0005531887 systemd[1]: libpod-conmon-a8561b611dac9379a07a3ef23221fb187772a6529ad6dfa0f6c85ccffaaf64ab.scope: Deactivated successfully.
Nov 22 02:55:13 np0005531887 nova_compute[186849]: 2025-11-22 07:55:13.493 186853 DEBUG nova.virt.libvirt.vif [None req-7a047783-f98f-47b0-b150-3b0e31b7503e 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:54:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-688404127',display_name='tempest-ListServerFiltersTestJSON-instance-688404127',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-688404127',id=64,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:54:31Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b4ca2b2e65ac4bf8b3d14f3310a3a7bf',ramdisk_id='',reservation_id='r-d6cq3b30',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-1217253496',owner_user_name='tempest-ListServerFiltersTestJSON-1217253496-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:54:31Z,user_data=None,user_id='6d9b8aa760ed4afdbf24f9deb5d29190',uuid=10a29489-706f-428f-b645-1c688d642f0b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c27f5a73-ae9a-4f31-95ec-4d5fa852d61b", "address": "fa:16:3e:15:d5:34", "network": {"id": "62930ff4-55a3-4e08-8229-5532aa7dcaed", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-130265711-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4ca2b2e65ac4bf8b3d14f3310a3a7bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc27f5a73-ae", "ovs_interfaceid": "c27f5a73-ae9a-4f31-95ec-4d5fa852d61b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 02:55:13 np0005531887 nova_compute[186849]: 2025-11-22 07:55:13.494 186853 DEBUG nova.network.os_vif_util [None req-7a047783-f98f-47b0-b150-3b0e31b7503e 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Converting VIF {"id": "c27f5a73-ae9a-4f31-95ec-4d5fa852d61b", "address": "fa:16:3e:15:d5:34", "network": {"id": "62930ff4-55a3-4e08-8229-5532aa7dcaed", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-130265711-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4ca2b2e65ac4bf8b3d14f3310a3a7bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc27f5a73-ae", "ovs_interfaceid": "c27f5a73-ae9a-4f31-95ec-4d5fa852d61b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:55:13 np0005531887 nova_compute[186849]: 2025-11-22 07:55:13.495 186853 DEBUG nova.network.os_vif_util [None req-7a047783-f98f-47b0-b150-3b0e31b7503e 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:15:d5:34,bridge_name='br-int',has_traffic_filtering=True,id=c27f5a73-ae9a-4f31-95ec-4d5fa852d61b,network=Network(62930ff4-55a3-4e08-8229-5532aa7dcaed),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc27f5a73-ae') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:55:13 np0005531887 nova_compute[186849]: 2025-11-22 07:55:13.495 186853 DEBUG os_vif [None req-7a047783-f98f-47b0-b150-3b0e31b7503e 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:15:d5:34,bridge_name='br-int',has_traffic_filtering=True,id=c27f5a73-ae9a-4f31-95ec-4d5fa852d61b,network=Network(62930ff4-55a3-4e08-8229-5532aa7dcaed),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc27f5a73-ae') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 02:55:13 np0005531887 nova_compute[186849]: 2025-11-22 07:55:13.497 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:13 np0005531887 nova_compute[186849]: 2025-11-22 07:55:13.497 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc27f5a73-ae, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:55:13 np0005531887 nova_compute[186849]: 2025-11-22 07:55:13.499 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:13 np0005531887 nova_compute[186849]: 2025-11-22 07:55:13.502 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:55:13 np0005531887 nova_compute[186849]: 2025-11-22 07:55:13.504 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:13 np0005531887 nova_compute[186849]: 2025-11-22 07:55:13.506 186853 INFO os_vif [None req-7a047783-f98f-47b0-b150-3b0e31b7503e 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:15:d5:34,bridge_name='br-int',has_traffic_filtering=True,id=c27f5a73-ae9a-4f31-95ec-4d5fa852d61b,network=Network(62930ff4-55a3-4e08-8229-5532aa7dcaed),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc27f5a73-ae')#033[00m
Nov 22 02:55:13 np0005531887 nova_compute[186849]: 2025-11-22 07:55:13.507 186853 INFO nova.virt.libvirt.driver [None req-7a047783-f98f-47b0-b150-3b0e31b7503e 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Deleting instance files /var/lib/nova/instances/10a29489-706f-428f-b645-1c688d642f0b_del#033[00m
Nov 22 02:55:13 np0005531887 nova_compute[186849]: 2025-11-22 07:55:13.507 186853 INFO nova.virt.libvirt.driver [None req-7a047783-f98f-47b0-b150-3b0e31b7503e 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Deletion of /var/lib/nova/instances/10a29489-706f-428f-b645-1c688d642f0b_del complete#033[00m
Nov 22 02:55:13 np0005531887 podman[222678]: 2025-11-22 07:55:13.548903206 +0000 UTC m=+0.044439439 container remove a8561b611dac9379a07a3ef23221fb187772a6529ad6dfa0f6c85ccffaaf64ab (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 22 02:55:13 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:13.555 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[76eaf220-74c8-4ae7-a60b-838ceda238e5]: (4, ('Sat Nov 22 07:55:13 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed (a8561b611dac9379a07a3ef23221fb187772a6529ad6dfa0f6c85ccffaaf64ab)\na8561b611dac9379a07a3ef23221fb187772a6529ad6dfa0f6c85ccffaaf64ab\nSat Nov 22 07:55:13 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed (a8561b611dac9379a07a3ef23221fb187772a6529ad6dfa0f6c85ccffaaf64ab)\na8561b611dac9379a07a3ef23221fb187772a6529ad6dfa0f6c85ccffaaf64ab\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:13 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:13.557 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[6a2d7b09-17de-49c5-aa3a-63bb075616a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:13 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:13.558 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62930ff4-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:55:13 np0005531887 nova_compute[186849]: 2025-11-22 07:55:13.560 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:13 np0005531887 kernel: tap62930ff4-50: left promiscuous mode
Nov 22 02:55:13 np0005531887 nova_compute[186849]: 2025-11-22 07:55:13.576 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:13 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:13.579 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[3b285a60-f529-4a63-99d7-7ba26d72671f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:13 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:13.603 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[7aacf9f1-f55e-4cf2-b3b6-eddd2966a154]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:13 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:13.604 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[40671221-a465-44e7-803c-bd86a4b97cd6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:13 np0005531887 nova_compute[186849]: 2025-11-22 07:55:13.611 186853 INFO nova.compute.manager [None req-7a047783-f98f-47b0-b150-3b0e31b7503e 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Took 0.41 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 02:55:13 np0005531887 nova_compute[186849]: 2025-11-22 07:55:13.612 186853 DEBUG oslo.service.loopingcall [None req-7a047783-f98f-47b0-b150-3b0e31b7503e 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 02:55:13 np0005531887 nova_compute[186849]: 2025-11-22 07:55:13.613 186853 DEBUG nova.compute.manager [-] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 02:55:13 np0005531887 nova_compute[186849]: 2025-11-22 07:55:13.613 186853 DEBUG nova.network.neutron [-] [instance: 10a29489-706f-428f-b645-1c688d642f0b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 02:55:13 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:13.623 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[06d7e390-da9d-4ab7-981a-5e0522da438d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 481277, 'reachable_time': 41038, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222698, 'error': None, 'target': 'ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:13 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:13.626 104200 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 02:55:13 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:13.626 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[52a6f84a-e01d-4373-8b0a-025c94994a87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:13 np0005531887 systemd[1]: run-netns-ovnmeta\x2d62930ff4\x2d55a3\x2d4e08\x2d8229\x2d5532aa7dcaed.mount: Deactivated successfully.
Nov 22 02:55:13 np0005531887 nova_compute[186849]: 2025-11-22 07:55:13.796 186853 DEBUG nova.compute.manager [None req-4d3d9a13-5bba-4128-94d3-008cc162f1af 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:55:13 np0005531887 nova_compute[186849]: 2025-11-22 07:55:13.909 186853 INFO nova.compute.manager [None req-4d3d9a13-5bba-4128-94d3-008cc162f1af 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] instance snapshotting#033[00m
Nov 22 02:55:13 np0005531887 nova_compute[186849]: 2025-11-22 07:55:13.909 186853 WARNING nova.compute.manager [None req-4d3d9a13-5bba-4128-94d3-008cc162f1af 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] trying to snapshot a non-running instance: (state: 4 expected: 1)#033[00m
Nov 22 02:55:13 np0005531887 nova_compute[186849]: 2025-11-22 07:55:13.960 186853 DEBUG nova.compute.manager [req-6b7aafb7-7f63-4e2f-a384-9044de263200 req-b1e3cef4-a96d-4185-8d4a-5637fb10c53e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Received event network-vif-unplugged-c27f5a73-ae9a-4f31-95ec-4d5fa852d61b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:55:13 np0005531887 nova_compute[186849]: 2025-11-22 07:55:13.960 186853 DEBUG oslo_concurrency.lockutils [req-6b7aafb7-7f63-4e2f-a384-9044de263200 req-b1e3cef4-a96d-4185-8d4a-5637fb10c53e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "10a29489-706f-428f-b645-1c688d642f0b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:55:13 np0005531887 nova_compute[186849]: 2025-11-22 07:55:13.960 186853 DEBUG oslo_concurrency.lockutils [req-6b7aafb7-7f63-4e2f-a384-9044de263200 req-b1e3cef4-a96d-4185-8d4a-5637fb10c53e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "10a29489-706f-428f-b645-1c688d642f0b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:55:13 np0005531887 nova_compute[186849]: 2025-11-22 07:55:13.961 186853 DEBUG oslo_concurrency.lockutils [req-6b7aafb7-7f63-4e2f-a384-9044de263200 req-b1e3cef4-a96d-4185-8d4a-5637fb10c53e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "10a29489-706f-428f-b645-1c688d642f0b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:55:13 np0005531887 nova_compute[186849]: 2025-11-22 07:55:13.961 186853 DEBUG nova.compute.manager [req-6b7aafb7-7f63-4e2f-a384-9044de263200 req-b1e3cef4-a96d-4185-8d4a-5637fb10c53e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 10a29489-706f-428f-b645-1c688d642f0b] No waiting events found dispatching network-vif-unplugged-c27f5a73-ae9a-4f31-95ec-4d5fa852d61b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:55:13 np0005531887 nova_compute[186849]: 2025-11-22 07:55:13.961 186853 DEBUG nova.compute.manager [req-6b7aafb7-7f63-4e2f-a384-9044de263200 req-b1e3cef4-a96d-4185-8d4a-5637fb10c53e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Received event network-vif-unplugged-c27f5a73-ae9a-4f31-95ec-4d5fa852d61b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 02:55:14 np0005531887 nova_compute[186849]: 2025-11-22 07:55:14.385 186853 INFO nova.virt.libvirt.driver [None req-4d3d9a13-5bba-4128-94d3-008cc162f1af 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Beginning cold snapshot process#033[00m
Nov 22 02:55:14 np0005531887 nova_compute[186849]: 2025-11-22 07:55:14.625 186853 DEBUG nova.privsep.utils [None req-4d3d9a13-5bba-4128-94d3-008cc162f1af 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 22 02:55:14 np0005531887 nova_compute[186849]: 2025-11-22 07:55:14.626 186853 DEBUG oslo_concurrency.processutils [None req-4d3d9a13-5bba-4128-94d3-008cc162f1af 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/a80b8598-7a3d-4859-8c57-d0c476d1fe01/disk /var/lib/nova/instances/snapshots/tmp5v69dmju/d73f9ba3156a47c6bcfecc4984fcd865 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:55:14 np0005531887 nova_compute[186849]: 2025-11-22 07:55:14.839 186853 DEBUG oslo_concurrency.processutils [None req-4d3d9a13-5bba-4128-94d3-008cc162f1af 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/a80b8598-7a3d-4859-8c57-d0c476d1fe01/disk /var/lib/nova/instances/snapshots/tmp5v69dmju/d73f9ba3156a47c6bcfecc4984fcd865" returned: 0 in 0.213s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:55:14 np0005531887 nova_compute[186849]: 2025-11-22 07:55:14.840 186853 INFO nova.virt.libvirt.driver [None req-4d3d9a13-5bba-4128-94d3-008cc162f1af 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Snapshot extracted, beginning image upload#033[00m
Nov 22 02:55:15 np0005531887 nova_compute[186849]: 2025-11-22 07:55:15.432 186853 DEBUG nova.network.neutron [-] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:55:15 np0005531887 nova_compute[186849]: 2025-11-22 07:55:15.511 186853 INFO nova.compute.manager [-] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Took 1.90 seconds to deallocate network for instance.#033[00m
Nov 22 02:55:15 np0005531887 nova_compute[186849]: 2025-11-22 07:55:15.590 186853 DEBUG oslo_concurrency.lockutils [None req-7a047783-f98f-47b0-b150-3b0e31b7503e 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:55:15 np0005531887 nova_compute[186849]: 2025-11-22 07:55:15.591 186853 DEBUG oslo_concurrency.lockutils [None req-7a047783-f98f-47b0-b150-3b0e31b7503e 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:55:15 np0005531887 nova_compute[186849]: 2025-11-22 07:55:15.620 186853 DEBUG nova.compute.manager [req-558a5adc-7b3b-42b6-9131-8620dee5f47f req-d8752b2d-6494-41d3-a5d7-cfb070f19d91 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Received event network-vif-deleted-c27f5a73-ae9a-4f31-95ec-4d5fa852d61b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:55:15 np0005531887 nova_compute[186849]: 2025-11-22 07:55:15.629 186853 DEBUG nova.scheduler.client.report [None req-7a047783-f98f-47b0-b150-3b0e31b7503e 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Refreshing inventories for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 22 02:55:15 np0005531887 nova_compute[186849]: 2025-11-22 07:55:15.673 186853 DEBUG nova.scheduler.client.report [None req-7a047783-f98f-47b0-b150-3b0e31b7503e 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Updating ProviderTree inventory for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 22 02:55:15 np0005531887 nova_compute[186849]: 2025-11-22 07:55:15.673 186853 DEBUG nova.compute.provider_tree [None req-7a047783-f98f-47b0-b150-3b0e31b7503e 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Updating inventory in ProviderTree for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 22 02:55:15 np0005531887 nova_compute[186849]: 2025-11-22 07:55:15.691 186853 DEBUG nova.scheduler.client.report [None req-7a047783-f98f-47b0-b150-3b0e31b7503e 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Refreshing aggregate associations for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 22 02:55:15 np0005531887 nova_compute[186849]: 2025-11-22 07:55:15.714 186853 DEBUG nova.scheduler.client.report [None req-7a047783-f98f-47b0-b150-3b0e31b7503e 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Refreshing trait associations for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78, traits: COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_MMX,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_PCNET _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 22 02:55:15 np0005531887 nova_compute[186849]: 2025-11-22 07:55:15.795 186853 DEBUG nova.compute.provider_tree [None req-7a047783-f98f-47b0-b150-3b0e31b7503e 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:55:15 np0005531887 nova_compute[186849]: 2025-11-22 07:55:15.808 186853 DEBUG nova.scheduler.client.report [None req-7a047783-f98f-47b0-b150-3b0e31b7503e 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:55:15 np0005531887 nova_compute[186849]: 2025-11-22 07:55:15.847 186853 DEBUG oslo_concurrency.lockutils [None req-7a047783-f98f-47b0-b150-3b0e31b7503e 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.256s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:55:15 np0005531887 nova_compute[186849]: 2025-11-22 07:55:15.916 186853 INFO nova.scheduler.client.report [None req-7a047783-f98f-47b0-b150-3b0e31b7503e 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Deleted allocations for instance 10a29489-706f-428f-b645-1c688d642f0b#033[00m
Nov 22 02:55:16 np0005531887 nova_compute[186849]: 2025-11-22 07:55:16.014 186853 DEBUG oslo_concurrency.lockutils [None req-7a047783-f98f-47b0-b150-3b0e31b7503e 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lock "10a29489-706f-428f-b645-1c688d642f0b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.837s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:55:16 np0005531887 nova_compute[186849]: 2025-11-22 07:55:16.074 186853 DEBUG nova.compute.manager [req-662a6332-3c02-4e15-9e16-d0d99de4b88a req-d7ebfa54-8958-4cf8-9894-370ab56b4a4b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Received event network-vif-plugged-c27f5a73-ae9a-4f31-95ec-4d5fa852d61b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:55:16 np0005531887 nova_compute[186849]: 2025-11-22 07:55:16.076 186853 DEBUG oslo_concurrency.lockutils [req-662a6332-3c02-4e15-9e16-d0d99de4b88a req-d7ebfa54-8958-4cf8-9894-370ab56b4a4b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "10a29489-706f-428f-b645-1c688d642f0b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:55:16 np0005531887 nova_compute[186849]: 2025-11-22 07:55:16.076 186853 DEBUG oslo_concurrency.lockutils [req-662a6332-3c02-4e15-9e16-d0d99de4b88a req-d7ebfa54-8958-4cf8-9894-370ab56b4a4b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "10a29489-706f-428f-b645-1c688d642f0b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:55:16 np0005531887 nova_compute[186849]: 2025-11-22 07:55:16.077 186853 DEBUG oslo_concurrency.lockutils [req-662a6332-3c02-4e15-9e16-d0d99de4b88a req-d7ebfa54-8958-4cf8-9894-370ab56b4a4b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "10a29489-706f-428f-b645-1c688d642f0b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:55:16 np0005531887 nova_compute[186849]: 2025-11-22 07:55:16.077 186853 DEBUG nova.compute.manager [req-662a6332-3c02-4e15-9e16-d0d99de4b88a req-d7ebfa54-8958-4cf8-9894-370ab56b4a4b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 10a29489-706f-428f-b645-1c688d642f0b] No waiting events found dispatching network-vif-plugged-c27f5a73-ae9a-4f31-95ec-4d5fa852d61b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:55:16 np0005531887 nova_compute[186849]: 2025-11-22 07:55:16.077 186853 WARNING nova.compute.manager [req-662a6332-3c02-4e15-9e16-d0d99de4b88a req-d7ebfa54-8958-4cf8-9894-370ab56b4a4b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Received unexpected event network-vif-plugged-c27f5a73-ae9a-4f31-95ec-4d5fa852d61b for instance with vm_state deleted and task_state None.#033[00m
Nov 22 02:55:16 np0005531887 nova_compute[186849]: 2025-11-22 07:55:16.493 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:17 np0005531887 nova_compute[186849]: 2025-11-22 07:55:17.990 186853 INFO nova.virt.libvirt.driver [None req-4d3d9a13-5bba-4128-94d3-008cc162f1af 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Snapshot image upload complete#033[00m
Nov 22 02:55:17 np0005531887 nova_compute[186849]: 2025-11-22 07:55:17.992 186853 INFO nova.compute.manager [None req-4d3d9a13-5bba-4128-94d3-008cc162f1af 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Took 4.08 seconds to snapshot the instance on the hypervisor.#033[00m
Nov 22 02:55:18 np0005531887 nova_compute[186849]: 2025-11-22 07:55:18.501 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:18 np0005531887 podman[222705]: 2025-11-22 07:55:18.84995794 +0000 UTC m=+0.063781552 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent)
Nov 22 02:55:20 np0005531887 nova_compute[186849]: 2025-11-22 07:55:20.001 186853 DEBUG oslo_concurrency.lockutils [None req-9b0f8dba-84fa-4495-906b-6628a3650243 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "a80b8598-7a3d-4859-8c57-d0c476d1fe01" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:55:20 np0005531887 nova_compute[186849]: 2025-11-22 07:55:20.002 186853 DEBUG oslo_concurrency.lockutils [None req-9b0f8dba-84fa-4495-906b-6628a3650243 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "a80b8598-7a3d-4859-8c57-d0c476d1fe01" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:55:20 np0005531887 nova_compute[186849]: 2025-11-22 07:55:20.002 186853 DEBUG oslo_concurrency.lockutils [None req-9b0f8dba-84fa-4495-906b-6628a3650243 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "a80b8598-7a3d-4859-8c57-d0c476d1fe01-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:55:20 np0005531887 nova_compute[186849]: 2025-11-22 07:55:20.002 186853 DEBUG oslo_concurrency.lockutils [None req-9b0f8dba-84fa-4495-906b-6628a3650243 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "a80b8598-7a3d-4859-8c57-d0c476d1fe01-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:55:20 np0005531887 nova_compute[186849]: 2025-11-22 07:55:20.002 186853 DEBUG oslo_concurrency.lockutils [None req-9b0f8dba-84fa-4495-906b-6628a3650243 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "a80b8598-7a3d-4859-8c57-d0c476d1fe01-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:55:20 np0005531887 nova_compute[186849]: 2025-11-22 07:55:20.010 186853 INFO nova.compute.manager [None req-9b0f8dba-84fa-4495-906b-6628a3650243 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Terminating instance#033[00m
Nov 22 02:55:20 np0005531887 nova_compute[186849]: 2025-11-22 07:55:20.020 186853 DEBUG nova.compute.manager [None req-9b0f8dba-84fa-4495-906b-6628a3650243 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 02:55:20 np0005531887 nova_compute[186849]: 2025-11-22 07:55:20.026 186853 INFO nova.virt.libvirt.driver [-] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Instance destroyed successfully.#033[00m
Nov 22 02:55:20 np0005531887 nova_compute[186849]: 2025-11-22 07:55:20.026 186853 DEBUG nova.objects.instance [None req-9b0f8dba-84fa-4495-906b-6628a3650243 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lazy-loading 'resources' on Instance uuid a80b8598-7a3d-4859-8c57-d0c476d1fe01 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:55:20 np0005531887 nova_compute[186849]: 2025-11-22 07:55:20.037 186853 DEBUG nova.virt.libvirt.vif [None req-9b0f8dba-84fa-4495-906b-6628a3650243 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:54:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1757265874',display_name='tempest-ImagesTestJSON-server-1757265874',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagestestjson-server-1757265874',id=66,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:55:05Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='7ec4007dc8214caab4e2eb40f11fb3cd',ramdisk_id='',reservation_id='r-mp0eyiac',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ImagesTestJSON-117614339',owner_user_name='tempest-ImagesTestJSON-117614339-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:55:18Z,user_data=None,user_id='1ac2d2381d294c96aff369941185056a',uuid=a80b8598-7a3d-4859-8c57-d0c476d1fe01,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "3d99992d-6b96-45f9-9dff-ff737054f3f5", "address": "fa:16:3e:ba:c3:74", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3d99992d-6b", "ovs_interfaceid": "3d99992d-6b96-45f9-9dff-ff737054f3f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 02:55:20 np0005531887 nova_compute[186849]: 2025-11-22 07:55:20.037 186853 DEBUG nova.network.os_vif_util [None req-9b0f8dba-84fa-4495-906b-6628a3650243 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Converting VIF {"id": "3d99992d-6b96-45f9-9dff-ff737054f3f5", "address": "fa:16:3e:ba:c3:74", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3d99992d-6b", "ovs_interfaceid": "3d99992d-6b96-45f9-9dff-ff737054f3f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:55:20 np0005531887 nova_compute[186849]: 2025-11-22 07:55:20.038 186853 DEBUG nova.network.os_vif_util [None req-9b0f8dba-84fa-4495-906b-6628a3650243 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ba:c3:74,bridge_name='br-int',has_traffic_filtering=True,id=3d99992d-6b96-45f9-9dff-ff737054f3f5,network=Network(dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3d99992d-6b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:55:20 np0005531887 nova_compute[186849]: 2025-11-22 07:55:20.038 186853 DEBUG os_vif [None req-9b0f8dba-84fa-4495-906b-6628a3650243 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ba:c3:74,bridge_name='br-int',has_traffic_filtering=True,id=3d99992d-6b96-45f9-9dff-ff737054f3f5,network=Network(dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3d99992d-6b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 02:55:20 np0005531887 nova_compute[186849]: 2025-11-22 07:55:20.039 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:20 np0005531887 nova_compute[186849]: 2025-11-22 07:55:20.040 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3d99992d-6b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:55:20 np0005531887 nova_compute[186849]: 2025-11-22 07:55:20.042 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:20 np0005531887 nova_compute[186849]: 2025-11-22 07:55:20.044 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:55:20 np0005531887 nova_compute[186849]: 2025-11-22 07:55:20.046 186853 INFO os_vif [None req-9b0f8dba-84fa-4495-906b-6628a3650243 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ba:c3:74,bridge_name='br-int',has_traffic_filtering=True,id=3d99992d-6b96-45f9-9dff-ff737054f3f5,network=Network(dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3d99992d-6b')#033[00m
Nov 22 02:55:20 np0005531887 nova_compute[186849]: 2025-11-22 07:55:20.047 186853 INFO nova.virt.libvirt.driver [None req-9b0f8dba-84fa-4495-906b-6628a3650243 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Deleting instance files /var/lib/nova/instances/a80b8598-7a3d-4859-8c57-d0c476d1fe01_del#033[00m
Nov 22 02:55:20 np0005531887 nova_compute[186849]: 2025-11-22 07:55:20.047 186853 INFO nova.virt.libvirt.driver [None req-9b0f8dba-84fa-4495-906b-6628a3650243 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Deletion of /var/lib/nova/instances/a80b8598-7a3d-4859-8c57-d0c476d1fe01_del complete#033[00m
Nov 22 02:55:20 np0005531887 nova_compute[186849]: 2025-11-22 07:55:20.140 186853 INFO nova.compute.manager [None req-9b0f8dba-84fa-4495-906b-6628a3650243 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Took 0.12 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 02:55:20 np0005531887 nova_compute[186849]: 2025-11-22 07:55:20.141 186853 DEBUG oslo.service.loopingcall [None req-9b0f8dba-84fa-4495-906b-6628a3650243 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 02:55:20 np0005531887 nova_compute[186849]: 2025-11-22 07:55:20.141 186853 DEBUG nova.compute.manager [-] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 02:55:20 np0005531887 nova_compute[186849]: 2025-11-22 07:55:20.141 186853 DEBUG nova.network.neutron [-] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 02:55:21 np0005531887 nova_compute[186849]: 2025-11-22 07:55:21.495 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:21 np0005531887 nova_compute[186849]: 2025-11-22 07:55:21.642 186853 DEBUG nova.network.neutron [-] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:55:21 np0005531887 nova_compute[186849]: 2025-11-22 07:55:21.686 186853 INFO nova.compute.manager [-] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Took 1.54 seconds to deallocate network for instance.#033[00m
Nov 22 02:55:21 np0005531887 nova_compute[186849]: 2025-11-22 07:55:21.721 186853 DEBUG nova.compute.manager [req-e3999397-77b4-46c5-80ef-48c8a4883c6d req-b3fba8f6-8712-412d-9bfc-fd94dfb6a8da 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Received event network-vif-deleted-3d99992d-6b96-45f9-9dff-ff737054f3f5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:55:21 np0005531887 nova_compute[186849]: 2025-11-22 07:55:21.782 186853 DEBUG oslo_concurrency.lockutils [None req-9b0f8dba-84fa-4495-906b-6628a3650243 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:55:21 np0005531887 nova_compute[186849]: 2025-11-22 07:55:21.782 186853 DEBUG oslo_concurrency.lockutils [None req-9b0f8dba-84fa-4495-906b-6628a3650243 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:55:21 np0005531887 nova_compute[186849]: 2025-11-22 07:55:21.859 186853 DEBUG nova.compute.provider_tree [None req-9b0f8dba-84fa-4495-906b-6628a3650243 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:55:21 np0005531887 podman[222723]: 2025-11-22 07:55:21.874075123 +0000 UTC m=+0.077229530 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 22 02:55:21 np0005531887 nova_compute[186849]: 2025-11-22 07:55:21.882 186853 DEBUG nova.scheduler.client.report [None req-9b0f8dba-84fa-4495-906b-6628a3650243 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:55:21 np0005531887 nova_compute[186849]: 2025-11-22 07:55:21.915 186853 DEBUG oslo_concurrency.lockutils [None req-9b0f8dba-84fa-4495-906b-6628a3650243 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.133s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:55:21 np0005531887 nova_compute[186849]: 2025-11-22 07:55:21.949 186853 INFO nova.scheduler.client.report [None req-9b0f8dba-84fa-4495-906b-6628a3650243 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Deleted allocations for instance a80b8598-7a3d-4859-8c57-d0c476d1fe01#033[00m
Nov 22 02:55:22 np0005531887 nova_compute[186849]: 2025-11-22 07:55:22.053 186853 DEBUG oslo_concurrency.lockutils [None req-9b0f8dba-84fa-4495-906b-6628a3650243 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "a80b8598-7a3d-4859-8c57-d0c476d1fe01" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.051s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:55:24 np0005531887 nova_compute[186849]: 2025-11-22 07:55:24.609 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:25 np0005531887 nova_compute[186849]: 2025-11-22 07:55:25.043 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:25 np0005531887 nova_compute[186849]: 2025-11-22 07:55:25.260 186853 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798110.257475, a80b8598-7a3d-4859-8c57-d0c476d1fe01 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:55:25 np0005531887 nova_compute[186849]: 2025-11-22 07:55:25.260 186853 INFO nova.compute.manager [-] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:55:25 np0005531887 nova_compute[186849]: 2025-11-22 07:55:25.282 186853 DEBUG nova.compute.manager [None req-c8645366-fec6-4374-ac7b-4baab72356eb - - - - - -] [instance: a80b8598-7a3d-4859-8c57-d0c476d1fe01] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:55:26 np0005531887 nova_compute[186849]: 2025-11-22 07:55:26.498 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:26 np0005531887 podman[222742]: 2025-11-22 07:55:26.85877421 +0000 UTC m=+0.078525798 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 02:55:28 np0005531887 nova_compute[186849]: 2025-11-22 07:55:28.469 186853 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798113.467875, 10a29489-706f-428f-b645-1c688d642f0b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:55:28 np0005531887 nova_compute[186849]: 2025-11-22 07:55:28.470 186853 INFO nova.compute.manager [-] [instance: 10a29489-706f-428f-b645-1c688d642f0b] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:55:28 np0005531887 nova_compute[186849]: 2025-11-22 07:55:28.508 186853 DEBUG nova.compute.manager [None req-ed10f0fc-0fe5-4c18-bf66-d75219d05838 - - - - - -] [instance: 10a29489-706f-428f-b645-1c688d642f0b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:55:30 np0005531887 nova_compute[186849]: 2025-11-22 07:55:30.046 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:31 np0005531887 nova_compute[186849]: 2025-11-22 07:55:31.498 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:32 np0005531887 nova_compute[186849]: 2025-11-22 07:55:32.816 186853 DEBUG oslo_concurrency.lockutils [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Acquiring lock "c4014a21-495c-43f6-b9b0-e6460ba53d12" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:55:32 np0005531887 nova_compute[186849]: 2025-11-22 07:55:32.817 186853 DEBUG oslo_concurrency.lockutils [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Lock "c4014a21-495c-43f6-b9b0-e6460ba53d12" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:55:32 np0005531887 nova_compute[186849]: 2025-11-22 07:55:32.845 186853 DEBUG nova.compute.manager [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 02:55:32 np0005531887 podman[222768]: 2025-11-22 07:55:32.854518725 +0000 UTC m=+0.069179104 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, architecture=x86_64, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, name=ubi9-minimal, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 22 02:55:32 np0005531887 nova_compute[186849]: 2025-11-22 07:55:32.963 186853 DEBUG oslo_concurrency.lockutils [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:55:32 np0005531887 nova_compute[186849]: 2025-11-22 07:55:32.964 186853 DEBUG oslo_concurrency.lockutils [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:55:32 np0005531887 nova_compute[186849]: 2025-11-22 07:55:32.970 186853 DEBUG nova.virt.hardware [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:55:32 np0005531887 nova_compute[186849]: 2025-11-22 07:55:32.970 186853 INFO nova.compute.claims [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 22 02:55:33 np0005531887 nova_compute[186849]: 2025-11-22 07:55:33.198 186853 DEBUG nova.compute.provider_tree [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:55:33 np0005531887 nova_compute[186849]: 2025-11-22 07:55:33.220 186853 DEBUG nova.scheduler.client.report [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:55:33 np0005531887 nova_compute[186849]: 2025-11-22 07:55:33.257 186853 DEBUG oslo_concurrency.lockutils [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.293s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:55:33 np0005531887 nova_compute[186849]: 2025-11-22 07:55:33.258 186853 DEBUG nova.compute.manager [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 02:55:33 np0005531887 nova_compute[186849]: 2025-11-22 07:55:33.405 186853 DEBUG nova.compute.manager [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 02:55:33 np0005531887 nova_compute[186849]: 2025-11-22 07:55:33.405 186853 DEBUG nova.network.neutron [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 02:55:33 np0005531887 nova_compute[186849]: 2025-11-22 07:55:33.431 186853 INFO nova.virt.libvirt.driver [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 02:55:33 np0005531887 nova_compute[186849]: 2025-11-22 07:55:33.448 186853 DEBUG nova.compute.manager [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 02:55:33 np0005531887 nova_compute[186849]: 2025-11-22 07:55:33.604 186853 DEBUG nova.compute.manager [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 02:55:33 np0005531887 nova_compute[186849]: 2025-11-22 07:55:33.606 186853 DEBUG nova.virt.libvirt.driver [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:55:33 np0005531887 nova_compute[186849]: 2025-11-22 07:55:33.606 186853 INFO nova.virt.libvirt.driver [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Creating image(s)#033[00m
Nov 22 02:55:33 np0005531887 nova_compute[186849]: 2025-11-22 07:55:33.607 186853 DEBUG oslo_concurrency.lockutils [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Acquiring lock "/var/lib/nova/instances/c4014a21-495c-43f6-b9b0-e6460ba53d12/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:55:33 np0005531887 nova_compute[186849]: 2025-11-22 07:55:33.607 186853 DEBUG oslo_concurrency.lockutils [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Lock "/var/lib/nova/instances/c4014a21-495c-43f6-b9b0-e6460ba53d12/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:55:33 np0005531887 nova_compute[186849]: 2025-11-22 07:55:33.608 186853 DEBUG oslo_concurrency.lockutils [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Lock "/var/lib/nova/instances/c4014a21-495c-43f6-b9b0-e6460ba53d12/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:55:33 np0005531887 nova_compute[186849]: 2025-11-22 07:55:33.620 186853 DEBUG oslo_concurrency.processutils [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:55:33 np0005531887 nova_compute[186849]: 2025-11-22 07:55:33.674 186853 DEBUG nova.policy [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '14fea7f1307a4a04bd44f1831c499515', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '04bb699d9f7643838b7e68b6892b2373', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 02:55:33 np0005531887 nova_compute[186849]: 2025-11-22 07:55:33.693 186853 DEBUG oslo_concurrency.processutils [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:55:33 np0005531887 nova_compute[186849]: 2025-11-22 07:55:33.694 186853 DEBUG oslo_concurrency.lockutils [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:55:33 np0005531887 nova_compute[186849]: 2025-11-22 07:55:33.695 186853 DEBUG oslo_concurrency.lockutils [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:55:33 np0005531887 nova_compute[186849]: 2025-11-22 07:55:33.708 186853 DEBUG oslo_concurrency.processutils [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:55:33 np0005531887 nova_compute[186849]: 2025-11-22 07:55:33.782 186853 DEBUG oslo_concurrency.processutils [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:55:33 np0005531887 nova_compute[186849]: 2025-11-22 07:55:33.784 186853 DEBUG oslo_concurrency.processutils [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/c4014a21-495c-43f6-b9b0-e6460ba53d12/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:55:33 np0005531887 nova_compute[186849]: 2025-11-22 07:55:33.828 186853 DEBUG oslo_concurrency.processutils [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/c4014a21-495c-43f6-b9b0-e6460ba53d12/disk 1073741824" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:55:33 np0005531887 nova_compute[186849]: 2025-11-22 07:55:33.829 186853 DEBUG oslo_concurrency.lockutils [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:55:33 np0005531887 nova_compute[186849]: 2025-11-22 07:55:33.830 186853 DEBUG oslo_concurrency.processutils [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:55:33 np0005531887 nova_compute[186849]: 2025-11-22 07:55:33.892 186853 DEBUG oslo_concurrency.processutils [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:55:33 np0005531887 nova_compute[186849]: 2025-11-22 07:55:33.893 186853 DEBUG nova.virt.disk.api [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Checking if we can resize image /var/lib/nova/instances/c4014a21-495c-43f6-b9b0-e6460ba53d12/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:55:33 np0005531887 nova_compute[186849]: 2025-11-22 07:55:33.893 186853 DEBUG oslo_concurrency.processutils [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4014a21-495c-43f6-b9b0-e6460ba53d12/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:55:33 np0005531887 nova_compute[186849]: 2025-11-22 07:55:33.957 186853 DEBUG oslo_concurrency.processutils [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4014a21-495c-43f6-b9b0-e6460ba53d12/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:55:33 np0005531887 nova_compute[186849]: 2025-11-22 07:55:33.958 186853 DEBUG nova.virt.disk.api [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Cannot resize image /var/lib/nova/instances/c4014a21-495c-43f6-b9b0-e6460ba53d12/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:55:33 np0005531887 nova_compute[186849]: 2025-11-22 07:55:33.958 186853 DEBUG nova.objects.instance [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Lazy-loading 'migration_context' on Instance uuid c4014a21-495c-43f6-b9b0-e6460ba53d12 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:55:33 np0005531887 nova_compute[186849]: 2025-11-22 07:55:33.984 186853 DEBUG nova.virt.libvirt.driver [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:55:33 np0005531887 nova_compute[186849]: 2025-11-22 07:55:33.984 186853 DEBUG nova.virt.libvirt.driver [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Ensure instance console log exists: /var/lib/nova/instances/c4014a21-495c-43f6-b9b0-e6460ba53d12/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:55:33 np0005531887 nova_compute[186849]: 2025-11-22 07:55:33.985 186853 DEBUG oslo_concurrency.lockutils [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:55:33 np0005531887 nova_compute[186849]: 2025-11-22 07:55:33.985 186853 DEBUG oslo_concurrency.lockutils [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:55:33 np0005531887 nova_compute[186849]: 2025-11-22 07:55:33.985 186853 DEBUG oslo_concurrency.lockutils [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:55:35 np0005531887 nova_compute[186849]: 2025-11-22 07:55:35.048 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:35 np0005531887 nova_compute[186849]: 2025-11-22 07:55:35.149 186853 DEBUG nova.network.neutron [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Successfully created port: 376ec00d-cb9d-470f-abd0-a10f2086e245 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 02:55:36 np0005531887 nova_compute[186849]: 2025-11-22 07:55:36.500 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:36 np0005531887 podman[222802]: 2025-11-22 07:55:36.850250757 +0000 UTC m=+0.068764804 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 22 02:55:36 np0005531887 podman[222803]: 2025-11-22 07:55:36.882855179 +0000 UTC m=+0.097562891 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 22 02:55:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:37.325 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:55:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:37.326 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:55:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:37.326 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:55:38 np0005531887 nova_compute[186849]: 2025-11-22 07:55:38.523 186853 DEBUG nova.network.neutron [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Successfully updated port: 376ec00d-cb9d-470f-abd0-a10f2086e245 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 02:55:38 np0005531887 nova_compute[186849]: 2025-11-22 07:55:38.555 186853 DEBUG oslo_concurrency.lockutils [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Acquiring lock "refresh_cache-c4014a21-495c-43f6-b9b0-e6460ba53d12" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:55:38 np0005531887 nova_compute[186849]: 2025-11-22 07:55:38.556 186853 DEBUG oslo_concurrency.lockutils [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Acquired lock "refresh_cache-c4014a21-495c-43f6-b9b0-e6460ba53d12" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:55:38 np0005531887 nova_compute[186849]: 2025-11-22 07:55:38.556 186853 DEBUG nova.network.neutron [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:55:38 np0005531887 nova_compute[186849]: 2025-11-22 07:55:38.683 186853 DEBUG nova.compute.manager [req-7a260ecd-ecee-4f3d-8131-5a6d952e70d5 req-340f2d19-a528-44cc-9a94-0634a00dfd5a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Received event network-changed-376ec00d-cb9d-470f-abd0-a10f2086e245 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:55:38 np0005531887 nova_compute[186849]: 2025-11-22 07:55:38.684 186853 DEBUG nova.compute.manager [req-7a260ecd-ecee-4f3d-8131-5a6d952e70d5 req-340f2d19-a528-44cc-9a94-0634a00dfd5a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Refreshing instance network info cache due to event network-changed-376ec00d-cb9d-470f-abd0-a10f2086e245. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 02:55:38 np0005531887 nova_compute[186849]: 2025-11-22 07:55:38.684 186853 DEBUG oslo_concurrency.lockutils [req-7a260ecd-ecee-4f3d-8131-5a6d952e70d5 req-340f2d19-a528-44cc-9a94-0634a00dfd5a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-c4014a21-495c-43f6-b9b0-e6460ba53d12" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:55:38 np0005531887 nova_compute[186849]: 2025-11-22 07:55:38.816 186853 DEBUG nova.network.neutron [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:55:40 np0005531887 nova_compute[186849]: 2025-11-22 07:55:40.051 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:40 np0005531887 nova_compute[186849]: 2025-11-22 07:55:40.593 186853 DEBUG nova.network.neutron [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Updating instance_info_cache with network_info: [{"id": "376ec00d-cb9d-470f-abd0-a10f2086e245", "address": "fa:16:3e:8c:90:d3", "network": {"id": "bc5af6f1-3dd2-4c09-8226-9d0ca73df2f5", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1940588496-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "04bb699d9f7643838b7e68b6892b2373", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap376ec00d-cb", "ovs_interfaceid": "376ec00d-cb9d-470f-abd0-a10f2086e245", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:55:40 np0005531887 nova_compute[186849]: 2025-11-22 07:55:40.616 186853 DEBUG oslo_concurrency.lockutils [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Releasing lock "refresh_cache-c4014a21-495c-43f6-b9b0-e6460ba53d12" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:55:40 np0005531887 nova_compute[186849]: 2025-11-22 07:55:40.617 186853 DEBUG nova.compute.manager [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Instance network_info: |[{"id": "376ec00d-cb9d-470f-abd0-a10f2086e245", "address": "fa:16:3e:8c:90:d3", "network": {"id": "bc5af6f1-3dd2-4c09-8226-9d0ca73df2f5", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1940588496-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "04bb699d9f7643838b7e68b6892b2373", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap376ec00d-cb", "ovs_interfaceid": "376ec00d-cb9d-470f-abd0-a10f2086e245", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 02:55:40 np0005531887 nova_compute[186849]: 2025-11-22 07:55:40.617 186853 DEBUG oslo_concurrency.lockutils [req-7a260ecd-ecee-4f3d-8131-5a6d952e70d5 req-340f2d19-a528-44cc-9a94-0634a00dfd5a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-c4014a21-495c-43f6-b9b0-e6460ba53d12" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:55:40 np0005531887 nova_compute[186849]: 2025-11-22 07:55:40.618 186853 DEBUG nova.network.neutron [req-7a260ecd-ecee-4f3d-8131-5a6d952e70d5 req-340f2d19-a528-44cc-9a94-0634a00dfd5a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Refreshing network info cache for port 376ec00d-cb9d-470f-abd0-a10f2086e245 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 02:55:40 np0005531887 nova_compute[186849]: 2025-11-22 07:55:40.621 186853 DEBUG nova.virt.libvirt.driver [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Start _get_guest_xml network_info=[{"id": "376ec00d-cb9d-470f-abd0-a10f2086e245", "address": "fa:16:3e:8c:90:d3", "network": {"id": "bc5af6f1-3dd2-4c09-8226-9d0ca73df2f5", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1940588496-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "04bb699d9f7643838b7e68b6892b2373", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap376ec00d-cb", "ovs_interfaceid": "376ec00d-cb9d-470f-abd0-a10f2086e245", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'size': 0, 'boot_index': 0, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:55:40 np0005531887 nova_compute[186849]: 2025-11-22 07:55:40.629 186853 WARNING nova.virt.libvirt.driver [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:55:40 np0005531887 nova_compute[186849]: 2025-11-22 07:55:40.638 186853 DEBUG nova.virt.libvirt.host [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:55:40 np0005531887 nova_compute[186849]: 2025-11-22 07:55:40.639 186853 DEBUG nova.virt.libvirt.host [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:55:40 np0005531887 nova_compute[186849]: 2025-11-22 07:55:40.645 186853 DEBUG nova.virt.libvirt.host [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:55:40 np0005531887 nova_compute[186849]: 2025-11-22 07:55:40.646 186853 DEBUG nova.virt.libvirt.host [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:55:40 np0005531887 nova_compute[186849]: 2025-11-22 07:55:40.647 186853 DEBUG nova.virt.libvirt.driver [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:55:40 np0005531887 nova_compute[186849]: 2025-11-22 07:55:40.648 186853 DEBUG nova.virt.hardware [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:55:40 np0005531887 nova_compute[186849]: 2025-11-22 07:55:40.648 186853 DEBUG nova.virt.hardware [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:55:40 np0005531887 nova_compute[186849]: 2025-11-22 07:55:40.648 186853 DEBUG nova.virt.hardware [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:55:40 np0005531887 nova_compute[186849]: 2025-11-22 07:55:40.649 186853 DEBUG nova.virt.hardware [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:55:40 np0005531887 nova_compute[186849]: 2025-11-22 07:55:40.649 186853 DEBUG nova.virt.hardware [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:55:40 np0005531887 nova_compute[186849]: 2025-11-22 07:55:40.649 186853 DEBUG nova.virt.hardware [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:55:40 np0005531887 nova_compute[186849]: 2025-11-22 07:55:40.649 186853 DEBUG nova.virt.hardware [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:55:40 np0005531887 nova_compute[186849]: 2025-11-22 07:55:40.650 186853 DEBUG nova.virt.hardware [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:55:40 np0005531887 nova_compute[186849]: 2025-11-22 07:55:40.650 186853 DEBUG nova.virt.hardware [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:55:40 np0005531887 nova_compute[186849]: 2025-11-22 07:55:40.650 186853 DEBUG nova.virt.hardware [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:55:40 np0005531887 nova_compute[186849]: 2025-11-22 07:55:40.651 186853 DEBUG nova.virt.hardware [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:55:40 np0005531887 nova_compute[186849]: 2025-11-22 07:55:40.655 186853 DEBUG nova.virt.libvirt.vif [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:55:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerMetadataTestJSON-server-1798420740',display_name='tempest-ServerMetadataTestJSON-server-1798420740',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-servermetadatatestjson-server-1798420740',id=69,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='04bb699d9f7643838b7e68b6892b2373',ramdisk_id='',reservation_id='r-fp8d0mnk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerMetadataTestJSON-1046671153',owner_user_name='tempest-ServerMetadataTestJSON-1046671153-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:55:33Z,user_data=None,user_id='14fea7f1307a4a04bd44f1831c499515',uuid=c4014a21-495c-43f6-b9b0-e6460ba53d12,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "376ec00d-cb9d-470f-abd0-a10f2086e245", "address": "fa:16:3e:8c:90:d3", "network": {"id": "bc5af6f1-3dd2-4c09-8226-9d0ca73df2f5", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1940588496-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "04bb699d9f7643838b7e68b6892b2373", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap376ec00d-cb", "ovs_interfaceid": "376ec00d-cb9d-470f-abd0-a10f2086e245", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 02:55:40 np0005531887 nova_compute[186849]: 2025-11-22 07:55:40.656 186853 DEBUG nova.network.os_vif_util [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Converting VIF {"id": "376ec00d-cb9d-470f-abd0-a10f2086e245", "address": "fa:16:3e:8c:90:d3", "network": {"id": "bc5af6f1-3dd2-4c09-8226-9d0ca73df2f5", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1940588496-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "04bb699d9f7643838b7e68b6892b2373", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap376ec00d-cb", "ovs_interfaceid": "376ec00d-cb9d-470f-abd0-a10f2086e245", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:55:40 np0005531887 nova_compute[186849]: 2025-11-22 07:55:40.656 186853 DEBUG nova.network.os_vif_util [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8c:90:d3,bridge_name='br-int',has_traffic_filtering=True,id=376ec00d-cb9d-470f-abd0-a10f2086e245,network=Network(bc5af6f1-3dd2-4c09-8226-9d0ca73df2f5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap376ec00d-cb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:55:40 np0005531887 nova_compute[186849]: 2025-11-22 07:55:40.657 186853 DEBUG nova.objects.instance [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Lazy-loading 'pci_devices' on Instance uuid c4014a21-495c-43f6-b9b0-e6460ba53d12 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:55:40 np0005531887 nova_compute[186849]: 2025-11-22 07:55:40.679 186853 DEBUG nova.virt.libvirt.driver [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:55:40 np0005531887 nova_compute[186849]:  <uuid>c4014a21-495c-43f6-b9b0-e6460ba53d12</uuid>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:  <name>instance-00000045</name>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:  <memory>131072</memory>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:  <vcpu>1</vcpu>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:55:40 np0005531887 nova_compute[186849]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:      <nova:name>tempest-ServerMetadataTestJSON-server-1798420740</nova:name>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:      <nova:creationTime>2025-11-22 07:55:40</nova:creationTime>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:      <nova:flavor name="m1.nano">
Nov 22 02:55:40 np0005531887 nova_compute[186849]:        <nova:memory>128</nova:memory>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:        <nova:disk>1</nova:disk>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:        <nova:swap>0</nova:swap>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:      </nova:flavor>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:      <nova:owner>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:        <nova:user uuid="14fea7f1307a4a04bd44f1831c499515">tempest-ServerMetadataTestJSON-1046671153-project-member</nova:user>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:        <nova:project uuid="04bb699d9f7643838b7e68b6892b2373">tempest-ServerMetadataTestJSON-1046671153</nova:project>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:      </nova:owner>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:      <nova:ports>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:        <nova:port uuid="376ec00d-cb9d-470f-abd0-a10f2086e245">
Nov 22 02:55:40 np0005531887 nova_compute[186849]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:        </nova:port>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:      </nova:ports>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:    </nova:instance>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:  <sysinfo type="smbios">
Nov 22 02:55:40 np0005531887 nova_compute[186849]:    <system>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:      <entry name="serial">c4014a21-495c-43f6-b9b0-e6460ba53d12</entry>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:      <entry name="uuid">c4014a21-495c-43f6-b9b0-e6460ba53d12</entry>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:    </system>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:  <os>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:    <boot dev="hd"/>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:    <smbios mode="sysinfo"/>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:  </os>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:  <features>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:    <vmcoreinfo/>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:  </features>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:  <clock offset="utc">
Nov 22 02:55:40 np0005531887 nova_compute[186849]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:    <timer name="hpet" present="no"/>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:  </clock>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:  <cpu mode="custom" match="exact">
Nov 22 02:55:40 np0005531887 nova_compute[186849]:    <model>Nehalem</model>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:  <devices>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:    <disk type="file" device="disk">
Nov 22 02:55:40 np0005531887 nova_compute[186849]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/c4014a21-495c-43f6-b9b0-e6460ba53d12/disk"/>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:      <target dev="vda" bus="virtio"/>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:    <disk type="file" device="cdrom">
Nov 22 02:55:40 np0005531887 nova_compute[186849]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/c4014a21-495c-43f6-b9b0-e6460ba53d12/disk.config"/>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:      <target dev="sda" bus="sata"/>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:    <interface type="ethernet">
Nov 22 02:55:40 np0005531887 nova_compute[186849]:      <mac address="fa:16:3e:8c:90:d3"/>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:      <mtu size="1442"/>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:      <target dev="tap376ec00d-cb"/>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:    </interface>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:    <serial type="pty">
Nov 22 02:55:40 np0005531887 nova_compute[186849]:      <log file="/var/lib/nova/instances/c4014a21-495c-43f6-b9b0-e6460ba53d12/console.log" append="off"/>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:    </serial>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:    <video>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:    </video>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:    <input type="tablet" bus="usb"/>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:    <rng model="virtio">
Nov 22 02:55:40 np0005531887 nova_compute[186849]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:    </rng>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:    <controller type="usb" index="0"/>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:    <memballoon model="virtio">
Nov 22 02:55:40 np0005531887 nova_compute[186849]:      <stats period="10"/>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 02:55:40 np0005531887 nova_compute[186849]:  </devices>
Nov 22 02:55:40 np0005531887 nova_compute[186849]: </domain>
Nov 22 02:55:40 np0005531887 nova_compute[186849]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:55:40 np0005531887 nova_compute[186849]: 2025-11-22 07:55:40.681 186853 DEBUG nova.compute.manager [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Preparing to wait for external event network-vif-plugged-376ec00d-cb9d-470f-abd0-a10f2086e245 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 02:55:40 np0005531887 nova_compute[186849]: 2025-11-22 07:55:40.682 186853 DEBUG oslo_concurrency.lockutils [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Acquiring lock "c4014a21-495c-43f6-b9b0-e6460ba53d12-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:55:40 np0005531887 nova_compute[186849]: 2025-11-22 07:55:40.682 186853 DEBUG oslo_concurrency.lockutils [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Lock "c4014a21-495c-43f6-b9b0-e6460ba53d12-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:55:40 np0005531887 nova_compute[186849]: 2025-11-22 07:55:40.682 186853 DEBUG oslo_concurrency.lockutils [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Lock "c4014a21-495c-43f6-b9b0-e6460ba53d12-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:55:40 np0005531887 nova_compute[186849]: 2025-11-22 07:55:40.683 186853 DEBUG nova.virt.libvirt.vif [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:55:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerMetadataTestJSON-server-1798420740',display_name='tempest-ServerMetadataTestJSON-server-1798420740',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-servermetadatatestjson-server-1798420740',id=69,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='04bb699d9f7643838b7e68b6892b2373',ramdisk_id='',reservation_id='r-fp8d0mnk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerMetadataTestJSON-1046671153',owner_user_name='tempest-ServerMetadataTestJSON-1046671153-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:55:33Z,user_data=None,user_id='14fea7f1307a4a04bd44f1831c499515',uuid=c4014a21-495c-43f6-b9b0-e6460ba53d12,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "376ec00d-cb9d-470f-abd0-a10f2086e245", "address": "fa:16:3e:8c:90:d3", "network": {"id": "bc5af6f1-3dd2-4c09-8226-9d0ca73df2f5", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1940588496-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "04bb699d9f7643838b7e68b6892b2373", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap376ec00d-cb", "ovs_interfaceid": "376ec00d-cb9d-470f-abd0-a10f2086e245", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 02:55:40 np0005531887 nova_compute[186849]: 2025-11-22 07:55:40.683 186853 DEBUG nova.network.os_vif_util [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Converting VIF {"id": "376ec00d-cb9d-470f-abd0-a10f2086e245", "address": "fa:16:3e:8c:90:d3", "network": {"id": "bc5af6f1-3dd2-4c09-8226-9d0ca73df2f5", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1940588496-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "04bb699d9f7643838b7e68b6892b2373", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap376ec00d-cb", "ovs_interfaceid": "376ec00d-cb9d-470f-abd0-a10f2086e245", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:55:40 np0005531887 nova_compute[186849]: 2025-11-22 07:55:40.684 186853 DEBUG nova.network.os_vif_util [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8c:90:d3,bridge_name='br-int',has_traffic_filtering=True,id=376ec00d-cb9d-470f-abd0-a10f2086e245,network=Network(bc5af6f1-3dd2-4c09-8226-9d0ca73df2f5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap376ec00d-cb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:55:40 np0005531887 nova_compute[186849]: 2025-11-22 07:55:40.684 186853 DEBUG os_vif [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8c:90:d3,bridge_name='br-int',has_traffic_filtering=True,id=376ec00d-cb9d-470f-abd0-a10f2086e245,network=Network(bc5af6f1-3dd2-4c09-8226-9d0ca73df2f5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap376ec00d-cb') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 02:55:40 np0005531887 nova_compute[186849]: 2025-11-22 07:55:40.685 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:40 np0005531887 nova_compute[186849]: 2025-11-22 07:55:40.685 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:55:40 np0005531887 nova_compute[186849]: 2025-11-22 07:55:40.686 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:55:40 np0005531887 nova_compute[186849]: 2025-11-22 07:55:40.690 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:40 np0005531887 nova_compute[186849]: 2025-11-22 07:55:40.690 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap376ec00d-cb, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:55:40 np0005531887 nova_compute[186849]: 2025-11-22 07:55:40.691 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap376ec00d-cb, col_values=(('external_ids', {'iface-id': '376ec00d-cb9d-470f-abd0-a10f2086e245', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8c:90:d3', 'vm-uuid': 'c4014a21-495c-43f6-b9b0-e6460ba53d12'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:55:40 np0005531887 nova_compute[186849]: 2025-11-22 07:55:40.693 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:40 np0005531887 NetworkManager[55210]: <info>  [1763798140.6952] manager: (tap376ec00d-cb): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/89)
Nov 22 02:55:40 np0005531887 nova_compute[186849]: 2025-11-22 07:55:40.696 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:55:40 np0005531887 nova_compute[186849]: 2025-11-22 07:55:40.702 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:40 np0005531887 nova_compute[186849]: 2025-11-22 07:55:40.705 186853 INFO os_vif [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8c:90:d3,bridge_name='br-int',has_traffic_filtering=True,id=376ec00d-cb9d-470f-abd0-a10f2086e245,network=Network(bc5af6f1-3dd2-4c09-8226-9d0ca73df2f5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap376ec00d-cb')#033[00m
Nov 22 02:55:40 np0005531887 nova_compute[186849]: 2025-11-22 07:55:40.794 186853 DEBUG nova.virt.libvirt.driver [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:55:40 np0005531887 nova_compute[186849]: 2025-11-22 07:55:40.795 186853 DEBUG nova.virt.libvirt.driver [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:55:40 np0005531887 nova_compute[186849]: 2025-11-22 07:55:40.796 186853 DEBUG nova.virt.libvirt.driver [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] No VIF found with MAC fa:16:3e:8c:90:d3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 02:55:40 np0005531887 nova_compute[186849]: 2025-11-22 07:55:40.797 186853 INFO nova.virt.libvirt.driver [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Using config drive#033[00m
Nov 22 02:55:41 np0005531887 nova_compute[186849]: 2025-11-22 07:55:41.424 186853 INFO nova.virt.libvirt.driver [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Creating config drive at /var/lib/nova/instances/c4014a21-495c-43f6-b9b0-e6460ba53d12/disk.config#033[00m
Nov 22 02:55:41 np0005531887 nova_compute[186849]: 2025-11-22 07:55:41.430 186853 DEBUG oslo_concurrency.processutils [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c4014a21-495c-43f6-b9b0-e6460ba53d12/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpj3fpjwh2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:55:41 np0005531887 nova_compute[186849]: 2025-11-22 07:55:41.502 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:41 np0005531887 nova_compute[186849]: 2025-11-22 07:55:41.560 186853 DEBUG oslo_concurrency.processutils [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c4014a21-495c-43f6-b9b0-e6460ba53d12/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpj3fpjwh2" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:55:41 np0005531887 kernel: tap376ec00d-cb: entered promiscuous mode
Nov 22 02:55:41 np0005531887 NetworkManager[55210]: <info>  [1763798141.6399] manager: (tap376ec00d-cb): new Tun device (/org/freedesktop/NetworkManager/Devices/90)
Nov 22 02:55:41 np0005531887 ovn_controller[95130]: 2025-11-22T07:55:41Z|00168|binding|INFO|Claiming lport 376ec00d-cb9d-470f-abd0-a10f2086e245 for this chassis.
Nov 22 02:55:41 np0005531887 ovn_controller[95130]: 2025-11-22T07:55:41Z|00169|binding|INFO|376ec00d-cb9d-470f-abd0-a10f2086e245: Claiming fa:16:3e:8c:90:d3 10.100.0.6
Nov 22 02:55:41 np0005531887 nova_compute[186849]: 2025-11-22 07:55:41.640 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:41 np0005531887 nova_compute[186849]: 2025-11-22 07:55:41.647 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:41.665 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8c:90:d3 10.100.0.6'], port_security=['fa:16:3e:8c:90:d3 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'c4014a21-495c-43f6-b9b0-e6460ba53d12', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bc5af6f1-3dd2-4c09-8226-9d0ca73df2f5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '04bb699d9f7643838b7e68b6892b2373', 'neutron:revision_number': '2', 'neutron:security_group_ids': '6296ac54-2d71-4ba1-a8e4-a8577b8e8d3a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c563a3b5-fd1a-4517-8f11-38ef50aa5a82, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=376ec00d-cb9d-470f-abd0-a10f2086e245) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:55:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:41.668 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 376ec00d-cb9d-470f-abd0-a10f2086e245 in datapath bc5af6f1-3dd2-4c09-8226-9d0ca73df2f5 bound to our chassis#033[00m
Nov 22 02:55:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:41.670 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network bc5af6f1-3dd2-4c09-8226-9d0ca73df2f5#033[00m
Nov 22 02:55:41 np0005531887 systemd-udevd[222863]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:55:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:41.683 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[4c8472fd-0fe4-4106-8830-5ac7e039446f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:41.685 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapbc5af6f1-31 in ovnmeta-bc5af6f1-3dd2-4c09-8226-9d0ca73df2f5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 02:55:41 np0005531887 NetworkManager[55210]: <info>  [1763798141.6892] device (tap376ec00d-cb): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 02:55:41 np0005531887 NetworkManager[55210]: <info>  [1763798141.6905] device (tap376ec00d-cb): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 02:55:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:41.687 213790 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapbc5af6f1-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 02:55:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:41.688 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[21c8ed00-cd79-4ae4-8dd8-e6849819e3b3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:41.692 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[7640e605-76c0-4118-93b5-c37f2af1a1f7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:41 np0005531887 systemd-machined[153180]: New machine qemu-26-instance-00000045.
Nov 22 02:55:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:41.707 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[fda3428c-0290-4504-93bb-89659d1dd147]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:41 np0005531887 nova_compute[186849]: 2025-11-22 07:55:41.707 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:41 np0005531887 ovn_controller[95130]: 2025-11-22T07:55:41Z|00170|binding|INFO|Setting lport 376ec00d-cb9d-470f-abd0-a10f2086e245 ovn-installed in OVS
Nov 22 02:55:41 np0005531887 ovn_controller[95130]: 2025-11-22T07:55:41Z|00171|binding|INFO|Setting lport 376ec00d-cb9d-470f-abd0-a10f2086e245 up in Southbound
Nov 22 02:55:41 np0005531887 nova_compute[186849]: 2025-11-22 07:55:41.716 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:41 np0005531887 systemd[1]: Started Virtual Machine qemu-26-instance-00000045.
Nov 22 02:55:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:41.724 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[b949711d-559c-4dab-abbb-a39e61a1fd42]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:41.753 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[2726e281-4d29-4fa7-a517-385efeee9f8e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:41.761 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[d0e2efa9-1323-4b8f-a78f-9265ccd12f0c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:41 np0005531887 systemd-udevd[222868]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:55:41 np0005531887 NetworkManager[55210]: <info>  [1763798141.7643] manager: (tapbc5af6f1-30): new Veth device (/org/freedesktop/NetworkManager/Devices/91)
Nov 22 02:55:41 np0005531887 nova_compute[186849]: 2025-11-22 07:55:41.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:55:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:41.800 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[428b7c1a-e199-4b49-98d7-e40d90518698]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:41.804 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[21517f85-888f-47b1-a543-106f1886357c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:41 np0005531887 NetworkManager[55210]: <info>  [1763798141.8333] device (tapbc5af6f1-30): carrier: link connected
Nov 22 02:55:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:41.840 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[cd210a0b-2c88-4f1f-b4f5-abba7eed3682]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:41.863 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[aaf83d32-6f0b-4583-afec-811d750f940d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbc5af6f1-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bc:0c:47'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 55], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 488444, 'reachable_time': 37781, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222899, 'error': None, 'target': 'ovnmeta-bc5af6f1-3dd2-4c09-8226-9d0ca73df2f5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:41.881 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[8a2c9e6a-a2a6-4f25-90ad-361e91166a29]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:febc:c47'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 488444, 'tstamp': 488444}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222900, 'error': None, 'target': 'ovnmeta-bc5af6f1-3dd2-4c09-8226-9d0ca73df2f5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:41.896 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[d7bd2ef0-f9d4-4d1e-86ee-942adf078f86]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbc5af6f1-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bc:0c:47'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 55], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 488444, 'reachable_time': 37781, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222901, 'error': None, 'target': 'ovnmeta-bc5af6f1-3dd2-4c09-8226-9d0ca73df2f5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:41.926 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[98927d6b-2fdd-4f02-a990-954f930a10bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:41.988 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[855fc6ae-96da-4eba-a1ce-085e45043960]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:41.990 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbc5af6f1-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:55:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:41.990 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:55:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:41.990 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbc5af6f1-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:55:41 np0005531887 kernel: tapbc5af6f1-30: entered promiscuous mode
Nov 22 02:55:41 np0005531887 NetworkManager[55210]: <info>  [1763798141.9934] manager: (tapbc5af6f1-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/92)
Nov 22 02:55:41 np0005531887 nova_compute[186849]: 2025-11-22 07:55:41.992 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:41.996 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapbc5af6f1-30, col_values=(('external_ids', {'iface-id': '3a7ed642-4189-4bc1-a6a6-f25f1d0c40dc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:55:41 np0005531887 ovn_controller[95130]: 2025-11-22T07:55:41Z|00172|binding|INFO|Releasing lport 3a7ed642-4189-4bc1-a6a6-f25f1d0c40dc from this chassis (sb_readonly=0)
Nov 22 02:55:41 np0005531887 nova_compute[186849]: 2025-11-22 07:55:41.997 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:42 np0005531887 nova_compute[186849]: 2025-11-22 07:55:42.007 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:42 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:42.008 104084 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/bc5af6f1-3dd2-4c09-8226-9d0ca73df2f5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/bc5af6f1-3dd2-4c09-8226-9d0ca73df2f5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 02:55:42 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:42.009 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[4d09cfdc-55f5-4936-972a-1b86ce225807]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:42 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:42.010 104084 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 02:55:42 np0005531887 ovn_metadata_agent[104079]: global
Nov 22 02:55:42 np0005531887 ovn_metadata_agent[104079]:    log         /dev/log local0 debug
Nov 22 02:55:42 np0005531887 ovn_metadata_agent[104079]:    log-tag     haproxy-metadata-proxy-bc5af6f1-3dd2-4c09-8226-9d0ca73df2f5
Nov 22 02:55:42 np0005531887 ovn_metadata_agent[104079]:    user        root
Nov 22 02:55:42 np0005531887 ovn_metadata_agent[104079]:    group       root
Nov 22 02:55:42 np0005531887 ovn_metadata_agent[104079]:    maxconn     1024
Nov 22 02:55:42 np0005531887 ovn_metadata_agent[104079]:    pidfile     /var/lib/neutron/external/pids/bc5af6f1-3dd2-4c09-8226-9d0ca73df2f5.pid.haproxy
Nov 22 02:55:42 np0005531887 ovn_metadata_agent[104079]:    daemon
Nov 22 02:55:42 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:55:42 np0005531887 ovn_metadata_agent[104079]: defaults
Nov 22 02:55:42 np0005531887 ovn_metadata_agent[104079]:    log global
Nov 22 02:55:42 np0005531887 ovn_metadata_agent[104079]:    mode http
Nov 22 02:55:42 np0005531887 ovn_metadata_agent[104079]:    option httplog
Nov 22 02:55:42 np0005531887 ovn_metadata_agent[104079]:    option dontlognull
Nov 22 02:55:42 np0005531887 ovn_metadata_agent[104079]:    option http-server-close
Nov 22 02:55:42 np0005531887 ovn_metadata_agent[104079]:    option forwardfor
Nov 22 02:55:42 np0005531887 ovn_metadata_agent[104079]:    retries                 3
Nov 22 02:55:42 np0005531887 ovn_metadata_agent[104079]:    timeout http-request    30s
Nov 22 02:55:42 np0005531887 ovn_metadata_agent[104079]:    timeout connect         30s
Nov 22 02:55:42 np0005531887 ovn_metadata_agent[104079]:    timeout client          32s
Nov 22 02:55:42 np0005531887 ovn_metadata_agent[104079]:    timeout server          32s
Nov 22 02:55:42 np0005531887 ovn_metadata_agent[104079]:    timeout http-keep-alive 30s
Nov 22 02:55:42 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:55:42 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:55:42 np0005531887 ovn_metadata_agent[104079]: listen listener
Nov 22 02:55:42 np0005531887 ovn_metadata_agent[104079]:    bind 169.254.169.254:80
Nov 22 02:55:42 np0005531887 ovn_metadata_agent[104079]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 02:55:42 np0005531887 ovn_metadata_agent[104079]:    http-request add-header X-OVN-Network-ID bc5af6f1-3dd2-4c09-8226-9d0ca73df2f5
Nov 22 02:55:42 np0005531887 ovn_metadata_agent[104079]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 02:55:42 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:42.011 104084 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-bc5af6f1-3dd2-4c09-8226-9d0ca73df2f5', 'env', 'PROCESS_TAG=haproxy-bc5af6f1-3dd2-4c09-8226-9d0ca73df2f5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/bc5af6f1-3dd2-4c09-8226-9d0ca73df2f5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 02:55:42 np0005531887 nova_compute[186849]: 2025-11-22 07:55:42.129 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798142.1284647, c4014a21-495c-43f6-b9b0-e6460ba53d12 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:55:42 np0005531887 nova_compute[186849]: 2025-11-22 07:55:42.130 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] VM Started (Lifecycle Event)#033[00m
Nov 22 02:55:42 np0005531887 nova_compute[186849]: 2025-11-22 07:55:42.157 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:55:42 np0005531887 nova_compute[186849]: 2025-11-22 07:55:42.163 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798142.128644, c4014a21-495c-43f6-b9b0-e6460ba53d12 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:55:42 np0005531887 nova_compute[186849]: 2025-11-22 07:55:42.164 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] VM Paused (Lifecycle Event)#033[00m
Nov 22 02:55:42 np0005531887 nova_compute[186849]: 2025-11-22 07:55:42.188 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:55:42 np0005531887 nova_compute[186849]: 2025-11-22 07:55:42.193 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:55:42 np0005531887 nova_compute[186849]: 2025-11-22 07:55:42.230 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:55:42 np0005531887 nova_compute[186849]: 2025-11-22 07:55:42.334 186853 DEBUG nova.compute.manager [req-d51b269a-e2b2-46b2-ba82-b72c816671e4 req-a883ee1f-22ce-407c-b881-772550af337d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Received event network-vif-plugged-376ec00d-cb9d-470f-abd0-a10f2086e245 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:55:42 np0005531887 nova_compute[186849]: 2025-11-22 07:55:42.335 186853 DEBUG oslo_concurrency.lockutils [req-d51b269a-e2b2-46b2-ba82-b72c816671e4 req-a883ee1f-22ce-407c-b881-772550af337d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c4014a21-495c-43f6-b9b0-e6460ba53d12-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:55:42 np0005531887 nova_compute[186849]: 2025-11-22 07:55:42.335 186853 DEBUG oslo_concurrency.lockutils [req-d51b269a-e2b2-46b2-ba82-b72c816671e4 req-a883ee1f-22ce-407c-b881-772550af337d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c4014a21-495c-43f6-b9b0-e6460ba53d12-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:55:42 np0005531887 nova_compute[186849]: 2025-11-22 07:55:42.335 186853 DEBUG oslo_concurrency.lockutils [req-d51b269a-e2b2-46b2-ba82-b72c816671e4 req-a883ee1f-22ce-407c-b881-772550af337d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c4014a21-495c-43f6-b9b0-e6460ba53d12-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:55:42 np0005531887 nova_compute[186849]: 2025-11-22 07:55:42.336 186853 DEBUG nova.compute.manager [req-d51b269a-e2b2-46b2-ba82-b72c816671e4 req-a883ee1f-22ce-407c-b881-772550af337d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Processing event network-vif-plugged-376ec00d-cb9d-470f-abd0-a10f2086e245 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 02:55:42 np0005531887 nova_compute[186849]: 2025-11-22 07:55:42.336 186853 DEBUG nova.compute.manager [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:55:42 np0005531887 nova_compute[186849]: 2025-11-22 07:55:42.341 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798142.3416688, c4014a21-495c-43f6-b9b0-e6460ba53d12 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:55:42 np0005531887 nova_compute[186849]: 2025-11-22 07:55:42.342 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:55:42 np0005531887 nova_compute[186849]: 2025-11-22 07:55:42.346 186853 DEBUG nova.virt.libvirt.driver [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 02:55:42 np0005531887 nova_compute[186849]: 2025-11-22 07:55:42.350 186853 INFO nova.virt.libvirt.driver [-] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Instance spawned successfully.#033[00m
Nov 22 02:55:42 np0005531887 nova_compute[186849]: 2025-11-22 07:55:42.350 186853 DEBUG nova.virt.libvirt.driver [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 02:55:42 np0005531887 nova_compute[186849]: 2025-11-22 07:55:42.387 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:55:42 np0005531887 nova_compute[186849]: 2025-11-22 07:55:42.398 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:55:42 np0005531887 nova_compute[186849]: 2025-11-22 07:55:42.402 186853 DEBUG nova.virt.libvirt.driver [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:55:42 np0005531887 nova_compute[186849]: 2025-11-22 07:55:42.402 186853 DEBUG nova.virt.libvirt.driver [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:55:42 np0005531887 nova_compute[186849]: 2025-11-22 07:55:42.403 186853 DEBUG nova.virt.libvirt.driver [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:55:42 np0005531887 nova_compute[186849]: 2025-11-22 07:55:42.403 186853 DEBUG nova.virt.libvirt.driver [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:55:42 np0005531887 nova_compute[186849]: 2025-11-22 07:55:42.404 186853 DEBUG nova.virt.libvirt.driver [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:55:42 np0005531887 nova_compute[186849]: 2025-11-22 07:55:42.404 186853 DEBUG nova.virt.libvirt.driver [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:55:42 np0005531887 podman[222940]: 2025-11-22 07:55:42.433583283 +0000 UTC m=+0.103358307 container create d0646671defc9ccbfab70d23cd0c242eb851a8b1e326d5a861861d0f70f1a12e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bc5af6f1-3dd2-4c09-8226-9d0ca73df2f5, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 22 02:55:42 np0005531887 nova_compute[186849]: 2025-11-22 07:55:42.450 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:55:42 np0005531887 podman[222940]: 2025-11-22 07:55:42.363068645 +0000 UTC m=+0.032843679 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 02:55:42 np0005531887 systemd[1]: Started libpod-conmon-d0646671defc9ccbfab70d23cd0c242eb851a8b1e326d5a861861d0f70f1a12e.scope.
Nov 22 02:55:42 np0005531887 systemd[1]: Started libcrun container.
Nov 22 02:55:42 np0005531887 nova_compute[186849]: 2025-11-22 07:55:42.520 186853 INFO nova.compute.manager [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Took 8.92 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 02:55:42 np0005531887 nova_compute[186849]: 2025-11-22 07:55:42.522 186853 DEBUG nova.compute.manager [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:55:42 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb12c3debc87216b003d50434ea90f40ff2c1d9a43ccf293fe0d4a4ca94f5c51/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 02:55:42 np0005531887 podman[222940]: 2025-11-22 07:55:42.560760082 +0000 UTC m=+0.230535116 container init d0646671defc9ccbfab70d23cd0c242eb851a8b1e326d5a861861d0f70f1a12e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bc5af6f1-3dd2-4c09-8226-9d0ca73df2f5, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:55:42 np0005531887 podman[222940]: 2025-11-22 07:55:42.567434198 +0000 UTC m=+0.237209202 container start d0646671defc9ccbfab70d23cd0c242eb851a8b1e326d5a861861d0f70f1a12e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bc5af6f1-3dd2-4c09-8226-9d0ca73df2f5, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:55:42 np0005531887 neutron-haproxy-ovnmeta-bc5af6f1-3dd2-4c09-8226-9d0ca73df2f5[222956]: [NOTICE]   (222960) : New worker (222962) forked
Nov 22 02:55:42 np0005531887 neutron-haproxy-ovnmeta-bc5af6f1-3dd2-4c09-8226-9d0ca73df2f5[222956]: [NOTICE]   (222960) : Loading success.
Nov 22 02:55:42 np0005531887 nova_compute[186849]: 2025-11-22 07:55:42.613 186853 INFO nova.compute.manager [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Took 9.69 seconds to build instance.#033[00m
Nov 22 02:55:42 np0005531887 nova_compute[186849]: 2025-11-22 07:55:42.626 186853 DEBUG oslo_concurrency.lockutils [None req-062d5946-3b3f-4bb0-a1b0-434542a64832 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Lock "c4014a21-495c-43f6-b9b0-e6460ba53d12" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.810s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:55:42 np0005531887 nova_compute[186849]: 2025-11-22 07:55:42.709 186853 DEBUG nova.network.neutron [req-7a260ecd-ecee-4f3d-8131-5a6d952e70d5 req-340f2d19-a528-44cc-9a94-0634a00dfd5a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Updated VIF entry in instance network info cache for port 376ec00d-cb9d-470f-abd0-a10f2086e245. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 02:55:42 np0005531887 nova_compute[186849]: 2025-11-22 07:55:42.710 186853 DEBUG nova.network.neutron [req-7a260ecd-ecee-4f3d-8131-5a6d952e70d5 req-340f2d19-a528-44cc-9a94-0634a00dfd5a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Updating instance_info_cache with network_info: [{"id": "376ec00d-cb9d-470f-abd0-a10f2086e245", "address": "fa:16:3e:8c:90:d3", "network": {"id": "bc5af6f1-3dd2-4c09-8226-9d0ca73df2f5", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1940588496-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "04bb699d9f7643838b7e68b6892b2373", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap376ec00d-cb", "ovs_interfaceid": "376ec00d-cb9d-470f-abd0-a10f2086e245", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:55:42 np0005531887 nova_compute[186849]: 2025-11-22 07:55:42.729 186853 DEBUG oslo_concurrency.lockutils [req-7a260ecd-ecee-4f3d-8131-5a6d952e70d5 req-340f2d19-a528-44cc-9a94-0634a00dfd5a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-c4014a21-495c-43f6-b9b0-e6460ba53d12" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:55:42 np0005531887 nova_compute[186849]: 2025-11-22 07:55:42.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:55:43 np0005531887 podman[222971]: 2025-11-22 07:55:43.845139082 +0000 UTC m=+0.058399937 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 02:55:44 np0005531887 nova_compute[186849]: 2025-11-22 07:55:44.713 186853 DEBUG nova.compute.manager [req-3a138054-dba1-4815-883a-e9bb7d8e131a req-b22564aa-d300-4a0f-9141-3ef505e49396 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Received event network-vif-plugged-376ec00d-cb9d-470f-abd0-a10f2086e245 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:55:44 np0005531887 nova_compute[186849]: 2025-11-22 07:55:44.714 186853 DEBUG oslo_concurrency.lockutils [req-3a138054-dba1-4815-883a-e9bb7d8e131a req-b22564aa-d300-4a0f-9141-3ef505e49396 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c4014a21-495c-43f6-b9b0-e6460ba53d12-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:55:44 np0005531887 nova_compute[186849]: 2025-11-22 07:55:44.714 186853 DEBUG oslo_concurrency.lockutils [req-3a138054-dba1-4815-883a-e9bb7d8e131a req-b22564aa-d300-4a0f-9141-3ef505e49396 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c4014a21-495c-43f6-b9b0-e6460ba53d12-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:55:44 np0005531887 nova_compute[186849]: 2025-11-22 07:55:44.714 186853 DEBUG oslo_concurrency.lockutils [req-3a138054-dba1-4815-883a-e9bb7d8e131a req-b22564aa-d300-4a0f-9141-3ef505e49396 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c4014a21-495c-43f6-b9b0-e6460ba53d12-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:55:44 np0005531887 nova_compute[186849]: 2025-11-22 07:55:44.714 186853 DEBUG nova.compute.manager [req-3a138054-dba1-4815-883a-e9bb7d8e131a req-b22564aa-d300-4a0f-9141-3ef505e49396 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] No waiting events found dispatching network-vif-plugged-376ec00d-cb9d-470f-abd0-a10f2086e245 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:55:44 np0005531887 nova_compute[186849]: 2025-11-22 07:55:44.715 186853 WARNING nova.compute.manager [req-3a138054-dba1-4815-883a-e9bb7d8e131a req-b22564aa-d300-4a0f-9141-3ef505e49396 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Received unexpected event network-vif-plugged-376ec00d-cb9d-470f-abd0-a10f2086e245 for instance with vm_state active and task_state None.#033[00m
Nov 22 02:55:45 np0005531887 nova_compute[186849]: 2025-11-22 07:55:45.694 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:46 np0005531887 nova_compute[186849]: 2025-11-22 07:55:46.504 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:46 np0005531887 nova_compute[186849]: 2025-11-22 07:55:46.763 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:55:46 np0005531887 nova_compute[186849]: 2025-11-22 07:55:46.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:55:46 np0005531887 nova_compute[186849]: 2025-11-22 07:55:46.792 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:55:46 np0005531887 nova_compute[186849]: 2025-11-22 07:55:46.793 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:55:46 np0005531887 nova_compute[186849]: 2025-11-22 07:55:46.793 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:55:46 np0005531887 nova_compute[186849]: 2025-11-22 07:55:46.794 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 02:55:46 np0005531887 nova_compute[186849]: 2025-11-22 07:55:46.903 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4014a21-495c-43f6-b9b0-e6460ba53d12/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:55:46 np0005531887 nova_compute[186849]: 2025-11-22 07:55:46.970 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4014a21-495c-43f6-b9b0-e6460ba53d12/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:55:46 np0005531887 nova_compute[186849]: 2025-11-22 07:55:46.972 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4014a21-495c-43f6-b9b0-e6460ba53d12/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:55:47 np0005531887 nova_compute[186849]: 2025-11-22 07:55:47.036 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4014a21-495c-43f6-b9b0-e6460ba53d12/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:55:47 np0005531887 nova_compute[186849]: 2025-11-22 07:55:47.216 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:55:47 np0005531887 nova_compute[186849]: 2025-11-22 07:55:47.218 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5539MB free_disk=73.4151840209961GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 02:55:47 np0005531887 nova_compute[186849]: 2025-11-22 07:55:47.218 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:55:47 np0005531887 nova_compute[186849]: 2025-11-22 07:55:47.219 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:55:47 np0005531887 nova_compute[186849]: 2025-11-22 07:55:47.329 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Instance c4014a21-495c-43f6-b9b0-e6460ba53d12 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 02:55:47 np0005531887 nova_compute[186849]: 2025-11-22 07:55:47.331 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 02:55:47 np0005531887 nova_compute[186849]: 2025-11-22 07:55:47.332 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 02:55:47 np0005531887 nova_compute[186849]: 2025-11-22 07:55:47.434 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:55:47 np0005531887 nova_compute[186849]: 2025-11-22 07:55:47.451 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:55:47 np0005531887 nova_compute[186849]: 2025-11-22 07:55:47.481 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 02:55:47 np0005531887 nova_compute[186849]: 2025-11-22 07:55:47.482 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.263s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:55:48 np0005531887 nova_compute[186849]: 2025-11-22 07:55:48.483 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:55:48 np0005531887 nova_compute[186849]: 2025-11-22 07:55:48.484 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:55:48 np0005531887 nova_compute[186849]: 2025-11-22 07:55:48.485 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 02:55:48 np0005531887 nova_compute[186849]: 2025-11-22 07:55:48.770 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:55:48 np0005531887 nova_compute[186849]: 2025-11-22 07:55:48.771 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 02:55:48 np0005531887 nova_compute[186849]: 2025-11-22 07:55:48.771 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 02:55:49 np0005531887 nova_compute[186849]: 2025-11-22 07:55:49.391 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "refresh_cache-c4014a21-495c-43f6-b9b0-e6460ba53d12" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:55:49 np0005531887 nova_compute[186849]: 2025-11-22 07:55:49.392 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquired lock "refresh_cache-c4014a21-495c-43f6-b9b0-e6460ba53d12" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:55:49 np0005531887 nova_compute[186849]: 2025-11-22 07:55:49.392 186853 DEBUG nova.network.neutron [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 02:55:49 np0005531887 nova_compute[186849]: 2025-11-22 07:55:49.393 186853 DEBUG nova.objects.instance [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c4014a21-495c-43f6-b9b0-e6460ba53d12 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:55:49 np0005531887 podman[223004]: 2025-11-22 07:55:49.845972534 +0000 UTC m=+0.062958200 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Nov 22 02:55:49 np0005531887 nova_compute[186849]: 2025-11-22 07:55:49.977 186853 DEBUG oslo_concurrency.lockutils [None req-167b41e3-cbf0-4ef4-b0af-72ca5a0f7eae 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Acquiring lock "c4014a21-495c-43f6-b9b0-e6460ba53d12" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:55:49 np0005531887 nova_compute[186849]: 2025-11-22 07:55:49.979 186853 DEBUG oslo_concurrency.lockutils [None req-167b41e3-cbf0-4ef4-b0af-72ca5a0f7eae 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Lock "c4014a21-495c-43f6-b9b0-e6460ba53d12" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:55:49 np0005531887 nova_compute[186849]: 2025-11-22 07:55:49.979 186853 DEBUG oslo_concurrency.lockutils [None req-167b41e3-cbf0-4ef4-b0af-72ca5a0f7eae 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Acquiring lock "c4014a21-495c-43f6-b9b0-e6460ba53d12-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:55:49 np0005531887 nova_compute[186849]: 2025-11-22 07:55:49.979 186853 DEBUG oslo_concurrency.lockutils [None req-167b41e3-cbf0-4ef4-b0af-72ca5a0f7eae 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Lock "c4014a21-495c-43f6-b9b0-e6460ba53d12-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:55:49 np0005531887 nova_compute[186849]: 2025-11-22 07:55:49.979 186853 DEBUG oslo_concurrency.lockutils [None req-167b41e3-cbf0-4ef4-b0af-72ca5a0f7eae 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Lock "c4014a21-495c-43f6-b9b0-e6460ba53d12-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:55:49 np0005531887 nova_compute[186849]: 2025-11-22 07:55:49.987 186853 INFO nova.compute.manager [None req-167b41e3-cbf0-4ef4-b0af-72ca5a0f7eae 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Terminating instance#033[00m
Nov 22 02:55:49 np0005531887 nova_compute[186849]: 2025-11-22 07:55:49.995 186853 DEBUG nova.compute.manager [None req-167b41e3-cbf0-4ef4-b0af-72ca5a0f7eae 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 02:55:50 np0005531887 kernel: tap376ec00d-cb (unregistering): left promiscuous mode
Nov 22 02:55:50 np0005531887 NetworkManager[55210]: <info>  [1763798150.0190] device (tap376ec00d-cb): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 02:55:50 np0005531887 nova_compute[186849]: 2025-11-22 07:55:50.030 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:50 np0005531887 ovn_controller[95130]: 2025-11-22T07:55:50Z|00173|binding|INFO|Releasing lport 376ec00d-cb9d-470f-abd0-a10f2086e245 from this chassis (sb_readonly=0)
Nov 22 02:55:50 np0005531887 ovn_controller[95130]: 2025-11-22T07:55:50Z|00174|binding|INFO|Setting lport 376ec00d-cb9d-470f-abd0-a10f2086e245 down in Southbound
Nov 22 02:55:50 np0005531887 ovn_controller[95130]: 2025-11-22T07:55:50Z|00175|binding|INFO|Removing iface tap376ec00d-cb ovn-installed in OVS
Nov 22 02:55:50 np0005531887 nova_compute[186849]: 2025-11-22 07:55:50.036 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:50 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:50.047 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8c:90:d3 10.100.0.6'], port_security=['fa:16:3e:8c:90:d3 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'c4014a21-495c-43f6-b9b0-e6460ba53d12', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bc5af6f1-3dd2-4c09-8226-9d0ca73df2f5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '04bb699d9f7643838b7e68b6892b2373', 'neutron:revision_number': '4', 'neutron:security_group_ids': '6296ac54-2d71-4ba1-a8e4-a8577b8e8d3a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c563a3b5-fd1a-4517-8f11-38ef50aa5a82, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=376ec00d-cb9d-470f-abd0-a10f2086e245) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:55:50 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:50.049 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 376ec00d-cb9d-470f-abd0-a10f2086e245 in datapath bc5af6f1-3dd2-4c09-8226-9d0ca73df2f5 unbound from our chassis#033[00m
Nov 22 02:55:50 np0005531887 nova_compute[186849]: 2025-11-22 07:55:50.050 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:50 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:50.050 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bc5af6f1-3dd2-4c09-8226-9d0ca73df2f5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 02:55:50 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:50.054 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[2215a133-799c-4d00-87ca-763c429ecbbe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:50 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:50.056 104084 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-bc5af6f1-3dd2-4c09-8226-9d0ca73df2f5 namespace which is not needed anymore#033[00m
Nov 22 02:55:50 np0005531887 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d00000045.scope: Deactivated successfully.
Nov 22 02:55:50 np0005531887 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d00000045.scope: Consumed 8.191s CPU time.
Nov 22 02:55:50 np0005531887 systemd-machined[153180]: Machine qemu-26-instance-00000045 terminated.
Nov 22 02:55:50 np0005531887 nova_compute[186849]: 2025-11-22 07:55:50.278 186853 INFO nova.virt.libvirt.driver [-] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Instance destroyed successfully.#033[00m
Nov 22 02:55:50 np0005531887 nova_compute[186849]: 2025-11-22 07:55:50.279 186853 DEBUG nova.objects.instance [None req-167b41e3-cbf0-4ef4-b0af-72ca5a0f7eae 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Lazy-loading 'resources' on Instance uuid c4014a21-495c-43f6-b9b0-e6460ba53d12 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:55:50 np0005531887 nova_compute[186849]: 2025-11-22 07:55:50.296 186853 DEBUG nova.virt.libvirt.vif [None req-167b41e3-cbf0-4ef4-b0af-72ca5a0f7eae 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:55:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerMetadataTestJSON-server-1798420740',display_name='tempest-ServerMetadataTestJSON-server-1798420740',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-servermetadatatestjson-server-1798420740',id=69,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:55:42Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={key1='alt1',key2='value2',key3='value3'},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='04bb699d9f7643838b7e68b6892b2373',ramdisk_id='',reservation_id='r-fp8d0mnk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerMetadataTestJSON-1046671153',owner_user_name='tempest-ServerMetadataTestJSON-1046671153-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:55:49Z,user_data=None,user_id='14fea7f1307a4a04bd44f1831c499515',uuid=c4014a21-495c-43f6-b9b0-e6460ba53d12,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "376ec00d-cb9d-470f-abd0-a10f2086e245", "address": "fa:16:3e:8c:90:d3", "network": {"id": "bc5af6f1-3dd2-4c09-8226-9d0ca73df2f5", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1940588496-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "04bb699d9f7643838b7e68b6892b2373", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap376ec00d-cb", "ovs_interfaceid": "376ec00d-cb9d-470f-abd0-a10f2086e245", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 02:55:50 np0005531887 nova_compute[186849]: 2025-11-22 07:55:50.296 186853 DEBUG nova.network.os_vif_util [None req-167b41e3-cbf0-4ef4-b0af-72ca5a0f7eae 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Converting VIF {"id": "376ec00d-cb9d-470f-abd0-a10f2086e245", "address": "fa:16:3e:8c:90:d3", "network": {"id": "bc5af6f1-3dd2-4c09-8226-9d0ca73df2f5", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1940588496-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "04bb699d9f7643838b7e68b6892b2373", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap376ec00d-cb", "ovs_interfaceid": "376ec00d-cb9d-470f-abd0-a10f2086e245", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:55:50 np0005531887 nova_compute[186849]: 2025-11-22 07:55:50.297 186853 DEBUG nova.network.os_vif_util [None req-167b41e3-cbf0-4ef4-b0af-72ca5a0f7eae 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8c:90:d3,bridge_name='br-int',has_traffic_filtering=True,id=376ec00d-cb9d-470f-abd0-a10f2086e245,network=Network(bc5af6f1-3dd2-4c09-8226-9d0ca73df2f5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap376ec00d-cb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:55:50 np0005531887 nova_compute[186849]: 2025-11-22 07:55:50.297 186853 DEBUG os_vif [None req-167b41e3-cbf0-4ef4-b0af-72ca5a0f7eae 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8c:90:d3,bridge_name='br-int',has_traffic_filtering=True,id=376ec00d-cb9d-470f-abd0-a10f2086e245,network=Network(bc5af6f1-3dd2-4c09-8226-9d0ca73df2f5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap376ec00d-cb') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 02:55:50 np0005531887 nova_compute[186849]: 2025-11-22 07:55:50.300 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:50 np0005531887 nova_compute[186849]: 2025-11-22 07:55:50.300 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap376ec00d-cb, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:55:50 np0005531887 nova_compute[186849]: 2025-11-22 07:55:50.302 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:50 np0005531887 nova_compute[186849]: 2025-11-22 07:55:50.304 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:50 np0005531887 nova_compute[186849]: 2025-11-22 07:55:50.306 186853 INFO os_vif [None req-167b41e3-cbf0-4ef4-b0af-72ca5a0f7eae 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8c:90:d3,bridge_name='br-int',has_traffic_filtering=True,id=376ec00d-cb9d-470f-abd0-a10f2086e245,network=Network(bc5af6f1-3dd2-4c09-8226-9d0ca73df2f5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap376ec00d-cb')#033[00m
Nov 22 02:55:50 np0005531887 nova_compute[186849]: 2025-11-22 07:55:50.306 186853 INFO nova.virt.libvirt.driver [None req-167b41e3-cbf0-4ef4-b0af-72ca5a0f7eae 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Deleting instance files /var/lib/nova/instances/c4014a21-495c-43f6-b9b0-e6460ba53d12_del#033[00m
Nov 22 02:55:50 np0005531887 nova_compute[186849]: 2025-11-22 07:55:50.307 186853 INFO nova.virt.libvirt.driver [None req-167b41e3-cbf0-4ef4-b0af-72ca5a0f7eae 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Deletion of /var/lib/nova/instances/c4014a21-495c-43f6-b9b0-e6460ba53d12_del complete#033[00m
Nov 22 02:55:50 np0005531887 neutron-haproxy-ovnmeta-bc5af6f1-3dd2-4c09-8226-9d0ca73df2f5[222956]: [NOTICE]   (222960) : haproxy version is 2.8.14-c23fe91
Nov 22 02:55:50 np0005531887 neutron-haproxy-ovnmeta-bc5af6f1-3dd2-4c09-8226-9d0ca73df2f5[222956]: [NOTICE]   (222960) : path to executable is /usr/sbin/haproxy
Nov 22 02:55:50 np0005531887 neutron-haproxy-ovnmeta-bc5af6f1-3dd2-4c09-8226-9d0ca73df2f5[222956]: [WARNING]  (222960) : Exiting Master process...
Nov 22 02:55:50 np0005531887 neutron-haproxy-ovnmeta-bc5af6f1-3dd2-4c09-8226-9d0ca73df2f5[222956]: [ALERT]    (222960) : Current worker (222962) exited with code 143 (Terminated)
Nov 22 02:55:50 np0005531887 neutron-haproxy-ovnmeta-bc5af6f1-3dd2-4c09-8226-9d0ca73df2f5[222956]: [WARNING]  (222960) : All workers exited. Exiting... (0)
Nov 22 02:55:50 np0005531887 systemd[1]: libpod-d0646671defc9ccbfab70d23cd0c242eb851a8b1e326d5a861861d0f70f1a12e.scope: Deactivated successfully.
Nov 22 02:55:50 np0005531887 podman[223048]: 2025-11-22 07:55:50.321220198 +0000 UTC m=+0.151142058 container died d0646671defc9ccbfab70d23cd0c242eb851a8b1e326d5a861861d0f70f1a12e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bc5af6f1-3dd2-4c09-8226-9d0ca73df2f5, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:55:50 np0005531887 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d0646671defc9ccbfab70d23cd0c242eb851a8b1e326d5a861861d0f70f1a12e-userdata-shm.mount: Deactivated successfully.
Nov 22 02:55:50 np0005531887 systemd[1]: var-lib-containers-storage-overlay-cb12c3debc87216b003d50434ea90f40ff2c1d9a43ccf293fe0d4a4ca94f5c51-merged.mount: Deactivated successfully.
Nov 22 02:55:50 np0005531887 podman[223048]: 2025-11-22 07:55:50.36063529 +0000 UTC m=+0.190557170 container cleanup d0646671defc9ccbfab70d23cd0c242eb851a8b1e326d5a861861d0f70f1a12e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bc5af6f1-3dd2-4c09-8226-9d0ca73df2f5, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 22 02:55:50 np0005531887 systemd[1]: libpod-conmon-d0646671defc9ccbfab70d23cd0c242eb851a8b1e326d5a861861d0f70f1a12e.scope: Deactivated successfully.
Nov 22 02:55:50 np0005531887 nova_compute[186849]: 2025-11-22 07:55:50.396 186853 INFO nova.compute.manager [None req-167b41e3-cbf0-4ef4-b0af-72ca5a0f7eae 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Took 0.40 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 02:55:50 np0005531887 nova_compute[186849]: 2025-11-22 07:55:50.397 186853 DEBUG oslo.service.loopingcall [None req-167b41e3-cbf0-4ef4-b0af-72ca5a0f7eae 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 02:55:50 np0005531887 nova_compute[186849]: 2025-11-22 07:55:50.397 186853 DEBUG nova.compute.manager [-] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 02:55:50 np0005531887 nova_compute[186849]: 2025-11-22 07:55:50.397 186853 DEBUG nova.network.neutron [-] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 02:55:50 np0005531887 podman[223092]: 2025-11-22 07:55:50.434195933 +0000 UTC m=+0.051389142 container remove d0646671defc9ccbfab70d23cd0c242eb851a8b1e326d5a861861d0f70f1a12e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bc5af6f1-3dd2-4c09-8226-9d0ca73df2f5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:55:50 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:50.441 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[4a5a1749-7543-4907-a3cc-8e8d505769c8]: (4, ('Sat Nov 22 07:55:50 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-bc5af6f1-3dd2-4c09-8226-9d0ca73df2f5 (d0646671defc9ccbfab70d23cd0c242eb851a8b1e326d5a861861d0f70f1a12e)\nd0646671defc9ccbfab70d23cd0c242eb851a8b1e326d5a861861d0f70f1a12e\nSat Nov 22 07:55:50 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-bc5af6f1-3dd2-4c09-8226-9d0ca73df2f5 (d0646671defc9ccbfab70d23cd0c242eb851a8b1e326d5a861861d0f70f1a12e)\nd0646671defc9ccbfab70d23cd0c242eb851a8b1e326d5a861861d0f70f1a12e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:50 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:50.443 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[a091d787-00b9-455b-9b79-4ca15cea128e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:50 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:50.444 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbc5af6f1-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:55:50 np0005531887 nova_compute[186849]: 2025-11-22 07:55:50.447 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:50 np0005531887 kernel: tapbc5af6f1-30: left promiscuous mode
Nov 22 02:55:50 np0005531887 nova_compute[186849]: 2025-11-22 07:55:50.460 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:50 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:50.463 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[f912adaa-f814-4a4a-ab2b-b13bdb205059]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:50 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:50.485 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[32e0ca2a-6fc3-401d-8274-240b20c7b21f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:50 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:50.486 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[5243713b-4b5f-4ea9-b024-1f0886f5df1a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:50 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:50.504 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[9642285d-24ae-4f1c-b6c1-ef06defdc232]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 488435, 'reachable_time': 22998, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223107, 'error': None, 'target': 'ovnmeta-bc5af6f1-3dd2-4c09-8226-9d0ca73df2f5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:50 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:50.507 104200 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-bc5af6f1-3dd2-4c09-8226-9d0ca73df2f5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 02:55:50 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:55:50.508 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[16fd3299-0c0d-47da-81b8-fcc4202b9d2f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:55:50 np0005531887 systemd[1]: run-netns-ovnmeta\x2dbc5af6f1\x2d3dd2\x2d4c09\x2d8226\x2d9d0ca73df2f5.mount: Deactivated successfully.
Nov 22 02:55:51 np0005531887 nova_compute[186849]: 2025-11-22 07:55:51.317 186853 DEBUG nova.compute.manager [req-d3443cff-f2d7-40ba-86a7-8c731655f7a6 req-fcc545ec-a2a1-4ce5-9ada-f2fe9a7aecaf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Received event network-vif-unplugged-376ec00d-cb9d-470f-abd0-a10f2086e245 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:55:51 np0005531887 nova_compute[186849]: 2025-11-22 07:55:51.318 186853 DEBUG oslo_concurrency.lockutils [req-d3443cff-f2d7-40ba-86a7-8c731655f7a6 req-fcc545ec-a2a1-4ce5-9ada-f2fe9a7aecaf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c4014a21-495c-43f6-b9b0-e6460ba53d12-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:55:51 np0005531887 nova_compute[186849]: 2025-11-22 07:55:51.318 186853 DEBUG oslo_concurrency.lockutils [req-d3443cff-f2d7-40ba-86a7-8c731655f7a6 req-fcc545ec-a2a1-4ce5-9ada-f2fe9a7aecaf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c4014a21-495c-43f6-b9b0-e6460ba53d12-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:55:51 np0005531887 nova_compute[186849]: 2025-11-22 07:55:51.318 186853 DEBUG oslo_concurrency.lockutils [req-d3443cff-f2d7-40ba-86a7-8c731655f7a6 req-fcc545ec-a2a1-4ce5-9ada-f2fe9a7aecaf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c4014a21-495c-43f6-b9b0-e6460ba53d12-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:55:51 np0005531887 nova_compute[186849]: 2025-11-22 07:55:51.318 186853 DEBUG nova.compute.manager [req-d3443cff-f2d7-40ba-86a7-8c731655f7a6 req-fcc545ec-a2a1-4ce5-9ada-f2fe9a7aecaf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] No waiting events found dispatching network-vif-unplugged-376ec00d-cb9d-470f-abd0-a10f2086e245 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:55:51 np0005531887 nova_compute[186849]: 2025-11-22 07:55:51.318 186853 DEBUG nova.compute.manager [req-d3443cff-f2d7-40ba-86a7-8c731655f7a6 req-fcc545ec-a2a1-4ce5-9ada-f2fe9a7aecaf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Received event network-vif-unplugged-376ec00d-cb9d-470f-abd0-a10f2086e245 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 02:55:51 np0005531887 nova_compute[186849]: 2025-11-22 07:55:51.505 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:51 np0005531887 nova_compute[186849]: 2025-11-22 07:55:51.935 186853 DEBUG nova.network.neutron [-] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:55:51 np0005531887 nova_compute[186849]: 2025-11-22 07:55:51.959 186853 INFO nova.compute.manager [-] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Took 1.56 seconds to deallocate network for instance.#033[00m
Nov 22 02:55:52 np0005531887 nova_compute[186849]: 2025-11-22 07:55:52.049 186853 DEBUG oslo_concurrency.lockutils [None req-167b41e3-cbf0-4ef4-b0af-72ca5a0f7eae 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:55:52 np0005531887 nova_compute[186849]: 2025-11-22 07:55:52.050 186853 DEBUG oslo_concurrency.lockutils [None req-167b41e3-cbf0-4ef4-b0af-72ca5a0f7eae 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:55:52 np0005531887 nova_compute[186849]: 2025-11-22 07:55:52.110 186853 DEBUG nova.compute.provider_tree [None req-167b41e3-cbf0-4ef4-b0af-72ca5a0f7eae 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:55:52 np0005531887 nova_compute[186849]: 2025-11-22 07:55:52.132 186853 DEBUG nova.scheduler.client.report [None req-167b41e3-cbf0-4ef4-b0af-72ca5a0f7eae 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:55:52 np0005531887 nova_compute[186849]: 2025-11-22 07:55:52.153 186853 DEBUG oslo_concurrency.lockutils [None req-167b41e3-cbf0-4ef4-b0af-72ca5a0f7eae 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:55:52 np0005531887 nova_compute[186849]: 2025-11-22 07:55:52.191 186853 DEBUG nova.network.neutron [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Updating instance_info_cache with network_info: [{"id": "376ec00d-cb9d-470f-abd0-a10f2086e245", "address": "fa:16:3e:8c:90:d3", "network": {"id": "bc5af6f1-3dd2-4c09-8226-9d0ca73df2f5", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1940588496-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "04bb699d9f7643838b7e68b6892b2373", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap376ec00d-cb", "ovs_interfaceid": "376ec00d-cb9d-470f-abd0-a10f2086e245", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:55:52 np0005531887 nova_compute[186849]: 2025-11-22 07:55:52.206 186853 INFO nova.scheduler.client.report [None req-167b41e3-cbf0-4ef4-b0af-72ca5a0f7eae 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Deleted allocations for instance c4014a21-495c-43f6-b9b0-e6460ba53d12#033[00m
Nov 22 02:55:52 np0005531887 nova_compute[186849]: 2025-11-22 07:55:52.230 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Releasing lock "refresh_cache-c4014a21-495c-43f6-b9b0-e6460ba53d12" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:55:52 np0005531887 nova_compute[186849]: 2025-11-22 07:55:52.230 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 02:55:52 np0005531887 nova_compute[186849]: 2025-11-22 07:55:52.230 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:55:52 np0005531887 nova_compute[186849]: 2025-11-22 07:55:52.231 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:55:52 np0005531887 nova_compute[186849]: 2025-11-22 07:55:52.326 186853 DEBUG oslo_concurrency.lockutils [None req-167b41e3-cbf0-4ef4-b0af-72ca5a0f7eae 14fea7f1307a4a04bd44f1831c499515 04bb699d9f7643838b7e68b6892b2373 - - default default] Lock "c4014a21-495c-43f6-b9b0-e6460ba53d12" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.347s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:55:52 np0005531887 podman[223108]: 2025-11-22 07:55:52.848858621 +0000 UTC m=+0.064586251 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 02:55:53 np0005531887 nova_compute[186849]: 2025-11-22 07:55:53.471 186853 DEBUG nova.compute.manager [req-41a2b70a-4cf7-4b19-9256-7319a9ef489b req-aa1d824a-d4f5-433a-be3e-d673c25b005a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Received event network-vif-plugged-376ec00d-cb9d-470f-abd0-a10f2086e245 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:55:53 np0005531887 nova_compute[186849]: 2025-11-22 07:55:53.471 186853 DEBUG oslo_concurrency.lockutils [req-41a2b70a-4cf7-4b19-9256-7319a9ef489b req-aa1d824a-d4f5-433a-be3e-d673c25b005a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c4014a21-495c-43f6-b9b0-e6460ba53d12-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:55:53 np0005531887 nova_compute[186849]: 2025-11-22 07:55:53.471 186853 DEBUG oslo_concurrency.lockutils [req-41a2b70a-4cf7-4b19-9256-7319a9ef489b req-aa1d824a-d4f5-433a-be3e-d673c25b005a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c4014a21-495c-43f6-b9b0-e6460ba53d12-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:55:53 np0005531887 nova_compute[186849]: 2025-11-22 07:55:53.471 186853 DEBUG oslo_concurrency.lockutils [req-41a2b70a-4cf7-4b19-9256-7319a9ef489b req-aa1d824a-d4f5-433a-be3e-d673c25b005a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c4014a21-495c-43f6-b9b0-e6460ba53d12-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:55:53 np0005531887 nova_compute[186849]: 2025-11-22 07:55:53.472 186853 DEBUG nova.compute.manager [req-41a2b70a-4cf7-4b19-9256-7319a9ef489b req-aa1d824a-d4f5-433a-be3e-d673c25b005a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] No waiting events found dispatching network-vif-plugged-376ec00d-cb9d-470f-abd0-a10f2086e245 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:55:53 np0005531887 nova_compute[186849]: 2025-11-22 07:55:53.472 186853 WARNING nova.compute.manager [req-41a2b70a-4cf7-4b19-9256-7319a9ef489b req-aa1d824a-d4f5-433a-be3e-d673c25b005a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Received unexpected event network-vif-plugged-376ec00d-cb9d-470f-abd0-a10f2086e245 for instance with vm_state deleted and task_state None.#033[00m
Nov 22 02:55:53 np0005531887 nova_compute[186849]: 2025-11-22 07:55:53.472 186853 DEBUG nova.compute.manager [req-41a2b70a-4cf7-4b19-9256-7319a9ef489b req-aa1d824a-d4f5-433a-be3e-d673c25b005a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Received event network-vif-deleted-376ec00d-cb9d-470f-abd0-a10f2086e245 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:55:53 np0005531887 nova_compute[186849]: 2025-11-22 07:55:53.472 186853 INFO nova.compute.manager [req-41a2b70a-4cf7-4b19-9256-7319a9ef489b req-aa1d824a-d4f5-433a-be3e-d673c25b005a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Neutron deleted interface 376ec00d-cb9d-470f-abd0-a10f2086e245; detaching it from the instance and deleting it from the info cache#033[00m
Nov 22 02:55:53 np0005531887 nova_compute[186849]: 2025-11-22 07:55:53.473 186853 DEBUG nova.network.neutron [req-41a2b70a-4cf7-4b19-9256-7319a9ef489b req-aa1d824a-d4f5-433a-be3e-d673c25b005a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106#033[00m
Nov 22 02:55:53 np0005531887 nova_compute[186849]: 2025-11-22 07:55:53.475 186853 DEBUG nova.compute.manager [req-41a2b70a-4cf7-4b19-9256-7319a9ef489b req-aa1d824a-d4f5-433a-be3e-d673c25b005a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Detach interface failed, port_id=376ec00d-cb9d-470f-abd0-a10f2086e245, reason: Instance c4014a21-495c-43f6-b9b0-e6460ba53d12 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Nov 22 02:55:55 np0005531887 nova_compute[186849]: 2025-11-22 07:55:55.304 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:56 np0005531887 nova_compute[186849]: 2025-11-22 07:55:56.507 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:56 np0005531887 nova_compute[186849]: 2025-11-22 07:55:56.768 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:55:57 np0005531887 nova_compute[186849]: 2025-11-22 07:55:57.354 186853 DEBUG nova.compute.manager [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Nov 22 02:55:57 np0005531887 nova_compute[186849]: 2025-11-22 07:55:57.471 186853 DEBUG oslo_concurrency.lockutils [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:55:57 np0005531887 nova_compute[186849]: 2025-11-22 07:55:57.471 186853 DEBUG oslo_concurrency.lockutils [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:55:57 np0005531887 nova_compute[186849]: 2025-11-22 07:55:57.497 186853 DEBUG nova.objects.instance [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lazy-loading 'pci_requests' on Instance uuid 5eafb037-41a2-463f-9d3a-1b4248cb00f2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:55:57 np0005531887 nova_compute[186849]: 2025-11-22 07:55:57.512 186853 DEBUG nova.virt.hardware [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:55:57 np0005531887 nova_compute[186849]: 2025-11-22 07:55:57.513 186853 INFO nova.compute.claims [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 22 02:55:57 np0005531887 nova_compute[186849]: 2025-11-22 07:55:57.514 186853 DEBUG nova.objects.instance [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lazy-loading 'resources' on Instance uuid 5eafb037-41a2-463f-9d3a-1b4248cb00f2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:55:57 np0005531887 nova_compute[186849]: 2025-11-22 07:55:57.527 186853 DEBUG nova.objects.instance [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5eafb037-41a2-463f-9d3a-1b4248cb00f2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:55:57 np0005531887 nova_compute[186849]: 2025-11-22 07:55:57.569 186853 INFO nova.compute.resource_tracker [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Updating resource usage from migration 7911db20-f973-4a7b-bbc8-bc7a93d3bceb#033[00m
Nov 22 02:55:57 np0005531887 nova_compute[186849]: 2025-11-22 07:55:57.570 186853 DEBUG nova.compute.resource_tracker [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Starting to track incoming migration 7911db20-f973-4a7b-bbc8-bc7a93d3bceb with flavor 1c351edf-5b2d-477d-93d0-c380bdae83e7 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Nov 22 02:55:57 np0005531887 nova_compute[186849]: 2025-11-22 07:55:57.640 186853 DEBUG nova.compute.provider_tree [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:55:57 np0005531887 nova_compute[186849]: 2025-11-22 07:55:57.653 186853 DEBUG nova.scheduler.client.report [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:55:57 np0005531887 nova_compute[186849]: 2025-11-22 07:55:57.688 186853 DEBUG oslo_concurrency.lockutils [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.217s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:55:57 np0005531887 nova_compute[186849]: 2025-11-22 07:55:57.689 186853 INFO nova.compute.manager [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Migrating#033[00m
Nov 22 02:55:57 np0005531887 podman[223129]: 2025-11-22 07:55:57.833693354 +0000 UTC m=+0.053033543 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 02:55:59 np0005531887 nova_compute[186849]: 2025-11-22 07:55:59.223 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:55:59 np0005531887 systemd-logind[821]: New session 44 of user nova.
Nov 22 02:55:59 np0005531887 systemd[1]: Created slice User Slice of UID 42436.
Nov 22 02:55:59 np0005531887 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 22 02:55:59 np0005531887 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 22 02:55:59 np0005531887 systemd[1]: Starting User Manager for UID 42436...
Nov 22 02:55:59 np0005531887 systemd[223158]: Queued start job for default target Main User Target.
Nov 22 02:55:59 np0005531887 systemd[223158]: Created slice User Application Slice.
Nov 22 02:55:59 np0005531887 systemd[223158]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 22 02:55:59 np0005531887 systemd[223158]: Started Daily Cleanup of User's Temporary Directories.
Nov 22 02:55:59 np0005531887 systemd[223158]: Reached target Paths.
Nov 22 02:55:59 np0005531887 systemd[223158]: Reached target Timers.
Nov 22 02:55:59 np0005531887 systemd[223158]: Starting D-Bus User Message Bus Socket...
Nov 22 02:55:59 np0005531887 systemd[223158]: Starting Create User's Volatile Files and Directories...
Nov 22 02:55:59 np0005531887 systemd[223158]: Listening on D-Bus User Message Bus Socket.
Nov 22 02:55:59 np0005531887 systemd[223158]: Reached target Sockets.
Nov 22 02:55:59 np0005531887 systemd[223158]: Finished Create User's Volatile Files and Directories.
Nov 22 02:55:59 np0005531887 systemd[223158]: Reached target Basic System.
Nov 22 02:55:59 np0005531887 systemd[223158]: Reached target Main User Target.
Nov 22 02:55:59 np0005531887 systemd[223158]: Startup finished in 151ms.
Nov 22 02:55:59 np0005531887 systemd[1]: Started User Manager for UID 42436.
Nov 22 02:55:59 np0005531887 systemd[1]: Started Session 44 of User nova.
Nov 22 02:55:59 np0005531887 systemd[1]: session-44.scope: Deactivated successfully.
Nov 22 02:55:59 np0005531887 systemd-logind[821]: Session 44 logged out. Waiting for processes to exit.
Nov 22 02:55:59 np0005531887 systemd-logind[821]: Removed session 44.
Nov 22 02:55:59 np0005531887 systemd-logind[821]: New session 46 of user nova.
Nov 22 02:55:59 np0005531887 systemd[1]: Started Session 46 of User nova.
Nov 22 02:56:00 np0005531887 systemd[1]: session-46.scope: Deactivated successfully.
Nov 22 02:56:00 np0005531887 systemd-logind[821]: Session 46 logged out. Waiting for processes to exit.
Nov 22 02:56:00 np0005531887 systemd-logind[821]: Removed session 46.
Nov 22 02:56:00 np0005531887 nova_compute[186849]: 2025-11-22 07:56:00.307 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:01 np0005531887 nova_compute[186849]: 2025-11-22 07:56:01.509 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:03 np0005531887 podman[223180]: 2025-11-22 07:56:03.853374005 +0000 UTC m=+0.074285112 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, name=ubi9-minimal, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, release=1755695350, vcs-type=git, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., config_id=edpm, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 02:56:05 np0005531887 nova_compute[186849]: 2025-11-22 07:56:05.277 186853 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798150.2764115, c4014a21-495c-43f6-b9b0-e6460ba53d12 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:56:05 np0005531887 nova_compute[186849]: 2025-11-22 07:56:05.278 186853 INFO nova.compute.manager [-] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:56:05 np0005531887 nova_compute[186849]: 2025-11-22 07:56:05.298 186853 DEBUG nova.compute.manager [None req-9ab9279b-03ce-46d0-8c07-be0ebbfaeb8b - - - - - -] [instance: c4014a21-495c-43f6-b9b0-e6460ba53d12] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:56:05 np0005531887 nova_compute[186849]: 2025-11-22 07:56:05.311 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:06 np0005531887 nova_compute[186849]: 2025-11-22 07:56:06.511 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:07 np0005531887 podman[223200]: 2025-11-22 07:56:07.852430199 +0000 UTC m=+0.070771264 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm)
Nov 22 02:56:07 np0005531887 podman[223201]: 2025-11-22 07:56:07.910124697 +0000 UTC m=+0.123423667 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Nov 22 02:56:10 np0005531887 systemd[1]: Stopping User Manager for UID 42436...
Nov 22 02:56:10 np0005531887 systemd[223158]: Activating special unit Exit the Session...
Nov 22 02:56:10 np0005531887 systemd[223158]: Stopped target Main User Target.
Nov 22 02:56:10 np0005531887 systemd[223158]: Stopped target Basic System.
Nov 22 02:56:10 np0005531887 systemd[223158]: Stopped target Paths.
Nov 22 02:56:10 np0005531887 systemd[223158]: Stopped target Sockets.
Nov 22 02:56:10 np0005531887 systemd[223158]: Stopped target Timers.
Nov 22 02:56:10 np0005531887 systemd[223158]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 22 02:56:10 np0005531887 systemd[223158]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 22 02:56:10 np0005531887 systemd[223158]: Closed D-Bus User Message Bus Socket.
Nov 22 02:56:10 np0005531887 systemd[223158]: Stopped Create User's Volatile Files and Directories.
Nov 22 02:56:10 np0005531887 systemd[223158]: Removed slice User Application Slice.
Nov 22 02:56:10 np0005531887 systemd[223158]: Reached target Shutdown.
Nov 22 02:56:10 np0005531887 systemd[223158]: Finished Exit the Session.
Nov 22 02:56:10 np0005531887 systemd[223158]: Reached target Exit the Session.
Nov 22 02:56:10 np0005531887 systemd[1]: user@42436.service: Deactivated successfully.
Nov 22 02:56:10 np0005531887 systemd[1]: Stopped User Manager for UID 42436.
Nov 22 02:56:10 np0005531887 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 22 02:56:10 np0005531887 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 22 02:56:10 np0005531887 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 22 02:56:10 np0005531887 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 22 02:56:10 np0005531887 systemd[1]: Removed slice User Slice of UID 42436.
Nov 22 02:56:10 np0005531887 nova_compute[186849]: 2025-11-22 07:56:10.314 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:11 np0005531887 nova_compute[186849]: 2025-11-22 07:56:11.513 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:13 np0005531887 systemd[1]: Created slice User Slice of UID 42436.
Nov 22 02:56:13 np0005531887 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 22 02:56:13 np0005531887 systemd-logind[821]: New session 47 of user nova.
Nov 22 02:56:13 np0005531887 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 22 02:56:13 np0005531887 systemd[1]: Starting User Manager for UID 42436...
Nov 22 02:56:13 np0005531887 systemd[223248]: Queued start job for default target Main User Target.
Nov 22 02:56:13 np0005531887 systemd[223248]: Created slice User Application Slice.
Nov 22 02:56:13 np0005531887 systemd[223248]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 22 02:56:13 np0005531887 systemd[223248]: Started Daily Cleanup of User's Temporary Directories.
Nov 22 02:56:13 np0005531887 systemd[223248]: Reached target Paths.
Nov 22 02:56:13 np0005531887 systemd[223248]: Reached target Timers.
Nov 22 02:56:13 np0005531887 systemd[223248]: Starting D-Bus User Message Bus Socket...
Nov 22 02:56:13 np0005531887 systemd[223248]: Starting Create User's Volatile Files and Directories...
Nov 22 02:56:13 np0005531887 systemd[223248]: Finished Create User's Volatile Files and Directories.
Nov 22 02:56:13 np0005531887 systemd[223248]: Listening on D-Bus User Message Bus Socket.
Nov 22 02:56:13 np0005531887 systemd[223248]: Reached target Sockets.
Nov 22 02:56:13 np0005531887 systemd[223248]: Reached target Basic System.
Nov 22 02:56:13 np0005531887 systemd[223248]: Reached target Main User Target.
Nov 22 02:56:13 np0005531887 systemd[223248]: Startup finished in 125ms.
Nov 22 02:56:13 np0005531887 systemd[1]: Started User Manager for UID 42436.
Nov 22 02:56:13 np0005531887 systemd[1]: Started Session 47 of User nova.
Nov 22 02:56:14 np0005531887 systemd[1]: session-47.scope: Deactivated successfully.
Nov 22 02:56:14 np0005531887 systemd-logind[821]: Session 47 logged out. Waiting for processes to exit.
Nov 22 02:56:14 np0005531887 systemd-logind[821]: Removed session 47.
Nov 22 02:56:14 np0005531887 podman[223265]: 2025-11-22 07:56:14.184904427 +0000 UTC m=+0.062345606 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 02:56:14 np0005531887 systemd-logind[821]: New session 49 of user nova.
Nov 22 02:56:14 np0005531887 systemd[1]: Started Session 49 of User nova.
Nov 22 02:56:14 np0005531887 systemd[1]: session-49.scope: Deactivated successfully.
Nov 22 02:56:14 np0005531887 systemd-logind[821]: Session 49 logged out. Waiting for processes to exit.
Nov 22 02:56:14 np0005531887 systemd-logind[821]: Removed session 49.
Nov 22 02:56:14 np0005531887 systemd-logind[821]: New session 50 of user nova.
Nov 22 02:56:14 np0005531887 systemd[1]: Started Session 50 of User nova.
Nov 22 02:56:14 np0005531887 systemd[1]: session-50.scope: Deactivated successfully.
Nov 22 02:56:14 np0005531887 systemd-logind[821]: Session 50 logged out. Waiting for processes to exit.
Nov 22 02:56:14 np0005531887 systemd-logind[821]: Removed session 50.
Nov 22 02:56:15 np0005531887 nova_compute[186849]: 2025-11-22 07:56:15.317 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:15 np0005531887 nova_compute[186849]: 2025-11-22 07:56:15.523 186853 INFO nova.network.neutron [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Updating port 041d3fd2-9a77-48a1-b976-9dda05f01f7b with attributes {'binding:host_id': 'compute-1.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Nov 22 02:56:15 np0005531887 nova_compute[186849]: 2025-11-22 07:56:15.939 186853 DEBUG oslo_concurrency.lockutils [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Acquiring lock "8e21ac2e-9273-4909-9626-f29aae1d2c5a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:15 np0005531887 nova_compute[186849]: 2025-11-22 07:56:15.940 186853 DEBUG oslo_concurrency.lockutils [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Lock "8e21ac2e-9273-4909-9626-f29aae1d2c5a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:15 np0005531887 nova_compute[186849]: 2025-11-22 07:56:15.960 186853 DEBUG nova.compute.manager [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 02:56:15 np0005531887 nova_compute[186849]: 2025-11-22 07:56:15.974 186853 DEBUG nova.compute.manager [req-06c1788b-7377-4690-ade4-cc6996e193e1 req-d4b8c9f0-4754-4106-a8c4-bc1f39def99c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Received event network-vif-unplugged-041d3fd2-9a77-48a1-b976-9dda05f01f7b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:56:15 np0005531887 nova_compute[186849]: 2025-11-22 07:56:15.974 186853 DEBUG oslo_concurrency.lockutils [req-06c1788b-7377-4690-ade4-cc6996e193e1 req-d4b8c9f0-4754-4106-a8c4-bc1f39def99c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "5eafb037-41a2-463f-9d3a-1b4248cb00f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:15 np0005531887 nova_compute[186849]: 2025-11-22 07:56:15.974 186853 DEBUG oslo_concurrency.lockutils [req-06c1788b-7377-4690-ade4-cc6996e193e1 req-d4b8c9f0-4754-4106-a8c4-bc1f39def99c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "5eafb037-41a2-463f-9d3a-1b4248cb00f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:15 np0005531887 nova_compute[186849]: 2025-11-22 07:56:15.975 186853 DEBUG oslo_concurrency.lockutils [req-06c1788b-7377-4690-ade4-cc6996e193e1 req-d4b8c9f0-4754-4106-a8c4-bc1f39def99c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "5eafb037-41a2-463f-9d3a-1b4248cb00f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:15 np0005531887 nova_compute[186849]: 2025-11-22 07:56:15.975 186853 DEBUG nova.compute.manager [req-06c1788b-7377-4690-ade4-cc6996e193e1 req-d4b8c9f0-4754-4106-a8c4-bc1f39def99c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] No waiting events found dispatching network-vif-unplugged-041d3fd2-9a77-48a1-b976-9dda05f01f7b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:56:15 np0005531887 nova_compute[186849]: 2025-11-22 07:56:15.975 186853 WARNING nova.compute.manager [req-06c1788b-7377-4690-ade4-cc6996e193e1 req-d4b8c9f0-4754-4106-a8c4-bc1f39def99c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Received unexpected event network-vif-unplugged-041d3fd2-9a77-48a1-b976-9dda05f01f7b for instance with vm_state active and task_state resize_migrated.#033[00m
Nov 22 02:56:16 np0005531887 nova_compute[186849]: 2025-11-22 07:56:16.083 186853 DEBUG oslo_concurrency.lockutils [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:16 np0005531887 nova_compute[186849]: 2025-11-22 07:56:16.084 186853 DEBUG oslo_concurrency.lockutils [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:16 np0005531887 nova_compute[186849]: 2025-11-22 07:56:16.093 186853 DEBUG nova.virt.hardware [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:56:16 np0005531887 nova_compute[186849]: 2025-11-22 07:56:16.094 186853 INFO nova.compute.claims [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 22 02:56:16 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:16.196 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:56:16 np0005531887 nova_compute[186849]: 2025-11-22 07:56:16.196 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:16 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:16.197 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 02:56:16 np0005531887 nova_compute[186849]: 2025-11-22 07:56:16.273 186853 DEBUG nova.compute.provider_tree [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:56:16 np0005531887 nova_compute[186849]: 2025-11-22 07:56:16.285 186853 DEBUG nova.scheduler.client.report [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:56:16 np0005531887 nova_compute[186849]: 2025-11-22 07:56:16.381 186853 DEBUG oslo_concurrency.lockutils [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.296s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:16 np0005531887 nova_compute[186849]: 2025-11-22 07:56:16.382 186853 DEBUG nova.compute.manager [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 02:56:16 np0005531887 nova_compute[186849]: 2025-11-22 07:56:16.492 186853 DEBUG nova.compute.manager [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 02:56:16 np0005531887 nova_compute[186849]: 2025-11-22 07:56:16.493 186853 DEBUG nova.network.neutron [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 02:56:16 np0005531887 nova_compute[186849]: 2025-11-22 07:56:16.514 186853 INFO nova.virt.libvirt.driver [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 02:56:16 np0005531887 nova_compute[186849]: 2025-11-22 07:56:16.517 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:16 np0005531887 nova_compute[186849]: 2025-11-22 07:56:16.539 186853 DEBUG nova.compute.manager [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 02:56:16 np0005531887 nova_compute[186849]: 2025-11-22 07:56:16.641 186853 DEBUG nova.compute.manager [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 02:56:16 np0005531887 nova_compute[186849]: 2025-11-22 07:56:16.643 186853 DEBUG nova.virt.libvirt.driver [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:56:16 np0005531887 nova_compute[186849]: 2025-11-22 07:56:16.643 186853 INFO nova.virt.libvirt.driver [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Creating image(s)#033[00m
Nov 22 02:56:16 np0005531887 nova_compute[186849]: 2025-11-22 07:56:16.644 186853 DEBUG oslo_concurrency.lockutils [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Acquiring lock "/var/lib/nova/instances/8e21ac2e-9273-4909-9626-f29aae1d2c5a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:16 np0005531887 nova_compute[186849]: 2025-11-22 07:56:16.645 186853 DEBUG oslo_concurrency.lockutils [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Lock "/var/lib/nova/instances/8e21ac2e-9273-4909-9626-f29aae1d2c5a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:16 np0005531887 nova_compute[186849]: 2025-11-22 07:56:16.645 186853 DEBUG oslo_concurrency.lockutils [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Lock "/var/lib/nova/instances/8e21ac2e-9273-4909-9626-f29aae1d2c5a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:16 np0005531887 nova_compute[186849]: 2025-11-22 07:56:16.660 186853 DEBUG oslo_concurrency.processutils [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:56:16 np0005531887 nova_compute[186849]: 2025-11-22 07:56:16.700 186853 DEBUG nova.policy [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '468858baf68a4236b3d03dc430310165', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '73fcb22503a7438a99d8946a1ffb38ad', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 02:56:16 np0005531887 nova_compute[186849]: 2025-11-22 07:56:16.737 186853 DEBUG oslo_concurrency.processutils [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:56:16 np0005531887 nova_compute[186849]: 2025-11-22 07:56:16.739 186853 DEBUG oslo_concurrency.lockutils [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:16 np0005531887 nova_compute[186849]: 2025-11-22 07:56:16.740 186853 DEBUG oslo_concurrency.lockutils [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:16 np0005531887 nova_compute[186849]: 2025-11-22 07:56:16.752 186853 DEBUG oslo_concurrency.processutils [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:56:16 np0005531887 nova_compute[186849]: 2025-11-22 07:56:16.813 186853 DEBUG oslo_concurrency.processutils [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:56:16 np0005531887 nova_compute[186849]: 2025-11-22 07:56:16.814 186853 DEBUG oslo_concurrency.processutils [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/8e21ac2e-9273-4909-9626-f29aae1d2c5a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:56:17 np0005531887 nova_compute[186849]: 2025-11-22 07:56:17.100 186853 DEBUG oslo_concurrency.processutils [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/8e21ac2e-9273-4909-9626-f29aae1d2c5a/disk 1073741824" returned: 0 in 0.285s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:56:17 np0005531887 nova_compute[186849]: 2025-11-22 07:56:17.101 186853 DEBUG oslo_concurrency.lockutils [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.361s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:17 np0005531887 nova_compute[186849]: 2025-11-22 07:56:17.102 186853 DEBUG oslo_concurrency.processutils [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:56:17 np0005531887 nova_compute[186849]: 2025-11-22 07:56:17.172 186853 DEBUG oslo_concurrency.processutils [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:56:17 np0005531887 nova_compute[186849]: 2025-11-22 07:56:17.173 186853 DEBUG nova.virt.disk.api [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Checking if we can resize image /var/lib/nova/instances/8e21ac2e-9273-4909-9626-f29aae1d2c5a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:56:17 np0005531887 nova_compute[186849]: 2025-11-22 07:56:17.174 186853 DEBUG oslo_concurrency.processutils [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8e21ac2e-9273-4909-9626-f29aae1d2c5a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:56:17 np0005531887 nova_compute[186849]: 2025-11-22 07:56:17.238 186853 DEBUG oslo_concurrency.processutils [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8e21ac2e-9273-4909-9626-f29aae1d2c5a/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:56:17 np0005531887 nova_compute[186849]: 2025-11-22 07:56:17.240 186853 DEBUG nova.virt.disk.api [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Cannot resize image /var/lib/nova/instances/8e21ac2e-9273-4909-9626-f29aae1d2c5a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:56:17 np0005531887 nova_compute[186849]: 2025-11-22 07:56:17.240 186853 DEBUG nova.objects.instance [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Lazy-loading 'migration_context' on Instance uuid 8e21ac2e-9273-4909-9626-f29aae1d2c5a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:56:17 np0005531887 nova_compute[186849]: 2025-11-22 07:56:17.254 186853 DEBUG nova.virt.libvirt.driver [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:56:17 np0005531887 nova_compute[186849]: 2025-11-22 07:56:17.255 186853 DEBUG nova.virt.libvirt.driver [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Ensure instance console log exists: /var/lib/nova/instances/8e21ac2e-9273-4909-9626-f29aae1d2c5a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:56:17 np0005531887 nova_compute[186849]: 2025-11-22 07:56:17.255 186853 DEBUG oslo_concurrency.lockutils [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:17 np0005531887 nova_compute[186849]: 2025-11-22 07:56:17.256 186853 DEBUG oslo_concurrency.lockutils [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:17 np0005531887 nova_compute[186849]: 2025-11-22 07:56:17.256 186853 DEBUG oslo_concurrency.lockutils [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:17 np0005531887 nova_compute[186849]: 2025-11-22 07:56:17.381 186853 DEBUG oslo_concurrency.lockutils [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "refresh_cache-5eafb037-41a2-463f-9d3a-1b4248cb00f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:56:17 np0005531887 nova_compute[186849]: 2025-11-22 07:56:17.382 186853 DEBUG oslo_concurrency.lockutils [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquired lock "refresh_cache-5eafb037-41a2-463f-9d3a-1b4248cb00f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:56:17 np0005531887 nova_compute[186849]: 2025-11-22 07:56:17.382 186853 DEBUG nova.network.neutron [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:56:17 np0005531887 nova_compute[186849]: 2025-11-22 07:56:17.479 186853 DEBUG nova.compute.manager [req-03f3e226-dbb0-427f-b4ed-dd0fb5c9a636 req-3708b7c0-c7ef-4094-8595-db5a4c82bc1f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Received event network-changed-041d3fd2-9a77-48a1-b976-9dda05f01f7b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:56:17 np0005531887 nova_compute[186849]: 2025-11-22 07:56:17.479 186853 DEBUG nova.compute.manager [req-03f3e226-dbb0-427f-b4ed-dd0fb5c9a636 req-3708b7c0-c7ef-4094-8595-db5a4c82bc1f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Refreshing instance network info cache due to event network-changed-041d3fd2-9a77-48a1-b976-9dda05f01f7b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 02:56:17 np0005531887 nova_compute[186849]: 2025-11-22 07:56:17.480 186853 DEBUG oslo_concurrency.lockutils [req-03f3e226-dbb0-427f-b4ed-dd0fb5c9a636 req-3708b7c0-c7ef-4094-8595-db5a4c82bc1f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-5eafb037-41a2-463f-9d3a-1b4248cb00f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:56:18 np0005531887 nova_compute[186849]: 2025-11-22 07:56:18.065 186853 DEBUG nova.compute.manager [req-e9a90f4f-98c6-4c99-9cc5-3701527bf8d6 req-6998114a-bce5-4548-89ad-ae48fcc2f500 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Received event network-vif-plugged-041d3fd2-9a77-48a1-b976-9dda05f01f7b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:56:18 np0005531887 nova_compute[186849]: 2025-11-22 07:56:18.065 186853 DEBUG oslo_concurrency.lockutils [req-e9a90f4f-98c6-4c99-9cc5-3701527bf8d6 req-6998114a-bce5-4548-89ad-ae48fcc2f500 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "5eafb037-41a2-463f-9d3a-1b4248cb00f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:18 np0005531887 nova_compute[186849]: 2025-11-22 07:56:18.066 186853 DEBUG oslo_concurrency.lockutils [req-e9a90f4f-98c6-4c99-9cc5-3701527bf8d6 req-6998114a-bce5-4548-89ad-ae48fcc2f500 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "5eafb037-41a2-463f-9d3a-1b4248cb00f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:18 np0005531887 nova_compute[186849]: 2025-11-22 07:56:18.066 186853 DEBUG oslo_concurrency.lockutils [req-e9a90f4f-98c6-4c99-9cc5-3701527bf8d6 req-6998114a-bce5-4548-89ad-ae48fcc2f500 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "5eafb037-41a2-463f-9d3a-1b4248cb00f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:18 np0005531887 nova_compute[186849]: 2025-11-22 07:56:18.066 186853 DEBUG nova.compute.manager [req-e9a90f4f-98c6-4c99-9cc5-3701527bf8d6 req-6998114a-bce5-4548-89ad-ae48fcc2f500 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] No waiting events found dispatching network-vif-plugged-041d3fd2-9a77-48a1-b976-9dda05f01f7b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:56:18 np0005531887 nova_compute[186849]: 2025-11-22 07:56:18.067 186853 WARNING nova.compute.manager [req-e9a90f4f-98c6-4c99-9cc5-3701527bf8d6 req-6998114a-bce5-4548-89ad-ae48fcc2f500 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Received unexpected event network-vif-plugged-041d3fd2-9a77-48a1-b976-9dda05f01f7b for instance with vm_state active and task_state resize_migrated.#033[00m
Nov 22 02:56:18 np0005531887 nova_compute[186849]: 2025-11-22 07:56:18.213 186853 DEBUG nova.network.neutron [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Successfully created port: 5a3c5c20-b82a-439a-ad72-9d1caab258ce _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.258 186853 DEBUG nova.network.neutron [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Updating instance_info_cache with network_info: [{"id": "041d3fd2-9a77-48a1-b976-9dda05f01f7b", "address": "fa:16:3e:31:97:43", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap041d3fd2-9a", "ovs_interfaceid": "041d3fd2-9a77-48a1-b976-9dda05f01f7b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.270 186853 DEBUG oslo_concurrency.lockutils [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Releasing lock "refresh_cache-5eafb037-41a2-463f-9d3a-1b4248cb00f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.278 186853 DEBUG oslo_concurrency.lockutils [req-03f3e226-dbb0-427f-b4ed-dd0fb5c9a636 req-3708b7c0-c7ef-4094-8595-db5a4c82bc1f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-5eafb037-41a2-463f-9d3a-1b4248cb00f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.278 186853 DEBUG nova.network.neutron [req-03f3e226-dbb0-427f-b4ed-dd0fb5c9a636 req-3708b7c0-c7ef-4094-8595-db5a4c82bc1f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Refreshing network info cache for port 041d3fd2-9a77-48a1-b976-9dda05f01f7b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.512 186853 DEBUG nova.network.neutron [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Successfully updated port: 5a3c5c20-b82a-439a-ad72-9d1caab258ce _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.531 186853 DEBUG oslo_concurrency.lockutils [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Acquiring lock "refresh_cache-8e21ac2e-9273-4909-9626-f29aae1d2c5a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.532 186853 DEBUG oslo_concurrency.lockutils [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Acquired lock "refresh_cache-8e21ac2e-9273-4909-9626-f29aae1d2c5a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.532 186853 DEBUG nova.network.neutron [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.534 186853 DEBUG nova.virt.libvirt.driver [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.536 186853 DEBUG nova.virt.libvirt.driver [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.536 186853 INFO nova.virt.libvirt.driver [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Creating image(s)#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.537 186853 DEBUG nova.objects.instance [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 5eafb037-41a2-463f-9d3a-1b4248cb00f2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.556 186853 DEBUG oslo_concurrency.processutils [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.621 186853 DEBUG oslo_concurrency.processutils [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.623 186853 DEBUG nova.virt.disk.api [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Checking if we can resize image /var/lib/nova/instances/5eafb037-41a2-463f-9d3a-1b4248cb00f2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.623 186853 DEBUG oslo_concurrency.processutils [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5eafb037-41a2-463f-9d3a-1b4248cb00f2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.680 186853 DEBUG oslo_concurrency.processutils [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5eafb037-41a2-463f-9d3a-1b4248cb00f2/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.681 186853 DEBUG nova.virt.disk.api [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Cannot resize image /var/lib/nova/instances/5eafb037-41a2-463f-9d3a-1b4248cb00f2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.695 186853 DEBUG nova.virt.libvirt.driver [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.695 186853 DEBUG nova.virt.libvirt.driver [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Ensure instance console log exists: /var/lib/nova/instances/5eafb037-41a2-463f-9d3a-1b4248cb00f2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.696 186853 DEBUG oslo_concurrency.lockutils [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.696 186853 DEBUG oslo_concurrency.lockutils [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.696 186853 DEBUG oslo_concurrency.lockutils [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.699 186853 DEBUG nova.virt.libvirt.driver [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Start _get_guest_xml network_info=[{"id": "041d3fd2-9a77-48a1-b976-9dda05f01f7b", "address": "fa:16:3e:31:97:43", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "vif_mac": "fa:16:3e:31:97:43"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap041d3fd2-9a", "ovs_interfaceid": "041d3fd2-9a77-48a1-b976-9dda05f01f7b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'size': 0, 'boot_index': 0, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.703 186853 WARNING nova.virt.libvirt.driver [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.708 186853 DEBUG nova.virt.libvirt.host [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.709 186853 DEBUG nova.virt.libvirt.host [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.713 186853 DEBUG nova.virt.libvirt.host [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.713 186853 DEBUG nova.virt.libvirt.host [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.714 186853 DEBUG nova.virt.libvirt.driver [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.715 186853 DEBUG nova.virt.hardware [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1c351edf-5b2d-477d-93d0-c380bdae83e7',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.715 186853 DEBUG nova.virt.hardware [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.715 186853 DEBUG nova.virt.hardware [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.715 186853 DEBUG nova.virt.hardware [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.716 186853 DEBUG nova.virt.hardware [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.716 186853 DEBUG nova.virt.hardware [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.716 186853 DEBUG nova.virt.hardware [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.716 186853 DEBUG nova.virt.hardware [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.716 186853 DEBUG nova.virt.hardware [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.716 186853 DEBUG nova.virt.hardware [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.717 186853 DEBUG nova.virt.hardware [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.717 186853 DEBUG nova.objects.instance [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 5eafb037-41a2-463f-9d3a-1b4248cb00f2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.737 186853 DEBUG oslo_concurrency.processutils [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5eafb037-41a2-463f-9d3a-1b4248cb00f2/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.759 186853 DEBUG nova.network.neutron [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.817 186853 DEBUG oslo_concurrency.processutils [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5eafb037-41a2-463f-9d3a-1b4248cb00f2/disk.config --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.817 186853 DEBUG oslo_concurrency.lockutils [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "/var/lib/nova/instances/5eafb037-41a2-463f-9d3a-1b4248cb00f2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.818 186853 DEBUG oslo_concurrency.lockutils [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "/var/lib/nova/instances/5eafb037-41a2-463f-9d3a-1b4248cb00f2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.819 186853 DEBUG oslo_concurrency.lockutils [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "/var/lib/nova/instances/5eafb037-41a2-463f-9d3a-1b4248cb00f2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.820 186853 DEBUG nova.virt.libvirt.vif [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:55:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1663966303',display_name='tempest-ServerDiskConfigTestJSON-server-1663966303',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1663966303',id=70,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:55:51Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='063bf16c91af408ca075c690797e09d8',ramdisk_id='',reservation_id='r-9g9kcea9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerDiskConfigTestJSON-592691466',owner_user_name='tempest-ServerDiskConfigTestJSON-592691466-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:56:15Z,user_data=None,user_id='e24c302b62fb470aa189b76d4676733b',uuid=5eafb037-41a2-463f-9d3a-1b4248cb00f2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "041d3fd2-9a77-48a1-b976-9dda05f01f7b", "address": "fa:16:3e:31:97:43", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "vif_mac": "fa:16:3e:31:97:43"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap041d3fd2-9a", "ovs_interfaceid": "041d3fd2-9a77-48a1-b976-9dda05f01f7b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.820 186853 DEBUG nova.network.os_vif_util [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Converting VIF {"id": "041d3fd2-9a77-48a1-b976-9dda05f01f7b", "address": "fa:16:3e:31:97:43", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "vif_mac": "fa:16:3e:31:97:43"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap041d3fd2-9a", "ovs_interfaceid": "041d3fd2-9a77-48a1-b976-9dda05f01f7b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.821 186853 DEBUG nova.network.os_vif_util [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:31:97:43,bridge_name='br-int',has_traffic_filtering=True,id=041d3fd2-9a77-48a1-b976-9dda05f01f7b,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap041d3fd2-9a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.823 186853 DEBUG nova.virt.libvirt.driver [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:56:19 np0005531887 nova_compute[186849]:  <uuid>5eafb037-41a2-463f-9d3a-1b4248cb00f2</uuid>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:  <name>instance-00000046</name>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:  <memory>196608</memory>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:  <vcpu>1</vcpu>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:56:19 np0005531887 nova_compute[186849]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:      <nova:name>tempest-ServerDiskConfigTestJSON-server-1663966303</nova:name>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:      <nova:creationTime>2025-11-22 07:56:19</nova:creationTime>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:      <nova:flavor name="m1.micro">
Nov 22 02:56:19 np0005531887 nova_compute[186849]:        <nova:memory>192</nova:memory>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:        <nova:disk>1</nova:disk>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:        <nova:swap>0</nova:swap>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:      </nova:flavor>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:      <nova:owner>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:        <nova:user uuid="e24c302b62fb470aa189b76d4676733b">tempest-ServerDiskConfigTestJSON-592691466-project-member</nova:user>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:        <nova:project uuid="063bf16c91af408ca075c690797e09d8">tempest-ServerDiskConfigTestJSON-592691466</nova:project>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:      </nova:owner>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:      <nova:ports>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:        <nova:port uuid="041d3fd2-9a77-48a1-b976-9dda05f01f7b">
Nov 22 02:56:19 np0005531887 nova_compute[186849]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:        </nova:port>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:      </nova:ports>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:    </nova:instance>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:  <sysinfo type="smbios">
Nov 22 02:56:19 np0005531887 nova_compute[186849]:    <system>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:      <entry name="serial">5eafb037-41a2-463f-9d3a-1b4248cb00f2</entry>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:      <entry name="uuid">5eafb037-41a2-463f-9d3a-1b4248cb00f2</entry>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:    </system>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:  <os>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:    <boot dev="hd"/>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:    <smbios mode="sysinfo"/>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:  </os>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:  <features>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:    <vmcoreinfo/>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:  </features>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:  <clock offset="utc">
Nov 22 02:56:19 np0005531887 nova_compute[186849]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:    <timer name="hpet" present="no"/>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:  </clock>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:  <cpu mode="custom" match="exact">
Nov 22 02:56:19 np0005531887 nova_compute[186849]:    <model>Nehalem</model>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:  <devices>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:    <disk type="file" device="disk">
Nov 22 02:56:19 np0005531887 nova_compute[186849]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/5eafb037-41a2-463f-9d3a-1b4248cb00f2/disk"/>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:      <target dev="vda" bus="virtio"/>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:    <disk type="file" device="cdrom">
Nov 22 02:56:19 np0005531887 nova_compute[186849]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/5eafb037-41a2-463f-9d3a-1b4248cb00f2/disk.config"/>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:      <target dev="sda" bus="sata"/>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:    <interface type="ethernet">
Nov 22 02:56:19 np0005531887 nova_compute[186849]:      <mac address="fa:16:3e:31:97:43"/>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:      <mtu size="1442"/>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:      <target dev="tap041d3fd2-9a"/>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:    </interface>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:    <serial type="pty">
Nov 22 02:56:19 np0005531887 nova_compute[186849]:      <log file="/var/lib/nova/instances/5eafb037-41a2-463f-9d3a-1b4248cb00f2/console.log" append="off"/>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:    </serial>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:    <video>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:    </video>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:    <input type="tablet" bus="usb"/>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:    <rng model="virtio">
Nov 22 02:56:19 np0005531887 nova_compute[186849]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:    </rng>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:    <controller type="usb" index="0"/>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:    <memballoon model="virtio">
Nov 22 02:56:19 np0005531887 nova_compute[186849]:      <stats period="10"/>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 02:56:19 np0005531887 nova_compute[186849]:  </devices>
Nov 22 02:56:19 np0005531887 nova_compute[186849]: </domain>
Nov 22 02:56:19 np0005531887 nova_compute[186849]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.825 186853 DEBUG nova.virt.libvirt.vif [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:55:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1663966303',display_name='tempest-ServerDiskConfigTestJSON-server-1663966303',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1663966303',id=70,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:55:51Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='063bf16c91af408ca075c690797e09d8',ramdisk_id='',reservation_id='r-9g9kcea9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerDiskConfigTestJSON-592691466',owner_user_name='tempest-ServerDiskConfigTestJSON-592691466-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:56:15Z,user_data=None,user_id='e24c302b62fb470aa189b76d4676733b',uuid=5eafb037-41a2-463f-9d3a-1b4248cb00f2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "041d3fd2-9a77-48a1-b976-9dda05f01f7b", "address": "fa:16:3e:31:97:43", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "vif_mac": "fa:16:3e:31:97:43"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap041d3fd2-9a", "ovs_interfaceid": "041d3fd2-9a77-48a1-b976-9dda05f01f7b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.826 186853 DEBUG nova.network.os_vif_util [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Converting VIF {"id": "041d3fd2-9a77-48a1-b976-9dda05f01f7b", "address": "fa:16:3e:31:97:43", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "vif_mac": "fa:16:3e:31:97:43"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap041d3fd2-9a", "ovs_interfaceid": "041d3fd2-9a77-48a1-b976-9dda05f01f7b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.826 186853 DEBUG nova.network.os_vif_util [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:31:97:43,bridge_name='br-int',has_traffic_filtering=True,id=041d3fd2-9a77-48a1-b976-9dda05f01f7b,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap041d3fd2-9a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.827 186853 DEBUG os_vif [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:31:97:43,bridge_name='br-int',has_traffic_filtering=True,id=041d3fd2-9a77-48a1-b976-9dda05f01f7b,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap041d3fd2-9a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.827 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.828 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.828 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.832 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.833 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap041d3fd2-9a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.834 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap041d3fd2-9a, col_values=(('external_ids', {'iface-id': '041d3fd2-9a77-48a1-b976-9dda05f01f7b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:31:97:43', 'vm-uuid': '5eafb037-41a2-463f-9d3a-1b4248cb00f2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.835 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:19 np0005531887 NetworkManager[55210]: <info>  [1763798179.8373] manager: (tap041d3fd2-9a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/93)
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.838 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.843 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.844 186853 INFO os_vif [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:31:97:43,bridge_name='br-int',has_traffic_filtering=True,id=041d3fd2-9a77-48a1-b976-9dda05f01f7b,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap041d3fd2-9a')#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.900 186853 DEBUG nova.virt.libvirt.driver [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.902 186853 DEBUG nova.virt.libvirt.driver [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.902 186853 DEBUG nova.virt.libvirt.driver [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] No VIF found with MAC fa:16:3e:31:97:43, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.903 186853 INFO nova.virt.libvirt.driver [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Using config drive#033[00m
Nov 22 02:56:19 np0005531887 kernel: tap041d3fd2-9a: entered promiscuous mode
Nov 22 02:56:19 np0005531887 NetworkManager[55210]: <info>  [1763798179.9808] manager: (tap041d3fd2-9a): new Tun device (/org/freedesktop/NetworkManager/Devices/94)
Nov 22 02:56:19 np0005531887 ovn_controller[95130]: 2025-11-22T07:56:19Z|00176|binding|INFO|Claiming lport 041d3fd2-9a77-48a1-b976-9dda05f01f7b for this chassis.
Nov 22 02:56:19 np0005531887 ovn_controller[95130]: 2025-11-22T07:56:19Z|00177|binding|INFO|041d3fd2-9a77-48a1-b976-9dda05f01f7b: Claiming fa:16:3e:31:97:43 10.100.0.14
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.982 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:19 np0005531887 nova_compute[186849]: 2025-11-22 07:56:19.985 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:20 np0005531887 systemd-udevd[223351]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:56:20 np0005531887 systemd-machined[153180]: New machine qemu-27-instance-00000046.
Nov 22 02:56:20 np0005531887 NetworkManager[55210]: <info>  [1763798180.0327] device (tap041d3fd2-9a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 02:56:20 np0005531887 NetworkManager[55210]: <info>  [1763798180.0341] device (tap041d3fd2-9a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 02:56:20 np0005531887 nova_compute[186849]: 2025-11-22 07:56:20.043 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:20 np0005531887 ovn_controller[95130]: 2025-11-22T07:56:20Z|00178|binding|INFO|Setting lport 041d3fd2-9a77-48a1-b976-9dda05f01f7b ovn-installed in OVS
Nov 22 02:56:20 np0005531887 systemd[1]: Started Virtual Machine qemu-27-instance-00000046.
Nov 22 02:56:20 np0005531887 nova_compute[186849]: 2025-11-22 07:56:20.050 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:20 np0005531887 podman[223335]: 2025-11-22 07:56:20.062385015 +0000 UTC m=+0.088181138 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:56:20 np0005531887 ovn_controller[95130]: 2025-11-22T07:56:20Z|00179|binding|INFO|Setting lport 041d3fd2-9a77-48a1-b976-9dda05f01f7b up in Southbound
Nov 22 02:56:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:20.103 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:31:97:43 10.100.0.14'], port_security=['fa:16:3e:31:97:43 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '5eafb037-41a2-463f-9d3a-1b4248cb00f2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '063bf16c91af408ca075c690797e09d8', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'f7dee984-8c58-404b-bb82-9a414aef709d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3245992a-6d76-4250-aff6-2ef09c2bac10, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=041d3fd2-9a77-48a1-b976-9dda05f01f7b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:56:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:20.104 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 041d3fd2-9a77-48a1-b976-9dda05f01f7b in datapath d54e232a-5c68-4cc7-b58c-054da9c4646f bound to our chassis#033[00m
Nov 22 02:56:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:20.106 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d54e232a-5c68-4cc7-b58c-054da9c4646f#033[00m
Nov 22 02:56:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:20.121 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[78d5f116-5b04-4038-9464-0f9f0f71be4d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:20.122 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd54e232a-51 in ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 02:56:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:20.124 213790 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd54e232a-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 02:56:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:20.124 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[58b71e15-b1b0-41ff-aa70-f275614d3f27]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:20.125 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[4437c892-04fe-4b14-a52f-94bc141cdace]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:20.139 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[44e9bf12-375c-4f1e-b94d-55a7ec006588]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:20 np0005531887 nova_compute[186849]: 2025-11-22 07:56:20.163 186853 DEBUG nova.compute.manager [req-50a16e2e-e062-41b8-a52a-76f03da532f0 req-f92a6b09-5278-4e3e-b255-1dc0f4964382 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Received event network-changed-5a3c5c20-b82a-439a-ad72-9d1caab258ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:56:20 np0005531887 nova_compute[186849]: 2025-11-22 07:56:20.165 186853 DEBUG nova.compute.manager [req-50a16e2e-e062-41b8-a52a-76f03da532f0 req-f92a6b09-5278-4e3e-b255-1dc0f4964382 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Refreshing instance network info cache due to event network-changed-5a3c5c20-b82a-439a-ad72-9d1caab258ce. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 02:56:20 np0005531887 nova_compute[186849]: 2025-11-22 07:56:20.165 186853 DEBUG oslo_concurrency.lockutils [req-50a16e2e-e062-41b8-a52a-76f03da532f0 req-f92a6b09-5278-4e3e-b255-1dc0f4964382 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-8e21ac2e-9273-4909-9626-f29aae1d2c5a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:56:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:20.165 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[94584ca9-cbef-4dbd-8799-608fdb6e68f5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:20.198 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[ad52feea-0c09-461b-98b8-a7789b83cf63]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:20.202 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[4ed13f8f-4324-45ed-baf2-79203deb601c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:20 np0005531887 NetworkManager[55210]: <info>  [1763798180.2043] manager: (tapd54e232a-50): new Veth device (/org/freedesktop/NetworkManager/Devices/95)
Nov 22 02:56:20 np0005531887 systemd-udevd[223358]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:56:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:20.237 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[9ea2b0a8-9d2e-4bb9-b823-6820ad671027]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:20.240 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[fb73ded5-c52c-4ec2-876c-ed3da3d117fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:20 np0005531887 NetworkManager[55210]: <info>  [1763798180.2672] device (tapd54e232a-50): carrier: link connected
Nov 22 02:56:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:20.272 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[4c6c0369-dd44-414e-b9bc-53a6c6791ba2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:20.293 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[789bfbd1-8431-4bdc-9598-d420f4ea7762]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd54e232a-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5b:69:51'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 58], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 492287, 'reachable_time': 36003, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223391, 'error': None, 'target': 'ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:20.317 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[cfcf3923-d6dd-424b-8278-29989df53664]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5b:6951'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 492287, 'tstamp': 492287}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223392, 'error': None, 'target': 'ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:20.347 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[986f5ce9-8792-4548-949a-6f708443dda4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd54e232a-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5b:69:51'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 58], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 492287, 'reachable_time': 36003, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 223393, 'error': None, 'target': 'ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:20.380 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[15bbc02a-1043-461f-81f6-9c239eb8eb5a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:20.443 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[71e29dbf-3f6e-461a-becd-01ce1facb849]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:20.445 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd54e232a-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:56:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:20.445 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:56:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:20.446 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd54e232a-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:56:20 np0005531887 nova_compute[186849]: 2025-11-22 07:56:20.448 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:20 np0005531887 NetworkManager[55210]: <info>  [1763798180.4491] manager: (tapd54e232a-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/96)
Nov 22 02:56:20 np0005531887 kernel: tapd54e232a-50: entered promiscuous mode
Nov 22 02:56:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:20.451 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd54e232a-50, col_values=(('external_ids', {'iface-id': 'bab7bafe-e92a-4e88-a16b-e3bd78ab8944'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:56:20 np0005531887 ovn_controller[95130]: 2025-11-22T07:56:20Z|00180|binding|INFO|Releasing lport bab7bafe-e92a-4e88-a16b-e3bd78ab8944 from this chassis (sb_readonly=0)
Nov 22 02:56:20 np0005531887 nova_compute[186849]: 2025-11-22 07:56:20.469 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:20.470 104084 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d54e232a-5c68-4cc7-b58c-054da9c4646f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d54e232a-5c68-4cc7-b58c-054da9c4646f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 02:56:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:20.471 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[e9a8050a-ace6-44d6-ab8e-00c9472c50fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:20.472 104084 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 02:56:20 np0005531887 ovn_metadata_agent[104079]: global
Nov 22 02:56:20 np0005531887 ovn_metadata_agent[104079]:    log         /dev/log local0 debug
Nov 22 02:56:20 np0005531887 ovn_metadata_agent[104079]:    log-tag     haproxy-metadata-proxy-d54e232a-5c68-4cc7-b58c-054da9c4646f
Nov 22 02:56:20 np0005531887 ovn_metadata_agent[104079]:    user        root
Nov 22 02:56:20 np0005531887 ovn_metadata_agent[104079]:    group       root
Nov 22 02:56:20 np0005531887 ovn_metadata_agent[104079]:    maxconn     1024
Nov 22 02:56:20 np0005531887 ovn_metadata_agent[104079]:    pidfile     /var/lib/neutron/external/pids/d54e232a-5c68-4cc7-b58c-054da9c4646f.pid.haproxy
Nov 22 02:56:20 np0005531887 ovn_metadata_agent[104079]:    daemon
Nov 22 02:56:20 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:56:20 np0005531887 ovn_metadata_agent[104079]: defaults
Nov 22 02:56:20 np0005531887 ovn_metadata_agent[104079]:    log global
Nov 22 02:56:20 np0005531887 ovn_metadata_agent[104079]:    mode http
Nov 22 02:56:20 np0005531887 ovn_metadata_agent[104079]:    option httplog
Nov 22 02:56:20 np0005531887 ovn_metadata_agent[104079]:    option dontlognull
Nov 22 02:56:20 np0005531887 ovn_metadata_agent[104079]:    option http-server-close
Nov 22 02:56:20 np0005531887 ovn_metadata_agent[104079]:    option forwardfor
Nov 22 02:56:20 np0005531887 ovn_metadata_agent[104079]:    retries                 3
Nov 22 02:56:20 np0005531887 ovn_metadata_agent[104079]:    timeout http-request    30s
Nov 22 02:56:20 np0005531887 ovn_metadata_agent[104079]:    timeout connect         30s
Nov 22 02:56:20 np0005531887 ovn_metadata_agent[104079]:    timeout client          32s
Nov 22 02:56:20 np0005531887 ovn_metadata_agent[104079]:    timeout server          32s
Nov 22 02:56:20 np0005531887 ovn_metadata_agent[104079]:    timeout http-keep-alive 30s
Nov 22 02:56:20 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:56:20 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:56:20 np0005531887 ovn_metadata_agent[104079]: listen listener
Nov 22 02:56:20 np0005531887 ovn_metadata_agent[104079]:    bind 169.254.169.254:80
Nov 22 02:56:20 np0005531887 ovn_metadata_agent[104079]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 02:56:20 np0005531887 ovn_metadata_agent[104079]:    http-request add-header X-OVN-Network-ID d54e232a-5c68-4cc7-b58c-054da9c4646f
Nov 22 02:56:20 np0005531887 ovn_metadata_agent[104079]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 02:56:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:20.473 104084 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'env', 'PROCESS_TAG=haproxy-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d54e232a-5c68-4cc7-b58c-054da9c4646f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 02:56:20 np0005531887 nova_compute[186849]: 2025-11-22 07:56:20.557 186853 DEBUG nova.compute.manager [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:56:20 np0005531887 nova_compute[186849]: 2025-11-22 07:56:20.558 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798180.5569534, 5eafb037-41a2-463f-9d3a-1b4248cb00f2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:56:20 np0005531887 nova_compute[186849]: 2025-11-22 07:56:20.559 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:56:20 np0005531887 nova_compute[186849]: 2025-11-22 07:56:20.565 186853 INFO nova.virt.libvirt.driver [-] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Instance running successfully.#033[00m
Nov 22 02:56:20 np0005531887 virtqemud[186424]: argument unsupported: QEMU guest agent is not configured
Nov 22 02:56:20 np0005531887 nova_compute[186849]: 2025-11-22 07:56:20.569 186853 DEBUG nova.virt.libvirt.guest [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Nov 22 02:56:20 np0005531887 nova_compute[186849]: 2025-11-22 07:56:20.569 186853 DEBUG nova.virt.libvirt.driver [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Nov 22 02:56:20 np0005531887 nova_compute[186849]: 2025-11-22 07:56:20.574 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:56:20 np0005531887 nova_compute[186849]: 2025-11-22 07:56:20.578 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:56:20 np0005531887 nova_compute[186849]: 2025-11-22 07:56:20.598 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Nov 22 02:56:20 np0005531887 nova_compute[186849]: 2025-11-22 07:56:20.598 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798180.558037, 5eafb037-41a2-463f-9d3a-1b4248cb00f2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:56:20 np0005531887 nova_compute[186849]: 2025-11-22 07:56:20.599 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] VM Started (Lifecycle Event)#033[00m
Nov 22 02:56:20 np0005531887 nova_compute[186849]: 2025-11-22 07:56:20.620 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:56:20 np0005531887 nova_compute[186849]: 2025-11-22 07:56:20.623 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:56:20 np0005531887 nova_compute[186849]: 2025-11-22 07:56:20.644 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Nov 22 02:56:20 np0005531887 nova_compute[186849]: 2025-11-22 07:56:20.826 186853 DEBUG nova.network.neutron [req-03f3e226-dbb0-427f-b4ed-dd0fb5c9a636 req-3708b7c0-c7ef-4094-8595-db5a4c82bc1f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Updated VIF entry in instance network info cache for port 041d3fd2-9a77-48a1-b976-9dda05f01f7b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 02:56:20 np0005531887 nova_compute[186849]: 2025-11-22 07:56:20.828 186853 DEBUG nova.network.neutron [req-03f3e226-dbb0-427f-b4ed-dd0fb5c9a636 req-3708b7c0-c7ef-4094-8595-db5a4c82bc1f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Updating instance_info_cache with network_info: [{"id": "041d3fd2-9a77-48a1-b976-9dda05f01f7b", "address": "fa:16:3e:31:97:43", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap041d3fd2-9a", "ovs_interfaceid": "041d3fd2-9a77-48a1-b976-9dda05f01f7b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:56:20 np0005531887 nova_compute[186849]: 2025-11-22 07:56:20.860 186853 DEBUG oslo_concurrency.lockutils [req-03f3e226-dbb0-427f-b4ed-dd0fb5c9a636 req-3708b7c0-c7ef-4094-8595-db5a4c82bc1f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-5eafb037-41a2-463f-9d3a-1b4248cb00f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:56:20 np0005531887 podman[223432]: 2025-11-22 07:56:20.922013209 +0000 UTC m=+0.063003711 container create e8b24e209502207a55f39799f013840f87986758e8dfef2510be31a61dc11f42 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 02:56:20 np0005531887 systemd[1]: Started libpod-conmon-e8b24e209502207a55f39799f013840f87986758e8dfef2510be31a61dc11f42.scope.
Nov 22 02:56:20 np0005531887 podman[223432]: 2025-11-22 07:56:20.89274594 +0000 UTC m=+0.033736472 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 02:56:20 np0005531887 systemd[1]: Started libcrun container.
Nov 22 02:56:20 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5a1fe8b600a9bfcc1476e28b6cc24df9875dca020f1fb5c431cfa6c42ced55d5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 02:56:21 np0005531887 podman[223432]: 2025-11-22 07:56:21.011737565 +0000 UTC m=+0.152728087 container init e8b24e209502207a55f39799f013840f87986758e8dfef2510be31a61dc11f42 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:56:21 np0005531887 podman[223432]: 2025-11-22 07:56:21.019685952 +0000 UTC m=+0.160676454 container start e8b24e209502207a55f39799f013840f87986758e8dfef2510be31a61dc11f42 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 22 02:56:21 np0005531887 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[223448]: [NOTICE]   (223452) : New worker (223454) forked
Nov 22 02:56:21 np0005531887 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[223448]: [NOTICE]   (223452) : Loading success.
Nov 22 02:56:21 np0005531887 nova_compute[186849]: 2025-11-22 07:56:21.182 186853 DEBUG nova.network.neutron [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Updating instance_info_cache with network_info: [{"id": "5a3c5c20-b82a-439a-ad72-9d1caab258ce", "address": "fa:16:3e:3e:57:33", "network": {"id": "7b1d5ca6-1256-44b6-ace0-8cd170d3014e", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-2036743825-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "73fcb22503a7438a99d8946a1ffb38ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5a3c5c20-b8", "ovs_interfaceid": "5a3c5c20-b82a-439a-ad72-9d1caab258ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:56:21 np0005531887 nova_compute[186849]: 2025-11-22 07:56:21.327 186853 DEBUG oslo_concurrency.lockutils [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Releasing lock "refresh_cache-8e21ac2e-9273-4909-9626-f29aae1d2c5a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:56:21 np0005531887 nova_compute[186849]: 2025-11-22 07:56:21.328 186853 DEBUG nova.compute.manager [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Instance network_info: |[{"id": "5a3c5c20-b82a-439a-ad72-9d1caab258ce", "address": "fa:16:3e:3e:57:33", "network": {"id": "7b1d5ca6-1256-44b6-ace0-8cd170d3014e", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-2036743825-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "73fcb22503a7438a99d8946a1ffb38ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5a3c5c20-b8", "ovs_interfaceid": "5a3c5c20-b82a-439a-ad72-9d1caab258ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 02:56:21 np0005531887 nova_compute[186849]: 2025-11-22 07:56:21.329 186853 DEBUG oslo_concurrency.lockutils [req-50a16e2e-e062-41b8-a52a-76f03da532f0 req-f92a6b09-5278-4e3e-b255-1dc0f4964382 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-8e21ac2e-9273-4909-9626-f29aae1d2c5a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:56:21 np0005531887 nova_compute[186849]: 2025-11-22 07:56:21.331 186853 DEBUG nova.network.neutron [req-50a16e2e-e062-41b8-a52a-76f03da532f0 req-f92a6b09-5278-4e3e-b255-1dc0f4964382 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Refreshing network info cache for port 5a3c5c20-b82a-439a-ad72-9d1caab258ce _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 02:56:21 np0005531887 nova_compute[186849]: 2025-11-22 07:56:21.334 186853 DEBUG nova.virt.libvirt.driver [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Start _get_guest_xml network_info=[{"id": "5a3c5c20-b82a-439a-ad72-9d1caab258ce", "address": "fa:16:3e:3e:57:33", "network": {"id": "7b1d5ca6-1256-44b6-ace0-8cd170d3014e", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-2036743825-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "73fcb22503a7438a99d8946a1ffb38ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5a3c5c20-b8", "ovs_interfaceid": "5a3c5c20-b82a-439a-ad72-9d1caab258ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'size': 0, 'boot_index': 0, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:56:21 np0005531887 nova_compute[186849]: 2025-11-22 07:56:21.341 186853 WARNING nova.virt.libvirt.driver [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:56:21 np0005531887 nova_compute[186849]: 2025-11-22 07:56:21.354 186853 DEBUG nova.virt.libvirt.host [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:56:21 np0005531887 nova_compute[186849]: 2025-11-22 07:56:21.355 186853 DEBUG nova.virt.libvirt.host [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:56:21 np0005531887 nova_compute[186849]: 2025-11-22 07:56:21.362 186853 DEBUG nova.virt.libvirt.host [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:56:21 np0005531887 nova_compute[186849]: 2025-11-22 07:56:21.363 186853 DEBUG nova.virt.libvirt.host [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:56:21 np0005531887 nova_compute[186849]: 2025-11-22 07:56:21.364 186853 DEBUG nova.virt.libvirt.driver [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:56:21 np0005531887 nova_compute[186849]: 2025-11-22 07:56:21.365 186853 DEBUG nova.virt.hardware [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:56:21 np0005531887 nova_compute[186849]: 2025-11-22 07:56:21.365 186853 DEBUG nova.virt.hardware [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:56:21 np0005531887 nova_compute[186849]: 2025-11-22 07:56:21.366 186853 DEBUG nova.virt.hardware [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:56:21 np0005531887 nova_compute[186849]: 2025-11-22 07:56:21.366 186853 DEBUG nova.virt.hardware [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:56:21 np0005531887 nova_compute[186849]: 2025-11-22 07:56:21.366 186853 DEBUG nova.virt.hardware [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:56:21 np0005531887 nova_compute[186849]: 2025-11-22 07:56:21.367 186853 DEBUG nova.virt.hardware [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:56:21 np0005531887 nova_compute[186849]: 2025-11-22 07:56:21.367 186853 DEBUG nova.virt.hardware [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:56:21 np0005531887 nova_compute[186849]: 2025-11-22 07:56:21.367 186853 DEBUG nova.virt.hardware [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:56:21 np0005531887 nova_compute[186849]: 2025-11-22 07:56:21.368 186853 DEBUG nova.virt.hardware [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:56:21 np0005531887 nova_compute[186849]: 2025-11-22 07:56:21.368 186853 DEBUG nova.virt.hardware [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:56:21 np0005531887 nova_compute[186849]: 2025-11-22 07:56:21.368 186853 DEBUG nova.virt.hardware [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:56:21 np0005531887 nova_compute[186849]: 2025-11-22 07:56:21.372 186853 DEBUG nova.virt.libvirt.vif [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:56:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerMetadataNegativeTestJSON-server-667251429',display_name='tempest-ServerMetadataNegativeTestJSON-server-667251429',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-servermetadatanegativetestjson-server-667251429',id=74,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='73fcb22503a7438a99d8946a1ffb38ad',ramdisk_id='',reservation_id='r-wtkezp7k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerMetadataNegativeTestJSON-1154116890',owner_user_name='tempest-ServerMetadataNegativeTestJSON-1154116890-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:56:16Z,user_data=None,user_id='468858baf68a4236b3d03dc430310165',uuid=8e21ac2e-9273-4909-9626-f29aae1d2c5a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5a3c5c20-b82a-439a-ad72-9d1caab258ce", "address": "fa:16:3e:3e:57:33", "network": {"id": "7b1d5ca6-1256-44b6-ace0-8cd170d3014e", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-2036743825-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "73fcb22503a7438a99d8946a1ffb38ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5a3c5c20-b8", "ovs_interfaceid": "5a3c5c20-b82a-439a-ad72-9d1caab258ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 02:56:21 np0005531887 nova_compute[186849]: 2025-11-22 07:56:21.373 186853 DEBUG nova.network.os_vif_util [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Converting VIF {"id": "5a3c5c20-b82a-439a-ad72-9d1caab258ce", "address": "fa:16:3e:3e:57:33", "network": {"id": "7b1d5ca6-1256-44b6-ace0-8cd170d3014e", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-2036743825-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "73fcb22503a7438a99d8946a1ffb38ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5a3c5c20-b8", "ovs_interfaceid": "5a3c5c20-b82a-439a-ad72-9d1caab258ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:56:21 np0005531887 nova_compute[186849]: 2025-11-22 07:56:21.374 186853 DEBUG nova.network.os_vif_util [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3e:57:33,bridge_name='br-int',has_traffic_filtering=True,id=5a3c5c20-b82a-439a-ad72-9d1caab258ce,network=Network(7b1d5ca6-1256-44b6-ace0-8cd170d3014e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5a3c5c20-b8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:56:21 np0005531887 nova_compute[186849]: 2025-11-22 07:56:21.375 186853 DEBUG nova.objects.instance [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Lazy-loading 'pci_devices' on Instance uuid 8e21ac2e-9273-4909-9626-f29aae1d2c5a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:56:21 np0005531887 nova_compute[186849]: 2025-11-22 07:56:21.388 186853 DEBUG nova.virt.libvirt.driver [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:56:21 np0005531887 nova_compute[186849]:  <uuid>8e21ac2e-9273-4909-9626-f29aae1d2c5a</uuid>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:  <name>instance-0000004a</name>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:  <memory>131072</memory>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:  <vcpu>1</vcpu>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:56:21 np0005531887 nova_compute[186849]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:      <nova:name>tempest-ServerMetadataNegativeTestJSON-server-667251429</nova:name>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:      <nova:creationTime>2025-11-22 07:56:21</nova:creationTime>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:      <nova:flavor name="m1.nano">
Nov 22 02:56:21 np0005531887 nova_compute[186849]:        <nova:memory>128</nova:memory>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:        <nova:disk>1</nova:disk>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:        <nova:swap>0</nova:swap>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:      </nova:flavor>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:      <nova:owner>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:        <nova:user uuid="468858baf68a4236b3d03dc430310165">tempest-ServerMetadataNegativeTestJSON-1154116890-project-member</nova:user>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:        <nova:project uuid="73fcb22503a7438a99d8946a1ffb38ad">tempest-ServerMetadataNegativeTestJSON-1154116890</nova:project>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:      </nova:owner>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:      <nova:ports>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:        <nova:port uuid="5a3c5c20-b82a-439a-ad72-9d1caab258ce">
Nov 22 02:56:21 np0005531887 nova_compute[186849]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:        </nova:port>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:      </nova:ports>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:    </nova:instance>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:  <sysinfo type="smbios">
Nov 22 02:56:21 np0005531887 nova_compute[186849]:    <system>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:      <entry name="serial">8e21ac2e-9273-4909-9626-f29aae1d2c5a</entry>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:      <entry name="uuid">8e21ac2e-9273-4909-9626-f29aae1d2c5a</entry>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:    </system>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:  <os>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:    <boot dev="hd"/>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:    <smbios mode="sysinfo"/>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:  </os>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:  <features>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:    <vmcoreinfo/>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:  </features>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:  <clock offset="utc">
Nov 22 02:56:21 np0005531887 nova_compute[186849]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:    <timer name="hpet" present="no"/>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:  </clock>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:  <cpu mode="custom" match="exact">
Nov 22 02:56:21 np0005531887 nova_compute[186849]:    <model>Nehalem</model>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:  <devices>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:    <disk type="file" device="disk">
Nov 22 02:56:21 np0005531887 nova_compute[186849]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/8e21ac2e-9273-4909-9626-f29aae1d2c5a/disk"/>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:      <target dev="vda" bus="virtio"/>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:    <disk type="file" device="cdrom">
Nov 22 02:56:21 np0005531887 nova_compute[186849]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/8e21ac2e-9273-4909-9626-f29aae1d2c5a/disk.config"/>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:      <target dev="sda" bus="sata"/>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:    <interface type="ethernet">
Nov 22 02:56:21 np0005531887 nova_compute[186849]:      <mac address="fa:16:3e:3e:57:33"/>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:      <mtu size="1442"/>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:      <target dev="tap5a3c5c20-b8"/>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:    </interface>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:    <serial type="pty">
Nov 22 02:56:21 np0005531887 nova_compute[186849]:      <log file="/var/lib/nova/instances/8e21ac2e-9273-4909-9626-f29aae1d2c5a/console.log" append="off"/>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:    </serial>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:    <video>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:    </video>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:    <input type="tablet" bus="usb"/>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:    <rng model="virtio">
Nov 22 02:56:21 np0005531887 nova_compute[186849]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:    </rng>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:    <controller type="usb" index="0"/>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:    <memballoon model="virtio">
Nov 22 02:56:21 np0005531887 nova_compute[186849]:      <stats period="10"/>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 02:56:21 np0005531887 nova_compute[186849]:  </devices>
Nov 22 02:56:21 np0005531887 nova_compute[186849]: </domain>
Nov 22 02:56:21 np0005531887 nova_compute[186849]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:56:21 np0005531887 nova_compute[186849]: 2025-11-22 07:56:21.394 186853 DEBUG nova.compute.manager [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Preparing to wait for external event network-vif-plugged-5a3c5c20-b82a-439a-ad72-9d1caab258ce prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 02:56:21 np0005531887 nova_compute[186849]: 2025-11-22 07:56:21.395 186853 DEBUG oslo_concurrency.lockutils [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Acquiring lock "8e21ac2e-9273-4909-9626-f29aae1d2c5a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:21 np0005531887 nova_compute[186849]: 2025-11-22 07:56:21.395 186853 DEBUG oslo_concurrency.lockutils [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Lock "8e21ac2e-9273-4909-9626-f29aae1d2c5a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:21 np0005531887 nova_compute[186849]: 2025-11-22 07:56:21.395 186853 DEBUG oslo_concurrency.lockutils [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Lock "8e21ac2e-9273-4909-9626-f29aae1d2c5a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:21 np0005531887 nova_compute[186849]: 2025-11-22 07:56:21.396 186853 DEBUG nova.virt.libvirt.vif [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:56:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerMetadataNegativeTestJSON-server-667251429',display_name='tempest-ServerMetadataNegativeTestJSON-server-667251429',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-servermetadatanegativetestjson-server-667251429',id=74,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='73fcb22503a7438a99d8946a1ffb38ad',ramdisk_id='',reservation_id='r-wtkezp7k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerMetadataNegativeTestJSON-1154116890',owner_user_name='tempest-ServerMetadataNegativeTestJSON-1154116890-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:56:16Z,user_data=None,user_id='468858baf68a4236b3d03dc430310165',uuid=8e21ac2e-9273-4909-9626-f29aae1d2c5a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5a3c5c20-b82a-439a-ad72-9d1caab258ce", "address": "fa:16:3e:3e:57:33", "network": {"id": "7b1d5ca6-1256-44b6-ace0-8cd170d3014e", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-2036743825-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "73fcb22503a7438a99d8946a1ffb38ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5a3c5c20-b8", "ovs_interfaceid": "5a3c5c20-b82a-439a-ad72-9d1caab258ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 02:56:21 np0005531887 nova_compute[186849]: 2025-11-22 07:56:21.397 186853 DEBUG nova.network.os_vif_util [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Converting VIF {"id": "5a3c5c20-b82a-439a-ad72-9d1caab258ce", "address": "fa:16:3e:3e:57:33", "network": {"id": "7b1d5ca6-1256-44b6-ace0-8cd170d3014e", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-2036743825-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "73fcb22503a7438a99d8946a1ffb38ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5a3c5c20-b8", "ovs_interfaceid": "5a3c5c20-b82a-439a-ad72-9d1caab258ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:56:21 np0005531887 nova_compute[186849]: 2025-11-22 07:56:21.397 186853 DEBUG nova.network.os_vif_util [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3e:57:33,bridge_name='br-int',has_traffic_filtering=True,id=5a3c5c20-b82a-439a-ad72-9d1caab258ce,network=Network(7b1d5ca6-1256-44b6-ace0-8cd170d3014e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5a3c5c20-b8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:56:21 np0005531887 nova_compute[186849]: 2025-11-22 07:56:21.397 186853 DEBUG os_vif [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3e:57:33,bridge_name='br-int',has_traffic_filtering=True,id=5a3c5c20-b82a-439a-ad72-9d1caab258ce,network=Network(7b1d5ca6-1256-44b6-ace0-8cd170d3014e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5a3c5c20-b8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 02:56:21 np0005531887 nova_compute[186849]: 2025-11-22 07:56:21.398 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:21 np0005531887 nova_compute[186849]: 2025-11-22 07:56:21.398 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:56:21 np0005531887 nova_compute[186849]: 2025-11-22 07:56:21.399 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:56:21 np0005531887 nova_compute[186849]: 2025-11-22 07:56:21.401 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:21 np0005531887 nova_compute[186849]: 2025-11-22 07:56:21.401 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5a3c5c20-b8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:56:21 np0005531887 nova_compute[186849]: 2025-11-22 07:56:21.402 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5a3c5c20-b8, col_values=(('external_ids', {'iface-id': '5a3c5c20-b82a-439a-ad72-9d1caab258ce', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3e:57:33', 'vm-uuid': '8e21ac2e-9273-4909-9626-f29aae1d2c5a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:56:21 np0005531887 nova_compute[186849]: 2025-11-22 07:56:21.403 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:21 np0005531887 NetworkManager[55210]: <info>  [1763798181.4048] manager: (tap5a3c5c20-b8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/97)
Nov 22 02:56:21 np0005531887 nova_compute[186849]: 2025-11-22 07:56:21.405 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:56:21 np0005531887 nova_compute[186849]: 2025-11-22 07:56:21.411 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:21 np0005531887 nova_compute[186849]: 2025-11-22 07:56:21.412 186853 INFO os_vif [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3e:57:33,bridge_name='br-int',has_traffic_filtering=True,id=5a3c5c20-b82a-439a-ad72-9d1caab258ce,network=Network(7b1d5ca6-1256-44b6-ace0-8cd170d3014e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5a3c5c20-b8')#033[00m
Nov 22 02:56:21 np0005531887 nova_compute[186849]: 2025-11-22 07:56:21.478 186853 DEBUG nova.virt.libvirt.driver [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:56:21 np0005531887 nova_compute[186849]: 2025-11-22 07:56:21.478 186853 DEBUG nova.virt.libvirt.driver [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:56:21 np0005531887 nova_compute[186849]: 2025-11-22 07:56:21.479 186853 DEBUG nova.virt.libvirt.driver [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] No VIF found with MAC fa:16:3e:3e:57:33, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 02:56:21 np0005531887 nova_compute[186849]: 2025-11-22 07:56:21.479 186853 INFO nova.virt.libvirt.driver [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Using config drive#033[00m
Nov 22 02:56:21 np0005531887 nova_compute[186849]: 2025-11-22 07:56:21.517 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:21 np0005531887 nova_compute[186849]: 2025-11-22 07:56:21.771 186853 INFO nova.virt.libvirt.driver [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Creating config drive at /var/lib/nova/instances/8e21ac2e-9273-4909-9626-f29aae1d2c5a/disk.config#033[00m
Nov 22 02:56:21 np0005531887 nova_compute[186849]: 2025-11-22 07:56:21.782 186853 DEBUG oslo_concurrency.processutils [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8e21ac2e-9273-4909-9626-f29aae1d2c5a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5u3a_gm_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:56:21 np0005531887 nova_compute[186849]: 2025-11-22 07:56:21.910 186853 DEBUG oslo_concurrency.processutils [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8e21ac2e-9273-4909-9626-f29aae1d2c5a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5u3a_gm_" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:56:21 np0005531887 kernel: tap5a3c5c20-b8: entered promiscuous mode
Nov 22 02:56:21 np0005531887 NetworkManager[55210]: <info>  [1763798181.9821] manager: (tap5a3c5c20-b8): new Tun device (/org/freedesktop/NetworkManager/Devices/98)
Nov 22 02:56:21 np0005531887 ovn_controller[95130]: 2025-11-22T07:56:21Z|00181|binding|INFO|Claiming lport 5a3c5c20-b82a-439a-ad72-9d1caab258ce for this chassis.
Nov 22 02:56:21 np0005531887 ovn_controller[95130]: 2025-11-22T07:56:21Z|00182|binding|INFO|5a3c5c20-b82a-439a-ad72-9d1caab258ce: Claiming fa:16:3e:3e:57:33 10.100.0.9
Nov 22 02:56:21 np0005531887 nova_compute[186849]: 2025-11-22 07:56:21.985 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:21 np0005531887 NetworkManager[55210]: <info>  [1763798181.9959] device (tap5a3c5c20-b8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 02:56:21 np0005531887 NetworkManager[55210]: <info>  [1763798181.9978] device (tap5a3c5c20-b8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 02:56:22 np0005531887 systemd-machined[153180]: New machine qemu-28-instance-0000004a.
Nov 22 02:56:22 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:22.044 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3e:57:33 10.100.0.9'], port_security=['fa:16:3e:3e:57:33 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '8e21ac2e-9273-4909-9626-f29aae1d2c5a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7b1d5ca6-1256-44b6-ace0-8cd170d3014e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '73fcb22503a7438a99d8946a1ffb38ad', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5508c60b-1199-4229-9070-d4b6ffba0161', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=43c6129c-fbb4-4ac7-933b-09f225eb7e77, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=5a3c5c20-b82a-439a-ad72-9d1caab258ce) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:56:22 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:22.046 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 5a3c5c20-b82a-439a-ad72-9d1caab258ce in datapath 7b1d5ca6-1256-44b6-ace0-8cd170d3014e bound to our chassis#033[00m
Nov 22 02:56:22 np0005531887 nova_compute[186849]: 2025-11-22 07:56:22.047 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:22 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:22.047 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7b1d5ca6-1256-44b6-ace0-8cd170d3014e#033[00m
Nov 22 02:56:22 np0005531887 systemd[1]: Started Virtual Machine qemu-28-instance-0000004a.
Nov 22 02:56:22 np0005531887 nova_compute[186849]: 2025-11-22 07:56:22.055 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:22 np0005531887 ovn_controller[95130]: 2025-11-22T07:56:22Z|00183|binding|INFO|Setting lport 5a3c5c20-b82a-439a-ad72-9d1caab258ce ovn-installed in OVS
Nov 22 02:56:22 np0005531887 ovn_controller[95130]: 2025-11-22T07:56:22Z|00184|binding|INFO|Setting lport 5a3c5c20-b82a-439a-ad72-9d1caab258ce up in Southbound
Nov 22 02:56:22 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:22.061 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[127ec15f-044d-4c82-915f-5aa9a9517413]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:22 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:22.062 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7b1d5ca6-11 in ovnmeta-7b1d5ca6-1256-44b6-ace0-8cd170d3014e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 02:56:22 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:22.064 213790 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7b1d5ca6-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 02:56:22 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:22.064 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[9585e711-eb35-47c2-9a8d-73aa09d87ef3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:22 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:22.065 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[85390b5a-56e9-4e3f-824a-1954d04f19a3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:22 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:22.075 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[9b6fac87-ce3b-4224-97f8-2973d469598e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:22 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:22.087 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[576041b1-c306-475f-a224-40beaa66f1ea]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:22 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:22.114 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[7aa4b61d-086a-4612-97e8-6b5b0b2d41c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:22 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:22.118 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[bfd669f4-62b4-4ac1-9b88-911346866e08]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:22 np0005531887 NetworkManager[55210]: <info>  [1763798182.1205] manager: (tap7b1d5ca6-10): new Veth device (/org/freedesktop/NetworkManager/Devices/99)
Nov 22 02:56:22 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:22.151 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[6d12d982-6589-4867-be70-e73e274689ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:22 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:22.154 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[3fd53529-122e-4de6-87a5-28e42d92a044]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:22 np0005531887 NetworkManager[55210]: <info>  [1763798182.1781] device (tap7b1d5ca6-10): carrier: link connected
Nov 22 02:56:22 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:22.186 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[af7d9600-ff0d-4350-8daf-56d05d7d03d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:22 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:22.199 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:56:22 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:22.206 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[d2339352-6e72-4e9e-9ada-2272791734b3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7b1d5ca6-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:73:a0:4c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 60], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 492478, 'reachable_time': 36984, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223500, 'error': None, 'target': 'ovnmeta-7b1d5ca6-1256-44b6-ace0-8cd170d3014e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:22 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:22.223 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[d8d6b223-4d38-4782-9d78-7cd24b9e2204]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe73:a04c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 492478, 'tstamp': 492478}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223501, 'error': None, 'target': 'ovnmeta-7b1d5ca6-1256-44b6-ace0-8cd170d3014e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:22 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:22.242 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[fa096dfe-6cf9-49b7-b9b8-f1d759a585cb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7b1d5ca6-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:73:a0:4c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 60], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 492478, 'reachable_time': 36984, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 223502, 'error': None, 'target': 'ovnmeta-7b1d5ca6-1256-44b6-ace0-8cd170d3014e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:22 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:22.271 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[23ca6f9d-419a-4f6a-b49a-47ad37ebd3fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:22 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:22.328 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[17f19b10-1132-4e36-88c4-aa4114bd34f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:22 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:22.329 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7b1d5ca6-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:56:22 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:22.329 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:56:22 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:22.330 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7b1d5ca6-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:56:22 np0005531887 kernel: tap7b1d5ca6-10: entered promiscuous mode
Nov 22 02:56:22 np0005531887 NetworkManager[55210]: <info>  [1763798182.3345] manager: (tap7b1d5ca6-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/100)
Nov 22 02:56:22 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:22.334 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7b1d5ca6-10, col_values=(('external_ids', {'iface-id': 'bc0b8c53-99a6-4485-9d82-052241c788cb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:56:22 np0005531887 ovn_controller[95130]: 2025-11-22T07:56:22Z|00185|binding|INFO|Releasing lport bc0b8c53-99a6-4485-9d82-052241c788cb from this chassis (sb_readonly=0)
Nov 22 02:56:22 np0005531887 nova_compute[186849]: 2025-11-22 07:56:22.336 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:22 np0005531887 nova_compute[186849]: 2025-11-22 07:56:22.348 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:22 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:22.349 104084 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7b1d5ca6-1256-44b6-ace0-8cd170d3014e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7b1d5ca6-1256-44b6-ace0-8cd170d3014e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 02:56:22 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:22.350 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[9443bdbc-4cb4-4d38-8c37-bb723514dd82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:22 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:22.351 104084 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 02:56:22 np0005531887 ovn_metadata_agent[104079]: global
Nov 22 02:56:22 np0005531887 ovn_metadata_agent[104079]:    log         /dev/log local0 debug
Nov 22 02:56:22 np0005531887 ovn_metadata_agent[104079]:    log-tag     haproxy-metadata-proxy-7b1d5ca6-1256-44b6-ace0-8cd170d3014e
Nov 22 02:56:22 np0005531887 ovn_metadata_agent[104079]:    user        root
Nov 22 02:56:22 np0005531887 ovn_metadata_agent[104079]:    group       root
Nov 22 02:56:22 np0005531887 ovn_metadata_agent[104079]:    maxconn     1024
Nov 22 02:56:22 np0005531887 ovn_metadata_agent[104079]:    pidfile     /var/lib/neutron/external/pids/7b1d5ca6-1256-44b6-ace0-8cd170d3014e.pid.haproxy
Nov 22 02:56:22 np0005531887 ovn_metadata_agent[104079]:    daemon
Nov 22 02:56:22 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:56:22 np0005531887 ovn_metadata_agent[104079]: defaults
Nov 22 02:56:22 np0005531887 ovn_metadata_agent[104079]:    log global
Nov 22 02:56:22 np0005531887 ovn_metadata_agent[104079]:    mode http
Nov 22 02:56:22 np0005531887 ovn_metadata_agent[104079]:    option httplog
Nov 22 02:56:22 np0005531887 ovn_metadata_agent[104079]:    option dontlognull
Nov 22 02:56:22 np0005531887 ovn_metadata_agent[104079]:    option http-server-close
Nov 22 02:56:22 np0005531887 ovn_metadata_agent[104079]:    option forwardfor
Nov 22 02:56:22 np0005531887 ovn_metadata_agent[104079]:    retries                 3
Nov 22 02:56:22 np0005531887 ovn_metadata_agent[104079]:    timeout http-request    30s
Nov 22 02:56:22 np0005531887 ovn_metadata_agent[104079]:    timeout connect         30s
Nov 22 02:56:22 np0005531887 ovn_metadata_agent[104079]:    timeout client          32s
Nov 22 02:56:22 np0005531887 ovn_metadata_agent[104079]:    timeout server          32s
Nov 22 02:56:22 np0005531887 ovn_metadata_agent[104079]:    timeout http-keep-alive 30s
Nov 22 02:56:22 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:56:22 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:56:22 np0005531887 ovn_metadata_agent[104079]: listen listener
Nov 22 02:56:22 np0005531887 ovn_metadata_agent[104079]:    bind 169.254.169.254:80
Nov 22 02:56:22 np0005531887 ovn_metadata_agent[104079]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 02:56:22 np0005531887 ovn_metadata_agent[104079]:    http-request add-header X-OVN-Network-ID 7b1d5ca6-1256-44b6-ace0-8cd170d3014e
Nov 22 02:56:22 np0005531887 ovn_metadata_agent[104079]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 02:56:22 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:22.351 104084 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7b1d5ca6-1256-44b6-ace0-8cd170d3014e', 'env', 'PROCESS_TAG=haproxy-7b1d5ca6-1256-44b6-ace0-8cd170d3014e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7b1d5ca6-1256-44b6-ace0-8cd170d3014e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 02:56:22 np0005531887 nova_compute[186849]: 2025-11-22 07:56:22.534 186853 DEBUG nova.compute.manager [req-055304b8-6b38-465b-8f99-ea19ae5973ed req-d3827b56-f734-4294-8448-2b6a80424cef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Received event network-vif-plugged-041d3fd2-9a77-48a1-b976-9dda05f01f7b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:56:22 np0005531887 nova_compute[186849]: 2025-11-22 07:56:22.535 186853 DEBUG oslo_concurrency.lockutils [req-055304b8-6b38-465b-8f99-ea19ae5973ed req-d3827b56-f734-4294-8448-2b6a80424cef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "5eafb037-41a2-463f-9d3a-1b4248cb00f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:22 np0005531887 nova_compute[186849]: 2025-11-22 07:56:22.535 186853 DEBUG oslo_concurrency.lockutils [req-055304b8-6b38-465b-8f99-ea19ae5973ed req-d3827b56-f734-4294-8448-2b6a80424cef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "5eafb037-41a2-463f-9d3a-1b4248cb00f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:22 np0005531887 nova_compute[186849]: 2025-11-22 07:56:22.535 186853 DEBUG oslo_concurrency.lockutils [req-055304b8-6b38-465b-8f99-ea19ae5973ed req-d3827b56-f734-4294-8448-2b6a80424cef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "5eafb037-41a2-463f-9d3a-1b4248cb00f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:22 np0005531887 nova_compute[186849]: 2025-11-22 07:56:22.536 186853 DEBUG nova.compute.manager [req-055304b8-6b38-465b-8f99-ea19ae5973ed req-d3827b56-f734-4294-8448-2b6a80424cef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] No waiting events found dispatching network-vif-plugged-041d3fd2-9a77-48a1-b976-9dda05f01f7b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:56:22 np0005531887 nova_compute[186849]: 2025-11-22 07:56:22.536 186853 WARNING nova.compute.manager [req-055304b8-6b38-465b-8f99-ea19ae5973ed req-d3827b56-f734-4294-8448-2b6a80424cef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Received unexpected event network-vif-plugged-041d3fd2-9a77-48a1-b976-9dda05f01f7b for instance with vm_state resized and task_state None.#033[00m
Nov 22 02:56:22 np0005531887 nova_compute[186849]: 2025-11-22 07:56:22.536 186853 DEBUG nova.compute.manager [req-055304b8-6b38-465b-8f99-ea19ae5973ed req-d3827b56-f734-4294-8448-2b6a80424cef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Received event network-vif-plugged-041d3fd2-9a77-48a1-b976-9dda05f01f7b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:56:22 np0005531887 nova_compute[186849]: 2025-11-22 07:56:22.537 186853 DEBUG oslo_concurrency.lockutils [req-055304b8-6b38-465b-8f99-ea19ae5973ed req-d3827b56-f734-4294-8448-2b6a80424cef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "5eafb037-41a2-463f-9d3a-1b4248cb00f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:22 np0005531887 nova_compute[186849]: 2025-11-22 07:56:22.537 186853 DEBUG oslo_concurrency.lockutils [req-055304b8-6b38-465b-8f99-ea19ae5973ed req-d3827b56-f734-4294-8448-2b6a80424cef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "5eafb037-41a2-463f-9d3a-1b4248cb00f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:22 np0005531887 nova_compute[186849]: 2025-11-22 07:56:22.537 186853 DEBUG oslo_concurrency.lockutils [req-055304b8-6b38-465b-8f99-ea19ae5973ed req-d3827b56-f734-4294-8448-2b6a80424cef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "5eafb037-41a2-463f-9d3a-1b4248cb00f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:22 np0005531887 nova_compute[186849]: 2025-11-22 07:56:22.537 186853 DEBUG nova.compute.manager [req-055304b8-6b38-465b-8f99-ea19ae5973ed req-d3827b56-f734-4294-8448-2b6a80424cef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] No waiting events found dispatching network-vif-plugged-041d3fd2-9a77-48a1-b976-9dda05f01f7b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:56:22 np0005531887 nova_compute[186849]: 2025-11-22 07:56:22.538 186853 WARNING nova.compute.manager [req-055304b8-6b38-465b-8f99-ea19ae5973ed req-d3827b56-f734-4294-8448-2b6a80424cef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Received unexpected event network-vif-plugged-041d3fd2-9a77-48a1-b976-9dda05f01f7b for instance with vm_state resized and task_state None.#033[00m
Nov 22 02:56:22 np0005531887 nova_compute[186849]: 2025-11-22 07:56:22.556 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798182.556323, 8e21ac2e-9273-4909-9626-f29aae1d2c5a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:56:22 np0005531887 nova_compute[186849]: 2025-11-22 07:56:22.557 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] VM Started (Lifecycle Event)#033[00m
Nov 22 02:56:22 np0005531887 nova_compute[186849]: 2025-11-22 07:56:22.581 186853 DEBUG nova.compute.manager [req-43267e6a-ba4a-40af-9d9d-c6f9ade90843 req-1eac5e39-2312-4910-ae8c-3e4d97bed5c3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Received event network-vif-plugged-5a3c5c20-b82a-439a-ad72-9d1caab258ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:56:22 np0005531887 nova_compute[186849]: 2025-11-22 07:56:22.583 186853 DEBUG oslo_concurrency.lockutils [req-43267e6a-ba4a-40af-9d9d-c6f9ade90843 req-1eac5e39-2312-4910-ae8c-3e4d97bed5c3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "8e21ac2e-9273-4909-9626-f29aae1d2c5a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:22 np0005531887 nova_compute[186849]: 2025-11-22 07:56:22.584 186853 DEBUG oslo_concurrency.lockutils [req-43267e6a-ba4a-40af-9d9d-c6f9ade90843 req-1eac5e39-2312-4910-ae8c-3e4d97bed5c3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "8e21ac2e-9273-4909-9626-f29aae1d2c5a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:22 np0005531887 nova_compute[186849]: 2025-11-22 07:56:22.586 186853 DEBUG oslo_concurrency.lockutils [req-43267e6a-ba4a-40af-9d9d-c6f9ade90843 req-1eac5e39-2312-4910-ae8c-3e4d97bed5c3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "8e21ac2e-9273-4909-9626-f29aae1d2c5a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:22 np0005531887 nova_compute[186849]: 2025-11-22 07:56:22.587 186853 DEBUG nova.compute.manager [req-43267e6a-ba4a-40af-9d9d-c6f9ade90843 req-1eac5e39-2312-4910-ae8c-3e4d97bed5c3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Processing event network-vif-plugged-5a3c5c20-b82a-439a-ad72-9d1caab258ce _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 02:56:22 np0005531887 nova_compute[186849]: 2025-11-22 07:56:22.591 186853 DEBUG nova.compute.manager [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:56:22 np0005531887 nova_compute[186849]: 2025-11-22 07:56:22.593 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:56:22 np0005531887 nova_compute[186849]: 2025-11-22 07:56:22.598 186853 DEBUG nova.virt.libvirt.driver [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 02:56:22 np0005531887 nova_compute[186849]: 2025-11-22 07:56:22.600 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:56:22 np0005531887 nova_compute[186849]: 2025-11-22 07:56:22.605 186853 INFO nova.virt.libvirt.driver [-] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Instance spawned successfully.#033[00m
Nov 22 02:56:22 np0005531887 nova_compute[186849]: 2025-11-22 07:56:22.606 186853 DEBUG nova.virt.libvirt.driver [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 02:56:22 np0005531887 nova_compute[186849]: 2025-11-22 07:56:22.627 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:56:22 np0005531887 nova_compute[186849]: 2025-11-22 07:56:22.627 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798182.5573974, 8e21ac2e-9273-4909-9626-f29aae1d2c5a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:56:22 np0005531887 nova_compute[186849]: 2025-11-22 07:56:22.628 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] VM Paused (Lifecycle Event)#033[00m
Nov 22 02:56:22 np0005531887 nova_compute[186849]: 2025-11-22 07:56:22.639 186853 DEBUG nova.virt.libvirt.driver [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:56:22 np0005531887 nova_compute[186849]: 2025-11-22 07:56:22.640 186853 DEBUG nova.virt.libvirt.driver [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:56:22 np0005531887 nova_compute[186849]: 2025-11-22 07:56:22.640 186853 DEBUG nova.virt.libvirt.driver [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:56:22 np0005531887 nova_compute[186849]: 2025-11-22 07:56:22.641 186853 DEBUG nova.virt.libvirt.driver [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:56:22 np0005531887 nova_compute[186849]: 2025-11-22 07:56:22.641 186853 DEBUG nova.virt.libvirt.driver [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:56:22 np0005531887 nova_compute[186849]: 2025-11-22 07:56:22.642 186853 DEBUG nova.virt.libvirt.driver [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:56:22 np0005531887 nova_compute[186849]: 2025-11-22 07:56:22.648 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:56:22 np0005531887 nova_compute[186849]: 2025-11-22 07:56:22.652 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798182.596359, 8e21ac2e-9273-4909-9626-f29aae1d2c5a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:56:22 np0005531887 nova_compute[186849]: 2025-11-22 07:56:22.652 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:56:22 np0005531887 nova_compute[186849]: 2025-11-22 07:56:22.673 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:56:22 np0005531887 nova_compute[186849]: 2025-11-22 07:56:22.679 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:56:22 np0005531887 nova_compute[186849]: 2025-11-22 07:56:22.695 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:56:22 np0005531887 podman[223540]: 2025-11-22 07:56:22.735384202 +0000 UTC m=+0.055993666 container create 85d924c6c2edb0165dd941c3127c4dfd231c48a9a29fe88e51b5f45d9de7cf40 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b1d5ca6-1256-44b6-ace0-8cd170d3014e, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 22 02:56:22 np0005531887 nova_compute[186849]: 2025-11-22 07:56:22.768 186853 DEBUG nova.network.neutron [req-50a16e2e-e062-41b8-a52a-76f03da532f0 req-f92a6b09-5278-4e3e-b255-1dc0f4964382 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Updated VIF entry in instance network info cache for port 5a3c5c20-b82a-439a-ad72-9d1caab258ce. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 02:56:22 np0005531887 nova_compute[186849]: 2025-11-22 07:56:22.770 186853 DEBUG nova.network.neutron [req-50a16e2e-e062-41b8-a52a-76f03da532f0 req-f92a6b09-5278-4e3e-b255-1dc0f4964382 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Updating instance_info_cache with network_info: [{"id": "5a3c5c20-b82a-439a-ad72-9d1caab258ce", "address": "fa:16:3e:3e:57:33", "network": {"id": "7b1d5ca6-1256-44b6-ace0-8cd170d3014e", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-2036743825-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "73fcb22503a7438a99d8946a1ffb38ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5a3c5c20-b8", "ovs_interfaceid": "5a3c5c20-b82a-439a-ad72-9d1caab258ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:56:22 np0005531887 nova_compute[186849]: 2025-11-22 07:56:22.774 186853 INFO nova.compute.manager [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Took 6.13 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 02:56:22 np0005531887 nova_compute[186849]: 2025-11-22 07:56:22.774 186853 DEBUG nova.compute.manager [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:56:22 np0005531887 systemd[1]: Started libpod-conmon-85d924c6c2edb0165dd941c3127c4dfd231c48a9a29fe88e51b5f45d9de7cf40.scope.
Nov 22 02:56:22 np0005531887 nova_compute[186849]: 2025-11-22 07:56:22.783 186853 DEBUG oslo_concurrency.lockutils [req-50a16e2e-e062-41b8-a52a-76f03da532f0 req-f92a6b09-5278-4e3e-b255-1dc0f4964382 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-8e21ac2e-9273-4909-9626-f29aae1d2c5a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:56:22 np0005531887 podman[223540]: 2025-11-22 07:56:22.704849901 +0000 UTC m=+0.025459375 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 02:56:22 np0005531887 systemd[1]: Started libcrun container.
Nov 22 02:56:22 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4d1dffd4b99df420c8bbcb141dbc6c65f3eca9c326f10650c5c1732c18bd750c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 02:56:22 np0005531887 podman[223540]: 2025-11-22 07:56:22.831159889 +0000 UTC m=+0.151769353 container init 85d924c6c2edb0165dd941c3127c4dfd231c48a9a29fe88e51b5f45d9de7cf40 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b1d5ca6-1256-44b6-ace0-8cd170d3014e, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 02:56:22 np0005531887 podman[223540]: 2025-11-22 07:56:22.836465141 +0000 UTC m=+0.157074595 container start 85d924c6c2edb0165dd941c3127c4dfd231c48a9a29fe88e51b5f45d9de7cf40 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b1d5ca6-1256-44b6-ace0-8cd170d3014e, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 22 02:56:22 np0005531887 neutron-haproxy-ovnmeta-7b1d5ca6-1256-44b6-ace0-8cd170d3014e[223556]: [NOTICE]   (223560) : New worker (223562) forked
Nov 22 02:56:22 np0005531887 neutron-haproxy-ovnmeta-7b1d5ca6-1256-44b6-ace0-8cd170d3014e[223556]: [NOTICE]   (223560) : Loading success.
Nov 22 02:56:22 np0005531887 nova_compute[186849]: 2025-11-22 07:56:22.868 186853 INFO nova.compute.manager [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Took 6.84 seconds to build instance.#033[00m
Nov 22 02:56:22 np0005531887 nova_compute[186849]: 2025-11-22 07:56:22.886 186853 DEBUG oslo_concurrency.lockutils [None req-68d98e02-1e83-473d-a952-f18737bb2d9c 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Lock "8e21ac2e-9273-4909-9626-f29aae1d2c5a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.947s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:23 np0005531887 podman[223571]: 2025-11-22 07:56:23.866432149 +0000 UTC m=+0.088982918 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 02:56:24 np0005531887 systemd[1]: Stopping User Manager for UID 42436...
Nov 22 02:56:24 np0005531887 systemd[223248]: Activating special unit Exit the Session...
Nov 22 02:56:24 np0005531887 systemd[223248]: Stopped target Main User Target.
Nov 22 02:56:24 np0005531887 systemd[223248]: Stopped target Basic System.
Nov 22 02:56:24 np0005531887 systemd[223248]: Stopped target Paths.
Nov 22 02:56:24 np0005531887 systemd[223248]: Stopped target Sockets.
Nov 22 02:56:24 np0005531887 systemd[223248]: Stopped target Timers.
Nov 22 02:56:24 np0005531887 systemd[223248]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 22 02:56:24 np0005531887 systemd[223248]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 22 02:56:24 np0005531887 systemd[223248]: Closed D-Bus User Message Bus Socket.
Nov 22 02:56:24 np0005531887 systemd[223248]: Stopped Create User's Volatile Files and Directories.
Nov 22 02:56:24 np0005531887 systemd[223248]: Removed slice User Application Slice.
Nov 22 02:56:24 np0005531887 systemd[223248]: Reached target Shutdown.
Nov 22 02:56:24 np0005531887 systemd[223248]: Finished Exit the Session.
Nov 22 02:56:24 np0005531887 systemd[223248]: Reached target Exit the Session.
Nov 22 02:56:24 np0005531887 systemd[1]: user@42436.service: Deactivated successfully.
Nov 22 02:56:24 np0005531887 systemd[1]: Stopped User Manager for UID 42436.
Nov 22 02:56:24 np0005531887 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 22 02:56:24 np0005531887 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 22 02:56:24 np0005531887 nova_compute[186849]: 2025-11-22 07:56:24.804 186853 DEBUG nova.compute.manager [req-c6c56179-9112-465f-8f1e-963bfbacdf42 req-ee037c7b-eeb9-4d89-a66b-0a4f90f00ebe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Received event network-vif-plugged-5a3c5c20-b82a-439a-ad72-9d1caab258ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:56:24 np0005531887 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 22 02:56:24 np0005531887 nova_compute[186849]: 2025-11-22 07:56:24.805 186853 DEBUG oslo_concurrency.lockutils [req-c6c56179-9112-465f-8f1e-963bfbacdf42 req-ee037c7b-eeb9-4d89-a66b-0a4f90f00ebe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "8e21ac2e-9273-4909-9626-f29aae1d2c5a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:24 np0005531887 nova_compute[186849]: 2025-11-22 07:56:24.805 186853 DEBUG oslo_concurrency.lockutils [req-c6c56179-9112-465f-8f1e-963bfbacdf42 req-ee037c7b-eeb9-4d89-a66b-0a4f90f00ebe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "8e21ac2e-9273-4909-9626-f29aae1d2c5a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:24 np0005531887 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 22 02:56:24 np0005531887 nova_compute[186849]: 2025-11-22 07:56:24.805 186853 DEBUG oslo_concurrency.lockutils [req-c6c56179-9112-465f-8f1e-963bfbacdf42 req-ee037c7b-eeb9-4d89-a66b-0a4f90f00ebe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "8e21ac2e-9273-4909-9626-f29aae1d2c5a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:24 np0005531887 nova_compute[186849]: 2025-11-22 07:56:24.806 186853 DEBUG nova.compute.manager [req-c6c56179-9112-465f-8f1e-963bfbacdf42 req-ee037c7b-eeb9-4d89-a66b-0a4f90f00ebe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] No waiting events found dispatching network-vif-plugged-5a3c5c20-b82a-439a-ad72-9d1caab258ce pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:56:24 np0005531887 nova_compute[186849]: 2025-11-22 07:56:24.806 186853 WARNING nova.compute.manager [req-c6c56179-9112-465f-8f1e-963bfbacdf42 req-ee037c7b-eeb9-4d89-a66b-0a4f90f00ebe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Received unexpected event network-vif-plugged-5a3c5c20-b82a-439a-ad72-9d1caab258ce for instance with vm_state active and task_state None.#033[00m
Nov 22 02:56:24 np0005531887 systemd[1]: Removed slice User Slice of UID 42436.
Nov 22 02:56:26 np0005531887 nova_compute[186849]: 2025-11-22 07:56:26.406 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:26 np0005531887 nova_compute[186849]: 2025-11-22 07:56:26.521 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:28 np0005531887 podman[223592]: 2025-11-22 07:56:28.846395639 +0000 UTC m=+0.060586250 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 02:56:28 np0005531887 nova_compute[186849]: 2025-11-22 07:56:28.943 186853 DEBUG oslo_concurrency.lockutils [None req-dc377e59-c67d-4439-b45c-6364068c6801 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Acquiring lock "8e21ac2e-9273-4909-9626-f29aae1d2c5a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:28 np0005531887 nova_compute[186849]: 2025-11-22 07:56:28.944 186853 DEBUG oslo_concurrency.lockutils [None req-dc377e59-c67d-4439-b45c-6364068c6801 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Lock "8e21ac2e-9273-4909-9626-f29aae1d2c5a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:28 np0005531887 nova_compute[186849]: 2025-11-22 07:56:28.945 186853 DEBUG oslo_concurrency.lockutils [None req-dc377e59-c67d-4439-b45c-6364068c6801 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Acquiring lock "8e21ac2e-9273-4909-9626-f29aae1d2c5a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:28 np0005531887 nova_compute[186849]: 2025-11-22 07:56:28.946 186853 DEBUG oslo_concurrency.lockutils [None req-dc377e59-c67d-4439-b45c-6364068c6801 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Lock "8e21ac2e-9273-4909-9626-f29aae1d2c5a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:28 np0005531887 nova_compute[186849]: 2025-11-22 07:56:28.946 186853 DEBUG oslo_concurrency.lockutils [None req-dc377e59-c67d-4439-b45c-6364068c6801 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Lock "8e21ac2e-9273-4909-9626-f29aae1d2c5a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:28 np0005531887 nova_compute[186849]: 2025-11-22 07:56:28.954 186853 INFO nova.compute.manager [None req-dc377e59-c67d-4439-b45c-6364068c6801 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Terminating instance#033[00m
Nov 22 02:56:28 np0005531887 nova_compute[186849]: 2025-11-22 07:56:28.961 186853 DEBUG nova.compute.manager [None req-dc377e59-c67d-4439-b45c-6364068c6801 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 02:56:28 np0005531887 kernel: tap5a3c5c20-b8 (unregistering): left promiscuous mode
Nov 22 02:56:29 np0005531887 NetworkManager[55210]: <info>  [1763798189.0000] device (tap5a3c5c20-b8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 02:56:29 np0005531887 ovn_controller[95130]: 2025-11-22T07:56:29Z|00186|binding|INFO|Releasing lport 5a3c5c20-b82a-439a-ad72-9d1caab258ce from this chassis (sb_readonly=0)
Nov 22 02:56:29 np0005531887 ovn_controller[95130]: 2025-11-22T07:56:29Z|00187|binding|INFO|Setting lport 5a3c5c20-b82a-439a-ad72-9d1caab258ce down in Southbound
Nov 22 02:56:29 np0005531887 ovn_controller[95130]: 2025-11-22T07:56:29Z|00188|binding|INFO|Removing iface tap5a3c5c20-b8 ovn-installed in OVS
Nov 22 02:56:29 np0005531887 nova_compute[186849]: 2025-11-22 07:56:29.003 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:29 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:29.008 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3e:57:33 10.100.0.9'], port_security=['fa:16:3e:3e:57:33 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '8e21ac2e-9273-4909-9626-f29aae1d2c5a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7b1d5ca6-1256-44b6-ace0-8cd170d3014e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '73fcb22503a7438a99d8946a1ffb38ad', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5508c60b-1199-4229-9070-d4b6ffba0161', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=43c6129c-fbb4-4ac7-933b-09f225eb7e77, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=5a3c5c20-b82a-439a-ad72-9d1caab258ce) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:56:29 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:29.011 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 5a3c5c20-b82a-439a-ad72-9d1caab258ce in datapath 7b1d5ca6-1256-44b6-ace0-8cd170d3014e unbound from our chassis#033[00m
Nov 22 02:56:29 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:29.012 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7b1d5ca6-1256-44b6-ace0-8cd170d3014e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 02:56:29 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:29.013 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[3dbe30e2-3d01-43b7-935b-b7aea47dd81e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:29 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:29.014 104084 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7b1d5ca6-1256-44b6-ace0-8cd170d3014e namespace which is not needed anymore#033[00m
Nov 22 02:56:29 np0005531887 nova_compute[186849]: 2025-11-22 07:56:29.018 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:29 np0005531887 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d0000004a.scope: Deactivated successfully.
Nov 22 02:56:29 np0005531887 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d0000004a.scope: Consumed 6.992s CPU time.
Nov 22 02:56:29 np0005531887 systemd-machined[153180]: Machine qemu-28-instance-0000004a terminated.
Nov 22 02:56:29 np0005531887 neutron-haproxy-ovnmeta-7b1d5ca6-1256-44b6-ace0-8cd170d3014e[223556]: [NOTICE]   (223560) : haproxy version is 2.8.14-c23fe91
Nov 22 02:56:29 np0005531887 neutron-haproxy-ovnmeta-7b1d5ca6-1256-44b6-ace0-8cd170d3014e[223556]: [NOTICE]   (223560) : path to executable is /usr/sbin/haproxy
Nov 22 02:56:29 np0005531887 neutron-haproxy-ovnmeta-7b1d5ca6-1256-44b6-ace0-8cd170d3014e[223556]: [WARNING]  (223560) : Exiting Master process...
Nov 22 02:56:29 np0005531887 neutron-haproxy-ovnmeta-7b1d5ca6-1256-44b6-ace0-8cd170d3014e[223556]: [ALERT]    (223560) : Current worker (223562) exited with code 143 (Terminated)
Nov 22 02:56:29 np0005531887 neutron-haproxy-ovnmeta-7b1d5ca6-1256-44b6-ace0-8cd170d3014e[223556]: [WARNING]  (223560) : All workers exited. Exiting... (0)
Nov 22 02:56:29 np0005531887 systemd[1]: libpod-85d924c6c2edb0165dd941c3127c4dfd231c48a9a29fe88e51b5f45d9de7cf40.scope: Deactivated successfully.
Nov 22 02:56:29 np0005531887 podman[223641]: 2025-11-22 07:56:29.210733489 +0000 UTC m=+0.072521717 container died 85d924c6c2edb0165dd941c3127c4dfd231c48a9a29fe88e51b5f45d9de7cf40 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b1d5ca6-1256-44b6-ace0-8cd170d3014e, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Nov 22 02:56:29 np0005531887 nova_compute[186849]: 2025-11-22 07:56:29.227 186853 INFO nova.virt.libvirt.driver [-] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Instance destroyed successfully.#033[00m
Nov 22 02:56:29 np0005531887 nova_compute[186849]: 2025-11-22 07:56:29.228 186853 DEBUG nova.objects.instance [None req-dc377e59-c67d-4439-b45c-6364068c6801 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Lazy-loading 'resources' on Instance uuid 8e21ac2e-9273-4909-9626-f29aae1d2c5a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:56:29 np0005531887 nova_compute[186849]: 2025-11-22 07:56:29.238 186853 DEBUG nova.compute.manager [req-fb981e22-b985-407c-8569-63f9ac970b62 req-a4a4f70f-3558-41d2-bba1-d5c532368bd0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Received event network-vif-unplugged-5a3c5c20-b82a-439a-ad72-9d1caab258ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:56:29 np0005531887 nova_compute[186849]: 2025-11-22 07:56:29.239 186853 DEBUG oslo_concurrency.lockutils [req-fb981e22-b985-407c-8569-63f9ac970b62 req-a4a4f70f-3558-41d2-bba1-d5c532368bd0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "8e21ac2e-9273-4909-9626-f29aae1d2c5a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:29 np0005531887 nova_compute[186849]: 2025-11-22 07:56:29.239 186853 DEBUG oslo_concurrency.lockutils [req-fb981e22-b985-407c-8569-63f9ac970b62 req-a4a4f70f-3558-41d2-bba1-d5c532368bd0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "8e21ac2e-9273-4909-9626-f29aae1d2c5a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:29 np0005531887 nova_compute[186849]: 2025-11-22 07:56:29.239 186853 DEBUG oslo_concurrency.lockutils [req-fb981e22-b985-407c-8569-63f9ac970b62 req-a4a4f70f-3558-41d2-bba1-d5c532368bd0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "8e21ac2e-9273-4909-9626-f29aae1d2c5a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:29 np0005531887 nova_compute[186849]: 2025-11-22 07:56:29.240 186853 DEBUG nova.compute.manager [req-fb981e22-b985-407c-8569-63f9ac970b62 req-a4a4f70f-3558-41d2-bba1-d5c532368bd0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] No waiting events found dispatching network-vif-unplugged-5a3c5c20-b82a-439a-ad72-9d1caab258ce pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:56:29 np0005531887 nova_compute[186849]: 2025-11-22 07:56:29.240 186853 DEBUG nova.compute.manager [req-fb981e22-b985-407c-8569-63f9ac970b62 req-a4a4f70f-3558-41d2-bba1-d5c532368bd0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Received event network-vif-unplugged-5a3c5c20-b82a-439a-ad72-9d1caab258ce for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 02:56:29 np0005531887 nova_compute[186849]: 2025-11-22 07:56:29.245 186853 DEBUG nova.virt.libvirt.vif [None req-dc377e59-c67d-4439-b45c-6364068c6801 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:56:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerMetadataNegativeTestJSON-server-667251429',display_name='tempest-ServerMetadataNegativeTestJSON-server-667251429',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-servermetadatanegativetestjson-server-667251429',id=74,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:56:22Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='73fcb22503a7438a99d8946a1ffb38ad',ramdisk_id='',reservation_id='r-wtkezp7k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerMetadataNegativeTestJSON-1154116890',owner_user_name='tempest-ServerMetadataNegativeTestJSON-1154116890-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:56:22Z,user_data=None,user_id='468858baf68a4236b3d03dc430310165',uuid=8e21ac2e-9273-4909-9626-f29aae1d2c5a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5a3c5c20-b82a-439a-ad72-9d1caab258ce", "address": "fa:16:3e:3e:57:33", "network": {"id": "7b1d5ca6-1256-44b6-ace0-8cd170d3014e", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-2036743825-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "73fcb22503a7438a99d8946a1ffb38ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5a3c5c20-b8", "ovs_interfaceid": "5a3c5c20-b82a-439a-ad72-9d1caab258ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 02:56:29 np0005531887 nova_compute[186849]: 2025-11-22 07:56:29.246 186853 DEBUG nova.network.os_vif_util [None req-dc377e59-c67d-4439-b45c-6364068c6801 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Converting VIF {"id": "5a3c5c20-b82a-439a-ad72-9d1caab258ce", "address": "fa:16:3e:3e:57:33", "network": {"id": "7b1d5ca6-1256-44b6-ace0-8cd170d3014e", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-2036743825-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "73fcb22503a7438a99d8946a1ffb38ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5a3c5c20-b8", "ovs_interfaceid": "5a3c5c20-b82a-439a-ad72-9d1caab258ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:56:29 np0005531887 nova_compute[186849]: 2025-11-22 07:56:29.247 186853 DEBUG nova.network.os_vif_util [None req-dc377e59-c67d-4439-b45c-6364068c6801 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3e:57:33,bridge_name='br-int',has_traffic_filtering=True,id=5a3c5c20-b82a-439a-ad72-9d1caab258ce,network=Network(7b1d5ca6-1256-44b6-ace0-8cd170d3014e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5a3c5c20-b8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:56:29 np0005531887 nova_compute[186849]: 2025-11-22 07:56:29.247 186853 DEBUG os_vif [None req-dc377e59-c67d-4439-b45c-6364068c6801 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3e:57:33,bridge_name='br-int',has_traffic_filtering=True,id=5a3c5c20-b82a-439a-ad72-9d1caab258ce,network=Network(7b1d5ca6-1256-44b6-ace0-8cd170d3014e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5a3c5c20-b8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 02:56:29 np0005531887 nova_compute[186849]: 2025-11-22 07:56:29.249 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:29 np0005531887 nova_compute[186849]: 2025-11-22 07:56:29.249 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5a3c5c20-b8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:56:29 np0005531887 nova_compute[186849]: 2025-11-22 07:56:29.250 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:29 np0005531887 nova_compute[186849]: 2025-11-22 07:56:29.252 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:29 np0005531887 nova_compute[186849]: 2025-11-22 07:56:29.255 186853 INFO os_vif [None req-dc377e59-c67d-4439-b45c-6364068c6801 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3e:57:33,bridge_name='br-int',has_traffic_filtering=True,id=5a3c5c20-b82a-439a-ad72-9d1caab258ce,network=Network(7b1d5ca6-1256-44b6-ace0-8cd170d3014e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5a3c5c20-b8')#033[00m
Nov 22 02:56:29 np0005531887 nova_compute[186849]: 2025-11-22 07:56:29.255 186853 INFO nova.virt.libvirt.driver [None req-dc377e59-c67d-4439-b45c-6364068c6801 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Deleting instance files /var/lib/nova/instances/8e21ac2e-9273-4909-9626-f29aae1d2c5a_del#033[00m
Nov 22 02:56:29 np0005531887 nova_compute[186849]: 2025-11-22 07:56:29.256 186853 INFO nova.virt.libvirt.driver [None req-dc377e59-c67d-4439-b45c-6364068c6801 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Deletion of /var/lib/nova/instances/8e21ac2e-9273-4909-9626-f29aae1d2c5a_del complete#033[00m
Nov 22 02:56:29 np0005531887 nova_compute[186849]: 2025-11-22 07:56:29.339 186853 INFO nova.compute.manager [None req-dc377e59-c67d-4439-b45c-6364068c6801 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 02:56:29 np0005531887 nova_compute[186849]: 2025-11-22 07:56:29.340 186853 DEBUG oslo.service.loopingcall [None req-dc377e59-c67d-4439-b45c-6364068c6801 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 02:56:29 np0005531887 nova_compute[186849]: 2025-11-22 07:56:29.340 186853 DEBUG nova.compute.manager [-] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 02:56:29 np0005531887 nova_compute[186849]: 2025-11-22 07:56:29.341 186853 DEBUG nova.network.neutron [-] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 02:56:29 np0005531887 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-85d924c6c2edb0165dd941c3127c4dfd231c48a9a29fe88e51b5f45d9de7cf40-userdata-shm.mount: Deactivated successfully.
Nov 22 02:56:29 np0005531887 systemd[1]: var-lib-containers-storage-overlay-4d1dffd4b99df420c8bbcb141dbc6c65f3eca9c326f10650c5c1732c18bd750c-merged.mount: Deactivated successfully.
Nov 22 02:56:29 np0005531887 podman[223641]: 2025-11-22 07:56:29.37042536 +0000 UTC m=+0.232213588 container cleanup 85d924c6c2edb0165dd941c3127c4dfd231c48a9a29fe88e51b5f45d9de7cf40 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b1d5ca6-1256-44b6-ace0-8cd170d3014e, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 22 02:56:29 np0005531887 systemd[1]: libpod-conmon-85d924c6c2edb0165dd941c3127c4dfd231c48a9a29fe88e51b5f45d9de7cf40.scope: Deactivated successfully.
Nov 22 02:56:29 np0005531887 podman[223687]: 2025-11-22 07:56:29.510224183 +0000 UTC m=+0.112474304 container remove 85d924c6c2edb0165dd941c3127c4dfd231c48a9a29fe88e51b5f45d9de7cf40 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b1d5ca6-1256-44b6-ace0-8cd170d3014e, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 02:56:29 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:29.517 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[e0c350d1-ac25-47ef-8cb8-d8f1db1beb63]: (4, ('Sat Nov 22 07:56:29 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-7b1d5ca6-1256-44b6-ace0-8cd170d3014e (85d924c6c2edb0165dd941c3127c4dfd231c48a9a29fe88e51b5f45d9de7cf40)\n85d924c6c2edb0165dd941c3127c4dfd231c48a9a29fe88e51b5f45d9de7cf40\nSat Nov 22 07:56:29 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-7b1d5ca6-1256-44b6-ace0-8cd170d3014e (85d924c6c2edb0165dd941c3127c4dfd231c48a9a29fe88e51b5f45d9de7cf40)\n85d924c6c2edb0165dd941c3127c4dfd231c48a9a29fe88e51b5f45d9de7cf40\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:29 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:29.519 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[2a72aa20-d1d2-446a-a03c-f2a08c4bf662]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:29 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:29.520 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7b1d5ca6-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:56:29 np0005531887 nova_compute[186849]: 2025-11-22 07:56:29.523 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:29 np0005531887 kernel: tap7b1d5ca6-10: left promiscuous mode
Nov 22 02:56:29 np0005531887 nova_compute[186849]: 2025-11-22 07:56:29.527 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:29 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:29.532 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[315f93fd-0282-40e0-8e51-4012e8376c83]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:29 np0005531887 nova_compute[186849]: 2025-11-22 07:56:29.542 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:29 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:29.563 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[539e7fe3-4f86-476d-8b97-fc6daf75666c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:29 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:29.564 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[919bd1ab-3398-41ad-866d-5550487deb85]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:29 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:29.587 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[d1994fbf-e5b2-4e24-8a2c-13ab97277fda]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 492471, 'reachable_time': 19966, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223702, 'error': None, 'target': 'ovnmeta-7b1d5ca6-1256-44b6-ace0-8cd170d3014e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:29 np0005531887 systemd[1]: run-netns-ovnmeta\x2d7b1d5ca6\x2d1256\x2d44b6\x2dace0\x2d8cd170d3014e.mount: Deactivated successfully.
Nov 22 02:56:29 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:29.590 104200 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7b1d5ca6-1256-44b6-ace0-8cd170d3014e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 02:56:29 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:29.590 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[0670c390-6137-441e-9af3-819fdcc1a965]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:30 np0005531887 nova_compute[186849]: 2025-11-22 07:56:30.717 186853 DEBUG nova.network.neutron [-] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:56:30 np0005531887 nova_compute[186849]: 2025-11-22 07:56:30.750 186853 DEBUG nova.compute.manager [req-a0afe4bf-d386-4649-9676-b76a73f4349a req-f1751638-ffa3-460d-bfa5-e9e78c0f4549 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Received event network-vif-deleted-5a3c5c20-b82a-439a-ad72-9d1caab258ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:56:30 np0005531887 nova_compute[186849]: 2025-11-22 07:56:30.751 186853 INFO nova.compute.manager [req-a0afe4bf-d386-4649-9676-b76a73f4349a req-f1751638-ffa3-460d-bfa5-e9e78c0f4549 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Neutron deleted interface 5a3c5c20-b82a-439a-ad72-9d1caab258ce; detaching it from the instance and deleting it from the info cache#033[00m
Nov 22 02:56:30 np0005531887 nova_compute[186849]: 2025-11-22 07:56:30.751 186853 DEBUG nova.network.neutron [req-a0afe4bf-d386-4649-9676-b76a73f4349a req-f1751638-ffa3-460d-bfa5-e9e78c0f4549 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:56:30 np0005531887 nova_compute[186849]: 2025-11-22 07:56:30.769 186853 INFO nova.compute.manager [-] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Took 1.43 seconds to deallocate network for instance.#033[00m
Nov 22 02:56:30 np0005531887 nova_compute[186849]: 2025-11-22 07:56:30.778 186853 DEBUG nova.compute.manager [req-a0afe4bf-d386-4649-9676-b76a73f4349a req-f1751638-ffa3-460d-bfa5-e9e78c0f4549 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Detach interface failed, port_id=5a3c5c20-b82a-439a-ad72-9d1caab258ce, reason: Instance 8e21ac2e-9273-4909-9626-f29aae1d2c5a could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Nov 22 02:56:30 np0005531887 nova_compute[186849]: 2025-11-22 07:56:30.839 186853 DEBUG oslo_concurrency.lockutils [None req-dc377e59-c67d-4439-b45c-6364068c6801 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:30 np0005531887 nova_compute[186849]: 2025-11-22 07:56:30.840 186853 DEBUG oslo_concurrency.lockutils [None req-dc377e59-c67d-4439-b45c-6364068c6801 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:30 np0005531887 nova_compute[186849]: 2025-11-22 07:56:30.942 186853 DEBUG nova.compute.provider_tree [None req-dc377e59-c67d-4439-b45c-6364068c6801 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:56:30 np0005531887 nova_compute[186849]: 2025-11-22 07:56:30.957 186853 DEBUG nova.scheduler.client.report [None req-dc377e59-c67d-4439-b45c-6364068c6801 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:56:30 np0005531887 nova_compute[186849]: 2025-11-22 07:56:30.981 186853 DEBUG oslo_concurrency.lockutils [None req-dc377e59-c67d-4439-b45c-6364068c6801 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.142s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:31 np0005531887 nova_compute[186849]: 2025-11-22 07:56:31.063 186853 INFO nova.scheduler.client.report [None req-dc377e59-c67d-4439-b45c-6364068c6801 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Deleted allocations for instance 8e21ac2e-9273-4909-9626-f29aae1d2c5a#033[00m
Nov 22 02:56:31 np0005531887 nova_compute[186849]: 2025-11-22 07:56:31.209 186853 DEBUG oslo_concurrency.lockutils [None req-dc377e59-c67d-4439-b45c-6364068c6801 468858baf68a4236b3d03dc430310165 73fcb22503a7438a99d8946a1ffb38ad - - default default] Lock "8e21ac2e-9273-4909-9626-f29aae1d2c5a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.264s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:31 np0005531887 nova_compute[186849]: 2025-11-22 07:56:31.338 186853 DEBUG nova.compute.manager [req-d55464b0-e473-4e07-b83c-77323c34bf7f req-de28f1bc-2392-49c6-ab97-4d3d26c0e73c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Received event network-vif-plugged-5a3c5c20-b82a-439a-ad72-9d1caab258ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:56:31 np0005531887 nova_compute[186849]: 2025-11-22 07:56:31.339 186853 DEBUG oslo_concurrency.lockutils [req-d55464b0-e473-4e07-b83c-77323c34bf7f req-de28f1bc-2392-49c6-ab97-4d3d26c0e73c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "8e21ac2e-9273-4909-9626-f29aae1d2c5a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:31 np0005531887 nova_compute[186849]: 2025-11-22 07:56:31.339 186853 DEBUG oslo_concurrency.lockutils [req-d55464b0-e473-4e07-b83c-77323c34bf7f req-de28f1bc-2392-49c6-ab97-4d3d26c0e73c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "8e21ac2e-9273-4909-9626-f29aae1d2c5a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:31 np0005531887 nova_compute[186849]: 2025-11-22 07:56:31.339 186853 DEBUG oslo_concurrency.lockutils [req-d55464b0-e473-4e07-b83c-77323c34bf7f req-de28f1bc-2392-49c6-ab97-4d3d26c0e73c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "8e21ac2e-9273-4909-9626-f29aae1d2c5a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:31 np0005531887 nova_compute[186849]: 2025-11-22 07:56:31.339 186853 DEBUG nova.compute.manager [req-d55464b0-e473-4e07-b83c-77323c34bf7f req-de28f1bc-2392-49c6-ab97-4d3d26c0e73c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] No waiting events found dispatching network-vif-plugged-5a3c5c20-b82a-439a-ad72-9d1caab258ce pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:56:31 np0005531887 nova_compute[186849]: 2025-11-22 07:56:31.340 186853 WARNING nova.compute.manager [req-d55464b0-e473-4e07-b83c-77323c34bf7f req-de28f1bc-2392-49c6-ab97-4d3d26c0e73c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Received unexpected event network-vif-plugged-5a3c5c20-b82a-439a-ad72-9d1caab258ce for instance with vm_state deleted and task_state None.#033[00m
Nov 22 02:56:31 np0005531887 nova_compute[186849]: 2025-11-22 07:56:31.522 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:32 np0005531887 nova_compute[186849]: 2025-11-22 07:56:32.167 186853 DEBUG oslo_concurrency.lockutils [None req-f5f81692-dbf3-48a8-91ca-3db0878651b7 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "5eafb037-41a2-463f-9d3a-1b4248cb00f2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:32 np0005531887 nova_compute[186849]: 2025-11-22 07:56:32.168 186853 DEBUG oslo_concurrency.lockutils [None req-f5f81692-dbf3-48a8-91ca-3db0878651b7 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "5eafb037-41a2-463f-9d3a-1b4248cb00f2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:32 np0005531887 nova_compute[186849]: 2025-11-22 07:56:32.168 186853 DEBUG oslo_concurrency.lockutils [None req-f5f81692-dbf3-48a8-91ca-3db0878651b7 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "5eafb037-41a2-463f-9d3a-1b4248cb00f2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:32 np0005531887 nova_compute[186849]: 2025-11-22 07:56:32.169 186853 DEBUG oslo_concurrency.lockutils [None req-f5f81692-dbf3-48a8-91ca-3db0878651b7 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "5eafb037-41a2-463f-9d3a-1b4248cb00f2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:32 np0005531887 nova_compute[186849]: 2025-11-22 07:56:32.169 186853 DEBUG oslo_concurrency.lockutils [None req-f5f81692-dbf3-48a8-91ca-3db0878651b7 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "5eafb037-41a2-463f-9d3a-1b4248cb00f2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:32 np0005531887 nova_compute[186849]: 2025-11-22 07:56:32.177 186853 INFO nova.compute.manager [None req-f5f81692-dbf3-48a8-91ca-3db0878651b7 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Terminating instance#033[00m
Nov 22 02:56:32 np0005531887 nova_compute[186849]: 2025-11-22 07:56:32.186 186853 DEBUG nova.compute.manager [None req-f5f81692-dbf3-48a8-91ca-3db0878651b7 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 02:56:32 np0005531887 kernel: tap041d3fd2-9a (unregistering): left promiscuous mode
Nov 22 02:56:32 np0005531887 NetworkManager[55210]: <info>  [1763798192.2070] device (tap041d3fd2-9a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 02:56:32 np0005531887 ovn_controller[95130]: 2025-11-22T07:56:32Z|00189|binding|INFO|Releasing lport 041d3fd2-9a77-48a1-b976-9dda05f01f7b from this chassis (sb_readonly=0)
Nov 22 02:56:32 np0005531887 nova_compute[186849]: 2025-11-22 07:56:32.217 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:32 np0005531887 ovn_controller[95130]: 2025-11-22T07:56:32Z|00190|binding|INFO|Setting lport 041d3fd2-9a77-48a1-b976-9dda05f01f7b down in Southbound
Nov 22 02:56:32 np0005531887 ovn_controller[95130]: 2025-11-22T07:56:32Z|00191|binding|INFO|Removing iface tap041d3fd2-9a ovn-installed in OVS
Nov 22 02:56:32 np0005531887 nova_compute[186849]: 2025-11-22 07:56:32.219 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:32 np0005531887 nova_compute[186849]: 2025-11-22 07:56:32.235 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:32 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:32.243 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:31:97:43 10.100.0.14'], port_security=['fa:16:3e:31:97:43 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '5eafb037-41a2-463f-9d3a-1b4248cb00f2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '063bf16c91af408ca075c690797e09d8', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'f7dee984-8c58-404b-bb82-9a414aef709d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3245992a-6d76-4250-aff6-2ef09c2bac10, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=041d3fd2-9a77-48a1-b976-9dda05f01f7b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:56:32 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:32.245 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 041d3fd2-9a77-48a1-b976-9dda05f01f7b in datapath d54e232a-5c68-4cc7-b58c-054da9c4646f unbound from our chassis#033[00m
Nov 22 02:56:32 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:32.246 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d54e232a-5c68-4cc7-b58c-054da9c4646f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 02:56:32 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:32.247 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[ed9db80c-d142-4219-aaca-8f38f4b1b621]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:32 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:32.247 104084 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f namespace which is not needed anymore#033[00m
Nov 22 02:56:32 np0005531887 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d00000046.scope: Deactivated successfully.
Nov 22 02:56:32 np0005531887 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d00000046.scope: Consumed 12.245s CPU time.
Nov 22 02:56:32 np0005531887 systemd-machined[153180]: Machine qemu-27-instance-00000046 terminated.
Nov 22 02:56:32 np0005531887 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[223448]: [NOTICE]   (223452) : haproxy version is 2.8.14-c23fe91
Nov 22 02:56:32 np0005531887 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[223448]: [NOTICE]   (223452) : path to executable is /usr/sbin/haproxy
Nov 22 02:56:32 np0005531887 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[223448]: [WARNING]  (223452) : Exiting Master process...
Nov 22 02:56:32 np0005531887 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[223448]: [ALERT]    (223452) : Current worker (223454) exited with code 143 (Terminated)
Nov 22 02:56:32 np0005531887 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[223448]: [WARNING]  (223452) : All workers exited. Exiting... (0)
Nov 22 02:56:32 np0005531887 systemd[1]: libpod-e8b24e209502207a55f39799f013840f87986758e8dfef2510be31a61dc11f42.scope: Deactivated successfully.
Nov 22 02:56:32 np0005531887 podman[223728]: 2025-11-22 07:56:32.389489691 +0000 UTC m=+0.050336556 container died e8b24e209502207a55f39799f013840f87986758e8dfef2510be31a61dc11f42 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:56:32 np0005531887 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e8b24e209502207a55f39799f013840f87986758e8dfef2510be31a61dc11f42-userdata-shm.mount: Deactivated successfully.
Nov 22 02:56:32 np0005531887 systemd[1]: var-lib-containers-storage-overlay-5a1fe8b600a9bfcc1476e28b6cc24df9875dca020f1fb5c431cfa6c42ced55d5-merged.mount: Deactivated successfully.
Nov 22 02:56:32 np0005531887 podman[223728]: 2025-11-22 07:56:32.435708252 +0000 UTC m=+0.096555117 container cleanup e8b24e209502207a55f39799f013840f87986758e8dfef2510be31a61dc11f42 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:56:32 np0005531887 systemd[1]: libpod-conmon-e8b24e209502207a55f39799f013840f87986758e8dfef2510be31a61dc11f42.scope: Deactivated successfully.
Nov 22 02:56:32 np0005531887 nova_compute[186849]: 2025-11-22 07:56:32.456 186853 INFO nova.virt.libvirt.driver [-] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Instance destroyed successfully.#033[00m
Nov 22 02:56:32 np0005531887 nova_compute[186849]: 2025-11-22 07:56:32.457 186853 DEBUG nova.objects.instance [None req-f5f81692-dbf3-48a8-91ca-3db0878651b7 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lazy-loading 'resources' on Instance uuid 5eafb037-41a2-463f-9d3a-1b4248cb00f2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:56:32 np0005531887 nova_compute[186849]: 2025-11-22 07:56:32.474 186853 DEBUG nova.virt.libvirt.vif [None req-f5f81692-dbf3-48a8-91ca-3db0878651b7 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:55:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1663966303',display_name='tempest-ServerDiskConfigTestJSON-server-1663966303',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1663966303',id=70,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:56:20Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='063bf16c91af408ca075c690797e09d8',ramdisk_id='',reservation_id='r-9g9kcea9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-592691466',owner_user_name='tempest-ServerDiskConfigTestJSON-592691466-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:56:24Z,user_data=None,user_id='e24c302b62fb470aa189b76d4676733b',uuid=5eafb037-41a2-463f-9d3a-1b4248cb00f2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "041d3fd2-9a77-48a1-b976-9dda05f01f7b", "address": "fa:16:3e:31:97:43", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap041d3fd2-9a", "ovs_interfaceid": "041d3fd2-9a77-48a1-b976-9dda05f01f7b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 02:56:32 np0005531887 nova_compute[186849]: 2025-11-22 07:56:32.475 186853 DEBUG nova.network.os_vif_util [None req-f5f81692-dbf3-48a8-91ca-3db0878651b7 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Converting VIF {"id": "041d3fd2-9a77-48a1-b976-9dda05f01f7b", "address": "fa:16:3e:31:97:43", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap041d3fd2-9a", "ovs_interfaceid": "041d3fd2-9a77-48a1-b976-9dda05f01f7b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:56:32 np0005531887 nova_compute[186849]: 2025-11-22 07:56:32.476 186853 DEBUG nova.network.os_vif_util [None req-f5f81692-dbf3-48a8-91ca-3db0878651b7 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:31:97:43,bridge_name='br-int',has_traffic_filtering=True,id=041d3fd2-9a77-48a1-b976-9dda05f01f7b,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap041d3fd2-9a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:56:32 np0005531887 nova_compute[186849]: 2025-11-22 07:56:32.476 186853 DEBUG os_vif [None req-f5f81692-dbf3-48a8-91ca-3db0878651b7 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:31:97:43,bridge_name='br-int',has_traffic_filtering=True,id=041d3fd2-9a77-48a1-b976-9dda05f01f7b,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap041d3fd2-9a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 02:56:32 np0005531887 nova_compute[186849]: 2025-11-22 07:56:32.477 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:32 np0005531887 nova_compute[186849]: 2025-11-22 07:56:32.478 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap041d3fd2-9a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:56:32 np0005531887 nova_compute[186849]: 2025-11-22 07:56:32.479 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:32 np0005531887 nova_compute[186849]: 2025-11-22 07:56:32.481 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:56:32 np0005531887 nova_compute[186849]: 2025-11-22 07:56:32.481 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:32 np0005531887 nova_compute[186849]: 2025-11-22 07:56:32.483 186853 INFO os_vif [None req-f5f81692-dbf3-48a8-91ca-3db0878651b7 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:31:97:43,bridge_name='br-int',has_traffic_filtering=True,id=041d3fd2-9a77-48a1-b976-9dda05f01f7b,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap041d3fd2-9a')#033[00m
Nov 22 02:56:32 np0005531887 nova_compute[186849]: 2025-11-22 07:56:32.483 186853 INFO nova.virt.libvirt.driver [None req-f5f81692-dbf3-48a8-91ca-3db0878651b7 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Deleting instance files /var/lib/nova/instances/5eafb037-41a2-463f-9d3a-1b4248cb00f2_del#033[00m
Nov 22 02:56:32 np0005531887 nova_compute[186849]: 2025-11-22 07:56:32.490 186853 INFO nova.virt.libvirt.driver [None req-f5f81692-dbf3-48a8-91ca-3db0878651b7 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Deletion of /var/lib/nova/instances/5eafb037-41a2-463f-9d3a-1b4248cb00f2_del complete#033[00m
Nov 22 02:56:32 np0005531887 podman[223773]: 2025-11-22 07:56:32.522172418 +0000 UTC m=+0.057166226 container remove e8b24e209502207a55f39799f013840f87986758e8dfef2510be31a61dc11f42 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 02:56:32 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:32.528 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[c3f43f1a-69ed-40f9-9457-fda336be48e0]: (4, ('Sat Nov 22 07:56:32 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f (e8b24e209502207a55f39799f013840f87986758e8dfef2510be31a61dc11f42)\ne8b24e209502207a55f39799f013840f87986758e8dfef2510be31a61dc11f42\nSat Nov 22 07:56:32 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f (e8b24e209502207a55f39799f013840f87986758e8dfef2510be31a61dc11f42)\ne8b24e209502207a55f39799f013840f87986758e8dfef2510be31a61dc11f42\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:32 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:32.530 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[8bc65ab2-6b47-4326-ad74-3caa0ff53632]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:32 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:32.532 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd54e232a-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:56:32 np0005531887 nova_compute[186849]: 2025-11-22 07:56:32.534 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:32 np0005531887 kernel: tapd54e232a-50: left promiscuous mode
Nov 22 02:56:32 np0005531887 nova_compute[186849]: 2025-11-22 07:56:32.548 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:32 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:32.550 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[5fce751e-477f-4e52-ad45-1757360890aa]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:32 np0005531887 nova_compute[186849]: 2025-11-22 07:56:32.561 186853 DEBUG nova.compute.manager [req-d5ae9c68-15ed-43a5-8d7f-5505ed633d98 req-bbf631be-8b56-45c2-8dda-b1fe291f4e06 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Received event network-vif-unplugged-041d3fd2-9a77-48a1-b976-9dda05f01f7b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:56:32 np0005531887 nova_compute[186849]: 2025-11-22 07:56:32.562 186853 DEBUG oslo_concurrency.lockutils [req-d5ae9c68-15ed-43a5-8d7f-5505ed633d98 req-bbf631be-8b56-45c2-8dda-b1fe291f4e06 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "5eafb037-41a2-463f-9d3a-1b4248cb00f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:32 np0005531887 nova_compute[186849]: 2025-11-22 07:56:32.562 186853 DEBUG oslo_concurrency.lockutils [req-d5ae9c68-15ed-43a5-8d7f-5505ed633d98 req-bbf631be-8b56-45c2-8dda-b1fe291f4e06 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "5eafb037-41a2-463f-9d3a-1b4248cb00f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:32 np0005531887 nova_compute[186849]: 2025-11-22 07:56:32.564 186853 DEBUG oslo_concurrency.lockutils [req-d5ae9c68-15ed-43a5-8d7f-5505ed633d98 req-bbf631be-8b56-45c2-8dda-b1fe291f4e06 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "5eafb037-41a2-463f-9d3a-1b4248cb00f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:32 np0005531887 nova_compute[186849]: 2025-11-22 07:56:32.565 186853 DEBUG nova.compute.manager [req-d5ae9c68-15ed-43a5-8d7f-5505ed633d98 req-bbf631be-8b56-45c2-8dda-b1fe291f4e06 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] No waiting events found dispatching network-vif-unplugged-041d3fd2-9a77-48a1-b976-9dda05f01f7b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:56:32 np0005531887 nova_compute[186849]: 2025-11-22 07:56:32.565 186853 DEBUG nova.compute.manager [req-d5ae9c68-15ed-43a5-8d7f-5505ed633d98 req-bbf631be-8b56-45c2-8dda-b1fe291f4e06 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Received event network-vif-unplugged-041d3fd2-9a77-48a1-b976-9dda05f01f7b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 02:56:32 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:32.570 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[9f91cebd-b94b-4260-a12b-854af3ac9a5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:32 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:32.572 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[5ff1ee0c-f79e-47ef-b408-c072ff8ed2be]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:32 np0005531887 nova_compute[186849]: 2025-11-22 07:56:32.579 186853 INFO nova.compute.manager [None req-f5f81692-dbf3-48a8-91ca-3db0878651b7 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 02:56:32 np0005531887 nova_compute[186849]: 2025-11-22 07:56:32.580 186853 DEBUG oslo.service.loopingcall [None req-f5f81692-dbf3-48a8-91ca-3db0878651b7 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 02:56:32 np0005531887 nova_compute[186849]: 2025-11-22 07:56:32.580 186853 DEBUG nova.compute.manager [-] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 02:56:32 np0005531887 nova_compute[186849]: 2025-11-22 07:56:32.580 186853 DEBUG nova.network.neutron [-] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 02:56:32 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:32.593 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[e258ada7-c8ae-4239-8c40-86389d5b15bc]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 492279, 'reachable_time': 23062, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223786, 'error': None, 'target': 'ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:32 np0005531887 systemd[1]: run-netns-ovnmeta\x2dd54e232a\x2d5c68\x2d4cc7\x2db58c\x2d054da9c4646f.mount: Deactivated successfully.
Nov 22 02:56:32 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:32.597 104200 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 02:56:32 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:32.597 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[fd9fa232-500c-484a-9b20-a802f5fc2e57]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:34 np0005531887 nova_compute[186849]: 2025-11-22 07:56:34.676 186853 DEBUG nova.compute.manager [req-9643b71f-062e-4409-bcab-32aeb7408e65 req-9b40f90e-deb8-4a23-b552-795548d1a3b6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Received event network-vif-plugged-041d3fd2-9a77-48a1-b976-9dda05f01f7b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:56:34 np0005531887 nova_compute[186849]: 2025-11-22 07:56:34.677 186853 DEBUG oslo_concurrency.lockutils [req-9643b71f-062e-4409-bcab-32aeb7408e65 req-9b40f90e-deb8-4a23-b552-795548d1a3b6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "5eafb037-41a2-463f-9d3a-1b4248cb00f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:34 np0005531887 nova_compute[186849]: 2025-11-22 07:56:34.677 186853 DEBUG oslo_concurrency.lockutils [req-9643b71f-062e-4409-bcab-32aeb7408e65 req-9b40f90e-deb8-4a23-b552-795548d1a3b6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "5eafb037-41a2-463f-9d3a-1b4248cb00f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:34 np0005531887 nova_compute[186849]: 2025-11-22 07:56:34.677 186853 DEBUG oslo_concurrency.lockutils [req-9643b71f-062e-4409-bcab-32aeb7408e65 req-9b40f90e-deb8-4a23-b552-795548d1a3b6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "5eafb037-41a2-463f-9d3a-1b4248cb00f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:34 np0005531887 nova_compute[186849]: 2025-11-22 07:56:34.678 186853 DEBUG nova.compute.manager [req-9643b71f-062e-4409-bcab-32aeb7408e65 req-9b40f90e-deb8-4a23-b552-795548d1a3b6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] No waiting events found dispatching network-vif-plugged-041d3fd2-9a77-48a1-b976-9dda05f01f7b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:56:34 np0005531887 nova_compute[186849]: 2025-11-22 07:56:34.678 186853 WARNING nova.compute.manager [req-9643b71f-062e-4409-bcab-32aeb7408e65 req-9b40f90e-deb8-4a23-b552-795548d1a3b6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Received unexpected event network-vif-plugged-041d3fd2-9a77-48a1-b976-9dda05f01f7b for instance with vm_state active and task_state deleting.#033[00m
Nov 22 02:56:34 np0005531887 podman[223787]: 2025-11-22 07:56:34.846648078 +0000 UTC m=+0.060168201 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., name=ubi9-minimal, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-type=git)
Nov 22 02:56:35 np0005531887 nova_compute[186849]: 2025-11-22 07:56:35.307 186853 DEBUG nova.network.neutron [-] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:56:35 np0005531887 nova_compute[186849]: 2025-11-22 07:56:35.322 186853 INFO nova.compute.manager [-] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Took 2.74 seconds to deallocate network for instance.#033[00m
Nov 22 02:56:35 np0005531887 nova_compute[186849]: 2025-11-22 07:56:35.438 186853 DEBUG oslo_concurrency.lockutils [None req-f5f81692-dbf3-48a8-91ca-3db0878651b7 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:35 np0005531887 nova_compute[186849]: 2025-11-22 07:56:35.439 186853 DEBUG oslo_concurrency.lockutils [None req-f5f81692-dbf3-48a8-91ca-3db0878651b7 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:35 np0005531887 nova_compute[186849]: 2025-11-22 07:56:35.446 186853 DEBUG oslo_concurrency.lockutils [None req-f5f81692-dbf3-48a8-91ca-3db0878651b7 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.007s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:35 np0005531887 nova_compute[186849]: 2025-11-22 07:56:35.504 186853 INFO nova.scheduler.client.report [None req-f5f81692-dbf3-48a8-91ca-3db0878651b7 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Deleted allocations for instance 5eafb037-41a2-463f-9d3a-1b4248cb00f2#033[00m
Nov 22 02:56:35 np0005531887 nova_compute[186849]: 2025-11-22 07:56:35.510 186853 DEBUG nova.compute.manager [req-a5abb145-16ad-44e0-8d64-f019a14160df req-7733b5d5-499a-4a5c-801f-e1a329159e1f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Received event network-vif-deleted-041d3fd2-9a77-48a1-b976-9dda05f01f7b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:56:35 np0005531887 nova_compute[186849]: 2025-11-22 07:56:35.591 186853 DEBUG oslo_concurrency.lockutils [None req-f5f81692-dbf3-48a8-91ca-3db0878651b7 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "5eafb037-41a2-463f-9d3a-1b4248cb00f2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.423s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:36 np0005531887 nova_compute[186849]: 2025-11-22 07:56:36.524 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:56:36.666 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:56:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:56:36.669 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:56:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:56:36.669 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:56:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:56:36.669 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:56:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:56:36.669 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:56:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:56:36.669 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:56:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:56:36.669 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:56:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:56:36.669 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:56:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:56:36.669 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:56:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:56:36.669 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:56:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:56:36.669 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:56:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:56:36.669 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:56:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:56:36.670 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:56:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:56:36.670 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:56:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:56:36.670 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:56:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:56:36.670 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:56:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:56:36.670 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:56:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:56:36.670 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:56:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:56:36.670 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:56:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:56:36.670 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:56:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:56:36.670 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:56:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:56:36.670 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:56:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:56:36.670 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:56:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:56:36.670 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:56:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:56:36.671 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 02:56:36 np0005531887 nova_compute[186849]: 2025-11-22 07:56:36.872 186853 DEBUG oslo_concurrency.lockutils [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "c6cd5fec-f214-4bbc-b854-9e16c9a7577a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:36 np0005531887 nova_compute[186849]: 2025-11-22 07:56:36.873 186853 DEBUG oslo_concurrency.lockutils [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "c6cd5fec-f214-4bbc-b854-9e16c9a7577a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:36 np0005531887 nova_compute[186849]: 2025-11-22 07:56:36.907 186853 DEBUG nova.compute.manager [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 02:56:37 np0005531887 nova_compute[186849]: 2025-11-22 07:56:37.000 186853 DEBUG oslo_concurrency.lockutils [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:37 np0005531887 nova_compute[186849]: 2025-11-22 07:56:37.001 186853 DEBUG oslo_concurrency.lockutils [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:37 np0005531887 nova_compute[186849]: 2025-11-22 07:56:37.006 186853 DEBUG nova.virt.hardware [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:56:37 np0005531887 nova_compute[186849]: 2025-11-22 07:56:37.006 186853 INFO nova.compute.claims [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 22 02:56:37 np0005531887 nova_compute[186849]: 2025-11-22 07:56:37.117 186853 DEBUG nova.compute.provider_tree [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:56:37 np0005531887 nova_compute[186849]: 2025-11-22 07:56:37.142 186853 DEBUG nova.scheduler.client.report [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:56:37 np0005531887 nova_compute[186849]: 2025-11-22 07:56:37.162 186853 DEBUG oslo_concurrency.lockutils [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.162s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:37 np0005531887 nova_compute[186849]: 2025-11-22 07:56:37.163 186853 DEBUG nova.compute.manager [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 02:56:37 np0005531887 nova_compute[186849]: 2025-11-22 07:56:37.243 186853 DEBUG nova.compute.manager [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 02:56:37 np0005531887 nova_compute[186849]: 2025-11-22 07:56:37.244 186853 DEBUG nova.network.neutron [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 02:56:37 np0005531887 nova_compute[186849]: 2025-11-22 07:56:37.270 186853 INFO nova.virt.libvirt.driver [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 02:56:37 np0005531887 nova_compute[186849]: 2025-11-22 07:56:37.289 186853 DEBUG nova.compute.manager [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 02:56:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:37.327 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:37.327 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:37.327 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:37 np0005531887 nova_compute[186849]: 2025-11-22 07:56:37.454 186853 DEBUG nova.compute.manager [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 02:56:37 np0005531887 nova_compute[186849]: 2025-11-22 07:56:37.455 186853 DEBUG nova.virt.libvirt.driver [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:56:37 np0005531887 nova_compute[186849]: 2025-11-22 07:56:37.456 186853 INFO nova.virt.libvirt.driver [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Creating image(s)#033[00m
Nov 22 02:56:37 np0005531887 nova_compute[186849]: 2025-11-22 07:56:37.456 186853 DEBUG oslo_concurrency.lockutils [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "/var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:37 np0005531887 nova_compute[186849]: 2025-11-22 07:56:37.457 186853 DEBUG oslo_concurrency.lockutils [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "/var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:37 np0005531887 nova_compute[186849]: 2025-11-22 07:56:37.458 186853 DEBUG oslo_concurrency.lockutils [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "/var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:37 np0005531887 nova_compute[186849]: 2025-11-22 07:56:37.471 186853 DEBUG oslo_concurrency.processutils [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:56:37 np0005531887 nova_compute[186849]: 2025-11-22 07:56:37.488 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:37 np0005531887 nova_compute[186849]: 2025-11-22 07:56:37.529 186853 DEBUG oslo_concurrency.processutils [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:56:37 np0005531887 nova_compute[186849]: 2025-11-22 07:56:37.530 186853 DEBUG oslo_concurrency.lockutils [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:37 np0005531887 nova_compute[186849]: 2025-11-22 07:56:37.531 186853 DEBUG oslo_concurrency.lockutils [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:37 np0005531887 nova_compute[186849]: 2025-11-22 07:56:37.544 186853 DEBUG oslo_concurrency.processutils [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:56:37 np0005531887 nova_compute[186849]: 2025-11-22 07:56:37.610 186853 DEBUG oslo_concurrency.processutils [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:56:37 np0005531887 nova_compute[186849]: 2025-11-22 07:56:37.611 186853 DEBUG oslo_concurrency.processutils [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:56:37 np0005531887 nova_compute[186849]: 2025-11-22 07:56:37.674 186853 DEBUG oslo_concurrency.processutils [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a/disk 1073741824" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:56:37 np0005531887 nova_compute[186849]: 2025-11-22 07:56:37.675 186853 DEBUG oslo_concurrency.lockutils [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.145s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:37 np0005531887 nova_compute[186849]: 2025-11-22 07:56:37.676 186853 DEBUG oslo_concurrency.processutils [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:56:37 np0005531887 nova_compute[186849]: 2025-11-22 07:56:37.729 186853 DEBUG nova.policy [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e24c302b62fb470aa189b76d4676733b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '063bf16c91af408ca075c690797e09d8', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 02:56:37 np0005531887 nova_compute[186849]: 2025-11-22 07:56:37.742 186853 DEBUG oslo_concurrency.processutils [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:56:37 np0005531887 nova_compute[186849]: 2025-11-22 07:56:37.743 186853 DEBUG nova.virt.disk.api [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Checking if we can resize image /var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:56:37 np0005531887 nova_compute[186849]: 2025-11-22 07:56:37.744 186853 DEBUG oslo_concurrency.processutils [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:56:37 np0005531887 nova_compute[186849]: 2025-11-22 07:56:37.811 186853 DEBUG oslo_concurrency.processutils [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:56:37 np0005531887 nova_compute[186849]: 2025-11-22 07:56:37.812 186853 DEBUG nova.virt.disk.api [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Cannot resize image /var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:56:37 np0005531887 nova_compute[186849]: 2025-11-22 07:56:37.813 186853 DEBUG nova.objects.instance [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lazy-loading 'migration_context' on Instance uuid c6cd5fec-f214-4bbc-b854-9e16c9a7577a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:56:37 np0005531887 nova_compute[186849]: 2025-11-22 07:56:37.830 186853 DEBUG nova.virt.libvirt.driver [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:56:37 np0005531887 nova_compute[186849]: 2025-11-22 07:56:37.831 186853 DEBUG nova.virt.libvirt.driver [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Ensure instance console log exists: /var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:56:37 np0005531887 nova_compute[186849]: 2025-11-22 07:56:37.831 186853 DEBUG oslo_concurrency.lockutils [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:37 np0005531887 nova_compute[186849]: 2025-11-22 07:56:37.832 186853 DEBUG oslo_concurrency.lockutils [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:37 np0005531887 nova_compute[186849]: 2025-11-22 07:56:37.832 186853 DEBUG oslo_concurrency.lockutils [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:38 np0005531887 nova_compute[186849]: 2025-11-22 07:56:38.045 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:38 np0005531887 podman[223824]: 2025-11-22 07:56:38.851473275 +0000 UTC m=+0.069445382 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 02:56:38 np0005531887 podman[223825]: 2025-11-22 07:56:38.882452727 +0000 UTC m=+0.094629459 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 22 02:56:39 np0005531887 nova_compute[186849]: 2025-11-22 07:56:39.351 186853 DEBUG nova.network.neutron [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Successfully created port: 3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 02:56:40 np0005531887 nova_compute[186849]: 2025-11-22 07:56:40.618 186853 DEBUG nova.network.neutron [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Successfully updated port: 3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 02:56:40 np0005531887 nova_compute[186849]: 2025-11-22 07:56:40.633 186853 DEBUG oslo_concurrency.lockutils [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "refresh_cache-c6cd5fec-f214-4bbc-b854-9e16c9a7577a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:56:40 np0005531887 nova_compute[186849]: 2025-11-22 07:56:40.633 186853 DEBUG oslo_concurrency.lockutils [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquired lock "refresh_cache-c6cd5fec-f214-4bbc-b854-9e16c9a7577a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:56:40 np0005531887 nova_compute[186849]: 2025-11-22 07:56:40.633 186853 DEBUG nova.network.neutron [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:56:40 np0005531887 nova_compute[186849]: 2025-11-22 07:56:40.815 186853 DEBUG nova.compute.manager [req-5f810c27-5d99-4ace-8eed-f55b7b636d30 req-164f98a3-6a4d-4fc8-82f2-b307ccfc1a65 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Received event network-changed-3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:56:40 np0005531887 nova_compute[186849]: 2025-11-22 07:56:40.815 186853 DEBUG nova.compute.manager [req-5f810c27-5d99-4ace-8eed-f55b7b636d30 req-164f98a3-6a4d-4fc8-82f2-b307ccfc1a65 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Refreshing instance network info cache due to event network-changed-3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 02:56:40 np0005531887 nova_compute[186849]: 2025-11-22 07:56:40.816 186853 DEBUG oslo_concurrency.lockutils [req-5f810c27-5d99-4ace-8eed-f55b7b636d30 req-164f98a3-6a4d-4fc8-82f2-b307ccfc1a65 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-c6cd5fec-f214-4bbc-b854-9e16c9a7577a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:56:40 np0005531887 nova_compute[186849]: 2025-11-22 07:56:40.960 186853 DEBUG nova.network.neutron [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:56:41 np0005531887 nova_compute[186849]: 2025-11-22 07:56:41.526 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:41 np0005531887 nova_compute[186849]: 2025-11-22 07:56:41.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:56:42 np0005531887 nova_compute[186849]: 2025-11-22 07:56:42.492 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:42 np0005531887 nova_compute[186849]: 2025-11-22 07:56:42.499 186853 DEBUG nova.network.neutron [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Updating instance_info_cache with network_info: [{"id": "3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2", "address": "fa:16:3e:27:44:b9", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3640f80e-71", "ovs_interfaceid": "3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:56:42 np0005531887 nova_compute[186849]: 2025-11-22 07:56:42.545 186853 DEBUG oslo_concurrency.lockutils [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Releasing lock "refresh_cache-c6cd5fec-f214-4bbc-b854-9e16c9a7577a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:56:42 np0005531887 nova_compute[186849]: 2025-11-22 07:56:42.546 186853 DEBUG nova.compute.manager [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Instance network_info: |[{"id": "3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2", "address": "fa:16:3e:27:44:b9", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3640f80e-71", "ovs_interfaceid": "3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 02:56:42 np0005531887 nova_compute[186849]: 2025-11-22 07:56:42.546 186853 DEBUG oslo_concurrency.lockutils [req-5f810c27-5d99-4ace-8eed-f55b7b636d30 req-164f98a3-6a4d-4fc8-82f2-b307ccfc1a65 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-c6cd5fec-f214-4bbc-b854-9e16c9a7577a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:56:42 np0005531887 nova_compute[186849]: 2025-11-22 07:56:42.547 186853 DEBUG nova.network.neutron [req-5f810c27-5d99-4ace-8eed-f55b7b636d30 req-164f98a3-6a4d-4fc8-82f2-b307ccfc1a65 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Refreshing network info cache for port 3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 02:56:42 np0005531887 nova_compute[186849]: 2025-11-22 07:56:42.551 186853 DEBUG nova.virt.libvirt.driver [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Start _get_guest_xml network_info=[{"id": "3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2", "address": "fa:16:3e:27:44:b9", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3640f80e-71", "ovs_interfaceid": "3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'size': 0, 'boot_index': 0, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:56:42 np0005531887 nova_compute[186849]: 2025-11-22 07:56:42.559 186853 WARNING nova.virt.libvirt.driver [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:56:42 np0005531887 nova_compute[186849]: 2025-11-22 07:56:42.565 186853 DEBUG nova.virt.libvirt.host [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:56:42 np0005531887 nova_compute[186849]: 2025-11-22 07:56:42.566 186853 DEBUG nova.virt.libvirt.host [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:56:42 np0005531887 nova_compute[186849]: 2025-11-22 07:56:42.577 186853 DEBUG nova.virt.libvirt.host [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:56:42 np0005531887 nova_compute[186849]: 2025-11-22 07:56:42.578 186853 DEBUG nova.virt.libvirt.host [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:56:42 np0005531887 nova_compute[186849]: 2025-11-22 07:56:42.579 186853 DEBUG nova.virt.libvirt.driver [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:56:42 np0005531887 nova_compute[186849]: 2025-11-22 07:56:42.579 186853 DEBUG nova.virt.hardware [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:56:42 np0005531887 nova_compute[186849]: 2025-11-22 07:56:42.580 186853 DEBUG nova.virt.hardware [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:56:42 np0005531887 nova_compute[186849]: 2025-11-22 07:56:42.580 186853 DEBUG nova.virt.hardware [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:56:42 np0005531887 nova_compute[186849]: 2025-11-22 07:56:42.580 186853 DEBUG nova.virt.hardware [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:56:42 np0005531887 nova_compute[186849]: 2025-11-22 07:56:42.580 186853 DEBUG nova.virt.hardware [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:56:42 np0005531887 nova_compute[186849]: 2025-11-22 07:56:42.580 186853 DEBUG nova.virt.hardware [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:56:42 np0005531887 nova_compute[186849]: 2025-11-22 07:56:42.581 186853 DEBUG nova.virt.hardware [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:56:42 np0005531887 nova_compute[186849]: 2025-11-22 07:56:42.581 186853 DEBUG nova.virt.hardware [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:56:42 np0005531887 nova_compute[186849]: 2025-11-22 07:56:42.581 186853 DEBUG nova.virt.hardware [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:56:42 np0005531887 nova_compute[186849]: 2025-11-22 07:56:42.581 186853 DEBUG nova.virt.hardware [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:56:42 np0005531887 nova_compute[186849]: 2025-11-22 07:56:42.581 186853 DEBUG nova.virt.hardware [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:56:42 np0005531887 nova_compute[186849]: 2025-11-22 07:56:42.585 186853 DEBUG nova.virt.libvirt.vif [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:56:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-772508279',display_name='tempest-ServerDiskConfigTestJSON-server-772508279',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-772508279',id=76,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='063bf16c91af408ca075c690797e09d8',ramdisk_id='',reservation_id='r-0n3qm4qh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-592691466',owner_user_name='tempest-ServerDiskConfigTestJSON-592691466-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:56:37Z,user_data=None,user_id='e24c302b62fb470aa189b76d4676733b',uuid=c6cd5fec-f214-4bbc-b854-9e16c9a7577a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2", "address": "fa:16:3e:27:44:b9", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3640f80e-71", "ovs_interfaceid": "3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 02:56:42 np0005531887 nova_compute[186849]: 2025-11-22 07:56:42.585 186853 DEBUG nova.network.os_vif_util [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Converting VIF {"id": "3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2", "address": "fa:16:3e:27:44:b9", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3640f80e-71", "ovs_interfaceid": "3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:56:42 np0005531887 nova_compute[186849]: 2025-11-22 07:56:42.586 186853 DEBUG nova.network.os_vif_util [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:27:44:b9,bridge_name='br-int',has_traffic_filtering=True,id=3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3640f80e-71') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:56:42 np0005531887 nova_compute[186849]: 2025-11-22 07:56:42.586 186853 DEBUG nova.objects.instance [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lazy-loading 'pci_devices' on Instance uuid c6cd5fec-f214-4bbc-b854-9e16c9a7577a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:56:42 np0005531887 nova_compute[186849]: 2025-11-22 07:56:42.600 186853 DEBUG nova.virt.libvirt.driver [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:56:42 np0005531887 nova_compute[186849]:  <uuid>c6cd5fec-f214-4bbc-b854-9e16c9a7577a</uuid>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:  <name>instance-0000004c</name>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:  <memory>131072</memory>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:  <vcpu>1</vcpu>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:56:42 np0005531887 nova_compute[186849]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:      <nova:name>tempest-ServerDiskConfigTestJSON-server-772508279</nova:name>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:      <nova:creationTime>2025-11-22 07:56:42</nova:creationTime>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:      <nova:flavor name="m1.nano">
Nov 22 02:56:42 np0005531887 nova_compute[186849]:        <nova:memory>128</nova:memory>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:        <nova:disk>1</nova:disk>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:        <nova:swap>0</nova:swap>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:      </nova:flavor>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:      <nova:owner>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:        <nova:user uuid="e24c302b62fb470aa189b76d4676733b">tempest-ServerDiskConfigTestJSON-592691466-project-member</nova:user>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:        <nova:project uuid="063bf16c91af408ca075c690797e09d8">tempest-ServerDiskConfigTestJSON-592691466</nova:project>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:      </nova:owner>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:      <nova:ports>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:        <nova:port uuid="3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2">
Nov 22 02:56:42 np0005531887 nova_compute[186849]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:        </nova:port>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:      </nova:ports>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:    </nova:instance>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:  <sysinfo type="smbios">
Nov 22 02:56:42 np0005531887 nova_compute[186849]:    <system>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:      <entry name="serial">c6cd5fec-f214-4bbc-b854-9e16c9a7577a</entry>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:      <entry name="uuid">c6cd5fec-f214-4bbc-b854-9e16c9a7577a</entry>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:    </system>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:  <os>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:    <boot dev="hd"/>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:    <smbios mode="sysinfo"/>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:  </os>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:  <features>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:    <vmcoreinfo/>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:  </features>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:  <clock offset="utc">
Nov 22 02:56:42 np0005531887 nova_compute[186849]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:    <timer name="hpet" present="no"/>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:  </clock>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:  <cpu mode="custom" match="exact">
Nov 22 02:56:42 np0005531887 nova_compute[186849]:    <model>Nehalem</model>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:  <devices>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:    <disk type="file" device="disk">
Nov 22 02:56:42 np0005531887 nova_compute[186849]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a/disk"/>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:      <target dev="vda" bus="virtio"/>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:    <disk type="file" device="cdrom">
Nov 22 02:56:42 np0005531887 nova_compute[186849]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a/disk.config"/>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:      <target dev="sda" bus="sata"/>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:    <interface type="ethernet">
Nov 22 02:56:42 np0005531887 nova_compute[186849]:      <mac address="fa:16:3e:27:44:b9"/>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:      <mtu size="1442"/>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:      <target dev="tap3640f80e-71"/>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:    </interface>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:    <serial type="pty">
Nov 22 02:56:42 np0005531887 nova_compute[186849]:      <log file="/var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a/console.log" append="off"/>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:    </serial>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:    <video>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:    </video>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:    <input type="tablet" bus="usb"/>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:    <rng model="virtio">
Nov 22 02:56:42 np0005531887 nova_compute[186849]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:    </rng>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:    <controller type="usb" index="0"/>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:    <memballoon model="virtio">
Nov 22 02:56:42 np0005531887 nova_compute[186849]:      <stats period="10"/>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 02:56:42 np0005531887 nova_compute[186849]:  </devices>
Nov 22 02:56:42 np0005531887 nova_compute[186849]: </domain>
Nov 22 02:56:42 np0005531887 nova_compute[186849]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:56:42 np0005531887 nova_compute[186849]: 2025-11-22 07:56:42.602 186853 DEBUG nova.compute.manager [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Preparing to wait for external event network-vif-plugged-3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 02:56:42 np0005531887 nova_compute[186849]: 2025-11-22 07:56:42.602 186853 DEBUG oslo_concurrency.lockutils [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "c6cd5fec-f214-4bbc-b854-9e16c9a7577a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:42 np0005531887 nova_compute[186849]: 2025-11-22 07:56:42.603 186853 DEBUG oslo_concurrency.lockutils [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "c6cd5fec-f214-4bbc-b854-9e16c9a7577a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:42 np0005531887 nova_compute[186849]: 2025-11-22 07:56:42.603 186853 DEBUG oslo_concurrency.lockutils [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "c6cd5fec-f214-4bbc-b854-9e16c9a7577a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:42 np0005531887 nova_compute[186849]: 2025-11-22 07:56:42.604 186853 DEBUG nova.virt.libvirt.vif [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:56:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-772508279',display_name='tempest-ServerDiskConfigTestJSON-server-772508279',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-772508279',id=76,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='063bf16c91af408ca075c690797e09d8',ramdisk_id='',reservation_id='r-0n3qm4qh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-592691466',owner_user_name='tempest-ServerDiskConfigTestJSON-592691466-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:56:37Z,user_data=None,user_id='e24c302b62fb470aa189b76d4676733b',uuid=c6cd5fec-f214-4bbc-b854-9e16c9a7577a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2", "address": "fa:16:3e:27:44:b9", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3640f80e-71", "ovs_interfaceid": "3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 02:56:42 np0005531887 nova_compute[186849]: 2025-11-22 07:56:42.604 186853 DEBUG nova.network.os_vif_util [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Converting VIF {"id": "3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2", "address": "fa:16:3e:27:44:b9", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3640f80e-71", "ovs_interfaceid": "3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:56:42 np0005531887 nova_compute[186849]: 2025-11-22 07:56:42.605 186853 DEBUG nova.network.os_vif_util [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:27:44:b9,bridge_name='br-int',has_traffic_filtering=True,id=3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3640f80e-71') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:56:42 np0005531887 nova_compute[186849]: 2025-11-22 07:56:42.605 186853 DEBUG os_vif [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:27:44:b9,bridge_name='br-int',has_traffic_filtering=True,id=3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3640f80e-71') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 02:56:42 np0005531887 nova_compute[186849]: 2025-11-22 07:56:42.606 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:42 np0005531887 nova_compute[186849]: 2025-11-22 07:56:42.607 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:56:42 np0005531887 nova_compute[186849]: 2025-11-22 07:56:42.607 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:56:42 np0005531887 nova_compute[186849]: 2025-11-22 07:56:42.610 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:42 np0005531887 nova_compute[186849]: 2025-11-22 07:56:42.611 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3640f80e-71, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:56:42 np0005531887 nova_compute[186849]: 2025-11-22 07:56:42.611 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3640f80e-71, col_values=(('external_ids', {'iface-id': '3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:27:44:b9', 'vm-uuid': 'c6cd5fec-f214-4bbc-b854-9e16c9a7577a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:56:42 np0005531887 nova_compute[186849]: 2025-11-22 07:56:42.613 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:42 np0005531887 NetworkManager[55210]: <info>  [1763798202.6144] manager: (tap3640f80e-71): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/101)
Nov 22 02:56:42 np0005531887 nova_compute[186849]: 2025-11-22 07:56:42.615 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:56:42 np0005531887 nova_compute[186849]: 2025-11-22 07:56:42.622 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:42 np0005531887 nova_compute[186849]: 2025-11-22 07:56:42.624 186853 INFO os_vif [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:27:44:b9,bridge_name='br-int',has_traffic_filtering=True,id=3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3640f80e-71')#033[00m
Nov 22 02:56:42 np0005531887 nova_compute[186849]: 2025-11-22 07:56:42.905 186853 DEBUG nova.virt.libvirt.driver [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:56:42 np0005531887 nova_compute[186849]: 2025-11-22 07:56:42.906 186853 DEBUG nova.virt.libvirt.driver [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:56:42 np0005531887 nova_compute[186849]: 2025-11-22 07:56:42.907 186853 DEBUG nova.virt.libvirt.driver [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] No VIF found with MAC fa:16:3e:27:44:b9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 02:56:42 np0005531887 nova_compute[186849]: 2025-11-22 07:56:42.907 186853 INFO nova.virt.libvirt.driver [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Using config drive#033[00m
Nov 22 02:56:44 np0005531887 nova_compute[186849]: 2025-11-22 07:56:44.226 186853 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798189.2255511, 8e21ac2e-9273-4909-9626-f29aae1d2c5a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:56:44 np0005531887 nova_compute[186849]: 2025-11-22 07:56:44.227 186853 INFO nova.compute.manager [-] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:56:44 np0005531887 nova_compute[186849]: 2025-11-22 07:56:44.256 186853 DEBUG nova.compute.manager [None req-afc27b4b-3013-431b-b6dd-cd1ac0404fb6 - - - - - -] [instance: 8e21ac2e-9273-4909-9626-f29aae1d2c5a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:56:44 np0005531887 nova_compute[186849]: 2025-11-22 07:56:44.270 186853 INFO nova.virt.libvirt.driver [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Creating config drive at /var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a/disk.config#033[00m
Nov 22 02:56:44 np0005531887 nova_compute[186849]: 2025-11-22 07:56:44.276 186853 DEBUG oslo_concurrency.processutils [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_spl_x61 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:56:44 np0005531887 nova_compute[186849]: 2025-11-22 07:56:44.415 186853 DEBUG oslo_concurrency.processutils [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_spl_x61" returned: 0 in 0.138s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:56:44 np0005531887 kernel: tap3640f80e-71: entered promiscuous mode
Nov 22 02:56:44 np0005531887 NetworkManager[55210]: <info>  [1763798204.5100] manager: (tap3640f80e-71): new Tun device (/org/freedesktop/NetworkManager/Devices/102)
Nov 22 02:56:44 np0005531887 nova_compute[186849]: 2025-11-22 07:56:44.511 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:44 np0005531887 ovn_controller[95130]: 2025-11-22T07:56:44Z|00192|binding|INFO|Claiming lport 3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 for this chassis.
Nov 22 02:56:44 np0005531887 ovn_controller[95130]: 2025-11-22T07:56:44Z|00193|binding|INFO|3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2: Claiming fa:16:3e:27:44:b9 10.100.0.12
Nov 22 02:56:44 np0005531887 nova_compute[186849]: 2025-11-22 07:56:44.517 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:44 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:44.540 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:27:44:b9 10.100.0.12'], port_security=['fa:16:3e:27:44:b9 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'c6cd5fec-f214-4bbc-b854-9e16c9a7577a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '063bf16c91af408ca075c690797e09d8', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f7dee984-8c58-404b-bb82-9a414aef709d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3245992a-6d76-4250-aff6-2ef09c2bac10, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:56:44 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:44.542 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 in datapath d54e232a-5c68-4cc7-b58c-054da9c4646f bound to our chassis#033[00m
Nov 22 02:56:44 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:44.543 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d54e232a-5c68-4cc7-b58c-054da9c4646f#033[00m
Nov 22 02:56:44 np0005531887 systemd-udevd[223898]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:56:44 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:44.558 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[4acf4c8c-8172-4a39-95ac-8db6e4d7d973]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:44 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:44.559 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd54e232a-51 in ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 02:56:44 np0005531887 NetworkManager[55210]: <info>  [1763798204.5604] device (tap3640f80e-71): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 02:56:44 np0005531887 NetworkManager[55210]: <info>  [1763798204.5617] device (tap3640f80e-71): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 02:56:44 np0005531887 systemd-machined[153180]: New machine qemu-29-instance-0000004c.
Nov 22 02:56:44 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:44.571 213790 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd54e232a-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 02:56:44 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:44.572 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[89481673-31b1-4733-8747-61c833aaa615]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:44 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:44.574 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[9e335f1c-e43e-4089-8b15-a62f1aa30e8d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:44 np0005531887 podman[223880]: 2025-11-22 07:56:44.582614275 +0000 UTC m=+0.080214550 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 02:56:44 np0005531887 nova_compute[186849]: 2025-11-22 07:56:44.582 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:44 np0005531887 nova_compute[186849]: 2025-11-22 07:56:44.587 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:44 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:44.587 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[829938cf-d637-421e-8d34-1e6f8b371396]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:44 np0005531887 ovn_controller[95130]: 2025-11-22T07:56:44Z|00194|binding|INFO|Setting lport 3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 ovn-installed in OVS
Nov 22 02:56:44 np0005531887 ovn_controller[95130]: 2025-11-22T07:56:44Z|00195|binding|INFO|Setting lport 3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 up in Southbound
Nov 22 02:56:44 np0005531887 systemd[1]: Started Virtual Machine qemu-29-instance-0000004c.
Nov 22 02:56:44 np0005531887 nova_compute[186849]: 2025-11-22 07:56:44.591 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:44 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:44.619 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[72c2fe49-bf9a-4986-bd98-7f2c6f8efa34]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:44 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:44.657 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[04333c60-069d-4dc4-a2e0-30a70031e97d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:44 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:44.666 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[95e66155-d08a-4e0d-b207-d013f11c15bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:44 np0005531887 systemd-udevd[223906]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:56:44 np0005531887 NetworkManager[55210]: <info>  [1763798204.6692] manager: (tapd54e232a-50): new Veth device (/org/freedesktop/NetworkManager/Devices/103)
Nov 22 02:56:44 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:44.711 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[445c5869-c134-488e-b7b5-f5e026a49423]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:44 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:44.714 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[2da79ff6-1c2b-4fd6-beac-0835da376f46]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:44 np0005531887 NetworkManager[55210]: <info>  [1763798204.7477] device (tapd54e232a-50): carrier: link connected
Nov 22 02:56:44 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:44.754 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[09196b26-f7bb-4333-a270-b189d98d2b45]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:44 np0005531887 nova_compute[186849]: 2025-11-22 07:56:44.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:56:44 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:44.776 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[38aae3a3-f172-49b0-9258-b7eaa501d984]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd54e232a-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5b:69:51'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 64], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 494735, 'reachable_time': 41795, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223943, 'error': None, 'target': 'ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:44 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:44.798 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[00d351c9-a692-4700-8e1f-3b290f659dce]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5b:6951'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 494735, 'tstamp': 494735}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223944, 'error': None, 'target': 'ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:44 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:44.817 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[3a6c3496-391d-4a32-9dfa-870e69e7ddf9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd54e232a-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5b:69:51'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 64], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 494735, 'reachable_time': 41795, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 223945, 'error': None, 'target': 'ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:44 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:44.853 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[3ca04a7d-2b71-423e-a11c-c1779a7c240d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:44 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:44.929 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[28fda64a-d481-4cad-9e8c-2eeff4e1f3f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:44 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:44.931 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd54e232a-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:56:44 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:44.931 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:56:44 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:44.932 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd54e232a-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:56:44 np0005531887 NetworkManager[55210]: <info>  [1763798204.9350] manager: (tapd54e232a-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/104)
Nov 22 02:56:44 np0005531887 nova_compute[186849]: 2025-11-22 07:56:44.934 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:44 np0005531887 kernel: tapd54e232a-50: entered promiscuous mode
Nov 22 02:56:44 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:44.939 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd54e232a-50, col_values=(('external_ids', {'iface-id': 'bab7bafe-e92a-4e88-a16b-e3bd78ab8944'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:56:44 np0005531887 nova_compute[186849]: 2025-11-22 07:56:44.940 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:44 np0005531887 ovn_controller[95130]: 2025-11-22T07:56:44Z|00196|binding|INFO|Releasing lport bab7bafe-e92a-4e88-a16b-e3bd78ab8944 from this chassis (sb_readonly=0)
Nov 22 02:56:44 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:44.942 104084 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d54e232a-5c68-4cc7-b58c-054da9c4646f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d54e232a-5c68-4cc7-b58c-054da9c4646f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 02:56:44 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:44.944 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[c1abc64b-0f8a-4bb2-8823-d09adbdc5fb0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:56:44 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:44.945 104084 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 02:56:44 np0005531887 ovn_metadata_agent[104079]: global
Nov 22 02:56:44 np0005531887 ovn_metadata_agent[104079]:    log         /dev/log local0 debug
Nov 22 02:56:44 np0005531887 ovn_metadata_agent[104079]:    log-tag     haproxy-metadata-proxy-d54e232a-5c68-4cc7-b58c-054da9c4646f
Nov 22 02:56:44 np0005531887 ovn_metadata_agent[104079]:    user        root
Nov 22 02:56:44 np0005531887 ovn_metadata_agent[104079]:    group       root
Nov 22 02:56:44 np0005531887 ovn_metadata_agent[104079]:    maxconn     1024
Nov 22 02:56:44 np0005531887 ovn_metadata_agent[104079]:    pidfile     /var/lib/neutron/external/pids/d54e232a-5c68-4cc7-b58c-054da9c4646f.pid.haproxy
Nov 22 02:56:44 np0005531887 ovn_metadata_agent[104079]:    daemon
Nov 22 02:56:44 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:56:44 np0005531887 ovn_metadata_agent[104079]: defaults
Nov 22 02:56:44 np0005531887 ovn_metadata_agent[104079]:    log global
Nov 22 02:56:44 np0005531887 ovn_metadata_agent[104079]:    mode http
Nov 22 02:56:44 np0005531887 ovn_metadata_agent[104079]:    option httplog
Nov 22 02:56:44 np0005531887 ovn_metadata_agent[104079]:    option dontlognull
Nov 22 02:56:44 np0005531887 ovn_metadata_agent[104079]:    option http-server-close
Nov 22 02:56:44 np0005531887 ovn_metadata_agent[104079]:    option forwardfor
Nov 22 02:56:44 np0005531887 ovn_metadata_agent[104079]:    retries                 3
Nov 22 02:56:44 np0005531887 ovn_metadata_agent[104079]:    timeout http-request    30s
Nov 22 02:56:44 np0005531887 ovn_metadata_agent[104079]:    timeout connect         30s
Nov 22 02:56:44 np0005531887 ovn_metadata_agent[104079]:    timeout client          32s
Nov 22 02:56:44 np0005531887 ovn_metadata_agent[104079]:    timeout server          32s
Nov 22 02:56:44 np0005531887 ovn_metadata_agent[104079]:    timeout http-keep-alive 30s
Nov 22 02:56:44 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:56:44 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:56:44 np0005531887 ovn_metadata_agent[104079]: listen listener
Nov 22 02:56:44 np0005531887 ovn_metadata_agent[104079]:    bind 169.254.169.254:80
Nov 22 02:56:44 np0005531887 ovn_metadata_agent[104079]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 02:56:44 np0005531887 ovn_metadata_agent[104079]:    http-request add-header X-OVN-Network-ID d54e232a-5c68-4cc7-b58c-054da9c4646f
Nov 22 02:56:44 np0005531887 ovn_metadata_agent[104079]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 02:56:44 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:44.946 104084 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'env', 'PROCESS_TAG=haproxy-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d54e232a-5c68-4cc7-b58c-054da9c4646f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 02:56:44 np0005531887 nova_compute[186849]: 2025-11-22 07:56:44.954 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:45 np0005531887 nova_compute[186849]: 2025-11-22 07:56:45.027 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798205.0266223, c6cd5fec-f214-4bbc-b854-9e16c9a7577a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:56:45 np0005531887 nova_compute[186849]: 2025-11-22 07:56:45.027 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] VM Started (Lifecycle Event)#033[00m
Nov 22 02:56:45 np0005531887 nova_compute[186849]: 2025-11-22 07:56:45.052 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:56:45 np0005531887 nova_compute[186849]: 2025-11-22 07:56:45.057 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798205.0269692, c6cd5fec-f214-4bbc-b854-9e16c9a7577a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:56:45 np0005531887 nova_compute[186849]: 2025-11-22 07:56:45.057 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] VM Paused (Lifecycle Event)#033[00m
Nov 22 02:56:45 np0005531887 nova_compute[186849]: 2025-11-22 07:56:45.073 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:56:45 np0005531887 nova_compute[186849]: 2025-11-22 07:56:45.078 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:56:45 np0005531887 nova_compute[186849]: 2025-11-22 07:56:45.094 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:56:45 np0005531887 nova_compute[186849]: 2025-11-22 07:56:45.214 186853 DEBUG nova.compute.manager [req-211bd6e3-6995-48a6-8628-7a43089db427 req-cfc0faab-e294-442f-b1e9-8278a8a4adbd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Received event network-vif-plugged-3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:56:45 np0005531887 nova_compute[186849]: 2025-11-22 07:56:45.215 186853 DEBUG oslo_concurrency.lockutils [req-211bd6e3-6995-48a6-8628-7a43089db427 req-cfc0faab-e294-442f-b1e9-8278a8a4adbd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c6cd5fec-f214-4bbc-b854-9e16c9a7577a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:45 np0005531887 nova_compute[186849]: 2025-11-22 07:56:45.215 186853 DEBUG oslo_concurrency.lockutils [req-211bd6e3-6995-48a6-8628-7a43089db427 req-cfc0faab-e294-442f-b1e9-8278a8a4adbd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c6cd5fec-f214-4bbc-b854-9e16c9a7577a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:45 np0005531887 nova_compute[186849]: 2025-11-22 07:56:45.215 186853 DEBUG oslo_concurrency.lockutils [req-211bd6e3-6995-48a6-8628-7a43089db427 req-cfc0faab-e294-442f-b1e9-8278a8a4adbd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c6cd5fec-f214-4bbc-b854-9e16c9a7577a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:45 np0005531887 nova_compute[186849]: 2025-11-22 07:56:45.216 186853 DEBUG nova.compute.manager [req-211bd6e3-6995-48a6-8628-7a43089db427 req-cfc0faab-e294-442f-b1e9-8278a8a4adbd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Processing event network-vif-plugged-3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 02:56:45 np0005531887 nova_compute[186849]: 2025-11-22 07:56:45.217 186853 DEBUG nova.compute.manager [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:56:45 np0005531887 nova_compute[186849]: 2025-11-22 07:56:45.221 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798205.221531, c6cd5fec-f214-4bbc-b854-9e16c9a7577a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:56:45 np0005531887 nova_compute[186849]: 2025-11-22 07:56:45.222 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:56:45 np0005531887 nova_compute[186849]: 2025-11-22 07:56:45.224 186853 DEBUG nova.virt.libvirt.driver [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 02:56:45 np0005531887 nova_compute[186849]: 2025-11-22 07:56:45.230 186853 INFO nova.virt.libvirt.driver [-] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Instance spawned successfully.#033[00m
Nov 22 02:56:45 np0005531887 nova_compute[186849]: 2025-11-22 07:56:45.234 186853 DEBUG nova.virt.libvirt.driver [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 02:56:45 np0005531887 nova_compute[186849]: 2025-11-22 07:56:45.251 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:56:45 np0005531887 nova_compute[186849]: 2025-11-22 07:56:45.259 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:56:45 np0005531887 nova_compute[186849]: 2025-11-22 07:56:45.265 186853 DEBUG nova.virt.libvirt.driver [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:56:45 np0005531887 nova_compute[186849]: 2025-11-22 07:56:45.265 186853 DEBUG nova.virt.libvirt.driver [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:56:45 np0005531887 nova_compute[186849]: 2025-11-22 07:56:45.266 186853 DEBUG nova.virt.libvirt.driver [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:56:45 np0005531887 nova_compute[186849]: 2025-11-22 07:56:45.267 186853 DEBUG nova.virt.libvirt.driver [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:56:45 np0005531887 nova_compute[186849]: 2025-11-22 07:56:45.268 186853 DEBUG nova.virt.libvirt.driver [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:56:45 np0005531887 nova_compute[186849]: 2025-11-22 07:56:45.268 186853 DEBUG nova.virt.libvirt.driver [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:56:45 np0005531887 nova_compute[186849]: 2025-11-22 07:56:45.276 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:56:45 np0005531887 podman[223983]: 2025-11-22 07:56:45.350421661 +0000 UTC m=+0.066419907 container create 125c2020204afba0ea9a79b5d6680116b68f4c0b40b6fb0cca61e391d37e4718 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:56:45 np0005531887 nova_compute[186849]: 2025-11-22 07:56:45.359 186853 INFO nova.compute.manager [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Took 7.90 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 02:56:45 np0005531887 nova_compute[186849]: 2025-11-22 07:56:45.360 186853 DEBUG nova.compute.manager [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:56:45 np0005531887 systemd[1]: Started libpod-conmon-125c2020204afba0ea9a79b5d6680116b68f4c0b40b6fb0cca61e391d37e4718.scope.
Nov 22 02:56:45 np0005531887 podman[223983]: 2025-11-22 07:56:45.31384072 +0000 UTC m=+0.029838986 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 02:56:45 np0005531887 nova_compute[186849]: 2025-11-22 07:56:45.430 186853 DEBUG nova.network.neutron [req-5f810c27-5d99-4ace-8eed-f55b7b636d30 req-164f98a3-6a4d-4fc8-82f2-b307ccfc1a65 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Updated VIF entry in instance network info cache for port 3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 02:56:45 np0005531887 nova_compute[186849]: 2025-11-22 07:56:45.431 186853 DEBUG nova.network.neutron [req-5f810c27-5d99-4ace-8eed-f55b7b636d30 req-164f98a3-6a4d-4fc8-82f2-b307ccfc1a65 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Updating instance_info_cache with network_info: [{"id": "3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2", "address": "fa:16:3e:27:44:b9", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3640f80e-71", "ovs_interfaceid": "3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:56:45 np0005531887 systemd[1]: Started libcrun container.
Nov 22 02:56:45 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b8c9ddeb688ef326eca70afc8594407fb268e5e699e9c27fcaeedb9b6040e17c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 02:56:45 np0005531887 podman[223983]: 2025-11-22 07:56:45.459810137 +0000 UTC m=+0.175808413 container init 125c2020204afba0ea9a79b5d6680116b68f4c0b40b6fb0cca61e391d37e4718 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 22 02:56:45 np0005531887 nova_compute[186849]: 2025-11-22 07:56:45.467 186853 DEBUG oslo_concurrency.lockutils [req-5f810c27-5d99-4ace-8eed-f55b7b636d30 req-164f98a3-6a4d-4fc8-82f2-b307ccfc1a65 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-c6cd5fec-f214-4bbc-b854-9e16c9a7577a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:56:45 np0005531887 podman[223983]: 2025-11-22 07:56:45.465758775 +0000 UTC m=+0.181757021 container start 125c2020204afba0ea9a79b5d6680116b68f4c0b40b6fb0cca61e391d37e4718 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:56:45 np0005531887 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[223998]: [NOTICE]   (224002) : New worker (224004) forked
Nov 22 02:56:45 np0005531887 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[223998]: [NOTICE]   (224002) : Loading success.
Nov 22 02:56:45 np0005531887 nova_compute[186849]: 2025-11-22 07:56:45.494 186853 INFO nova.compute.manager [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Took 8.52 seconds to build instance.#033[00m
Nov 22 02:56:45 np0005531887 nova_compute[186849]: 2025-11-22 07:56:45.540 186853 DEBUG oslo_concurrency.lockutils [None req-640c3d1a-ffe6-4a44-93f1-4b2b10dc60ea e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "c6cd5fec-f214-4bbc-b854-9e16c9a7577a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.667s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:46 np0005531887 nova_compute[186849]: 2025-11-22 07:56:46.528 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:46 np0005531887 nova_compute[186849]: 2025-11-22 07:56:46.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:56:46 np0005531887 nova_compute[186849]: 2025-11-22 07:56:46.796 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:46 np0005531887 nova_compute[186849]: 2025-11-22 07:56:46.797 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:46 np0005531887 nova_compute[186849]: 2025-11-22 07:56:46.797 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:46 np0005531887 nova_compute[186849]: 2025-11-22 07:56:46.798 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 02:56:46 np0005531887 nova_compute[186849]: 2025-11-22 07:56:46.875 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:56:46 np0005531887 nova_compute[186849]: 2025-11-22 07:56:46.936 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:56:46 np0005531887 nova_compute[186849]: 2025-11-22 07:56:46.938 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:56:46 np0005531887 nova_compute[186849]: 2025-11-22 07:56:46.993 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:56:47 np0005531887 nova_compute[186849]: 2025-11-22 07:56:47.197 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:56:47 np0005531887 nova_compute[186849]: 2025-11-22 07:56:47.199 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5567MB free_disk=73.41507720947266GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 02:56:47 np0005531887 nova_compute[186849]: 2025-11-22 07:56:47.199 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:47 np0005531887 nova_compute[186849]: 2025-11-22 07:56:47.200 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:47 np0005531887 nova_compute[186849]: 2025-11-22 07:56:47.262 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Instance c6cd5fec-f214-4bbc-b854-9e16c9a7577a actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 02:56:47 np0005531887 nova_compute[186849]: 2025-11-22 07:56:47.263 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 02:56:47 np0005531887 nova_compute[186849]: 2025-11-22 07:56:47.263 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 02:56:47 np0005531887 nova_compute[186849]: 2025-11-22 07:56:47.308 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:56:47 np0005531887 nova_compute[186849]: 2025-11-22 07:56:47.321 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:56:47 np0005531887 nova_compute[186849]: 2025-11-22 07:56:47.348 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 02:56:47 np0005531887 nova_compute[186849]: 2025-11-22 07:56:47.349 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.149s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:47 np0005531887 nova_compute[186849]: 2025-11-22 07:56:47.375 186853 DEBUG nova.compute.manager [req-6af97b11-f2e0-4964-8846-29236480dc70 req-ede154d4-d0e6-4504-a9dc-1b9872bb18c1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Received event network-vif-plugged-3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:56:47 np0005531887 nova_compute[186849]: 2025-11-22 07:56:47.376 186853 DEBUG oslo_concurrency.lockutils [req-6af97b11-f2e0-4964-8846-29236480dc70 req-ede154d4-d0e6-4504-a9dc-1b9872bb18c1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c6cd5fec-f214-4bbc-b854-9e16c9a7577a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:47 np0005531887 nova_compute[186849]: 2025-11-22 07:56:47.377 186853 DEBUG oslo_concurrency.lockutils [req-6af97b11-f2e0-4964-8846-29236480dc70 req-ede154d4-d0e6-4504-a9dc-1b9872bb18c1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c6cd5fec-f214-4bbc-b854-9e16c9a7577a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:47 np0005531887 nova_compute[186849]: 2025-11-22 07:56:47.377 186853 DEBUG oslo_concurrency.lockutils [req-6af97b11-f2e0-4964-8846-29236480dc70 req-ede154d4-d0e6-4504-a9dc-1b9872bb18c1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c6cd5fec-f214-4bbc-b854-9e16c9a7577a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:47 np0005531887 nova_compute[186849]: 2025-11-22 07:56:47.378 186853 DEBUG nova.compute.manager [req-6af97b11-f2e0-4964-8846-29236480dc70 req-ede154d4-d0e6-4504-a9dc-1b9872bb18c1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] No waiting events found dispatching network-vif-plugged-3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:56:47 np0005531887 nova_compute[186849]: 2025-11-22 07:56:47.379 186853 WARNING nova.compute.manager [req-6af97b11-f2e0-4964-8846-29236480dc70 req-ede154d4-d0e6-4504-a9dc-1b9872bb18c1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Received unexpected event network-vif-plugged-3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 for instance with vm_state active and task_state None.#033[00m
Nov 22 02:56:47 np0005531887 nova_compute[186849]: 2025-11-22 07:56:47.453 186853 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798192.4530342, 5eafb037-41a2-463f-9d3a-1b4248cb00f2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:56:47 np0005531887 nova_compute[186849]: 2025-11-22 07:56:47.454 186853 INFO nova.compute.manager [-] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:56:47 np0005531887 nova_compute[186849]: 2025-11-22 07:56:47.476 186853 DEBUG nova.compute.manager [None req-c273c325-016c-4b1a-9b5e-9910eb40cbde - - - - - -] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:56:47 np0005531887 nova_compute[186849]: 2025-11-22 07:56:47.614 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:48 np0005531887 nova_compute[186849]: 2025-11-22 07:56:48.349 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:56:48 np0005531887 nova_compute[186849]: 2025-11-22 07:56:48.762 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:56:48 np0005531887 nova_compute[186849]: 2025-11-22 07:56:48.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:56:48 np0005531887 nova_compute[186849]: 2025-11-22 07:56:48.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 02:56:49 np0005531887 nova_compute[186849]: 2025-11-22 07:56:49.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:56:50 np0005531887 nova_compute[186849]: 2025-11-22 07:56:50.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:56:50 np0005531887 nova_compute[186849]: 2025-11-22 07:56:50.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 02:56:50 np0005531887 nova_compute[186849]: 2025-11-22 07:56:50.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 02:56:50 np0005531887 nova_compute[186849]: 2025-11-22 07:56:50.798 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "refresh_cache-c6cd5fec-f214-4bbc-b854-9e16c9a7577a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:56:50 np0005531887 nova_compute[186849]: 2025-11-22 07:56:50.798 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquired lock "refresh_cache-c6cd5fec-f214-4bbc-b854-9e16c9a7577a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:56:50 np0005531887 nova_compute[186849]: 2025-11-22 07:56:50.798 186853 DEBUG nova.network.neutron [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 02:56:50 np0005531887 nova_compute[186849]: 2025-11-22 07:56:50.799 186853 DEBUG nova.objects.instance [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c6cd5fec-f214-4bbc-b854-9e16c9a7577a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:56:50 np0005531887 podman[224020]: 2025-11-22 07:56:50.839209812 +0000 UTC m=+0.059406342 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 22 02:56:51 np0005531887 nova_compute[186849]: 2025-11-22 07:56:51.215 186853 DEBUG oslo_concurrency.lockutils [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "refresh_cache-c6cd5fec-f214-4bbc-b854-9e16c9a7577a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:56:51 np0005531887 nova_compute[186849]: 2025-11-22 07:56:51.532 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:52 np0005531887 nova_compute[186849]: 2025-11-22 07:56:52.518 186853 DEBUG nova.network.neutron [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Updating instance_info_cache with network_info: [{"id": "3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2", "address": "fa:16:3e:27:44:b9", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3640f80e-71", "ovs_interfaceid": "3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:56:52 np0005531887 nova_compute[186849]: 2025-11-22 07:56:52.545 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Releasing lock "refresh_cache-c6cd5fec-f214-4bbc-b854-9e16c9a7577a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:56:52 np0005531887 nova_compute[186849]: 2025-11-22 07:56:52.545 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 02:56:52 np0005531887 nova_compute[186849]: 2025-11-22 07:56:52.546 186853 DEBUG oslo_concurrency.lockutils [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquired lock "refresh_cache-c6cd5fec-f214-4bbc-b854-9e16c9a7577a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:56:52 np0005531887 nova_compute[186849]: 2025-11-22 07:56:52.546 186853 DEBUG nova.network.neutron [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:56:52 np0005531887 nova_compute[186849]: 2025-11-22 07:56:52.547 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:56:52 np0005531887 nova_compute[186849]: 2025-11-22 07:56:52.616 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:54 np0005531887 podman[224039]: 2025-11-22 07:56:54.863684329 +0000 UTC m=+0.068260192 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Nov 22 02:56:55 np0005531887 nova_compute[186849]: 2025-11-22 07:56:55.086 186853 DEBUG nova.network.neutron [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Updating instance_info_cache with network_info: [{"id": "3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2", "address": "fa:16:3e:27:44:b9", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3640f80e-71", "ovs_interfaceid": "3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:56:55 np0005531887 nova_compute[186849]: 2025-11-22 07:56:55.100 186853 DEBUG oslo_concurrency.lockutils [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Releasing lock "refresh_cache-c6cd5fec-f214-4bbc-b854-9e16c9a7577a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:56:55 np0005531887 nova_compute[186849]: 2025-11-22 07:56:55.290 186853 DEBUG nova.virt.libvirt.driver [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Nov 22 02:56:55 np0005531887 nova_compute[186849]: 2025-11-22 07:56:55.291 186853 DEBUG nova.virt.libvirt.volume.remotefs [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Creating file /var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a/816b62b5f6ed42cdb7dc947f5cf2f744.tmp on remote host 192.168.122.100 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Nov 22 02:56:55 np0005531887 nova_compute[186849]: 2025-11-22 07:56:55.291 186853 DEBUG oslo_concurrency.processutils [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a/816b62b5f6ed42cdb7dc947f5cf2f744.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:56:55 np0005531887 nova_compute[186849]: 2025-11-22 07:56:55.722 186853 DEBUG oslo_concurrency.processutils [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a/816b62b5f6ed42cdb7dc947f5cf2f744.tmp" returned: 1 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:56:55 np0005531887 nova_compute[186849]: 2025-11-22 07:56:55.723 186853 DEBUG oslo_concurrency.processutils [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] 'ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a/816b62b5f6ed42cdb7dc947f5cf2f744.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Nov 22 02:56:55 np0005531887 nova_compute[186849]: 2025-11-22 07:56:55.724 186853 DEBUG nova.virt.libvirt.volume.remotefs [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Creating directory /var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a on remote host 192.168.122.100 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Nov 22 02:56:55 np0005531887 nova_compute[186849]: 2025-11-22 07:56:55.724 186853 DEBUG oslo_concurrency.processutils [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:56:55 np0005531887 nova_compute[186849]: 2025-11-22 07:56:55.933 186853 DEBUG oslo_concurrency.processutils [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a" returned: 0 in 0.209s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:56:55 np0005531887 nova_compute[186849]: 2025-11-22 07:56:55.938 186853 DEBUG nova.virt.libvirt.driver [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 22 02:56:56 np0005531887 nova_compute[186849]: 2025-11-22 07:56:56.425 186853 DEBUG oslo_concurrency.lockutils [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Acquiring lock "e4a6074c-55b0-4529-b184-3ba3ca0dab8c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:56 np0005531887 nova_compute[186849]: 2025-11-22 07:56:56.427 186853 DEBUG oslo_concurrency.lockutils [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "e4a6074c-55b0-4529-b184-3ba3ca0dab8c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:56 np0005531887 nova_compute[186849]: 2025-11-22 07:56:56.475 186853 DEBUG nova.compute.manager [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 02:56:56 np0005531887 nova_compute[186849]: 2025-11-22 07:56:56.538 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:56 np0005531887 nova_compute[186849]: 2025-11-22 07:56:56.600 186853 DEBUG oslo_concurrency.lockutils [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:56 np0005531887 nova_compute[186849]: 2025-11-22 07:56:56.600 186853 DEBUG oslo_concurrency.lockutils [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:56 np0005531887 nova_compute[186849]: 2025-11-22 07:56:56.606 186853 DEBUG nova.virt.hardware [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:56:56 np0005531887 nova_compute[186849]: 2025-11-22 07:56:56.607 186853 INFO nova.compute.claims [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 22 02:56:56 np0005531887 nova_compute[186849]: 2025-11-22 07:56:56.746 186853 DEBUG nova.compute.provider_tree [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:56:56 np0005531887 nova_compute[186849]: 2025-11-22 07:56:56.758 186853 DEBUG nova.scheduler.client.report [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:56:56 np0005531887 nova_compute[186849]: 2025-11-22 07:56:56.784 186853 DEBUG oslo_concurrency.lockutils [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.183s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:56 np0005531887 nova_compute[186849]: 2025-11-22 07:56:56.785 186853 DEBUG nova.compute.manager [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 02:56:56 np0005531887 nova_compute[186849]: 2025-11-22 07:56:56.846 186853 DEBUG nova.compute.manager [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 02:56:56 np0005531887 nova_compute[186849]: 2025-11-22 07:56:56.847 186853 DEBUG nova.network.neutron [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 02:56:56 np0005531887 nova_compute[186849]: 2025-11-22 07:56:56.863 186853 INFO nova.virt.libvirt.driver [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 02:56:56 np0005531887 nova_compute[186849]: 2025-11-22 07:56:56.902 186853 DEBUG nova.compute.manager [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 02:56:57 np0005531887 nova_compute[186849]: 2025-11-22 07:56:57.042 186853 DEBUG nova.compute.manager [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 02:56:57 np0005531887 nova_compute[186849]: 2025-11-22 07:56:57.044 186853 DEBUG nova.virt.libvirt.driver [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:56:57 np0005531887 nova_compute[186849]: 2025-11-22 07:56:57.045 186853 INFO nova.virt.libvirt.driver [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Creating image(s)#033[00m
Nov 22 02:56:57 np0005531887 nova_compute[186849]: 2025-11-22 07:56:57.045 186853 DEBUG oslo_concurrency.lockutils [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Acquiring lock "/var/lib/nova/instances/e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:57 np0005531887 nova_compute[186849]: 2025-11-22 07:56:57.046 186853 DEBUG oslo_concurrency.lockutils [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "/var/lib/nova/instances/e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:57 np0005531887 nova_compute[186849]: 2025-11-22 07:56:57.047 186853 DEBUG oslo_concurrency.lockutils [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "/var/lib/nova/instances/e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:57 np0005531887 nova_compute[186849]: 2025-11-22 07:56:57.064 186853 DEBUG oslo_concurrency.processutils [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:56:57 np0005531887 nova_compute[186849]: 2025-11-22 07:56:57.134 186853 DEBUG oslo_concurrency.processutils [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:56:57 np0005531887 nova_compute[186849]: 2025-11-22 07:56:57.136 186853 DEBUG oslo_concurrency.lockutils [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:57 np0005531887 nova_compute[186849]: 2025-11-22 07:56:57.137 186853 DEBUG oslo_concurrency.lockutils [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:57 np0005531887 nova_compute[186849]: 2025-11-22 07:56:57.149 186853 DEBUG oslo_concurrency.processutils [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:56:57 np0005531887 nova_compute[186849]: 2025-11-22 07:56:57.213 186853 DEBUG oslo_concurrency.processutils [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:56:57 np0005531887 nova_compute[186849]: 2025-11-22 07:56:57.214 186853 DEBUG oslo_concurrency.processutils [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:56:57 np0005531887 nova_compute[186849]: 2025-11-22 07:56:57.255 186853 DEBUG oslo_concurrency.processutils [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:56:57 np0005531887 nova_compute[186849]: 2025-11-22 07:56:57.257 186853 DEBUG oslo_concurrency.lockutils [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:57 np0005531887 nova_compute[186849]: 2025-11-22 07:56:57.257 186853 DEBUG oslo_concurrency.processutils [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:56:57 np0005531887 nova_compute[186849]: 2025-11-22 07:56:57.327 186853 DEBUG oslo_concurrency.processutils [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:56:57 np0005531887 nova_compute[186849]: 2025-11-22 07:56:57.329 186853 DEBUG nova.virt.disk.api [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Checking if we can resize image /var/lib/nova/instances/e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:56:57 np0005531887 nova_compute[186849]: 2025-11-22 07:56:57.329 186853 DEBUG oslo_concurrency.processutils [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:56:57 np0005531887 nova_compute[186849]: 2025-11-22 07:56:57.397 186853 DEBUG oslo_concurrency.processutils [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:56:57 np0005531887 nova_compute[186849]: 2025-11-22 07:56:57.398 186853 DEBUG nova.virt.disk.api [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Cannot resize image /var/lib/nova/instances/e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:56:57 np0005531887 nova_compute[186849]: 2025-11-22 07:56:57.398 186853 DEBUG nova.objects.instance [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lazy-loading 'migration_context' on Instance uuid e4a6074c-55b0-4529-b184-3ba3ca0dab8c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:56:57 np0005531887 nova_compute[186849]: 2025-11-22 07:56:57.427 186853 DEBUG nova.virt.libvirt.driver [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:56:57 np0005531887 nova_compute[186849]: 2025-11-22 07:56:57.428 186853 DEBUG nova.virt.libvirt.driver [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Ensure instance console log exists: /var/lib/nova/instances/e4a6074c-55b0-4529-b184-3ba3ca0dab8c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:56:57 np0005531887 nova_compute[186849]: 2025-11-22 07:56:57.428 186853 DEBUG oslo_concurrency.lockutils [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:56:57 np0005531887 nova_compute[186849]: 2025-11-22 07:56:57.429 186853 DEBUG oslo_concurrency.lockutils [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:56:57 np0005531887 nova_compute[186849]: 2025-11-22 07:56:57.429 186853 DEBUG oslo_concurrency.lockutils [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:56:57 np0005531887 nova_compute[186849]: 2025-11-22 07:56:57.487 186853 DEBUG nova.policy [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 02:56:57 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:57.614 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:56:57 np0005531887 nova_compute[186849]: 2025-11-22 07:56:57.613 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:57 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:56:57.616 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 02:56:57 np0005531887 nova_compute[186849]: 2025-11-22 07:56:57.619 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:56:59 np0005531887 nova_compute[186849]: 2025-11-22 07:56:59.166 186853 DEBUG nova.network.neutron [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Successfully created port: 392e43af-a923-4bd6-bdff-445c6101995b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 02:56:59 np0005531887 podman[224093]: 2025-11-22 07:56:59.838205763 +0000 UTC m=+0.057309439 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 02:56:59 np0005531887 ovn_controller[95130]: 2025-11-22T07:56:59Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:27:44:b9 10.100.0.12
Nov 22 02:56:59 np0005531887 ovn_controller[95130]: 2025-11-22T07:56:59Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:27:44:b9 10.100.0.12
Nov 22 02:57:01 np0005531887 nova_compute[186849]: 2025-11-22 07:57:01.542 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:02 np0005531887 nova_compute[186849]: 2025-11-22 07:57:02.621 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:03 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:03.618 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:57:05 np0005531887 podman[224119]: 2025-11-22 07:57:05.862632715 +0000 UTC m=+0.068743895 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_id=edpm, io.buildah.version=1.33.7, architecture=x86_64, io.openshift.expose-services=, version=9.6, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, distribution-scope=public, maintainer=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 22 02:57:05 np0005531887 nova_compute[186849]: 2025-11-22 07:57:05.991 186853 DEBUG nova.virt.libvirt.driver [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Nov 22 02:57:06 np0005531887 nova_compute[186849]: 2025-11-22 07:57:06.010 186853 DEBUG nova.network.neutron [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Successfully updated port: 392e43af-a923-4bd6-bdff-445c6101995b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 02:57:06 np0005531887 nova_compute[186849]: 2025-11-22 07:57:06.038 186853 DEBUG oslo_concurrency.lockutils [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Acquiring lock "refresh_cache-e4a6074c-55b0-4529-b184-3ba3ca0dab8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:57:06 np0005531887 nova_compute[186849]: 2025-11-22 07:57:06.038 186853 DEBUG oslo_concurrency.lockutils [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Acquired lock "refresh_cache-e4a6074c-55b0-4529-b184-3ba3ca0dab8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:57:06 np0005531887 nova_compute[186849]: 2025-11-22 07:57:06.038 186853 DEBUG nova.network.neutron [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:57:06 np0005531887 nova_compute[186849]: 2025-11-22 07:57:06.179 186853 DEBUG nova.compute.manager [req-93e28c87-5e1b-45bf-8ebd-f8d1498fa784 req-0ce4198b-2675-4901-af27-315aa5b289fe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Received event network-changed-392e43af-a923-4bd6-bdff-445c6101995b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:57:06 np0005531887 nova_compute[186849]: 2025-11-22 07:57:06.180 186853 DEBUG nova.compute.manager [req-93e28c87-5e1b-45bf-8ebd-f8d1498fa784 req-0ce4198b-2675-4901-af27-315aa5b289fe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Refreshing instance network info cache due to event network-changed-392e43af-a923-4bd6-bdff-445c6101995b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 02:57:06 np0005531887 nova_compute[186849]: 2025-11-22 07:57:06.180 186853 DEBUG oslo_concurrency.lockutils [req-93e28c87-5e1b-45bf-8ebd-f8d1498fa784 req-0ce4198b-2675-4901-af27-315aa5b289fe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-e4a6074c-55b0-4529-b184-3ba3ca0dab8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:57:06 np0005531887 nova_compute[186849]: 2025-11-22 07:57:06.237 186853 DEBUG nova.network.neutron [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:57:06 np0005531887 nova_compute[186849]: 2025-11-22 07:57:06.544 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:07 np0005531887 nova_compute[186849]: 2025-11-22 07:57:07.628 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:07 np0005531887 nova_compute[186849]: 2025-11-22 07:57:07.659 186853 DEBUG nova.network.neutron [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Updating instance_info_cache with network_info: [{"id": "392e43af-a923-4bd6-bdff-445c6101995b", "address": "fa:16:3e:10:9b:64", "network": {"id": "06e0f3a5-911a-4244-bd9c-8cb4fa4c4794", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-960378838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd33c7e49baa4c7f9575824b348a0f23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap392e43af-a9", "ovs_interfaceid": "392e43af-a923-4bd6-bdff-445c6101995b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:57:07 np0005531887 nova_compute[186849]: 2025-11-22 07:57:07.689 186853 DEBUG oslo_concurrency.lockutils [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Releasing lock "refresh_cache-e4a6074c-55b0-4529-b184-3ba3ca0dab8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:57:07 np0005531887 nova_compute[186849]: 2025-11-22 07:57:07.689 186853 DEBUG nova.compute.manager [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Instance network_info: |[{"id": "392e43af-a923-4bd6-bdff-445c6101995b", "address": "fa:16:3e:10:9b:64", "network": {"id": "06e0f3a5-911a-4244-bd9c-8cb4fa4c4794", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-960378838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd33c7e49baa4c7f9575824b348a0f23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap392e43af-a9", "ovs_interfaceid": "392e43af-a923-4bd6-bdff-445c6101995b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 02:57:07 np0005531887 nova_compute[186849]: 2025-11-22 07:57:07.690 186853 DEBUG oslo_concurrency.lockutils [req-93e28c87-5e1b-45bf-8ebd-f8d1498fa784 req-0ce4198b-2675-4901-af27-315aa5b289fe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-e4a6074c-55b0-4529-b184-3ba3ca0dab8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:57:07 np0005531887 nova_compute[186849]: 2025-11-22 07:57:07.690 186853 DEBUG nova.network.neutron [req-93e28c87-5e1b-45bf-8ebd-f8d1498fa784 req-0ce4198b-2675-4901-af27-315aa5b289fe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Refreshing network info cache for port 392e43af-a923-4bd6-bdff-445c6101995b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 02:57:07 np0005531887 nova_compute[186849]: 2025-11-22 07:57:07.693 186853 DEBUG nova.virt.libvirt.driver [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Start _get_guest_xml network_info=[{"id": "392e43af-a923-4bd6-bdff-445c6101995b", "address": "fa:16:3e:10:9b:64", "network": {"id": "06e0f3a5-911a-4244-bd9c-8cb4fa4c4794", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-960378838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd33c7e49baa4c7f9575824b348a0f23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap392e43af-a9", "ovs_interfaceid": "392e43af-a923-4bd6-bdff-445c6101995b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'size': 0, 'boot_index': 0, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:57:07 np0005531887 nova_compute[186849]: 2025-11-22 07:57:07.698 186853 WARNING nova.virt.libvirt.driver [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:57:07 np0005531887 nova_compute[186849]: 2025-11-22 07:57:07.711 186853 DEBUG nova.virt.libvirt.host [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:57:07 np0005531887 nova_compute[186849]: 2025-11-22 07:57:07.711 186853 DEBUG nova.virt.libvirt.host [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:57:07 np0005531887 nova_compute[186849]: 2025-11-22 07:57:07.719 186853 DEBUG nova.virt.libvirt.host [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:57:07 np0005531887 nova_compute[186849]: 2025-11-22 07:57:07.720 186853 DEBUG nova.virt.libvirt.host [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:57:07 np0005531887 nova_compute[186849]: 2025-11-22 07:57:07.721 186853 DEBUG nova.virt.libvirt.driver [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:57:07 np0005531887 nova_compute[186849]: 2025-11-22 07:57:07.722 186853 DEBUG nova.virt.hardware [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:57:07 np0005531887 nova_compute[186849]: 2025-11-22 07:57:07.722 186853 DEBUG nova.virt.hardware [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:57:07 np0005531887 nova_compute[186849]: 2025-11-22 07:57:07.723 186853 DEBUG nova.virt.hardware [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:57:07 np0005531887 nova_compute[186849]: 2025-11-22 07:57:07.723 186853 DEBUG nova.virt.hardware [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:57:07 np0005531887 nova_compute[186849]: 2025-11-22 07:57:07.723 186853 DEBUG nova.virt.hardware [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:57:07 np0005531887 nova_compute[186849]: 2025-11-22 07:57:07.723 186853 DEBUG nova.virt.hardware [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:57:07 np0005531887 nova_compute[186849]: 2025-11-22 07:57:07.724 186853 DEBUG nova.virt.hardware [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:57:07 np0005531887 nova_compute[186849]: 2025-11-22 07:57:07.724 186853 DEBUG nova.virt.hardware [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:57:07 np0005531887 nova_compute[186849]: 2025-11-22 07:57:07.724 186853 DEBUG nova.virt.hardware [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:57:07 np0005531887 nova_compute[186849]: 2025-11-22 07:57:07.724 186853 DEBUG nova.virt.hardware [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:57:07 np0005531887 nova_compute[186849]: 2025-11-22 07:57:07.725 186853 DEBUG nova.virt.hardware [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:57:07 np0005531887 nova_compute[186849]: 2025-11-22 07:57:07.728 186853 DEBUG nova.virt.libvirt.vif [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:56:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-697404939',display_name='tempest-ServerStableDeviceRescueTest-server-697404939',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-697404939',id=77,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fd33c7e49baa4c7f9575824b348a0f23',ramdisk_id='',reservation_id='r-0m39mt0u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-455223381',owner_user_name='tempest-ServerStableDeviceRescueTest-455223381-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:56:56Z,user_data=None,user_id='0d84421d986b40f481c0caef764443e2',uuid=e4a6074c-55b0-4529-b184-3ba3ca0dab8c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "392e43af-a923-4bd6-bdff-445c6101995b", "address": "fa:16:3e:10:9b:64", "network": {"id": "06e0f3a5-911a-4244-bd9c-8cb4fa4c4794", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-960378838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd33c7e49baa4c7f9575824b348a0f23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap392e43af-a9", "ovs_interfaceid": "392e43af-a923-4bd6-bdff-445c6101995b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 02:57:07 np0005531887 nova_compute[186849]: 2025-11-22 07:57:07.729 186853 DEBUG nova.network.os_vif_util [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Converting VIF {"id": "392e43af-a923-4bd6-bdff-445c6101995b", "address": "fa:16:3e:10:9b:64", "network": {"id": "06e0f3a5-911a-4244-bd9c-8cb4fa4c4794", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-960378838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd33c7e49baa4c7f9575824b348a0f23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap392e43af-a9", "ovs_interfaceid": "392e43af-a923-4bd6-bdff-445c6101995b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:57:07 np0005531887 nova_compute[186849]: 2025-11-22 07:57:07.729 186853 DEBUG nova.network.os_vif_util [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:10:9b:64,bridge_name='br-int',has_traffic_filtering=True,id=392e43af-a923-4bd6-bdff-445c6101995b,network=Network(06e0f3a5-911a-4244-bd9c-8cb4fa4c4794),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap392e43af-a9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:57:07 np0005531887 nova_compute[186849]: 2025-11-22 07:57:07.730 186853 DEBUG nova.objects.instance [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lazy-loading 'pci_devices' on Instance uuid e4a6074c-55b0-4529-b184-3ba3ca0dab8c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:57:07 np0005531887 nova_compute[186849]: 2025-11-22 07:57:07.753 186853 DEBUG nova.virt.libvirt.driver [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:57:07 np0005531887 nova_compute[186849]:  <uuid>e4a6074c-55b0-4529-b184-3ba3ca0dab8c</uuid>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:  <name>instance-0000004d</name>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:  <memory>131072</memory>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:  <vcpu>1</vcpu>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:57:07 np0005531887 nova_compute[186849]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:      <nova:name>tempest-ServerStableDeviceRescueTest-server-697404939</nova:name>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:      <nova:creationTime>2025-11-22 07:57:07</nova:creationTime>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:      <nova:flavor name="m1.nano">
Nov 22 02:57:07 np0005531887 nova_compute[186849]:        <nova:memory>128</nova:memory>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:        <nova:disk>1</nova:disk>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:        <nova:swap>0</nova:swap>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:      </nova:flavor>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:      <nova:owner>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:        <nova:user uuid="0d84421d986b40f481c0caef764443e2">tempest-ServerStableDeviceRescueTest-455223381-project-member</nova:user>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:        <nova:project uuid="fd33c7e49baa4c7f9575824b348a0f23">tempest-ServerStableDeviceRescueTest-455223381</nova:project>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:      </nova:owner>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:      <nova:ports>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:        <nova:port uuid="392e43af-a923-4bd6-bdff-445c6101995b">
Nov 22 02:57:07 np0005531887 nova_compute[186849]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:        </nova:port>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:      </nova:ports>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:    </nova:instance>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:  <sysinfo type="smbios">
Nov 22 02:57:07 np0005531887 nova_compute[186849]:    <system>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:      <entry name="serial">e4a6074c-55b0-4529-b184-3ba3ca0dab8c</entry>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:      <entry name="uuid">e4a6074c-55b0-4529-b184-3ba3ca0dab8c</entry>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:    </system>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:  <os>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:    <boot dev="hd"/>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:    <smbios mode="sysinfo"/>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:  </os>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:  <features>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:    <vmcoreinfo/>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:  </features>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:  <clock offset="utc">
Nov 22 02:57:07 np0005531887 nova_compute[186849]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:    <timer name="hpet" present="no"/>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:  </clock>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:  <cpu mode="custom" match="exact">
Nov 22 02:57:07 np0005531887 nova_compute[186849]:    <model>Nehalem</model>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:  <devices>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:    <disk type="file" device="disk">
Nov 22 02:57:07 np0005531887 nova_compute[186849]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk"/>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:      <target dev="vda" bus="virtio"/>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:    <disk type="file" device="cdrom">
Nov 22 02:57:07 np0005531887 nova_compute[186849]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk.config"/>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:      <target dev="sda" bus="sata"/>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:    <interface type="ethernet">
Nov 22 02:57:07 np0005531887 nova_compute[186849]:      <mac address="fa:16:3e:10:9b:64"/>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:      <mtu size="1442"/>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:      <target dev="tap392e43af-a9"/>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:    </interface>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:    <serial type="pty">
Nov 22 02:57:07 np0005531887 nova_compute[186849]:      <log file="/var/lib/nova/instances/e4a6074c-55b0-4529-b184-3ba3ca0dab8c/console.log" append="off"/>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:    </serial>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:    <video>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:    </video>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:    <input type="tablet" bus="usb"/>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:    <rng model="virtio">
Nov 22 02:57:07 np0005531887 nova_compute[186849]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:    </rng>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:    <controller type="usb" index="0"/>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:    <memballoon model="virtio">
Nov 22 02:57:07 np0005531887 nova_compute[186849]:      <stats period="10"/>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 02:57:07 np0005531887 nova_compute[186849]:  </devices>
Nov 22 02:57:07 np0005531887 nova_compute[186849]: </domain>
Nov 22 02:57:07 np0005531887 nova_compute[186849]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:57:07 np0005531887 nova_compute[186849]: 2025-11-22 07:57:07.755 186853 DEBUG nova.compute.manager [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Preparing to wait for external event network-vif-plugged-392e43af-a923-4bd6-bdff-445c6101995b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 02:57:07 np0005531887 nova_compute[186849]: 2025-11-22 07:57:07.755 186853 DEBUG oslo_concurrency.lockutils [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Acquiring lock "e4a6074c-55b0-4529-b184-3ba3ca0dab8c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:57:07 np0005531887 nova_compute[186849]: 2025-11-22 07:57:07.756 186853 DEBUG oslo_concurrency.lockutils [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "e4a6074c-55b0-4529-b184-3ba3ca0dab8c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:57:07 np0005531887 nova_compute[186849]: 2025-11-22 07:57:07.756 186853 DEBUG oslo_concurrency.lockutils [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "e4a6074c-55b0-4529-b184-3ba3ca0dab8c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:57:07 np0005531887 nova_compute[186849]: 2025-11-22 07:57:07.757 186853 DEBUG nova.virt.libvirt.vif [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:56:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-697404939',display_name='tempest-ServerStableDeviceRescueTest-server-697404939',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-697404939',id=77,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fd33c7e49baa4c7f9575824b348a0f23',ramdisk_id='',reservation_id='r-0m39mt0u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-455223381',owner_user_name='tempest-ServerStableDeviceRescueTest-455223381-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:56:56Z,user_data=None,user_id='0d84421d986b40f481c0caef764443e2',uuid=e4a6074c-55b0-4529-b184-3ba3ca0dab8c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "392e43af-a923-4bd6-bdff-445c6101995b", "address": "fa:16:3e:10:9b:64", "network": {"id": "06e0f3a5-911a-4244-bd9c-8cb4fa4c4794", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-960378838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd33c7e49baa4c7f9575824b348a0f23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap392e43af-a9", "ovs_interfaceid": "392e43af-a923-4bd6-bdff-445c6101995b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 02:57:07 np0005531887 nova_compute[186849]: 2025-11-22 07:57:07.757 186853 DEBUG nova.network.os_vif_util [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Converting VIF {"id": "392e43af-a923-4bd6-bdff-445c6101995b", "address": "fa:16:3e:10:9b:64", "network": {"id": "06e0f3a5-911a-4244-bd9c-8cb4fa4c4794", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-960378838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd33c7e49baa4c7f9575824b348a0f23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap392e43af-a9", "ovs_interfaceid": "392e43af-a923-4bd6-bdff-445c6101995b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:57:07 np0005531887 nova_compute[186849]: 2025-11-22 07:57:07.758 186853 DEBUG nova.network.os_vif_util [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:10:9b:64,bridge_name='br-int',has_traffic_filtering=True,id=392e43af-a923-4bd6-bdff-445c6101995b,network=Network(06e0f3a5-911a-4244-bd9c-8cb4fa4c4794),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap392e43af-a9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:57:07 np0005531887 nova_compute[186849]: 2025-11-22 07:57:07.758 186853 DEBUG os_vif [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:10:9b:64,bridge_name='br-int',has_traffic_filtering=True,id=392e43af-a923-4bd6-bdff-445c6101995b,network=Network(06e0f3a5-911a-4244-bd9c-8cb4fa4c4794),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap392e43af-a9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 02:57:07 np0005531887 nova_compute[186849]: 2025-11-22 07:57:07.759 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:07 np0005531887 nova_compute[186849]: 2025-11-22 07:57:07.759 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:57:07 np0005531887 nova_compute[186849]: 2025-11-22 07:57:07.760 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:57:07 np0005531887 nova_compute[186849]: 2025-11-22 07:57:07.763 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:07 np0005531887 nova_compute[186849]: 2025-11-22 07:57:07.763 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap392e43af-a9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:57:07 np0005531887 nova_compute[186849]: 2025-11-22 07:57:07.763 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap392e43af-a9, col_values=(('external_ids', {'iface-id': '392e43af-a923-4bd6-bdff-445c6101995b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:10:9b:64', 'vm-uuid': 'e4a6074c-55b0-4529-b184-3ba3ca0dab8c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:57:07 np0005531887 nova_compute[186849]: 2025-11-22 07:57:07.765 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:07 np0005531887 NetworkManager[55210]: <info>  [1763798227.7670] manager: (tap392e43af-a9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/105)
Nov 22 02:57:07 np0005531887 nova_compute[186849]: 2025-11-22 07:57:07.767 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:57:07 np0005531887 nova_compute[186849]: 2025-11-22 07:57:07.773 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:07 np0005531887 nova_compute[186849]: 2025-11-22 07:57:07.775 186853 INFO os_vif [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:10:9b:64,bridge_name='br-int',has_traffic_filtering=True,id=392e43af-a923-4bd6-bdff-445c6101995b,network=Network(06e0f3a5-911a-4244-bd9c-8cb4fa4c4794),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap392e43af-a9')#033[00m
Nov 22 02:57:07 np0005531887 nova_compute[186849]: 2025-11-22 07:57:07.829 186853 DEBUG nova.virt.libvirt.driver [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:57:07 np0005531887 nova_compute[186849]: 2025-11-22 07:57:07.829 186853 DEBUG nova.virt.libvirt.driver [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:57:07 np0005531887 nova_compute[186849]: 2025-11-22 07:57:07.830 186853 DEBUG nova.virt.libvirt.driver [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] No VIF found with MAC fa:16:3e:10:9b:64, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 02:57:07 np0005531887 nova_compute[186849]: 2025-11-22 07:57:07.830 186853 INFO nova.virt.libvirt.driver [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Using config drive#033[00m
Nov 22 02:57:08 np0005531887 kernel: tap3640f80e-71 (unregistering): left promiscuous mode
Nov 22 02:57:08 np0005531887 NetworkManager[55210]: <info>  [1763798228.1894] device (tap3640f80e-71): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 02:57:08 np0005531887 nova_compute[186849]: 2025-11-22 07:57:08.196 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:08 np0005531887 ovn_controller[95130]: 2025-11-22T07:57:08Z|00197|binding|INFO|Releasing lport 3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 from this chassis (sb_readonly=0)
Nov 22 02:57:08 np0005531887 ovn_controller[95130]: 2025-11-22T07:57:08Z|00198|binding|INFO|Setting lport 3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 down in Southbound
Nov 22 02:57:08 np0005531887 ovn_controller[95130]: 2025-11-22T07:57:08Z|00199|binding|INFO|Removing iface tap3640f80e-71 ovn-installed in OVS
Nov 22 02:57:08 np0005531887 nova_compute[186849]: 2025-11-22 07:57:08.199 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:08 np0005531887 nova_compute[186849]: 2025-11-22 07:57:08.217 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:08 np0005531887 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d0000004c.scope: Deactivated successfully.
Nov 22 02:57:08 np0005531887 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d0000004c.scope: Consumed 14.620s CPU time.
Nov 22 02:57:08 np0005531887 systemd-machined[153180]: Machine qemu-29-instance-0000004c terminated.
Nov 22 02:57:09 np0005531887 nova_compute[186849]: 2025-11-22 07:57:09.007 186853 INFO nova.virt.libvirt.driver [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Instance shutdown successfully after 13 seconds.#033[00m
Nov 22 02:57:09 np0005531887 nova_compute[186849]: 2025-11-22 07:57:09.015 186853 INFO nova.virt.libvirt.driver [-] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Instance destroyed successfully.#033[00m
Nov 22 02:57:09 np0005531887 nova_compute[186849]: 2025-11-22 07:57:09.016 186853 DEBUG nova.virt.libvirt.vif [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:56:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-772508279',display_name='tempest-ServerDiskConfigTestJSON-server-772508279',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-772508279',id=76,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:56:45Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='063bf16c91af408ca075c690797e09d8',ramdisk_id='',reservation_id='r-0n3qm4qh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerDiskConfigTestJSON-592691466',owner_user_name='tempest-ServerDiskConfigTestJSON-592691466-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:56:50Z,user_data=None,user_id='e24c302b62fb470aa189b76d4676733b',uuid=c6cd5fec-f214-4bbc-b854-9e16c9a7577a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2", "address": "fa:16:3e:27:44:b9", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "vif_mac": "fa:16:3e:27:44:b9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3640f80e-71", "ovs_interfaceid": "3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 02:57:09 np0005531887 nova_compute[186849]: 2025-11-22 07:57:09.017 186853 DEBUG nova.network.os_vif_util [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Converting VIF {"id": "3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2", "address": "fa:16:3e:27:44:b9", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "vif_mac": "fa:16:3e:27:44:b9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3640f80e-71", "ovs_interfaceid": "3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:57:09 np0005531887 nova_compute[186849]: 2025-11-22 07:57:09.018 186853 DEBUG nova.network.os_vif_util [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:27:44:b9,bridge_name='br-int',has_traffic_filtering=True,id=3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3640f80e-71') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:57:09 np0005531887 nova_compute[186849]: 2025-11-22 07:57:09.018 186853 DEBUG os_vif [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:27:44:b9,bridge_name='br-int',has_traffic_filtering=True,id=3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3640f80e-71') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 02:57:09 np0005531887 nova_compute[186849]: 2025-11-22 07:57:09.019 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:09 np0005531887 nova_compute[186849]: 2025-11-22 07:57:09.020 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3640f80e-71, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:57:09 np0005531887 nova_compute[186849]: 2025-11-22 07:57:09.021 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:09 np0005531887 nova_compute[186849]: 2025-11-22 07:57:09.023 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:57:09 np0005531887 nova_compute[186849]: 2025-11-22 07:57:09.028 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:09 np0005531887 nova_compute[186849]: 2025-11-22 07:57:09.030 186853 INFO os_vif [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:27:44:b9,bridge_name='br-int',has_traffic_filtering=True,id=3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3640f80e-71')#033[00m
Nov 22 02:57:09 np0005531887 nova_compute[186849]: 2025-11-22 07:57:09.033 186853 DEBUG oslo_concurrency.processutils [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:57:09 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:09.054 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:27:44:b9 10.100.0.12'], port_security=['fa:16:3e:27:44:b9 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'c6cd5fec-f214-4bbc-b854-9e16c9a7577a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '063bf16c91af408ca075c690797e09d8', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f7dee984-8c58-404b-bb82-9a414aef709d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3245992a-6d76-4250-aff6-2ef09c2bac10, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:57:09 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:09.055 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 in datapath d54e232a-5c68-4cc7-b58c-054da9c4646f unbound from our chassis#033[00m
Nov 22 02:57:09 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:09.057 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d54e232a-5c68-4cc7-b58c-054da9c4646f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 02:57:09 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:09.058 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[0f4a9204-c3a4-4774-a865-bc37cba3e00d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:09 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:09.058 104084 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f namespace which is not needed anymore#033[00m
Nov 22 02:57:09 np0005531887 nova_compute[186849]: 2025-11-22 07:57:09.108 186853 DEBUG oslo_concurrency.processutils [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:57:09 np0005531887 nova_compute[186849]: 2025-11-22 07:57:09.109 186853 DEBUG oslo_concurrency.processutils [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:57:09 np0005531887 nova_compute[186849]: 2025-11-22 07:57:09.179 186853 DEBUG oslo_concurrency.processutils [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:57:09 np0005531887 nova_compute[186849]: 2025-11-22 07:57:09.181 186853 DEBUG nova.virt.libvirt.volume.remotefs [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Copying file /var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a_resize/disk to 192.168.122.100:/var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 22 02:57:09 np0005531887 nova_compute[186849]: 2025-11-22 07:57:09.182 186853 DEBUG oslo_concurrency.processutils [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a_resize/disk 192.168.122.100:/var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:57:09 np0005531887 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[223998]: [NOTICE]   (224002) : haproxy version is 2.8.14-c23fe91
Nov 22 02:57:09 np0005531887 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[223998]: [NOTICE]   (224002) : path to executable is /usr/sbin/haproxy
Nov 22 02:57:09 np0005531887 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[223998]: [WARNING]  (224002) : Exiting Master process...
Nov 22 02:57:09 np0005531887 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[223998]: [WARNING]  (224002) : Exiting Master process...
Nov 22 02:57:09 np0005531887 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[223998]: [ALERT]    (224002) : Current worker (224004) exited with code 143 (Terminated)
Nov 22 02:57:09 np0005531887 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[223998]: [WARNING]  (224002) : All workers exited. Exiting... (0)
Nov 22 02:57:09 np0005531887 systemd[1]: libpod-125c2020204afba0ea9a79b5d6680116b68f4c0b40b6fb0cca61e391d37e4718.scope: Deactivated successfully.
Nov 22 02:57:09 np0005531887 podman[224192]: 2025-11-22 07:57:09.2462268 +0000 UTC m=+0.088094357 container died 125c2020204afba0ea9a79b5d6680116b68f4c0b40b6fb0cca61e391d37e4718 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:57:09 np0005531887 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-125c2020204afba0ea9a79b5d6680116b68f4c0b40b6fb0cca61e391d37e4718-userdata-shm.mount: Deactivated successfully.
Nov 22 02:57:09 np0005531887 systemd[1]: var-lib-containers-storage-overlay-b8c9ddeb688ef326eca70afc8594407fb268e5e699e9c27fcaeedb9b6040e17c-merged.mount: Deactivated successfully.
Nov 22 02:57:09 np0005531887 podman[224192]: 2025-11-22 07:57:09.309567289 +0000 UTC m=+0.151434846 container cleanup 125c2020204afba0ea9a79b5d6680116b68f4c0b40b6fb0cca61e391d37e4718 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118)
Nov 22 02:57:09 np0005531887 systemd[1]: libpod-conmon-125c2020204afba0ea9a79b5d6680116b68f4c0b40b6fb0cca61e391d37e4718.scope: Deactivated successfully.
Nov 22 02:57:09 np0005531887 podman[224210]: 2025-11-22 07:57:09.343449103 +0000 UTC m=+0.071564124 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:57:09 np0005531887 nova_compute[186849]: 2025-11-22 07:57:09.367 186853 INFO nova.virt.libvirt.driver [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Creating config drive at /var/lib/nova/instances/e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk.config#033[00m
Nov 22 02:57:09 np0005531887 nova_compute[186849]: 2025-11-22 07:57:09.373 186853 DEBUG oslo_concurrency.processutils [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0tmikh8r execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:57:09 np0005531887 podman[224216]: 2025-11-22 07:57:09.381282366 +0000 UTC m=+0.107276915 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:57:09 np0005531887 podman[224253]: 2025-11-22 07:57:09.41957222 +0000 UTC m=+0.087418459 container remove 125c2020204afba0ea9a79b5d6680116b68f4c0b40b6fb0cca61e391d37e4718 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:57:09 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:09.425 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[92d9e55b-c136-42f6-aba6-5a5bc6e0849e]: (4, ('Sat Nov 22 07:57:09 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f (125c2020204afba0ea9a79b5d6680116b68f4c0b40b6fb0cca61e391d37e4718)\n125c2020204afba0ea9a79b5d6680116b68f4c0b40b6fb0cca61e391d37e4718\nSat Nov 22 07:57:09 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f (125c2020204afba0ea9a79b5d6680116b68f4c0b40b6fb0cca61e391d37e4718)\n125c2020204afba0ea9a79b5d6680116b68f4c0b40b6fb0cca61e391d37e4718\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:09 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:09.427 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[19666ff2-f602-4b2d-849d-c4b8fdc0e4bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:09 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:09.428 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd54e232a-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:57:09 np0005531887 kernel: tapd54e232a-50: left promiscuous mode
Nov 22 02:57:09 np0005531887 nova_compute[186849]: 2025-11-22 07:57:09.432 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:09 np0005531887 nova_compute[186849]: 2025-11-22 07:57:09.447 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:09 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:09.451 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[9437ae42-b16a-4685-bbf1-1547e6673d13]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:09 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:09.466 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[005c6556-d644-4b1d-b1c7-a1fa1c826354]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:09 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:09.467 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[a8206d8f-d907-44e9-b421-0f263817fa79]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:09 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:09.484 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[678eb701-a50b-4567-9305-7bdd3b350fb3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 494725, 'reachable_time': 43254, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224291, 'error': None, 'target': 'ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:09 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:09.489 104200 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 02:57:09 np0005531887 systemd[1]: run-netns-ovnmeta\x2dd54e232a\x2d5c68\x2d4cc7\x2db58c\x2d054da9c4646f.mount: Deactivated successfully.
Nov 22 02:57:09 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:09.490 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[68334b50-b789-4bf6-ad36-72e2f4652e74]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:09 np0005531887 nova_compute[186849]: 2025-11-22 07:57:09.501 186853 DEBUG oslo_concurrency.processutils [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0tmikh8r" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:57:09 np0005531887 kernel: tap392e43af-a9: entered promiscuous mode
Nov 22 02:57:09 np0005531887 NetworkManager[55210]: <info>  [1763798229.5899] manager: (tap392e43af-a9): new Tun device (/org/freedesktop/NetworkManager/Devices/106)
Nov 22 02:57:09 np0005531887 systemd-udevd[224146]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:57:09 np0005531887 nova_compute[186849]: 2025-11-22 07:57:09.590 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:09 np0005531887 ovn_controller[95130]: 2025-11-22T07:57:09Z|00200|binding|INFO|Claiming lport 392e43af-a923-4bd6-bdff-445c6101995b for this chassis.
Nov 22 02:57:09 np0005531887 ovn_controller[95130]: 2025-11-22T07:57:09Z|00201|binding|INFO|392e43af-a923-4bd6-bdff-445c6101995b: Claiming fa:16:3e:10:9b:64 10.100.0.7
Nov 22 02:57:09 np0005531887 nova_compute[186849]: 2025-11-22 07:57:09.595 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:09 np0005531887 NetworkManager[55210]: <info>  [1763798229.6071] device (tap392e43af-a9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 02:57:09 np0005531887 NetworkManager[55210]: <info>  [1763798229.6079] device (tap392e43af-a9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 02:57:09 np0005531887 systemd-machined[153180]: New machine qemu-30-instance-0000004d.
Nov 22 02:57:09 np0005531887 nova_compute[186849]: 2025-11-22 07:57:09.654 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:09 np0005531887 systemd[1]: Started Virtual Machine qemu-30-instance-0000004d.
Nov 22 02:57:09 np0005531887 ovn_controller[95130]: 2025-11-22T07:57:09Z|00202|binding|INFO|Setting lport 392e43af-a923-4bd6-bdff-445c6101995b ovn-installed in OVS
Nov 22 02:57:09 np0005531887 nova_compute[186849]: 2025-11-22 07:57:09.669 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:10 np0005531887 ovn_controller[95130]: 2025-11-22T07:57:10Z|00203|binding|INFO|Setting lport 392e43af-a923-4bd6-bdff-445c6101995b up in Southbound
Nov 22 02:57:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:10.071 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:10:9b:64 10.100.0.7'], port_security=['fa:16:3e:10:9b:64 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'e4a6074c-55b0-4529-b184-3ba3ca0dab8c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'neutron:revision_number': '2', 'neutron:security_group_ids': '40e7d412-78c2-4966-b2d0-76294ef96b0a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2f082361-600e-461c-8c13-2c91c0ff7f77, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=392e43af-a923-4bd6-bdff-445c6101995b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:57:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:10.073 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 392e43af-a923-4bd6-bdff-445c6101995b in datapath 06e0f3a5-911a-4244-bd9c-8cb4fa4c4794 bound to our chassis#033[00m
Nov 22 02:57:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:10.075 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 06e0f3a5-911a-4244-bd9c-8cb4fa4c4794#033[00m
Nov 22 02:57:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:10.091 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[f13407b4-6933-4288-a512-11deedf152cd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:10.093 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap06e0f3a5-91 in ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 02:57:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:10.097 213790 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap06e0f3a5-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 02:57:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:10.097 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[91612b64-26a1-45c1-9fa7-0611e4f93426]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:10.099 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[b410f6e0-ac3b-40ba-a4fe-0ef8cca39a20]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:10.111 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[76ea6833-411b-4723-b10d-adf9bbb2c9b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:10.128 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[983093f0-e1c0-4de9-9be0-8f7af9937ff9]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:10 np0005531887 nova_compute[186849]: 2025-11-22 07:57:10.152 186853 DEBUG oslo_concurrency.processutils [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "scp -r /var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a_resize/disk 192.168.122.100:/var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a/disk" returned: 0 in 0.970s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:57:10 np0005531887 nova_compute[186849]: 2025-11-22 07:57:10.154 186853 DEBUG nova.virt.libvirt.volume.remotefs [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Copying file /var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a_resize/disk.config to 192.168.122.100:/var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 22 02:57:10 np0005531887 nova_compute[186849]: 2025-11-22 07:57:10.154 186853 DEBUG oslo_concurrency.processutils [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a_resize/disk.config 192.168.122.100:/var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:57:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:10.164 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[9782ea04-7ffa-4e66-8243-b421968d1be5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:10 np0005531887 NetworkManager[55210]: <info>  [1763798230.1751] manager: (tap06e0f3a5-90): new Veth device (/org/freedesktop/NetworkManager/Devices/107)
Nov 22 02:57:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:10.176 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[7f287964-0c61-459c-8971-9af211cccda0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:10.219 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[dcd908a3-b5b0-4945-9a92-8d3480df3f72]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:10.223 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[39affe83-52f2-4121-8def-9a64f3049a9f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:10 np0005531887 NetworkManager[55210]: <info>  [1763798230.2508] device (tap06e0f3a5-90): carrier: link connected
Nov 22 02:57:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:10.256 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[f4e1cbca-52a1-46ae-ad23-a5a8e00300ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:10.277 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[fb39f5a6-54db-4869-8069-8f01aa0af029]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap06e0f3a5-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:b7:bd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 67], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 497285, 'reachable_time': 20330, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224342, 'error': None, 'target': 'ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:10.293 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[86eee9f4-7f8b-402f-8f81-54c653386e0a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed7:b7bd'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 497285, 'tstamp': 497285}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224343, 'error': None, 'target': 'ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:10.312 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[84b219e5-4a59-46b0-b652-d3f2d0c2b99e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap06e0f3a5-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:b7:bd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 67], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 497285, 'reachable_time': 20330, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 224344, 'error': None, 'target': 'ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:10.349 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[cff900fd-5fbc-45ee-bdda-320614e07407]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:10.412 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[bbe1acc5-4cc5-41ac-b5ea-f855e1fe3ae3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:10.413 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap06e0f3a5-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:57:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:10.414 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:57:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:10.414 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap06e0f3a5-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:57:10 np0005531887 nova_compute[186849]: 2025-11-22 07:57:10.416 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:10 np0005531887 NetworkManager[55210]: <info>  [1763798230.4171] manager: (tap06e0f3a5-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/108)
Nov 22 02:57:10 np0005531887 kernel: tap06e0f3a5-90: entered promiscuous mode
Nov 22 02:57:10 np0005531887 nova_compute[186849]: 2025-11-22 07:57:10.419 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:10.421 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap06e0f3a5-90, col_values=(('external_ids', {'iface-id': '465da2c0-9a1c-41a9-be9a-d10bcbd7a813'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:57:10 np0005531887 ovn_controller[95130]: 2025-11-22T07:57:10Z|00204|binding|INFO|Releasing lport 465da2c0-9a1c-41a9-be9a-d10bcbd7a813 from this chassis (sb_readonly=0)
Nov 22 02:57:10 np0005531887 nova_compute[186849]: 2025-11-22 07:57:10.422 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:10.423 104084 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/06e0f3a5-911a-4244-bd9c-8cb4fa4c4794.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/06e0f3a5-911a-4244-bd9c-8cb4fa4c4794.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 02:57:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:10.425 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[9256399a-cee1-4954-bd74-58deebd16f56]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:10.426 104084 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 02:57:10 np0005531887 ovn_metadata_agent[104079]: global
Nov 22 02:57:10 np0005531887 ovn_metadata_agent[104079]:    log         /dev/log local0 debug
Nov 22 02:57:10 np0005531887 ovn_metadata_agent[104079]:    log-tag     haproxy-metadata-proxy-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794
Nov 22 02:57:10 np0005531887 ovn_metadata_agent[104079]:    user        root
Nov 22 02:57:10 np0005531887 ovn_metadata_agent[104079]:    group       root
Nov 22 02:57:10 np0005531887 ovn_metadata_agent[104079]:    maxconn     1024
Nov 22 02:57:10 np0005531887 ovn_metadata_agent[104079]:    pidfile     /var/lib/neutron/external/pids/06e0f3a5-911a-4244-bd9c-8cb4fa4c4794.pid.haproxy
Nov 22 02:57:10 np0005531887 ovn_metadata_agent[104079]:    daemon
Nov 22 02:57:10 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:57:10 np0005531887 ovn_metadata_agent[104079]: defaults
Nov 22 02:57:10 np0005531887 ovn_metadata_agent[104079]:    log global
Nov 22 02:57:10 np0005531887 ovn_metadata_agent[104079]:    mode http
Nov 22 02:57:10 np0005531887 ovn_metadata_agent[104079]:    option httplog
Nov 22 02:57:10 np0005531887 ovn_metadata_agent[104079]:    option dontlognull
Nov 22 02:57:10 np0005531887 ovn_metadata_agent[104079]:    option http-server-close
Nov 22 02:57:10 np0005531887 ovn_metadata_agent[104079]:    option forwardfor
Nov 22 02:57:10 np0005531887 ovn_metadata_agent[104079]:    retries                 3
Nov 22 02:57:10 np0005531887 ovn_metadata_agent[104079]:    timeout http-request    30s
Nov 22 02:57:10 np0005531887 ovn_metadata_agent[104079]:    timeout connect         30s
Nov 22 02:57:10 np0005531887 ovn_metadata_agent[104079]:    timeout client          32s
Nov 22 02:57:10 np0005531887 ovn_metadata_agent[104079]:    timeout server          32s
Nov 22 02:57:10 np0005531887 ovn_metadata_agent[104079]:    timeout http-keep-alive 30s
Nov 22 02:57:10 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:57:10 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:57:10 np0005531887 ovn_metadata_agent[104079]: listen listener
Nov 22 02:57:10 np0005531887 ovn_metadata_agent[104079]:    bind 169.254.169.254:80
Nov 22 02:57:10 np0005531887 ovn_metadata_agent[104079]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 02:57:10 np0005531887 ovn_metadata_agent[104079]:    http-request add-header X-OVN-Network-ID 06e0f3a5-911a-4244-bd9c-8cb4fa4c4794
Nov 22 02:57:10 np0005531887 ovn_metadata_agent[104079]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 02:57:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:10.428 104084 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'env', 'PROCESS_TAG=haproxy-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/06e0f3a5-911a-4244-bd9c-8cb4fa4c4794.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 02:57:10 np0005531887 nova_compute[186849]: 2025-11-22 07:57:10.434 186853 DEBUG oslo_concurrency.processutils [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "scp -C -r /var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a_resize/disk.config 192.168.122.100:/var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a/disk.config" returned: 0 in 0.280s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:57:10 np0005531887 nova_compute[186849]: 2025-11-22 07:57:10.435 186853 DEBUG nova.virt.libvirt.volume.remotefs [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Copying file /var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a_resize/disk.info to 192.168.122.100:/var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 22 02:57:10 np0005531887 nova_compute[186849]: 2025-11-22 07:57:10.436 186853 DEBUG oslo_concurrency.processutils [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a_resize/disk.info 192.168.122.100:/var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:57:10 np0005531887 nova_compute[186849]: 2025-11-22 07:57:10.453 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:10 np0005531887 nova_compute[186849]: 2025-11-22 07:57:10.539 186853 DEBUG nova.compute.manager [req-345250c6-d3f8-40d1-b166-f9e6f5c34617 req-e63eb320-811a-4529-8b35-434a421d399c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Received event network-vif-unplugged-3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:57:10 np0005531887 nova_compute[186849]: 2025-11-22 07:57:10.540 186853 DEBUG oslo_concurrency.lockutils [req-345250c6-d3f8-40d1-b166-f9e6f5c34617 req-e63eb320-811a-4529-8b35-434a421d399c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c6cd5fec-f214-4bbc-b854-9e16c9a7577a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:57:10 np0005531887 nova_compute[186849]: 2025-11-22 07:57:10.541 186853 DEBUG oslo_concurrency.lockutils [req-345250c6-d3f8-40d1-b166-f9e6f5c34617 req-e63eb320-811a-4529-8b35-434a421d399c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c6cd5fec-f214-4bbc-b854-9e16c9a7577a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:57:10 np0005531887 nova_compute[186849]: 2025-11-22 07:57:10.541 186853 DEBUG oslo_concurrency.lockutils [req-345250c6-d3f8-40d1-b166-f9e6f5c34617 req-e63eb320-811a-4529-8b35-434a421d399c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c6cd5fec-f214-4bbc-b854-9e16c9a7577a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:57:10 np0005531887 nova_compute[186849]: 2025-11-22 07:57:10.541 186853 DEBUG nova.compute.manager [req-345250c6-d3f8-40d1-b166-f9e6f5c34617 req-e63eb320-811a-4529-8b35-434a421d399c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] No waiting events found dispatching network-vif-unplugged-3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:57:10 np0005531887 nova_compute[186849]: 2025-11-22 07:57:10.542 186853 WARNING nova.compute.manager [req-345250c6-d3f8-40d1-b166-f9e6f5c34617 req-e63eb320-811a-4529-8b35-434a421d399c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Received unexpected event network-vif-unplugged-3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 for instance with vm_state active and task_state resize_migrating.#033[00m
Nov 22 02:57:10 np0005531887 nova_compute[186849]: 2025-11-22 07:57:10.624 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798230.6235497, e4a6074c-55b0-4529-b184-3ba3ca0dab8c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:57:10 np0005531887 nova_compute[186849]: 2025-11-22 07:57:10.624 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] VM Started (Lifecycle Event)#033[00m
Nov 22 02:57:10 np0005531887 nova_compute[186849]: 2025-11-22 07:57:10.646 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:57:10 np0005531887 nova_compute[186849]: 2025-11-22 07:57:10.650 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798230.6236715, e4a6074c-55b0-4529-b184-3ba3ca0dab8c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:57:10 np0005531887 nova_compute[186849]: 2025-11-22 07:57:10.651 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] VM Paused (Lifecycle Event)#033[00m
Nov 22 02:57:10 np0005531887 nova_compute[186849]: 2025-11-22 07:57:10.654 186853 DEBUG oslo_concurrency.processutils [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "scp -C -r /var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a_resize/disk.info 192.168.122.100:/var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a/disk.info" returned: 0 in 0.218s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:57:10 np0005531887 nova_compute[186849]: 2025-11-22 07:57:10.684 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:57:10 np0005531887 nova_compute[186849]: 2025-11-22 07:57:10.688 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:57:10 np0005531887 nova_compute[186849]: 2025-11-22 07:57:10.711 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:57:10 np0005531887 nova_compute[186849]: 2025-11-22 07:57:10.769 186853 DEBUG nova.compute.manager [req-ef78e9ff-ca9f-4fa3-a2d2-fc618ce424d6 req-7a9de193-7e90-4060-8c75-d657ec2c3af2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Received event network-vif-plugged-392e43af-a923-4bd6-bdff-445c6101995b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:57:10 np0005531887 nova_compute[186849]: 2025-11-22 07:57:10.769 186853 DEBUG oslo_concurrency.lockutils [req-ef78e9ff-ca9f-4fa3-a2d2-fc618ce424d6 req-7a9de193-7e90-4060-8c75-d657ec2c3af2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "e4a6074c-55b0-4529-b184-3ba3ca0dab8c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:57:10 np0005531887 nova_compute[186849]: 2025-11-22 07:57:10.770 186853 DEBUG oslo_concurrency.lockutils [req-ef78e9ff-ca9f-4fa3-a2d2-fc618ce424d6 req-7a9de193-7e90-4060-8c75-d657ec2c3af2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e4a6074c-55b0-4529-b184-3ba3ca0dab8c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:57:10 np0005531887 nova_compute[186849]: 2025-11-22 07:57:10.770 186853 DEBUG oslo_concurrency.lockutils [req-ef78e9ff-ca9f-4fa3-a2d2-fc618ce424d6 req-7a9de193-7e90-4060-8c75-d657ec2c3af2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e4a6074c-55b0-4529-b184-3ba3ca0dab8c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:57:10 np0005531887 nova_compute[186849]: 2025-11-22 07:57:10.770 186853 DEBUG nova.compute.manager [req-ef78e9ff-ca9f-4fa3-a2d2-fc618ce424d6 req-7a9de193-7e90-4060-8c75-d657ec2c3af2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Processing event network-vif-plugged-392e43af-a923-4bd6-bdff-445c6101995b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 02:57:10 np0005531887 nova_compute[186849]: 2025-11-22 07:57:10.771 186853 DEBUG nova.compute.manager [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:57:10 np0005531887 nova_compute[186849]: 2025-11-22 07:57:10.777 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798230.776661, e4a6074c-55b0-4529-b184-3ba3ca0dab8c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:57:10 np0005531887 nova_compute[186849]: 2025-11-22 07:57:10.778 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:57:10 np0005531887 nova_compute[186849]: 2025-11-22 07:57:10.780 186853 DEBUG nova.virt.libvirt.driver [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 02:57:10 np0005531887 nova_compute[186849]: 2025-11-22 07:57:10.787 186853 INFO nova.virt.libvirt.driver [-] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Instance spawned successfully.#033[00m
Nov 22 02:57:10 np0005531887 nova_compute[186849]: 2025-11-22 07:57:10.788 186853 DEBUG nova.virt.libvirt.driver [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 02:57:10 np0005531887 nova_compute[186849]: 2025-11-22 07:57:10.805 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:57:10 np0005531887 nova_compute[186849]: 2025-11-22 07:57:10.808 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:57:10 np0005531887 nova_compute[186849]: 2025-11-22 07:57:10.818 186853 DEBUG nova.virt.libvirt.driver [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:57:10 np0005531887 nova_compute[186849]: 2025-11-22 07:57:10.819 186853 DEBUG nova.virt.libvirt.driver [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:57:10 np0005531887 nova_compute[186849]: 2025-11-22 07:57:10.820 186853 DEBUG nova.virt.libvirt.driver [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:57:10 np0005531887 nova_compute[186849]: 2025-11-22 07:57:10.820 186853 DEBUG nova.virt.libvirt.driver [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:57:10 np0005531887 nova_compute[186849]: 2025-11-22 07:57:10.821 186853 DEBUG nova.virt.libvirt.driver [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:57:10 np0005531887 nova_compute[186849]: 2025-11-22 07:57:10.821 186853 DEBUG nova.virt.libvirt.driver [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:57:10 np0005531887 nova_compute[186849]: 2025-11-22 07:57:10.826 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:57:10 np0005531887 podman[224385]: 2025-11-22 07:57:10.886039038 +0000 UTC m=+0.090510117 container create 95817d0067494cc0e2446f9d3aec5de1019909a5700516d722e7b21647f696ab (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118)
Nov 22 02:57:10 np0005531887 podman[224385]: 2025-11-22 07:57:10.821006267 +0000 UTC m=+0.025477346 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 02:57:10 np0005531887 systemd[1]: Started libpod-conmon-95817d0067494cc0e2446f9d3aec5de1019909a5700516d722e7b21647f696ab.scope.
Nov 22 02:57:10 np0005531887 systemd[1]: Started libcrun container.
Nov 22 02:57:10 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a1f9aca14d8ada4adf637ffde626510f950763f32f0bc33e5674be281e9cda3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 02:57:11 np0005531887 podman[224385]: 2025-11-22 07:57:11.006584172 +0000 UTC m=+0.211055271 container init 95817d0067494cc0e2446f9d3aec5de1019909a5700516d722e7b21647f696ab (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:57:11 np0005531887 podman[224385]: 2025-11-22 07:57:11.013432353 +0000 UTC m=+0.217903432 container start 95817d0067494cc0e2446f9d3aec5de1019909a5700516d722e7b21647f696ab (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 22 02:57:11 np0005531887 neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794[224400]: [NOTICE]   (224404) : New worker (224406) forked
Nov 22 02:57:11 np0005531887 neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794[224400]: [NOTICE]   (224404) : Loading success.
Nov 22 02:57:11 np0005531887 nova_compute[186849]: 2025-11-22 07:57:11.427 186853 INFO nova.compute.manager [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Took 14.38 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 02:57:11 np0005531887 nova_compute[186849]: 2025-11-22 07:57:11.428 186853 DEBUG nova.compute.manager [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:57:11 np0005531887 nova_compute[186849]: 2025-11-22 07:57:11.547 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:11 np0005531887 nova_compute[186849]: 2025-11-22 07:57:11.729 186853 DEBUG neutronclient.v2_0.client [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 for host compute-0.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Nov 22 02:57:11 np0005531887 nova_compute[186849]: 2025-11-22 07:57:11.814 186853 INFO nova.compute.manager [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Took 15.25 seconds to build instance.#033[00m
Nov 22 02:57:11 np0005531887 nova_compute[186849]: 2025-11-22 07:57:11.950 186853 DEBUG oslo_concurrency.lockutils [None req-ce8ca059-6321-44f6-b4b1-b3ea40414f6a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "e4a6074c-55b0-4529-b184-3ba3ca0dab8c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.523s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:57:11 np0005531887 nova_compute[186849]: 2025-11-22 07:57:11.998 186853 DEBUG nova.network.neutron [req-93e28c87-5e1b-45bf-8ebd-f8d1498fa784 req-0ce4198b-2675-4901-af27-315aa5b289fe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Updated VIF entry in instance network info cache for port 392e43af-a923-4bd6-bdff-445c6101995b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 02:57:11 np0005531887 nova_compute[186849]: 2025-11-22 07:57:11.999 186853 DEBUG nova.network.neutron [req-93e28c87-5e1b-45bf-8ebd-f8d1498fa784 req-0ce4198b-2675-4901-af27-315aa5b289fe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Updating instance_info_cache with network_info: [{"id": "392e43af-a923-4bd6-bdff-445c6101995b", "address": "fa:16:3e:10:9b:64", "network": {"id": "06e0f3a5-911a-4244-bd9c-8cb4fa4c4794", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-960378838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd33c7e49baa4c7f9575824b348a0f23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap392e43af-a9", "ovs_interfaceid": "392e43af-a923-4bd6-bdff-445c6101995b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:57:12 np0005531887 nova_compute[186849]: 2025-11-22 07:57:12.055 186853 DEBUG oslo_concurrency.lockutils [req-93e28c87-5e1b-45bf-8ebd-f8d1498fa784 req-0ce4198b-2675-4901-af27-315aa5b289fe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-e4a6074c-55b0-4529-b184-3ba3ca0dab8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:57:12 np0005531887 nova_compute[186849]: 2025-11-22 07:57:12.079 186853 DEBUG oslo_concurrency.lockutils [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "c6cd5fec-f214-4bbc-b854-9e16c9a7577a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:57:12 np0005531887 nova_compute[186849]: 2025-11-22 07:57:12.079 186853 DEBUG oslo_concurrency.lockutils [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "c6cd5fec-f214-4bbc-b854-9e16c9a7577a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:57:12 np0005531887 nova_compute[186849]: 2025-11-22 07:57:12.080 186853 DEBUG oslo_concurrency.lockutils [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "c6cd5fec-f214-4bbc-b854-9e16c9a7577a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:57:12 np0005531887 nova_compute[186849]: 2025-11-22 07:57:12.766 186853 DEBUG nova.compute.manager [req-f1e28a8b-d898-412a-8033-a8901f54e2a0 req-52afde83-588c-46f5-8e8f-ca17df998619 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Received event network-vif-plugged-3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:57:12 np0005531887 nova_compute[186849]: 2025-11-22 07:57:12.766 186853 DEBUG oslo_concurrency.lockutils [req-f1e28a8b-d898-412a-8033-a8901f54e2a0 req-52afde83-588c-46f5-8e8f-ca17df998619 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c6cd5fec-f214-4bbc-b854-9e16c9a7577a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:57:12 np0005531887 nova_compute[186849]: 2025-11-22 07:57:12.767 186853 DEBUG oslo_concurrency.lockutils [req-f1e28a8b-d898-412a-8033-a8901f54e2a0 req-52afde83-588c-46f5-8e8f-ca17df998619 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c6cd5fec-f214-4bbc-b854-9e16c9a7577a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:57:12 np0005531887 nova_compute[186849]: 2025-11-22 07:57:12.767 186853 DEBUG oslo_concurrency.lockutils [req-f1e28a8b-d898-412a-8033-a8901f54e2a0 req-52afde83-588c-46f5-8e8f-ca17df998619 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c6cd5fec-f214-4bbc-b854-9e16c9a7577a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:57:12 np0005531887 nova_compute[186849]: 2025-11-22 07:57:12.767 186853 DEBUG nova.compute.manager [req-f1e28a8b-d898-412a-8033-a8901f54e2a0 req-52afde83-588c-46f5-8e8f-ca17df998619 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] No waiting events found dispatching network-vif-plugged-3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:57:12 np0005531887 nova_compute[186849]: 2025-11-22 07:57:12.767 186853 WARNING nova.compute.manager [req-f1e28a8b-d898-412a-8033-a8901f54e2a0 req-52afde83-588c-46f5-8e8f-ca17df998619 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Received unexpected event network-vif-plugged-3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 for instance with vm_state active and task_state resize_migrated.#033[00m
Nov 22 02:57:12 np0005531887 nova_compute[186849]: 2025-11-22 07:57:12.972 186853 DEBUG nova.compute.manager [req-84442a16-ee54-4cac-8d66-63342bc96376 req-598e7c36-33f2-4775-b7f2-c68773d6cc78 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Received event network-vif-plugged-392e43af-a923-4bd6-bdff-445c6101995b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:57:12 np0005531887 nova_compute[186849]: 2025-11-22 07:57:12.972 186853 DEBUG oslo_concurrency.lockutils [req-84442a16-ee54-4cac-8d66-63342bc96376 req-598e7c36-33f2-4775-b7f2-c68773d6cc78 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "e4a6074c-55b0-4529-b184-3ba3ca0dab8c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:57:12 np0005531887 nova_compute[186849]: 2025-11-22 07:57:12.973 186853 DEBUG oslo_concurrency.lockutils [req-84442a16-ee54-4cac-8d66-63342bc96376 req-598e7c36-33f2-4775-b7f2-c68773d6cc78 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e4a6074c-55b0-4529-b184-3ba3ca0dab8c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:57:12 np0005531887 nova_compute[186849]: 2025-11-22 07:57:12.973 186853 DEBUG oslo_concurrency.lockutils [req-84442a16-ee54-4cac-8d66-63342bc96376 req-598e7c36-33f2-4775-b7f2-c68773d6cc78 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e4a6074c-55b0-4529-b184-3ba3ca0dab8c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:57:12 np0005531887 nova_compute[186849]: 2025-11-22 07:57:12.973 186853 DEBUG nova.compute.manager [req-84442a16-ee54-4cac-8d66-63342bc96376 req-598e7c36-33f2-4775-b7f2-c68773d6cc78 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] No waiting events found dispatching network-vif-plugged-392e43af-a923-4bd6-bdff-445c6101995b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:57:12 np0005531887 nova_compute[186849]: 2025-11-22 07:57:12.974 186853 WARNING nova.compute.manager [req-84442a16-ee54-4cac-8d66-63342bc96376 req-598e7c36-33f2-4775-b7f2-c68773d6cc78 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Received unexpected event network-vif-plugged-392e43af-a923-4bd6-bdff-445c6101995b for instance with vm_state active and task_state None.#033[00m
Nov 22 02:57:14 np0005531887 nova_compute[186849]: 2025-11-22 07:57:14.023 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:14 np0005531887 nova_compute[186849]: 2025-11-22 07:57:14.353 186853 DEBUG nova.compute.manager [None req-7bf0b5cb-41dc-42ee-96cd-4cb634c85ab6 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:57:14 np0005531887 nova_compute[186849]: 2025-11-22 07:57:14.438 186853 INFO nova.compute.manager [None req-7bf0b5cb-41dc-42ee-96cd-4cb634c85ab6 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] instance snapshotting#033[00m
Nov 22 02:57:14 np0005531887 podman[224416]: 2025-11-22 07:57:14.855211717 +0000 UTC m=+0.067106994 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 02:57:14 np0005531887 nova_compute[186849]: 2025-11-22 07:57:14.929 186853 INFO nova.virt.libvirt.driver [None req-7bf0b5cb-41dc-42ee-96cd-4cb634c85ab6 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Beginning live snapshot process#033[00m
Nov 22 02:57:15 np0005531887 nova_compute[186849]: 2025-11-22 07:57:15.222 186853 DEBUG nova.compute.manager [req-d5e6e6d0-0099-413e-98b6-0f8c1c274975 req-fdb09782-6fb5-47bd-ae2a-2d7a44d8e7ca 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Received event network-changed-3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:57:15 np0005531887 nova_compute[186849]: 2025-11-22 07:57:15.223 186853 DEBUG nova.compute.manager [req-d5e6e6d0-0099-413e-98b6-0f8c1c274975 req-fdb09782-6fb5-47bd-ae2a-2d7a44d8e7ca 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Refreshing instance network info cache due to event network-changed-3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 02:57:15 np0005531887 nova_compute[186849]: 2025-11-22 07:57:15.223 186853 DEBUG oslo_concurrency.lockutils [req-d5e6e6d0-0099-413e-98b6-0f8c1c274975 req-fdb09782-6fb5-47bd-ae2a-2d7a44d8e7ca 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-c6cd5fec-f214-4bbc-b854-9e16c9a7577a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:57:15 np0005531887 nova_compute[186849]: 2025-11-22 07:57:15.223 186853 DEBUG oslo_concurrency.lockutils [req-d5e6e6d0-0099-413e-98b6-0f8c1c274975 req-fdb09782-6fb5-47bd-ae2a-2d7a44d8e7ca 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-c6cd5fec-f214-4bbc-b854-9e16c9a7577a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:57:15 np0005531887 nova_compute[186849]: 2025-11-22 07:57:15.224 186853 DEBUG nova.network.neutron [req-d5e6e6d0-0099-413e-98b6-0f8c1c274975 req-fdb09782-6fb5-47bd-ae2a-2d7a44d8e7ca 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Refreshing network info cache for port 3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 02:57:15 np0005531887 virtqemud[186424]: invalid argument: disk vda does not have an active block job
Nov 22 02:57:15 np0005531887 nova_compute[186849]: 2025-11-22 07:57:15.474 186853 DEBUG oslo_concurrency.processutils [None req-7bf0b5cb-41dc-42ee-96cd-4cb634c85ab6 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:57:15 np0005531887 nova_compute[186849]: 2025-11-22 07:57:15.535 186853 DEBUG oslo_concurrency.processutils [None req-7bf0b5cb-41dc-42ee-96cd-4cb634c85ab6 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk --force-share --output=json -f qcow2" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:57:15 np0005531887 nova_compute[186849]: 2025-11-22 07:57:15.537 186853 DEBUG oslo_concurrency.processutils [None req-7bf0b5cb-41dc-42ee-96cd-4cb634c85ab6 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:57:15 np0005531887 nova_compute[186849]: 2025-11-22 07:57:15.598 186853 DEBUG oslo_concurrency.processutils [None req-7bf0b5cb-41dc-42ee-96cd-4cb634c85ab6 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk --force-share --output=json -f qcow2" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:57:15 np0005531887 nova_compute[186849]: 2025-11-22 07:57:15.611 186853 DEBUG oslo_concurrency.processutils [None req-7bf0b5cb-41dc-42ee-96cd-4cb634c85ab6 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:57:15 np0005531887 nova_compute[186849]: 2025-11-22 07:57:15.670 186853 DEBUG oslo_concurrency.processutils [None req-7bf0b5cb-41dc-42ee-96cd-4cb634c85ab6 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:57:15 np0005531887 nova_compute[186849]: 2025-11-22 07:57:15.671 186853 DEBUG oslo_concurrency.processutils [None req-7bf0b5cb-41dc-42ee-96cd-4cb634c85ab6 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpwcexp_n0/ee2cb085d6884995b2e33890349aa0c7.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:57:15 np0005531887 nova_compute[186849]: 2025-11-22 07:57:15.954 186853 DEBUG oslo_concurrency.processutils [None req-7bf0b5cb-41dc-42ee-96cd-4cb634c85ab6 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpwcexp_n0/ee2cb085d6884995b2e33890349aa0c7.delta 1073741824" returned: 0 in 0.283s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:57:15 np0005531887 nova_compute[186849]: 2025-11-22 07:57:15.955 186853 INFO nova.virt.libvirt.driver [None req-7bf0b5cb-41dc-42ee-96cd-4cb634c85ab6 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Nov 22 02:57:16 np0005531887 nova_compute[186849]: 2025-11-22 07:57:16.025 186853 DEBUG nova.virt.libvirt.guest [None req-7bf0b5cb-41dc-42ee-96cd-4cb634c85ab6 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] COPY block job progress, current cursor: 0 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Nov 22 02:57:16 np0005531887 nova_compute[186849]: 2025-11-22 07:57:16.530 186853 DEBUG nova.virt.libvirt.guest [None req-7bf0b5cb-41dc-42ee-96cd-4cb634c85ab6 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] COPY block job progress, current cursor: 1 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Nov 22 02:57:16 np0005531887 nova_compute[186849]: 2025-11-22 07:57:16.533 186853 INFO nova.virt.libvirt.driver [None req-7bf0b5cb-41dc-42ee-96cd-4cb634c85ab6 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Nov 22 02:57:16 np0005531887 nova_compute[186849]: 2025-11-22 07:57:16.549 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:16 np0005531887 nova_compute[186849]: 2025-11-22 07:57:16.579 186853 DEBUG nova.privsep.utils [None req-7bf0b5cb-41dc-42ee-96cd-4cb634c85ab6 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 22 02:57:16 np0005531887 nova_compute[186849]: 2025-11-22 07:57:16.580 186853 DEBUG oslo_concurrency.processutils [None req-7bf0b5cb-41dc-42ee-96cd-4cb634c85ab6 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpwcexp_n0/ee2cb085d6884995b2e33890349aa0c7.delta /var/lib/nova/instances/snapshots/tmpwcexp_n0/ee2cb085d6884995b2e33890349aa0c7 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:57:16 np0005531887 nova_compute[186849]: 2025-11-22 07:57:16.950 186853 DEBUG oslo_concurrency.processutils [None req-7bf0b5cb-41dc-42ee-96cd-4cb634c85ab6 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpwcexp_n0/ee2cb085d6884995b2e33890349aa0c7.delta /var/lib/nova/instances/snapshots/tmpwcexp_n0/ee2cb085d6884995b2e33890349aa0c7" returned: 0 in 0.370s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:57:16 np0005531887 nova_compute[186849]: 2025-11-22 07:57:16.952 186853 INFO nova.virt.libvirt.driver [None req-7bf0b5cb-41dc-42ee-96cd-4cb634c85ab6 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Snapshot extracted, beginning image upload#033[00m
Nov 22 02:57:19 np0005531887 nova_compute[186849]: 2025-11-22 07:57:19.027 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:20 np0005531887 nova_compute[186849]: 2025-11-22 07:57:20.076 186853 DEBUG nova.network.neutron [req-d5e6e6d0-0099-413e-98b6-0f8c1c274975 req-fdb09782-6fb5-47bd-ae2a-2d7a44d8e7ca 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Updated VIF entry in instance network info cache for port 3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 02:57:20 np0005531887 nova_compute[186849]: 2025-11-22 07:57:20.077 186853 DEBUG nova.network.neutron [req-d5e6e6d0-0099-413e-98b6-0f8c1c274975 req-fdb09782-6fb5-47bd-ae2a-2d7a44d8e7ca 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Updating instance_info_cache with network_info: [{"id": "3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2", "address": "fa:16:3e:27:44:b9", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3640f80e-71", "ovs_interfaceid": "3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:57:20 np0005531887 nova_compute[186849]: 2025-11-22 07:57:20.186 186853 DEBUG oslo_concurrency.lockutils [req-d5e6e6d0-0099-413e-98b6-0f8c1c274975 req-fdb09782-6fb5-47bd-ae2a-2d7a44d8e7ca 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-c6cd5fec-f214-4bbc-b854-9e16c9a7577a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:57:21 np0005531887 nova_compute[186849]: 2025-11-22 07:57:21.552 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:21 np0005531887 podman[224463]: 2025-11-22 07:57:21.839653603 +0000 UTC m=+0.057173016 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 22 02:57:22 np0005531887 nova_compute[186849]: 2025-11-22 07:57:22.842 186853 DEBUG nova.compute.manager [req-b3b183d2-a7d5-4012-88ae-84f9db16fc68 req-29c0aaa5-3ec8-44ba-b89e-47616e3e72fe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Received event network-vif-plugged-3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:57:22 np0005531887 nova_compute[186849]: 2025-11-22 07:57:22.843 186853 DEBUG oslo_concurrency.lockutils [req-b3b183d2-a7d5-4012-88ae-84f9db16fc68 req-29c0aaa5-3ec8-44ba-b89e-47616e3e72fe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c6cd5fec-f214-4bbc-b854-9e16c9a7577a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:57:22 np0005531887 nova_compute[186849]: 2025-11-22 07:57:22.844 186853 DEBUG oslo_concurrency.lockutils [req-b3b183d2-a7d5-4012-88ae-84f9db16fc68 req-29c0aaa5-3ec8-44ba-b89e-47616e3e72fe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c6cd5fec-f214-4bbc-b854-9e16c9a7577a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:57:22 np0005531887 nova_compute[186849]: 2025-11-22 07:57:22.844 186853 DEBUG oslo_concurrency.lockutils [req-b3b183d2-a7d5-4012-88ae-84f9db16fc68 req-29c0aaa5-3ec8-44ba-b89e-47616e3e72fe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c6cd5fec-f214-4bbc-b854-9e16c9a7577a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:57:22 np0005531887 nova_compute[186849]: 2025-11-22 07:57:22.844 186853 DEBUG nova.compute.manager [req-b3b183d2-a7d5-4012-88ae-84f9db16fc68 req-29c0aaa5-3ec8-44ba-b89e-47616e3e72fe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] No waiting events found dispatching network-vif-plugged-3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:57:22 np0005531887 nova_compute[186849]: 2025-11-22 07:57:22.845 186853 WARNING nova.compute.manager [req-b3b183d2-a7d5-4012-88ae-84f9db16fc68 req-29c0aaa5-3ec8-44ba-b89e-47616e3e72fe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Received unexpected event network-vif-plugged-3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 for instance with vm_state resized and task_state None.#033[00m
Nov 22 02:57:23 np0005531887 nova_compute[186849]: 2025-11-22 07:57:23.489 186853 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798228.4872031, c6cd5fec-f214-4bbc-b854-9e16c9a7577a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:57:23 np0005531887 nova_compute[186849]: 2025-11-22 07:57:23.489 186853 INFO nova.compute.manager [-] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:57:23 np0005531887 nova_compute[186849]: 2025-11-22 07:57:23.519 186853 DEBUG nova.compute.manager [None req-c2b8c265-e024-4847-8abc-4300b8e2a3bf - - - - - -] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:57:23 np0005531887 nova_compute[186849]: 2025-11-22 07:57:23.524 186853 DEBUG nova.compute.manager [None req-c2b8c265-e024-4847-8abc-4300b8e2a3bf - - - - - -] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: resized, current task_state: None, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:57:23 np0005531887 nova_compute[186849]: 2025-11-22 07:57:23.548 186853 INFO nova.compute.manager [None req-c2b8c265-e024-4847-8abc-4300b8e2a3bf - - - - - -] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Nov 22 02:57:23 np0005531887 nova_compute[186849]: 2025-11-22 07:57:23.590 186853 INFO nova.virt.libvirt.driver [None req-7bf0b5cb-41dc-42ee-96cd-4cb634c85ab6 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Snapshot image upload complete#033[00m
Nov 22 02:57:23 np0005531887 nova_compute[186849]: 2025-11-22 07:57:23.591 186853 INFO nova.compute.manager [None req-7bf0b5cb-41dc-42ee-96cd-4cb634c85ab6 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Took 9.14 seconds to snapshot the instance on the hypervisor.#033[00m
Nov 22 02:57:24 np0005531887 nova_compute[186849]: 2025-11-22 07:57:24.031 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:24 np0005531887 nova_compute[186849]: 2025-11-22 07:57:24.975 186853 DEBUG nova.compute.manager [req-92925060-f0e6-4e75-9afc-abd6be7dc557 req-79dd736e-65a7-46d0-82b3-fcbfaec71001 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Received event network-vif-plugged-3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:57:24 np0005531887 nova_compute[186849]: 2025-11-22 07:57:24.975 186853 DEBUG oslo_concurrency.lockutils [req-92925060-f0e6-4e75-9afc-abd6be7dc557 req-79dd736e-65a7-46d0-82b3-fcbfaec71001 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c6cd5fec-f214-4bbc-b854-9e16c9a7577a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:57:24 np0005531887 nova_compute[186849]: 2025-11-22 07:57:24.975 186853 DEBUG oslo_concurrency.lockutils [req-92925060-f0e6-4e75-9afc-abd6be7dc557 req-79dd736e-65a7-46d0-82b3-fcbfaec71001 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c6cd5fec-f214-4bbc-b854-9e16c9a7577a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:57:24 np0005531887 nova_compute[186849]: 2025-11-22 07:57:24.976 186853 DEBUG oslo_concurrency.lockutils [req-92925060-f0e6-4e75-9afc-abd6be7dc557 req-79dd736e-65a7-46d0-82b3-fcbfaec71001 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c6cd5fec-f214-4bbc-b854-9e16c9a7577a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:57:24 np0005531887 nova_compute[186849]: 2025-11-22 07:57:24.976 186853 DEBUG nova.compute.manager [req-92925060-f0e6-4e75-9afc-abd6be7dc557 req-79dd736e-65a7-46d0-82b3-fcbfaec71001 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] No waiting events found dispatching network-vif-plugged-3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:57:24 np0005531887 nova_compute[186849]: 2025-11-22 07:57:24.976 186853 WARNING nova.compute.manager [req-92925060-f0e6-4e75-9afc-abd6be7dc557 req-79dd736e-65a7-46d0-82b3-fcbfaec71001 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Received unexpected event network-vif-plugged-3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 for instance with vm_state resized and task_state None.#033[00m
Nov 22 02:57:25 np0005531887 podman[224494]: 2025-11-22 07:57:25.841326612 +0000 UTC m=+0.064171300 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_id=multipathd, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 02:57:26 np0005531887 nova_compute[186849]: 2025-11-22 07:57:26.554 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:26 np0005531887 nova_compute[186849]: 2025-11-22 07:57:26.615 186853 DEBUG oslo_concurrency.lockutils [None req-1865d54e-6b81-44c6-bcf5-20c41feda476 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "c6cd5fec-f214-4bbc-b854-9e16c9a7577a" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:57:26 np0005531887 nova_compute[186849]: 2025-11-22 07:57:26.615 186853 DEBUG oslo_concurrency.lockutils [None req-1865d54e-6b81-44c6-bcf5-20c41feda476 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "c6cd5fec-f214-4bbc-b854-9e16c9a7577a" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:57:26 np0005531887 nova_compute[186849]: 2025-11-22 07:57:26.616 186853 DEBUG nova.compute.manager [None req-1865d54e-6b81-44c6-bcf5-20c41feda476 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Going to confirm migration 13 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679#033[00m
Nov 22 02:57:26 np0005531887 nova_compute[186849]: 2025-11-22 07:57:26.643 186853 DEBUG nova.objects.instance [None req-1865d54e-6b81-44c6-bcf5-20c41feda476 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lazy-loading 'info_cache' on Instance uuid c6cd5fec-f214-4bbc-b854-9e16c9a7577a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:57:27 np0005531887 nova_compute[186849]: 2025-11-22 07:57:27.070 186853 DEBUG neutronclient.v2_0.client [None req-1865d54e-6b81-44c6-bcf5-20c41feda476 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 for host compute-1.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Nov 22 02:57:27 np0005531887 nova_compute[186849]: 2025-11-22 07:57:27.071 186853 DEBUG oslo_concurrency.lockutils [None req-1865d54e-6b81-44c6-bcf5-20c41feda476 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "refresh_cache-c6cd5fec-f214-4bbc-b854-9e16c9a7577a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:57:27 np0005531887 nova_compute[186849]: 2025-11-22 07:57:27.071 186853 DEBUG oslo_concurrency.lockutils [None req-1865d54e-6b81-44c6-bcf5-20c41feda476 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquired lock "refresh_cache-c6cd5fec-f214-4bbc-b854-9e16c9a7577a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:57:27 np0005531887 nova_compute[186849]: 2025-11-22 07:57:27.071 186853 DEBUG nova.network.neutron [None req-1865d54e-6b81-44c6-bcf5-20c41feda476 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:57:27 np0005531887 nova_compute[186849]: 2025-11-22 07:57:27.839 186853 INFO nova.compute.manager [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Rescuing#033[00m
Nov 22 02:57:27 np0005531887 nova_compute[186849]: 2025-11-22 07:57:27.840 186853 DEBUG oslo_concurrency.lockutils [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Acquiring lock "refresh_cache-e4a6074c-55b0-4529-b184-3ba3ca0dab8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:57:27 np0005531887 nova_compute[186849]: 2025-11-22 07:57:27.840 186853 DEBUG oslo_concurrency.lockutils [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Acquired lock "refresh_cache-e4a6074c-55b0-4529-b184-3ba3ca0dab8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:57:27 np0005531887 nova_compute[186849]: 2025-11-22 07:57:27.840 186853 DEBUG nova.network.neutron [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:57:29 np0005531887 nova_compute[186849]: 2025-11-22 07:57:29.033 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:29 np0005531887 nova_compute[186849]: 2025-11-22 07:57:29.041 186853 DEBUG nova.network.neutron [None req-1865d54e-6b81-44c6-bcf5-20c41feda476 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Updating instance_info_cache with network_info: [{"id": "3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2", "address": "fa:16:3e:27:44:b9", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3640f80e-71", "ovs_interfaceid": "3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:57:29 np0005531887 ovn_controller[95130]: 2025-11-22T07:57:29Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:10:9b:64 10.100.0.7
Nov 22 02:57:29 np0005531887 ovn_controller[95130]: 2025-11-22T07:57:29Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:10:9b:64 10.100.0.7
Nov 22 02:57:29 np0005531887 nova_compute[186849]: 2025-11-22 07:57:29.841 186853 DEBUG oslo_concurrency.lockutils [None req-1865d54e-6b81-44c6-bcf5-20c41feda476 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Releasing lock "refresh_cache-c6cd5fec-f214-4bbc-b854-9e16c9a7577a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:57:29 np0005531887 nova_compute[186849]: 2025-11-22 07:57:29.842 186853 DEBUG nova.objects.instance [None req-1865d54e-6b81-44c6-bcf5-20c41feda476 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lazy-loading 'migration_context' on Instance uuid c6cd5fec-f214-4bbc-b854-9e16c9a7577a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:57:29 np0005531887 nova_compute[186849]: 2025-11-22 07:57:29.868 186853 DEBUG nova.virt.libvirt.vif [None req-1865d54e-6b81-44c6-bcf5-20c41feda476 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:56:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-772508279',display_name='tempest-ServerDiskConfigTestJSON-server-772508279',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-772508279',id=76,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:57:22Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='063bf16c91af408ca075c690797e09d8',ramdisk_id='',reservation_id='r-0n3qm4qh',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerDiskConfigTestJSON-592691466',owner_user_name='tempest-ServerDiskConfigTestJSON-592691466-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:57:22Z,user_data=None,user_id='e24c302b62fb470aa189b76d4676733b',uuid=c6cd5fec-f214-4bbc-b854-9e16c9a7577a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2", "address": "fa:16:3e:27:44:b9", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3640f80e-71", "ovs_interfaceid": "3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 02:57:29 np0005531887 nova_compute[186849]: 2025-11-22 07:57:29.869 186853 DEBUG nova.network.os_vif_util [None req-1865d54e-6b81-44c6-bcf5-20c41feda476 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Converting VIF {"id": "3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2", "address": "fa:16:3e:27:44:b9", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3640f80e-71", "ovs_interfaceid": "3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:57:29 np0005531887 nova_compute[186849]: 2025-11-22 07:57:29.869 186853 DEBUG nova.network.os_vif_util [None req-1865d54e-6b81-44c6-bcf5-20c41feda476 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:27:44:b9,bridge_name='br-int',has_traffic_filtering=True,id=3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3640f80e-71') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:57:29 np0005531887 nova_compute[186849]: 2025-11-22 07:57:29.870 186853 DEBUG os_vif [None req-1865d54e-6b81-44c6-bcf5-20c41feda476 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:27:44:b9,bridge_name='br-int',has_traffic_filtering=True,id=3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3640f80e-71') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 02:57:29 np0005531887 nova_compute[186849]: 2025-11-22 07:57:29.872 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:29 np0005531887 nova_compute[186849]: 2025-11-22 07:57:29.873 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3640f80e-71, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:57:29 np0005531887 nova_compute[186849]: 2025-11-22 07:57:29.873 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:57:29 np0005531887 nova_compute[186849]: 2025-11-22 07:57:29.875 186853 INFO os_vif [None req-1865d54e-6b81-44c6-bcf5-20c41feda476 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:27:44:b9,bridge_name='br-int',has_traffic_filtering=True,id=3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3640f80e-71')#033[00m
Nov 22 02:57:29 np0005531887 nova_compute[186849]: 2025-11-22 07:57:29.876 186853 DEBUG oslo_concurrency.lockutils [None req-1865d54e-6b81-44c6-bcf5-20c41feda476 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:57:29 np0005531887 nova_compute[186849]: 2025-11-22 07:57:29.876 186853 DEBUG oslo_concurrency.lockutils [None req-1865d54e-6b81-44c6-bcf5-20c41feda476 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:57:29 np0005531887 nova_compute[186849]: 2025-11-22 07:57:29.987 186853 DEBUG nova.compute.provider_tree [None req-1865d54e-6b81-44c6-bcf5-20c41feda476 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:57:30 np0005531887 nova_compute[186849]: 2025-11-22 07:57:30.012 186853 DEBUG nova.scheduler.client.report [None req-1865d54e-6b81-44c6-bcf5-20c41feda476 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:57:30 np0005531887 nova_compute[186849]: 2025-11-22 07:57:30.081 186853 DEBUG oslo_concurrency.lockutils [None req-1865d54e-6b81-44c6-bcf5-20c41feda476 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.204s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:57:30 np0005531887 nova_compute[186849]: 2025-11-22 07:57:30.247 186853 INFO nova.scheduler.client.report [None req-1865d54e-6b81-44c6-bcf5-20c41feda476 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Deleted allocation for migration 263622db-30b9-4ef7-a87a-380190b6fd0e#033[00m
Nov 22 02:57:30 np0005531887 nova_compute[186849]: 2025-11-22 07:57:30.360 186853 DEBUG oslo_concurrency.lockutils [None req-1865d54e-6b81-44c6-bcf5-20c41feda476 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "c6cd5fec-f214-4bbc-b854-9e16c9a7577a" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 3.744s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:57:30 np0005531887 nova_compute[186849]: 2025-11-22 07:57:30.638 186853 DEBUG nova.network.neutron [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Updating instance_info_cache with network_info: [{"id": "392e43af-a923-4bd6-bdff-445c6101995b", "address": "fa:16:3e:10:9b:64", "network": {"id": "06e0f3a5-911a-4244-bd9c-8cb4fa4c4794", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-960378838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd33c7e49baa4c7f9575824b348a0f23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap392e43af-a9", "ovs_interfaceid": "392e43af-a923-4bd6-bdff-445c6101995b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:57:30 np0005531887 nova_compute[186849]: 2025-11-22 07:57:30.665 186853 DEBUG oslo_concurrency.lockutils [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Releasing lock "refresh_cache-e4a6074c-55b0-4529-b184-3ba3ca0dab8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:57:30 np0005531887 podman[224519]: 2025-11-22 07:57:30.835884396 +0000 UTC m=+0.049284929 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 02:57:30 np0005531887 nova_compute[186849]: 2025-11-22 07:57:30.982 186853 DEBUG nova.virt.libvirt.driver [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 22 02:57:31 np0005531887 nova_compute[186849]: 2025-11-22 07:57:31.556 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:33 np0005531887 kernel: tap392e43af-a9 (unregistering): left promiscuous mode
Nov 22 02:57:33 np0005531887 NetworkManager[55210]: <info>  [1763798253.7989] device (tap392e43af-a9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 02:57:33 np0005531887 nova_compute[186849]: 2025-11-22 07:57:33.808 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:33 np0005531887 ovn_controller[95130]: 2025-11-22T07:57:33Z|00205|binding|INFO|Releasing lport 392e43af-a923-4bd6-bdff-445c6101995b from this chassis (sb_readonly=0)
Nov 22 02:57:33 np0005531887 ovn_controller[95130]: 2025-11-22T07:57:33Z|00206|binding|INFO|Setting lport 392e43af-a923-4bd6-bdff-445c6101995b down in Southbound
Nov 22 02:57:33 np0005531887 ovn_controller[95130]: 2025-11-22T07:57:33Z|00207|binding|INFO|Removing iface tap392e43af-a9 ovn-installed in OVS
Nov 22 02:57:33 np0005531887 nova_compute[186849]: 2025-11-22 07:57:33.812 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:33 np0005531887 nova_compute[186849]: 2025-11-22 07:57:33.827 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:33.850 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:10:9b:64 10.100.0.7'], port_security=['fa:16:3e:10:9b:64 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'e4a6074c-55b0-4529-b184-3ba3ca0dab8c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'neutron:revision_number': '4', 'neutron:security_group_ids': '40e7d412-78c2-4966-b2d0-76294ef96b0a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2f082361-600e-461c-8c13-2c91c0ff7f77, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=392e43af-a923-4bd6-bdff-445c6101995b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:57:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:33.852 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 392e43af-a923-4bd6-bdff-445c6101995b in datapath 06e0f3a5-911a-4244-bd9c-8cb4fa4c4794 unbound from our chassis#033[00m
Nov 22 02:57:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:33.853 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 06e0f3a5-911a-4244-bd9c-8cb4fa4c4794, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 02:57:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:33.854 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[ed2800dc-923b-49d7-a63a-7a8c3c7c2cd0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:33.855 104084 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794 namespace which is not needed anymore#033[00m
Nov 22 02:57:33 np0005531887 systemd[1]: machine-qemu\x2d30\x2dinstance\x2d0000004d.scope: Deactivated successfully.
Nov 22 02:57:33 np0005531887 systemd[1]: machine-qemu\x2d30\x2dinstance\x2d0000004d.scope: Consumed 16.112s CPU time.
Nov 22 02:57:33 np0005531887 systemd-machined[153180]: Machine qemu-30-instance-0000004d terminated.
Nov 22 02:57:34 np0005531887 nova_compute[186849]: 2025-11-22 07:57:34.035 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:34 np0005531887 nova_compute[186849]: 2025-11-22 07:57:34.061 186853 INFO nova.virt.libvirt.driver [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Instance shutdown successfully after 3 seconds.#033[00m
Nov 22 02:57:34 np0005531887 nova_compute[186849]: 2025-11-22 07:57:34.066 186853 INFO nova.virt.libvirt.driver [-] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Instance destroyed successfully.#033[00m
Nov 22 02:57:34 np0005531887 nova_compute[186849]: 2025-11-22 07:57:34.067 186853 DEBUG nova.objects.instance [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lazy-loading 'numa_topology' on Instance uuid e4a6074c-55b0-4529-b184-3ba3ca0dab8c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:57:34 np0005531887 nova_compute[186849]: 2025-11-22 07:57:34.085 186853 INFO nova.virt.libvirt.driver [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Attempting a stable device rescue#033[00m
Nov 22 02:57:34 np0005531887 neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794[224400]: [NOTICE]   (224404) : haproxy version is 2.8.14-c23fe91
Nov 22 02:57:34 np0005531887 neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794[224400]: [NOTICE]   (224404) : path to executable is /usr/sbin/haproxy
Nov 22 02:57:34 np0005531887 neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794[224400]: [WARNING]  (224404) : Exiting Master process...
Nov 22 02:57:34 np0005531887 neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794[224400]: [ALERT]    (224404) : Current worker (224406) exited with code 143 (Terminated)
Nov 22 02:57:34 np0005531887 neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794[224400]: [WARNING]  (224404) : All workers exited. Exiting... (0)
Nov 22 02:57:34 np0005531887 systemd[1]: libpod-95817d0067494cc0e2446f9d3aec5de1019909a5700516d722e7b21647f696ab.scope: Deactivated successfully.
Nov 22 02:57:34 np0005531887 podman[224569]: 2025-11-22 07:57:34.127612503 +0000 UTC m=+0.177514886 container died 95817d0067494cc0e2446f9d3aec5de1019909a5700516d722e7b21647f696ab (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 22 02:57:34 np0005531887 nova_compute[186849]: 2025-11-22 07:57:34.402 186853 DEBUG nova.compute.manager [req-4583cc2b-123e-4bbc-9a36-1c314356b84d req-993c0837-3154-4495-b833-f90a8b9e8e4d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Received event network-vif-unplugged-392e43af-a923-4bd6-bdff-445c6101995b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:57:34 np0005531887 nova_compute[186849]: 2025-11-22 07:57:34.403 186853 DEBUG oslo_concurrency.lockutils [req-4583cc2b-123e-4bbc-9a36-1c314356b84d req-993c0837-3154-4495-b833-f90a8b9e8e4d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "e4a6074c-55b0-4529-b184-3ba3ca0dab8c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:57:34 np0005531887 nova_compute[186849]: 2025-11-22 07:57:34.403 186853 DEBUG oslo_concurrency.lockutils [req-4583cc2b-123e-4bbc-9a36-1c314356b84d req-993c0837-3154-4495-b833-f90a8b9e8e4d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e4a6074c-55b0-4529-b184-3ba3ca0dab8c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:57:34 np0005531887 nova_compute[186849]: 2025-11-22 07:57:34.403 186853 DEBUG oslo_concurrency.lockutils [req-4583cc2b-123e-4bbc-9a36-1c314356b84d req-993c0837-3154-4495-b833-f90a8b9e8e4d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e4a6074c-55b0-4529-b184-3ba3ca0dab8c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:57:34 np0005531887 nova_compute[186849]: 2025-11-22 07:57:34.404 186853 DEBUG nova.compute.manager [req-4583cc2b-123e-4bbc-9a36-1c314356b84d req-993c0837-3154-4495-b833-f90a8b9e8e4d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] No waiting events found dispatching network-vif-unplugged-392e43af-a923-4bd6-bdff-445c6101995b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:57:34 np0005531887 nova_compute[186849]: 2025-11-22 07:57:34.405 186853 WARNING nova.compute.manager [req-4583cc2b-123e-4bbc-9a36-1c314356b84d req-993c0837-3154-4495-b833-f90a8b9e8e4d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Received unexpected event network-vif-unplugged-392e43af-a923-4bd6-bdff-445c6101995b for instance with vm_state active and task_state rescuing.#033[00m
Nov 22 02:57:34 np0005531887 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-95817d0067494cc0e2446f9d3aec5de1019909a5700516d722e7b21647f696ab-userdata-shm.mount: Deactivated successfully.
Nov 22 02:57:34 np0005531887 systemd[1]: var-lib-containers-storage-overlay-3a1f9aca14d8ada4adf637ffde626510f950763f32f0bc33e5674be281e9cda3-merged.mount: Deactivated successfully.
Nov 22 02:57:34 np0005531887 nova_compute[186849]: 2025-11-22 07:57:34.532 186853 DEBUG nova.virt.libvirt.driver [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'scsi', 'dev': 'sdb', 'type': 'disk'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314#033[00m
Nov 22 02:57:34 np0005531887 nova_compute[186849]: 2025-11-22 07:57:34.537 186853 DEBUG nova.virt.libvirt.driver [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Nov 22 02:57:34 np0005531887 nova_compute[186849]: 2025-11-22 07:57:34.537 186853 INFO nova.virt.libvirt.driver [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Creating image(s)#033[00m
Nov 22 02:57:34 np0005531887 nova_compute[186849]: 2025-11-22 07:57:34.538 186853 DEBUG oslo_concurrency.lockutils [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Acquiring lock "/var/lib/nova/instances/e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:57:34 np0005531887 nova_compute[186849]: 2025-11-22 07:57:34.539 186853 DEBUG oslo_concurrency.lockutils [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "/var/lib/nova/instances/e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:57:34 np0005531887 nova_compute[186849]: 2025-11-22 07:57:34.539 186853 DEBUG oslo_concurrency.lockutils [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "/var/lib/nova/instances/e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:57:34 np0005531887 nova_compute[186849]: 2025-11-22 07:57:34.540 186853 DEBUG nova.objects.instance [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lazy-loading 'trusted_certs' on Instance uuid e4a6074c-55b0-4529-b184-3ba3ca0dab8c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:57:34 np0005531887 nova_compute[186849]: 2025-11-22 07:57:34.550 186853 DEBUG oslo_concurrency.lockutils [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Acquiring lock "5e1b055cd2dda7073fea6bdd458a9a8fcf51be29" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:57:34 np0005531887 nova_compute[186849]: 2025-11-22 07:57:34.551 186853 DEBUG oslo_concurrency.lockutils [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "5e1b055cd2dda7073fea6bdd458a9a8fcf51be29" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:57:34 np0005531887 podman[224569]: 2025-11-22 07:57:34.721884923 +0000 UTC m=+0.771787276 container cleanup 95817d0067494cc0e2446f9d3aec5de1019909a5700516d722e7b21647f696ab (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118)
Nov 22 02:57:34 np0005531887 systemd[1]: libpod-conmon-95817d0067494cc0e2446f9d3aec5de1019909a5700516d722e7b21647f696ab.scope: Deactivated successfully.
Nov 22 02:57:35 np0005531887 podman[224618]: 2025-11-22 07:57:35.183223741 +0000 UTC m=+0.438169981 container remove 95817d0067494cc0e2446f9d3aec5de1019909a5700516d722e7b21647f696ab (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:57:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:35.188 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[3a2da05d-4a34-406e-87be-85faf0eb5a74]: (4, ('Sat Nov 22 07:57:33 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794 (95817d0067494cc0e2446f9d3aec5de1019909a5700516d722e7b21647f696ab)\n95817d0067494cc0e2446f9d3aec5de1019909a5700516d722e7b21647f696ab\nSat Nov 22 07:57:34 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794 (95817d0067494cc0e2446f9d3aec5de1019909a5700516d722e7b21647f696ab)\n95817d0067494cc0e2446f9d3aec5de1019909a5700516d722e7b21647f696ab\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:35.190 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[06df69b7-47c1-4f4e-bcd2-09d722f9ecc1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:35.191 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap06e0f3a5-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:57:35 np0005531887 nova_compute[186849]: 2025-11-22 07:57:35.194 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:35 np0005531887 kernel: tap06e0f3a5-90: left promiscuous mode
Nov 22 02:57:35 np0005531887 nova_compute[186849]: 2025-11-22 07:57:35.211 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:35.214 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[ee11460b-337f-4663-8ae3-530d7fdcd273]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:35.233 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[0bdaf30e-8822-4f77-8671-2d8837eaf961]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:35.235 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[2a80c025-80b5-4015-8fc9-12a6da6ac219]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:35.253 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[7749abb5-e989-4fd2-997f-17965ccc899b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 497276, 'reachable_time': 40221, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224637, 'error': None, 'target': 'ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:35 np0005531887 systemd[1]: run-netns-ovnmeta\x2d06e0f3a5\x2d911a\x2d4244\x2dbd9c\x2d8cb4fa4c4794.mount: Deactivated successfully.
Nov 22 02:57:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:35.256 104200 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 02:57:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:35.256 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[0f0c5efd-bb84-4eca-aac3-eb4ba8080562]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:36 np0005531887 nova_compute[186849]: 2025-11-22 07:57:36.557 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:36 np0005531887 podman[224638]: 2025-11-22 07:57:36.8556509 +0000 UTC m=+0.065764840 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.expose-services=, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350, architecture=x86_64, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_id=edpm, name=ubi9-minimal, version=9.6)
Nov 22 02:57:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:37.328 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:57:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:37.329 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:57:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:37.330 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:57:39 np0005531887 nova_compute[186849]: 2025-11-22 07:57:39.373 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:39 np0005531887 nova_compute[186849]: 2025-11-22 07:57:39.730 186853 DEBUG oslo_concurrency.processutils [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5e1b055cd2dda7073fea6bdd458a9a8fcf51be29.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:57:39 np0005531887 nova_compute[186849]: 2025-11-22 07:57:39.798 186853 DEBUG oslo_concurrency.processutils [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5e1b055cd2dda7073fea6bdd458a9a8fcf51be29.part --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:57:39 np0005531887 nova_compute[186849]: 2025-11-22 07:57:39.799 186853 DEBUG nova.virt.images [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] 00c92898-55a1-4fc1-bf14-d61d0c35316c was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Nov 22 02:57:39 np0005531887 nova_compute[186849]: 2025-11-22 07:57:39.802 186853 DEBUG nova.privsep.utils [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 22 02:57:39 np0005531887 nova_compute[186849]: 2025-11-22 07:57:39.803 186853 DEBUG oslo_concurrency.processutils [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/5e1b055cd2dda7073fea6bdd458a9a8fcf51be29.part /var/lib/nova/instances/_base/5e1b055cd2dda7073fea6bdd458a9a8fcf51be29.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:57:39 np0005531887 podman[224661]: 2025-11-22 07:57:39.846102597 +0000 UTC m=+0.066588720 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 22 02:57:39 np0005531887 podman[224662]: 2025-11-22 07:57:39.888998877 +0000 UTC m=+0.101543632 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 22 02:57:39 np0005531887 nova_compute[186849]: 2025-11-22 07:57:39.920 186853 DEBUG nova.compute.manager [req-68c58b07-b085-4878-90b1-7db51afa4225 req-c914fb98-98c0-4f01-bcfb-473d5abe520a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Received event network-vif-plugged-392e43af-a923-4bd6-bdff-445c6101995b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:57:39 np0005531887 nova_compute[186849]: 2025-11-22 07:57:39.921 186853 DEBUG oslo_concurrency.lockutils [req-68c58b07-b085-4878-90b1-7db51afa4225 req-c914fb98-98c0-4f01-bcfb-473d5abe520a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "e4a6074c-55b0-4529-b184-3ba3ca0dab8c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:57:39 np0005531887 nova_compute[186849]: 2025-11-22 07:57:39.921 186853 DEBUG oslo_concurrency.lockutils [req-68c58b07-b085-4878-90b1-7db51afa4225 req-c914fb98-98c0-4f01-bcfb-473d5abe520a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e4a6074c-55b0-4529-b184-3ba3ca0dab8c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:57:39 np0005531887 nova_compute[186849]: 2025-11-22 07:57:39.921 186853 DEBUG oslo_concurrency.lockutils [req-68c58b07-b085-4878-90b1-7db51afa4225 req-c914fb98-98c0-4f01-bcfb-473d5abe520a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e4a6074c-55b0-4529-b184-3ba3ca0dab8c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:57:39 np0005531887 nova_compute[186849]: 2025-11-22 07:57:39.922 186853 DEBUG nova.compute.manager [req-68c58b07-b085-4878-90b1-7db51afa4225 req-c914fb98-98c0-4f01-bcfb-473d5abe520a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] No waiting events found dispatching network-vif-plugged-392e43af-a923-4bd6-bdff-445c6101995b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:57:39 np0005531887 nova_compute[186849]: 2025-11-22 07:57:39.922 186853 WARNING nova.compute.manager [req-68c58b07-b085-4878-90b1-7db51afa4225 req-c914fb98-98c0-4f01-bcfb-473d5abe520a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Received unexpected event network-vif-plugged-392e43af-a923-4bd6-bdff-445c6101995b for instance with vm_state active and task_state rescuing.#033[00m
Nov 22 02:57:40 np0005531887 nova_compute[186849]: 2025-11-22 07:57:40.665 186853 DEBUG oslo_concurrency.processutils [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/5e1b055cd2dda7073fea6bdd458a9a8fcf51be29.part /var/lib/nova/instances/_base/5e1b055cd2dda7073fea6bdd458a9a8fcf51be29.converted" returned: 0 in 0.863s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:57:40 np0005531887 nova_compute[186849]: 2025-11-22 07:57:40.671 186853 DEBUG oslo_concurrency.processutils [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5e1b055cd2dda7073fea6bdd458a9a8fcf51be29.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:57:40 np0005531887 nova_compute[186849]: 2025-11-22 07:57:40.737 186853 DEBUG oslo_concurrency.processutils [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5e1b055cd2dda7073fea6bdd458a9a8fcf51be29.converted --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:57:40 np0005531887 nova_compute[186849]: 2025-11-22 07:57:40.739 186853 DEBUG oslo_concurrency.lockutils [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "5e1b055cd2dda7073fea6bdd458a9a8fcf51be29" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 6.188s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:57:40 np0005531887 nova_compute[186849]: 2025-11-22 07:57:40.753 186853 DEBUG oslo_concurrency.lockutils [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Acquiring lock "5e1b055cd2dda7073fea6bdd458a9a8fcf51be29" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:57:40 np0005531887 nova_compute[186849]: 2025-11-22 07:57:40.754 186853 DEBUG oslo_concurrency.lockutils [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "5e1b055cd2dda7073fea6bdd458a9a8fcf51be29" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:57:40 np0005531887 nova_compute[186849]: 2025-11-22 07:57:40.765 186853 DEBUG oslo_concurrency.processutils [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5e1b055cd2dda7073fea6bdd458a9a8fcf51be29 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:57:40 np0005531887 nova_compute[186849]: 2025-11-22 07:57:40.829 186853 DEBUG oslo_concurrency.processutils [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5e1b055cd2dda7073fea6bdd458a9a8fcf51be29 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:57:40 np0005531887 nova_compute[186849]: 2025-11-22 07:57:40.831 186853 DEBUG oslo_concurrency.processutils [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5e1b055cd2dda7073fea6bdd458a9a8fcf51be29,backing_fmt=raw /var/lib/nova/instances/e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk.rescue execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:57:41 np0005531887 nova_compute[186849]: 2025-11-22 07:57:41.133 186853 DEBUG oslo_concurrency.processutils [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5e1b055cd2dda7073fea6bdd458a9a8fcf51be29,backing_fmt=raw /var/lib/nova/instances/e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk.rescue" returned: 0 in 0.303s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:57:41 np0005531887 nova_compute[186849]: 2025-11-22 07:57:41.135 186853 DEBUG oslo_concurrency.lockutils [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "5e1b055cd2dda7073fea6bdd458a9a8fcf51be29" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.381s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:57:41 np0005531887 nova_compute[186849]: 2025-11-22 07:57:41.135 186853 DEBUG nova.objects.instance [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lazy-loading 'migration_context' on Instance uuid e4a6074c-55b0-4529-b184-3ba3ca0dab8c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:57:41 np0005531887 nova_compute[186849]: 2025-11-22 07:57:41.147 186853 DEBUG nova.virt.libvirt.driver [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:57:41 np0005531887 nova_compute[186849]: 2025-11-22 07:57:41.150 186853 DEBUG nova.virt.libvirt.driver [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Start _get_guest_xml network_info=[{"id": "392e43af-a923-4bd6-bdff-445c6101995b", "address": "fa:16:3e:10:9b:64", "network": {"id": "06e0f3a5-911a-4244-bd9c-8cb4fa4c4794", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-960378838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-960378838-network", "vif_mac": "fa:16:3e:10:9b:64"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd33c7e49baa4c7f9575824b348a0f23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap392e43af-a9", "ovs_interfaceid": "392e43af-a923-4bd6-bdff-445c6101995b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'scsi', 'dev': 'sdb', 'type': 'disk'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': '00c92898-55a1-4fc1-bf14-d61d0c35316c', 'kernel_id': '', 'ramdisk_id': ''} block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'size': 0, 'boot_index': 0, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:57:41 np0005531887 nova_compute[186849]: 2025-11-22 07:57:41.151 186853 DEBUG nova.objects.instance [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lazy-loading 'resources' on Instance uuid e4a6074c-55b0-4529-b184-3ba3ca0dab8c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:57:41 np0005531887 nova_compute[186849]: 2025-11-22 07:57:41.165 186853 WARNING nova.virt.libvirt.driver [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:57:41 np0005531887 nova_compute[186849]: 2025-11-22 07:57:41.174 186853 DEBUG nova.virt.libvirt.host [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:57:41 np0005531887 nova_compute[186849]: 2025-11-22 07:57:41.175 186853 DEBUG nova.virt.libvirt.host [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:57:41 np0005531887 nova_compute[186849]: 2025-11-22 07:57:41.181 186853 DEBUG nova.virt.libvirt.host [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:57:41 np0005531887 nova_compute[186849]: 2025-11-22 07:57:41.182 186853 DEBUG nova.virt.libvirt.host [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:57:41 np0005531887 nova_compute[186849]: 2025-11-22 07:57:41.183 186853 DEBUG nova.virt.libvirt.driver [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:57:41 np0005531887 nova_compute[186849]: 2025-11-22 07:57:41.184 186853 DEBUG nova.virt.hardware [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:57:41 np0005531887 nova_compute[186849]: 2025-11-22 07:57:41.184 186853 DEBUG nova.virt.hardware [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:57:41 np0005531887 nova_compute[186849]: 2025-11-22 07:57:41.184 186853 DEBUG nova.virt.hardware [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:57:41 np0005531887 nova_compute[186849]: 2025-11-22 07:57:41.185 186853 DEBUG nova.virt.hardware [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:57:41 np0005531887 nova_compute[186849]: 2025-11-22 07:57:41.185 186853 DEBUG nova.virt.hardware [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:57:41 np0005531887 nova_compute[186849]: 2025-11-22 07:57:41.185 186853 DEBUG nova.virt.hardware [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:57:41 np0005531887 nova_compute[186849]: 2025-11-22 07:57:41.185 186853 DEBUG nova.virt.hardware [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:57:41 np0005531887 nova_compute[186849]: 2025-11-22 07:57:41.186 186853 DEBUG nova.virt.hardware [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:57:41 np0005531887 nova_compute[186849]: 2025-11-22 07:57:41.186 186853 DEBUG nova.virt.hardware [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:57:41 np0005531887 nova_compute[186849]: 2025-11-22 07:57:41.186 186853 DEBUG nova.virt.hardware [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:57:41 np0005531887 nova_compute[186849]: 2025-11-22 07:57:41.186 186853 DEBUG nova.virt.hardware [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:57:41 np0005531887 nova_compute[186849]: 2025-11-22 07:57:41.187 186853 DEBUG nova.objects.instance [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lazy-loading 'vcpu_model' on Instance uuid e4a6074c-55b0-4529-b184-3ba3ca0dab8c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:57:41 np0005531887 nova_compute[186849]: 2025-11-22 07:57:41.206 186853 DEBUG oslo_concurrency.processutils [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:57:41 np0005531887 nova_compute[186849]: 2025-11-22 07:57:41.273 186853 DEBUG oslo_concurrency.processutils [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk.config --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:57:41 np0005531887 nova_compute[186849]: 2025-11-22 07:57:41.274 186853 DEBUG oslo_concurrency.lockutils [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Acquiring lock "/var/lib/nova/instances/e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:57:41 np0005531887 nova_compute[186849]: 2025-11-22 07:57:41.275 186853 DEBUG oslo_concurrency.lockutils [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "/var/lib/nova/instances/e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:57:41 np0005531887 nova_compute[186849]: 2025-11-22 07:57:41.276 186853 DEBUG oslo_concurrency.lockutils [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "/var/lib/nova/instances/e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:57:41 np0005531887 nova_compute[186849]: 2025-11-22 07:57:41.277 186853 DEBUG nova.virt.libvirt.vif [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:56:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-697404939',display_name='tempest-ServerStableDeviceRescueTest-server-697404939',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-697404939',id=77,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:57:11Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fd33c7e49baa4c7f9575824b348a0f23',ramdisk_id='',reservation_id='r-0m39mt0u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerStableDeviceRescueTest-455223381',owner_user_name='tempest-ServerStableDeviceRescueTest-455223381-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:57:23Z,user_data=None,user_id='0d84421d986b40f481c0caef764443e2',uuid=e4a6074c-55b0-4529-b184-3ba3ca0dab8c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "392e43af-a923-4bd6-bdff-445c6101995b", "address": "fa:16:3e:10:9b:64", "network": {"id": "06e0f3a5-911a-4244-bd9c-8cb4fa4c4794", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-960378838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-960378838-network", "vif_mac": "fa:16:3e:10:9b:64"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd33c7e49baa4c7f9575824b348a0f23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap392e43af-a9", "ovs_interfaceid": "392e43af-a923-4bd6-bdff-445c6101995b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 02:57:41 np0005531887 nova_compute[186849]: 2025-11-22 07:57:41.278 186853 DEBUG nova.network.os_vif_util [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Converting VIF {"id": "392e43af-a923-4bd6-bdff-445c6101995b", "address": "fa:16:3e:10:9b:64", "network": {"id": "06e0f3a5-911a-4244-bd9c-8cb4fa4c4794", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-960378838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-960378838-network", "vif_mac": "fa:16:3e:10:9b:64"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd33c7e49baa4c7f9575824b348a0f23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap392e43af-a9", "ovs_interfaceid": "392e43af-a923-4bd6-bdff-445c6101995b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:57:41 np0005531887 nova_compute[186849]: 2025-11-22 07:57:41.279 186853 DEBUG nova.network.os_vif_util [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:10:9b:64,bridge_name='br-int',has_traffic_filtering=True,id=392e43af-a923-4bd6-bdff-445c6101995b,network=Network(06e0f3a5-911a-4244-bd9c-8cb4fa4c4794),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap392e43af-a9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:57:41 np0005531887 nova_compute[186849]: 2025-11-22 07:57:41.280 186853 DEBUG nova.objects.instance [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lazy-loading 'pci_devices' on Instance uuid e4a6074c-55b0-4529-b184-3ba3ca0dab8c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:57:41 np0005531887 nova_compute[186849]: 2025-11-22 07:57:41.299 186853 DEBUG nova.virt.libvirt.driver [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:57:41 np0005531887 nova_compute[186849]:  <uuid>e4a6074c-55b0-4529-b184-3ba3ca0dab8c</uuid>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:  <name>instance-0000004d</name>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:  <memory>131072</memory>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:  <vcpu>1</vcpu>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:57:41 np0005531887 nova_compute[186849]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:      <nova:name>tempest-ServerStableDeviceRescueTest-server-697404939</nova:name>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:      <nova:creationTime>2025-11-22 07:57:41</nova:creationTime>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:      <nova:flavor name="m1.nano">
Nov 22 02:57:41 np0005531887 nova_compute[186849]:        <nova:memory>128</nova:memory>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:        <nova:disk>1</nova:disk>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:        <nova:swap>0</nova:swap>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:      </nova:flavor>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:      <nova:owner>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:        <nova:user uuid="0d84421d986b40f481c0caef764443e2">tempest-ServerStableDeviceRescueTest-455223381-project-member</nova:user>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:        <nova:project uuid="fd33c7e49baa4c7f9575824b348a0f23">tempest-ServerStableDeviceRescueTest-455223381</nova:project>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:      </nova:owner>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:      <nova:ports>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:        <nova:port uuid="392e43af-a923-4bd6-bdff-445c6101995b">
Nov 22 02:57:41 np0005531887 nova_compute[186849]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:        </nova:port>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:      </nova:ports>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:    </nova:instance>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:  <sysinfo type="smbios">
Nov 22 02:57:41 np0005531887 nova_compute[186849]:    <system>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:      <entry name="serial">e4a6074c-55b0-4529-b184-3ba3ca0dab8c</entry>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:      <entry name="uuid">e4a6074c-55b0-4529-b184-3ba3ca0dab8c</entry>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:    </system>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:  <os>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:    <smbios mode="sysinfo"/>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:  </os>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:  <features>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:    <vmcoreinfo/>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:  </features>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:  <clock offset="utc">
Nov 22 02:57:41 np0005531887 nova_compute[186849]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:    <timer name="hpet" present="no"/>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:  </clock>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:  <cpu mode="custom" match="exact">
Nov 22 02:57:41 np0005531887 nova_compute[186849]:    <model>Nehalem</model>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:  <devices>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:    <disk type="file" device="disk">
Nov 22 02:57:41 np0005531887 nova_compute[186849]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk"/>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:      <target dev="vda" bus="virtio"/>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:    <disk type="file" device="cdrom">
Nov 22 02:57:41 np0005531887 nova_compute[186849]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk.config"/>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:      <target dev="sda" bus="sata"/>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:    <disk type="file" device="disk">
Nov 22 02:57:41 np0005531887 nova_compute[186849]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk.rescue"/>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:      <target dev="sdb" bus="scsi"/>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:      <boot order="1"/>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:    <interface type="ethernet">
Nov 22 02:57:41 np0005531887 nova_compute[186849]:      <mac address="fa:16:3e:10:9b:64"/>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:      <mtu size="1442"/>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:      <target dev="tap392e43af-a9"/>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:    </interface>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:    <serial type="pty">
Nov 22 02:57:41 np0005531887 nova_compute[186849]:      <log file="/var/lib/nova/instances/e4a6074c-55b0-4529-b184-3ba3ca0dab8c/console.log" append="off"/>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:    </serial>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:    <video>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:    </video>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:    <input type="tablet" bus="usb"/>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:    <rng model="virtio">
Nov 22 02:57:41 np0005531887 nova_compute[186849]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:    </rng>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:    <controller type="usb" index="0"/>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:    <memballoon model="virtio">
Nov 22 02:57:41 np0005531887 nova_compute[186849]:      <stats period="10"/>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 02:57:41 np0005531887 nova_compute[186849]:  </devices>
Nov 22 02:57:41 np0005531887 nova_compute[186849]: </domain>
Nov 22 02:57:41 np0005531887 nova_compute[186849]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:57:41 np0005531887 nova_compute[186849]: 2025-11-22 07:57:41.308 186853 INFO nova.virt.libvirt.driver [-] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Instance destroyed successfully.#033[00m
Nov 22 02:57:41 np0005531887 nova_compute[186849]: 2025-11-22 07:57:41.396 186853 DEBUG nova.virt.libvirt.driver [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:57:41 np0005531887 nova_compute[186849]: 2025-11-22 07:57:41.397 186853 DEBUG nova.virt.libvirt.driver [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:57:41 np0005531887 nova_compute[186849]: 2025-11-22 07:57:41.397 186853 DEBUG nova.virt.libvirt.driver [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] No BDM found with device name sdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:57:41 np0005531887 nova_compute[186849]: 2025-11-22 07:57:41.397 186853 DEBUG nova.virt.libvirt.driver [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] No VIF found with MAC fa:16:3e:10:9b:64, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 02:57:41 np0005531887 nova_compute[186849]: 2025-11-22 07:57:41.398 186853 INFO nova.virt.libvirt.driver [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Using config drive#033[00m
Nov 22 02:57:41 np0005531887 nova_compute[186849]: 2025-11-22 07:57:41.414 186853 DEBUG nova.objects.instance [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lazy-loading 'ec2_ids' on Instance uuid e4a6074c-55b0-4529-b184-3ba3ca0dab8c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:57:41 np0005531887 nova_compute[186849]: 2025-11-22 07:57:41.450 186853 DEBUG nova.objects.instance [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lazy-loading 'keypairs' on Instance uuid e4a6074c-55b0-4529-b184-3ba3ca0dab8c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:57:41 np0005531887 nova_compute[186849]: 2025-11-22 07:57:41.560 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:42 np0005531887 nova_compute[186849]: 2025-11-22 07:57:42.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:57:43 np0005531887 nova_compute[186849]: 2025-11-22 07:57:43.861 186853 INFO nova.virt.libvirt.driver [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Creating config drive at /var/lib/nova/instances/e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk.config.rescue#033[00m
Nov 22 02:57:43 np0005531887 nova_compute[186849]: 2025-11-22 07:57:43.866 186853 DEBUG oslo_concurrency.processutils [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbe6ike1l execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:57:44 np0005531887 nova_compute[186849]: 2025-11-22 07:57:44.000 186853 DEBUG oslo_concurrency.processutils [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbe6ike1l" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:57:44 np0005531887 kernel: tap392e43af-a9: entered promiscuous mode
Nov 22 02:57:44 np0005531887 ovn_controller[95130]: 2025-11-22T07:57:44Z|00208|binding|INFO|Claiming lport 392e43af-a923-4bd6-bdff-445c6101995b for this chassis.
Nov 22 02:57:44 np0005531887 NetworkManager[55210]: <info>  [1763798264.1073] manager: (tap392e43af-a9): new Tun device (/org/freedesktop/NetworkManager/Devices/109)
Nov 22 02:57:44 np0005531887 ovn_controller[95130]: 2025-11-22T07:57:44Z|00209|binding|INFO|392e43af-a923-4bd6-bdff-445c6101995b: Claiming fa:16:3e:10:9b:64 10.100.0.7
Nov 22 02:57:44 np0005531887 nova_compute[186849]: 2025-11-22 07:57:44.107 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:44 np0005531887 ovn_controller[95130]: 2025-11-22T07:57:44Z|00210|binding|INFO|Setting lport 392e43af-a923-4bd6-bdff-445c6101995b ovn-installed in OVS
Nov 22 02:57:44 np0005531887 nova_compute[186849]: 2025-11-22 07:57:44.124 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:44 np0005531887 nova_compute[186849]: 2025-11-22 07:57:44.126 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:44 np0005531887 ovn_controller[95130]: 2025-11-22T07:57:44Z|00211|binding|INFO|Setting lport 392e43af-a923-4bd6-bdff-445c6101995b up in Southbound
Nov 22 02:57:44 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:44.137 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:10:9b:64 10.100.0.7'], port_security=['fa:16:3e:10:9b:64 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'e4a6074c-55b0-4529-b184-3ba3ca0dab8c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'neutron:revision_number': '5', 'neutron:security_group_ids': '40e7d412-78c2-4966-b2d0-76294ef96b0a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2f082361-600e-461c-8c13-2c91c0ff7f77, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=392e43af-a923-4bd6-bdff-445c6101995b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:57:44 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:44.139 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 392e43af-a923-4bd6-bdff-445c6101995b in datapath 06e0f3a5-911a-4244-bd9c-8cb4fa4c4794 bound to our chassis#033[00m
Nov 22 02:57:44 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:44.141 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 06e0f3a5-911a-4244-bd9c-8cb4fa4c4794#033[00m
Nov 22 02:57:44 np0005531887 systemd-udevd[224749]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:57:44 np0005531887 NetworkManager[55210]: <info>  [1763798264.1599] device (tap392e43af-a9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 02:57:44 np0005531887 NetworkManager[55210]: <info>  [1763798264.1610] device (tap392e43af-a9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 02:57:44 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:44.161 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[c4612824-e084-4e45-89fa-b8cf432ae3ea]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:44 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:44.163 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap06e0f3a5-91 in ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 02:57:44 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:44.165 213790 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap06e0f3a5-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 02:57:44 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:44.165 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[13d6054b-7f88-41cb-830a-871539b90977]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:44 np0005531887 systemd-machined[153180]: New machine qemu-31-instance-0000004d.
Nov 22 02:57:44 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:44.167 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[ad2a756e-d6bf-46b4-8122-4f0f47dcdf2b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:44 np0005531887 systemd[1]: Started Virtual Machine qemu-31-instance-0000004d.
Nov 22 02:57:44 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:44.180 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[a636183d-c66a-4125-b66c-10b0b850c1c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:44 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:44.194 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[5ef76635-aa9d-4d7b-82ba-337639eac34b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:44 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:44.230 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[ab1477b4-caf9-4f42-810f-bb1ad6c7dcbc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:44 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:44.238 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[25917d61-cb44-466a-87d1-1192b9389fad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:44 np0005531887 NetworkManager[55210]: <info>  [1763798264.2395] manager: (tap06e0f3a5-90): new Veth device (/org/freedesktop/NetworkManager/Devices/110)
Nov 22 02:57:44 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:44.271 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[59930469-f027-4ed6-9c94-d9ee1832eacd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:44 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:44.275 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[700b33cb-b62c-4da6-9346-80e858aac8a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:44 np0005531887 NetworkManager[55210]: <info>  [1763798264.2981] device (tap06e0f3a5-90): carrier: link connected
Nov 22 02:57:44 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:44.302 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[4489bb49-c709-473d-9730-737fdb528b79]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:44 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:44.318 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[7180f41b-4eaa-4f8c-83b4-31ab97c08fef]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap06e0f3a5-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:b7:bd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 70], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 500690, 'reachable_time': 41354, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224785, 'error': None, 'target': 'ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:44 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:44.335 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[cd03b5ea-57f3-481b-8f74-83ed9e23fda0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed7:b7bd'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 500690, 'tstamp': 500690}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224786, 'error': None, 'target': 'ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:44 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:44.354 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[d3535326-7640-494f-9d9d-f0ab3fadfaae]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap06e0f3a5-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:b7:bd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 70], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 500690, 'reachable_time': 41354, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 224787, 'error': None, 'target': 'ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:44 np0005531887 nova_compute[186849]: 2025-11-22 07:57:44.375 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:44 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:44.389 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[213c8a47-7f28-484a-95ec-647ae00c9bed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:44 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:44.446 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[c9178582-9e52-41b8-b3a6-13f1af0c49a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:44 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:44.448 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap06e0f3a5-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:57:44 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:44.449 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:57:44 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:44.449 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap06e0f3a5-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:57:44 np0005531887 nova_compute[186849]: 2025-11-22 07:57:44.451 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:44 np0005531887 kernel: tap06e0f3a5-90: entered promiscuous mode
Nov 22 02:57:44 np0005531887 nova_compute[186849]: 2025-11-22 07:57:44.453 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:44 np0005531887 NetworkManager[55210]: <info>  [1763798264.4542] manager: (tap06e0f3a5-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/111)
Nov 22 02:57:44 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:44.455 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap06e0f3a5-90, col_values=(('external_ids', {'iface-id': '465da2c0-9a1c-41a9-be9a-d10bcbd7a813'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:57:44 np0005531887 nova_compute[186849]: 2025-11-22 07:57:44.456 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:44 np0005531887 ovn_controller[95130]: 2025-11-22T07:57:44Z|00212|binding|INFO|Releasing lport 465da2c0-9a1c-41a9-be9a-d10bcbd7a813 from this chassis (sb_readonly=0)
Nov 22 02:57:44 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:44.457 104084 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/06e0f3a5-911a-4244-bd9c-8cb4fa4c4794.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/06e0f3a5-911a-4244-bd9c-8cb4fa4c4794.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 02:57:44 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:44.459 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[947c2edd-158c-453a-a68c-2afb54e5478a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:44 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:44.459 104084 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 02:57:44 np0005531887 ovn_metadata_agent[104079]: global
Nov 22 02:57:44 np0005531887 ovn_metadata_agent[104079]:    log         /dev/log local0 debug
Nov 22 02:57:44 np0005531887 ovn_metadata_agent[104079]:    log-tag     haproxy-metadata-proxy-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794
Nov 22 02:57:44 np0005531887 ovn_metadata_agent[104079]:    user        root
Nov 22 02:57:44 np0005531887 ovn_metadata_agent[104079]:    group       root
Nov 22 02:57:44 np0005531887 ovn_metadata_agent[104079]:    maxconn     1024
Nov 22 02:57:44 np0005531887 ovn_metadata_agent[104079]:    pidfile     /var/lib/neutron/external/pids/06e0f3a5-911a-4244-bd9c-8cb4fa4c4794.pid.haproxy
Nov 22 02:57:44 np0005531887 ovn_metadata_agent[104079]:    daemon
Nov 22 02:57:44 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:57:44 np0005531887 ovn_metadata_agent[104079]: defaults
Nov 22 02:57:44 np0005531887 ovn_metadata_agent[104079]:    log global
Nov 22 02:57:44 np0005531887 ovn_metadata_agent[104079]:    mode http
Nov 22 02:57:44 np0005531887 ovn_metadata_agent[104079]:    option httplog
Nov 22 02:57:44 np0005531887 ovn_metadata_agent[104079]:    option dontlognull
Nov 22 02:57:44 np0005531887 ovn_metadata_agent[104079]:    option http-server-close
Nov 22 02:57:44 np0005531887 ovn_metadata_agent[104079]:    option forwardfor
Nov 22 02:57:44 np0005531887 ovn_metadata_agent[104079]:    retries                 3
Nov 22 02:57:44 np0005531887 ovn_metadata_agent[104079]:    timeout http-request    30s
Nov 22 02:57:44 np0005531887 ovn_metadata_agent[104079]:    timeout connect         30s
Nov 22 02:57:44 np0005531887 ovn_metadata_agent[104079]:    timeout client          32s
Nov 22 02:57:44 np0005531887 ovn_metadata_agent[104079]:    timeout server          32s
Nov 22 02:57:44 np0005531887 ovn_metadata_agent[104079]:    timeout http-keep-alive 30s
Nov 22 02:57:44 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:57:44 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:57:44 np0005531887 ovn_metadata_agent[104079]: listen listener
Nov 22 02:57:44 np0005531887 ovn_metadata_agent[104079]:    bind 169.254.169.254:80
Nov 22 02:57:44 np0005531887 ovn_metadata_agent[104079]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 02:57:44 np0005531887 ovn_metadata_agent[104079]:    http-request add-header X-OVN-Network-ID 06e0f3a5-911a-4244-bd9c-8cb4fa4c4794
Nov 22 02:57:44 np0005531887 ovn_metadata_agent[104079]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 02:57:44 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:44.461 104084 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'env', 'PROCESS_TAG=haproxy-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/06e0f3a5-911a-4244-bd9c-8cb4fa4c4794.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 02:57:44 np0005531887 nova_compute[186849]: 2025-11-22 07:57:44.467 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:44 np0005531887 podman[224819]: 2025-11-22 07:57:44.82091892 +0000 UTC m=+0.028502131 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 02:57:45 np0005531887 nova_compute[186849]: 2025-11-22 07:57:45.370 186853 DEBUG nova.virt.libvirt.host [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Removed pending event for e4a6074c-55b0-4529-b184-3ba3ca0dab8c due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Nov 22 02:57:45 np0005531887 nova_compute[186849]: 2025-11-22 07:57:45.371 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798265.3703694, e4a6074c-55b0-4529-b184-3ba3ca0dab8c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:57:45 np0005531887 nova_compute[186849]: 2025-11-22 07:57:45.372 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:57:45 np0005531887 nova_compute[186849]: 2025-11-22 07:57:45.381 186853 DEBUG nova.compute.manager [None req-8afd7fcd-9f05-4970-a6c4-b67795d629b9 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:57:45 np0005531887 nova_compute[186849]: 2025-11-22 07:57:45.418 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:57:45 np0005531887 nova_compute[186849]: 2025-11-22 07:57:45.422 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:57:45 np0005531887 nova_compute[186849]: 2025-11-22 07:57:45.446 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] During sync_power_state the instance has a pending task (rescuing). Skip.#033[00m
Nov 22 02:57:45 np0005531887 nova_compute[186849]: 2025-11-22 07:57:45.447 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798265.3721466, e4a6074c-55b0-4529-b184-3ba3ca0dab8c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:57:45 np0005531887 nova_compute[186849]: 2025-11-22 07:57:45.447 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] VM Started (Lifecycle Event)#033[00m
Nov 22 02:57:45 np0005531887 nova_compute[186849]: 2025-11-22 07:57:45.473 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:57:45 np0005531887 nova_compute[186849]: 2025-11-22 07:57:45.477 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:57:45 np0005531887 podman[224819]: 2025-11-22 07:57:45.805868237 +0000 UTC m=+1.013451428 container create 70272b2bfcb29ff08aa8fb2c044e8855f63c4db7da51505b734375e5abf122fc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:57:45 np0005531887 systemd[1]: Started libpod-conmon-70272b2bfcb29ff08aa8fb2c044e8855f63c4db7da51505b734375e5abf122fc.scope.
Nov 22 02:57:46 np0005531887 systemd[1]: Started libcrun container.
Nov 22 02:57:46 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cbf6325cb25cff1eeded041375fbacef716e348b71035f2751c303c7ba598022/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 02:57:46 np0005531887 podman[224819]: 2025-11-22 07:57:46.082140522 +0000 UTC m=+1.289723733 container init 70272b2bfcb29ff08aa8fb2c044e8855f63c4db7da51505b734375e5abf122fc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 22 02:57:46 np0005531887 podman[224840]: 2025-11-22 07:57:46.087336332 +0000 UTC m=+0.304903600 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 02:57:46 np0005531887 podman[224819]: 2025-11-22 07:57:46.088779048 +0000 UTC m=+1.296362239 container start 70272b2bfcb29ff08aa8fb2c044e8855f63c4db7da51505b734375e5abf122fc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:57:46 np0005531887 neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794[224853]: [NOTICE]   (224868) : New worker (224870) forked
Nov 22 02:57:46 np0005531887 neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794[224853]: [NOTICE]   (224868) : Loading success.
Nov 22 02:57:46 np0005531887 nova_compute[186849]: 2025-11-22 07:57:46.562 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:46 np0005531887 nova_compute[186849]: 2025-11-22 07:57:46.770 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:57:46 np0005531887 nova_compute[186849]: 2025-11-22 07:57:46.939 186853 DEBUG nova.compute.manager [req-376868b1-f80c-49d7-aa40-8f129d0d06f0 req-96160b41-c32c-4fdd-892b-9bd0bf2d6d06 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Received event network-vif-plugged-392e43af-a923-4bd6-bdff-445c6101995b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:57:46 np0005531887 nova_compute[186849]: 2025-11-22 07:57:46.940 186853 DEBUG oslo_concurrency.lockutils [req-376868b1-f80c-49d7-aa40-8f129d0d06f0 req-96160b41-c32c-4fdd-892b-9bd0bf2d6d06 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "e4a6074c-55b0-4529-b184-3ba3ca0dab8c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:57:46 np0005531887 nova_compute[186849]: 2025-11-22 07:57:46.940 186853 DEBUG oslo_concurrency.lockutils [req-376868b1-f80c-49d7-aa40-8f129d0d06f0 req-96160b41-c32c-4fdd-892b-9bd0bf2d6d06 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e4a6074c-55b0-4529-b184-3ba3ca0dab8c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:57:46 np0005531887 nova_compute[186849]: 2025-11-22 07:57:46.941 186853 DEBUG oslo_concurrency.lockutils [req-376868b1-f80c-49d7-aa40-8f129d0d06f0 req-96160b41-c32c-4fdd-892b-9bd0bf2d6d06 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e4a6074c-55b0-4529-b184-3ba3ca0dab8c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:57:46 np0005531887 nova_compute[186849]: 2025-11-22 07:57:46.941 186853 DEBUG nova.compute.manager [req-376868b1-f80c-49d7-aa40-8f129d0d06f0 req-96160b41-c32c-4fdd-892b-9bd0bf2d6d06 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] No waiting events found dispatching network-vif-plugged-392e43af-a923-4bd6-bdff-445c6101995b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:57:46 np0005531887 nova_compute[186849]: 2025-11-22 07:57:46.941 186853 WARNING nova.compute.manager [req-376868b1-f80c-49d7-aa40-8f129d0d06f0 req-96160b41-c32c-4fdd-892b-9bd0bf2d6d06 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Received unexpected event network-vif-plugged-392e43af-a923-4bd6-bdff-445c6101995b for instance with vm_state rescued and task_state None.#033[00m
Nov 22 02:57:46 np0005531887 nova_compute[186849]: 2025-11-22 07:57:46.941 186853 DEBUG nova.compute.manager [req-376868b1-f80c-49d7-aa40-8f129d0d06f0 req-96160b41-c32c-4fdd-892b-9bd0bf2d6d06 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Received event network-vif-plugged-392e43af-a923-4bd6-bdff-445c6101995b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:57:46 np0005531887 nova_compute[186849]: 2025-11-22 07:57:46.942 186853 DEBUG oslo_concurrency.lockutils [req-376868b1-f80c-49d7-aa40-8f129d0d06f0 req-96160b41-c32c-4fdd-892b-9bd0bf2d6d06 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "e4a6074c-55b0-4529-b184-3ba3ca0dab8c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:57:46 np0005531887 nova_compute[186849]: 2025-11-22 07:57:46.942 186853 DEBUG oslo_concurrency.lockutils [req-376868b1-f80c-49d7-aa40-8f129d0d06f0 req-96160b41-c32c-4fdd-892b-9bd0bf2d6d06 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e4a6074c-55b0-4529-b184-3ba3ca0dab8c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:57:46 np0005531887 nova_compute[186849]: 2025-11-22 07:57:46.942 186853 DEBUG oslo_concurrency.lockutils [req-376868b1-f80c-49d7-aa40-8f129d0d06f0 req-96160b41-c32c-4fdd-892b-9bd0bf2d6d06 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e4a6074c-55b0-4529-b184-3ba3ca0dab8c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:57:46 np0005531887 nova_compute[186849]: 2025-11-22 07:57:46.942 186853 DEBUG nova.compute.manager [req-376868b1-f80c-49d7-aa40-8f129d0d06f0 req-96160b41-c32c-4fdd-892b-9bd0bf2d6d06 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] No waiting events found dispatching network-vif-plugged-392e43af-a923-4bd6-bdff-445c6101995b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:57:46 np0005531887 nova_compute[186849]: 2025-11-22 07:57:46.943 186853 WARNING nova.compute.manager [req-376868b1-f80c-49d7-aa40-8f129d0d06f0 req-96160b41-c32c-4fdd-892b-9bd0bf2d6d06 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Received unexpected event network-vif-plugged-392e43af-a923-4bd6-bdff-445c6101995b for instance with vm_state rescued and task_state None.#033[00m
Nov 22 02:57:47 np0005531887 nova_compute[186849]: 2025-11-22 07:57:47.115 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:47 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:47.114 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:57:47 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:47.117 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 02:57:47 np0005531887 nova_compute[186849]: 2025-11-22 07:57:47.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:57:48 np0005531887 nova_compute[186849]: 2025-11-22 07:57:48.679 186853 INFO nova.compute.manager [None req-40753d08-8b03-46fd-a408-4c533972b642 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Unrescuing#033[00m
Nov 22 02:57:48 np0005531887 nova_compute[186849]: 2025-11-22 07:57:48.680 186853 DEBUG oslo_concurrency.lockutils [None req-40753d08-8b03-46fd-a408-4c533972b642 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Acquiring lock "refresh_cache-e4a6074c-55b0-4529-b184-3ba3ca0dab8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:57:48 np0005531887 nova_compute[186849]: 2025-11-22 07:57:48.681 186853 DEBUG oslo_concurrency.lockutils [None req-40753d08-8b03-46fd-a408-4c533972b642 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Acquired lock "refresh_cache-e4a6074c-55b0-4529-b184-3ba3ca0dab8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:57:48 np0005531887 nova_compute[186849]: 2025-11-22 07:57:48.683 186853 DEBUG nova.network.neutron [None req-40753d08-8b03-46fd-a408-4c533972b642 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:57:48 np0005531887 nova_compute[186849]: 2025-11-22 07:57:48.767 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:57:48 np0005531887 nova_compute[186849]: 2025-11-22 07:57:48.792 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:57:48 np0005531887 nova_compute[186849]: 2025-11-22 07:57:48.792 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:57:48 np0005531887 nova_compute[186849]: 2025-11-22 07:57:48.792 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:57:48 np0005531887 nova_compute[186849]: 2025-11-22 07:57:48.792 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 02:57:48 np0005531887 nova_compute[186849]: 2025-11-22 07:57:48.870 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:57:48 np0005531887 nova_compute[186849]: 2025-11-22 07:57:48.940 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:57:48 np0005531887 nova_compute[186849]: 2025-11-22 07:57:48.942 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:57:49 np0005531887 nova_compute[186849]: 2025-11-22 07:57:49.003 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:57:49 np0005531887 nova_compute[186849]: 2025-11-22 07:57:49.004 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk.rescue --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:57:49 np0005531887 nova_compute[186849]: 2025-11-22 07:57:49.069 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk.rescue --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:57:49 np0005531887 nova_compute[186849]: 2025-11-22 07:57:49.070 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk.rescue --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:57:49 np0005531887 nova_compute[186849]: 2025-11-22 07:57:49.137 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk.rescue --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:57:49 np0005531887 nova_compute[186849]: 2025-11-22 07:57:49.334 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:57:49 np0005531887 nova_compute[186849]: 2025-11-22 07:57:49.336 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5570MB free_disk=73.35211563110352GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 02:57:49 np0005531887 nova_compute[186849]: 2025-11-22 07:57:49.336 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:57:49 np0005531887 nova_compute[186849]: 2025-11-22 07:57:49.337 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:57:49 np0005531887 nova_compute[186849]: 2025-11-22 07:57:49.380 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:49 np0005531887 nova_compute[186849]: 2025-11-22 07:57:49.431 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Instance e4a6074c-55b0-4529-b184-3ba3ca0dab8c actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 02:57:49 np0005531887 nova_compute[186849]: 2025-11-22 07:57:49.432 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 02:57:49 np0005531887 nova_compute[186849]: 2025-11-22 07:57:49.433 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 02:57:49 np0005531887 nova_compute[186849]: 2025-11-22 07:57:49.476 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:57:49 np0005531887 nova_compute[186849]: 2025-11-22 07:57:49.494 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:57:50 np0005531887 nova_compute[186849]: 2025-11-22 07:57:50.092 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 02:57:50 np0005531887 nova_compute[186849]: 2025-11-22 07:57:50.094 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.757s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:57:51 np0005531887 nova_compute[186849]: 2025-11-22 07:57:51.089 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:57:51 np0005531887 nova_compute[186849]: 2025-11-22 07:57:51.090 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:57:51 np0005531887 nova_compute[186849]: 2025-11-22 07:57:51.091 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:57:51 np0005531887 nova_compute[186849]: 2025-11-22 07:57:51.092 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 02:57:51 np0005531887 nova_compute[186849]: 2025-11-22 07:57:51.564 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:51 np0005531887 nova_compute[186849]: 2025-11-22 07:57:51.753 186853 DEBUG nova.network.neutron [None req-40753d08-8b03-46fd-a408-4c533972b642 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Updating instance_info_cache with network_info: [{"id": "392e43af-a923-4bd6-bdff-445c6101995b", "address": "fa:16:3e:10:9b:64", "network": {"id": "06e0f3a5-911a-4244-bd9c-8cb4fa4c4794", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-960378838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd33c7e49baa4c7f9575824b348a0f23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap392e43af-a9", "ovs_interfaceid": "392e43af-a923-4bd6-bdff-445c6101995b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:57:51 np0005531887 nova_compute[186849]: 2025-11-22 07:57:51.773 186853 DEBUG oslo_concurrency.lockutils [None req-40753d08-8b03-46fd-a408-4c533972b642 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Releasing lock "refresh_cache-e4a6074c-55b0-4529-b184-3ba3ca0dab8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:57:51 np0005531887 nova_compute[186849]: 2025-11-22 07:57:51.774 186853 DEBUG nova.objects.instance [None req-40753d08-8b03-46fd-a408-4c533972b642 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lazy-loading 'flavor' on Instance uuid e4a6074c-55b0-4529-b184-3ba3ca0dab8c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:57:51 np0005531887 kernel: tap392e43af-a9 (unregistering): left promiscuous mode
Nov 22 02:57:51 np0005531887 NetworkManager[55210]: <info>  [1763798271.8283] device (tap392e43af-a9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 02:57:51 np0005531887 ovn_controller[95130]: 2025-11-22T07:57:51Z|00213|binding|INFO|Releasing lport 392e43af-a923-4bd6-bdff-445c6101995b from this chassis (sb_readonly=0)
Nov 22 02:57:51 np0005531887 ovn_controller[95130]: 2025-11-22T07:57:51Z|00214|binding|INFO|Setting lport 392e43af-a923-4bd6-bdff-445c6101995b down in Southbound
Nov 22 02:57:51 np0005531887 nova_compute[186849]: 2025-11-22 07:57:51.844 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:51 np0005531887 ovn_controller[95130]: 2025-11-22T07:57:51Z|00215|binding|INFO|Removing iface tap392e43af-a9 ovn-installed in OVS
Nov 22 02:57:51 np0005531887 nova_compute[186849]: 2025-11-22 07:57:51.850 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:51 np0005531887 nova_compute[186849]: 2025-11-22 07:57:51.866 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:51 np0005531887 systemd[1]: machine-qemu\x2d31\x2dinstance\x2d0000004d.scope: Deactivated successfully.
Nov 22 02:57:51 np0005531887 systemd[1]: machine-qemu\x2d31\x2dinstance\x2d0000004d.scope: Consumed 7.515s CPU time.
Nov 22 02:57:51 np0005531887 systemd-machined[153180]: Machine qemu-31-instance-0000004d terminated.
Nov 22 02:57:51 np0005531887 podman[224896]: 2025-11-22 07:57:51.934596046 +0000 UTC m=+0.058612061 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent)
Nov 22 02:57:51 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:51.939 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:10:9b:64 10.100.0.7'], port_security=['fa:16:3e:10:9b:64 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'e4a6074c-55b0-4529-b184-3ba3ca0dab8c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'neutron:revision_number': '6', 'neutron:security_group_ids': '40e7d412-78c2-4966-b2d0-76294ef96b0a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2f082361-600e-461c-8c13-2c91c0ff7f77, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=392e43af-a923-4bd6-bdff-445c6101995b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:57:51 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:51.942 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 392e43af-a923-4bd6-bdff-445c6101995b in datapath 06e0f3a5-911a-4244-bd9c-8cb4fa4c4794 unbound from our chassis#033[00m
Nov 22 02:57:51 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:51.944 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 06e0f3a5-911a-4244-bd9c-8cb4fa4c4794, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 02:57:51 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:51.946 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[ea3d99fd-51f8-4bc7-89bf-b9a13097b1f2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:51 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:51.947 104084 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794 namespace which is not needed anymore#033[00m
Nov 22 02:57:52 np0005531887 nova_compute[186849]: 2025-11-22 07:57:52.025 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:52 np0005531887 nova_compute[186849]: 2025-11-22 07:57:52.029 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:52 np0005531887 nova_compute[186849]: 2025-11-22 07:57:52.091 186853 INFO nova.virt.libvirt.driver [-] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Instance destroyed successfully.#033[00m
Nov 22 02:57:52 np0005531887 nova_compute[186849]: 2025-11-22 07:57:52.092 186853 DEBUG nova.objects.instance [None req-40753d08-8b03-46fd-a408-4c533972b642 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lazy-loading 'numa_topology' on Instance uuid e4a6074c-55b0-4529-b184-3ba3ca0dab8c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:57:52 np0005531887 neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794[224853]: [NOTICE]   (224868) : haproxy version is 2.8.14-c23fe91
Nov 22 02:57:52 np0005531887 neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794[224853]: [NOTICE]   (224868) : path to executable is /usr/sbin/haproxy
Nov 22 02:57:52 np0005531887 neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794[224853]: [WARNING]  (224868) : Exiting Master process...
Nov 22 02:57:52 np0005531887 neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794[224853]: [WARNING]  (224868) : Exiting Master process...
Nov 22 02:57:52 np0005531887 neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794[224853]: [ALERT]    (224868) : Current worker (224870) exited with code 143 (Terminated)
Nov 22 02:57:52 np0005531887 neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794[224853]: [WARNING]  (224868) : All workers exited. Exiting... (0)
Nov 22 02:57:52 np0005531887 systemd[1]: libpod-70272b2bfcb29ff08aa8fb2c044e8855f63c4db7da51505b734375e5abf122fc.scope: Deactivated successfully.
Nov 22 02:57:52 np0005531887 podman[224942]: 2025-11-22 07:57:52.199512718 +0000 UTC m=+0.151961777 container died 70272b2bfcb29ff08aa8fb2c044e8855f63c4db7da51505b734375e5abf122fc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 22 02:57:52 np0005531887 kernel: tap392e43af-a9: entered promiscuous mode
Nov 22 02:57:52 np0005531887 NetworkManager[55210]: <info>  [1763798272.2154] manager: (tap392e43af-a9): new Tun device (/org/freedesktop/NetworkManager/Devices/112)
Nov 22 02:57:52 np0005531887 systemd-udevd[224903]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:57:52 np0005531887 ovn_controller[95130]: 2025-11-22T07:57:52Z|00216|binding|INFO|Claiming lport 392e43af-a923-4bd6-bdff-445c6101995b for this chassis.
Nov 22 02:57:52 np0005531887 ovn_controller[95130]: 2025-11-22T07:57:52Z|00217|binding|INFO|392e43af-a923-4bd6-bdff-445c6101995b: Claiming fa:16:3e:10:9b:64 10.100.0.7
Nov 22 02:57:52 np0005531887 nova_compute[186849]: 2025-11-22 07:57:52.221 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:52 np0005531887 NetworkManager[55210]: <info>  [1763798272.2296] device (tap392e43af-a9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 02:57:52 np0005531887 NetworkManager[55210]: <info>  [1763798272.2304] device (tap392e43af-a9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 02:57:52 np0005531887 ovn_controller[95130]: 2025-11-22T07:57:52Z|00218|binding|INFO|Setting lport 392e43af-a923-4bd6-bdff-445c6101995b ovn-installed in OVS
Nov 22 02:57:52 np0005531887 nova_compute[186849]: 2025-11-22 07:57:52.234 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:52 np0005531887 ovn_controller[95130]: 2025-11-22T07:57:52Z|00219|binding|INFO|Setting lport 392e43af-a923-4bd6-bdff-445c6101995b up in Southbound
Nov 22 02:57:52 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:52.255 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:10:9b:64 10.100.0.7'], port_security=['fa:16:3e:10:9b:64 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'e4a6074c-55b0-4529-b184-3ba3ca0dab8c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'neutron:revision_number': '6', 'neutron:security_group_ids': '40e7d412-78c2-4966-b2d0-76294ef96b0a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2f082361-600e-461c-8c13-2c91c0ff7f77, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=392e43af-a923-4bd6-bdff-445c6101995b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:57:52 np0005531887 systemd-machined[153180]: New machine qemu-32-instance-0000004d.
Nov 22 02:57:52 np0005531887 systemd[1]: Started Virtual Machine qemu-32-instance-0000004d.
Nov 22 02:57:52 np0005531887 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-70272b2bfcb29ff08aa8fb2c044e8855f63c4db7da51505b734375e5abf122fc-userdata-shm.mount: Deactivated successfully.
Nov 22 02:57:52 np0005531887 systemd[1]: var-lib-containers-storage-overlay-cbf6325cb25cff1eeded041375fbacef716e348b71035f2751c303c7ba598022-merged.mount: Deactivated successfully.
Nov 22 02:57:52 np0005531887 nova_compute[186849]: 2025-11-22 07:57:52.544 186853 DEBUG nova.compute.manager [req-0cc6b35e-3b5b-46bb-8f5e-d3366b348fb0 req-ad86e126-0cc7-407c-934c-436552c535a6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Received event network-vif-unplugged-392e43af-a923-4bd6-bdff-445c6101995b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:57:52 np0005531887 nova_compute[186849]: 2025-11-22 07:57:52.545 186853 DEBUG oslo_concurrency.lockutils [req-0cc6b35e-3b5b-46bb-8f5e-d3366b348fb0 req-ad86e126-0cc7-407c-934c-436552c535a6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "e4a6074c-55b0-4529-b184-3ba3ca0dab8c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:57:52 np0005531887 nova_compute[186849]: 2025-11-22 07:57:52.546 186853 DEBUG oslo_concurrency.lockutils [req-0cc6b35e-3b5b-46bb-8f5e-d3366b348fb0 req-ad86e126-0cc7-407c-934c-436552c535a6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e4a6074c-55b0-4529-b184-3ba3ca0dab8c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:57:52 np0005531887 nova_compute[186849]: 2025-11-22 07:57:52.546 186853 DEBUG oslo_concurrency.lockutils [req-0cc6b35e-3b5b-46bb-8f5e-d3366b348fb0 req-ad86e126-0cc7-407c-934c-436552c535a6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e4a6074c-55b0-4529-b184-3ba3ca0dab8c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:57:52 np0005531887 nova_compute[186849]: 2025-11-22 07:57:52.546 186853 DEBUG nova.compute.manager [req-0cc6b35e-3b5b-46bb-8f5e-d3366b348fb0 req-ad86e126-0cc7-407c-934c-436552c535a6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] No waiting events found dispatching network-vif-unplugged-392e43af-a923-4bd6-bdff-445c6101995b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:57:52 np0005531887 nova_compute[186849]: 2025-11-22 07:57:52.546 186853 WARNING nova.compute.manager [req-0cc6b35e-3b5b-46bb-8f5e-d3366b348fb0 req-ad86e126-0cc7-407c-934c-436552c535a6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Received unexpected event network-vif-unplugged-392e43af-a923-4bd6-bdff-445c6101995b for instance with vm_state rescued and task_state unrescuing.#033[00m
Nov 22 02:57:52 np0005531887 podman[224942]: 2025-11-22 07:57:52.591984349 +0000 UTC m=+0.544433398 container cleanup 70272b2bfcb29ff08aa8fb2c044e8855f63c4db7da51505b734375e5abf122fc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 02:57:52 np0005531887 systemd[1]: libpod-conmon-70272b2bfcb29ff08aa8fb2c044e8855f63c4db7da51505b734375e5abf122fc.scope: Deactivated successfully.
Nov 22 02:57:52 np0005531887 nova_compute[186849]: 2025-11-22 07:57:52.770 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:57:52 np0005531887 nova_compute[186849]: 2025-11-22 07:57:52.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 02:57:52 np0005531887 nova_compute[186849]: 2025-11-22 07:57:52.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 02:57:52 np0005531887 nova_compute[186849]: 2025-11-22 07:57:52.786 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "refresh_cache-e4a6074c-55b0-4529-b184-3ba3ca0dab8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:57:52 np0005531887 nova_compute[186849]: 2025-11-22 07:57:52.786 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquired lock "refresh_cache-e4a6074c-55b0-4529-b184-3ba3ca0dab8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:57:52 np0005531887 nova_compute[186849]: 2025-11-22 07:57:52.786 186853 DEBUG nova.network.neutron [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 02:57:52 np0005531887 nova_compute[186849]: 2025-11-22 07:57:52.787 186853 DEBUG nova.objects.instance [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid e4a6074c-55b0-4529-b184-3ba3ca0dab8c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:57:52 np0005531887 podman[225006]: 2025-11-22 07:57:52.917122993 +0000 UTC m=+0.285843714 container remove 70272b2bfcb29ff08aa8fb2c044e8855f63c4db7da51505b734375e5abf122fc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:57:52 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:52.923 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[5bdc3e9b-8e78-4784-9d33-604093e551a6]: (4, ('Sat Nov 22 07:57:52 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794 (70272b2bfcb29ff08aa8fb2c044e8855f63c4db7da51505b734375e5abf122fc)\n70272b2bfcb29ff08aa8fb2c044e8855f63c4db7da51505b734375e5abf122fc\nSat Nov 22 07:57:52 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794 (70272b2bfcb29ff08aa8fb2c044e8855f63c4db7da51505b734375e5abf122fc)\n70272b2bfcb29ff08aa8fb2c044e8855f63c4db7da51505b734375e5abf122fc\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:52 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:52.925 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[34bfa320-6c8d-4951-ac6c-a0684d8a998b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:52 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:52.926 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap06e0f3a5-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:57:52 np0005531887 nova_compute[186849]: 2025-11-22 07:57:52.929 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:52 np0005531887 kernel: tap06e0f3a5-90: left promiscuous mode
Nov 22 02:57:52 np0005531887 nova_compute[186849]: 2025-11-22 07:57:52.942 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:52 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:52.946 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[5ff0748c-006d-4bd8-a162-ea7bd3db7ff3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:52 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:52.962 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[88bece57-5bfe-4a32-b9e4-8edf5a7981fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:52 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:52.963 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[ea820c95-e0c2-4c30-a034-76ee26a5388c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:52 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:52.982 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[2b013c4c-e844-4065-bc33-5676cfe5a429]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 500683, 'reachable_time': 39385, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225021, 'error': None, 'target': 'ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:52 np0005531887 systemd[1]: run-netns-ovnmeta\x2d06e0f3a5\x2d911a\x2d4244\x2dbd9c\x2d8cb4fa4c4794.mount: Deactivated successfully.
Nov 22 02:57:52 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:52.986 104200 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 02:57:52 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:52.986 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[bfdd708c-f03b-4d95-8faa-4679803719c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:52 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:52.988 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 392e43af-a923-4bd6-bdff-445c6101995b in datapath 06e0f3a5-911a-4244-bd9c-8cb4fa4c4794 unbound from our chassis#033[00m
Nov 22 02:57:52 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:52.990 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 06e0f3a5-911a-4244-bd9c-8cb4fa4c4794#033[00m
Nov 22 02:57:53 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:53.005 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[d1d186e6-9a7f-420e-9528-bf5b1ba06dae]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:53 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:53.006 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap06e0f3a5-91 in ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 02:57:53 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:53.010 213790 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap06e0f3a5-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 02:57:53 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:53.010 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[6abe9b0e-cfbd-4f07-8d64-ae381e44f35c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:53 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:53.011 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[f6c5476a-b816-4bac-a7fe-73525a31eeb8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:53 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:53.023 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[cef2d631-da4d-45d9-a62b-d5c3b744d193]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:53 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:53.042 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[d39f6cec-4707-4b2e-8761-5c9342e031e6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:53 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:53.080 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[979b3847-8eb6-4cee-adc2-284074a7beb5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:53 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:53.086 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[e2904020-0b1c-4da5-9de2-63be8ad50c90]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:53 np0005531887 NetworkManager[55210]: <info>  [1763798273.0883] manager: (tap06e0f3a5-90): new Veth device (/org/freedesktop/NetworkManager/Devices/113)
Nov 22 02:57:53 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:53.131 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[afec9c30-6638-46df-9318-b20d3df76037]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:53 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:53.135 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[d1f62533-e521-47e2-adeb-787ba2c95b8f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:53 np0005531887 nova_compute[186849]: 2025-11-22 07:57:53.146 186853 DEBUG nova.virt.libvirt.host [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Removed pending event for e4a6074c-55b0-4529-b184-3ba3ca0dab8c due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Nov 22 02:57:53 np0005531887 nova_compute[186849]: 2025-11-22 07:57:53.146 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798273.1455054, e4a6074c-55b0-4529-b184-3ba3ca0dab8c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:57:53 np0005531887 nova_compute[186849]: 2025-11-22 07:57:53.146 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:57:53 np0005531887 nova_compute[186849]: 2025-11-22 07:57:53.150 186853 DEBUG nova.compute.manager [None req-40753d08-8b03-46fd-a408-4c533972b642 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:57:53 np0005531887 NetworkManager[55210]: <info>  [1763798273.1693] device (tap06e0f3a5-90): carrier: link connected
Nov 22 02:57:53 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:53.170 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[f3a6d622-7b7f-4fbe-b49e-42ed22bd0d41]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:53 np0005531887 nova_compute[186849]: 2025-11-22 07:57:53.184 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:57:53 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:53.188 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[4c8d35e7-b765-4462-a017-1210180c250c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap06e0f3a5-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:b7:bd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 73], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 501577, 'reachable_time': 19429, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225052, 'error': None, 'target': 'ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:53 np0005531887 nova_compute[186849]: 2025-11-22 07:57:53.189 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:57:53 np0005531887 nova_compute[186849]: 2025-11-22 07:57:53.208 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] During sync_power_state the instance has a pending task (unrescuing). Skip.#033[00m
Nov 22 02:57:53 np0005531887 nova_compute[186849]: 2025-11-22 07:57:53.208 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798273.148989, e4a6074c-55b0-4529-b184-3ba3ca0dab8c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:57:53 np0005531887 nova_compute[186849]: 2025-11-22 07:57:53.209 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] VM Started (Lifecycle Event)#033[00m
Nov 22 02:57:53 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:53.209 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[532a4ae9-90f5-4722-9919-bc4e9fc47a19]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed7:b7bd'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 501577, 'tstamp': 501577}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225053, 'error': None, 'target': 'ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:53 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:53.233 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[54dbb37d-e985-49e9-b5e7-8e517fef1263]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap06e0f3a5-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:b7:bd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 73], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 501577, 'reachable_time': 19429, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 225054, 'error': None, 'target': 'ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:53 np0005531887 nova_compute[186849]: 2025-11-22 07:57:53.243 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:57:53 np0005531887 nova_compute[186849]: 2025-11-22 07:57:53.248 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:57:53 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:53.270 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[2450533c-2f28-4da6-ac6b-5402ff6cf0c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:53 np0005531887 nova_compute[186849]: 2025-11-22 07:57:53.280 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] During sync_power_state the instance has a pending task (unrescuing). Skip.#033[00m
Nov 22 02:57:53 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:53.351 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[0aceffad-b5a7-4a24-aaed-4760bffa8de6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:53 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:53.353 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap06e0f3a5-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:57:53 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:53.353 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:57:53 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:53.354 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap06e0f3a5-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:57:53 np0005531887 NetworkManager[55210]: <info>  [1763798273.3569] manager: (tap06e0f3a5-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/114)
Nov 22 02:57:53 np0005531887 nova_compute[186849]: 2025-11-22 07:57:53.356 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:53 np0005531887 kernel: tap06e0f3a5-90: entered promiscuous mode
Nov 22 02:57:53 np0005531887 nova_compute[186849]: 2025-11-22 07:57:53.360 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:53 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:53.361 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap06e0f3a5-90, col_values=(('external_ids', {'iface-id': '465da2c0-9a1c-41a9-be9a-d10bcbd7a813'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:57:53 np0005531887 nova_compute[186849]: 2025-11-22 07:57:53.362 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:53 np0005531887 ovn_controller[95130]: 2025-11-22T07:57:53Z|00220|binding|INFO|Releasing lport 465da2c0-9a1c-41a9-be9a-d10bcbd7a813 from this chassis (sb_readonly=0)
Nov 22 02:57:53 np0005531887 nova_compute[186849]: 2025-11-22 07:57:53.363 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:53 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:53.366 104084 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/06e0f3a5-911a-4244-bd9c-8cb4fa4c4794.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/06e0f3a5-911a-4244-bd9c-8cb4fa4c4794.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 02:57:53 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:53.368 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[562ab186-3af2-4359-9cc8-f488c98e5cb4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:57:53 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:53.368 104084 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 02:57:53 np0005531887 ovn_metadata_agent[104079]: global
Nov 22 02:57:53 np0005531887 ovn_metadata_agent[104079]:    log         /dev/log local0 debug
Nov 22 02:57:53 np0005531887 ovn_metadata_agent[104079]:    log-tag     haproxy-metadata-proxy-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794
Nov 22 02:57:53 np0005531887 ovn_metadata_agent[104079]:    user        root
Nov 22 02:57:53 np0005531887 ovn_metadata_agent[104079]:    group       root
Nov 22 02:57:53 np0005531887 ovn_metadata_agent[104079]:    maxconn     1024
Nov 22 02:57:53 np0005531887 ovn_metadata_agent[104079]:    pidfile     /var/lib/neutron/external/pids/06e0f3a5-911a-4244-bd9c-8cb4fa4c4794.pid.haproxy
Nov 22 02:57:53 np0005531887 ovn_metadata_agent[104079]:    daemon
Nov 22 02:57:53 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:57:53 np0005531887 ovn_metadata_agent[104079]: defaults
Nov 22 02:57:53 np0005531887 ovn_metadata_agent[104079]:    log global
Nov 22 02:57:53 np0005531887 ovn_metadata_agent[104079]:    mode http
Nov 22 02:57:53 np0005531887 ovn_metadata_agent[104079]:    option httplog
Nov 22 02:57:53 np0005531887 ovn_metadata_agent[104079]:    option dontlognull
Nov 22 02:57:53 np0005531887 ovn_metadata_agent[104079]:    option http-server-close
Nov 22 02:57:53 np0005531887 ovn_metadata_agent[104079]:    option forwardfor
Nov 22 02:57:53 np0005531887 ovn_metadata_agent[104079]:    retries                 3
Nov 22 02:57:53 np0005531887 ovn_metadata_agent[104079]:    timeout http-request    30s
Nov 22 02:57:53 np0005531887 ovn_metadata_agent[104079]:    timeout connect         30s
Nov 22 02:57:53 np0005531887 ovn_metadata_agent[104079]:    timeout client          32s
Nov 22 02:57:53 np0005531887 ovn_metadata_agent[104079]:    timeout server          32s
Nov 22 02:57:53 np0005531887 ovn_metadata_agent[104079]:    timeout http-keep-alive 30s
Nov 22 02:57:53 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:57:53 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:57:53 np0005531887 ovn_metadata_agent[104079]: listen listener
Nov 22 02:57:53 np0005531887 ovn_metadata_agent[104079]:    bind 169.254.169.254:80
Nov 22 02:57:53 np0005531887 ovn_metadata_agent[104079]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 02:57:53 np0005531887 ovn_metadata_agent[104079]:    http-request add-header X-OVN-Network-ID 06e0f3a5-911a-4244-bd9c-8cb4fa4c4794
Nov 22 02:57:53 np0005531887 ovn_metadata_agent[104079]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 02:57:53 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:53.369 104084 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'env', 'PROCESS_TAG=haproxy-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/06e0f3a5-911a-4244-bd9c-8cb4fa4c4794.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 02:57:53 np0005531887 nova_compute[186849]: 2025-11-22 07:57:53.377 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:53 np0005531887 podman[225087]: 2025-11-22 07:57:53.761602189 +0000 UTC m=+0.026676066 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 02:57:53 np0005531887 podman[225087]: 2025-11-22 07:57:53.959241134 +0000 UTC m=+0.224314991 container create a7bc02ff977de32f78b5c5b2e408cda232607b926eeaa57ffbdf06c7483d9ccc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 02:57:54 np0005531887 systemd[1]: Started libpod-conmon-a7bc02ff977de32f78b5c5b2e408cda232607b926eeaa57ffbdf06c7483d9ccc.scope.
Nov 22 02:57:54 np0005531887 systemd[1]: Started libcrun container.
Nov 22 02:57:54 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50e51983c6bef0bbe7854780a42ba219c17700dfded49967db45379275f6f35c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 02:57:54 np0005531887 podman[225087]: 2025-11-22 07:57:54.098802873 +0000 UTC m=+0.363876750 container init a7bc02ff977de32f78b5c5b2e408cda232607b926eeaa57ffbdf06c7483d9ccc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:57:54 np0005531887 podman[225087]: 2025-11-22 07:57:54.106408922 +0000 UTC m=+0.371482779 container start a7bc02ff977de32f78b5c5b2e408cda232607b926eeaa57ffbdf06c7483d9ccc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Nov 22 02:57:54 np0005531887 neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794[225102]: [NOTICE]   (225106) : New worker (225108) forked
Nov 22 02:57:54 np0005531887 neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794[225102]: [NOTICE]   (225106) : Loading success.
Nov 22 02:57:54 np0005531887 nova_compute[186849]: 2025-11-22 07:57:54.383 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:54 np0005531887 nova_compute[186849]: 2025-11-22 07:57:54.677 186853 DEBUG nova.compute.manager [req-56d0772a-7500-484a-8eb2-a5e7da869cb5 req-4fe153c2-1104-438d-9024-eec5e82c203c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Received event network-vif-plugged-392e43af-a923-4bd6-bdff-445c6101995b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:57:54 np0005531887 nova_compute[186849]: 2025-11-22 07:57:54.678 186853 DEBUG oslo_concurrency.lockutils [req-56d0772a-7500-484a-8eb2-a5e7da869cb5 req-4fe153c2-1104-438d-9024-eec5e82c203c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "e4a6074c-55b0-4529-b184-3ba3ca0dab8c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:57:54 np0005531887 nova_compute[186849]: 2025-11-22 07:57:54.678 186853 DEBUG oslo_concurrency.lockutils [req-56d0772a-7500-484a-8eb2-a5e7da869cb5 req-4fe153c2-1104-438d-9024-eec5e82c203c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e4a6074c-55b0-4529-b184-3ba3ca0dab8c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:57:54 np0005531887 nova_compute[186849]: 2025-11-22 07:57:54.678 186853 DEBUG oslo_concurrency.lockutils [req-56d0772a-7500-484a-8eb2-a5e7da869cb5 req-4fe153c2-1104-438d-9024-eec5e82c203c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e4a6074c-55b0-4529-b184-3ba3ca0dab8c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:57:54 np0005531887 nova_compute[186849]: 2025-11-22 07:57:54.678 186853 DEBUG nova.compute.manager [req-56d0772a-7500-484a-8eb2-a5e7da869cb5 req-4fe153c2-1104-438d-9024-eec5e82c203c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] No waiting events found dispatching network-vif-plugged-392e43af-a923-4bd6-bdff-445c6101995b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:57:54 np0005531887 nova_compute[186849]: 2025-11-22 07:57:54.679 186853 WARNING nova.compute.manager [req-56d0772a-7500-484a-8eb2-a5e7da869cb5 req-4fe153c2-1104-438d-9024-eec5e82c203c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Received unexpected event network-vif-plugged-392e43af-a923-4bd6-bdff-445c6101995b for instance with vm_state active and task_state None.#033[00m
Nov 22 02:57:54 np0005531887 nova_compute[186849]: 2025-11-22 07:57:54.679 186853 DEBUG nova.compute.manager [req-56d0772a-7500-484a-8eb2-a5e7da869cb5 req-4fe153c2-1104-438d-9024-eec5e82c203c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Received event network-vif-plugged-392e43af-a923-4bd6-bdff-445c6101995b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:57:54 np0005531887 nova_compute[186849]: 2025-11-22 07:57:54.679 186853 DEBUG oslo_concurrency.lockutils [req-56d0772a-7500-484a-8eb2-a5e7da869cb5 req-4fe153c2-1104-438d-9024-eec5e82c203c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "e4a6074c-55b0-4529-b184-3ba3ca0dab8c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:57:54 np0005531887 nova_compute[186849]: 2025-11-22 07:57:54.679 186853 DEBUG oslo_concurrency.lockutils [req-56d0772a-7500-484a-8eb2-a5e7da869cb5 req-4fe153c2-1104-438d-9024-eec5e82c203c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e4a6074c-55b0-4529-b184-3ba3ca0dab8c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:57:54 np0005531887 nova_compute[186849]: 2025-11-22 07:57:54.680 186853 DEBUG oslo_concurrency.lockutils [req-56d0772a-7500-484a-8eb2-a5e7da869cb5 req-4fe153c2-1104-438d-9024-eec5e82c203c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e4a6074c-55b0-4529-b184-3ba3ca0dab8c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:57:54 np0005531887 nova_compute[186849]: 2025-11-22 07:57:54.680 186853 DEBUG nova.compute.manager [req-56d0772a-7500-484a-8eb2-a5e7da869cb5 req-4fe153c2-1104-438d-9024-eec5e82c203c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] No waiting events found dispatching network-vif-plugged-392e43af-a923-4bd6-bdff-445c6101995b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:57:54 np0005531887 nova_compute[186849]: 2025-11-22 07:57:54.680 186853 WARNING nova.compute.manager [req-56d0772a-7500-484a-8eb2-a5e7da869cb5 req-4fe153c2-1104-438d-9024-eec5e82c203c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Received unexpected event network-vif-plugged-392e43af-a923-4bd6-bdff-445c6101995b for instance with vm_state active and task_state None.#033[00m
Nov 22 02:57:54 np0005531887 nova_compute[186849]: 2025-11-22 07:57:54.680 186853 DEBUG nova.compute.manager [req-56d0772a-7500-484a-8eb2-a5e7da869cb5 req-4fe153c2-1104-438d-9024-eec5e82c203c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Received event network-vif-plugged-392e43af-a923-4bd6-bdff-445c6101995b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:57:54 np0005531887 nova_compute[186849]: 2025-11-22 07:57:54.681 186853 DEBUG oslo_concurrency.lockutils [req-56d0772a-7500-484a-8eb2-a5e7da869cb5 req-4fe153c2-1104-438d-9024-eec5e82c203c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "e4a6074c-55b0-4529-b184-3ba3ca0dab8c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:57:54 np0005531887 nova_compute[186849]: 2025-11-22 07:57:54.681 186853 DEBUG oslo_concurrency.lockutils [req-56d0772a-7500-484a-8eb2-a5e7da869cb5 req-4fe153c2-1104-438d-9024-eec5e82c203c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e4a6074c-55b0-4529-b184-3ba3ca0dab8c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:57:54 np0005531887 nova_compute[186849]: 2025-11-22 07:57:54.681 186853 DEBUG oslo_concurrency.lockutils [req-56d0772a-7500-484a-8eb2-a5e7da869cb5 req-4fe153c2-1104-438d-9024-eec5e82c203c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e4a6074c-55b0-4529-b184-3ba3ca0dab8c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:57:54 np0005531887 nova_compute[186849]: 2025-11-22 07:57:54.681 186853 DEBUG nova.compute.manager [req-56d0772a-7500-484a-8eb2-a5e7da869cb5 req-4fe153c2-1104-438d-9024-eec5e82c203c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] No waiting events found dispatching network-vif-plugged-392e43af-a923-4bd6-bdff-445c6101995b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:57:54 np0005531887 nova_compute[186849]: 2025-11-22 07:57:54.681 186853 WARNING nova.compute.manager [req-56d0772a-7500-484a-8eb2-a5e7da869cb5 req-4fe153c2-1104-438d-9024-eec5e82c203c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Received unexpected event network-vif-plugged-392e43af-a923-4bd6-bdff-445c6101995b for instance with vm_state active and task_state None.#033[00m
Nov 22 02:57:56 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:57:56.119 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:57:56 np0005531887 nova_compute[186849]: 2025-11-22 07:57:56.486 186853 DEBUG nova.network.neutron [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Updating instance_info_cache with network_info: [{"id": "392e43af-a923-4bd6-bdff-445c6101995b", "address": "fa:16:3e:10:9b:64", "network": {"id": "06e0f3a5-911a-4244-bd9c-8cb4fa4c4794", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-960378838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd33c7e49baa4c7f9575824b348a0f23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap392e43af-a9", "ovs_interfaceid": "392e43af-a923-4bd6-bdff-445c6101995b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:57:56 np0005531887 nova_compute[186849]: 2025-11-22 07:57:56.507 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Releasing lock "refresh_cache-e4a6074c-55b0-4529-b184-3ba3ca0dab8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:57:56 np0005531887 nova_compute[186849]: 2025-11-22 07:57:56.507 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 02:57:56 np0005531887 nova_compute[186849]: 2025-11-22 07:57:56.507 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:57:56 np0005531887 nova_compute[186849]: 2025-11-22 07:57:56.568 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:56 np0005531887 podman[225117]: 2025-11-22 07:57:56.851560437 +0000 UTC m=+0.070346355 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:57:57 np0005531887 nova_compute[186849]: 2025-11-22 07:57:57.712 186853 DEBUG oslo_concurrency.lockutils [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Acquiring lock "3f315996-e85d-463b-9123-272512335a7f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:57:57 np0005531887 nova_compute[186849]: 2025-11-22 07:57:57.714 186853 DEBUG oslo_concurrency.lockutils [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "3f315996-e85d-463b-9123-272512335a7f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:57:57 np0005531887 nova_compute[186849]: 2025-11-22 07:57:57.732 186853 DEBUG nova.compute.manager [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 02:57:57 np0005531887 nova_compute[186849]: 2025-11-22 07:57:57.833 186853 DEBUG oslo_concurrency.lockutils [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:57:57 np0005531887 nova_compute[186849]: 2025-11-22 07:57:57.834 186853 DEBUG oslo_concurrency.lockutils [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:57:57 np0005531887 nova_compute[186849]: 2025-11-22 07:57:57.846 186853 DEBUG nova.virt.hardware [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:57:57 np0005531887 nova_compute[186849]: 2025-11-22 07:57:57.846 186853 INFO nova.compute.claims [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 22 02:57:57 np0005531887 nova_compute[186849]: 2025-11-22 07:57:57.976 186853 DEBUG nova.compute.provider_tree [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:57:57 np0005531887 nova_compute[186849]: 2025-11-22 07:57:57.990 186853 DEBUG nova.scheduler.client.report [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:57:58 np0005531887 nova_compute[186849]: 2025-11-22 07:57:58.010 186853 DEBUG oslo_concurrency.lockutils [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.176s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:57:58 np0005531887 nova_compute[186849]: 2025-11-22 07:57:58.011 186853 DEBUG nova.compute.manager [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 02:57:58 np0005531887 nova_compute[186849]: 2025-11-22 07:57:58.064 186853 DEBUG nova.compute.manager [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 02:57:58 np0005531887 nova_compute[186849]: 2025-11-22 07:57:58.065 186853 DEBUG nova.network.neutron [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 02:57:58 np0005531887 nova_compute[186849]: 2025-11-22 07:57:58.089 186853 INFO nova.virt.libvirt.driver [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 02:57:58 np0005531887 nova_compute[186849]: 2025-11-22 07:57:58.109 186853 DEBUG nova.compute.manager [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 02:57:58 np0005531887 nova_compute[186849]: 2025-11-22 07:57:58.196 186853 DEBUG nova.compute.manager [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 02:57:58 np0005531887 nova_compute[186849]: 2025-11-22 07:57:58.197 186853 DEBUG nova.virt.libvirt.driver [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:57:58 np0005531887 nova_compute[186849]: 2025-11-22 07:57:58.198 186853 INFO nova.virt.libvirt.driver [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Creating image(s)#033[00m
Nov 22 02:57:58 np0005531887 nova_compute[186849]: 2025-11-22 07:57:58.198 186853 DEBUG oslo_concurrency.lockutils [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Acquiring lock "/var/lib/nova/instances/3f315996-e85d-463b-9123-272512335a7f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:57:58 np0005531887 nova_compute[186849]: 2025-11-22 07:57:58.199 186853 DEBUG oslo_concurrency.lockutils [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "/var/lib/nova/instances/3f315996-e85d-463b-9123-272512335a7f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:57:58 np0005531887 nova_compute[186849]: 2025-11-22 07:57:58.199 186853 DEBUG oslo_concurrency.lockutils [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "/var/lib/nova/instances/3f315996-e85d-463b-9123-272512335a7f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:57:58 np0005531887 nova_compute[186849]: 2025-11-22 07:57:58.215 186853 DEBUG oslo_concurrency.processutils [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:57:58 np0005531887 nova_compute[186849]: 2025-11-22 07:57:58.282 186853 DEBUG oslo_concurrency.processutils [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:57:58 np0005531887 nova_compute[186849]: 2025-11-22 07:57:58.285 186853 DEBUG oslo_concurrency.lockutils [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:57:58 np0005531887 nova_compute[186849]: 2025-11-22 07:57:58.286 186853 DEBUG oslo_concurrency.lockutils [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:57:58 np0005531887 nova_compute[186849]: 2025-11-22 07:57:58.300 186853 DEBUG oslo_concurrency.processutils [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:57:58 np0005531887 nova_compute[186849]: 2025-11-22 07:57:58.365 186853 DEBUG oslo_concurrency.processutils [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:57:58 np0005531887 nova_compute[186849]: 2025-11-22 07:57:58.366 186853 DEBUG oslo_concurrency.processutils [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/3f315996-e85d-463b-9123-272512335a7f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:57:58 np0005531887 nova_compute[186849]: 2025-11-22 07:57:58.503 186853 DEBUG nova.policy [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 02:57:58 np0005531887 nova_compute[186849]: 2025-11-22 07:57:58.554 186853 DEBUG oslo_concurrency.processutils [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/3f315996-e85d-463b-9123-272512335a7f/disk 1073741824" returned: 0 in 0.188s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:57:58 np0005531887 nova_compute[186849]: 2025-11-22 07:57:58.555 186853 DEBUG oslo_concurrency.lockutils [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.269s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:57:58 np0005531887 nova_compute[186849]: 2025-11-22 07:57:58.555 186853 DEBUG oslo_concurrency.processutils [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:57:58 np0005531887 nova_compute[186849]: 2025-11-22 07:57:58.626 186853 DEBUG oslo_concurrency.processutils [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:57:58 np0005531887 nova_compute[186849]: 2025-11-22 07:57:58.627 186853 DEBUG nova.virt.disk.api [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Checking if we can resize image /var/lib/nova/instances/3f315996-e85d-463b-9123-272512335a7f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:57:58 np0005531887 nova_compute[186849]: 2025-11-22 07:57:58.628 186853 DEBUG oslo_concurrency.processutils [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f315996-e85d-463b-9123-272512335a7f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:57:58 np0005531887 nova_compute[186849]: 2025-11-22 07:57:58.688 186853 DEBUG oslo_concurrency.processutils [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f315996-e85d-463b-9123-272512335a7f/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:57:58 np0005531887 nova_compute[186849]: 2025-11-22 07:57:58.689 186853 DEBUG nova.virt.disk.api [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Cannot resize image /var/lib/nova/instances/3f315996-e85d-463b-9123-272512335a7f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:57:58 np0005531887 nova_compute[186849]: 2025-11-22 07:57:58.690 186853 DEBUG nova.objects.instance [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lazy-loading 'migration_context' on Instance uuid 3f315996-e85d-463b-9123-272512335a7f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:57:58 np0005531887 nova_compute[186849]: 2025-11-22 07:57:58.702 186853 DEBUG nova.virt.libvirt.driver [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:57:58 np0005531887 nova_compute[186849]: 2025-11-22 07:57:58.703 186853 DEBUG nova.virt.libvirt.driver [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Ensure instance console log exists: /var/lib/nova/instances/3f315996-e85d-463b-9123-272512335a7f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:57:58 np0005531887 nova_compute[186849]: 2025-11-22 07:57:58.704 186853 DEBUG oslo_concurrency.lockutils [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:57:58 np0005531887 nova_compute[186849]: 2025-11-22 07:57:58.704 186853 DEBUG oslo_concurrency.lockutils [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:57:58 np0005531887 nova_compute[186849]: 2025-11-22 07:57:58.704 186853 DEBUG oslo_concurrency.lockutils [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:57:59 np0005531887 nova_compute[186849]: 2025-11-22 07:57:59.380 186853 DEBUG nova.network.neutron [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Successfully created port: f33dc67e-3190-49f9-a981-9b80daf65bdb _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 02:57:59 np0005531887 nova_compute[186849]: 2025-11-22 07:57:59.385 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:57:59 np0005531887 nova_compute[186849]: 2025-11-22 07:57:59.500 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:58:00 np0005531887 nova_compute[186849]: 2025-11-22 07:58:00.236 186853 DEBUG nova.network.neutron [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Successfully updated port: f33dc67e-3190-49f9-a981-9b80daf65bdb _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 02:58:00 np0005531887 nova_compute[186849]: 2025-11-22 07:58:00.255 186853 DEBUG oslo_concurrency.lockutils [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Acquiring lock "refresh_cache-3f315996-e85d-463b-9123-272512335a7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:58:00 np0005531887 nova_compute[186849]: 2025-11-22 07:58:00.255 186853 DEBUG oslo_concurrency.lockutils [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Acquired lock "refresh_cache-3f315996-e85d-463b-9123-272512335a7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:58:00 np0005531887 nova_compute[186849]: 2025-11-22 07:58:00.255 186853 DEBUG nova.network.neutron [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:58:00 np0005531887 nova_compute[186849]: 2025-11-22 07:58:00.409 186853 DEBUG nova.compute.manager [req-e014dfff-1689-4c1b-9b9a-98704d58e482 req-e2ccb3d0-ef75-4117-8c97-8acc730265d9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Received event network-changed-f33dc67e-3190-49f9-a981-9b80daf65bdb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:58:00 np0005531887 nova_compute[186849]: 2025-11-22 07:58:00.411 186853 DEBUG nova.compute.manager [req-e014dfff-1689-4c1b-9b9a-98704d58e482 req-e2ccb3d0-ef75-4117-8c97-8acc730265d9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Refreshing instance network info cache due to event network-changed-f33dc67e-3190-49f9-a981-9b80daf65bdb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 02:58:00 np0005531887 nova_compute[186849]: 2025-11-22 07:58:00.412 186853 DEBUG oslo_concurrency.lockutils [req-e014dfff-1689-4c1b-9b9a-98704d58e482 req-e2ccb3d0-ef75-4117-8c97-8acc730265d9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-3f315996-e85d-463b-9123-272512335a7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:58:00 np0005531887 nova_compute[186849]: 2025-11-22 07:58:00.464 186853 DEBUG nova.network.neutron [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:58:01 np0005531887 nova_compute[186849]: 2025-11-22 07:58:01.394 186853 DEBUG nova.network.neutron [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Updating instance_info_cache with network_info: [{"id": "f33dc67e-3190-49f9-a981-9b80daf65bdb", "address": "fa:16:3e:bc:09:ea", "network": {"id": "06e0f3a5-911a-4244-bd9c-8cb4fa4c4794", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-960378838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd33c7e49baa4c7f9575824b348a0f23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf33dc67e-31", "ovs_interfaceid": "f33dc67e-3190-49f9-a981-9b80daf65bdb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:58:01 np0005531887 nova_compute[186849]: 2025-11-22 07:58:01.413 186853 DEBUG oslo_concurrency.lockutils [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Releasing lock "refresh_cache-3f315996-e85d-463b-9123-272512335a7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:58:01 np0005531887 nova_compute[186849]: 2025-11-22 07:58:01.414 186853 DEBUG nova.compute.manager [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Instance network_info: |[{"id": "f33dc67e-3190-49f9-a981-9b80daf65bdb", "address": "fa:16:3e:bc:09:ea", "network": {"id": "06e0f3a5-911a-4244-bd9c-8cb4fa4c4794", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-960378838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd33c7e49baa4c7f9575824b348a0f23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf33dc67e-31", "ovs_interfaceid": "f33dc67e-3190-49f9-a981-9b80daf65bdb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 02:58:01 np0005531887 nova_compute[186849]: 2025-11-22 07:58:01.414 186853 DEBUG oslo_concurrency.lockutils [req-e014dfff-1689-4c1b-9b9a-98704d58e482 req-e2ccb3d0-ef75-4117-8c97-8acc730265d9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-3f315996-e85d-463b-9123-272512335a7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:58:01 np0005531887 nova_compute[186849]: 2025-11-22 07:58:01.414 186853 DEBUG nova.network.neutron [req-e014dfff-1689-4c1b-9b9a-98704d58e482 req-e2ccb3d0-ef75-4117-8c97-8acc730265d9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Refreshing network info cache for port f33dc67e-3190-49f9-a981-9b80daf65bdb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 02:58:01 np0005531887 nova_compute[186849]: 2025-11-22 07:58:01.417 186853 DEBUG nova.virt.libvirt.driver [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Start _get_guest_xml network_info=[{"id": "f33dc67e-3190-49f9-a981-9b80daf65bdb", "address": "fa:16:3e:bc:09:ea", "network": {"id": "06e0f3a5-911a-4244-bd9c-8cb4fa4c4794", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-960378838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd33c7e49baa4c7f9575824b348a0f23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf33dc67e-31", "ovs_interfaceid": "f33dc67e-3190-49f9-a981-9b80daf65bdb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'size': 0, 'boot_index': 0, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:58:01 np0005531887 nova_compute[186849]: 2025-11-22 07:58:01.426 186853 WARNING nova.virt.libvirt.driver [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:58:01 np0005531887 nova_compute[186849]: 2025-11-22 07:58:01.432 186853 DEBUG nova.virt.libvirt.host [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:58:01 np0005531887 nova_compute[186849]: 2025-11-22 07:58:01.433 186853 DEBUG nova.virt.libvirt.host [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:58:01 np0005531887 nova_compute[186849]: 2025-11-22 07:58:01.438 186853 DEBUG nova.virt.libvirt.host [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:58:01 np0005531887 nova_compute[186849]: 2025-11-22 07:58:01.439 186853 DEBUG nova.virt.libvirt.host [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:58:01 np0005531887 nova_compute[186849]: 2025-11-22 07:58:01.440 186853 DEBUG nova.virt.libvirt.driver [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:58:01 np0005531887 nova_compute[186849]: 2025-11-22 07:58:01.441 186853 DEBUG nova.virt.hardware [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:58:01 np0005531887 nova_compute[186849]: 2025-11-22 07:58:01.441 186853 DEBUG nova.virt.hardware [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:58:01 np0005531887 nova_compute[186849]: 2025-11-22 07:58:01.441 186853 DEBUG nova.virt.hardware [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:58:01 np0005531887 nova_compute[186849]: 2025-11-22 07:58:01.442 186853 DEBUG nova.virt.hardware [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:58:01 np0005531887 nova_compute[186849]: 2025-11-22 07:58:01.442 186853 DEBUG nova.virt.hardware [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:58:01 np0005531887 nova_compute[186849]: 2025-11-22 07:58:01.442 186853 DEBUG nova.virt.hardware [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:58:01 np0005531887 nova_compute[186849]: 2025-11-22 07:58:01.442 186853 DEBUG nova.virt.hardware [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:58:01 np0005531887 nova_compute[186849]: 2025-11-22 07:58:01.443 186853 DEBUG nova.virt.hardware [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:58:01 np0005531887 nova_compute[186849]: 2025-11-22 07:58:01.443 186853 DEBUG nova.virt.hardware [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:58:01 np0005531887 nova_compute[186849]: 2025-11-22 07:58:01.443 186853 DEBUG nova.virt.hardware [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:58:01 np0005531887 nova_compute[186849]: 2025-11-22 07:58:01.443 186853 DEBUG nova.virt.hardware [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:58:01 np0005531887 nova_compute[186849]: 2025-11-22 07:58:01.447 186853 DEBUG nova.virt.libvirt.vif [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:57:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-1175546546',display_name='tempest-ServerStableDeviceRescueTest-server-1175546546',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-1175546546',id=80,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fd33c7e49baa4c7f9575824b348a0f23',ramdisk_id='',reservation_id='r-wu2qhkzu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-455223381',owner_user_name='tempest-ServerStableDeviceRescueTest-455223381-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:57:58Z,user_data=None,user_id='0d84421d986b40f481c0caef764443e2',uuid=3f315996-e85d-463b-9123-272512335a7f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f33dc67e-3190-49f9-a981-9b80daf65bdb", "address": "fa:16:3e:bc:09:ea", "network": {"id": "06e0f3a5-911a-4244-bd9c-8cb4fa4c4794", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-960378838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd33c7e49baa4c7f9575824b348a0f23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf33dc67e-31", "ovs_interfaceid": "f33dc67e-3190-49f9-a981-9b80daf65bdb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 02:58:01 np0005531887 nova_compute[186849]: 2025-11-22 07:58:01.448 186853 DEBUG nova.network.os_vif_util [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Converting VIF {"id": "f33dc67e-3190-49f9-a981-9b80daf65bdb", "address": "fa:16:3e:bc:09:ea", "network": {"id": "06e0f3a5-911a-4244-bd9c-8cb4fa4c4794", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-960378838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd33c7e49baa4c7f9575824b348a0f23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf33dc67e-31", "ovs_interfaceid": "f33dc67e-3190-49f9-a981-9b80daf65bdb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:58:01 np0005531887 nova_compute[186849]: 2025-11-22 07:58:01.449 186853 DEBUG nova.network.os_vif_util [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bc:09:ea,bridge_name='br-int',has_traffic_filtering=True,id=f33dc67e-3190-49f9-a981-9b80daf65bdb,network=Network(06e0f3a5-911a-4244-bd9c-8cb4fa4c4794),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf33dc67e-31') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:58:01 np0005531887 nova_compute[186849]: 2025-11-22 07:58:01.450 186853 DEBUG nova.objects.instance [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3f315996-e85d-463b-9123-272512335a7f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:58:01 np0005531887 nova_compute[186849]: 2025-11-22 07:58:01.465 186853 DEBUG nova.virt.libvirt.driver [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:58:01 np0005531887 nova_compute[186849]:  <uuid>3f315996-e85d-463b-9123-272512335a7f</uuid>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:  <name>instance-00000050</name>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:  <memory>131072</memory>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:  <vcpu>1</vcpu>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:58:01 np0005531887 nova_compute[186849]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:      <nova:name>tempest-ServerStableDeviceRescueTest-server-1175546546</nova:name>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:      <nova:creationTime>2025-11-22 07:58:01</nova:creationTime>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:      <nova:flavor name="m1.nano">
Nov 22 02:58:01 np0005531887 nova_compute[186849]:        <nova:memory>128</nova:memory>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:        <nova:disk>1</nova:disk>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:        <nova:swap>0</nova:swap>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:      </nova:flavor>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:      <nova:owner>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:        <nova:user uuid="0d84421d986b40f481c0caef764443e2">tempest-ServerStableDeviceRescueTest-455223381-project-member</nova:user>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:        <nova:project uuid="fd33c7e49baa4c7f9575824b348a0f23">tempest-ServerStableDeviceRescueTest-455223381</nova:project>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:      </nova:owner>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:      <nova:ports>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:        <nova:port uuid="f33dc67e-3190-49f9-a981-9b80daf65bdb">
Nov 22 02:58:01 np0005531887 nova_compute[186849]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:        </nova:port>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:      </nova:ports>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:    </nova:instance>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:  <sysinfo type="smbios">
Nov 22 02:58:01 np0005531887 nova_compute[186849]:    <system>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:      <entry name="serial">3f315996-e85d-463b-9123-272512335a7f</entry>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:      <entry name="uuid">3f315996-e85d-463b-9123-272512335a7f</entry>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:    </system>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:  <os>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:    <boot dev="hd"/>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:    <smbios mode="sysinfo"/>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:  </os>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:  <features>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:    <vmcoreinfo/>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:  </features>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:  <clock offset="utc">
Nov 22 02:58:01 np0005531887 nova_compute[186849]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:    <timer name="hpet" present="no"/>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:  </clock>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:  <cpu mode="custom" match="exact">
Nov 22 02:58:01 np0005531887 nova_compute[186849]:    <model>Nehalem</model>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:  <devices>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:    <disk type="file" device="disk">
Nov 22 02:58:01 np0005531887 nova_compute[186849]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/3f315996-e85d-463b-9123-272512335a7f/disk"/>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:      <target dev="vda" bus="virtio"/>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:    <disk type="file" device="cdrom">
Nov 22 02:58:01 np0005531887 nova_compute[186849]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/3f315996-e85d-463b-9123-272512335a7f/disk.config"/>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:      <target dev="sda" bus="sata"/>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:    <interface type="ethernet">
Nov 22 02:58:01 np0005531887 nova_compute[186849]:      <mac address="fa:16:3e:bc:09:ea"/>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:      <mtu size="1442"/>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:      <target dev="tapf33dc67e-31"/>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:    </interface>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:    <serial type="pty">
Nov 22 02:58:01 np0005531887 nova_compute[186849]:      <log file="/var/lib/nova/instances/3f315996-e85d-463b-9123-272512335a7f/console.log" append="off"/>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:    </serial>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:    <video>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:    </video>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:    <input type="tablet" bus="usb"/>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:    <rng model="virtio">
Nov 22 02:58:01 np0005531887 nova_compute[186849]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:    </rng>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:    <controller type="usb" index="0"/>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:    <memballoon model="virtio">
Nov 22 02:58:01 np0005531887 nova_compute[186849]:      <stats period="10"/>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 02:58:01 np0005531887 nova_compute[186849]:  </devices>
Nov 22 02:58:01 np0005531887 nova_compute[186849]: </domain>
Nov 22 02:58:01 np0005531887 nova_compute[186849]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:58:01 np0005531887 nova_compute[186849]: 2025-11-22 07:58:01.466 186853 DEBUG nova.compute.manager [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Preparing to wait for external event network-vif-plugged-f33dc67e-3190-49f9-a981-9b80daf65bdb prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 02:58:01 np0005531887 nova_compute[186849]: 2025-11-22 07:58:01.467 186853 DEBUG oslo_concurrency.lockutils [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Acquiring lock "3f315996-e85d-463b-9123-272512335a7f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:58:01 np0005531887 nova_compute[186849]: 2025-11-22 07:58:01.467 186853 DEBUG oslo_concurrency.lockutils [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "3f315996-e85d-463b-9123-272512335a7f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:58:01 np0005531887 nova_compute[186849]: 2025-11-22 07:58:01.467 186853 DEBUG oslo_concurrency.lockutils [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "3f315996-e85d-463b-9123-272512335a7f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:58:01 np0005531887 nova_compute[186849]: 2025-11-22 07:58:01.468 186853 DEBUG nova.virt.libvirt.vif [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:57:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-1175546546',display_name='tempest-ServerStableDeviceRescueTest-server-1175546546',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-1175546546',id=80,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fd33c7e49baa4c7f9575824b348a0f23',ramdisk_id='',reservation_id='r-wu2qhkzu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-455223381',owner_user_name='tempest-ServerStableDeviceRescueTest-455223381-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:57:58Z,user_data=None,user_id='0d84421d986b40f481c0caef764443e2',uuid=3f315996-e85d-463b-9123-272512335a7f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f33dc67e-3190-49f9-a981-9b80daf65bdb", "address": "fa:16:3e:bc:09:ea", "network": {"id": "06e0f3a5-911a-4244-bd9c-8cb4fa4c4794", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-960378838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd33c7e49baa4c7f9575824b348a0f23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf33dc67e-31", "ovs_interfaceid": "f33dc67e-3190-49f9-a981-9b80daf65bdb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 02:58:01 np0005531887 nova_compute[186849]: 2025-11-22 07:58:01.468 186853 DEBUG nova.network.os_vif_util [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Converting VIF {"id": "f33dc67e-3190-49f9-a981-9b80daf65bdb", "address": "fa:16:3e:bc:09:ea", "network": {"id": "06e0f3a5-911a-4244-bd9c-8cb4fa4c4794", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-960378838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd33c7e49baa4c7f9575824b348a0f23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf33dc67e-31", "ovs_interfaceid": "f33dc67e-3190-49f9-a981-9b80daf65bdb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:58:01 np0005531887 nova_compute[186849]: 2025-11-22 07:58:01.469 186853 DEBUG nova.network.os_vif_util [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bc:09:ea,bridge_name='br-int',has_traffic_filtering=True,id=f33dc67e-3190-49f9-a981-9b80daf65bdb,network=Network(06e0f3a5-911a-4244-bd9c-8cb4fa4c4794),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf33dc67e-31') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:58:01 np0005531887 nova_compute[186849]: 2025-11-22 07:58:01.469 186853 DEBUG os_vif [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bc:09:ea,bridge_name='br-int',has_traffic_filtering=True,id=f33dc67e-3190-49f9-a981-9b80daf65bdb,network=Network(06e0f3a5-911a-4244-bd9c-8cb4fa4c4794),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf33dc67e-31') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 02:58:01 np0005531887 nova_compute[186849]: 2025-11-22 07:58:01.470 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:01 np0005531887 nova_compute[186849]: 2025-11-22 07:58:01.470 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:58:01 np0005531887 nova_compute[186849]: 2025-11-22 07:58:01.471 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:58:01 np0005531887 nova_compute[186849]: 2025-11-22 07:58:01.473 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:01 np0005531887 nova_compute[186849]: 2025-11-22 07:58:01.473 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf33dc67e-31, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:58:01 np0005531887 nova_compute[186849]: 2025-11-22 07:58:01.474 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf33dc67e-31, col_values=(('external_ids', {'iface-id': 'f33dc67e-3190-49f9-a981-9b80daf65bdb', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:bc:09:ea', 'vm-uuid': '3f315996-e85d-463b-9123-272512335a7f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:58:01 np0005531887 nova_compute[186849]: 2025-11-22 07:58:01.476 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:01 np0005531887 NetworkManager[55210]: <info>  [1763798281.4774] manager: (tapf33dc67e-31): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/115)
Nov 22 02:58:01 np0005531887 nova_compute[186849]: 2025-11-22 07:58:01.480 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:58:01 np0005531887 nova_compute[186849]: 2025-11-22 07:58:01.484 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:01 np0005531887 nova_compute[186849]: 2025-11-22 07:58:01.485 186853 INFO os_vif [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bc:09:ea,bridge_name='br-int',has_traffic_filtering=True,id=f33dc67e-3190-49f9-a981-9b80daf65bdb,network=Network(06e0f3a5-911a-4244-bd9c-8cb4fa4c4794),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf33dc67e-31')#033[00m
Nov 22 02:58:01 np0005531887 nova_compute[186849]: 2025-11-22 07:58:01.570 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:01 np0005531887 podman[225155]: 2025-11-22 07:58:01.584278665 +0000 UTC m=+0.057922204 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 02:58:01 np0005531887 nova_compute[186849]: 2025-11-22 07:58:01.779 186853 DEBUG nova.virt.libvirt.driver [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:58:01 np0005531887 nova_compute[186849]: 2025-11-22 07:58:01.780 186853 DEBUG nova.virt.libvirt.driver [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:58:01 np0005531887 nova_compute[186849]: 2025-11-22 07:58:01.780 186853 DEBUG nova.virt.libvirt.driver [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] No VIF found with MAC fa:16:3e:bc:09:ea, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 02:58:01 np0005531887 nova_compute[186849]: 2025-11-22 07:58:01.780 186853 INFO nova.virt.libvirt.driver [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Using config drive#033[00m
Nov 22 02:58:02 np0005531887 nova_compute[186849]: 2025-11-22 07:58:02.494 186853 INFO nova.virt.libvirt.driver [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Creating config drive at /var/lib/nova/instances/3f315996-e85d-463b-9123-272512335a7f/disk.config#033[00m
Nov 22 02:58:02 np0005531887 nova_compute[186849]: 2025-11-22 07:58:02.498 186853 DEBUG oslo_concurrency.processutils [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3f315996-e85d-463b-9123-272512335a7f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpb4o5bhed execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:58:02 np0005531887 nova_compute[186849]: 2025-11-22 07:58:02.627 186853 DEBUG oslo_concurrency.processutils [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3f315996-e85d-463b-9123-272512335a7f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpb4o5bhed" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:58:02 np0005531887 kernel: tapf33dc67e-31: entered promiscuous mode
Nov 22 02:58:02 np0005531887 NetworkManager[55210]: <info>  [1763798282.7013] manager: (tapf33dc67e-31): new Tun device (/org/freedesktop/NetworkManager/Devices/116)
Nov 22 02:58:02 np0005531887 ovn_controller[95130]: 2025-11-22T07:58:02Z|00221|binding|INFO|Claiming lport f33dc67e-3190-49f9-a981-9b80daf65bdb for this chassis.
Nov 22 02:58:02 np0005531887 ovn_controller[95130]: 2025-11-22T07:58:02Z|00222|binding|INFO|f33dc67e-3190-49f9-a981-9b80daf65bdb: Claiming fa:16:3e:bc:09:ea 10.100.0.3
Nov 22 02:58:02 np0005531887 nova_compute[186849]: 2025-11-22 07:58:02.704 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:02 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:02.718 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bc:09:ea 10.100.0.3'], port_security=['fa:16:3e:bc:09:ea 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '3f315996-e85d-463b-9123-272512335a7f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'neutron:revision_number': '2', 'neutron:security_group_ids': '40e7d412-78c2-4966-b2d0-76294ef96b0a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2f082361-600e-461c-8c13-2c91c0ff7f77, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=f33dc67e-3190-49f9-a981-9b80daf65bdb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:58:02 np0005531887 ovn_controller[95130]: 2025-11-22T07:58:02Z|00223|binding|INFO|Setting lport f33dc67e-3190-49f9-a981-9b80daf65bdb ovn-installed in OVS
Nov 22 02:58:02 np0005531887 ovn_controller[95130]: 2025-11-22T07:58:02Z|00224|binding|INFO|Setting lport f33dc67e-3190-49f9-a981-9b80daf65bdb up in Southbound
Nov 22 02:58:02 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:02.720 104084 INFO neutron.agent.ovn.metadata.agent [-] Port f33dc67e-3190-49f9-a981-9b80daf65bdb in datapath 06e0f3a5-911a-4244-bd9c-8cb4fa4c4794 bound to our chassis#033[00m
Nov 22 02:58:02 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:02.722 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 06e0f3a5-911a-4244-bd9c-8cb4fa4c4794#033[00m
Nov 22 02:58:02 np0005531887 nova_compute[186849]: 2025-11-22 07:58:02.727 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:02 np0005531887 systemd-udevd[225196]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:58:02 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:02.747 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[5c252044-65d8-4079-945c-953b7fcb337c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:02 np0005531887 NetworkManager[55210]: <info>  [1763798282.7592] device (tapf33dc67e-31): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 02:58:02 np0005531887 NetworkManager[55210]: <info>  [1763798282.7604] device (tapf33dc67e-31): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 02:58:02 np0005531887 systemd-machined[153180]: New machine qemu-33-instance-00000050.
Nov 22 02:58:02 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:02.786 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[8b372d3c-1db3-4309-a3db-f85f635409e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:02 np0005531887 systemd[1]: Started Virtual Machine qemu-33-instance-00000050.
Nov 22 02:58:02 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:02.791 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[0646a658-cb17-4718-99dd-1b10dfcb22a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:02 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:02.829 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[5799d278-5088-4ec8-bd8e-75b3f732abb4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:02 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:02.857 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[b7f808e3-9a76-4e29-97cf-5834099ba721]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap06e0f3a5-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:b7:bd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 73], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 501577, 'reachable_time': 19429, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225208, 'error': None, 'target': 'ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:02 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:02.875 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[bb3b6f24-c21f-4047-a2eb-24b1c67882bb]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap06e0f3a5-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 501591, 'tstamp': 501591}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225212, 'error': None, 'target': 'ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap06e0f3a5-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 501595, 'tstamp': 501595}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225212, 'error': None, 'target': 'ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:02 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:02.877 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap06e0f3a5-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:58:02 np0005531887 nova_compute[186849]: 2025-11-22 07:58:02.879 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:02 np0005531887 nova_compute[186849]: 2025-11-22 07:58:02.880 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:02 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:02.885 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap06e0f3a5-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:58:02 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:02.886 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:58:02 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:02.886 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap06e0f3a5-90, col_values=(('external_ids', {'iface-id': '465da2c0-9a1c-41a9-be9a-d10bcbd7a813'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:58:02 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:02.887 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:58:02 np0005531887 nova_compute[186849]: 2025-11-22 07:58:02.982 186853 DEBUG nova.network.neutron [req-e014dfff-1689-4c1b-9b9a-98704d58e482 req-e2ccb3d0-ef75-4117-8c97-8acc730265d9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Updated VIF entry in instance network info cache for port f33dc67e-3190-49f9-a981-9b80daf65bdb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 02:58:02 np0005531887 nova_compute[186849]: 2025-11-22 07:58:02.983 186853 DEBUG nova.network.neutron [req-e014dfff-1689-4c1b-9b9a-98704d58e482 req-e2ccb3d0-ef75-4117-8c97-8acc730265d9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Updating instance_info_cache with network_info: [{"id": "f33dc67e-3190-49f9-a981-9b80daf65bdb", "address": "fa:16:3e:bc:09:ea", "network": {"id": "06e0f3a5-911a-4244-bd9c-8cb4fa4c4794", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-960378838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd33c7e49baa4c7f9575824b348a0f23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf33dc67e-31", "ovs_interfaceid": "f33dc67e-3190-49f9-a981-9b80daf65bdb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:58:03 np0005531887 nova_compute[186849]: 2025-11-22 07:58:03.007 186853 DEBUG oslo_concurrency.lockutils [req-e014dfff-1689-4c1b-9b9a-98704d58e482 req-e2ccb3d0-ef75-4117-8c97-8acc730265d9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-3f315996-e85d-463b-9123-272512335a7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:58:03 np0005531887 nova_compute[186849]: 2025-11-22 07:58:03.329 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798283.3287966, 3f315996-e85d-463b-9123-272512335a7f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:58:03 np0005531887 nova_compute[186849]: 2025-11-22 07:58:03.330 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 3f315996-e85d-463b-9123-272512335a7f] VM Started (Lifecycle Event)#033[00m
Nov 22 02:58:03 np0005531887 nova_compute[186849]: 2025-11-22 07:58:03.371 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 3f315996-e85d-463b-9123-272512335a7f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:58:03 np0005531887 nova_compute[186849]: 2025-11-22 07:58:03.377 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798283.3291025, 3f315996-e85d-463b-9123-272512335a7f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:58:03 np0005531887 nova_compute[186849]: 2025-11-22 07:58:03.378 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 3f315996-e85d-463b-9123-272512335a7f] VM Paused (Lifecycle Event)#033[00m
Nov 22 02:58:03 np0005531887 nova_compute[186849]: 2025-11-22 07:58:03.410 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 3f315996-e85d-463b-9123-272512335a7f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:58:03 np0005531887 nova_compute[186849]: 2025-11-22 07:58:03.416 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 3f315996-e85d-463b-9123-272512335a7f] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:58:03 np0005531887 nova_compute[186849]: 2025-11-22 07:58:03.434 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 3f315996-e85d-463b-9123-272512335a7f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:58:03 np0005531887 nova_compute[186849]: 2025-11-22 07:58:03.769 186853 DEBUG nova.compute.manager [req-ee9aa848-c015-48c9-8686-3a02bdcf839d req-6c174dbe-96a3-4951-bf80-37c9397cf9ef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Received event network-vif-plugged-f33dc67e-3190-49f9-a981-9b80daf65bdb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:58:03 np0005531887 nova_compute[186849]: 2025-11-22 07:58:03.771 186853 DEBUG oslo_concurrency.lockutils [req-ee9aa848-c015-48c9-8686-3a02bdcf839d req-6c174dbe-96a3-4951-bf80-37c9397cf9ef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "3f315996-e85d-463b-9123-272512335a7f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:58:03 np0005531887 nova_compute[186849]: 2025-11-22 07:58:03.771 186853 DEBUG oslo_concurrency.lockutils [req-ee9aa848-c015-48c9-8686-3a02bdcf839d req-6c174dbe-96a3-4951-bf80-37c9397cf9ef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3f315996-e85d-463b-9123-272512335a7f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:58:03 np0005531887 nova_compute[186849]: 2025-11-22 07:58:03.772 186853 DEBUG oslo_concurrency.lockutils [req-ee9aa848-c015-48c9-8686-3a02bdcf839d req-6c174dbe-96a3-4951-bf80-37c9397cf9ef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3f315996-e85d-463b-9123-272512335a7f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:58:03 np0005531887 nova_compute[186849]: 2025-11-22 07:58:03.772 186853 DEBUG nova.compute.manager [req-ee9aa848-c015-48c9-8686-3a02bdcf839d req-6c174dbe-96a3-4951-bf80-37c9397cf9ef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Processing event network-vif-plugged-f33dc67e-3190-49f9-a981-9b80daf65bdb _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 02:58:03 np0005531887 nova_compute[186849]: 2025-11-22 07:58:03.772 186853 DEBUG nova.compute.manager [req-ee9aa848-c015-48c9-8686-3a02bdcf839d req-6c174dbe-96a3-4951-bf80-37c9397cf9ef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Received event network-vif-plugged-f33dc67e-3190-49f9-a981-9b80daf65bdb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:58:03 np0005531887 nova_compute[186849]: 2025-11-22 07:58:03.773 186853 DEBUG oslo_concurrency.lockutils [req-ee9aa848-c015-48c9-8686-3a02bdcf839d req-6c174dbe-96a3-4951-bf80-37c9397cf9ef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "3f315996-e85d-463b-9123-272512335a7f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:58:03 np0005531887 nova_compute[186849]: 2025-11-22 07:58:03.773 186853 DEBUG oslo_concurrency.lockutils [req-ee9aa848-c015-48c9-8686-3a02bdcf839d req-6c174dbe-96a3-4951-bf80-37c9397cf9ef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3f315996-e85d-463b-9123-272512335a7f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:58:03 np0005531887 nova_compute[186849]: 2025-11-22 07:58:03.773 186853 DEBUG oslo_concurrency.lockutils [req-ee9aa848-c015-48c9-8686-3a02bdcf839d req-6c174dbe-96a3-4951-bf80-37c9397cf9ef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3f315996-e85d-463b-9123-272512335a7f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:58:03 np0005531887 nova_compute[186849]: 2025-11-22 07:58:03.774 186853 DEBUG nova.compute.manager [req-ee9aa848-c015-48c9-8686-3a02bdcf839d req-6c174dbe-96a3-4951-bf80-37c9397cf9ef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] No waiting events found dispatching network-vif-plugged-f33dc67e-3190-49f9-a981-9b80daf65bdb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:58:03 np0005531887 nova_compute[186849]: 2025-11-22 07:58:03.774 186853 WARNING nova.compute.manager [req-ee9aa848-c015-48c9-8686-3a02bdcf839d req-6c174dbe-96a3-4951-bf80-37c9397cf9ef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Received unexpected event network-vif-plugged-f33dc67e-3190-49f9-a981-9b80daf65bdb for instance with vm_state building and task_state spawning.#033[00m
Nov 22 02:58:03 np0005531887 nova_compute[186849]: 2025-11-22 07:58:03.775 186853 DEBUG nova.compute.manager [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:58:03 np0005531887 nova_compute[186849]: 2025-11-22 07:58:03.790 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798283.7873902, 3f315996-e85d-463b-9123-272512335a7f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:58:03 np0005531887 nova_compute[186849]: 2025-11-22 07:58:03.790 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 3f315996-e85d-463b-9123-272512335a7f] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:58:03 np0005531887 nova_compute[186849]: 2025-11-22 07:58:03.793 186853 DEBUG nova.virt.libvirt.driver [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 02:58:03 np0005531887 nova_compute[186849]: 2025-11-22 07:58:03.811 186853 INFO nova.virt.libvirt.driver [-] [instance: 3f315996-e85d-463b-9123-272512335a7f] Instance spawned successfully.#033[00m
Nov 22 02:58:03 np0005531887 nova_compute[186849]: 2025-11-22 07:58:03.812 186853 DEBUG nova.virt.libvirt.driver [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 02:58:03 np0005531887 nova_compute[186849]: 2025-11-22 07:58:03.829 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 3f315996-e85d-463b-9123-272512335a7f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:58:03 np0005531887 nova_compute[186849]: 2025-11-22 07:58:03.842 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 3f315996-e85d-463b-9123-272512335a7f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:58:03 np0005531887 nova_compute[186849]: 2025-11-22 07:58:03.845 186853 DEBUG nova.virt.libvirt.driver [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:58:03 np0005531887 nova_compute[186849]: 2025-11-22 07:58:03.846 186853 DEBUG nova.virt.libvirt.driver [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:58:03 np0005531887 nova_compute[186849]: 2025-11-22 07:58:03.846 186853 DEBUG nova.virt.libvirt.driver [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:58:03 np0005531887 nova_compute[186849]: 2025-11-22 07:58:03.846 186853 DEBUG nova.virt.libvirt.driver [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:58:03 np0005531887 nova_compute[186849]: 2025-11-22 07:58:03.847 186853 DEBUG nova.virt.libvirt.driver [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:58:03 np0005531887 nova_compute[186849]: 2025-11-22 07:58:03.847 186853 DEBUG nova.virt.libvirt.driver [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:58:03 np0005531887 nova_compute[186849]: 2025-11-22 07:58:03.871 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 3f315996-e85d-463b-9123-272512335a7f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:58:03 np0005531887 nova_compute[186849]: 2025-11-22 07:58:03.914 186853 INFO nova.compute.manager [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Took 5.72 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 02:58:03 np0005531887 nova_compute[186849]: 2025-11-22 07:58:03.914 186853 DEBUG nova.compute.manager [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:58:03 np0005531887 nova_compute[186849]: 2025-11-22 07:58:03.987 186853 INFO nova.compute.manager [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Took 6.19 seconds to build instance.#033[00m
Nov 22 02:58:04 np0005531887 nova_compute[186849]: 2025-11-22 07:58:04.005 186853 DEBUG oslo_concurrency.lockutils [None req-113af4d1-30d1-49c1-99cd-132807637d5a 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "3f315996-e85d-463b-9123-272512335a7f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.292s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:58:05 np0005531887 nova_compute[186849]: 2025-11-22 07:58:05.440 186853 DEBUG nova.compute.manager [None req-91351b82-a44a-4a35-bf18-c02ac110f7d0 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:58:05 np0005531887 nova_compute[186849]: 2025-11-22 07:58:05.540 186853 INFO nova.compute.manager [None req-91351b82-a44a-4a35-bf18-c02ac110f7d0 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] instance snapshotting#033[00m
Nov 22 02:58:05 np0005531887 nova_compute[186849]: 2025-11-22 07:58:05.862 186853 INFO nova.virt.libvirt.driver [None req-91351b82-a44a-4a35-bf18-c02ac110f7d0 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Beginning live snapshot process#033[00m
Nov 22 02:58:06 np0005531887 virtqemud[186424]: invalid argument: disk vda does not have an active block job
Nov 22 02:58:06 np0005531887 nova_compute[186849]: 2025-11-22 07:58:06.022 186853 DEBUG oslo_concurrency.processutils [None req-91351b82-a44a-4a35-bf18-c02ac110f7d0 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f315996-e85d-463b-9123-272512335a7f/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:58:06 np0005531887 nova_compute[186849]: 2025-11-22 07:58:06.087 186853 DEBUG oslo_concurrency.processutils [None req-91351b82-a44a-4a35-bf18-c02ac110f7d0 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f315996-e85d-463b-9123-272512335a7f/disk --force-share --output=json -f qcow2" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:58:06 np0005531887 nova_compute[186849]: 2025-11-22 07:58:06.089 186853 DEBUG oslo_concurrency.processutils [None req-91351b82-a44a-4a35-bf18-c02ac110f7d0 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f315996-e85d-463b-9123-272512335a7f/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:58:06 np0005531887 nova_compute[186849]: 2025-11-22 07:58:06.154 186853 DEBUG oslo_concurrency.processutils [None req-91351b82-a44a-4a35-bf18-c02ac110f7d0 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f315996-e85d-463b-9123-272512335a7f/disk --force-share --output=json -f qcow2" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:58:06 np0005531887 nova_compute[186849]: 2025-11-22 07:58:06.167 186853 DEBUG oslo_concurrency.processutils [None req-91351b82-a44a-4a35-bf18-c02ac110f7d0 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:58:06 np0005531887 nova_compute[186849]: 2025-11-22 07:58:06.229 186853 DEBUG oslo_concurrency.processutils [None req-91351b82-a44a-4a35-bf18-c02ac110f7d0 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:58:06 np0005531887 nova_compute[186849]: 2025-11-22 07:58:06.230 186853 DEBUG oslo_concurrency.processutils [None req-91351b82-a44a-4a35-bf18-c02ac110f7d0 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/snapshots/tmp5pxp37i1/2e970b6305ac4855a81fd1db15bf7016.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:58:06 np0005531887 nova_compute[186849]: 2025-11-22 07:58:06.478 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:06 np0005531887 nova_compute[186849]: 2025-11-22 07:58:06.572 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:07 np0005531887 nova_compute[186849]: 2025-11-22 07:58:07.116 186853 DEBUG oslo_concurrency.processutils [None req-91351b82-a44a-4a35-bf18-c02ac110f7d0 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/snapshots/tmp5pxp37i1/2e970b6305ac4855a81fd1db15bf7016.delta 1073741824" returned: 0 in 0.886s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:58:07 np0005531887 nova_compute[186849]: 2025-11-22 07:58:07.118 186853 INFO nova.virt.libvirt.driver [None req-91351b82-a44a-4a35-bf18-c02ac110f7d0 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Nov 22 02:58:07 np0005531887 nova_compute[186849]: 2025-11-22 07:58:07.200 186853 DEBUG nova.virt.libvirt.guest [None req-91351b82-a44a-4a35-bf18-c02ac110f7d0 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] COPY block job progress, current cursor: 0 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Nov 22 02:58:07 np0005531887 podman[225236]: 2025-11-22 07:58:07.234263363 +0000 UTC m=+0.078151789 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, managed_by=edpm_ansible, release=1755695350, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, maintainer=Red Hat, Inc., name=ubi9-minimal, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41)
Nov 22 02:58:07 np0005531887 ovn_controller[95130]: 2025-11-22T07:58:07Z|00225|binding|INFO|Releasing lport 465da2c0-9a1c-41a9-be9a-d10bcbd7a813 from this chassis (sb_readonly=0)
Nov 22 02:58:07 np0005531887 nova_compute[186849]: 2025-11-22 07:58:07.314 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:07 np0005531887 nova_compute[186849]: 2025-11-22 07:58:07.704 186853 DEBUG nova.virt.libvirt.guest [None req-91351b82-a44a-4a35-bf18-c02ac110f7d0 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] COPY block job progress, current cursor: 1 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Nov 22 02:58:07 np0005531887 nova_compute[186849]: 2025-11-22 07:58:07.709 186853 INFO nova.virt.libvirt.driver [None req-91351b82-a44a-4a35-bf18-c02ac110f7d0 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Nov 22 02:58:08 np0005531887 nova_compute[186849]: 2025-11-22 07:58:08.049 186853 DEBUG nova.privsep.utils [None req-91351b82-a44a-4a35-bf18-c02ac110f7d0 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 22 02:58:08 np0005531887 nova_compute[186849]: 2025-11-22 07:58:08.051 186853 DEBUG oslo_concurrency.processutils [None req-91351b82-a44a-4a35-bf18-c02ac110f7d0 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmp5pxp37i1/2e970b6305ac4855a81fd1db15bf7016.delta /var/lib/nova/instances/snapshots/tmp5pxp37i1/2e970b6305ac4855a81fd1db15bf7016 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:58:10 np0005531887 nova_compute[186849]: 2025-11-22 07:58:10.011 186853 DEBUG oslo_concurrency.processutils [None req-91351b82-a44a-4a35-bf18-c02ac110f7d0 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmp5pxp37i1/2e970b6305ac4855a81fd1db15bf7016.delta /var/lib/nova/instances/snapshots/tmp5pxp37i1/2e970b6305ac4855a81fd1db15bf7016" returned: 0 in 1.960s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:58:10 np0005531887 nova_compute[186849]: 2025-11-22 07:58:10.013 186853 INFO nova.virt.libvirt.driver [None req-91351b82-a44a-4a35-bf18-c02ac110f7d0 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Snapshot extracted, beginning image upload#033[00m
Nov 22 02:58:10 np0005531887 ovn_controller[95130]: 2025-11-22T07:58:10Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:10:9b:64 10.100.0.7
Nov 22 02:58:10 np0005531887 ovn_controller[95130]: 2025-11-22T07:58:10Z|00029|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:10:9b:64 10.100.0.7
Nov 22 02:58:10 np0005531887 podman[225278]: 2025-11-22 07:58:10.850907456 +0000 UTC m=+0.070051516 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible)
Nov 22 02:58:10 np0005531887 podman[225279]: 2025-11-22 07:58:10.880385191 +0000 UTC m=+0.093463960 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 02:58:11 np0005531887 nova_compute[186849]: 2025-11-22 07:58:11.481 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:11 np0005531887 nova_compute[186849]: 2025-11-22 07:58:11.572 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:12 np0005531887 nova_compute[186849]: 2025-11-22 07:58:12.738 186853 INFO nova.virt.libvirt.driver [None req-91351b82-a44a-4a35-bf18-c02ac110f7d0 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Snapshot image upload complete#033[00m
Nov 22 02:58:12 np0005531887 nova_compute[186849]: 2025-11-22 07:58:12.739 186853 INFO nova.compute.manager [None req-91351b82-a44a-4a35-bf18-c02ac110f7d0 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Took 7.19 seconds to snapshot the instance on the hypervisor.#033[00m
Nov 22 02:58:14 np0005531887 nova_compute[186849]: 2025-11-22 07:58:14.861 186853 INFO nova.compute.manager [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Rescuing#033[00m
Nov 22 02:58:14 np0005531887 nova_compute[186849]: 2025-11-22 07:58:14.862 186853 DEBUG oslo_concurrency.lockutils [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Acquiring lock "refresh_cache-3f315996-e85d-463b-9123-272512335a7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:58:14 np0005531887 nova_compute[186849]: 2025-11-22 07:58:14.862 186853 DEBUG oslo_concurrency.lockutils [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Acquired lock "refresh_cache-3f315996-e85d-463b-9123-272512335a7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:58:14 np0005531887 nova_compute[186849]: 2025-11-22 07:58:14.862 186853 DEBUG nova.network.neutron [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:58:16 np0005531887 nova_compute[186849]: 2025-11-22 07:58:16.484 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:16 np0005531887 nova_compute[186849]: 2025-11-22 07:58:16.575 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:16 np0005531887 podman[225324]: 2025-11-22 07:58:16.832367536 +0000 UTC m=+0.054524200 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 02:58:17 np0005531887 nova_compute[186849]: 2025-11-22 07:58:17.115 186853 DEBUG nova.network.neutron [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Updating instance_info_cache with network_info: [{"id": "f33dc67e-3190-49f9-a981-9b80daf65bdb", "address": "fa:16:3e:bc:09:ea", "network": {"id": "06e0f3a5-911a-4244-bd9c-8cb4fa4c4794", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-960378838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd33c7e49baa4c7f9575824b348a0f23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf33dc67e-31", "ovs_interfaceid": "f33dc67e-3190-49f9-a981-9b80daf65bdb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:58:17 np0005531887 nova_compute[186849]: 2025-11-22 07:58:17.151 186853 DEBUG oslo_concurrency.lockutils [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Releasing lock "refresh_cache-3f315996-e85d-463b-9123-272512335a7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:58:17 np0005531887 nova_compute[186849]: 2025-11-22 07:58:17.551 186853 DEBUG nova.virt.libvirt.driver [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 22 02:58:21 np0005531887 nova_compute[186849]: 2025-11-22 07:58:21.487 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:21 np0005531887 nova_compute[186849]: 2025-11-22 07:58:21.577 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:22 np0005531887 ovn_controller[95130]: 2025-11-22T07:58:22Z|00030|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:bc:09:ea 10.100.0.3
Nov 22 02:58:22 np0005531887 ovn_controller[95130]: 2025-11-22T07:58:22Z|00031|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:bc:09:ea 10.100.0.3
Nov 22 02:58:22 np0005531887 podman[225370]: 2025-11-22 07:58:22.861220756 +0000 UTC m=+0.068277503 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Nov 22 02:58:26 np0005531887 nova_compute[186849]: 2025-11-22 07:58:26.491 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:26 np0005531887 nova_compute[186849]: 2025-11-22 07:58:26.580 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:27 np0005531887 nova_compute[186849]: 2025-11-22 07:58:27.598 186853 DEBUG nova.virt.libvirt.driver [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Nov 22 02:58:27 np0005531887 podman[225391]: 2025-11-22 07:58:27.846474999 +0000 UTC m=+0.063295219 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible)
Nov 22 02:58:29 np0005531887 kernel: tapf33dc67e-31 (unregistering): left promiscuous mode
Nov 22 02:58:29 np0005531887 NetworkManager[55210]: <info>  [1763798309.7864] device (tapf33dc67e-31): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 02:58:29 np0005531887 nova_compute[186849]: 2025-11-22 07:58:29.796 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:29 np0005531887 ovn_controller[95130]: 2025-11-22T07:58:29Z|00226|binding|INFO|Releasing lport f33dc67e-3190-49f9-a981-9b80daf65bdb from this chassis (sb_readonly=0)
Nov 22 02:58:29 np0005531887 ovn_controller[95130]: 2025-11-22T07:58:29Z|00227|binding|INFO|Setting lport f33dc67e-3190-49f9-a981-9b80daf65bdb down in Southbound
Nov 22 02:58:29 np0005531887 ovn_controller[95130]: 2025-11-22T07:58:29Z|00228|binding|INFO|Removing iface tapf33dc67e-31 ovn-installed in OVS
Nov 22 02:58:29 np0005531887 nova_compute[186849]: 2025-11-22 07:58:29.799 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:29 np0005531887 nova_compute[186849]: 2025-11-22 07:58:29.810 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:29 np0005531887 systemd[1]: machine-qemu\x2d33\x2dinstance\x2d00000050.scope: Deactivated successfully.
Nov 22 02:58:29 np0005531887 systemd[1]: machine-qemu\x2d33\x2dinstance\x2d00000050.scope: Consumed 16.576s CPU time.
Nov 22 02:58:29 np0005531887 systemd-machined[153180]: Machine qemu-33-instance-00000050 terminated.
Nov 22 02:58:29 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:29.921 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bc:09:ea 10.100.0.3'], port_security=['fa:16:3e:bc:09:ea 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '3f315996-e85d-463b-9123-272512335a7f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'neutron:revision_number': '4', 'neutron:security_group_ids': '40e7d412-78c2-4966-b2d0-76294ef96b0a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2f082361-600e-461c-8c13-2c91c0ff7f77, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=f33dc67e-3190-49f9-a981-9b80daf65bdb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:58:29 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:29.922 104084 INFO neutron.agent.ovn.metadata.agent [-] Port f33dc67e-3190-49f9-a981-9b80daf65bdb in datapath 06e0f3a5-911a-4244-bd9c-8cb4fa4c4794 unbound from our chassis#033[00m
Nov 22 02:58:29 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:29.923 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 06e0f3a5-911a-4244-bd9c-8cb4fa4c4794#033[00m
Nov 22 02:58:29 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:29.940 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[66bcb67a-ab20-443c-882c-eeeb391513b9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:29 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:29.973 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[426c5867-653a-4146-8d43-73e52115a376]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:29 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:29.976 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[ba42fbf8-9aa2-4ac7-83a6-701a6e2c8fa9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:30.007 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[27647107-6d86-4c06-ba80-ac2b7a92da1c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:30.027 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[3f99ce63-d91f-4aa9-b494-3bcf3d0122c3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap06e0f3a5-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:b7:bd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 9, 'tx_packets': 7, 'rx_bytes': 658, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 9, 'tx_packets': 7, 'rx_bytes': 658, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 73], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 501577, 'reachable_time': 19429, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225423, 'error': None, 'target': 'ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:30.048 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[244f4356-8a00-40c4-bc5a-4f5515ff2235]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap06e0f3a5-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 501591, 'tstamp': 501591}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225432, 'error': None, 'target': 'ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap06e0f3a5-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 501595, 'tstamp': 501595}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225432, 'error': None, 'target': 'ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:30.051 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap06e0f3a5-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:58:30 np0005531887 nova_compute[186849]: 2025-11-22 07:58:30.053 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:30 np0005531887 nova_compute[186849]: 2025-11-22 07:58:30.060 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:30.061 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap06e0f3a5-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:58:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:30.061 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:58:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:30.062 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap06e0f3a5-90, col_values=(('external_ids', {'iface-id': '465da2c0-9a1c-41a9-be9a-d10bcbd7a813'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:58:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:30.062 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:58:30 np0005531887 nova_compute[186849]: 2025-11-22 07:58:30.170 186853 DEBUG nova.compute.manager [req-2b0d8b26-ddf8-4d55-ab3a-1be907bd4ab6 req-63786988-9774-41b8-9659-d8d3297c3ec7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Received event network-vif-unplugged-f33dc67e-3190-49f9-a981-9b80daf65bdb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:58:30 np0005531887 nova_compute[186849]: 2025-11-22 07:58:30.171 186853 DEBUG oslo_concurrency.lockutils [req-2b0d8b26-ddf8-4d55-ab3a-1be907bd4ab6 req-63786988-9774-41b8-9659-d8d3297c3ec7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "3f315996-e85d-463b-9123-272512335a7f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:58:30 np0005531887 nova_compute[186849]: 2025-11-22 07:58:30.171 186853 DEBUG oslo_concurrency.lockutils [req-2b0d8b26-ddf8-4d55-ab3a-1be907bd4ab6 req-63786988-9774-41b8-9659-d8d3297c3ec7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3f315996-e85d-463b-9123-272512335a7f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:58:30 np0005531887 nova_compute[186849]: 2025-11-22 07:58:30.171 186853 DEBUG oslo_concurrency.lockutils [req-2b0d8b26-ddf8-4d55-ab3a-1be907bd4ab6 req-63786988-9774-41b8-9659-d8d3297c3ec7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3f315996-e85d-463b-9123-272512335a7f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:58:30 np0005531887 nova_compute[186849]: 2025-11-22 07:58:30.171 186853 DEBUG nova.compute.manager [req-2b0d8b26-ddf8-4d55-ab3a-1be907bd4ab6 req-63786988-9774-41b8-9659-d8d3297c3ec7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] No waiting events found dispatching network-vif-unplugged-f33dc67e-3190-49f9-a981-9b80daf65bdb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:58:30 np0005531887 nova_compute[186849]: 2025-11-22 07:58:30.171 186853 WARNING nova.compute.manager [req-2b0d8b26-ddf8-4d55-ab3a-1be907bd4ab6 req-63786988-9774-41b8-9659-d8d3297c3ec7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Received unexpected event network-vif-unplugged-f33dc67e-3190-49f9-a981-9b80daf65bdb for instance with vm_state active and task_state rescuing.#033[00m
Nov 22 02:58:30 np0005531887 nova_compute[186849]: 2025-11-22 07:58:30.615 186853 INFO nova.virt.libvirt.driver [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Instance shutdown successfully after 13 seconds.#033[00m
Nov 22 02:58:30 np0005531887 nova_compute[186849]: 2025-11-22 07:58:30.623 186853 INFO nova.virt.libvirt.driver [-] [instance: 3f315996-e85d-463b-9123-272512335a7f] Instance destroyed successfully.#033[00m
Nov 22 02:58:30 np0005531887 nova_compute[186849]: 2025-11-22 07:58:30.623 186853 DEBUG nova.objects.instance [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lazy-loading 'numa_topology' on Instance uuid 3f315996-e85d-463b-9123-272512335a7f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:58:30 np0005531887 nova_compute[186849]: 2025-11-22 07:58:30.642 186853 INFO nova.virt.libvirt.driver [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Attempting a stable device rescue#033[00m
Nov 22 02:58:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:30.722 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:58:30 np0005531887 nova_compute[186849]: 2025-11-22 07:58:30.722 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:30.723 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 02:58:30 np0005531887 nova_compute[186849]: 2025-11-22 07:58:30.937 186853 DEBUG nova.virt.libvirt.driver [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'usb', 'dev': 'sdb', 'type': 'disk'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314#033[00m
Nov 22 02:58:30 np0005531887 nova_compute[186849]: 2025-11-22 07:58:30.943 186853 DEBUG nova.virt.libvirt.driver [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Nov 22 02:58:30 np0005531887 nova_compute[186849]: 2025-11-22 07:58:30.944 186853 INFO nova.virt.libvirt.driver [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Creating image(s)#033[00m
Nov 22 02:58:30 np0005531887 nova_compute[186849]: 2025-11-22 07:58:30.945 186853 DEBUG oslo_concurrency.lockutils [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Acquiring lock "/var/lib/nova/instances/3f315996-e85d-463b-9123-272512335a7f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:58:30 np0005531887 nova_compute[186849]: 2025-11-22 07:58:30.945 186853 DEBUG oslo_concurrency.lockutils [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "/var/lib/nova/instances/3f315996-e85d-463b-9123-272512335a7f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:58:30 np0005531887 nova_compute[186849]: 2025-11-22 07:58:30.946 186853 DEBUG oslo_concurrency.lockutils [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "/var/lib/nova/instances/3f315996-e85d-463b-9123-272512335a7f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:58:30 np0005531887 nova_compute[186849]: 2025-11-22 07:58:30.946 186853 DEBUG nova.objects.instance [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 3f315996-e85d-463b-9123-272512335a7f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:58:30 np0005531887 nova_compute[186849]: 2025-11-22 07:58:30.957 186853 DEBUG oslo_concurrency.lockutils [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Acquiring lock "253915408629bc954cd98505927b671e71b5d9d2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:58:30 np0005531887 nova_compute[186849]: 2025-11-22 07:58:30.958 186853 DEBUG oslo_concurrency.lockutils [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "253915408629bc954cd98505927b671e71b5d9d2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:58:31 np0005531887 nova_compute[186849]: 2025-11-22 07:58:31.493 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:31 np0005531887 nova_compute[186849]: 2025-11-22 07:58:31.581 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:31 np0005531887 podman[225442]: 2025-11-22 07:58:31.852327922 +0000 UTC m=+0.053914595 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 02:58:32 np0005531887 nova_compute[186849]: 2025-11-22 07:58:32.510 186853 DEBUG nova.compute.manager [req-0db96641-4e82-4a7b-92ce-f1fa029d821a req-1f265785-4934-4cd2-8327-b7db8d571d87 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Received event network-vif-plugged-f33dc67e-3190-49f9-a981-9b80daf65bdb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:58:32 np0005531887 nova_compute[186849]: 2025-11-22 07:58:32.511 186853 DEBUG oslo_concurrency.lockutils [req-0db96641-4e82-4a7b-92ce-f1fa029d821a req-1f265785-4934-4cd2-8327-b7db8d571d87 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "3f315996-e85d-463b-9123-272512335a7f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:58:32 np0005531887 nova_compute[186849]: 2025-11-22 07:58:32.511 186853 DEBUG oslo_concurrency.lockutils [req-0db96641-4e82-4a7b-92ce-f1fa029d821a req-1f265785-4934-4cd2-8327-b7db8d571d87 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3f315996-e85d-463b-9123-272512335a7f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:58:32 np0005531887 nova_compute[186849]: 2025-11-22 07:58:32.511 186853 DEBUG oslo_concurrency.lockutils [req-0db96641-4e82-4a7b-92ce-f1fa029d821a req-1f265785-4934-4cd2-8327-b7db8d571d87 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3f315996-e85d-463b-9123-272512335a7f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:58:32 np0005531887 nova_compute[186849]: 2025-11-22 07:58:32.511 186853 DEBUG nova.compute.manager [req-0db96641-4e82-4a7b-92ce-f1fa029d821a req-1f265785-4934-4cd2-8327-b7db8d571d87 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] No waiting events found dispatching network-vif-plugged-f33dc67e-3190-49f9-a981-9b80daf65bdb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:58:32 np0005531887 nova_compute[186849]: 2025-11-22 07:58:32.512 186853 WARNING nova.compute.manager [req-0db96641-4e82-4a7b-92ce-f1fa029d821a req-1f265785-4934-4cd2-8327-b7db8d571d87 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Received unexpected event network-vif-plugged-f33dc67e-3190-49f9-a981-9b80daf65bdb for instance with vm_state active and task_state rescuing.#033[00m
Nov 22 02:58:32 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:32.725 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:58:33 np0005531887 nova_compute[186849]: 2025-11-22 07:58:33.175 186853 DEBUG oslo_concurrency.processutils [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/253915408629bc954cd98505927b671e71b5d9d2.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:58:33 np0005531887 nova_compute[186849]: 2025-11-22 07:58:33.237 186853 DEBUG oslo_concurrency.processutils [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/253915408629bc954cd98505927b671e71b5d9d2.part --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:58:33 np0005531887 nova_compute[186849]: 2025-11-22 07:58:33.238 186853 DEBUG nova.virt.images [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] ee0dbd5c-92f0-4228-9588-ef2e081e7580 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Nov 22 02:58:33 np0005531887 nova_compute[186849]: 2025-11-22 07:58:33.293 186853 DEBUG nova.privsep.utils [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 22 02:58:33 np0005531887 nova_compute[186849]: 2025-11-22 07:58:33.294 186853 DEBUG oslo_concurrency.processutils [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/253915408629bc954cd98505927b671e71b5d9d2.part /var/lib/nova/instances/_base/253915408629bc954cd98505927b671e71b5d9d2.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:58:34 np0005531887 nova_compute[186849]: 2025-11-22 07:58:34.678 186853 DEBUG oslo_concurrency.processutils [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/253915408629bc954cd98505927b671e71b5d9d2.part /var/lib/nova/instances/_base/253915408629bc954cd98505927b671e71b5d9d2.converted" returned: 0 in 1.384s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:58:34 np0005531887 nova_compute[186849]: 2025-11-22 07:58:34.683 186853 DEBUG oslo_concurrency.processutils [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/253915408629bc954cd98505927b671e71b5d9d2.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:58:34 np0005531887 nova_compute[186849]: 2025-11-22 07:58:34.771 186853 DEBUG oslo_concurrency.processutils [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/253915408629bc954cd98505927b671e71b5d9d2.converted --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:58:34 np0005531887 nova_compute[186849]: 2025-11-22 07:58:34.772 186853 DEBUG oslo_concurrency.lockutils [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "253915408629bc954cd98505927b671e71b5d9d2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 3.814s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:58:34 np0005531887 nova_compute[186849]: 2025-11-22 07:58:34.787 186853 DEBUG oslo_concurrency.lockutils [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Acquiring lock "253915408629bc954cd98505927b671e71b5d9d2" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:58:34 np0005531887 nova_compute[186849]: 2025-11-22 07:58:34.787 186853 DEBUG oslo_concurrency.lockutils [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "253915408629bc954cd98505927b671e71b5d9d2" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:58:34 np0005531887 nova_compute[186849]: 2025-11-22 07:58:34.799 186853 DEBUG oslo_concurrency.processutils [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/253915408629bc954cd98505927b671e71b5d9d2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:58:34 np0005531887 nova_compute[186849]: 2025-11-22 07:58:34.857 186853 DEBUG oslo_concurrency.processutils [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/253915408629bc954cd98505927b671e71b5d9d2 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:58:34 np0005531887 nova_compute[186849]: 2025-11-22 07:58:34.858 186853 DEBUG oslo_concurrency.processutils [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/253915408629bc954cd98505927b671e71b5d9d2,backing_fmt=raw /var/lib/nova/instances/3f315996-e85d-463b-9123-272512335a7f/disk.rescue execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:58:35 np0005531887 nova_compute[186849]: 2025-11-22 07:58:35.288 186853 DEBUG oslo_concurrency.processutils [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/253915408629bc954cd98505927b671e71b5d9d2,backing_fmt=raw /var/lib/nova/instances/3f315996-e85d-463b-9123-272512335a7f/disk.rescue" returned: 0 in 0.430s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:58:35 np0005531887 nova_compute[186849]: 2025-11-22 07:58:35.289 186853 DEBUG oslo_concurrency.lockutils [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "253915408629bc954cd98505927b671e71b5d9d2" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.502s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:58:35 np0005531887 nova_compute[186849]: 2025-11-22 07:58:35.289 186853 DEBUG nova.objects.instance [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lazy-loading 'migration_context' on Instance uuid 3f315996-e85d-463b-9123-272512335a7f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:58:35 np0005531887 nova_compute[186849]: 2025-11-22 07:58:35.306 186853 DEBUG nova.virt.libvirt.driver [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:58:35 np0005531887 nova_compute[186849]: 2025-11-22 07:58:35.310 186853 DEBUG nova.virt.libvirt.driver [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Start _get_guest_xml network_info=[{"id": "f33dc67e-3190-49f9-a981-9b80daf65bdb", "address": "fa:16:3e:bc:09:ea", "network": {"id": "06e0f3a5-911a-4244-bd9c-8cb4fa4c4794", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-960378838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-960378838-network", "vif_mac": "fa:16:3e:bc:09:ea"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd33c7e49baa4c7f9575824b348a0f23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf33dc67e-31", "ovs_interfaceid": "f33dc67e-3190-49f9-a981-9b80daf65bdb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'usb', 'dev': 'sdb', 'type': 'disk'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': 'ee0dbd5c-92f0-4228-9588-ef2e081e7580', 'kernel_id': '', 'ramdisk_id': ''} block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'size': 0, 'boot_index': 0, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:58:35 np0005531887 nova_compute[186849]: 2025-11-22 07:58:35.310 186853 DEBUG nova.objects.instance [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lazy-loading 'resources' on Instance uuid 3f315996-e85d-463b-9123-272512335a7f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:58:35 np0005531887 nova_compute[186849]: 2025-11-22 07:58:35.331 186853 WARNING nova.virt.libvirt.driver [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:58:35 np0005531887 nova_compute[186849]: 2025-11-22 07:58:35.338 186853 DEBUG nova.virt.libvirt.host [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:58:35 np0005531887 nova_compute[186849]: 2025-11-22 07:58:35.339 186853 DEBUG nova.virt.libvirt.host [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:58:35 np0005531887 nova_compute[186849]: 2025-11-22 07:58:35.342 186853 DEBUG nova.virt.libvirt.host [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:58:35 np0005531887 nova_compute[186849]: 2025-11-22 07:58:35.342 186853 DEBUG nova.virt.libvirt.host [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:58:35 np0005531887 nova_compute[186849]: 2025-11-22 07:58:35.343 186853 DEBUG nova.virt.libvirt.driver [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:58:35 np0005531887 nova_compute[186849]: 2025-11-22 07:58:35.343 186853 DEBUG nova.virt.hardware [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:58:35 np0005531887 nova_compute[186849]: 2025-11-22 07:58:35.344 186853 DEBUG nova.virt.hardware [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:58:35 np0005531887 nova_compute[186849]: 2025-11-22 07:58:35.344 186853 DEBUG nova.virt.hardware [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:58:35 np0005531887 nova_compute[186849]: 2025-11-22 07:58:35.344 186853 DEBUG nova.virt.hardware [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:58:35 np0005531887 nova_compute[186849]: 2025-11-22 07:58:35.344 186853 DEBUG nova.virt.hardware [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:58:35 np0005531887 nova_compute[186849]: 2025-11-22 07:58:35.345 186853 DEBUG nova.virt.hardware [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:58:35 np0005531887 nova_compute[186849]: 2025-11-22 07:58:35.345 186853 DEBUG nova.virt.hardware [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:58:35 np0005531887 nova_compute[186849]: 2025-11-22 07:58:35.345 186853 DEBUG nova.virt.hardware [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:58:35 np0005531887 nova_compute[186849]: 2025-11-22 07:58:35.345 186853 DEBUG nova.virt.hardware [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:58:35 np0005531887 nova_compute[186849]: 2025-11-22 07:58:35.345 186853 DEBUG nova.virt.hardware [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:58:35 np0005531887 nova_compute[186849]: 2025-11-22 07:58:35.345 186853 DEBUG nova.virt.hardware [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:58:35 np0005531887 nova_compute[186849]: 2025-11-22 07:58:35.346 186853 DEBUG nova.objects.instance [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 3f315996-e85d-463b-9123-272512335a7f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:58:35 np0005531887 nova_compute[186849]: 2025-11-22 07:58:35.362 186853 DEBUG oslo_concurrency.processutils [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f315996-e85d-463b-9123-272512335a7f/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:58:35 np0005531887 nova_compute[186849]: 2025-11-22 07:58:35.420 186853 DEBUG oslo_concurrency.processutils [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f315996-e85d-463b-9123-272512335a7f/disk.config --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:58:35 np0005531887 nova_compute[186849]: 2025-11-22 07:58:35.422 186853 DEBUG oslo_concurrency.lockutils [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Acquiring lock "/var/lib/nova/instances/3f315996-e85d-463b-9123-272512335a7f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:58:35 np0005531887 nova_compute[186849]: 2025-11-22 07:58:35.422 186853 DEBUG oslo_concurrency.lockutils [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "/var/lib/nova/instances/3f315996-e85d-463b-9123-272512335a7f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:58:35 np0005531887 nova_compute[186849]: 2025-11-22 07:58:35.423 186853 DEBUG oslo_concurrency.lockutils [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "/var/lib/nova/instances/3f315996-e85d-463b-9123-272512335a7f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:58:35 np0005531887 nova_compute[186849]: 2025-11-22 07:58:35.424 186853 DEBUG nova.virt.libvirt.vif [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:57:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-1175546546',display_name='tempest-ServerStableDeviceRescueTest-server-1175546546',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-1175546546',id=80,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:58:03Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fd33c7e49baa4c7f9575824b348a0f23',ramdisk_id='',reservation_id='r-wu2qhkzu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerStableDeviceRescueTest-455223381',owner_user_name='tempest-ServerStableDeviceRescueTest-455223381-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:58:12Z,user_data=None,user_id='0d84421d986b40f481c0caef764443e2',uuid=3f315996-e85d-463b-9123-272512335a7f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f33dc67e-3190-49f9-a981-9b80daf65bdb", "address": "fa:16:3e:bc:09:ea", "network": {"id": "06e0f3a5-911a-4244-bd9c-8cb4fa4c4794", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-960378838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-960378838-network", "vif_mac": "fa:16:3e:bc:09:ea"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd33c7e49baa4c7f9575824b348a0f23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf33dc67e-31", "ovs_interfaceid": "f33dc67e-3190-49f9-a981-9b80daf65bdb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 02:58:35 np0005531887 nova_compute[186849]: 2025-11-22 07:58:35.424 186853 DEBUG nova.network.os_vif_util [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Converting VIF {"id": "f33dc67e-3190-49f9-a981-9b80daf65bdb", "address": "fa:16:3e:bc:09:ea", "network": {"id": "06e0f3a5-911a-4244-bd9c-8cb4fa4c4794", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-960378838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-960378838-network", "vif_mac": "fa:16:3e:bc:09:ea"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd33c7e49baa4c7f9575824b348a0f23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf33dc67e-31", "ovs_interfaceid": "f33dc67e-3190-49f9-a981-9b80daf65bdb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:58:35 np0005531887 nova_compute[186849]: 2025-11-22 07:58:35.425 186853 DEBUG nova.network.os_vif_util [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:bc:09:ea,bridge_name='br-int',has_traffic_filtering=True,id=f33dc67e-3190-49f9-a981-9b80daf65bdb,network=Network(06e0f3a5-911a-4244-bd9c-8cb4fa4c4794),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf33dc67e-31') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:58:35 np0005531887 nova_compute[186849]: 2025-11-22 07:58:35.426 186853 DEBUG nova.objects.instance [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3f315996-e85d-463b-9123-272512335a7f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:58:35 np0005531887 nova_compute[186849]: 2025-11-22 07:58:35.441 186853 DEBUG nova.virt.libvirt.driver [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:58:35 np0005531887 nova_compute[186849]:  <uuid>3f315996-e85d-463b-9123-272512335a7f</uuid>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:  <name>instance-00000050</name>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:  <memory>131072</memory>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:  <vcpu>1</vcpu>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:58:35 np0005531887 nova_compute[186849]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:      <nova:name>tempest-ServerStableDeviceRescueTest-server-1175546546</nova:name>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:      <nova:creationTime>2025-11-22 07:58:35</nova:creationTime>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:      <nova:flavor name="m1.nano">
Nov 22 02:58:35 np0005531887 nova_compute[186849]:        <nova:memory>128</nova:memory>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:        <nova:disk>1</nova:disk>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:        <nova:swap>0</nova:swap>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:      </nova:flavor>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:      <nova:owner>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:        <nova:user uuid="0d84421d986b40f481c0caef764443e2">tempest-ServerStableDeviceRescueTest-455223381-project-member</nova:user>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:        <nova:project uuid="fd33c7e49baa4c7f9575824b348a0f23">tempest-ServerStableDeviceRescueTest-455223381</nova:project>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:      </nova:owner>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:      <nova:ports>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:        <nova:port uuid="f33dc67e-3190-49f9-a981-9b80daf65bdb">
Nov 22 02:58:35 np0005531887 nova_compute[186849]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:        </nova:port>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:      </nova:ports>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:    </nova:instance>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:  <sysinfo type="smbios">
Nov 22 02:58:35 np0005531887 nova_compute[186849]:    <system>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:      <entry name="serial">3f315996-e85d-463b-9123-272512335a7f</entry>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:      <entry name="uuid">3f315996-e85d-463b-9123-272512335a7f</entry>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:    </system>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:  <os>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:    <smbios mode="sysinfo"/>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:  </os>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:  <features>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:    <vmcoreinfo/>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:  </features>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:  <clock offset="utc">
Nov 22 02:58:35 np0005531887 nova_compute[186849]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:    <timer name="hpet" present="no"/>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:  </clock>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:  <cpu mode="custom" match="exact">
Nov 22 02:58:35 np0005531887 nova_compute[186849]:    <model>Nehalem</model>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:  <devices>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:    <disk type="file" device="disk">
Nov 22 02:58:35 np0005531887 nova_compute[186849]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/3f315996-e85d-463b-9123-272512335a7f/disk"/>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:      <target dev="vda" bus="virtio"/>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:    <disk type="file" device="cdrom">
Nov 22 02:58:35 np0005531887 nova_compute[186849]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/3f315996-e85d-463b-9123-272512335a7f/disk.config"/>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:      <target dev="sda" bus="sata"/>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:    <disk type="file" device="disk">
Nov 22 02:58:35 np0005531887 nova_compute[186849]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/3f315996-e85d-463b-9123-272512335a7f/disk.rescue"/>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:      <target dev="sdb" bus="usb"/>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:      <boot order="1"/>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:    <interface type="ethernet">
Nov 22 02:58:35 np0005531887 nova_compute[186849]:      <mac address="fa:16:3e:bc:09:ea"/>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:      <mtu size="1442"/>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:      <target dev="tapf33dc67e-31"/>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:    </interface>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:    <serial type="pty">
Nov 22 02:58:35 np0005531887 nova_compute[186849]:      <log file="/var/lib/nova/instances/3f315996-e85d-463b-9123-272512335a7f/console.log" append="off"/>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:    </serial>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:    <video>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:    </video>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:    <input type="tablet" bus="usb"/>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:    <rng model="virtio">
Nov 22 02:58:35 np0005531887 nova_compute[186849]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:    </rng>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:    <controller type="usb" index="0"/>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:    <memballoon model="virtio">
Nov 22 02:58:35 np0005531887 nova_compute[186849]:      <stats period="10"/>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 02:58:35 np0005531887 nova_compute[186849]:  </devices>
Nov 22 02:58:35 np0005531887 nova_compute[186849]: </domain>
Nov 22 02:58:35 np0005531887 nova_compute[186849]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:58:35 np0005531887 nova_compute[186849]: 2025-11-22 07:58:35.448 186853 INFO nova.virt.libvirt.driver [-] [instance: 3f315996-e85d-463b-9123-272512335a7f] Instance destroyed successfully.#033[00m
Nov 22 02:58:35 np0005531887 nova_compute[186849]: 2025-11-22 07:58:35.550 186853 DEBUG nova.virt.libvirt.driver [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:58:35 np0005531887 nova_compute[186849]: 2025-11-22 07:58:35.550 186853 DEBUG nova.virt.libvirt.driver [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:58:35 np0005531887 nova_compute[186849]: 2025-11-22 07:58:35.550 186853 DEBUG nova.virt.libvirt.driver [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] No BDM found with device name sdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:58:35 np0005531887 nova_compute[186849]: 2025-11-22 07:58:35.558 186853 DEBUG nova.virt.libvirt.driver [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] No VIF found with MAC fa:16:3e:bc:09:ea, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 02:58:35 np0005531887 nova_compute[186849]: 2025-11-22 07:58:35.559 186853 INFO nova.virt.libvirt.driver [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Using config drive#033[00m
Nov 22 02:58:35 np0005531887 nova_compute[186849]: 2025-11-22 07:58:35.574 186853 DEBUG nova.objects.instance [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 3f315996-e85d-463b-9123-272512335a7f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:58:35 np0005531887 nova_compute[186849]: 2025-11-22 07:58:35.604 186853 DEBUG nova.objects.instance [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lazy-loading 'keypairs' on Instance uuid 3f315996-e85d-463b-9123-272512335a7f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:58:35 np0005531887 nova_compute[186849]: 2025-11-22 07:58:35.961 186853 INFO nova.virt.libvirt.driver [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Creating config drive at /var/lib/nova/instances/3f315996-e85d-463b-9123-272512335a7f/disk.config.rescue#033[00m
Nov 22 02:58:35 np0005531887 nova_compute[186849]: 2025-11-22 07:58:35.966 186853 DEBUG oslo_concurrency.processutils [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3f315996-e85d-463b-9123-272512335a7f/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpav1uu0l4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:58:36 np0005531887 nova_compute[186849]: 2025-11-22 07:58:36.093 186853 DEBUG oslo_concurrency.processutils [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3f315996-e85d-463b-9123-272512335a7f/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpav1uu0l4" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:58:36 np0005531887 kernel: tapf33dc67e-31: entered promiscuous mode
Nov 22 02:58:36 np0005531887 NetworkManager[55210]: <info>  [1763798316.1846] manager: (tapf33dc67e-31): new Tun device (/org/freedesktop/NetworkManager/Devices/117)
Nov 22 02:58:36 np0005531887 ovn_controller[95130]: 2025-11-22T07:58:36Z|00229|binding|INFO|Claiming lport f33dc67e-3190-49f9-a981-9b80daf65bdb for this chassis.
Nov 22 02:58:36 np0005531887 ovn_controller[95130]: 2025-11-22T07:58:36Z|00230|binding|INFO|f33dc67e-3190-49f9-a981-9b80daf65bdb: Claiming fa:16:3e:bc:09:ea 10.100.0.3
Nov 22 02:58:36 np0005531887 nova_compute[186849]: 2025-11-22 07:58:36.185 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:36.195 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bc:09:ea 10.100.0.3'], port_security=['fa:16:3e:bc:09:ea 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '3f315996-e85d-463b-9123-272512335a7f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'neutron:revision_number': '5', 'neutron:security_group_ids': '40e7d412-78c2-4966-b2d0-76294ef96b0a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2f082361-600e-461c-8c13-2c91c0ff7f77, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=f33dc67e-3190-49f9-a981-9b80daf65bdb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:58:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:36.197 104084 INFO neutron.agent.ovn.metadata.agent [-] Port f33dc67e-3190-49f9-a981-9b80daf65bdb in datapath 06e0f3a5-911a-4244-bd9c-8cb4fa4c4794 bound to our chassis#033[00m
Nov 22 02:58:36 np0005531887 ovn_controller[95130]: 2025-11-22T07:58:36Z|00231|binding|INFO|Setting lport f33dc67e-3190-49f9-a981-9b80daf65bdb ovn-installed in OVS
Nov 22 02:58:36 np0005531887 ovn_controller[95130]: 2025-11-22T07:58:36Z|00232|binding|INFO|Setting lport f33dc67e-3190-49f9-a981-9b80daf65bdb up in Southbound
Nov 22 02:58:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:36.198 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 06e0f3a5-911a-4244-bd9c-8cb4fa4c4794#033[00m
Nov 22 02:58:36 np0005531887 nova_compute[186849]: 2025-11-22 07:58:36.199 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:36 np0005531887 nova_compute[186849]: 2025-11-22 07:58:36.202 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:36 np0005531887 systemd-udevd[225509]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:58:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:36.220 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[44756f16-f865-471a-9d9d-254ce7d2ef9e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:36 np0005531887 systemd-machined[153180]: New machine qemu-34-instance-00000050.
Nov 22 02:58:36 np0005531887 NetworkManager[55210]: <info>  [1763798316.2348] device (tapf33dc67e-31): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 02:58:36 np0005531887 NetworkManager[55210]: <info>  [1763798316.2356] device (tapf33dc67e-31): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 02:58:36 np0005531887 systemd[1]: Started Virtual Machine qemu-34-instance-00000050.
Nov 22 02:58:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:36.257 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[bfbc34a0-19cb-4ae8-85c3-bd8b9b6d4547]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:36.262 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[e790520b-4e60-483e-bcbb-36f51ea60556]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:36.294 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[b43fb389-ea1e-4f9a-af36-6673716e9d61]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:36.314 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[e087c99f-4f41-4bcf-a8c4-1b2f28756e0c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap06e0f3a5-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:b7:bd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 9, 'tx_packets': 9, 'rx_bytes': 658, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 9, 'tx_packets': 9, 'rx_bytes': 658, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 73], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 501577, 'reachable_time': 19429, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225522, 'error': None, 'target': 'ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:36.333 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[b4d69a50-7d22-453c-bc94-67e05821b872]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap06e0f3a5-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 501591, 'tstamp': 501591}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225523, 'error': None, 'target': 'ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap06e0f3a5-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 501595, 'tstamp': 501595}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225523, 'error': None, 'target': 'ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:36.335 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap06e0f3a5-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:58:36 np0005531887 nova_compute[186849]: 2025-11-22 07:58:36.336 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:36.340 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap06e0f3a5-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:58:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:36.340 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:58:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:36.340 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap06e0f3a5-90, col_values=(('external_ids', {'iface-id': '465da2c0-9a1c-41a9-be9a-d10bcbd7a813'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:58:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:36.341 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:58:36 np0005531887 nova_compute[186849]: 2025-11-22 07:58:36.495 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:36 np0005531887 nova_compute[186849]: 2025-11-22 07:58:36.585 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:36.666 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'e4a6074c-55b0-4529-b184-3ba3ca0dab8c', 'name': 'tempest-ServerStableDeviceRescueTest-server-697404939', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000004d', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'user_id': '0d84421d986b40f481c0caef764443e2', 'hostId': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 02:58:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:36.669 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '3f315996-e85d-463b-9123-272512335a7f', 'name': 'tempest-ServerStableDeviceRescueTest-server-1175546546', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000050', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'paused', 'tenant_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'user_id': '0d84421d986b40f481c0caef764443e2', 'hostId': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'status': 'paused', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 02:58:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:36.669 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 02:58:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:36.697 12 DEBUG ceilometer.compute.pollsters [-] e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk.device.read.bytes volume: 32057344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:36.698 12 DEBUG ceilometer.compute.pollsters [-] e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:36 np0005531887 nova_compute[186849]: 2025-11-22 07:58:36.708 186853 DEBUG nova.compute.manager [req-98ce7bfe-613f-4625-b280-aedef2209071 req-3ff7c89f-4b79-4957-9c5e-00a315a96646 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Received event network-vif-plugged-f33dc67e-3190-49f9-a981-9b80daf65bdb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:58:36 np0005531887 nova_compute[186849]: 2025-11-22 07:58:36.708 186853 DEBUG oslo_concurrency.lockutils [req-98ce7bfe-613f-4625-b280-aedef2209071 req-3ff7c89f-4b79-4957-9c5e-00a315a96646 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "3f315996-e85d-463b-9123-272512335a7f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:58:36 np0005531887 nova_compute[186849]: 2025-11-22 07:58:36.709 186853 DEBUG oslo_concurrency.lockutils [req-98ce7bfe-613f-4625-b280-aedef2209071 req-3ff7c89f-4b79-4957-9c5e-00a315a96646 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3f315996-e85d-463b-9123-272512335a7f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:58:36 np0005531887 nova_compute[186849]: 2025-11-22 07:58:36.709 186853 DEBUG oslo_concurrency.lockutils [req-98ce7bfe-613f-4625-b280-aedef2209071 req-3ff7c89f-4b79-4957-9c5e-00a315a96646 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3f315996-e85d-463b-9123-272512335a7f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:58:36 np0005531887 nova_compute[186849]: 2025-11-22 07:58:36.710 186853 DEBUG nova.compute.manager [req-98ce7bfe-613f-4625-b280-aedef2209071 req-3ff7c89f-4b79-4957-9c5e-00a315a96646 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] No waiting events found dispatching network-vif-plugged-f33dc67e-3190-49f9-a981-9b80daf65bdb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:58:36 np0005531887 nova_compute[186849]: 2025-11-22 07:58:36.710 186853 WARNING nova.compute.manager [req-98ce7bfe-613f-4625-b280-aedef2209071 req-3ff7c89f-4b79-4957-9c5e-00a315a96646 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Received unexpected event network-vif-plugged-f33dc67e-3190-49f9-a981-9b80daf65bdb for instance with vm_state active and task_state rescuing.#033[00m
Nov 22 02:58:36 np0005531887 nova_compute[186849]: 2025-11-22 07:58:36.975 186853 DEBUG nova.virt.libvirt.host [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Removed pending event for 3f315996-e85d-463b-9123-272512335a7f due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Nov 22 02:58:36 np0005531887 nova_compute[186849]: 2025-11-22 07:58:36.975 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798316.9748788, 3f315996-e85d-463b-9123-272512335a7f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:58:36 np0005531887 nova_compute[186849]: 2025-11-22 07:58:36.976 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 3f315996-e85d-463b-9123-272512335a7f] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:58:36 np0005531887 nova_compute[186849]: 2025-11-22 07:58:36.987 186853 DEBUG nova.compute.manager [None req-6c416bba-97af-4b7c-8d36-84fee0ffd15f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:58:36 np0005531887 nova_compute[186849]: 2025-11-22 07:58:36.994 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 3f315996-e85d-463b-9123-272512335a7f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:58:36 np0005531887 nova_compute[186849]: 2025-11-22 07:58:36.996 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 3f315996-e85d-463b-9123-272512335a7f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:58:37 np0005531887 nova_compute[186849]: 2025-11-22 07:58:37.020 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 3f315996-e85d-463b-9123-272512335a7f] During sync_power_state the instance has a pending task (rescuing). Skip.#033[00m
Nov 22 02:58:37 np0005531887 nova_compute[186849]: 2025-11-22 07:58:37.021 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798316.9751852, 3f315996-e85d-463b-9123-272512335a7f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:58:37 np0005531887 nova_compute[186849]: 2025-11-22 07:58:37.021 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 3f315996-e85d-463b-9123-272512335a7f] VM Started (Lifecycle Event)#033[00m
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.022 12 DEBUG ceilometer.compute.pollsters [-] 3f315996-e85d-463b-9123-272512335a7f/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.023 12 DEBUG ceilometer.compute.pollsters [-] 3f315996-e85d-463b-9123-272512335a7f/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.024 12 DEBUG ceilometer.compute.pollsters [-] 3f315996-e85d-463b-9123-272512335a7f/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.025 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7705bdcd-bb15-41e6-aa6e-f51d40effa42', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 32057344, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': 'e4a6074c-55b0-4529-b184-3ba3ca0dab8c-vda', 'timestamp': '2025-11-22T07:58:36.669681', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-697404939', 'name': 'instance-0000004d', 'instance_id': 'e4a6074c-55b0-4529-b184-3ba3ca0dab8c', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0cba23c4-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.339476419, 'message_signature': '869710d2fb5bc4c006780436e0b2365dd4d392c6d0c97704439c51a5f4efb5d2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': 'e4a6074c-55b0-4529-b184-3ba3ca0dab8c-sda', 'timestamp': '2025-11-22T07:58:36.669681', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-697404939', 'name': 'instance-0000004d', 'instance_id': 'e4a6074c-55b0-4529-b184-3ba3ca0dab8c', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0cba3274-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.339476419, 'message_signature': '8a57e1a90badb22d743e8bcd81561a90d0ea490a44147e0732e096a39c94a953'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': '3f315996-e85d-463b-9123-272512335a7f-vda', 'timestamp': '2025-11-22T07:58:36.669681', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1175546546', 'name': 'instance-00000050', 'instance_id': '3f315996-e85d-463b-9123-272512335a7f', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0cebcad2-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.36840911, 'message_signature': '0ed6e28ac759fb0845a379c7baef2df3d6ef7a6c2a4ef835b57a89988b5a4aef'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': '3f315996-e85d-463b-9123-272512335a7f-sda', 'timestamp': '2025-11-22T07:58:36.669681', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1175546546', 'name': 'instance-00000050', 'instance_id': '3f315996-e85d-463b-9123-272512335a7f', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0cebdc66-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.36840911, 'message_signature': '4cebc2362b8ad2bf6c697fc500923b6ede51b9ea4f1faefaa87be7dba80b9e42'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': '3f315996-e85d-463b-9123-272512335a7f-sdb', 'timestamp': '2025-11-22T07:58:36.669681', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1175546546', 'name': 'instance-00000050', 'instance_id': '3f315996-e85d-463b-9123-272512335a7f', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sdb'}, 'message_id': '0cebe968-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.36840911, 'message_signature': '316763e4f29e9ca508eb6f0f8df4878bc3e1f36cf0686e74c7f11431a33fcbca'}]}, 'timestamp': '2025-11-22 07:58:37.024448', '_unique_id': 'df39f0014fb942c0bc82efae8f426bf1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.025 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.025 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.025 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.025 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.025 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.025 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.025 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.025 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.025 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.025 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.025 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.025 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.025 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.025 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.025 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.025 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.025 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.025 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.025 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.025 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.025 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.025 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.025 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.025 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.025 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.025 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.025 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.025 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.025 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.025 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.025 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.027 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.044 12 DEBUG ceilometer.compute.pollsters [-] e4a6074c-55b0-4529-b184-3ba3ca0dab8c/memory.usage volume: 42.23828125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 nova_compute[186849]: 2025-11-22 07:58:37.051 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 3f315996-e85d-463b-9123-272512335a7f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:58:37 np0005531887 nova_compute[186849]: 2025-11-22 07:58:37.054 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 3f315996-e85d-463b-9123-272512335a7f] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.061 12 DEBUG ceilometer.compute.pollsters [-] 3f315996-e85d-463b-9123-272512335a7f/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.062 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 3f315996-e85d-463b-9123-272512335a7f: ceilometer.compute.pollsters.NoVolumeException
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.063 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7b786b90-49c9-4816-ad5e-dc0746902dc2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.23828125, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': 'e4a6074c-55b0-4529-b184-3ba3ca0dab8c', 'timestamp': '2025-11-22T07:58:37.027936', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-697404939', 'name': 'instance-0000004d', 'instance_id': 'e4a6074c-55b0-4529-b184-3ba3ca0dab8c', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '0cef0dc8-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.714085105, 'message_signature': '87ea66280d8f70188d95b2a7a8cac352903ad4e3c2d5976365d26f6336ca6c63'}]}, 'timestamp': '2025-11-22 07:58:37.062410', '_unique_id': 'a963b3a91fda426ebca2ad76c63e496b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.063 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.063 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.063 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.063 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.063 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.063 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.063 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.063 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.063 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.063 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.063 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.063 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.063 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.063 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.063 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.063 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.063 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.063 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.063 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.063 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.063 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.063 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.063 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.063 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.063 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.063 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.063 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.063 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.063 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.063 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.063 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.064 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.064 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.065 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-697404939>, <NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-1175546546>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-697404939>, <NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-1175546546>]
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.065 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.067 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for e4a6074c-55b0-4529-b184-3ba3ca0dab8c / tap392e43af-a9 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.068 12 DEBUG ceilometer.compute.pollsters [-] e4a6074c-55b0-4529-b184-3ba3ca0dab8c/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.070 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 3f315996-e85d-463b-9123-272512335a7f / tapf33dc67e-31 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.070 12 DEBUG ceilometer.compute.pollsters [-] 3f315996-e85d-463b-9123-272512335a7f/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.072 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0963e4db-0104-49ff-b493-5e420b9a4513', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': 'instance-0000004d-e4a6074c-55b0-4529-b184-3ba3ca0dab8c-tap392e43af-a9', 'timestamp': '2025-11-22T07:58:37.065480', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-697404939', 'name': 'tap392e43af-a9', 'instance_id': 'e4a6074c-55b0-4529-b184-3ba3ca0dab8c', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:10:9b:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap392e43af-a9'}, 'message_id': '0cf2a76c-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.735328634, 'message_signature': 'b1d5e47452dbe78f3ebbb4fb6428955d231c7118c2a4185c28a03194fd6c636f'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': 'instance-00000050-3f315996-e85d-463b-9123-272512335a7f-tapf33dc67e-31', 'timestamp': '2025-11-22T07:58:37.065480', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1175546546', 'name': 'tapf33dc67e-31', 'instance_id': '3f315996-e85d-463b-9123-272512335a7f', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bc:09:ea', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf33dc67e-31'}, 'message_id': '0cf30752-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.738434991, 'message_signature': '18e1520d1c5b0bad0cb665f1af3b48133b900d8d1cdbc3235af4a5ba2a4ecc65'}]}, 'timestamp': '2025-11-22 07:58:37.071074', '_unique_id': 'e27f935794b6486ea0830ad3700a90e8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.072 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.072 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.072 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.072 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.072 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.072 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.072 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.072 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.072 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.072 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.072 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.072 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.072 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.072 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.072 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.072 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.072 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.072 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.072 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.072 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.072 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.072 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.072 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.072 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.072 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.072 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.072 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.072 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.072 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.072 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.072 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.073 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.073 12 DEBUG ceilometer.compute.pollsters [-] e4a6074c-55b0-4529-b184-3ba3ca0dab8c/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.073 12 DEBUG ceilometer.compute.pollsters [-] 3f315996-e85d-463b-9123-272512335a7f/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.074 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '039754b6-97a0-492b-b50f-028f33cc89fe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': 'instance-0000004d-e4a6074c-55b0-4529-b184-3ba3ca0dab8c-tap392e43af-a9', 'timestamp': '2025-11-22T07:58:37.073383', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-697404939', 'name': 'tap392e43af-a9', 'instance_id': 'e4a6074c-55b0-4529-b184-3ba3ca0dab8c', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:10:9b:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap392e43af-a9'}, 'message_id': '0cf36fb2-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.735328634, 'message_signature': '17ea06cc1f424f990b0f0570699b5131009c6ef7d110f369bfba0087a99fcd47'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': 'instance-00000050-3f315996-e85d-463b-9123-272512335a7f-tapf33dc67e-31', 'timestamp': '2025-11-22T07:58:37.073383', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1175546546', 'name': 'tapf33dc67e-31', 'instance_id': '3f315996-e85d-463b-9123-272512335a7f', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bc:09:ea', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf33dc67e-31'}, 'message_id': '0cf37d04-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.738434991, 'message_signature': 'e3fcb26290a5b27563538f81e42d86eda66f1929c4c404c0d4e6c528324cd056'}]}, 'timestamp': '2025-11-22 07:58:37.074105', '_unique_id': '54afd0471fd14c268b06d0fc1ad2230f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.074 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.074 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.074 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.074 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.074 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.074 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.074 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.074 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.074 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.074 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.074 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.074 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.074 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.074 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.074 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.074 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.074 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.074 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.074 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.074 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.074 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.074 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.074 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.074 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.074 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.074 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.074 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.074 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.074 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.074 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.074 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.076 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.076 12 DEBUG ceilometer.compute.pollsters [-] e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk.device.write.bytes volume: 311296 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.076 12 DEBUG ceilometer.compute.pollsters [-] e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.076 12 DEBUG ceilometer.compute.pollsters [-] 3f315996-e85d-463b-9123-272512335a7f/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.077 12 DEBUG ceilometer.compute.pollsters [-] 3f315996-e85d-463b-9123-272512335a7f/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.077 12 DEBUG ceilometer.compute.pollsters [-] 3f315996-e85d-463b-9123-272512335a7f/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.078 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6487e93f-9fd9-4376-83ad-3b5230e96bb9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 311296, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': 'e4a6074c-55b0-4529-b184-3ba3ca0dab8c-vda', 'timestamp': '2025-11-22T07:58:37.076323', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-697404939', 'name': 'instance-0000004d', 'instance_id': 'e4a6074c-55b0-4529-b184-3ba3ca0dab8c', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0cf3e1fe-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.339476419, 'message_signature': '03d10838e776e0aa888f915c3c4f6e01edf777146a1c062a3b3b3eb5d22bfb29'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': 'e4a6074c-55b0-4529-b184-3ba3ca0dab8c-sda', 'timestamp': '2025-11-22T07:58:37.076323', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-697404939', 'name': 'instance-0000004d', 'instance_id': 'e4a6074c-55b0-4529-b184-3ba3ca0dab8c', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0cf3ec62-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.339476419, 'message_signature': 'd109dd9d7cdbac77bf46e5b1e3a191bbd6638a6cdfaf74cabb132ca4f83e9318'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': '3f315996-e85d-463b-9123-272512335a7f-vda', 'timestamp': '2025-11-22T07:58:37.076323', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1175546546', 'name': 'instance-00000050', 'instance_id': '3f315996-e85d-463b-9123-272512335a7f', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0cf3f720-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.36840911, 'message_signature': 'a5a9a1c536aed4df43ff63d80211568883084e9e255bcece52daf406523b9394'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': '3f315996-e85d-463b-9123-272512335a7f-sda', 'timestamp': '2025-11-22T07:58:37.076323', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1175546546', 'name': 'instance-00000050', 'instance_id': '3f315996-e85d-463b-9123-272512335a7f', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0cf40210-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.36840911, 'message_signature': '5e172adc4164cbfdd38b8896405d4fb8846506ad6eced32d7569c77e1de8e93c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': '3f315996-e85d-463b-9123-272512335a7f-sdb', 'timestamp': '2025-11-22T07:58:37.076323', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1175546546', 'name': 'instance-00000050', 'instance_id': '3f315996-e85d-463b-9123-272512335a7f', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sdb'}, 'message_id': '0cf40e68-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.36840911, 'message_signature': 'ff38b0f7a33ac0a0708ce50c32b30c5d3792a82734e1aaec849a380f8ff62811'}]}, 'timestamp': '2025-11-22 07:58:37.077777', '_unique_id': 'b20e20acd89a43bf896323180c8a05bc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.078 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.078 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.078 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.078 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.078 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.078 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.078 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.078 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.078 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.078 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.078 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.078 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.078 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.078 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.078 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.078 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.078 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.078 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.078 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.078 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.078 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.078 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.078 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.078 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.078 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.078 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.078 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.078 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.078 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.078 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.078 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.079 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.079 12 DEBUG ceilometer.compute.pollsters [-] e4a6074c-55b0-4529-b184-3ba3ca0dab8c/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.080 12 DEBUG ceilometer.compute.pollsters [-] 3f315996-e85d-463b-9123-272512335a7f/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.081 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ce1eb8d0-980c-4b3d-907b-6285f37aeb63', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': 'instance-0000004d-e4a6074c-55b0-4529-b184-3ba3ca0dab8c-tap392e43af-a9', 'timestamp': '2025-11-22T07:58:37.079849', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-697404939', 'name': 'tap392e43af-a9', 'instance_id': 'e4a6074c-55b0-4529-b184-3ba3ca0dab8c', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:10:9b:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap392e43af-a9'}, 'message_id': '0cf46aa2-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.735328634, 'message_signature': '10d7d9711c84d0d905ce145dc9f88fea9e2dc294377ba8401841e59998288dbb'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': 'instance-00000050-3f315996-e85d-463b-9123-272512335a7f-tapf33dc67e-31', 'timestamp': '2025-11-22T07:58:37.079849', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1175546546', 'name': 'tapf33dc67e-31', 'instance_id': '3f315996-e85d-463b-9123-272512335a7f', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bc:09:ea', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf33dc67e-31'}, 'message_id': '0cf47646-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.738434991, 'message_signature': '62a5fb7a22866a002467fc90bea00d75ca20af064a6a64a7f99538e4c4f11f2b'}]}, 'timestamp': '2025-11-22 07:58:37.080436', '_unique_id': '383f40f4dda24583bcb2d863d858ca83'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.081 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.081 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.081 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.081 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.081 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.081 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.081 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.081 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.081 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.081 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.081 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.081 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.081 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.081 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.081 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.081 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.081 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.081 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.081 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.081 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.081 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.081 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.081 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.081 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.081 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.081 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.081 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.081 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.081 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.081 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.081 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.081 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.091 12 DEBUG ceilometer.compute.pollsters [-] e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk.device.usage volume: 30081024 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.092 12 DEBUG ceilometer.compute.pollsters [-] e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.109 12 DEBUG ceilometer.compute.pollsters [-] 3f315996-e85d-463b-9123-272512335a7f/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.109 12 DEBUG ceilometer.compute.pollsters [-] 3f315996-e85d-463b-9123-272512335a7f/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.110 12 DEBUG ceilometer.compute.pollsters [-] 3f315996-e85d-463b-9123-272512335a7f/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.111 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '13caa819-85f9-4595-8cff-5df4a45c2d0c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30081024, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': 'e4a6074c-55b0-4529-b184-3ba3ca0dab8c-vda', 'timestamp': '2025-11-22T07:58:37.082072', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-697404939', 'name': 'instance-0000004d', 'instance_id': 'e4a6074c-55b0-4529-b184-3ba3ca0dab8c', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0cf656d2-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.751909848, 'message_signature': '317a660cc24ef121a11e2a04b28b6efb6e530699877b0077edcf52df9be787ed'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': 'e4a6074c-55b0-4529-b184-3ba3ca0dab8c-sda', 'timestamp': '2025-11-22T07:58:37.082072', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-697404939', 'name': 'instance-0000004d', 'instance_id': 'e4a6074c-55b0-4529-b184-3ba3ca0dab8c', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0cf66aa0-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.751909848, 'message_signature': 'fb16fb9978a967897b36c48febcebc4b293d2ffcd9fa91317ea591e359413d1c'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': '3f315996-e85d-463b-9123-272512335a7f-vda', 'timestamp': '2025-11-22T07:58:37.082072', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1175546546', 'name': 'instance-00000050', 'instance_id': '3f315996-e85d-463b-9123-272512335a7f', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0cf8f4aa-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.763146618, 'message_signature': '1944cb09dc7db140821c8faa987c42b295baa95c6a49232bc40cbef81b519693'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': '3f315996-e85d-463b-9123-272512335a7f-sda', 'timestamp': '2025-11-22T07:58:37.082072', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1175546546', 'name': 'instance-00000050', 'instance_id': '3f315996-e85d-463b-9123-272512335a7f', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0cf90256-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.763146618, 'message_signature': '038408138808f4f90d7b34dadcb2c51ee57c757f553d4150ed975bfc00104d88'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': '3f315996-e85d-463b-9123-272512335a7f-sdb', 'timestamp': '2025-11-22T07:58:37.082072', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1175546546', 'name': 'instance-00000050', 'instance_id': '3f315996-e85d-463b-9123-272512335a7f', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sdb'}, 'message_id': '0cf90df0-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.763146618, 'message_signature': 'd011bd4e39817ef3bd68de2a322a2e0c438e33452d90c60d017eb8c5f3c7fa19'}]}, 'timestamp': '2025-11-22 07:58:37.110544', '_unique_id': 'd6202f38400f4e27be4e21c2c3ace921'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.111 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.111 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.111 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.111 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.111 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.111 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.111 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.111 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.111 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.111 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.111 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.111 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.111 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.111 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.111 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.111 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.111 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.111 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.111 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.111 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.111 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.111 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.111 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.111 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.111 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.111 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.111 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.111 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.111 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.111 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.111 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.112 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.113 12 DEBUG ceilometer.compute.pollsters [-] e4a6074c-55b0-4529-b184-3ba3ca0dab8c/network.incoming.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.113 12 DEBUG ceilometer.compute.pollsters [-] 3f315996-e85d-463b-9123-272512335a7f/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.114 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '96dcb83b-ed59-448b-8f78-06433c6394c8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': 'instance-0000004d-e4a6074c-55b0-4529-b184-3ba3ca0dab8c-tap392e43af-a9', 'timestamp': '2025-11-22T07:58:37.113101', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-697404939', 'name': 'tap392e43af-a9', 'instance_id': 'e4a6074c-55b0-4529-b184-3ba3ca0dab8c', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:10:9b:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap392e43af-a9'}, 'message_id': '0cf97eca-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.735328634, 'message_signature': 'f99a00f15d6f605c8ebc5308ff80dbeccd77248d49d89721351ee846d4890aed'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': 'instance-00000050-3f315996-e85d-463b-9123-272512335a7f-tapf33dc67e-31', 'timestamp': '2025-11-22T07:58:37.113101', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1175546546', 'name': 'tapf33dc67e-31', 'instance_id': '3f315996-e85d-463b-9123-272512335a7f', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bc:09:ea', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf33dc67e-31'}, 'message_id': '0cf9894c-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.738434991, 'message_signature': '57bd83b524d72285aff977bb82afede047a2d4ee61abd35f08887d4330756108'}]}, 'timestamp': '2025-11-22 07:58:37.113689', '_unique_id': 'ed4ddd31bbda472dbd29e5287f652178'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.114 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.114 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.114 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.114 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.114 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.114 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.114 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.114 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.114 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.114 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.114 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.114 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.114 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.114 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.114 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.114 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.114 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.114 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.114 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.114 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.114 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.114 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.114 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.114 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.114 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.114 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.114 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.114 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.114 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.114 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.114 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.115 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.115 12 DEBUG ceilometer.compute.pollsters [-] e4a6074c-55b0-4529-b184-3ba3ca0dab8c/network.incoming.bytes volume: 1646 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.115 12 DEBUG ceilometer.compute.pollsters [-] 3f315996-e85d-463b-9123-272512335a7f/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.116 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ead5de3c-6300-4a19-adfc-ee53d9ea7f96', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1646, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': 'instance-0000004d-e4a6074c-55b0-4529-b184-3ba3ca0dab8c-tap392e43af-a9', 'timestamp': '2025-11-22T07:58:37.115305', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-697404939', 'name': 'tap392e43af-a9', 'instance_id': 'e4a6074c-55b0-4529-b184-3ba3ca0dab8c', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:10:9b:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap392e43af-a9'}, 'message_id': '0cf9d44c-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.735328634, 'message_signature': '9d9f673905876bc6d94d591db9cf5dd9dd338492dc1e82663c2622237e2ce254'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': 'instance-00000050-3f315996-e85d-463b-9123-272512335a7f-tapf33dc67e-31', 'timestamp': '2025-11-22T07:58:37.115305', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1175546546', 'name': 'tapf33dc67e-31', 'instance_id': '3f315996-e85d-463b-9123-272512335a7f', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bc:09:ea', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf33dc67e-31'}, 'message_id': '0cf9e27a-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.738434991, 'message_signature': '67e47dd9a72cc5dd41acf99c54e842533a3d7529ddf0dfbbb17ba9d360f5ad25'}]}, 'timestamp': '2025-11-22 07:58:37.116006', '_unique_id': '49710f3d051d460baf2a60f1d7e0b3b3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.116 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.116 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.116 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.116 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.116 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.116 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.116 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.116 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.116 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.116 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.116 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.116 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.116 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.116 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.116 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.116 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.116 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.116 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.116 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.116 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.116 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.116 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.116 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.116 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.116 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.116 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.116 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.116 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.116 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.117 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.117 12 DEBUG ceilometer.compute.pollsters [-] e4a6074c-55b0-4529-b184-3ba3ca0dab8c/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.118 12 DEBUG ceilometer.compute.pollsters [-] 3f315996-e85d-463b-9123-272512335a7f/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.118 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '14bae506-3e2f-4b81-ae86-d872b4d5b690', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': 'instance-0000004d-e4a6074c-55b0-4529-b184-3ba3ca0dab8c-tap392e43af-a9', 'timestamp': '2025-11-22T07:58:37.117871', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-697404939', 'name': 'tap392e43af-a9', 'instance_id': 'e4a6074c-55b0-4529-b184-3ba3ca0dab8c', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:10:9b:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap392e43af-a9'}, 'message_id': '0cfa3856-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.735328634, 'message_signature': '13c108a5c71b6cd66d442298e611e598d88640f98ef73f044b2ad512fdd7aa1b'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': 'instance-00000050-3f315996-e85d-463b-9123-272512335a7f-tapf33dc67e-31', 'timestamp': '2025-11-22T07:58:37.117871', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1175546546', 'name': 'tapf33dc67e-31', 'instance_id': '3f315996-e85d-463b-9123-272512335a7f', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bc:09:ea', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf33dc67e-31'}, 'message_id': '0cfa440e-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.738434991, 'message_signature': '8da5b44261d267e604c58fc23dd93d71c53b15e5b60b3e294f619a94bfbe8958'}]}, 'timestamp': '2025-11-22 07:58:37.118472', '_unique_id': '0fa6695cca9e4f9495124aa44526cdfb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.118 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.118 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.118 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.118 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.118 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.118 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.118 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.118 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.118 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.118 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.118 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.118 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.118 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.118 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.118 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.118 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.118 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.118 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.118 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.118 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.118 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.118 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.118 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.118 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.118 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.118 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.118 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.118 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.118 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.118 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.118 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.119 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.120 12 DEBUG ceilometer.compute.pollsters [-] e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk.device.write.latency volume: 677237694 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.120 12 DEBUG ceilometer.compute.pollsters [-] e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.120 12 DEBUG ceilometer.compute.pollsters [-] 3f315996-e85d-463b-9123-272512335a7f/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.120 12 DEBUG ceilometer.compute.pollsters [-] 3f315996-e85d-463b-9123-272512335a7f/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.121 12 DEBUG ceilometer.compute.pollsters [-] 3f315996-e85d-463b-9123-272512335a7f/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.121 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6a672470-8e13-4f1c-807f-3a28eb17f8ca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 677237694, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': 'e4a6074c-55b0-4529-b184-3ba3ca0dab8c-vda', 'timestamp': '2025-11-22T07:58:37.120042', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-697404939', 'name': 'instance-0000004d', 'instance_id': 'e4a6074c-55b0-4529-b184-3ba3ca0dab8c', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0cfa8c66-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.339476419, 'message_signature': '1ea608b89d873fca54cf2ee209cbe8c206a010719f81ec36a1474772ce259a31'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': 'e4a6074c-55b0-4529-b184-3ba3ca0dab8c-sda', 'timestamp': '2025-11-22T07:58:37.120042', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-697404939', 'name': 'instance-0000004d', 'instance_id': 'e4a6074c-55b0-4529-b184-3ba3ca0dab8c', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0cfa97b0-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.339476419, 'message_signature': '530d121c05722f5fd33172ecff457466debca9f58fa78667a1b708f37a29c2f5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': '3f315996-e85d-463b-9123-272512335a7f-vda', 'timestamp': '2025-11-22T07:58:37.120042', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1175546546', 'name': 'instance-00000050', 'instance_id': '3f315996-e85d-463b-9123-272512335a7f', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0cfaa0fc-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.36840911, 'message_signature': '8df6895f7a9d10ec2862c546641e2b361831db608f0eb11d64ed2a61dec2c6d7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': '3f315996-e85d-463b-9123-272512335a7f-sda', 'timestamp': '2025-11-22T07:58:37.120042', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1175546546', 'name': 'instance-00000050', 'instance_id': '3f315996-e85d-463b-9123-272512335a7f', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0cfaac32-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.36840911, 'message_signature': '53e1f1d498db6b3f03ef86a71db55a7cfc7856154e8f78b036e0448df3b7b344'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': '3f315996-e85d-463b-9123-272512335a7f-sdb', 'timestamp': '2025-11-22T07:58:37.120042', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1175546546', 'name': 'instance-00000050', 'instance_id': '3f315996-e85d-463b-9123-272512335a7f', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sdb'}, 'message_id': '0cfab5e2-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.36840911, 'message_signature': '4e0627cb294edbba60ce957d18260de2ecf6aa90d89a533b5bffb807a794f2b1'}]}, 'timestamp': '2025-11-22 07:58:37.121382', '_unique_id': 'e21430c1b10d42c88ce23df4f3069086'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.121 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.121 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.121 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.121 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.121 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.121 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.121 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.121 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.121 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.121 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.121 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.121 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.121 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.121 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.121 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.121 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.121 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.121 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.121 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.121 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.121 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.121 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.121 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.121 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.121 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.121 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.121 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.121 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.121 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.121 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.121 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.122 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.122 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.123 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-697404939>, <NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-1175546546>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-697404939>, <NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-1175546546>]
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.123 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.123 12 DEBUG ceilometer.compute.pollsters [-] e4a6074c-55b0-4529-b184-3ba3ca0dab8c/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.123 12 DEBUG ceilometer.compute.pollsters [-] 3f315996-e85d-463b-9123-272512335a7f/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.124 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a0a8161c-c6ae-415a-aef6-0d0a8a4e9f93', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': 'instance-0000004d-e4a6074c-55b0-4529-b184-3ba3ca0dab8c-tap392e43af-a9', 'timestamp': '2025-11-22T07:58:37.123345', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-697404939', 'name': 'tap392e43af-a9', 'instance_id': 'e4a6074c-55b0-4529-b184-3ba3ca0dab8c', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:10:9b:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap392e43af-a9'}, 'message_id': '0cfb0cd6-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.735328634, 'message_signature': 'b887103d2d762b42742ba61b209eb874da919faae8f816ef15a011552f2c52c3'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': 'instance-00000050-3f315996-e85d-463b-9123-272512335a7f-tapf33dc67e-31', 'timestamp': '2025-11-22T07:58:37.123345', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1175546546', 'name': 'tapf33dc67e-31', 'instance_id': '3f315996-e85d-463b-9123-272512335a7f', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bc:09:ea', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf33dc67e-31'}, 'message_id': '0cfb17bc-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.738434991, 'message_signature': 'ab01948f03d6ec004bf3d43d8aec12d25ab007782c0f06a5f97bb7db2f7704bb'}]}, 'timestamp': '2025-11-22 07:58:37.123887', '_unique_id': 'fe9b7fb7d65f47419a56f166f21855d9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.124 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.124 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.124 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.124 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.124 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.124 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.124 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.124 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.124 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.124 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.124 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.124 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.124 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.124 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.124 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.124 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.124 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.124 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.124 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.124 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.124 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.124 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.124 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.124 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.124 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.124 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.124 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.124 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.124 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.124 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.124 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.125 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.125 12 DEBUG ceilometer.compute.pollsters [-] e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk.device.allocation volume: 30351360 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.125 12 DEBUG ceilometer.compute.pollsters [-] e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.126 12 DEBUG ceilometer.compute.pollsters [-] 3f315996-e85d-463b-9123-272512335a7f/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.126 12 DEBUG ceilometer.compute.pollsters [-] 3f315996-e85d-463b-9123-272512335a7f/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.126 12 DEBUG ceilometer.compute.pollsters [-] 3f315996-e85d-463b-9123-272512335a7f/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.127 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '99cd7c0a-f4d8-4a33-b2f0-3e7335bb89f9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30351360, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': 'e4a6074c-55b0-4529-b184-3ba3ca0dab8c-vda', 'timestamp': '2025-11-22T07:58:37.125700', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-697404939', 'name': 'instance-0000004d', 'instance_id': 'e4a6074c-55b0-4529-b184-3ba3ca0dab8c', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0cfb68b6-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.751909848, 'message_signature': '53d7071ad90c7d46ab122f85cd28e818e040a715014fc5ef2ec2f1ee8fe3d802'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': 'e4a6074c-55b0-4529-b184-3ba3ca0dab8c-sda', 'timestamp': '2025-11-22T07:58:37.125700', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-697404939', 'name': 'instance-0000004d', 'instance_id': 'e4a6074c-55b0-4529-b184-3ba3ca0dab8c', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0cfb71f8-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.751909848, 'message_signature': 'd0899690b479047400e5b952c793b04535046c4386cdb3ea89f778b1cce078e3'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': '3f315996-e85d-463b-9123-272512335a7f-vda', 'timestamp': '2025-11-22T07:58:37.125700', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1175546546', 'name': 'instance-00000050', 'instance_id': '3f315996-e85d-463b-9123-272512335a7f', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0cfb7c02-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.763146618, 'message_signature': '60a8b6da4f46d5e7a48e4295807852ff2495a5cded0fcaf5430d470a271cf323'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': '3f315996-e85d-463b-9123-272512335a7f-sda', 'timestamp': '2025-11-22T07:58:37.125700', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1175546546', 'name': 'instance-00000050', 'instance_id': '3f315996-e85d-463b-9123-272512335a7f', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0cfb8530-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.763146618, 'message_signature': 'adcd0b25ff793789bd25f918eeb348417221cd260e8aa4acac4d77488319d17a'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': '3f315996-e85d-463b-9123-272512335a7f-sdb', 'timestamp': '2025-11-22T07:58:37.125700', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1175546546', 'name': 'instance-00000050', 'instance_id': '3f315996-e85d-463b-9123-272512335a7f', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sdb'}, 'message_id': '0cfb8de6-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.763146618, 'message_signature': '3ccaa2ca80fad62b53c4e7f18b8beb5eb7f8a7d9ba8e381f1ff9591b15a017bf'}]}, 'timestamp': '2025-11-22 07:58:37.126901', '_unique_id': '0769085351c146ccb3ed77819592f462'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.127 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.127 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.127 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.127 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.127 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.127 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.127 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.127 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.127 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.127 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.127 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.127 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.127 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.127 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.127 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.127 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.127 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.127 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.127 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.127 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.127 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.127 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.127 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.127 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.127 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.127 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.127 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.127 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.127 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.127 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.127 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.128 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.128 12 DEBUG ceilometer.compute.pollsters [-] e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk.device.read.latency volume: 3317987841 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.128 12 DEBUG ceilometer.compute.pollsters [-] e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk.device.read.latency volume: 326031591 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.128 12 DEBUG ceilometer.compute.pollsters [-] 3f315996-e85d-463b-9123-272512335a7f/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.129 12 DEBUG ceilometer.compute.pollsters [-] 3f315996-e85d-463b-9123-272512335a7f/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.129 12 DEBUG ceilometer.compute.pollsters [-] 3f315996-e85d-463b-9123-272512335a7f/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.130 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1320d840-41d5-4a84-bece-c61040089530', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3317987841, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': 'e4a6074c-55b0-4529-b184-3ba3ca0dab8c-vda', 'timestamp': '2025-11-22T07:58:37.128433', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-697404939', 'name': 'instance-0000004d', 'instance_id': 'e4a6074c-55b0-4529-b184-3ba3ca0dab8c', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0cfbd364-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.339476419, 'message_signature': '9afee478b9e6d34cffbafc8624c2ee1dcdf153d356e6c9744dfc7f7855a2e70b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 326031591, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': 'e4a6074c-55b0-4529-b184-3ba3ca0dab8c-sda', 'timestamp': '2025-11-22T07:58:37.128433', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-697404939', 'name': 'instance-0000004d', 'instance_id': 'e4a6074c-55b0-4529-b184-3ba3ca0dab8c', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0cfbdc7e-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.339476419, 'message_signature': 'd0dc94a7d7c963625faf7cb9e335dd44e51e6a2fe128b6408a5600870c280d5a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': '3f315996-e85d-463b-9123-272512335a7f-vda', 'timestamp': '2025-11-22T07:58:37.128433', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1175546546', 'name': 'instance-00000050', 'instance_id': '3f315996-e85d-463b-9123-272512335a7f', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0cfbe548-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.36840911, 'message_signature': 'c50ebf14663ee456c21463f4b9cd89d11e496e39b6574aa935103767cadf1139'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': '3f315996-e85d-463b-9123-272512335a7f-sda', 'timestamp': '2025-11-22T07:58:37.128433', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1175546546', 'name': 'instance-00000050', 'instance_id': '3f315996-e85d-463b-9123-272512335a7f', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0cfbeed0-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.36840911, 'message_signature': 'ccc67ae0f72f201d51503e2eb1e4e3e8200487eefdbd1231a00c1c06e20cedb4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': '3f315996-e85d-463b-9123-272512335a7f-sdb', 'timestamp': '2025-11-22T07:58:37.128433', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1175546546', 'name': 'instance-00000050', 'instance_id': '3f315996-e85d-463b-9123-272512335a7f', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sdb'}, 'message_id': '0cfbf844-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.36840911, 'message_signature': '7e16f7abfb9000035c3341a41c35a07d3f9e509fe6a713adcd18df75b9cb8384'}]}, 'timestamp': '2025-11-22 07:58:37.129649', '_unique_id': 'b5fa8686b38f423f90d2a170cba6e4d0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.130 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.130 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.130 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.130 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.130 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.130 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.130 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.130 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.130 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.130 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.130 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.130 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.130 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.130 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.130 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.130 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.130 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.130 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.130 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.130 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.130 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.130 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.130 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.130 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.130 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.130 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.130 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.130 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.130 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.130 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.130 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.131 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.131 12 DEBUG ceilometer.compute.pollsters [-] e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk.device.read.requests volume: 1213 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.131 12 DEBUG ceilometer.compute.pollsters [-] e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.131 12 DEBUG ceilometer.compute.pollsters [-] 3f315996-e85d-463b-9123-272512335a7f/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.131 12 DEBUG ceilometer.compute.pollsters [-] 3f315996-e85d-463b-9123-272512335a7f/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.132 12 DEBUG ceilometer.compute.pollsters [-] 3f315996-e85d-463b-9123-272512335a7f/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.132 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ddc4bfd2-4acf-41db-bd78-90572608b551', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1213, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': 'e4a6074c-55b0-4529-b184-3ba3ca0dab8c-vda', 'timestamp': '2025-11-22T07:58:37.131211', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-697404939', 'name': 'instance-0000004d', 'instance_id': 'e4a6074c-55b0-4529-b184-3ba3ca0dab8c', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0cfc4128-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.339476419, 'message_signature': '58ee215c8120163b6c30ac89282fb251a554b34577c949bc195fd45aab144d6b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': 'e4a6074c-55b0-4529-b184-3ba3ca0dab8c-sda', 'timestamp': '2025-11-22T07:58:37.131211', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-697404939', 'name': 'instance-0000004d', 'instance_id': 'e4a6074c-55b0-4529-b184-3ba3ca0dab8c', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0cfc4bbe-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.339476419, 'message_signature': '4a50965f790f14e7aa31c22aa63aee062cc895a8338c0bb3ccf4bbdafa5af01c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': '3f315996-e85d-463b-9123-272512335a7f-vda', 'timestamp': '2025-11-22T07:58:37.131211', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1175546546', 'name': 'instance-00000050', 'instance_id': '3f315996-e85d-463b-9123-272512335a7f', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0cfc54c4-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.36840911, 'message_signature': '3cd64d017edf89587ee0bb965ec6ea639a2838d136c6a1e4d94bbf5b05e71ce5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': '3f315996-e85d-463b-9123-272512335a7f-sda', 'timestamp': '2025-11-22T07:58:37.131211', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1175546546', 'name': 'instance-00000050', 'instance_id': '3f315996-e85d-463b-9123-272512335a7f', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0cfc5d66-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.36840911, 'message_signature': '6421e4801f13128be964aac957d22d61713e31c87957b7ba8281ff18e3e35e60'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': '3f315996-e85d-463b-9123-272512335a7f-sdb', 'timestamp': '2025-11-22T07:58:37.131211', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1175546546', 'name': 'instance-00000050', 'instance_id': '3f315996-e85d-463b-9123-272512335a7f', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sdb'}, 'message_id': '0cfc673e-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.36840911, 'message_signature': '9daca45c0bae77f35e1c542bd5af957e70fb922ec436621871436f17df608e10'}]}, 'timestamp': '2025-11-22 07:58:37.132466', '_unique_id': '9936d3d9670443de8ee14412648263a1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.132 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.132 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.132 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.132 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.132 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.132 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.132 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.132 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.132 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.132 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.132 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.132 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.132 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.132 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.132 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.132 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.132 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.132 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.132 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.132 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.132 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.132 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.132 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.132 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.132 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.132 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.132 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.132 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.132 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.132 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.132 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.133 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.134 12 DEBUG ceilometer.compute.pollsters [-] e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.134 12 DEBUG ceilometer.compute.pollsters [-] e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.134 12 DEBUG ceilometer.compute.pollsters [-] 3f315996-e85d-463b-9123-272512335a7f/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.134 12 DEBUG ceilometer.compute.pollsters [-] 3f315996-e85d-463b-9123-272512335a7f/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.135 12 DEBUG ceilometer.compute.pollsters [-] 3f315996-e85d-463b-9123-272512335a7f/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.135 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0cab60d0-e10c-49bb-85d7-8d0d4a39c320', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': 'e4a6074c-55b0-4529-b184-3ba3ca0dab8c-vda', 'timestamp': '2025-11-22T07:58:37.134047', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-697404939', 'name': 'instance-0000004d', 'instance_id': 'e4a6074c-55b0-4529-b184-3ba3ca0dab8c', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0cfcaec4-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.751909848, 'message_signature': '8ada0795988fb8fbf0e34ed187f0a72e625b83bfebe88ab8e129d24e66692260'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': 'e4a6074c-55b0-4529-b184-3ba3ca0dab8c-sda', 'timestamp': '2025-11-22T07:58:37.134047', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-697404939', 'name': 'instance-0000004d', 'instance_id': 'e4a6074c-55b0-4529-b184-3ba3ca0dab8c', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0cfcb900-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.751909848, 'message_signature': '59876a3d924aaac863082a690644ce3f77ac7fde61a9a81bd0e0108c4d144a44'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': '3f315996-e85d-463b-9123-272512335a7f-vda', 'timestamp': '2025-11-22T07:58:37.134047', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1175546546', 'name': 'instance-00000050', 'instance_id': '3f315996-e85d-463b-9123-272512335a7f', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0cfcc1e8-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.763146618, 'message_signature': '935fe5a2126dfcd4c26e6dba34add6782eafe3fdbc48f3b54a170e0b253bacee'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': '3f315996-e85d-463b-9123-272512335a7f-sda', 'timestamp': '2025-11-22T07:58:37.134047', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1175546546', 'name': 'instance-00000050', 'instance_id': '3f315996-e85d-463b-9123-272512335a7f', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0cfccaa8-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.763146618, 'message_signature': 'bb857cfcabc8dcc0152c1e140741e174458effd954a21e7b989e217688415011'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': '3f315996-e85d-463b-9123-272512335a7f-sdb', 'timestamp': '2025-11-22T07:58:37.134047', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1175546546', 'name': 'instance-00000050', 'instance_id': '3f315996-e85d-463b-9123-272512335a7f', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sdb'}, 'message_id': '0cfcd336-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.763146618, 'message_signature': 'ebd64226d002a88bdb99506054535b8d6e6c63c970060233e65433573ff67191'}]}, 'timestamp': '2025-11-22 07:58:37.135229', '_unique_id': 'fe72b9bc903c497f843912494e34864e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.135 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.135 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.135 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.135 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.135 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.135 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.135 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.135 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.135 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.135 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.135 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.135 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.135 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.135 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.135 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.135 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.135 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.135 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.135 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.135 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.135 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.135 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.135 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.135 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.135 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.135 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.135 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.135 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.135 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.135 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.135 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.136 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.137 12 DEBUG ceilometer.compute.pollsters [-] e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk.device.write.requests volume: 34 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.137 12 DEBUG ceilometer.compute.pollsters [-] e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.137 12 DEBUG ceilometer.compute.pollsters [-] 3f315996-e85d-463b-9123-272512335a7f/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.137 12 DEBUG ceilometer.compute.pollsters [-] 3f315996-e85d-463b-9123-272512335a7f/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.137 12 DEBUG ceilometer.compute.pollsters [-] 3f315996-e85d-463b-9123-272512335a7f/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.138 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a8776c87-e945-4a31-a940-573faa3831b9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 34, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': 'e4a6074c-55b0-4529-b184-3ba3ca0dab8c-vda', 'timestamp': '2025-11-22T07:58:37.136978', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-697404939', 'name': 'instance-0000004d', 'instance_id': 'e4a6074c-55b0-4529-b184-3ba3ca0dab8c', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0cfd21ba-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.339476419, 'message_signature': '8bb612a067ad32b24edf5bee616ed9f48518a52711bd847fb5fbae53caf765d3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': 'e4a6074c-55b0-4529-b184-3ba3ca0dab8c-sda', 'timestamp': '2025-11-22T07:58:37.136978', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-697404939', 'name': 'instance-0000004d', 'instance_id': 'e4a6074c-55b0-4529-b184-3ba3ca0dab8c', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0cfd2be2-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.339476419, 'message_signature': '0177b1b8b274827d740a10444e9d9c9f3a07ee602c0eefec2620f5111334af7f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': '3f315996-e85d-463b-9123-272512335a7f-vda', 'timestamp': '2025-11-22T07:58:37.136978', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1175546546', 'name': 'instance-00000050', 'instance_id': '3f315996-e85d-463b-9123-272512335a7f', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0cfd34e8-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.36840911, 'message_signature': '3490181301707d948aa1e94e4fe7e17726b995344653738ec547342739635bf5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': '3f315996-e85d-463b-9123-272512335a7f-sda', 'timestamp': '2025-11-22T07:58:37.136978', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1175546546', 'name': 'instance-00000050', 'instance_id': '3f315996-e85d-463b-9123-272512335a7f', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0cfd3d80-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.36840911, 'message_signature': '26f80261825bee999c0bc8eac4d6151afe72721b183eaeaee209f953be066863'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': '3f315996-e85d-463b-9123-272512335a7f-sdb', 'timestamp': '2025-11-22T07:58:37.136978', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1175546546', 'name': 'instance-00000050', 'instance_id': '3f315996-e85d-463b-9123-272512335a7f', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sdb'}, 'message_id': '0cfd4604-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.36840911, 'message_signature': '712eafdd676e1cda438237326eccb67fa2e8886dc7a4d2da833fc1eb65352f60'}]}, 'timestamp': '2025-11-22 07:58:37.138168', '_unique_id': 'f583dac5f5a84ca5baa49f71cbc2fdfd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.138 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.138 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.138 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.138 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.138 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.138 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.138 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.138 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.138 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.138 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.138 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.138 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.138 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.138 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.138 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.138 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.138 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.138 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.138 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.138 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.138 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.138 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.138 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.138 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.138 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.138 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.138 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.138 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.138 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.138 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.138 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.139 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.139 12 DEBUG ceilometer.compute.pollsters [-] e4a6074c-55b0-4529-b184-3ba3ca0dab8c/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.139 12 DEBUG ceilometer.compute.pollsters [-] 3f315996-e85d-463b-9123-272512335a7f/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.140 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd26d1e56-37fb-436d-87f6-2bd83da122db', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': 'instance-0000004d-e4a6074c-55b0-4529-b184-3ba3ca0dab8c-tap392e43af-a9', 'timestamp': '2025-11-22T07:58:37.139727', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-697404939', 'name': 'tap392e43af-a9', 'instance_id': 'e4a6074c-55b0-4529-b184-3ba3ca0dab8c', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:10:9b:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap392e43af-a9'}, 'message_id': '0cfd8c9a-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.735328634, 'message_signature': '0cef10af5ef3b07613d6082390b9c52e77060e80dbf298fec7816e39fcd2e3b7'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': 'instance-00000050-3f315996-e85d-463b-9123-272512335a7f-tapf33dc67e-31', 'timestamp': '2025-11-22T07:58:37.139727', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1175546546', 'name': 'tapf33dc67e-31', 'instance_id': '3f315996-e85d-463b-9123-272512335a7f', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bc:09:ea', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf33dc67e-31'}, 'message_id': '0cfd95fa-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.738434991, 'message_signature': '1a49e436c31534986e4b3a8f1cefa336b860214b796b61ee5f72d945055de3d5'}]}, 'timestamp': '2025-11-22 07:58:37.140223', '_unique_id': '63dbb3f856eb4a5b935282b1694618b2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.140 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.140 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.140 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.140 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.140 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.140 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.140 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.140 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.140 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.140 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.140 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.140 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.140 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.140 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.140 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.140 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.140 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.140 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.140 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.140 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.140 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.140 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.140 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.140 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.140 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.140 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.140 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.140 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.140 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.140 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.140 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.141 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.141 12 DEBUG ceilometer.compute.pollsters [-] e4a6074c-55b0-4529-b184-3ba3ca0dab8c/cpu volume: 14830000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.141 12 DEBUG ceilometer.compute.pollsters [-] 3f315996-e85d-463b-9123-272512335a7f/cpu volume: 60000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.142 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c1066704-4038-4491-b372-39d168c09fbb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 14830000000, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': 'e4a6074c-55b0-4529-b184-3ba3ca0dab8c', 'timestamp': '2025-11-22T07:58:37.141628', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-697404939', 'name': 'instance-0000004d', 'instance_id': 'e4a6074c-55b0-4529-b184-3ba3ca0dab8c', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '0cfdd696-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.714085105, 'message_signature': '94ec512a9599c7653f35d33e678803713e8cdee05a76a07a7f38cd86634b6a36'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 60000000, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': '3f315996-e85d-463b-9123-272512335a7f', 'timestamp': '2025-11-22T07:58:37.141628', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1175546546', 'name': 'instance-00000050', 'instance_id': '3f315996-e85d-463b-9123-272512335a7f', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '0cfddf74-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.731329464, 'message_signature': 'cee58fb90c561a314e3b5474c583fabc7ccf3d3ac8938a2469b8d593799011a5'}]}, 'timestamp': '2025-11-22 07:58:37.142096', '_unique_id': 'cfeac936643e401cb260ccfe88f8abea'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.142 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.142 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.142 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.142 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.142 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.142 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.142 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.142 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.142 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.142 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.142 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.142 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.142 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.142 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.142 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.142 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.142 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.142 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.142 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.142 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.142 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.142 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.142 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.142 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.142 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.142 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.142 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.142 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.142 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.143 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.143 12 DEBUG ceilometer.compute.pollsters [-] e4a6074c-55b0-4529-b184-3ba3ca0dab8c/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.143 12 DEBUG ceilometer.compute.pollsters [-] 3f315996-e85d-463b-9123-272512335a7f/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.144 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '377af68c-793b-4971-b796-c66dc1c2ab32', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': 'instance-0000004d-e4a6074c-55b0-4529-b184-3ba3ca0dab8c-tap392e43af-a9', 'timestamp': '2025-11-22T07:58:37.143481', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-697404939', 'name': 'tap392e43af-a9', 'instance_id': 'e4a6074c-55b0-4529-b184-3ba3ca0dab8c', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:10:9b:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap392e43af-a9'}, 'message_id': '0cfe1f84-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.735328634, 'message_signature': '252d2b2ba1163754b93ecc7c6fe61dd96d84f0f4836ae36bd60d3f28987d9efe'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': 'instance-00000050-3f315996-e85d-463b-9123-272512335a7f-tapf33dc67e-31', 'timestamp': '2025-11-22T07:58:37.143481', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1175546546', 'name': 'tapf33dc67e-31', 'instance_id': '3f315996-e85d-463b-9123-272512335a7f', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bc:09:ea', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf33dc67e-31'}, 'message_id': '0cfe28bc-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.738434991, 'message_signature': '5e24a918be6af1c7a9cc0de57a7442f1169e2bdabf9b62735898f0791b012856'}]}, 'timestamp': '2025-11-22 07:58:37.143980', '_unique_id': '212db67b4a944564b305cb02c3ee24b8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.144 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.144 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.144 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.144 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.144 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.144 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.144 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.144 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.144 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.144 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.144 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.144 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.144 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.144 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.144 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.144 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.144 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.144 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.144 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.144 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.144 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.144 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.144 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.144 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.144 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.144 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.144 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.144 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.144 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.145 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.145 12 DEBUG ceilometer.compute.pollsters [-] e4a6074c-55b0-4529-b184-3ba3ca0dab8c/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.145 12 DEBUG ceilometer.compute.pollsters [-] 3f315996-e85d-463b-9123-272512335a7f/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.146 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a738a24c-2c50-4a9a-989c-0df0c6bae3f9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': 'instance-0000004d-e4a6074c-55b0-4529-b184-3ba3ca0dab8c-tap392e43af-a9', 'timestamp': '2025-11-22T07:58:37.145429', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-697404939', 'name': 'tap392e43af-a9', 'instance_id': 'e4a6074c-55b0-4529-b184-3ba3ca0dab8c', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:10:9b:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap392e43af-a9'}, 'message_id': '0cfe6b1a-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.735328634, 'message_signature': 'd09ddb87702a4d55432e25c85a9ea46e2284a16d0d34fbd9e70c5e8492534561'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_name': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_name': None, 'resource_id': 'instance-00000050-3f315996-e85d-463b-9123-272512335a7f-tapf33dc67e-31', 'timestamp': '2025-11-22T07:58:37.145429', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1175546546', 'name': 'tapf33dc67e-31', 'instance_id': '3f315996-e85d-463b-9123-272512335a7f', 'instance_type': 'm1.nano', 'host': '324e9714b17440ff0140ef9a4e7147993ad59672cbe6385274aa8b4f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bc:09:ea', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf33dc67e-31'}, 'message_id': '0cfe743e-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5059.738434991, 'message_signature': '553d2b6452a3ea624b0e1eba771a8458f71052df44a2dd1f74dc7a93bc6a88e4'}]}, 'timestamp': '2025-11-22 07:58:37.145911', '_unique_id': 'c18e77c3e6d24d2f9671a9667397b2d5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.146 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.146 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.146 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.146 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.146 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.146 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.146 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.146 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.146 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.146 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.146 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.146 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.146 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.146 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.146 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.146 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.146 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.146 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.146 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.146 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.146 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.146 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.146 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.146 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.146 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.146 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.146 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.146 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.146 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.146 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.146 12 ERROR oslo_messaging.notify.messaging 
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.147 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.147 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.147 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-697404939>, <NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-1175546546>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-697404939>, <NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-1175546546>]
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.147 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.147 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 02:58:37 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 07:58:37.147 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-697404939>, <NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-1175546546>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-697404939>, <NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-1175546546>]
Nov 22 02:58:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:37.329 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:58:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:37.330 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:58:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:37.331 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:58:37 np0005531887 podman[225534]: 2025-11-22 07:58:37.860485058 +0000 UTC m=+0.078464857 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, managed_by=edpm_ansible, name=ubi9-minimal, version=9.6, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, vcs-type=git, io.openshift.expose-services=, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, release=1755695350)
Nov 22 02:58:38 np0005531887 nova_compute[186849]: 2025-11-22 07:58:38.583 186853 INFO nova.compute.manager [None req-01f6f0aa-76ef-48da-8f14-8aacde6119aa 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Unrescuing#033[00m
Nov 22 02:58:38 np0005531887 nova_compute[186849]: 2025-11-22 07:58:38.584 186853 DEBUG oslo_concurrency.lockutils [None req-01f6f0aa-76ef-48da-8f14-8aacde6119aa 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Acquiring lock "refresh_cache-3f315996-e85d-463b-9123-272512335a7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:58:38 np0005531887 nova_compute[186849]: 2025-11-22 07:58:38.584 186853 DEBUG oslo_concurrency.lockutils [None req-01f6f0aa-76ef-48da-8f14-8aacde6119aa 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Acquired lock "refresh_cache-3f315996-e85d-463b-9123-272512335a7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:58:38 np0005531887 nova_compute[186849]: 2025-11-22 07:58:38.584 186853 DEBUG nova.network.neutron [None req-01f6f0aa-76ef-48da-8f14-8aacde6119aa 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:58:38 np0005531887 nova_compute[186849]: 2025-11-22 07:58:38.801 186853 DEBUG nova.compute.manager [req-feb172a5-2fd8-4a22-922a-e4a2aecb6322 req-155fcd68-e13a-4849-8994-48cbbd11cb21 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Received event network-vif-plugged-f33dc67e-3190-49f9-a981-9b80daf65bdb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:58:38 np0005531887 nova_compute[186849]: 2025-11-22 07:58:38.802 186853 DEBUG oslo_concurrency.lockutils [req-feb172a5-2fd8-4a22-922a-e4a2aecb6322 req-155fcd68-e13a-4849-8994-48cbbd11cb21 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "3f315996-e85d-463b-9123-272512335a7f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:58:38 np0005531887 nova_compute[186849]: 2025-11-22 07:58:38.802 186853 DEBUG oslo_concurrency.lockutils [req-feb172a5-2fd8-4a22-922a-e4a2aecb6322 req-155fcd68-e13a-4849-8994-48cbbd11cb21 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3f315996-e85d-463b-9123-272512335a7f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:58:38 np0005531887 nova_compute[186849]: 2025-11-22 07:58:38.802 186853 DEBUG oslo_concurrency.lockutils [req-feb172a5-2fd8-4a22-922a-e4a2aecb6322 req-155fcd68-e13a-4849-8994-48cbbd11cb21 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3f315996-e85d-463b-9123-272512335a7f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:58:38 np0005531887 nova_compute[186849]: 2025-11-22 07:58:38.803 186853 DEBUG nova.compute.manager [req-feb172a5-2fd8-4a22-922a-e4a2aecb6322 req-155fcd68-e13a-4849-8994-48cbbd11cb21 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] No waiting events found dispatching network-vif-plugged-f33dc67e-3190-49f9-a981-9b80daf65bdb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:58:38 np0005531887 nova_compute[186849]: 2025-11-22 07:58:38.803 186853 WARNING nova.compute.manager [req-feb172a5-2fd8-4a22-922a-e4a2aecb6322 req-155fcd68-e13a-4849-8994-48cbbd11cb21 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Received unexpected event network-vif-plugged-f33dc67e-3190-49f9-a981-9b80daf65bdb for instance with vm_state rescued and task_state unrescuing.#033[00m
Nov 22 02:58:39 np0005531887 nova_compute[186849]: 2025-11-22 07:58:39.893 186853 DEBUG nova.network.neutron [None req-01f6f0aa-76ef-48da-8f14-8aacde6119aa 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Updating instance_info_cache with network_info: [{"id": "f33dc67e-3190-49f9-a981-9b80daf65bdb", "address": "fa:16:3e:bc:09:ea", "network": {"id": "06e0f3a5-911a-4244-bd9c-8cb4fa4c4794", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-960378838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd33c7e49baa4c7f9575824b348a0f23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf33dc67e-31", "ovs_interfaceid": "f33dc67e-3190-49f9-a981-9b80daf65bdb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:58:39 np0005531887 nova_compute[186849]: 2025-11-22 07:58:39.906 186853 DEBUG oslo_concurrency.lockutils [None req-01f6f0aa-76ef-48da-8f14-8aacde6119aa 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Releasing lock "refresh_cache-3f315996-e85d-463b-9123-272512335a7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:58:39 np0005531887 nova_compute[186849]: 2025-11-22 07:58:39.907 186853 DEBUG nova.objects.instance [None req-01f6f0aa-76ef-48da-8f14-8aacde6119aa 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lazy-loading 'flavor' on Instance uuid 3f315996-e85d-463b-9123-272512335a7f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:58:39 np0005531887 kernel: tapf33dc67e-31 (unregistering): left promiscuous mode
Nov 22 02:58:39 np0005531887 NetworkManager[55210]: <info>  [1763798319.9829] device (tapf33dc67e-31): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 02:58:39 np0005531887 ovn_controller[95130]: 2025-11-22T07:58:39Z|00233|binding|INFO|Releasing lport f33dc67e-3190-49f9-a981-9b80daf65bdb from this chassis (sb_readonly=0)
Nov 22 02:58:39 np0005531887 nova_compute[186849]: 2025-11-22 07:58:39.994 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:39 np0005531887 ovn_controller[95130]: 2025-11-22T07:58:39Z|00234|binding|INFO|Setting lport f33dc67e-3190-49f9-a981-9b80daf65bdb down in Southbound
Nov 22 02:58:39 np0005531887 ovn_controller[95130]: 2025-11-22T07:58:39Z|00235|binding|INFO|Removing iface tapf33dc67e-31 ovn-installed in OVS
Nov 22 02:58:39 np0005531887 nova_compute[186849]: 2025-11-22 07:58:39.997 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:40.007 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bc:09:ea 10.100.0.3'], port_security=['fa:16:3e:bc:09:ea 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '3f315996-e85d-463b-9123-272512335a7f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'neutron:revision_number': '6', 'neutron:security_group_ids': '40e7d412-78c2-4966-b2d0-76294ef96b0a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2f082361-600e-461c-8c13-2c91c0ff7f77, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=f33dc67e-3190-49f9-a981-9b80daf65bdb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:58:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:40.008 104084 INFO neutron.agent.ovn.metadata.agent [-] Port f33dc67e-3190-49f9-a981-9b80daf65bdb in datapath 06e0f3a5-911a-4244-bd9c-8cb4fa4c4794 unbound from our chassis#033[00m
Nov 22 02:58:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:40.010 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 06e0f3a5-911a-4244-bd9c-8cb4fa4c4794#033[00m
Nov 22 02:58:40 np0005531887 nova_compute[186849]: 2025-11-22 07:58:40.015 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:40.027 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[962aa275-2cb2-4e32-8ac6-3158733e5a72]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:40 np0005531887 systemd[1]: machine-qemu\x2d34\x2dinstance\x2d00000050.scope: Deactivated successfully.
Nov 22 02:58:40 np0005531887 systemd[1]: machine-qemu\x2d34\x2dinstance\x2d00000050.scope: Consumed 3.640s CPU time.
Nov 22 02:58:40 np0005531887 systemd-machined[153180]: Machine qemu-34-instance-00000050 terminated.
Nov 22 02:58:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:40.060 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[d2c3430a-e923-4a77-8515-144e4d9421ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:40.064 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[a5310ca1-6f95-46d6-b189-1d231e42c24c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:40.095 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[816b2e9f-92e2-4f4c-9303-75ca1a9b886f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:40.112 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[376ae33d-56eb-4c11-9744-a5c4648ba4ce]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap06e0f3a5-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:b7:bd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 9, 'tx_packets': 11, 'rx_bytes': 658, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 9, 'tx_packets': 11, 'rx_bytes': 658, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 73], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 501577, 'reachable_time': 19429, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225567, 'error': None, 'target': 'ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:40.128 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[bf696fa9-2978-4b3d-80da-8b91a98538d6]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap06e0f3a5-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 501591, 'tstamp': 501591}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225568, 'error': None, 'target': 'ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap06e0f3a5-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 501595, 'tstamp': 501595}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225568, 'error': None, 'target': 'ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:40.130 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap06e0f3a5-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:58:40 np0005531887 nova_compute[186849]: 2025-11-22 07:58:40.132 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:40 np0005531887 nova_compute[186849]: 2025-11-22 07:58:40.136 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:40.136 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap06e0f3a5-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:58:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:40.137 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:58:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:40.137 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap06e0f3a5-90, col_values=(('external_ids', {'iface-id': '465da2c0-9a1c-41a9-be9a-d10bcbd7a813'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:58:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:40.138 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:58:40 np0005531887 nova_compute[186849]: 2025-11-22 07:58:40.225 186853 INFO nova.virt.libvirt.driver [-] [instance: 3f315996-e85d-463b-9123-272512335a7f] Instance destroyed successfully.#033[00m
Nov 22 02:58:40 np0005531887 nova_compute[186849]: 2025-11-22 07:58:40.226 186853 DEBUG nova.objects.instance [None req-01f6f0aa-76ef-48da-8f14-8aacde6119aa 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lazy-loading 'numa_topology' on Instance uuid 3f315996-e85d-463b-9123-272512335a7f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:58:40 np0005531887 kernel: tapf33dc67e-31: entered promiscuous mode
Nov 22 02:58:40 np0005531887 NetworkManager[55210]: <info>  [1763798320.4217] manager: (tapf33dc67e-31): new Tun device (/org/freedesktop/NetworkManager/Devices/118)
Nov 22 02:58:40 np0005531887 systemd-udevd[225559]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:58:40 np0005531887 ovn_controller[95130]: 2025-11-22T07:58:40Z|00236|binding|INFO|Claiming lport f33dc67e-3190-49f9-a981-9b80daf65bdb for this chassis.
Nov 22 02:58:40 np0005531887 ovn_controller[95130]: 2025-11-22T07:58:40Z|00237|binding|INFO|f33dc67e-3190-49f9-a981-9b80daf65bdb: Claiming fa:16:3e:bc:09:ea 10.100.0.3
Nov 22 02:58:40 np0005531887 nova_compute[186849]: 2025-11-22 07:58:40.422 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:40.432 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bc:09:ea 10.100.0.3'], port_security=['fa:16:3e:bc:09:ea 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '3f315996-e85d-463b-9123-272512335a7f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'neutron:revision_number': '7', 'neutron:security_group_ids': '40e7d412-78c2-4966-b2d0-76294ef96b0a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2f082361-600e-461c-8c13-2c91c0ff7f77, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=f33dc67e-3190-49f9-a981-9b80daf65bdb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:58:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:40.433 104084 INFO neutron.agent.ovn.metadata.agent [-] Port f33dc67e-3190-49f9-a981-9b80daf65bdb in datapath 06e0f3a5-911a-4244-bd9c-8cb4fa4c4794 bound to our chassis#033[00m
Nov 22 02:58:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:40.435 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 06e0f3a5-911a-4244-bd9c-8cb4fa4c4794#033[00m
Nov 22 02:58:40 np0005531887 NetworkManager[55210]: <info>  [1763798320.4366] device (tapf33dc67e-31): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 02:58:40 np0005531887 ovn_controller[95130]: 2025-11-22T07:58:40Z|00238|binding|INFO|Setting lport f33dc67e-3190-49f9-a981-9b80daf65bdb ovn-installed in OVS
Nov 22 02:58:40 np0005531887 ovn_controller[95130]: 2025-11-22T07:58:40Z|00239|binding|INFO|Setting lport f33dc67e-3190-49f9-a981-9b80daf65bdb up in Southbound
Nov 22 02:58:40 np0005531887 NetworkManager[55210]: <info>  [1763798320.4381] device (tapf33dc67e-31): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 02:58:40 np0005531887 nova_compute[186849]: 2025-11-22 07:58:40.438 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:40 np0005531887 nova_compute[186849]: 2025-11-22 07:58:40.441 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:40.451 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[9c27362f-4b21-4c4d-a064-c1b8c3925407]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:40 np0005531887 systemd-machined[153180]: New machine qemu-35-instance-00000050.
Nov 22 02:58:40 np0005531887 systemd[1]: Started Virtual Machine qemu-35-instance-00000050.
Nov 22 02:58:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:40.483 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[34ca7467-94ab-42b6-9654-35f33d15d481]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:40.488 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[b93fdbad-fad8-402a-a854-3339eab96067]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:40.517 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[82223290-df9d-4626-b318-3b8bff68a35e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:40.536 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[9aa72517-07c7-42fa-93ce-3e629aad37d2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap06e0f3a5-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:b7:bd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 9, 'tx_packets': 13, 'rx_bytes': 658, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 9, 'tx_packets': 13, 'rx_bytes': 658, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 73], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 501577, 'reachable_time': 19429, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225617, 'error': None, 'target': 'ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:40.554 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[313a4ff3-f176-4231-82cf-897a35751409]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap06e0f3a5-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 501591, 'tstamp': 501591}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225618, 'error': None, 'target': 'ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap06e0f3a5-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 501595, 'tstamp': 501595}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225618, 'error': None, 'target': 'ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:58:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:40.556 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap06e0f3a5-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:58:40 np0005531887 nova_compute[186849]: 2025-11-22 07:58:40.557 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:40 np0005531887 nova_compute[186849]: 2025-11-22 07:58:40.558 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:40.559 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap06e0f3a5-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:58:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:40.559 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:58:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:40.560 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap06e0f3a5-90, col_values=(('external_ids', {'iface-id': '465da2c0-9a1c-41a9-be9a-d10bcbd7a813'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:58:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:58:40.560 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:58:40 np0005531887 nova_compute[186849]: 2025-11-22 07:58:40.893 186853 DEBUG nova.compute.manager [req-b6dc0cfe-f541-40f0-910e-9fa3c2d812c4 req-eb5412f2-32b6-4961-a9a7-a1ec48ec3cc5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Received event network-vif-unplugged-f33dc67e-3190-49f9-a981-9b80daf65bdb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:58:40 np0005531887 nova_compute[186849]: 2025-11-22 07:58:40.893 186853 DEBUG oslo_concurrency.lockutils [req-b6dc0cfe-f541-40f0-910e-9fa3c2d812c4 req-eb5412f2-32b6-4961-a9a7-a1ec48ec3cc5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "3f315996-e85d-463b-9123-272512335a7f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:58:40 np0005531887 nova_compute[186849]: 2025-11-22 07:58:40.894 186853 DEBUG oslo_concurrency.lockutils [req-b6dc0cfe-f541-40f0-910e-9fa3c2d812c4 req-eb5412f2-32b6-4961-a9a7-a1ec48ec3cc5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3f315996-e85d-463b-9123-272512335a7f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:58:40 np0005531887 nova_compute[186849]: 2025-11-22 07:58:40.894 186853 DEBUG oslo_concurrency.lockutils [req-b6dc0cfe-f541-40f0-910e-9fa3c2d812c4 req-eb5412f2-32b6-4961-a9a7-a1ec48ec3cc5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3f315996-e85d-463b-9123-272512335a7f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:58:40 np0005531887 nova_compute[186849]: 2025-11-22 07:58:40.894 186853 DEBUG nova.compute.manager [req-b6dc0cfe-f541-40f0-910e-9fa3c2d812c4 req-eb5412f2-32b6-4961-a9a7-a1ec48ec3cc5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] No waiting events found dispatching network-vif-unplugged-f33dc67e-3190-49f9-a981-9b80daf65bdb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:58:40 np0005531887 nova_compute[186849]: 2025-11-22 07:58:40.894 186853 WARNING nova.compute.manager [req-b6dc0cfe-f541-40f0-910e-9fa3c2d812c4 req-eb5412f2-32b6-4961-a9a7-a1ec48ec3cc5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Received unexpected event network-vif-unplugged-f33dc67e-3190-49f9-a981-9b80daf65bdb for instance with vm_state rescued and task_state unrescuing.#033[00m
Nov 22 02:58:40 np0005531887 nova_compute[186849]: 2025-11-22 07:58:40.895 186853 DEBUG nova.compute.manager [req-b6dc0cfe-f541-40f0-910e-9fa3c2d812c4 req-eb5412f2-32b6-4961-a9a7-a1ec48ec3cc5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Received event network-vif-plugged-f33dc67e-3190-49f9-a981-9b80daf65bdb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:58:40 np0005531887 nova_compute[186849]: 2025-11-22 07:58:40.895 186853 DEBUG oslo_concurrency.lockutils [req-b6dc0cfe-f541-40f0-910e-9fa3c2d812c4 req-eb5412f2-32b6-4961-a9a7-a1ec48ec3cc5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "3f315996-e85d-463b-9123-272512335a7f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:58:40 np0005531887 nova_compute[186849]: 2025-11-22 07:58:40.895 186853 DEBUG oslo_concurrency.lockutils [req-b6dc0cfe-f541-40f0-910e-9fa3c2d812c4 req-eb5412f2-32b6-4961-a9a7-a1ec48ec3cc5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3f315996-e85d-463b-9123-272512335a7f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:58:40 np0005531887 nova_compute[186849]: 2025-11-22 07:58:40.895 186853 DEBUG oslo_concurrency.lockutils [req-b6dc0cfe-f541-40f0-910e-9fa3c2d812c4 req-eb5412f2-32b6-4961-a9a7-a1ec48ec3cc5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3f315996-e85d-463b-9123-272512335a7f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:58:40 np0005531887 nova_compute[186849]: 2025-11-22 07:58:40.895 186853 DEBUG nova.compute.manager [req-b6dc0cfe-f541-40f0-910e-9fa3c2d812c4 req-eb5412f2-32b6-4961-a9a7-a1ec48ec3cc5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] No waiting events found dispatching network-vif-plugged-f33dc67e-3190-49f9-a981-9b80daf65bdb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:58:40 np0005531887 nova_compute[186849]: 2025-11-22 07:58:40.895 186853 WARNING nova.compute.manager [req-b6dc0cfe-f541-40f0-910e-9fa3c2d812c4 req-eb5412f2-32b6-4961-a9a7-a1ec48ec3cc5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Received unexpected event network-vif-plugged-f33dc67e-3190-49f9-a981-9b80daf65bdb for instance with vm_state rescued and task_state unrescuing.#033[00m
Nov 22 02:58:40 np0005531887 nova_compute[186849]: 2025-11-22 07:58:40.896 186853 DEBUG nova.compute.manager [req-b6dc0cfe-f541-40f0-910e-9fa3c2d812c4 req-eb5412f2-32b6-4961-a9a7-a1ec48ec3cc5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Received event network-vif-plugged-f33dc67e-3190-49f9-a981-9b80daf65bdb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:58:40 np0005531887 nova_compute[186849]: 2025-11-22 07:58:40.896 186853 DEBUG oslo_concurrency.lockutils [req-b6dc0cfe-f541-40f0-910e-9fa3c2d812c4 req-eb5412f2-32b6-4961-a9a7-a1ec48ec3cc5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "3f315996-e85d-463b-9123-272512335a7f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:58:40 np0005531887 nova_compute[186849]: 2025-11-22 07:58:40.896 186853 DEBUG oslo_concurrency.lockutils [req-b6dc0cfe-f541-40f0-910e-9fa3c2d812c4 req-eb5412f2-32b6-4961-a9a7-a1ec48ec3cc5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3f315996-e85d-463b-9123-272512335a7f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:58:40 np0005531887 nova_compute[186849]: 2025-11-22 07:58:40.896 186853 DEBUG oslo_concurrency.lockutils [req-b6dc0cfe-f541-40f0-910e-9fa3c2d812c4 req-eb5412f2-32b6-4961-a9a7-a1ec48ec3cc5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3f315996-e85d-463b-9123-272512335a7f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:58:40 np0005531887 nova_compute[186849]: 2025-11-22 07:58:40.897 186853 DEBUG nova.compute.manager [req-b6dc0cfe-f541-40f0-910e-9fa3c2d812c4 req-eb5412f2-32b6-4961-a9a7-a1ec48ec3cc5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] No waiting events found dispatching network-vif-plugged-f33dc67e-3190-49f9-a981-9b80daf65bdb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:58:40 np0005531887 nova_compute[186849]: 2025-11-22 07:58:40.897 186853 WARNING nova.compute.manager [req-b6dc0cfe-f541-40f0-910e-9fa3c2d812c4 req-eb5412f2-32b6-4961-a9a7-a1ec48ec3cc5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Received unexpected event network-vif-plugged-f33dc67e-3190-49f9-a981-9b80daf65bdb for instance with vm_state rescued and task_state unrescuing.#033[00m
Nov 22 02:58:41 np0005531887 nova_compute[186849]: 2025-11-22 07:58:41.032 186853 DEBUG nova.virt.libvirt.host [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Removed pending event for 3f315996-e85d-463b-9123-272512335a7f due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Nov 22 02:58:41 np0005531887 nova_compute[186849]: 2025-11-22 07:58:41.032 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798321.0316777, 3f315996-e85d-463b-9123-272512335a7f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:58:41 np0005531887 nova_compute[186849]: 2025-11-22 07:58:41.032 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 3f315996-e85d-463b-9123-272512335a7f] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:58:41 np0005531887 nova_compute[186849]: 2025-11-22 07:58:41.037 186853 DEBUG nova.compute.manager [None req-01f6f0aa-76ef-48da-8f14-8aacde6119aa 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:58:41 np0005531887 nova_compute[186849]: 2025-11-22 07:58:41.057 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 3f315996-e85d-463b-9123-272512335a7f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:58:41 np0005531887 nova_compute[186849]: 2025-11-22 07:58:41.061 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 3f315996-e85d-463b-9123-272512335a7f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:58:41 np0005531887 nova_compute[186849]: 2025-11-22 07:58:41.087 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 3f315996-e85d-463b-9123-272512335a7f] During sync_power_state the instance has a pending task (unrescuing). Skip.#033[00m
Nov 22 02:58:41 np0005531887 nova_compute[186849]: 2025-11-22 07:58:41.087 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798321.036628, 3f315996-e85d-463b-9123-272512335a7f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:58:41 np0005531887 nova_compute[186849]: 2025-11-22 07:58:41.087 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 3f315996-e85d-463b-9123-272512335a7f] VM Started (Lifecycle Event)#033[00m
Nov 22 02:58:41 np0005531887 nova_compute[186849]: 2025-11-22 07:58:41.115 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 3f315996-e85d-463b-9123-272512335a7f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:58:41 np0005531887 nova_compute[186849]: 2025-11-22 07:58:41.123 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 3f315996-e85d-463b-9123-272512335a7f] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:58:41 np0005531887 nova_compute[186849]: 2025-11-22 07:58:41.499 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:41 np0005531887 nova_compute[186849]: 2025-11-22 07:58:41.587 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:41 np0005531887 podman[225628]: 2025-11-22 07:58:41.858210208 +0000 UTC m=+0.071814140 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute)
Nov 22 02:58:41 np0005531887 podman[225629]: 2025-11-22 07:58:41.884797001 +0000 UTC m=+0.098239839 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 02:58:42 np0005531887 nova_compute[186849]: 2025-11-22 07:58:42.986 186853 DEBUG nova.compute.manager [req-2d858a0f-e55f-448b-a75e-cdffe52c9339 req-2d97996f-8cff-403e-9e57-ff11ae992ff3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Received event network-vif-plugged-f33dc67e-3190-49f9-a981-9b80daf65bdb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:58:42 np0005531887 nova_compute[186849]: 2025-11-22 07:58:42.987 186853 DEBUG oslo_concurrency.lockutils [req-2d858a0f-e55f-448b-a75e-cdffe52c9339 req-2d97996f-8cff-403e-9e57-ff11ae992ff3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "3f315996-e85d-463b-9123-272512335a7f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:58:42 np0005531887 nova_compute[186849]: 2025-11-22 07:58:42.987 186853 DEBUG oslo_concurrency.lockutils [req-2d858a0f-e55f-448b-a75e-cdffe52c9339 req-2d97996f-8cff-403e-9e57-ff11ae992ff3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3f315996-e85d-463b-9123-272512335a7f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:58:42 np0005531887 nova_compute[186849]: 2025-11-22 07:58:42.987 186853 DEBUG oslo_concurrency.lockutils [req-2d858a0f-e55f-448b-a75e-cdffe52c9339 req-2d97996f-8cff-403e-9e57-ff11ae992ff3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3f315996-e85d-463b-9123-272512335a7f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:58:42 np0005531887 nova_compute[186849]: 2025-11-22 07:58:42.987 186853 DEBUG nova.compute.manager [req-2d858a0f-e55f-448b-a75e-cdffe52c9339 req-2d97996f-8cff-403e-9e57-ff11ae992ff3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] No waiting events found dispatching network-vif-plugged-f33dc67e-3190-49f9-a981-9b80daf65bdb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:58:42 np0005531887 nova_compute[186849]: 2025-11-22 07:58:42.987 186853 WARNING nova.compute.manager [req-2d858a0f-e55f-448b-a75e-cdffe52c9339 req-2d97996f-8cff-403e-9e57-ff11ae992ff3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Received unexpected event network-vif-plugged-f33dc67e-3190-49f9-a981-9b80daf65bdb for instance with vm_state active and task_state None.#033[00m
Nov 22 02:58:44 np0005531887 nova_compute[186849]: 2025-11-22 07:58:44.767 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:58:44 np0005531887 nova_compute[186849]: 2025-11-22 07:58:44.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:58:44 np0005531887 nova_compute[186849]: 2025-11-22 07:58:44.768 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 22 02:58:44 np0005531887 nova_compute[186849]: 2025-11-22 07:58:44.782 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 22 02:58:46 np0005531887 nova_compute[186849]: 2025-11-22 07:58:46.501 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:46 np0005531887 nova_compute[186849]: 2025-11-22 07:58:46.589 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:47 np0005531887 podman[225674]: 2025-11-22 07:58:47.84055164 +0000 UTC m=+0.059141525 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 02:58:48 np0005531887 nova_compute[186849]: 2025-11-22 07:58:48.783 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:58:49 np0005531887 nova_compute[186849]: 2025-11-22 07:58:49.763 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:58:49 np0005531887 nova_compute[186849]: 2025-11-22 07:58:49.767 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:58:49 np0005531887 nova_compute[186849]: 2025-11-22 07:58:49.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:58:49 np0005531887 nova_compute[186849]: 2025-11-22 07:58:49.791 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:58:49 np0005531887 nova_compute[186849]: 2025-11-22 07:58:49.791 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:58:49 np0005531887 nova_compute[186849]: 2025-11-22 07:58:49.791 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:58:49 np0005531887 nova_compute[186849]: 2025-11-22 07:58:49.792 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 02:58:49 np0005531887 nova_compute[186849]: 2025-11-22 07:58:49.853 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:58:49 np0005531887 nova_compute[186849]: 2025-11-22 07:58:49.919 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:58:49 np0005531887 nova_compute[186849]: 2025-11-22 07:58:49.920 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:58:49 np0005531887 nova_compute[186849]: 2025-11-22 07:58:49.988 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e4a6074c-55b0-4529-b184-3ba3ca0dab8c/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:58:49 np0005531887 nova_compute[186849]: 2025-11-22 07:58:49.995 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f315996-e85d-463b-9123-272512335a7f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:58:50 np0005531887 nova_compute[186849]: 2025-11-22 07:58:50.056 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f315996-e85d-463b-9123-272512335a7f/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:58:50 np0005531887 nova_compute[186849]: 2025-11-22 07:58:50.057 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f315996-e85d-463b-9123-272512335a7f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:58:50 np0005531887 nova_compute[186849]: 2025-11-22 07:58:50.128 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f315996-e85d-463b-9123-272512335a7f/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:58:50 np0005531887 nova_compute[186849]: 2025-11-22 07:58:50.309 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:58:50 np0005531887 nova_compute[186849]: 2025-11-22 07:58:50.310 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5454MB free_disk=73.28828811645508GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 02:58:50 np0005531887 nova_compute[186849]: 2025-11-22 07:58:50.311 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:58:50 np0005531887 nova_compute[186849]: 2025-11-22 07:58:50.311 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:58:50 np0005531887 nova_compute[186849]: 2025-11-22 07:58:50.483 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Instance e4a6074c-55b0-4529-b184-3ba3ca0dab8c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 02:58:50 np0005531887 nova_compute[186849]: 2025-11-22 07:58:50.484 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Instance 3f315996-e85d-463b-9123-272512335a7f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 02:58:50 np0005531887 nova_compute[186849]: 2025-11-22 07:58:50.484 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 02:58:50 np0005531887 nova_compute[186849]: 2025-11-22 07:58:50.484 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 02:58:50 np0005531887 nova_compute[186849]: 2025-11-22 07:58:50.604 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:58:50 np0005531887 nova_compute[186849]: 2025-11-22 07:58:50.619 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:58:50 np0005531887 nova_compute[186849]: 2025-11-22 07:58:50.646 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 02:58:50 np0005531887 nova_compute[186849]: 2025-11-22 07:58:50.647 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.336s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:58:51 np0005531887 nova_compute[186849]: 2025-11-22 07:58:51.506 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:51 np0005531887 nova_compute[186849]: 2025-11-22 07:58:51.592 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:52 np0005531887 nova_compute[186849]: 2025-11-22 07:58:52.648 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:58:52 np0005531887 nova_compute[186849]: 2025-11-22 07:58:52.648 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 02:58:52 np0005531887 nova_compute[186849]: 2025-11-22 07:58:52.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:58:53 np0005531887 podman[225710]: 2025-11-22 07:58:53.846665623 +0000 UTC m=+0.061657438 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 22 02:58:54 np0005531887 nova_compute[186849]: 2025-11-22 07:58:54.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:58:54 np0005531887 nova_compute[186849]: 2025-11-22 07:58:54.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 02:58:54 np0005531887 nova_compute[186849]: 2025-11-22 07:58:54.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 02:58:55 np0005531887 nova_compute[186849]: 2025-11-22 07:58:55.063 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "refresh_cache-e4a6074c-55b0-4529-b184-3ba3ca0dab8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:58:55 np0005531887 nova_compute[186849]: 2025-11-22 07:58:55.064 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquired lock "refresh_cache-e4a6074c-55b0-4529-b184-3ba3ca0dab8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:58:55 np0005531887 nova_compute[186849]: 2025-11-22 07:58:55.064 186853 DEBUG nova.network.neutron [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 02:58:55 np0005531887 nova_compute[186849]: 2025-11-22 07:58:55.065 186853 DEBUG nova.objects.instance [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid e4a6074c-55b0-4529-b184-3ba3ca0dab8c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:58:55 np0005531887 ovn_controller[95130]: 2025-11-22T07:58:55Z|00032|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:bc:09:ea 10.100.0.3
Nov 22 02:58:56 np0005531887 nova_compute[186849]: 2025-11-22 07:58:56.365 186853 DEBUG nova.network.neutron [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Updating instance_info_cache with network_info: [{"id": "392e43af-a923-4bd6-bdff-445c6101995b", "address": "fa:16:3e:10:9b:64", "network": {"id": "06e0f3a5-911a-4244-bd9c-8cb4fa4c4794", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-960378838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd33c7e49baa4c7f9575824b348a0f23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap392e43af-a9", "ovs_interfaceid": "392e43af-a923-4bd6-bdff-445c6101995b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:58:56 np0005531887 nova_compute[186849]: 2025-11-22 07:58:56.386 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Releasing lock "refresh_cache-e4a6074c-55b0-4529-b184-3ba3ca0dab8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:58:56 np0005531887 nova_compute[186849]: 2025-11-22 07:58:56.386 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 02:58:56 np0005531887 nova_compute[186849]: 2025-11-22 07:58:56.387 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:58:56 np0005531887 nova_compute[186849]: 2025-11-22 07:58:56.387 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:58:56 np0005531887 nova_compute[186849]: 2025-11-22 07:58:56.508 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:56 np0005531887 nova_compute[186849]: 2025-11-22 07:58:56.595 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:58:58 np0005531887 podman[225739]: 2025-11-22 07:58:58.882342142 +0000 UTC m=+0.092789773 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Nov 22 02:59:01 np0005531887 nova_compute[186849]: 2025-11-22 07:59:01.512 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:01 np0005531887 nova_compute[186849]: 2025-11-22 07:59:01.597 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:02 np0005531887 podman[225758]: 2025-11-22 07:59:02.850647129 +0000 UTC m=+0.063143914 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 02:59:06 np0005531887 nova_compute[186849]: 2025-11-22 07:59:06.514 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:06 np0005531887 nova_compute[186849]: 2025-11-22 07:59:06.600 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:06 np0005531887 nova_compute[186849]: 2025-11-22 07:59:06.780 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:59:06 np0005531887 nova_compute[186849]: 2025-11-22 07:59:06.780 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 22 02:59:08 np0005531887 podman[225785]: 2025-11-22 07:59:08.863642486 +0000 UTC m=+0.079600625 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, architecture=x86_64, io.buildah.version=1.33.7, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, distribution-scope=public, vcs-type=git, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., release=1755695350, io.openshift.expose-services=, config_id=edpm)
Nov 22 02:59:11 np0005531887 nova_compute[186849]: 2025-11-22 07:59:11.409 186853 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Acquiring lock "63f687f3-efad-42ad-b771-c95586f36ed7" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:59:11 np0005531887 nova_compute[186849]: 2025-11-22 07:59:11.410 186853 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lock "63f687f3-efad-42ad-b771-c95586f36ed7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:59:11 np0005531887 nova_compute[186849]: 2025-11-22 07:59:11.432 186853 DEBUG nova.compute.manager [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 02:59:11 np0005531887 nova_compute[186849]: 2025-11-22 07:59:11.517 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:11 np0005531887 nova_compute[186849]: 2025-11-22 07:59:11.568 186853 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:59:11 np0005531887 nova_compute[186849]: 2025-11-22 07:59:11.569 186853 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:59:11 np0005531887 nova_compute[186849]: 2025-11-22 07:59:11.577 186853 DEBUG nova.virt.hardware [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:59:11 np0005531887 nova_compute[186849]: 2025-11-22 07:59:11.577 186853 INFO nova.compute.claims [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 22 02:59:11 np0005531887 nova_compute[186849]: 2025-11-22 07:59:11.603 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:11 np0005531887 nova_compute[186849]: 2025-11-22 07:59:11.839 186853 DEBUG nova.compute.provider_tree [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:59:11 np0005531887 nova_compute[186849]: 2025-11-22 07:59:11.881 186853 DEBUG nova.scheduler.client.report [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:59:11 np0005531887 nova_compute[186849]: 2025-11-22 07:59:11.936 186853 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.367s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:59:11 np0005531887 nova_compute[186849]: 2025-11-22 07:59:11.937 186853 DEBUG nova.compute.manager [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 02:59:12 np0005531887 nova_compute[186849]: 2025-11-22 07:59:12.035 186853 DEBUG nova.compute.manager [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 02:59:12 np0005531887 nova_compute[186849]: 2025-11-22 07:59:12.035 186853 DEBUG nova.network.neutron [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 02:59:12 np0005531887 nova_compute[186849]: 2025-11-22 07:59:12.053 186853 INFO nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 02:59:12 np0005531887 nova_compute[186849]: 2025-11-22 07:59:12.069 186853 DEBUG nova.compute.manager [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 02:59:12 np0005531887 nova_compute[186849]: 2025-11-22 07:59:12.175 186853 DEBUG nova.compute.manager [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 02:59:12 np0005531887 nova_compute[186849]: 2025-11-22 07:59:12.177 186853 DEBUG nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:59:12 np0005531887 nova_compute[186849]: 2025-11-22 07:59:12.177 186853 INFO nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Creating image(s)#033[00m
Nov 22 02:59:12 np0005531887 nova_compute[186849]: 2025-11-22 07:59:12.178 186853 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Acquiring lock "/var/lib/nova/instances/63f687f3-efad-42ad-b771-c95586f36ed7/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:59:12 np0005531887 nova_compute[186849]: 2025-11-22 07:59:12.178 186853 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lock "/var/lib/nova/instances/63f687f3-efad-42ad-b771-c95586f36ed7/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:59:12 np0005531887 nova_compute[186849]: 2025-11-22 07:59:12.178 186853 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lock "/var/lib/nova/instances/63f687f3-efad-42ad-b771-c95586f36ed7/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:59:12 np0005531887 nova_compute[186849]: 2025-11-22 07:59:12.193 186853 DEBUG oslo_concurrency.processutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:59:12 np0005531887 nova_compute[186849]: 2025-11-22 07:59:12.264 186853 DEBUG oslo_concurrency.processutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:59:12 np0005531887 nova_compute[186849]: 2025-11-22 07:59:12.265 186853 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:59:12 np0005531887 nova_compute[186849]: 2025-11-22 07:59:12.266 186853 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:59:12 np0005531887 nova_compute[186849]: 2025-11-22 07:59:12.278 186853 DEBUG oslo_concurrency.processutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:59:12 np0005531887 nova_compute[186849]: 2025-11-22 07:59:12.340 186853 DEBUG oslo_concurrency.processutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:59:12 np0005531887 nova_compute[186849]: 2025-11-22 07:59:12.341 186853 DEBUG oslo_concurrency.processutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/63f687f3-efad-42ad-b771-c95586f36ed7/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:59:12 np0005531887 nova_compute[186849]: 2025-11-22 07:59:12.383 186853 DEBUG oslo_concurrency.processutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/63f687f3-efad-42ad-b771-c95586f36ed7/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:59:12 np0005531887 nova_compute[186849]: 2025-11-22 07:59:12.385 186853 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:59:12 np0005531887 nova_compute[186849]: 2025-11-22 07:59:12.386 186853 DEBUG oslo_concurrency.processutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:59:12 np0005531887 nova_compute[186849]: 2025-11-22 07:59:12.452 186853 DEBUG oslo_concurrency.processutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:59:12 np0005531887 nova_compute[186849]: 2025-11-22 07:59:12.454 186853 DEBUG nova.virt.disk.api [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Checking if we can resize image /var/lib/nova/instances/63f687f3-efad-42ad-b771-c95586f36ed7/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:59:12 np0005531887 nova_compute[186849]: 2025-11-22 07:59:12.455 186853 DEBUG oslo_concurrency.processutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/63f687f3-efad-42ad-b771-c95586f36ed7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:59:12 np0005531887 nova_compute[186849]: 2025-11-22 07:59:12.479 186853 DEBUG nova.policy [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cf1790780fd64791b117114d170d6d90', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '16ccb24424c54ae1a1b0d7eef6f7d690', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 02:59:12 np0005531887 nova_compute[186849]: 2025-11-22 07:59:12.519 186853 DEBUG oslo_concurrency.processutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/63f687f3-efad-42ad-b771-c95586f36ed7/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:59:12 np0005531887 nova_compute[186849]: 2025-11-22 07:59:12.521 186853 DEBUG nova.virt.disk.api [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Cannot resize image /var/lib/nova/instances/63f687f3-efad-42ad-b771-c95586f36ed7/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:59:12 np0005531887 nova_compute[186849]: 2025-11-22 07:59:12.522 186853 DEBUG nova.objects.instance [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lazy-loading 'migration_context' on Instance uuid 63f687f3-efad-42ad-b771-c95586f36ed7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:59:12 np0005531887 nova_compute[186849]: 2025-11-22 07:59:12.546 186853 DEBUG nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:59:12 np0005531887 nova_compute[186849]: 2025-11-22 07:59:12.546 186853 DEBUG nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Ensure instance console log exists: /var/lib/nova/instances/63f687f3-efad-42ad-b771-c95586f36ed7/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:59:12 np0005531887 nova_compute[186849]: 2025-11-22 07:59:12.547 186853 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:59:12 np0005531887 nova_compute[186849]: 2025-11-22 07:59:12.547 186853 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:59:12 np0005531887 nova_compute[186849]: 2025-11-22 07:59:12.548 186853 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:59:12 np0005531887 podman[225836]: 2025-11-22 07:59:12.861608363 +0000 UTC m=+0.074736944 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:59:12 np0005531887 podman[225837]: 2025-11-22 07:59:12.879072328 +0000 UTC m=+0.091943852 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3)
Nov 22 02:59:13 np0005531887 nova_compute[186849]: 2025-11-22 07:59:13.896 186853 DEBUG nova.network.neutron [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Successfully created port: 606d4bb0-7edc-4fdc-8e00-7dce6b636a37 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 02:59:15 np0005531887 nova_compute[186849]: 2025-11-22 07:59:15.220 186853 DEBUG nova.network.neutron [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Successfully updated port: 606d4bb0-7edc-4fdc-8e00-7dce6b636a37 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 02:59:15 np0005531887 nova_compute[186849]: 2025-11-22 07:59:15.234 186853 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Acquiring lock "refresh_cache-63f687f3-efad-42ad-b771-c95586f36ed7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:59:15 np0005531887 nova_compute[186849]: 2025-11-22 07:59:15.235 186853 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Acquired lock "refresh_cache-63f687f3-efad-42ad-b771-c95586f36ed7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:59:15 np0005531887 nova_compute[186849]: 2025-11-22 07:59:15.235 186853 DEBUG nova.network.neutron [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 02:59:15 np0005531887 nova_compute[186849]: 2025-11-22 07:59:15.319 186853 DEBUG nova.compute.manager [req-3b091675-2f7f-4eed-a6a6-85e857606911 req-9ea4a87a-414b-4632-b29b-37d526e3e233 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Received event network-changed-606d4bb0-7edc-4fdc-8e00-7dce6b636a37 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:59:15 np0005531887 nova_compute[186849]: 2025-11-22 07:59:15.319 186853 DEBUG nova.compute.manager [req-3b091675-2f7f-4eed-a6a6-85e857606911 req-9ea4a87a-414b-4632-b29b-37d526e3e233 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Refreshing instance network info cache due to event network-changed-606d4bb0-7edc-4fdc-8e00-7dce6b636a37. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 02:59:15 np0005531887 nova_compute[186849]: 2025-11-22 07:59:15.320 186853 DEBUG oslo_concurrency.lockutils [req-3b091675-2f7f-4eed-a6a6-85e857606911 req-9ea4a87a-414b-4632-b29b-37d526e3e233 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-63f687f3-efad-42ad-b771-c95586f36ed7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 02:59:15 np0005531887 nova_compute[186849]: 2025-11-22 07:59:15.420 186853 DEBUG nova.network.neutron [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 02:59:16 np0005531887 nova_compute[186849]: 2025-11-22 07:59:16.519 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:16 np0005531887 nova_compute[186849]: 2025-11-22 07:59:16.605 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:17 np0005531887 nova_compute[186849]: 2025-11-22 07:59:17.578 186853 DEBUG nova.network.neutron [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Updating instance_info_cache with network_info: [{"id": "606d4bb0-7edc-4fdc-8e00-7dce6b636a37", "address": "fa:16:3e:7b:76:96", "network": {"id": "d6148823-d007-4a7e-be44-4329f8ecc6e5", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1122516717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16ccb24424c54ae1a1b0d7eef6f7d690", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap606d4bb0-7e", "ovs_interfaceid": "606d4bb0-7edc-4fdc-8e00-7dce6b636a37", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:59:17 np0005531887 nova_compute[186849]: 2025-11-22 07:59:17.599 186853 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Releasing lock "refresh_cache-63f687f3-efad-42ad-b771-c95586f36ed7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:59:17 np0005531887 nova_compute[186849]: 2025-11-22 07:59:17.599 186853 DEBUG nova.compute.manager [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Instance network_info: |[{"id": "606d4bb0-7edc-4fdc-8e00-7dce6b636a37", "address": "fa:16:3e:7b:76:96", "network": {"id": "d6148823-d007-4a7e-be44-4329f8ecc6e5", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1122516717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16ccb24424c54ae1a1b0d7eef6f7d690", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap606d4bb0-7e", "ovs_interfaceid": "606d4bb0-7edc-4fdc-8e00-7dce6b636a37", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 02:59:17 np0005531887 nova_compute[186849]: 2025-11-22 07:59:17.600 186853 DEBUG oslo_concurrency.lockutils [req-3b091675-2f7f-4eed-a6a6-85e857606911 req-9ea4a87a-414b-4632-b29b-37d526e3e233 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-63f687f3-efad-42ad-b771-c95586f36ed7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 02:59:17 np0005531887 nova_compute[186849]: 2025-11-22 07:59:17.600 186853 DEBUG nova.network.neutron [req-3b091675-2f7f-4eed-a6a6-85e857606911 req-9ea4a87a-414b-4632-b29b-37d526e3e233 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Refreshing network info cache for port 606d4bb0-7edc-4fdc-8e00-7dce6b636a37 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 02:59:17 np0005531887 nova_compute[186849]: 2025-11-22 07:59:17.604 186853 DEBUG nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Start _get_guest_xml network_info=[{"id": "606d4bb0-7edc-4fdc-8e00-7dce6b636a37", "address": "fa:16:3e:7b:76:96", "network": {"id": "d6148823-d007-4a7e-be44-4329f8ecc6e5", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1122516717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16ccb24424c54ae1a1b0d7eef6f7d690", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap606d4bb0-7e", "ovs_interfaceid": "606d4bb0-7edc-4fdc-8e00-7dce6b636a37", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'size': 0, 'boot_index': 0, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 02:59:17 np0005531887 nova_compute[186849]: 2025-11-22 07:59:17.608 186853 WARNING nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:59:17 np0005531887 nova_compute[186849]: 2025-11-22 07:59:17.613 186853 DEBUG nova.virt.libvirt.host [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 02:59:17 np0005531887 nova_compute[186849]: 2025-11-22 07:59:17.614 186853 DEBUG nova.virt.libvirt.host [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 02:59:17 np0005531887 nova_compute[186849]: 2025-11-22 07:59:17.617 186853 DEBUG nova.virt.libvirt.host [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 02:59:17 np0005531887 nova_compute[186849]: 2025-11-22 07:59:17.617 186853 DEBUG nova.virt.libvirt.host [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 02:59:17 np0005531887 nova_compute[186849]: 2025-11-22 07:59:17.619 186853 DEBUG nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 02:59:17 np0005531887 nova_compute[186849]: 2025-11-22 07:59:17.619 186853 DEBUG nova.virt.hardware [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 02:59:17 np0005531887 nova_compute[186849]: 2025-11-22 07:59:17.619 186853 DEBUG nova.virt.hardware [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 02:59:17 np0005531887 nova_compute[186849]: 2025-11-22 07:59:17.620 186853 DEBUG nova.virt.hardware [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 02:59:17 np0005531887 nova_compute[186849]: 2025-11-22 07:59:17.620 186853 DEBUG nova.virt.hardware [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 02:59:17 np0005531887 nova_compute[186849]: 2025-11-22 07:59:17.620 186853 DEBUG nova.virt.hardware [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 02:59:17 np0005531887 nova_compute[186849]: 2025-11-22 07:59:17.620 186853 DEBUG nova.virt.hardware [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 02:59:17 np0005531887 nova_compute[186849]: 2025-11-22 07:59:17.621 186853 DEBUG nova.virt.hardware [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 02:59:17 np0005531887 nova_compute[186849]: 2025-11-22 07:59:17.621 186853 DEBUG nova.virt.hardware [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 02:59:17 np0005531887 nova_compute[186849]: 2025-11-22 07:59:17.621 186853 DEBUG nova.virt.hardware [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 02:59:17 np0005531887 nova_compute[186849]: 2025-11-22 07:59:17.621 186853 DEBUG nova.virt.hardware [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 02:59:17 np0005531887 nova_compute[186849]: 2025-11-22 07:59:17.622 186853 DEBUG nova.virt.hardware [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 02:59:17 np0005531887 nova_compute[186849]: 2025-11-22 07:59:17.626 186853 DEBUG nova.virt.libvirt.vif [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:59:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-1317504943',display_name='tempest-ListServersNegativeTestJSON-server-1317504943-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-1317504943-2',id=87,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='16ccb24424c54ae1a1b0d7eef6f7d690',ramdisk_id='',reservation_id='r-ez5qyzoz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServersNegativeTestJSON-1715955177',owner_user_name='tempest-ListServersNegativeTestJSON-1715955177-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:59:12Z,user_data=None,user_id='cf1790780fd64791b117114d170d6d90',uuid=63f687f3-efad-42ad-b771-c95586f36ed7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "606d4bb0-7edc-4fdc-8e00-7dce6b636a37", "address": "fa:16:3e:7b:76:96", "network": {"id": "d6148823-d007-4a7e-be44-4329f8ecc6e5", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1122516717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16ccb24424c54ae1a1b0d7eef6f7d690", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap606d4bb0-7e", "ovs_interfaceid": "606d4bb0-7edc-4fdc-8e00-7dce6b636a37", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 02:59:17 np0005531887 nova_compute[186849]: 2025-11-22 07:59:17.626 186853 DEBUG nova.network.os_vif_util [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Converting VIF {"id": "606d4bb0-7edc-4fdc-8e00-7dce6b636a37", "address": "fa:16:3e:7b:76:96", "network": {"id": "d6148823-d007-4a7e-be44-4329f8ecc6e5", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1122516717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16ccb24424c54ae1a1b0d7eef6f7d690", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap606d4bb0-7e", "ovs_interfaceid": "606d4bb0-7edc-4fdc-8e00-7dce6b636a37", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:59:17 np0005531887 nova_compute[186849]: 2025-11-22 07:59:17.627 186853 DEBUG nova.network.os_vif_util [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7b:76:96,bridge_name='br-int',has_traffic_filtering=True,id=606d4bb0-7edc-4fdc-8e00-7dce6b636a37,network=Network(d6148823-d007-4a7e-be44-4329f8ecc6e5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap606d4bb0-7e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:59:17 np0005531887 nova_compute[186849]: 2025-11-22 07:59:17.628 186853 DEBUG nova.objects.instance [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lazy-loading 'pci_devices' on Instance uuid 63f687f3-efad-42ad-b771-c95586f36ed7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:59:17 np0005531887 nova_compute[186849]: 2025-11-22 07:59:17.640 186853 DEBUG nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] End _get_guest_xml xml=<domain type="kvm">
Nov 22 02:59:17 np0005531887 nova_compute[186849]:  <uuid>63f687f3-efad-42ad-b771-c95586f36ed7</uuid>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:  <name>instance-00000057</name>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:  <memory>131072</memory>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:  <vcpu>1</vcpu>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 02:59:17 np0005531887 nova_compute[186849]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:      <nova:name>tempest-ListServersNegativeTestJSON-server-1317504943-2</nova:name>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:      <nova:creationTime>2025-11-22 07:59:17</nova:creationTime>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:      <nova:flavor name="m1.nano">
Nov 22 02:59:17 np0005531887 nova_compute[186849]:        <nova:memory>128</nova:memory>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:        <nova:disk>1</nova:disk>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:        <nova:swap>0</nova:swap>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:        <nova:vcpus>1</nova:vcpus>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:      </nova:flavor>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:      <nova:owner>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:        <nova:user uuid="cf1790780fd64791b117114d170d6d90">tempest-ListServersNegativeTestJSON-1715955177-project-member</nova:user>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:        <nova:project uuid="16ccb24424c54ae1a1b0d7eef6f7d690">tempest-ListServersNegativeTestJSON-1715955177</nova:project>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:      </nova:owner>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:      <nova:ports>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:        <nova:port uuid="606d4bb0-7edc-4fdc-8e00-7dce6b636a37">
Nov 22 02:59:17 np0005531887 nova_compute[186849]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:        </nova:port>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:      </nova:ports>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:    </nova:instance>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:  <sysinfo type="smbios">
Nov 22 02:59:17 np0005531887 nova_compute[186849]:    <system>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:      <entry name="manufacturer">RDO</entry>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:      <entry name="product">OpenStack Compute</entry>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:      <entry name="serial">63f687f3-efad-42ad-b771-c95586f36ed7</entry>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:      <entry name="uuid">63f687f3-efad-42ad-b771-c95586f36ed7</entry>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:      <entry name="family">Virtual Machine</entry>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:    </system>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:  <os>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:    <boot dev="hd"/>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:    <smbios mode="sysinfo"/>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:  </os>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:  <features>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:    <vmcoreinfo/>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:  </features>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:  <clock offset="utc">
Nov 22 02:59:17 np0005531887 nova_compute[186849]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:    <timer name="hpet" present="no"/>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:  </clock>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:  <cpu mode="custom" match="exact">
Nov 22 02:59:17 np0005531887 nova_compute[186849]:    <model>Nehalem</model>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:  <devices>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:    <disk type="file" device="disk">
Nov 22 02:59:17 np0005531887 nova_compute[186849]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/63f687f3-efad-42ad-b771-c95586f36ed7/disk"/>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:      <target dev="vda" bus="virtio"/>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:    <disk type="file" device="cdrom">
Nov 22 02:59:17 np0005531887 nova_compute[186849]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/63f687f3-efad-42ad-b771-c95586f36ed7/disk.config"/>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:      <target dev="sda" bus="sata"/>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:    </disk>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:    <interface type="ethernet">
Nov 22 02:59:17 np0005531887 nova_compute[186849]:      <mac address="fa:16:3e:7b:76:96"/>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:      <mtu size="1442"/>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:      <target dev="tap606d4bb0-7e"/>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:    </interface>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:    <serial type="pty">
Nov 22 02:59:17 np0005531887 nova_compute[186849]:      <log file="/var/lib/nova/instances/63f687f3-efad-42ad-b771-c95586f36ed7/console.log" append="off"/>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:    </serial>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:    <video>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:    </video>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:    <input type="tablet" bus="usb"/>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:    <rng model="virtio">
Nov 22 02:59:17 np0005531887 nova_compute[186849]:      <backend model="random">/dev/urandom</backend>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:    </rng>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root"/>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:    <controller type="usb" index="0"/>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:    <memballoon model="virtio">
Nov 22 02:59:17 np0005531887 nova_compute[186849]:      <stats period="10"/>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 02:59:17 np0005531887 nova_compute[186849]:  </devices>
Nov 22 02:59:17 np0005531887 nova_compute[186849]: </domain>
Nov 22 02:59:17 np0005531887 nova_compute[186849]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 02:59:17 np0005531887 nova_compute[186849]: 2025-11-22 07:59:17.641 186853 DEBUG nova.compute.manager [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Preparing to wait for external event network-vif-plugged-606d4bb0-7edc-4fdc-8e00-7dce6b636a37 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 02:59:17 np0005531887 nova_compute[186849]: 2025-11-22 07:59:17.641 186853 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Acquiring lock "63f687f3-efad-42ad-b771-c95586f36ed7-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:59:17 np0005531887 nova_compute[186849]: 2025-11-22 07:59:17.642 186853 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lock "63f687f3-efad-42ad-b771-c95586f36ed7-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:59:17 np0005531887 nova_compute[186849]: 2025-11-22 07:59:17.642 186853 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lock "63f687f3-efad-42ad-b771-c95586f36ed7-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:59:17 np0005531887 nova_compute[186849]: 2025-11-22 07:59:17.643 186853 DEBUG nova.virt.libvirt.vif [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:59:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-1317504943',display_name='tempest-ListServersNegativeTestJSON-server-1317504943-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-1317504943-2',id=87,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='16ccb24424c54ae1a1b0d7eef6f7d690',ramdisk_id='',reservation_id='r-ez5qyzoz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServersNegativeTestJSON-1715955177',owner_user_name='tempest-ListServersNegativeTestJSON-1715955177-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:59:12Z,user_data=None,user_id='cf1790780fd64791b117114d170d6d90',uuid=63f687f3-efad-42ad-b771-c95586f36ed7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "606d4bb0-7edc-4fdc-8e00-7dce6b636a37", "address": "fa:16:3e:7b:76:96", "network": {"id": "d6148823-d007-4a7e-be44-4329f8ecc6e5", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1122516717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16ccb24424c54ae1a1b0d7eef6f7d690", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap606d4bb0-7e", "ovs_interfaceid": "606d4bb0-7edc-4fdc-8e00-7dce6b636a37", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 02:59:17 np0005531887 nova_compute[186849]: 2025-11-22 07:59:17.643 186853 DEBUG nova.network.os_vif_util [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Converting VIF {"id": "606d4bb0-7edc-4fdc-8e00-7dce6b636a37", "address": "fa:16:3e:7b:76:96", "network": {"id": "d6148823-d007-4a7e-be44-4329f8ecc6e5", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1122516717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16ccb24424c54ae1a1b0d7eef6f7d690", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap606d4bb0-7e", "ovs_interfaceid": "606d4bb0-7edc-4fdc-8e00-7dce6b636a37", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:59:17 np0005531887 nova_compute[186849]: 2025-11-22 07:59:17.644 186853 DEBUG nova.network.os_vif_util [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7b:76:96,bridge_name='br-int',has_traffic_filtering=True,id=606d4bb0-7edc-4fdc-8e00-7dce6b636a37,network=Network(d6148823-d007-4a7e-be44-4329f8ecc6e5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap606d4bb0-7e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:59:17 np0005531887 nova_compute[186849]: 2025-11-22 07:59:17.644 186853 DEBUG os_vif [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7b:76:96,bridge_name='br-int',has_traffic_filtering=True,id=606d4bb0-7edc-4fdc-8e00-7dce6b636a37,network=Network(d6148823-d007-4a7e-be44-4329f8ecc6e5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap606d4bb0-7e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 02:59:17 np0005531887 nova_compute[186849]: 2025-11-22 07:59:17.645 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:17 np0005531887 nova_compute[186849]: 2025-11-22 07:59:17.645 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:59:17 np0005531887 nova_compute[186849]: 2025-11-22 07:59:17.645 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:59:17 np0005531887 nova_compute[186849]: 2025-11-22 07:59:17.649 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:17 np0005531887 nova_compute[186849]: 2025-11-22 07:59:17.649 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap606d4bb0-7e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:59:17 np0005531887 nova_compute[186849]: 2025-11-22 07:59:17.650 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap606d4bb0-7e, col_values=(('external_ids', {'iface-id': '606d4bb0-7edc-4fdc-8e00-7dce6b636a37', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7b:76:96', 'vm-uuid': '63f687f3-efad-42ad-b771-c95586f36ed7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:59:17 np0005531887 NetworkManager[55210]: <info>  [1763798357.6553] manager: (tap606d4bb0-7e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/119)
Nov 22 02:59:17 np0005531887 nova_compute[186849]: 2025-11-22 07:59:17.655 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:17 np0005531887 nova_compute[186849]: 2025-11-22 07:59:17.658 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:59:17 np0005531887 nova_compute[186849]: 2025-11-22 07:59:17.663 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:17 np0005531887 nova_compute[186849]: 2025-11-22 07:59:17.666 186853 INFO os_vif [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7b:76:96,bridge_name='br-int',has_traffic_filtering=True,id=606d4bb0-7edc-4fdc-8e00-7dce6b636a37,network=Network(d6148823-d007-4a7e-be44-4329f8ecc6e5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap606d4bb0-7e')#033[00m
Nov 22 02:59:17 np0005531887 nova_compute[186849]: 2025-11-22 07:59:17.750 186853 DEBUG nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:59:17 np0005531887 nova_compute[186849]: 2025-11-22 07:59:17.750 186853 DEBUG nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 02:59:17 np0005531887 nova_compute[186849]: 2025-11-22 07:59:17.751 186853 DEBUG nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] No VIF found with MAC fa:16:3e:7b:76:96, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 02:59:17 np0005531887 nova_compute[186849]: 2025-11-22 07:59:17.751 186853 INFO nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Using config drive#033[00m
Nov 22 02:59:18 np0005531887 nova_compute[186849]: 2025-11-22 07:59:18.249 186853 INFO nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Creating config drive at /var/lib/nova/instances/63f687f3-efad-42ad-b771-c95586f36ed7/disk.config#033[00m
Nov 22 02:59:18 np0005531887 nova_compute[186849]: 2025-11-22 07:59:18.256 186853 DEBUG oslo_concurrency.processutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/63f687f3-efad-42ad-b771-c95586f36ed7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzzp9jfz5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:59:18 np0005531887 nova_compute[186849]: 2025-11-22 07:59:18.388 186853 DEBUG oslo_concurrency.processutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/63f687f3-efad-42ad-b771-c95586f36ed7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzzp9jfz5" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:59:18 np0005531887 kernel: tap606d4bb0-7e: entered promiscuous mode
Nov 22 02:59:18 np0005531887 NetworkManager[55210]: <info>  [1763798358.4721] manager: (tap606d4bb0-7e): new Tun device (/org/freedesktop/NetworkManager/Devices/120)
Nov 22 02:59:18 np0005531887 nova_compute[186849]: 2025-11-22 07:59:18.475 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:18 np0005531887 nova_compute[186849]: 2025-11-22 07:59:18.478 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:18 np0005531887 ovn_controller[95130]: 2025-11-22T07:59:18Z|00240|binding|INFO|Claiming lport 606d4bb0-7edc-4fdc-8e00-7dce6b636a37 for this chassis.
Nov 22 02:59:18 np0005531887 ovn_controller[95130]: 2025-11-22T07:59:18Z|00241|binding|INFO|606d4bb0-7edc-4fdc-8e00-7dce6b636a37: Claiming fa:16:3e:7b:76:96 10.100.0.8
Nov 22 02:59:18 np0005531887 systemd-udevd[225914]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 02:59:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:18.513 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7b:76:96 10.100.0.8'], port_security=['fa:16:3e:7b:76:96 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '63f687f3-efad-42ad-b771-c95586f36ed7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d6148823-d007-4a7e-be44-4329f8ecc6e5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '16ccb24424c54ae1a1b0d7eef6f7d690', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b4820a7f-a658-410a-b393-c754d89b7982', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9ac2bec8-4c70-4af1-8a46-6da94edec63d, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=606d4bb0-7edc-4fdc-8e00-7dce6b636a37) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:59:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:18.515 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 606d4bb0-7edc-4fdc-8e00-7dce6b636a37 in datapath d6148823-d007-4a7e-be44-4329f8ecc6e5 bound to our chassis#033[00m
Nov 22 02:59:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:18.517 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d6148823-d007-4a7e-be44-4329f8ecc6e5#033[00m
Nov 22 02:59:18 np0005531887 NetworkManager[55210]: <info>  [1763798358.5312] device (tap606d4bb0-7e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 02:59:18 np0005531887 NetworkManager[55210]: <info>  [1763798358.5320] device (tap606d4bb0-7e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 02:59:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:18.532 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[0930dbcd-e048-43e1-a930-df62e6404ec6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:18.536 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd6148823-d1 in ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 02:59:18 np0005531887 nova_compute[186849]: 2025-11-22 07:59:18.538 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:18.538 213790 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd6148823-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 02:59:18 np0005531887 ovn_controller[95130]: 2025-11-22T07:59:18Z|00242|binding|INFO|Setting lport 606d4bb0-7edc-4fdc-8e00-7dce6b636a37 ovn-installed in OVS
Nov 22 02:59:18 np0005531887 ovn_controller[95130]: 2025-11-22T07:59:18Z|00243|binding|INFO|Setting lport 606d4bb0-7edc-4fdc-8e00-7dce6b636a37 up in Southbound
Nov 22 02:59:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:18.538 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[49c55c94-1755-49f0-866f-6df6406e67b9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:18 np0005531887 nova_compute[186849]: 2025-11-22 07:59:18.542 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:18.543 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[2e3cf230-eb3b-4191-8b41-64604da5be3c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:18 np0005531887 systemd-machined[153180]: New machine qemu-36-instance-00000057.
Nov 22 02:59:18 np0005531887 podman[225893]: 2025-11-22 07:59:18.548468171 +0000 UTC m=+0.080519098 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 02:59:18 np0005531887 systemd[1]: Started Virtual Machine qemu-36-instance-00000057.
Nov 22 02:59:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:18.561 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[2ce436c7-296c-46b5-a337-67a309f16ede]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:18.577 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[89e21277-95fd-4c8d-bbf0-269783f683be]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:18.613 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[c863b4b4-ea56-4ecd-b096-95610851ba62]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:18.619 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[4b0571fe-d72d-47c1-b566-c5b091ca41e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:18 np0005531887 NetworkManager[55210]: <info>  [1763798358.6208] manager: (tapd6148823-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/121)
Nov 22 02:59:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:18.655 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[5619f3d7-6ad2-4944-a172-c9151319877c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:18.658 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[d9be1170-cbc1-4d0a-92dd-27a72e86994f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:18 np0005531887 NetworkManager[55210]: <info>  [1763798358.6881] device (tapd6148823-d0): carrier: link connected
Nov 22 02:59:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:18.694 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[a94882dd-e49f-4146-8a7d-0958e4c7e52c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:18.713 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[37b314ac-cd02-43e8-9132-9bd7cbbf9c08]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd6148823-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e2:f2:ef'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 80], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 510129, 'reachable_time': 16273, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225958, 'error': None, 'target': 'ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:18.734 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[c6d6920e-ef55-4964-9233-22d3d7122371]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee2:f2ef'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 510129, 'tstamp': 510129}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225959, 'error': None, 'target': 'ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:18.754 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[5af0d75d-b39b-4c77-9301-a6d8839d79fb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd6148823-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e2:f2:ef'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 80], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 510129, 'reachable_time': 16273, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 225960, 'error': None, 'target': 'ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:18.789 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[f8c2d216-01f9-4bc4-a605-f0a09090198f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:18.866 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[4b28cc90-8550-42cc-a9c3-92e30d07f84f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:18.868 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd6148823-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:59:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:18.868 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:59:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:18.869 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd6148823-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:59:18 np0005531887 kernel: tapd6148823-d0: entered promiscuous mode
Nov 22 02:59:18 np0005531887 NetworkManager[55210]: <info>  [1763798358.8719] manager: (tapd6148823-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/122)
Nov 22 02:59:18 np0005531887 nova_compute[186849]: 2025-11-22 07:59:18.873 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:18.873 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd6148823-d0, col_values=(('external_ids', {'iface-id': '2f86d506-522f-4def-915e-a14693535092'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:59:18 np0005531887 ovn_controller[95130]: 2025-11-22T07:59:18Z|00244|binding|INFO|Releasing lport 2f86d506-522f-4def-915e-a14693535092 from this chassis (sb_readonly=0)
Nov 22 02:59:18 np0005531887 nova_compute[186849]: 2025-11-22 07:59:18.876 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:18.877 104084 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d6148823-d007-4a7e-be44-4329f8ecc6e5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d6148823-d007-4a7e-be44-4329f8ecc6e5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 02:59:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:18.878 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[3c055d95-5f74-4d0e-8562-7878011cda9c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:18.879 104084 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 02:59:18 np0005531887 ovn_metadata_agent[104079]: global
Nov 22 02:59:18 np0005531887 ovn_metadata_agent[104079]:    log         /dev/log local0 debug
Nov 22 02:59:18 np0005531887 ovn_metadata_agent[104079]:    log-tag     haproxy-metadata-proxy-d6148823-d007-4a7e-be44-4329f8ecc6e5
Nov 22 02:59:18 np0005531887 ovn_metadata_agent[104079]:    user        root
Nov 22 02:59:18 np0005531887 ovn_metadata_agent[104079]:    group       root
Nov 22 02:59:18 np0005531887 ovn_metadata_agent[104079]:    maxconn     1024
Nov 22 02:59:18 np0005531887 ovn_metadata_agent[104079]:    pidfile     /var/lib/neutron/external/pids/d6148823-d007-4a7e-be44-4329f8ecc6e5.pid.haproxy
Nov 22 02:59:18 np0005531887 ovn_metadata_agent[104079]:    daemon
Nov 22 02:59:18 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:59:18 np0005531887 ovn_metadata_agent[104079]: defaults
Nov 22 02:59:18 np0005531887 ovn_metadata_agent[104079]:    log global
Nov 22 02:59:18 np0005531887 ovn_metadata_agent[104079]:    mode http
Nov 22 02:59:18 np0005531887 ovn_metadata_agent[104079]:    option httplog
Nov 22 02:59:18 np0005531887 ovn_metadata_agent[104079]:    option dontlognull
Nov 22 02:59:18 np0005531887 ovn_metadata_agent[104079]:    option http-server-close
Nov 22 02:59:18 np0005531887 ovn_metadata_agent[104079]:    option forwardfor
Nov 22 02:59:18 np0005531887 ovn_metadata_agent[104079]:    retries                 3
Nov 22 02:59:18 np0005531887 ovn_metadata_agent[104079]:    timeout http-request    30s
Nov 22 02:59:18 np0005531887 ovn_metadata_agent[104079]:    timeout connect         30s
Nov 22 02:59:18 np0005531887 ovn_metadata_agent[104079]:    timeout client          32s
Nov 22 02:59:18 np0005531887 ovn_metadata_agent[104079]:    timeout server          32s
Nov 22 02:59:18 np0005531887 ovn_metadata_agent[104079]:    timeout http-keep-alive 30s
Nov 22 02:59:18 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:59:18 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 02:59:18 np0005531887 ovn_metadata_agent[104079]: listen listener
Nov 22 02:59:18 np0005531887 ovn_metadata_agent[104079]:    bind 169.254.169.254:80
Nov 22 02:59:18 np0005531887 ovn_metadata_agent[104079]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 02:59:18 np0005531887 ovn_metadata_agent[104079]:    http-request add-header X-OVN-Network-ID d6148823-d007-4a7e-be44-4329f8ecc6e5
Nov 22 02:59:18 np0005531887 ovn_metadata_agent[104079]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 02:59:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:18.880 104084 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5', 'env', 'PROCESS_TAG=haproxy-d6148823-d007-4a7e-be44-4329f8ecc6e5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d6148823-d007-4a7e-be44-4329f8ecc6e5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 02:59:18 np0005531887 nova_compute[186849]: 2025-11-22 07:59:18.888 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:19 np0005531887 podman[225992]: 2025-11-22 07:59:19.372138998 +0000 UTC m=+0.098836865 container create 16a0d8dea394b60090df84e1bbe812420f433dcfc27332b266cc5e6e8edbc531 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:59:19 np0005531887 podman[225992]: 2025-11-22 07:59:19.302098092 +0000 UTC m=+0.028795989 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 02:59:19 np0005531887 systemd[1]: Started libpod-conmon-16a0d8dea394b60090df84e1bbe812420f433dcfc27332b266cc5e6e8edbc531.scope.
Nov 22 02:59:19 np0005531887 systemd[1]: Started libcrun container.
Nov 22 02:59:19 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2f1cb55a0e7f1aa57884f24549106c8f08354c03daf2785f0d2264bb8c69d8eb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 02:59:19 np0005531887 podman[225992]: 2025-11-22 07:59:19.491658386 +0000 UTC m=+0.218356283 container init 16a0d8dea394b60090df84e1bbe812420f433dcfc27332b266cc5e6e8edbc531 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:59:19 np0005531887 podman[225992]: 2025-11-22 07:59:19.498108567 +0000 UTC m=+0.224806424 container start 16a0d8dea394b60090df84e1bbe812420f433dcfc27332b266cc5e6e8edbc531 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:59:19 np0005531887 neutron-haproxy-ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5[226008]: [NOTICE]   (226012) : New worker (226014) forked
Nov 22 02:59:19 np0005531887 neutron-haproxy-ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5[226008]: [NOTICE]   (226012) : Loading success.
Nov 22 02:59:19 np0005531887 nova_compute[186849]: 2025-11-22 07:59:19.960 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798359.9591794, 63f687f3-efad-42ad-b771-c95586f36ed7 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:59:19 np0005531887 nova_compute[186849]: 2025-11-22 07:59:19.961 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] VM Started (Lifecycle Event)#033[00m
Nov 22 02:59:19 np0005531887 nova_compute[186849]: 2025-11-22 07:59:19.980 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:59:19 np0005531887 nova_compute[186849]: 2025-11-22 07:59:19.986 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798359.9598742, 63f687f3-efad-42ad-b771-c95586f36ed7 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:59:19 np0005531887 nova_compute[186849]: 2025-11-22 07:59:19.986 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] VM Paused (Lifecycle Event)#033[00m
Nov 22 02:59:20 np0005531887 nova_compute[186849]: 2025-11-22 07:59:20.004 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:59:20 np0005531887 nova_compute[186849]: 2025-11-22 07:59:20.008 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:59:20 np0005531887 nova_compute[186849]: 2025-11-22 07:59:20.024 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:59:20 np0005531887 nova_compute[186849]: 2025-11-22 07:59:20.570 186853 DEBUG nova.network.neutron [req-3b091675-2f7f-4eed-a6a6-85e857606911 req-9ea4a87a-414b-4632-b29b-37d526e3e233 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Updated VIF entry in instance network info cache for port 606d4bb0-7edc-4fdc-8e00-7dce6b636a37. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 02:59:20 np0005531887 nova_compute[186849]: 2025-11-22 07:59:20.572 186853 DEBUG nova.network.neutron [req-3b091675-2f7f-4eed-a6a6-85e857606911 req-9ea4a87a-414b-4632-b29b-37d526e3e233 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Updating instance_info_cache with network_info: [{"id": "606d4bb0-7edc-4fdc-8e00-7dce6b636a37", "address": "fa:16:3e:7b:76:96", "network": {"id": "d6148823-d007-4a7e-be44-4329f8ecc6e5", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1122516717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16ccb24424c54ae1a1b0d7eef6f7d690", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap606d4bb0-7e", "ovs_interfaceid": "606d4bb0-7edc-4fdc-8e00-7dce6b636a37", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:59:20 np0005531887 nova_compute[186849]: 2025-11-22 07:59:20.590 186853 DEBUG oslo_concurrency.lockutils [req-3b091675-2f7f-4eed-a6a6-85e857606911 req-9ea4a87a-414b-4632-b29b-37d526e3e233 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-63f687f3-efad-42ad-b771-c95586f36ed7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 02:59:21 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:21.259 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:59:21 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:21.260 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 02:59:21 np0005531887 nova_compute[186849]: 2025-11-22 07:59:21.261 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:21 np0005531887 nova_compute[186849]: 2025-11-22 07:59:21.607 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:22 np0005531887 nova_compute[186849]: 2025-11-22 07:59:22.654 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:22 np0005531887 nova_compute[186849]: 2025-11-22 07:59:22.730 186853 DEBUG nova.compute.manager [req-c25e3457-8f8f-4756-8473-4c90034eb7d4 req-b2179934-3f85-41f1-9d23-84c439e4ede0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Received event network-vif-plugged-606d4bb0-7edc-4fdc-8e00-7dce6b636a37 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:59:22 np0005531887 nova_compute[186849]: 2025-11-22 07:59:22.730 186853 DEBUG oslo_concurrency.lockutils [req-c25e3457-8f8f-4756-8473-4c90034eb7d4 req-b2179934-3f85-41f1-9d23-84c439e4ede0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "63f687f3-efad-42ad-b771-c95586f36ed7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:59:22 np0005531887 nova_compute[186849]: 2025-11-22 07:59:22.730 186853 DEBUG oslo_concurrency.lockutils [req-c25e3457-8f8f-4756-8473-4c90034eb7d4 req-b2179934-3f85-41f1-9d23-84c439e4ede0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "63f687f3-efad-42ad-b771-c95586f36ed7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:59:22 np0005531887 nova_compute[186849]: 2025-11-22 07:59:22.731 186853 DEBUG oslo_concurrency.lockutils [req-c25e3457-8f8f-4756-8473-4c90034eb7d4 req-b2179934-3f85-41f1-9d23-84c439e4ede0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "63f687f3-efad-42ad-b771-c95586f36ed7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:59:22 np0005531887 nova_compute[186849]: 2025-11-22 07:59:22.731 186853 DEBUG nova.compute.manager [req-c25e3457-8f8f-4756-8473-4c90034eb7d4 req-b2179934-3f85-41f1-9d23-84c439e4ede0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Processing event network-vif-plugged-606d4bb0-7edc-4fdc-8e00-7dce6b636a37 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 02:59:22 np0005531887 nova_compute[186849]: 2025-11-22 07:59:22.731 186853 DEBUG nova.compute.manager [req-c25e3457-8f8f-4756-8473-4c90034eb7d4 req-b2179934-3f85-41f1-9d23-84c439e4ede0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Received event network-vif-plugged-606d4bb0-7edc-4fdc-8e00-7dce6b636a37 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:59:22 np0005531887 nova_compute[186849]: 2025-11-22 07:59:22.731 186853 DEBUG oslo_concurrency.lockutils [req-c25e3457-8f8f-4756-8473-4c90034eb7d4 req-b2179934-3f85-41f1-9d23-84c439e4ede0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "63f687f3-efad-42ad-b771-c95586f36ed7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:59:22 np0005531887 nova_compute[186849]: 2025-11-22 07:59:22.732 186853 DEBUG oslo_concurrency.lockutils [req-c25e3457-8f8f-4756-8473-4c90034eb7d4 req-b2179934-3f85-41f1-9d23-84c439e4ede0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "63f687f3-efad-42ad-b771-c95586f36ed7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:59:22 np0005531887 nova_compute[186849]: 2025-11-22 07:59:22.732 186853 DEBUG oslo_concurrency.lockutils [req-c25e3457-8f8f-4756-8473-4c90034eb7d4 req-b2179934-3f85-41f1-9d23-84c439e4ede0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "63f687f3-efad-42ad-b771-c95586f36ed7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:59:22 np0005531887 nova_compute[186849]: 2025-11-22 07:59:22.732 186853 DEBUG nova.compute.manager [req-c25e3457-8f8f-4756-8473-4c90034eb7d4 req-b2179934-3f85-41f1-9d23-84c439e4ede0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] No waiting events found dispatching network-vif-plugged-606d4bb0-7edc-4fdc-8e00-7dce6b636a37 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:59:22 np0005531887 nova_compute[186849]: 2025-11-22 07:59:22.732 186853 WARNING nova.compute.manager [req-c25e3457-8f8f-4756-8473-4c90034eb7d4 req-b2179934-3f85-41f1-9d23-84c439e4ede0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Received unexpected event network-vif-plugged-606d4bb0-7edc-4fdc-8e00-7dce6b636a37 for instance with vm_state building and task_state spawning.#033[00m
Nov 22 02:59:22 np0005531887 nova_compute[186849]: 2025-11-22 07:59:22.733 186853 DEBUG nova.compute.manager [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 02:59:22 np0005531887 nova_compute[186849]: 2025-11-22 07:59:22.736 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798362.7366016, 63f687f3-efad-42ad-b771-c95586f36ed7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:59:22 np0005531887 nova_compute[186849]: 2025-11-22 07:59:22.737 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] VM Resumed (Lifecycle Event)#033[00m
Nov 22 02:59:22 np0005531887 nova_compute[186849]: 2025-11-22 07:59:22.738 186853 DEBUG nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 02:59:22 np0005531887 nova_compute[186849]: 2025-11-22 07:59:22.744 186853 INFO nova.virt.libvirt.driver [-] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Instance spawned successfully.#033[00m
Nov 22 02:59:22 np0005531887 nova_compute[186849]: 2025-11-22 07:59:22.744 186853 DEBUG nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 02:59:22 np0005531887 nova_compute[186849]: 2025-11-22 07:59:22.758 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:59:22 np0005531887 nova_compute[186849]: 2025-11-22 07:59:22.765 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 02:59:22 np0005531887 nova_compute[186849]: 2025-11-22 07:59:22.770 186853 DEBUG nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:59:22 np0005531887 nova_compute[186849]: 2025-11-22 07:59:22.770 186853 DEBUG nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:59:22 np0005531887 nova_compute[186849]: 2025-11-22 07:59:22.771 186853 DEBUG nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:59:22 np0005531887 nova_compute[186849]: 2025-11-22 07:59:22.771 186853 DEBUG nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:59:22 np0005531887 nova_compute[186849]: 2025-11-22 07:59:22.772 186853 DEBUG nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:59:22 np0005531887 nova_compute[186849]: 2025-11-22 07:59:22.772 186853 DEBUG nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 02:59:22 np0005531887 nova_compute[186849]: 2025-11-22 07:59:22.816 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 02:59:22 np0005531887 nova_compute[186849]: 2025-11-22 07:59:22.859 186853 INFO nova.compute.manager [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Took 10.68 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 02:59:22 np0005531887 nova_compute[186849]: 2025-11-22 07:59:22.859 186853 DEBUG nova.compute.manager [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:59:22 np0005531887 nova_compute[186849]: 2025-11-22 07:59:22.937 186853 INFO nova.compute.manager [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Took 11.45 seconds to build instance.#033[00m
Nov 22 02:59:22 np0005531887 nova_compute[186849]: 2025-11-22 07:59:22.958 186853 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lock "63f687f3-efad-42ad-b771-c95586f36ed7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.548s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:59:24 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:24.261 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:59:24 np0005531887 podman[226033]: 2025-11-22 07:59:24.669871848 +0000 UTC m=+0.057924325 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Nov 22 02:59:26 np0005531887 nova_compute[186849]: 2025-11-22 07:59:26.613 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:27 np0005531887 nova_compute[186849]: 2025-11-22 07:59:27.657 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:29 np0005531887 podman[226052]: 2025-11-22 07:59:29.867368629 +0000 UTC m=+0.075841521 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 22 02:59:31 np0005531887 nova_compute[186849]: 2025-11-22 07:59:31.419 186853 DEBUG oslo_concurrency.lockutils [None req-144df5e8-6e4c-45ac-bc3a-fdf46220df0e cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Acquiring lock "63f687f3-efad-42ad-b771-c95586f36ed7" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:59:31 np0005531887 nova_compute[186849]: 2025-11-22 07:59:31.422 186853 DEBUG oslo_concurrency.lockutils [None req-144df5e8-6e4c-45ac-bc3a-fdf46220df0e cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lock "63f687f3-efad-42ad-b771-c95586f36ed7" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:59:31 np0005531887 nova_compute[186849]: 2025-11-22 07:59:31.422 186853 DEBUG oslo_concurrency.lockutils [None req-144df5e8-6e4c-45ac-bc3a-fdf46220df0e cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Acquiring lock "63f687f3-efad-42ad-b771-c95586f36ed7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:59:31 np0005531887 nova_compute[186849]: 2025-11-22 07:59:31.423 186853 DEBUG oslo_concurrency.lockutils [None req-144df5e8-6e4c-45ac-bc3a-fdf46220df0e cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lock "63f687f3-efad-42ad-b771-c95586f36ed7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:59:31 np0005531887 nova_compute[186849]: 2025-11-22 07:59:31.423 186853 DEBUG oslo_concurrency.lockutils [None req-144df5e8-6e4c-45ac-bc3a-fdf46220df0e cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lock "63f687f3-efad-42ad-b771-c95586f36ed7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:59:31 np0005531887 nova_compute[186849]: 2025-11-22 07:59:31.434 186853 INFO nova.compute.manager [None req-144df5e8-6e4c-45ac-bc3a-fdf46220df0e cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Terminating instance#033[00m
Nov 22 02:59:31 np0005531887 nova_compute[186849]: 2025-11-22 07:59:31.442 186853 DEBUG nova.compute.manager [None req-144df5e8-6e4c-45ac-bc3a-fdf46220df0e cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 02:59:31 np0005531887 kernel: tap606d4bb0-7e (unregistering): left promiscuous mode
Nov 22 02:59:31 np0005531887 NetworkManager[55210]: <info>  [1763798371.4745] device (tap606d4bb0-7e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 02:59:31 np0005531887 ovn_controller[95130]: 2025-11-22T07:59:31Z|00245|binding|INFO|Releasing lport 606d4bb0-7edc-4fdc-8e00-7dce6b636a37 from this chassis (sb_readonly=0)
Nov 22 02:59:31 np0005531887 ovn_controller[95130]: 2025-11-22T07:59:31Z|00246|binding|INFO|Setting lport 606d4bb0-7edc-4fdc-8e00-7dce6b636a37 down in Southbound
Nov 22 02:59:31 np0005531887 ovn_controller[95130]: 2025-11-22T07:59:31Z|00247|binding|INFO|Removing iface tap606d4bb0-7e ovn-installed in OVS
Nov 22 02:59:31 np0005531887 nova_compute[186849]: 2025-11-22 07:59:31.487 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:31 np0005531887 nova_compute[186849]: 2025-11-22 07:59:31.500 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:31 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:31.502 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7b:76:96 10.100.0.8'], port_security=['fa:16:3e:7b:76:96 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '63f687f3-efad-42ad-b771-c95586f36ed7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d6148823-d007-4a7e-be44-4329f8ecc6e5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '16ccb24424c54ae1a1b0d7eef6f7d690', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b4820a7f-a658-410a-b393-c754d89b7982', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9ac2bec8-4c70-4af1-8a46-6da94edec63d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=606d4bb0-7edc-4fdc-8e00-7dce6b636a37) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:59:31 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:31.503 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 606d4bb0-7edc-4fdc-8e00-7dce6b636a37 in datapath d6148823-d007-4a7e-be44-4329f8ecc6e5 unbound from our chassis#033[00m
Nov 22 02:59:31 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:31.505 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d6148823-d007-4a7e-be44-4329f8ecc6e5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 02:59:31 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:31.507 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[263d8c87-f7bb-438d-b4cb-9ad9e1c3bb6e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:31 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:31.509 104084 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5 namespace which is not needed anymore#033[00m
Nov 22 02:59:31 np0005531887 systemd[1]: machine-qemu\x2d36\x2dinstance\x2d00000057.scope: Deactivated successfully.
Nov 22 02:59:31 np0005531887 systemd[1]: machine-qemu\x2d36\x2dinstance\x2d00000057.scope: Consumed 10.240s CPU time.
Nov 22 02:59:31 np0005531887 systemd-machined[153180]: Machine qemu-36-instance-00000057 terminated.
Nov 22 02:59:31 np0005531887 nova_compute[186849]: 2025-11-22 07:59:31.614 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:31 np0005531887 neutron-haproxy-ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5[226008]: [NOTICE]   (226012) : haproxy version is 2.8.14-c23fe91
Nov 22 02:59:31 np0005531887 neutron-haproxy-ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5[226008]: [NOTICE]   (226012) : path to executable is /usr/sbin/haproxy
Nov 22 02:59:31 np0005531887 neutron-haproxy-ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5[226008]: [WARNING]  (226012) : Exiting Master process...
Nov 22 02:59:31 np0005531887 neutron-haproxy-ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5[226008]: [WARNING]  (226012) : Exiting Master process...
Nov 22 02:59:31 np0005531887 neutron-haproxy-ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5[226008]: [ALERT]    (226012) : Current worker (226014) exited with code 143 (Terminated)
Nov 22 02:59:31 np0005531887 neutron-haproxy-ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5[226008]: [WARNING]  (226012) : All workers exited. Exiting... (0)
Nov 22 02:59:31 np0005531887 systemd[1]: libpod-16a0d8dea394b60090df84e1bbe812420f433dcfc27332b266cc5e6e8edbc531.scope: Deactivated successfully.
Nov 22 02:59:31 np0005531887 podman[226097]: 2025-11-22 07:59:31.694474584 +0000 UTC m=+0.078946589 container died 16a0d8dea394b60090df84e1bbe812420f433dcfc27332b266cc5e6e8edbc531 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 22 02:59:31 np0005531887 nova_compute[186849]: 2025-11-22 07:59:31.735 186853 INFO nova.virt.libvirt.driver [-] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Instance destroyed successfully.#033[00m
Nov 22 02:59:31 np0005531887 nova_compute[186849]: 2025-11-22 07:59:31.737 186853 DEBUG nova.objects.instance [None req-144df5e8-6e4c-45ac-bc3a-fdf46220df0e cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lazy-loading 'resources' on Instance uuid 63f687f3-efad-42ad-b771-c95586f36ed7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:59:31 np0005531887 nova_compute[186849]: 2025-11-22 07:59:31.748 186853 DEBUG nova.virt.libvirt.vif [None req-144df5e8-6e4c-45ac-bc3a-fdf46220df0e cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:59:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-1317504943',display_name='tempest-ListServersNegativeTestJSON-server-1317504943-2',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-1317504943-2',id=87,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=1,launched_at=2025-11-22T07:59:22Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='16ccb24424c54ae1a1b0d7eef6f7d690',ramdisk_id='',reservation_id='r-ez5qyzoz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServersNegativeTestJSON-1715955177',owner_user_name='tempest-ListServersNegativeTestJSON-1715955177-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:59:22Z,user_data=None,user_id='cf1790780fd64791b117114d170d6d90',uuid=63f687f3-efad-42ad-b771-c95586f36ed7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "606d4bb0-7edc-4fdc-8e00-7dce6b636a37", "address": "fa:16:3e:7b:76:96", "network": {"id": "d6148823-d007-4a7e-be44-4329f8ecc6e5", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1122516717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16ccb24424c54ae1a1b0d7eef6f7d690", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap606d4bb0-7e", "ovs_interfaceid": "606d4bb0-7edc-4fdc-8e00-7dce6b636a37", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 02:59:31 np0005531887 nova_compute[186849]: 2025-11-22 07:59:31.749 186853 DEBUG nova.network.os_vif_util [None req-144df5e8-6e4c-45ac-bc3a-fdf46220df0e cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Converting VIF {"id": "606d4bb0-7edc-4fdc-8e00-7dce6b636a37", "address": "fa:16:3e:7b:76:96", "network": {"id": "d6148823-d007-4a7e-be44-4329f8ecc6e5", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1122516717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16ccb24424c54ae1a1b0d7eef6f7d690", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap606d4bb0-7e", "ovs_interfaceid": "606d4bb0-7edc-4fdc-8e00-7dce6b636a37", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:59:31 np0005531887 nova_compute[186849]: 2025-11-22 07:59:31.749 186853 DEBUG nova.network.os_vif_util [None req-144df5e8-6e4c-45ac-bc3a-fdf46220df0e cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7b:76:96,bridge_name='br-int',has_traffic_filtering=True,id=606d4bb0-7edc-4fdc-8e00-7dce6b636a37,network=Network(d6148823-d007-4a7e-be44-4329f8ecc6e5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap606d4bb0-7e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:59:31 np0005531887 nova_compute[186849]: 2025-11-22 07:59:31.750 186853 DEBUG os_vif [None req-144df5e8-6e4c-45ac-bc3a-fdf46220df0e cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7b:76:96,bridge_name='br-int',has_traffic_filtering=True,id=606d4bb0-7edc-4fdc-8e00-7dce6b636a37,network=Network(d6148823-d007-4a7e-be44-4329f8ecc6e5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap606d4bb0-7e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 02:59:31 np0005531887 nova_compute[186849]: 2025-11-22 07:59:31.753 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:31 np0005531887 nova_compute[186849]: 2025-11-22 07:59:31.754 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap606d4bb0-7e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:59:31 np0005531887 nova_compute[186849]: 2025-11-22 07:59:31.756 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:31 np0005531887 nova_compute[186849]: 2025-11-22 07:59:31.758 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:31 np0005531887 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-16a0d8dea394b60090df84e1bbe812420f433dcfc27332b266cc5e6e8edbc531-userdata-shm.mount: Deactivated successfully.
Nov 22 02:59:31 np0005531887 nova_compute[186849]: 2025-11-22 07:59:31.762 186853 INFO os_vif [None req-144df5e8-6e4c-45ac-bc3a-fdf46220df0e cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7b:76:96,bridge_name='br-int',has_traffic_filtering=True,id=606d4bb0-7edc-4fdc-8e00-7dce6b636a37,network=Network(d6148823-d007-4a7e-be44-4329f8ecc6e5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap606d4bb0-7e')#033[00m
Nov 22 02:59:31 np0005531887 nova_compute[186849]: 2025-11-22 07:59:31.763 186853 INFO nova.virt.libvirt.driver [None req-144df5e8-6e4c-45ac-bc3a-fdf46220df0e cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Deleting instance files /var/lib/nova/instances/63f687f3-efad-42ad-b771-c95586f36ed7_del#033[00m
Nov 22 02:59:31 np0005531887 systemd[1]: var-lib-containers-storage-overlay-2f1cb55a0e7f1aa57884f24549106c8f08354c03daf2785f0d2264bb8c69d8eb-merged.mount: Deactivated successfully.
Nov 22 02:59:31 np0005531887 nova_compute[186849]: 2025-11-22 07:59:31.764 186853 INFO nova.virt.libvirt.driver [None req-144df5e8-6e4c-45ac-bc3a-fdf46220df0e cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Deletion of /var/lib/nova/instances/63f687f3-efad-42ad-b771-c95586f36ed7_del complete#033[00m
Nov 22 02:59:31 np0005531887 podman[226097]: 2025-11-22 07:59:31.772640803 +0000 UTC m=+0.157112778 container cleanup 16a0d8dea394b60090df84e1bbe812420f433dcfc27332b266cc5e6e8edbc531 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 02:59:31 np0005531887 systemd[1]: libpod-conmon-16a0d8dea394b60090df84e1bbe812420f433dcfc27332b266cc5e6e8edbc531.scope: Deactivated successfully.
Nov 22 02:59:31 np0005531887 nova_compute[186849]: 2025-11-22 07:59:31.843 186853 INFO nova.compute.manager [None req-144df5e8-6e4c-45ac-bc3a-fdf46220df0e cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Took 0.40 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 02:59:31 np0005531887 nova_compute[186849]: 2025-11-22 07:59:31.844 186853 DEBUG oslo.service.loopingcall [None req-144df5e8-6e4c-45ac-bc3a-fdf46220df0e cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 02:59:31 np0005531887 nova_compute[186849]: 2025-11-22 07:59:31.844 186853 DEBUG nova.compute.manager [-] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 02:59:31 np0005531887 nova_compute[186849]: 2025-11-22 07:59:31.844 186853 DEBUG nova.network.neutron [-] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 02:59:31 np0005531887 podman[226146]: 2025-11-22 07:59:31.866188513 +0000 UTC m=+0.070785584 container remove 16a0d8dea394b60090df84e1bbe812420f433dcfc27332b266cc5e6e8edbc531 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 02:59:31 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:31.871 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[1d119431-3471-4707-826c-df0d61e42cf1]: (4, ('Sat Nov 22 07:59:31 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5 (16a0d8dea394b60090df84e1bbe812420f433dcfc27332b266cc5e6e8edbc531)\n16a0d8dea394b60090df84e1bbe812420f433dcfc27332b266cc5e6e8edbc531\nSat Nov 22 07:59:31 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5 (16a0d8dea394b60090df84e1bbe812420f433dcfc27332b266cc5e6e8edbc531)\n16a0d8dea394b60090df84e1bbe812420f433dcfc27332b266cc5e6e8edbc531\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:31 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:31.872 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[08633ed8-d077-4b29-b1a0-ffea2c5dd319]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:31 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:31.873 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd6148823-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:59:31 np0005531887 nova_compute[186849]: 2025-11-22 07:59:31.875 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:31 np0005531887 kernel: tapd6148823-d0: left promiscuous mode
Nov 22 02:59:31 np0005531887 nova_compute[186849]: 2025-11-22 07:59:31.888 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:31 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:31.891 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[3d8891db-f618-4d7f-87b1-27b0b3211b58]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:31 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:31.911 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[3570f6d1-1695-4b4f-bd4e-f93720312433]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:31 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:31.916 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[1ac44459-3396-4c35-a7ad-9c51125e15f2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:31 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:31.938 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[f49560d2-4c83-45a9-923f-92e82e5850ea]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 510121, 'reachable_time': 17578, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226162, 'error': None, 'target': 'ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:31 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:31.943 104200 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 02:59:31 np0005531887 systemd[1]: run-netns-ovnmeta\x2dd6148823\x2dd007\x2d4a7e\x2dbe44\x2d4329f8ecc6e5.mount: Deactivated successfully.
Nov 22 02:59:31 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:31.944 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[166af427-fe05-4860-8ae3-dd82c1c52c39]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:32 np0005531887 nova_compute[186849]: 2025-11-22 07:59:32.021 186853 DEBUG nova.compute.manager [req-42fcaf55-9373-46cd-a0dc-976702d63ef4 req-d8081fd2-310f-4037-ba56-00e10d198e76 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Received event network-vif-unplugged-606d4bb0-7edc-4fdc-8e00-7dce6b636a37 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:59:32 np0005531887 nova_compute[186849]: 2025-11-22 07:59:32.021 186853 DEBUG oslo_concurrency.lockutils [req-42fcaf55-9373-46cd-a0dc-976702d63ef4 req-d8081fd2-310f-4037-ba56-00e10d198e76 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "63f687f3-efad-42ad-b771-c95586f36ed7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:59:32 np0005531887 nova_compute[186849]: 2025-11-22 07:59:32.022 186853 DEBUG oslo_concurrency.lockutils [req-42fcaf55-9373-46cd-a0dc-976702d63ef4 req-d8081fd2-310f-4037-ba56-00e10d198e76 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "63f687f3-efad-42ad-b771-c95586f36ed7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:59:32 np0005531887 nova_compute[186849]: 2025-11-22 07:59:32.022 186853 DEBUG oslo_concurrency.lockutils [req-42fcaf55-9373-46cd-a0dc-976702d63ef4 req-d8081fd2-310f-4037-ba56-00e10d198e76 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "63f687f3-efad-42ad-b771-c95586f36ed7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:59:32 np0005531887 nova_compute[186849]: 2025-11-22 07:59:32.022 186853 DEBUG nova.compute.manager [req-42fcaf55-9373-46cd-a0dc-976702d63ef4 req-d8081fd2-310f-4037-ba56-00e10d198e76 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] No waiting events found dispatching network-vif-unplugged-606d4bb0-7edc-4fdc-8e00-7dce6b636a37 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:59:32 np0005531887 nova_compute[186849]: 2025-11-22 07:59:32.022 186853 DEBUG nova.compute.manager [req-42fcaf55-9373-46cd-a0dc-976702d63ef4 req-d8081fd2-310f-4037-ba56-00e10d198e76 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Received event network-vif-unplugged-606d4bb0-7edc-4fdc-8e00-7dce6b636a37 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 02:59:32 np0005531887 nova_compute[186849]: 2025-11-22 07:59:32.473 186853 DEBUG oslo_concurrency.lockutils [None req-3d9ab49f-d365-4e9a-b119-ef6b852cbc02 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Acquiring lock "3f315996-e85d-463b-9123-272512335a7f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:59:32 np0005531887 nova_compute[186849]: 2025-11-22 07:59:32.474 186853 DEBUG oslo_concurrency.lockutils [None req-3d9ab49f-d365-4e9a-b119-ef6b852cbc02 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "3f315996-e85d-463b-9123-272512335a7f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:59:32 np0005531887 nova_compute[186849]: 2025-11-22 07:59:32.475 186853 DEBUG oslo_concurrency.lockutils [None req-3d9ab49f-d365-4e9a-b119-ef6b852cbc02 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Acquiring lock "3f315996-e85d-463b-9123-272512335a7f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:59:32 np0005531887 nova_compute[186849]: 2025-11-22 07:59:32.475 186853 DEBUG oslo_concurrency.lockutils [None req-3d9ab49f-d365-4e9a-b119-ef6b852cbc02 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "3f315996-e85d-463b-9123-272512335a7f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:59:32 np0005531887 nova_compute[186849]: 2025-11-22 07:59:32.475 186853 DEBUG oslo_concurrency.lockutils [None req-3d9ab49f-d365-4e9a-b119-ef6b852cbc02 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "3f315996-e85d-463b-9123-272512335a7f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:59:32 np0005531887 nova_compute[186849]: 2025-11-22 07:59:32.483 186853 INFO nova.compute.manager [None req-3d9ab49f-d365-4e9a-b119-ef6b852cbc02 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Terminating instance#033[00m
Nov 22 02:59:32 np0005531887 nova_compute[186849]: 2025-11-22 07:59:32.489 186853 DEBUG nova.compute.manager [None req-3d9ab49f-d365-4e9a-b119-ef6b852cbc02 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 02:59:32 np0005531887 kernel: tapf33dc67e-31 (unregistering): left promiscuous mode
Nov 22 02:59:32 np0005531887 NetworkManager[55210]: <info>  [1763798372.5274] device (tapf33dc67e-31): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 02:59:32 np0005531887 ovn_controller[95130]: 2025-11-22T07:59:32Z|00248|binding|INFO|Releasing lport f33dc67e-3190-49f9-a981-9b80daf65bdb from this chassis (sb_readonly=0)
Nov 22 02:59:32 np0005531887 nova_compute[186849]: 2025-11-22 07:59:32.535 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:32 np0005531887 ovn_controller[95130]: 2025-11-22T07:59:32Z|00249|binding|INFO|Setting lport f33dc67e-3190-49f9-a981-9b80daf65bdb down in Southbound
Nov 22 02:59:32 np0005531887 ovn_controller[95130]: 2025-11-22T07:59:32Z|00250|binding|INFO|Removing iface tapf33dc67e-31 ovn-installed in OVS
Nov 22 02:59:32 np0005531887 nova_compute[186849]: 2025-11-22 07:59:32.537 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:32 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:32.546 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bc:09:ea 10.100.0.3'], port_security=['fa:16:3e:bc:09:ea 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '3f315996-e85d-463b-9123-272512335a7f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'neutron:revision_number': '8', 'neutron:security_group_ids': '40e7d412-78c2-4966-b2d0-76294ef96b0a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2f082361-600e-461c-8c13-2c91c0ff7f77, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=f33dc67e-3190-49f9-a981-9b80daf65bdb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:59:32 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:32.549 104084 INFO neutron.agent.ovn.metadata.agent [-] Port f33dc67e-3190-49f9-a981-9b80daf65bdb in datapath 06e0f3a5-911a-4244-bd9c-8cb4fa4c4794 unbound from our chassis#033[00m
Nov 22 02:59:32 np0005531887 nova_compute[186849]: 2025-11-22 07:59:32.549 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:32 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:32.551 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 06e0f3a5-911a-4244-bd9c-8cb4fa4c4794#033[00m
Nov 22 02:59:32 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:32.571 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[80e22c5c-5c3e-49f1-9c11-f3562d5384b7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:32 np0005531887 systemd[1]: machine-qemu\x2d35\x2dinstance\x2d00000050.scope: Deactivated successfully.
Nov 22 02:59:32 np0005531887 systemd[1]: machine-qemu\x2d35\x2dinstance\x2d00000050.scope: Consumed 16.696s CPU time.
Nov 22 02:59:32 np0005531887 systemd-machined[153180]: Machine qemu-35-instance-00000050 terminated.
Nov 22 02:59:32 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:32.610 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[a19a639a-1a8a-4f22-afed-06b4c2537b2e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:32 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:32.615 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[55e7cf4d-7b3e-4662-b5fd-1d6eb8c7728e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:32 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:32.652 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[f6a5f5a1-cdf4-41a2-b0f2-d9fb414db8e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:32 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:32.672 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[a93dba05-5332-4ad0-b159-59ba7209afa5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap06e0f3a5-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:b7:bd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 11, 'tx_packets': 15, 'rx_bytes': 742, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 11, 'tx_packets': 15, 'rx_bytes': 742, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 73], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 501577, 'reachable_time': 19429, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226171, 'error': None, 'target': 'ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:32 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:32.692 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[793307ea-062f-4b42-b94b-eb7a6077daa4]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap06e0f3a5-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 501591, 'tstamp': 501591}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226172, 'error': None, 'target': 'ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap06e0f3a5-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 501595, 'tstamp': 501595}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226172, 'error': None, 'target': 'ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:32 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:32.694 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap06e0f3a5-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:59:32 np0005531887 nova_compute[186849]: 2025-11-22 07:59:32.696 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:32 np0005531887 nova_compute[186849]: 2025-11-22 07:59:32.703 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:32 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:32.704 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap06e0f3a5-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:59:32 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:32.705 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:59:32 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:32.705 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap06e0f3a5-90, col_values=(('external_ids', {'iface-id': '465da2c0-9a1c-41a9-be9a-d10bcbd7a813'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:59:32 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:32.706 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 02:59:32 np0005531887 nova_compute[186849]: 2025-11-22 07:59:32.767 186853 INFO nova.virt.libvirt.driver [-] [instance: 3f315996-e85d-463b-9123-272512335a7f] Instance destroyed successfully.#033[00m
Nov 22 02:59:32 np0005531887 nova_compute[186849]: 2025-11-22 07:59:32.768 186853 DEBUG nova.objects.instance [None req-3d9ab49f-d365-4e9a-b119-ef6b852cbc02 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lazy-loading 'resources' on Instance uuid 3f315996-e85d-463b-9123-272512335a7f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:59:32 np0005531887 nova_compute[186849]: 2025-11-22 07:59:32.780 186853 DEBUG nova.virt.libvirt.vif [None req-3d9ab49f-d365-4e9a-b119-ef6b852cbc02 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:57:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-1175546546',display_name='tempest-ServerStableDeviceRescueTest-server-1175546546',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-1175546546',id=80,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:58:36Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fd33c7e49baa4c7f9575824b348a0f23',ramdisk_id='',reservation_id='r-wu2qhkzu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerStableDeviceRescueTest-455223381',owner_user_name='tempest-ServerStableDeviceRescueTest-455223381-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:58:41Z,user_data=None,user_id='0d84421d986b40f481c0caef764443e2',uuid=3f315996-e85d-463b-9123-272512335a7f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f33dc67e-3190-49f9-a981-9b80daf65bdb", "address": "fa:16:3e:bc:09:ea", "network": {"id": "06e0f3a5-911a-4244-bd9c-8cb4fa4c4794", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-960378838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd33c7e49baa4c7f9575824b348a0f23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf33dc67e-31", "ovs_interfaceid": "f33dc67e-3190-49f9-a981-9b80daf65bdb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 02:59:32 np0005531887 nova_compute[186849]: 2025-11-22 07:59:32.780 186853 DEBUG nova.network.os_vif_util [None req-3d9ab49f-d365-4e9a-b119-ef6b852cbc02 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Converting VIF {"id": "f33dc67e-3190-49f9-a981-9b80daf65bdb", "address": "fa:16:3e:bc:09:ea", "network": {"id": "06e0f3a5-911a-4244-bd9c-8cb4fa4c4794", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-960378838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd33c7e49baa4c7f9575824b348a0f23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf33dc67e-31", "ovs_interfaceid": "f33dc67e-3190-49f9-a981-9b80daf65bdb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:59:32 np0005531887 nova_compute[186849]: 2025-11-22 07:59:32.781 186853 DEBUG nova.network.os_vif_util [None req-3d9ab49f-d365-4e9a-b119-ef6b852cbc02 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:bc:09:ea,bridge_name='br-int',has_traffic_filtering=True,id=f33dc67e-3190-49f9-a981-9b80daf65bdb,network=Network(06e0f3a5-911a-4244-bd9c-8cb4fa4c4794),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf33dc67e-31') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:59:32 np0005531887 nova_compute[186849]: 2025-11-22 07:59:32.781 186853 DEBUG os_vif [None req-3d9ab49f-d365-4e9a-b119-ef6b852cbc02 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:bc:09:ea,bridge_name='br-int',has_traffic_filtering=True,id=f33dc67e-3190-49f9-a981-9b80daf65bdb,network=Network(06e0f3a5-911a-4244-bd9c-8cb4fa4c4794),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf33dc67e-31') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 02:59:32 np0005531887 nova_compute[186849]: 2025-11-22 07:59:32.782 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:32 np0005531887 nova_compute[186849]: 2025-11-22 07:59:32.783 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf33dc67e-31, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:59:32 np0005531887 nova_compute[186849]: 2025-11-22 07:59:32.784 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:32 np0005531887 nova_compute[186849]: 2025-11-22 07:59:32.786 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:32 np0005531887 nova_compute[186849]: 2025-11-22 07:59:32.789 186853 INFO os_vif [None req-3d9ab49f-d365-4e9a-b119-ef6b852cbc02 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:bc:09:ea,bridge_name='br-int',has_traffic_filtering=True,id=f33dc67e-3190-49f9-a981-9b80daf65bdb,network=Network(06e0f3a5-911a-4244-bd9c-8cb4fa4c4794),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf33dc67e-31')#033[00m
Nov 22 02:59:32 np0005531887 nova_compute[186849]: 2025-11-22 07:59:32.789 186853 INFO nova.virt.libvirt.driver [None req-3d9ab49f-d365-4e9a-b119-ef6b852cbc02 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Deleting instance files /var/lib/nova/instances/3f315996-e85d-463b-9123-272512335a7f_del#033[00m
Nov 22 02:59:32 np0005531887 nova_compute[186849]: 2025-11-22 07:59:32.790 186853 INFO nova.virt.libvirt.driver [None req-3d9ab49f-d365-4e9a-b119-ef6b852cbc02 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Deletion of /var/lib/nova/instances/3f315996-e85d-463b-9123-272512335a7f_del complete#033[00m
Nov 22 02:59:32 np0005531887 nova_compute[186849]: 2025-11-22 07:59:32.794 186853 DEBUG nova.network.neutron [-] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:59:32 np0005531887 nova_compute[186849]: 2025-11-22 07:59:32.816 186853 DEBUG nova.compute.manager [req-ea7aaf38-6cc2-4b8a-ac91-83780dbe6c6a req-51ea06ee-2e24-4a06-8339-4068c0e65bbe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Received event network-vif-unplugged-f33dc67e-3190-49f9-a981-9b80daf65bdb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:59:32 np0005531887 nova_compute[186849]: 2025-11-22 07:59:32.816 186853 DEBUG oslo_concurrency.lockutils [req-ea7aaf38-6cc2-4b8a-ac91-83780dbe6c6a req-51ea06ee-2e24-4a06-8339-4068c0e65bbe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "3f315996-e85d-463b-9123-272512335a7f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:59:32 np0005531887 nova_compute[186849]: 2025-11-22 07:59:32.816 186853 DEBUG oslo_concurrency.lockutils [req-ea7aaf38-6cc2-4b8a-ac91-83780dbe6c6a req-51ea06ee-2e24-4a06-8339-4068c0e65bbe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3f315996-e85d-463b-9123-272512335a7f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:59:32 np0005531887 nova_compute[186849]: 2025-11-22 07:59:32.817 186853 DEBUG oslo_concurrency.lockutils [req-ea7aaf38-6cc2-4b8a-ac91-83780dbe6c6a req-51ea06ee-2e24-4a06-8339-4068c0e65bbe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3f315996-e85d-463b-9123-272512335a7f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:59:32 np0005531887 nova_compute[186849]: 2025-11-22 07:59:32.817 186853 DEBUG nova.compute.manager [req-ea7aaf38-6cc2-4b8a-ac91-83780dbe6c6a req-51ea06ee-2e24-4a06-8339-4068c0e65bbe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] No waiting events found dispatching network-vif-unplugged-f33dc67e-3190-49f9-a981-9b80daf65bdb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:59:32 np0005531887 nova_compute[186849]: 2025-11-22 07:59:32.817 186853 DEBUG nova.compute.manager [req-ea7aaf38-6cc2-4b8a-ac91-83780dbe6c6a req-51ea06ee-2e24-4a06-8339-4068c0e65bbe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Received event network-vif-unplugged-f33dc67e-3190-49f9-a981-9b80daf65bdb for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 02:59:32 np0005531887 nova_compute[186849]: 2025-11-22 07:59:32.836 186853 INFO nova.compute.manager [-] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Took 0.99 seconds to deallocate network for instance.#033[00m
Nov 22 02:59:32 np0005531887 nova_compute[186849]: 2025-11-22 07:59:32.908 186853 INFO nova.compute.manager [None req-3d9ab49f-d365-4e9a-b119-ef6b852cbc02 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Took 0.42 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 02:59:32 np0005531887 nova_compute[186849]: 2025-11-22 07:59:32.910 186853 DEBUG oslo.service.loopingcall [None req-3d9ab49f-d365-4e9a-b119-ef6b852cbc02 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 02:59:32 np0005531887 nova_compute[186849]: 2025-11-22 07:59:32.910 186853 DEBUG nova.compute.manager [-] [instance: 3f315996-e85d-463b-9123-272512335a7f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 02:59:32 np0005531887 nova_compute[186849]: 2025-11-22 07:59:32.910 186853 DEBUG nova.network.neutron [-] [instance: 3f315996-e85d-463b-9123-272512335a7f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 02:59:32 np0005531887 nova_compute[186849]: 2025-11-22 07:59:32.963 186853 DEBUG oslo_concurrency.lockutils [None req-144df5e8-6e4c-45ac-bc3a-fdf46220df0e cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:59:32 np0005531887 nova_compute[186849]: 2025-11-22 07:59:32.963 186853 DEBUG oslo_concurrency.lockutils [None req-144df5e8-6e4c-45ac-bc3a-fdf46220df0e cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:59:33 np0005531887 nova_compute[186849]: 2025-11-22 07:59:33.065 186853 DEBUG nova.compute.provider_tree [None req-144df5e8-6e4c-45ac-bc3a-fdf46220df0e cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:59:33 np0005531887 nova_compute[186849]: 2025-11-22 07:59:33.078 186853 DEBUG nova.scheduler.client.report [None req-144df5e8-6e4c-45ac-bc3a-fdf46220df0e cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:59:33 np0005531887 nova_compute[186849]: 2025-11-22 07:59:33.097 186853 DEBUG oslo_concurrency.lockutils [None req-144df5e8-6e4c-45ac-bc3a-fdf46220df0e cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:59:33 np0005531887 nova_compute[186849]: 2025-11-22 07:59:33.165 186853 INFO nova.scheduler.client.report [None req-144df5e8-6e4c-45ac-bc3a-fdf46220df0e cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Deleted allocations for instance 63f687f3-efad-42ad-b771-c95586f36ed7#033[00m
Nov 22 02:59:33 np0005531887 nova_compute[186849]: 2025-11-22 07:59:33.264 186853 DEBUG oslo_concurrency.lockutils [None req-144df5e8-6e4c-45ac-bc3a-fdf46220df0e cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lock "63f687f3-efad-42ad-b771-c95586f36ed7" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.842s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:59:33 np0005531887 nova_compute[186849]: 2025-11-22 07:59:33.756 186853 DEBUG nova.network.neutron [-] [instance: 3f315996-e85d-463b-9123-272512335a7f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:59:33 np0005531887 nova_compute[186849]: 2025-11-22 07:59:33.789 186853 INFO nova.compute.manager [-] [instance: 3f315996-e85d-463b-9123-272512335a7f] Took 0.88 seconds to deallocate network for instance.#033[00m
Nov 22 02:59:33 np0005531887 podman[226191]: 2025-11-22 07:59:33.842328173 +0000 UTC m=+0.060196381 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 02:59:33 np0005531887 nova_compute[186849]: 2025-11-22 07:59:33.871 186853 DEBUG oslo_concurrency.lockutils [None req-3d9ab49f-d365-4e9a-b119-ef6b852cbc02 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:59:33 np0005531887 nova_compute[186849]: 2025-11-22 07:59:33.872 186853 DEBUG oslo_concurrency.lockutils [None req-3d9ab49f-d365-4e9a-b119-ef6b852cbc02 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:59:33 np0005531887 nova_compute[186849]: 2025-11-22 07:59:33.974 186853 DEBUG nova.compute.provider_tree [None req-3d9ab49f-d365-4e9a-b119-ef6b852cbc02 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:59:33 np0005531887 nova_compute[186849]: 2025-11-22 07:59:33.989 186853 DEBUG nova.scheduler.client.report [None req-3d9ab49f-d365-4e9a-b119-ef6b852cbc02 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:59:34 np0005531887 nova_compute[186849]: 2025-11-22 07:59:34.020 186853 DEBUG oslo_concurrency.lockutils [None req-3d9ab49f-d365-4e9a-b119-ef6b852cbc02 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.148s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:59:34 np0005531887 nova_compute[186849]: 2025-11-22 07:59:34.106 186853 DEBUG nova.compute.manager [req-fa9a415e-619c-4983-b52d-bfb91ae9cc92 req-6e3b79d6-d5e4-42c2-ad5d-8b18a9031df4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Received event network-vif-deleted-606d4bb0-7edc-4fdc-8e00-7dce6b636a37 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:59:34 np0005531887 nova_compute[186849]: 2025-11-22 07:59:34.106 186853 DEBUG nova.compute.manager [req-fa9a415e-619c-4983-b52d-bfb91ae9cc92 req-6e3b79d6-d5e4-42c2-ad5d-8b18a9031df4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Received event network-vif-deleted-f33dc67e-3190-49f9-a981-9b80daf65bdb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:59:34 np0005531887 nova_compute[186849]: 2025-11-22 07:59:34.134 186853 DEBUG nova.compute.manager [req-afcd9ab6-e7a4-49b1-87a3-f5f4a07f8c83 req-83529df1-858f-4712-8c0e-b12b6101c202 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Received event network-vif-plugged-606d4bb0-7edc-4fdc-8e00-7dce6b636a37 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:59:34 np0005531887 nova_compute[186849]: 2025-11-22 07:59:34.134 186853 DEBUG oslo_concurrency.lockutils [req-afcd9ab6-e7a4-49b1-87a3-f5f4a07f8c83 req-83529df1-858f-4712-8c0e-b12b6101c202 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "63f687f3-efad-42ad-b771-c95586f36ed7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:59:34 np0005531887 nova_compute[186849]: 2025-11-22 07:59:34.134 186853 DEBUG oslo_concurrency.lockutils [req-afcd9ab6-e7a4-49b1-87a3-f5f4a07f8c83 req-83529df1-858f-4712-8c0e-b12b6101c202 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "63f687f3-efad-42ad-b771-c95586f36ed7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:59:34 np0005531887 nova_compute[186849]: 2025-11-22 07:59:34.134 186853 DEBUG oslo_concurrency.lockutils [req-afcd9ab6-e7a4-49b1-87a3-f5f4a07f8c83 req-83529df1-858f-4712-8c0e-b12b6101c202 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "63f687f3-efad-42ad-b771-c95586f36ed7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:59:34 np0005531887 nova_compute[186849]: 2025-11-22 07:59:34.135 186853 DEBUG nova.compute.manager [req-afcd9ab6-e7a4-49b1-87a3-f5f4a07f8c83 req-83529df1-858f-4712-8c0e-b12b6101c202 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] No waiting events found dispatching network-vif-plugged-606d4bb0-7edc-4fdc-8e00-7dce6b636a37 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:59:34 np0005531887 nova_compute[186849]: 2025-11-22 07:59:34.135 186853 WARNING nova.compute.manager [req-afcd9ab6-e7a4-49b1-87a3-f5f4a07f8c83 req-83529df1-858f-4712-8c0e-b12b6101c202 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Received unexpected event network-vif-plugged-606d4bb0-7edc-4fdc-8e00-7dce6b636a37 for instance with vm_state deleted and task_state None.#033[00m
Nov 22 02:59:34 np0005531887 nova_compute[186849]: 2025-11-22 07:59:34.155 186853 INFO nova.scheduler.client.report [None req-3d9ab49f-d365-4e9a-b119-ef6b852cbc02 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Deleted allocations for instance 3f315996-e85d-463b-9123-272512335a7f#033[00m
Nov 22 02:59:34 np0005531887 nova_compute[186849]: 2025-11-22 07:59:34.593 186853 DEBUG oslo_concurrency.lockutils [None req-3d9ab49f-d365-4e9a-b119-ef6b852cbc02 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "3f315996-e85d-463b-9123-272512335a7f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:59:34 np0005531887 nova_compute[186849]: 2025-11-22 07:59:34.917 186853 DEBUG nova.compute.manager [req-97ebc68e-3044-4665-a5e6-161e64e411ca req-519793b8-a88f-483f-8e74-92272c3b852a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Received event network-vif-plugged-f33dc67e-3190-49f9-a981-9b80daf65bdb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:59:34 np0005531887 nova_compute[186849]: 2025-11-22 07:59:34.918 186853 DEBUG oslo_concurrency.lockutils [req-97ebc68e-3044-4665-a5e6-161e64e411ca req-519793b8-a88f-483f-8e74-92272c3b852a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "3f315996-e85d-463b-9123-272512335a7f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:59:34 np0005531887 nova_compute[186849]: 2025-11-22 07:59:34.918 186853 DEBUG oslo_concurrency.lockutils [req-97ebc68e-3044-4665-a5e6-161e64e411ca req-519793b8-a88f-483f-8e74-92272c3b852a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3f315996-e85d-463b-9123-272512335a7f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:59:34 np0005531887 nova_compute[186849]: 2025-11-22 07:59:34.919 186853 DEBUG oslo_concurrency.lockutils [req-97ebc68e-3044-4665-a5e6-161e64e411ca req-519793b8-a88f-483f-8e74-92272c3b852a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3f315996-e85d-463b-9123-272512335a7f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:59:34 np0005531887 nova_compute[186849]: 2025-11-22 07:59:34.919 186853 DEBUG nova.compute.manager [req-97ebc68e-3044-4665-a5e6-161e64e411ca req-519793b8-a88f-483f-8e74-92272c3b852a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] No waiting events found dispatching network-vif-plugged-f33dc67e-3190-49f9-a981-9b80daf65bdb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:59:34 np0005531887 nova_compute[186849]: 2025-11-22 07:59:34.919 186853 WARNING nova.compute.manager [req-97ebc68e-3044-4665-a5e6-161e64e411ca req-519793b8-a88f-483f-8e74-92272c3b852a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f315996-e85d-463b-9123-272512335a7f] Received unexpected event network-vif-plugged-f33dc67e-3190-49f9-a981-9b80daf65bdb for instance with vm_state deleted and task_state None.#033[00m
Nov 22 02:59:36 np0005531887 nova_compute[186849]: 2025-11-22 07:59:36.617 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:37.330 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:59:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:37.331 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:59:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:37.331 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:59:37 np0005531887 nova_compute[186849]: 2025-11-22 07:59:37.540 186853 DEBUG oslo_concurrency.lockutils [None req-7a770eb1-9cf3-4881-aec4-99c28c042ca7 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Acquiring lock "e4a6074c-55b0-4529-b184-3ba3ca0dab8c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:59:37 np0005531887 nova_compute[186849]: 2025-11-22 07:59:37.540 186853 DEBUG oslo_concurrency.lockutils [None req-7a770eb1-9cf3-4881-aec4-99c28c042ca7 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "e4a6074c-55b0-4529-b184-3ba3ca0dab8c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:59:37 np0005531887 nova_compute[186849]: 2025-11-22 07:59:37.541 186853 DEBUG oslo_concurrency.lockutils [None req-7a770eb1-9cf3-4881-aec4-99c28c042ca7 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Acquiring lock "e4a6074c-55b0-4529-b184-3ba3ca0dab8c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:59:37 np0005531887 nova_compute[186849]: 2025-11-22 07:59:37.541 186853 DEBUG oslo_concurrency.lockutils [None req-7a770eb1-9cf3-4881-aec4-99c28c042ca7 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "e4a6074c-55b0-4529-b184-3ba3ca0dab8c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:59:37 np0005531887 nova_compute[186849]: 2025-11-22 07:59:37.541 186853 DEBUG oslo_concurrency.lockutils [None req-7a770eb1-9cf3-4881-aec4-99c28c042ca7 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "e4a6074c-55b0-4529-b184-3ba3ca0dab8c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:59:37 np0005531887 nova_compute[186849]: 2025-11-22 07:59:37.549 186853 INFO nova.compute.manager [None req-7a770eb1-9cf3-4881-aec4-99c28c042ca7 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Terminating instance#033[00m
Nov 22 02:59:37 np0005531887 nova_compute[186849]: 2025-11-22 07:59:37.558 186853 DEBUG nova.compute.manager [None req-7a770eb1-9cf3-4881-aec4-99c28c042ca7 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 02:59:37 np0005531887 kernel: tap392e43af-a9 (unregistering): left promiscuous mode
Nov 22 02:59:37 np0005531887 NetworkManager[55210]: <info>  [1763798377.5907] device (tap392e43af-a9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 02:59:37 np0005531887 ovn_controller[95130]: 2025-11-22T07:59:37Z|00251|binding|INFO|Releasing lport 392e43af-a923-4bd6-bdff-445c6101995b from this chassis (sb_readonly=0)
Nov 22 02:59:37 np0005531887 ovn_controller[95130]: 2025-11-22T07:59:37Z|00252|binding|INFO|Setting lport 392e43af-a923-4bd6-bdff-445c6101995b down in Southbound
Nov 22 02:59:37 np0005531887 nova_compute[186849]: 2025-11-22 07:59:37.602 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:37 np0005531887 ovn_controller[95130]: 2025-11-22T07:59:37Z|00253|binding|INFO|Removing iface tap392e43af-a9 ovn-installed in OVS
Nov 22 02:59:37 np0005531887 nova_compute[186849]: 2025-11-22 07:59:37.620 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:37 np0005531887 systemd[1]: machine-qemu\x2d32\x2dinstance\x2d0000004d.scope: Deactivated successfully.
Nov 22 02:59:37 np0005531887 systemd[1]: machine-qemu\x2d32\x2dinstance\x2d0000004d.scope: Consumed 20.577s CPU time.
Nov 22 02:59:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:37.639 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:10:9b:64 10.100.0.7'], port_security=['fa:16:3e:10:9b:64 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'e4a6074c-55b0-4529-b184-3ba3ca0dab8c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'neutron:revision_number': '8', 'neutron:security_group_ids': '40e7d412-78c2-4966-b2d0-76294ef96b0a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2f082361-600e-461c-8c13-2c91c0ff7f77, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=392e43af-a923-4bd6-bdff-445c6101995b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 02:59:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:37.641 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 392e43af-a923-4bd6-bdff-445c6101995b in datapath 06e0f3a5-911a-4244-bd9c-8cb4fa4c4794 unbound from our chassis#033[00m
Nov 22 02:59:37 np0005531887 systemd-machined[153180]: Machine qemu-32-instance-0000004d terminated.
Nov 22 02:59:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:37.642 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 06e0f3a5-911a-4244-bd9c-8cb4fa4c4794, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 02:59:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:37.643 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[bd168f7e-5497-4295-8fd9-08c707be87de]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:37.644 104084 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794 namespace which is not needed anymore#033[00m
Nov 22 02:59:37 np0005531887 nova_compute[186849]: 2025-11-22 07:59:37.785 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:37 np0005531887 neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794[225102]: [NOTICE]   (225106) : haproxy version is 2.8.14-c23fe91
Nov 22 02:59:37 np0005531887 neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794[225102]: [NOTICE]   (225106) : path to executable is /usr/sbin/haproxy
Nov 22 02:59:37 np0005531887 neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794[225102]: [WARNING]  (225106) : Exiting Master process...
Nov 22 02:59:37 np0005531887 neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794[225102]: [WARNING]  (225106) : Exiting Master process...
Nov 22 02:59:37 np0005531887 neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794[225102]: [ALERT]    (225106) : Current worker (225108) exited with code 143 (Terminated)
Nov 22 02:59:37 np0005531887 neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794[225102]: [WARNING]  (225106) : All workers exited. Exiting... (0)
Nov 22 02:59:37 np0005531887 systemd[1]: libpod-a7bc02ff977de32f78b5c5b2e408cda232607b926eeaa57ffbdf06c7483d9ccc.scope: Deactivated successfully.
Nov 22 02:59:37 np0005531887 podman[226241]: 2025-11-22 07:59:37.826498016 +0000 UTC m=+0.074392785 container died a7bc02ff977de32f78b5c5b2e408cda232607b926eeaa57ffbdf06c7483d9ccc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Nov 22 02:59:37 np0005531887 ovn_controller[95130]: 2025-11-22T07:59:37Z|00254|binding|INFO|Releasing lport 465da2c0-9a1c-41a9-be9a-d10bcbd7a813 from this chassis (sb_readonly=0)
Nov 22 02:59:37 np0005531887 nova_compute[186849]: 2025-11-22 07:59:37.845 186853 INFO nova.virt.libvirt.driver [-] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Instance destroyed successfully.#033[00m
Nov 22 02:59:37 np0005531887 nova_compute[186849]: 2025-11-22 07:59:37.849 186853 DEBUG nova.objects.instance [None req-7a770eb1-9cf3-4881-aec4-99c28c042ca7 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lazy-loading 'resources' on Instance uuid e4a6074c-55b0-4529-b184-3ba3ca0dab8c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:59:37 np0005531887 nova_compute[186849]: 2025-11-22 07:59:37.877 186853 DEBUG nova.virt.libvirt.vif [None req-7a770eb1-9cf3-4881-aec4-99c28c042ca7 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:56:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-697404939',display_name='tempest-ServerStableDeviceRescueTest-server-697404939',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-697404939',id=77,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:57:45Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fd33c7e49baa4c7f9575824b348a0f23',ramdisk_id='',reservation_id='r-0m39mt0u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerStableDeviceRescueTest-455223381',owner_user_name='tempest-ServerStableDeviceRescueTest-455223381-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:57:53Z,user_data=None,user_id='0d84421d986b40f481c0caef764443e2',uuid=e4a6074c-55b0-4529-b184-3ba3ca0dab8c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "392e43af-a923-4bd6-bdff-445c6101995b", "address": "fa:16:3e:10:9b:64", "network": {"id": "06e0f3a5-911a-4244-bd9c-8cb4fa4c4794", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-960378838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd33c7e49baa4c7f9575824b348a0f23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap392e43af-a9", "ovs_interfaceid": "392e43af-a923-4bd6-bdff-445c6101995b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 02:59:37 np0005531887 nova_compute[186849]: 2025-11-22 07:59:37.878 186853 DEBUG nova.network.os_vif_util [None req-7a770eb1-9cf3-4881-aec4-99c28c042ca7 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Converting VIF {"id": "392e43af-a923-4bd6-bdff-445c6101995b", "address": "fa:16:3e:10:9b:64", "network": {"id": "06e0f3a5-911a-4244-bd9c-8cb4fa4c4794", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-960378838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd33c7e49baa4c7f9575824b348a0f23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap392e43af-a9", "ovs_interfaceid": "392e43af-a923-4bd6-bdff-445c6101995b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 02:59:37 np0005531887 nova_compute[186849]: 2025-11-22 07:59:37.879 186853 DEBUG nova.network.os_vif_util [None req-7a770eb1-9cf3-4881-aec4-99c28c042ca7 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:10:9b:64,bridge_name='br-int',has_traffic_filtering=True,id=392e43af-a923-4bd6-bdff-445c6101995b,network=Network(06e0f3a5-911a-4244-bd9c-8cb4fa4c4794),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap392e43af-a9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 02:59:37 np0005531887 nova_compute[186849]: 2025-11-22 07:59:37.879 186853 DEBUG os_vif [None req-7a770eb1-9cf3-4881-aec4-99c28c042ca7 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:10:9b:64,bridge_name='br-int',has_traffic_filtering=True,id=392e43af-a923-4bd6-bdff-445c6101995b,network=Network(06e0f3a5-911a-4244-bd9c-8cb4fa4c4794),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap392e43af-a9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 02:59:37 np0005531887 nova_compute[186849]: 2025-11-22 07:59:37.882 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:37 np0005531887 nova_compute[186849]: 2025-11-22 07:59:37.882 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap392e43af-a9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:59:37 np0005531887 nova_compute[186849]: 2025-11-22 07:59:37.884 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:37 np0005531887 nova_compute[186849]: 2025-11-22 07:59:37.886 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 02:59:37 np0005531887 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a7bc02ff977de32f78b5c5b2e408cda232607b926eeaa57ffbdf06c7483d9ccc-userdata-shm.mount: Deactivated successfully.
Nov 22 02:59:37 np0005531887 systemd[1]: var-lib-containers-storage-overlay-50e51983c6bef0bbe7854780a42ba219c17700dfded49967db45379275f6f35c-merged.mount: Deactivated successfully.
Nov 22 02:59:37 np0005531887 nova_compute[186849]: 2025-11-22 07:59:37.909 186853 DEBUG nova.compute.manager [req-9ac857ef-73d5-4c77-b2bb-340b2886172f req-ccb1e386-bc1e-48ee-a407-94a536235632 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Received event network-vif-unplugged-392e43af-a923-4bd6-bdff-445c6101995b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:59:37 np0005531887 nova_compute[186849]: 2025-11-22 07:59:37.910 186853 DEBUG oslo_concurrency.lockutils [req-9ac857ef-73d5-4c77-b2bb-340b2886172f req-ccb1e386-bc1e-48ee-a407-94a536235632 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "e4a6074c-55b0-4529-b184-3ba3ca0dab8c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:59:37 np0005531887 nova_compute[186849]: 2025-11-22 07:59:37.910 186853 DEBUG oslo_concurrency.lockutils [req-9ac857ef-73d5-4c77-b2bb-340b2886172f req-ccb1e386-bc1e-48ee-a407-94a536235632 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e4a6074c-55b0-4529-b184-3ba3ca0dab8c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:59:37 np0005531887 nova_compute[186849]: 2025-11-22 07:59:37.910 186853 DEBUG oslo_concurrency.lockutils [req-9ac857ef-73d5-4c77-b2bb-340b2886172f req-ccb1e386-bc1e-48ee-a407-94a536235632 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e4a6074c-55b0-4529-b184-3ba3ca0dab8c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:59:37 np0005531887 nova_compute[186849]: 2025-11-22 07:59:37.911 186853 DEBUG nova.compute.manager [req-9ac857ef-73d5-4c77-b2bb-340b2886172f req-ccb1e386-bc1e-48ee-a407-94a536235632 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] No waiting events found dispatching network-vif-unplugged-392e43af-a923-4bd6-bdff-445c6101995b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:59:37 np0005531887 nova_compute[186849]: 2025-11-22 07:59:37.911 186853 DEBUG nova.compute.manager [req-9ac857ef-73d5-4c77-b2bb-340b2886172f req-ccb1e386-bc1e-48ee-a407-94a536235632 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Received event network-vif-unplugged-392e43af-a923-4bd6-bdff-445c6101995b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 02:59:37 np0005531887 nova_compute[186849]: 2025-11-22 07:59:37.911 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:37 np0005531887 nova_compute[186849]: 2025-11-22 07:59:37.914 186853 INFO os_vif [None req-7a770eb1-9cf3-4881-aec4-99c28c042ca7 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:10:9b:64,bridge_name='br-int',has_traffic_filtering=True,id=392e43af-a923-4bd6-bdff-445c6101995b,network=Network(06e0f3a5-911a-4244-bd9c-8cb4fa4c4794),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap392e43af-a9')#033[00m
Nov 22 02:59:37 np0005531887 nova_compute[186849]: 2025-11-22 07:59:37.915 186853 INFO nova.virt.libvirt.driver [None req-7a770eb1-9cf3-4881-aec4-99c28c042ca7 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Deleting instance files /var/lib/nova/instances/e4a6074c-55b0-4529-b184-3ba3ca0dab8c_del#033[00m
Nov 22 02:59:37 np0005531887 nova_compute[186849]: 2025-11-22 07:59:37.916 186853 INFO nova.virt.libvirt.driver [None req-7a770eb1-9cf3-4881-aec4-99c28c042ca7 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Deletion of /var/lib/nova/instances/e4a6074c-55b0-4529-b184-3ba3ca0dab8c_del complete#033[00m
Nov 22 02:59:37 np0005531887 nova_compute[186849]: 2025-11-22 07:59:37.919 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:37 np0005531887 podman[226241]: 2025-11-22 07:59:37.961700116 +0000 UTC m=+0.209594885 container cleanup a7bc02ff977de32f78b5c5b2e408cda232607b926eeaa57ffbdf06c7483d9ccc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 02:59:37 np0005531887 systemd[1]: libpod-conmon-a7bc02ff977de32f78b5c5b2e408cda232607b926eeaa57ffbdf06c7483d9ccc.scope: Deactivated successfully.
Nov 22 02:59:38 np0005531887 nova_compute[186849]: 2025-11-22 07:59:38.000 186853 INFO nova.compute.manager [None req-7a770eb1-9cf3-4881-aec4-99c28c042ca7 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Took 0.44 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 02:59:38 np0005531887 nova_compute[186849]: 2025-11-22 07:59:38.000 186853 DEBUG oslo.service.loopingcall [None req-7a770eb1-9cf3-4881-aec4-99c28c042ca7 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 02:59:38 np0005531887 nova_compute[186849]: 2025-11-22 07:59:38.001 186853 DEBUG nova.compute.manager [-] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 02:59:38 np0005531887 nova_compute[186849]: 2025-11-22 07:59:38.002 186853 DEBUG nova.network.neutron [-] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 02:59:38 np0005531887 podman[226285]: 2025-11-22 07:59:38.070384765 +0000 UTC m=+0.086728813 container remove a7bc02ff977de32f78b5c5b2e408cda232607b926eeaa57ffbdf06c7483d9ccc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 22 02:59:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:38.075 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[1777b0fe-e055-471a-85b7-2659c3dafca3]: (4, ('Sat Nov 22 07:59:37 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794 (a7bc02ff977de32f78b5c5b2e408cda232607b926eeaa57ffbdf06c7483d9ccc)\na7bc02ff977de32f78b5c5b2e408cda232607b926eeaa57ffbdf06c7483d9ccc\nSat Nov 22 07:59:37 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794 (a7bc02ff977de32f78b5c5b2e408cda232607b926eeaa57ffbdf06c7483d9ccc)\na7bc02ff977de32f78b5c5b2e408cda232607b926eeaa57ffbdf06c7483d9ccc\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:38.077 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[7e1952fe-17ae-486b-a2d2-95a154e255cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:38.078 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap06e0f3a5-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 02:59:38 np0005531887 nova_compute[186849]: 2025-11-22 07:59:38.079 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:38 np0005531887 kernel: tap06e0f3a5-90: left promiscuous mode
Nov 22 02:59:38 np0005531887 nova_compute[186849]: 2025-11-22 07:59:38.092 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:38.096 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[fccd3211-e0e9-4bc4-9d0d-079edc2e408d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:38.122 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[824290af-b0fa-4a9b-9bb2-2f97119a6761]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:38.124 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[71b28992-507c-457c-bc72-f85849bd14ee]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:38.140 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[590d7954-45de-4586-a95a-518bb2afe46c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 501568, 'reachable_time': 22535, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226298, 'error': None, 'target': 'ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:38.142 104200 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 02:59:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 07:59:38.143 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[f5c45e04-d7fe-4e9e-bdb2-50508a3b3973]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 02:59:38 np0005531887 systemd[1]: run-netns-ovnmeta\x2d06e0f3a5\x2d911a\x2d4244\x2dbd9c\x2d8cb4fa4c4794.mount: Deactivated successfully.
Nov 22 02:59:39 np0005531887 podman[226299]: 2025-11-22 07:59:39.837075233 +0000 UTC m=+0.060392226 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, config_id=edpm, vendor=Red Hat, Inc., io.buildah.version=1.33.7, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, name=ubi9-minimal, vcs-type=git, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 22 02:59:40 np0005531887 nova_compute[186849]: 2025-11-22 07:59:40.000 186853 DEBUG nova.compute.manager [req-712c7651-81c8-420d-af24-c611a3341c85 req-9ad3b492-efd1-45e3-bc23-7f6cf0632c81 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Received event network-vif-plugged-392e43af-a923-4bd6-bdff-445c6101995b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:59:40 np0005531887 nova_compute[186849]: 2025-11-22 07:59:40.000 186853 DEBUG oslo_concurrency.lockutils [req-712c7651-81c8-420d-af24-c611a3341c85 req-9ad3b492-efd1-45e3-bc23-7f6cf0632c81 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "e4a6074c-55b0-4529-b184-3ba3ca0dab8c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:59:40 np0005531887 nova_compute[186849]: 2025-11-22 07:59:40.001 186853 DEBUG oslo_concurrency.lockutils [req-712c7651-81c8-420d-af24-c611a3341c85 req-9ad3b492-efd1-45e3-bc23-7f6cf0632c81 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e4a6074c-55b0-4529-b184-3ba3ca0dab8c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:59:40 np0005531887 nova_compute[186849]: 2025-11-22 07:59:40.001 186853 DEBUG oslo_concurrency.lockutils [req-712c7651-81c8-420d-af24-c611a3341c85 req-9ad3b492-efd1-45e3-bc23-7f6cf0632c81 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e4a6074c-55b0-4529-b184-3ba3ca0dab8c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:59:40 np0005531887 nova_compute[186849]: 2025-11-22 07:59:40.001 186853 DEBUG nova.compute.manager [req-712c7651-81c8-420d-af24-c611a3341c85 req-9ad3b492-efd1-45e3-bc23-7f6cf0632c81 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] No waiting events found dispatching network-vif-plugged-392e43af-a923-4bd6-bdff-445c6101995b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 02:59:40 np0005531887 nova_compute[186849]: 2025-11-22 07:59:40.001 186853 WARNING nova.compute.manager [req-712c7651-81c8-420d-af24-c611a3341c85 req-9ad3b492-efd1-45e3-bc23-7f6cf0632c81 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Received unexpected event network-vif-plugged-392e43af-a923-4bd6-bdff-445c6101995b for instance with vm_state active and task_state deleting.#033[00m
Nov 22 02:59:40 np0005531887 nova_compute[186849]: 2025-11-22 07:59:40.407 186853 DEBUG nova.network.neutron [-] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 02:59:40 np0005531887 nova_compute[186849]: 2025-11-22 07:59:40.456 186853 INFO nova.compute.manager [-] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Took 2.45 seconds to deallocate network for instance.#033[00m
Nov 22 02:59:40 np0005531887 nova_compute[186849]: 2025-11-22 07:59:40.595 186853 DEBUG oslo_concurrency.lockutils [None req-7a770eb1-9cf3-4881-aec4-99c28c042ca7 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:59:40 np0005531887 nova_compute[186849]: 2025-11-22 07:59:40.596 186853 DEBUG oslo_concurrency.lockutils [None req-7a770eb1-9cf3-4881-aec4-99c28c042ca7 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:59:40 np0005531887 nova_compute[186849]: 2025-11-22 07:59:40.658 186853 DEBUG nova.compute.provider_tree [None req-7a770eb1-9cf3-4881-aec4-99c28c042ca7 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:59:40 np0005531887 nova_compute[186849]: 2025-11-22 07:59:40.669 186853 DEBUG nova.scheduler.client.report [None req-7a770eb1-9cf3-4881-aec4-99c28c042ca7 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:59:40 np0005531887 nova_compute[186849]: 2025-11-22 07:59:40.734 186853 DEBUG oslo_concurrency.lockutils [None req-7a770eb1-9cf3-4881-aec4-99c28c042ca7 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.138s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:59:40 np0005531887 nova_compute[186849]: 2025-11-22 07:59:40.793 186853 INFO nova.scheduler.client.report [None req-7a770eb1-9cf3-4881-aec4-99c28c042ca7 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Deleted allocations for instance e4a6074c-55b0-4529-b184-3ba3ca0dab8c#033[00m
Nov 22 02:59:40 np0005531887 nova_compute[186849]: 2025-11-22 07:59:40.866 186853 DEBUG oslo_concurrency.lockutils [None req-7a770eb1-9cf3-4881-aec4-99c28c042ca7 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "e4a6074c-55b0-4529-b184-3ba3ca0dab8c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.326s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:59:41 np0005531887 nova_compute[186849]: 2025-11-22 07:59:41.619 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:42 np0005531887 nova_compute[186849]: 2025-11-22 07:59:42.143 186853 DEBUG nova.compute.manager [req-deeac83e-a5e7-44e0-8614-fbf015048313 req-6b7da09c-b1a2-4882-8ba4-a5915b896c0f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Received event network-vif-deleted-392e43af-a923-4bd6-bdff-445c6101995b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 02:59:42 np0005531887 nova_compute[186849]: 2025-11-22 07:59:42.885 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:43 np0005531887 podman[226322]: 2025-11-22 07:59:43.837799339 +0000 UTC m=+0.061669368 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 22 02:59:43 np0005531887 podman[226323]: 2025-11-22 07:59:43.873432127 +0000 UTC m=+0.091075381 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 02:59:45 np0005531887 nova_compute[186849]: 2025-11-22 07:59:45.914 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:59:46 np0005531887 nova_compute[186849]: 2025-11-22 07:59:46.621 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:46 np0005531887 nova_compute[186849]: 2025-11-22 07:59:46.732 186853 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798371.7311978, 63f687f3-efad-42ad-b771-c95586f36ed7 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:59:46 np0005531887 nova_compute[186849]: 2025-11-22 07:59:46.733 186853 INFO nova.compute.manager [-] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:59:46 np0005531887 nova_compute[186849]: 2025-11-22 07:59:46.750 186853 DEBUG nova.compute.manager [None req-ab42023a-b4e9-477d-8ddd-ecf61605bdc8 - - - - - -] [instance: 63f687f3-efad-42ad-b771-c95586f36ed7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:59:47 np0005531887 nova_compute[186849]: 2025-11-22 07:59:47.765 186853 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798372.7643616, 3f315996-e85d-463b-9123-272512335a7f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:59:47 np0005531887 nova_compute[186849]: 2025-11-22 07:59:47.766 186853 INFO nova.compute.manager [-] [instance: 3f315996-e85d-463b-9123-272512335a7f] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:59:47 np0005531887 nova_compute[186849]: 2025-11-22 07:59:47.794 186853 DEBUG nova.compute.manager [None req-41df4850-bbfc-44cd-b961-918a17cb37a6 - - - - - -] [instance: 3f315996-e85d-463b-9123-272512335a7f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:59:47 np0005531887 nova_compute[186849]: 2025-11-22 07:59:47.887 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:48 np0005531887 podman[226369]: 2025-11-22 07:59:48.840854472 +0000 UTC m=+0.057408034 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 02:59:49 np0005531887 nova_compute[186849]: 2025-11-22 07:59:49.770 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:59:51 np0005531887 nova_compute[186849]: 2025-11-22 07:59:51.621 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:51 np0005531887 nova_compute[186849]: 2025-11-22 07:59:51.762 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:59:51 np0005531887 nova_compute[186849]: 2025-11-22 07:59:51.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:59:51 np0005531887 nova_compute[186849]: 2025-11-22 07:59:51.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:59:51 np0005531887 nova_compute[186849]: 2025-11-22 07:59:51.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 02:59:51 np0005531887 nova_compute[186849]: 2025-11-22 07:59:51.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:59:51 np0005531887 nova_compute[186849]: 2025-11-22 07:59:51.789 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:59:51 np0005531887 nova_compute[186849]: 2025-11-22 07:59:51.790 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:59:51 np0005531887 nova_compute[186849]: 2025-11-22 07:59:51.790 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:59:51 np0005531887 nova_compute[186849]: 2025-11-22 07:59:51.790 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 02:59:51 np0005531887 nova_compute[186849]: 2025-11-22 07:59:51.993 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 02:59:51 np0005531887 nova_compute[186849]: 2025-11-22 07:59:51.994 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5719MB free_disk=73.34574127197266GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 02:59:51 np0005531887 nova_compute[186849]: 2025-11-22 07:59:51.994 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:59:51 np0005531887 nova_compute[186849]: 2025-11-22 07:59:51.995 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:59:52 np0005531887 nova_compute[186849]: 2025-11-22 07:59:52.061 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 02:59:52 np0005531887 nova_compute[186849]: 2025-11-22 07:59:52.062 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 02:59:52 np0005531887 nova_compute[186849]: 2025-11-22 07:59:52.081 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:59:52 np0005531887 nova_compute[186849]: 2025-11-22 07:59:52.093 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:59:52 np0005531887 nova_compute[186849]: 2025-11-22 07:59:52.344 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 02:59:52 np0005531887 nova_compute[186849]: 2025-11-22 07:59:52.344 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.350s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:59:52 np0005531887 nova_compute[186849]: 2025-11-22 07:59:52.844 186853 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798377.8424802, e4a6074c-55b0-4529-b184-3ba3ca0dab8c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 02:59:52 np0005531887 nova_compute[186849]: 2025-11-22 07:59:52.844 186853 INFO nova.compute.manager [-] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] VM Stopped (Lifecycle Event)#033[00m
Nov 22 02:59:52 np0005531887 nova_compute[186849]: 2025-11-22 07:59:52.859 186853 DEBUG nova.compute.manager [None req-305de3f5-5c17-4866-b234-ad89c7d6f6f8 - - - - - -] [instance: e4a6074c-55b0-4529-b184-3ba3ca0dab8c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 02:59:52 np0005531887 nova_compute[186849]: 2025-11-22 07:59:52.890 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:54 np0005531887 nova_compute[186849]: 2025-11-22 07:59:54.344 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:59:54 np0005531887 podman[226395]: 2025-11-22 07:59:54.841966925 +0000 UTC m=+0.059916729 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 02:59:55 np0005531887 nova_compute[186849]: 2025-11-22 07:59:55.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:59:56 np0005531887 nova_compute[186849]: 2025-11-22 07:59:56.625 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:56 np0005531887 nova_compute[186849]: 2025-11-22 07:59:56.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:59:56 np0005531887 nova_compute[186849]: 2025-11-22 07:59:56.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 02:59:56 np0005531887 nova_compute[186849]: 2025-11-22 07:59:56.794 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 02:59:57 np0005531887 nova_compute[186849]: 2025-11-22 07:59:57.853 186853 DEBUG oslo_concurrency.lockutils [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquiring lock "36558ab5-ef38-44f5-8dd6-98c8e20c68c7" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:59:57 np0005531887 nova_compute[186849]: 2025-11-22 07:59:57.853 186853 DEBUG oslo_concurrency.lockutils [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "36558ab5-ef38-44f5-8dd6-98c8e20c68c7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:59:57 np0005531887 nova_compute[186849]: 2025-11-22 07:59:57.864 186853 DEBUG nova.compute.manager [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 02:59:57 np0005531887 nova_compute[186849]: 2025-11-22 07:59:57.892 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 02:59:57 np0005531887 nova_compute[186849]: 2025-11-22 07:59:57.984 186853 DEBUG oslo_concurrency.lockutils [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:59:57 np0005531887 nova_compute[186849]: 2025-11-22 07:59:57.985 186853 DEBUG oslo_concurrency.lockutils [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:59:57 np0005531887 nova_compute[186849]: 2025-11-22 07:59:57.992 186853 DEBUG nova.virt.hardware [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 02:59:57 np0005531887 nova_compute[186849]: 2025-11-22 07:59:57.993 186853 INFO nova.compute.claims [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 22 02:59:58 np0005531887 nova_compute[186849]: 2025-11-22 07:59:58.103 186853 DEBUG nova.compute.provider_tree [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 02:59:58 np0005531887 nova_compute[186849]: 2025-11-22 07:59:58.122 186853 DEBUG nova.scheduler.client.report [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 02:59:58 np0005531887 nova_compute[186849]: 2025-11-22 07:59:58.165 186853 DEBUG oslo_concurrency.lockutils [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.180s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:59:58 np0005531887 nova_compute[186849]: 2025-11-22 07:59:58.166 186853 DEBUG nova.compute.manager [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 02:59:58 np0005531887 nova_compute[186849]: 2025-11-22 07:59:58.222 186853 DEBUG nova.compute.manager [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 02:59:58 np0005531887 nova_compute[186849]: 2025-11-22 07:59:58.222 186853 DEBUG nova.network.neutron [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 02:59:58 np0005531887 nova_compute[186849]: 2025-11-22 07:59:58.240 186853 INFO nova.virt.libvirt.driver [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 02:59:58 np0005531887 nova_compute[186849]: 2025-11-22 07:59:58.267 186853 DEBUG nova.compute.manager [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 02:59:58 np0005531887 nova_compute[186849]: 2025-11-22 07:59:58.396 186853 DEBUG nova.compute.manager [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 02:59:58 np0005531887 nova_compute[186849]: 2025-11-22 07:59:58.398 186853 DEBUG nova.virt.libvirt.driver [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 02:59:58 np0005531887 nova_compute[186849]: 2025-11-22 07:59:58.398 186853 INFO nova.virt.libvirt.driver [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Creating image(s)#033[00m
Nov 22 02:59:58 np0005531887 nova_compute[186849]: 2025-11-22 07:59:58.399 186853 DEBUG oslo_concurrency.lockutils [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquiring lock "/var/lib/nova/instances/36558ab5-ef38-44f5-8dd6-98c8e20c68c7/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:59:58 np0005531887 nova_compute[186849]: 2025-11-22 07:59:58.399 186853 DEBUG oslo_concurrency.lockutils [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "/var/lib/nova/instances/36558ab5-ef38-44f5-8dd6-98c8e20c68c7/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:59:58 np0005531887 nova_compute[186849]: 2025-11-22 07:59:58.400 186853 DEBUG oslo_concurrency.lockutils [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "/var/lib/nova/instances/36558ab5-ef38-44f5-8dd6-98c8e20c68c7/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:59:58 np0005531887 nova_compute[186849]: 2025-11-22 07:59:58.418 186853 DEBUG oslo_concurrency.processutils [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:59:58 np0005531887 nova_compute[186849]: 2025-11-22 07:59:58.467 186853 DEBUG nova.policy [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 02:59:58 np0005531887 nova_compute[186849]: 2025-11-22 07:59:58.479 186853 DEBUG oslo_concurrency.processutils [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:59:58 np0005531887 nova_compute[186849]: 2025-11-22 07:59:58.480 186853 DEBUG oslo_concurrency.lockutils [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:59:58 np0005531887 nova_compute[186849]: 2025-11-22 07:59:58.480 186853 DEBUG oslo_concurrency.lockutils [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:59:58 np0005531887 nova_compute[186849]: 2025-11-22 07:59:58.495 186853 DEBUG oslo_concurrency.processutils [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:59:58 np0005531887 nova_compute[186849]: 2025-11-22 07:59:58.557 186853 DEBUG oslo_concurrency.processutils [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:59:58 np0005531887 nova_compute[186849]: 2025-11-22 07:59:58.558 186853 DEBUG oslo_concurrency.processutils [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/36558ab5-ef38-44f5-8dd6-98c8e20c68c7/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:59:58 np0005531887 nova_compute[186849]: 2025-11-22 07:59:58.788 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 02:59:58 np0005531887 nova_compute[186849]: 2025-11-22 07:59:58.849 186853 DEBUG oslo_concurrency.processutils [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/36558ab5-ef38-44f5-8dd6-98c8e20c68c7/disk 1073741824" returned: 0 in 0.291s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:59:58 np0005531887 nova_compute[186849]: 2025-11-22 07:59:58.850 186853 DEBUG oslo_concurrency.lockutils [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.369s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 02:59:58 np0005531887 nova_compute[186849]: 2025-11-22 07:59:58.850 186853 DEBUG oslo_concurrency.processutils [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:59:58 np0005531887 nova_compute[186849]: 2025-11-22 07:59:58.941 186853 DEBUG oslo_concurrency.processutils [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:59:58 np0005531887 nova_compute[186849]: 2025-11-22 07:59:58.942 186853 DEBUG nova.virt.disk.api [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Checking if we can resize image /var/lib/nova/instances/36558ab5-ef38-44f5-8dd6-98c8e20c68c7/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 02:59:58 np0005531887 nova_compute[186849]: 2025-11-22 07:59:58.943 186853 DEBUG oslo_concurrency.processutils [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/36558ab5-ef38-44f5-8dd6-98c8e20c68c7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 02:59:59 np0005531887 nova_compute[186849]: 2025-11-22 07:59:59.015 186853 DEBUG oslo_concurrency.processutils [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/36558ab5-ef38-44f5-8dd6-98c8e20c68c7/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 02:59:59 np0005531887 nova_compute[186849]: 2025-11-22 07:59:59.016 186853 DEBUG nova.virt.disk.api [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Cannot resize image /var/lib/nova/instances/36558ab5-ef38-44f5-8dd6-98c8e20c68c7/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 02:59:59 np0005531887 nova_compute[186849]: 2025-11-22 07:59:59.016 186853 DEBUG nova.objects.instance [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lazy-loading 'migration_context' on Instance uuid 36558ab5-ef38-44f5-8dd6-98c8e20c68c7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 02:59:59 np0005531887 nova_compute[186849]: 2025-11-22 07:59:59.043 186853 DEBUG nova.virt.libvirt.driver [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 02:59:59 np0005531887 nova_compute[186849]: 2025-11-22 07:59:59.044 186853 DEBUG nova.virt.libvirt.driver [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Ensure instance console log exists: /var/lib/nova/instances/36558ab5-ef38-44f5-8dd6-98c8e20c68c7/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 02:59:59 np0005531887 nova_compute[186849]: 2025-11-22 07:59:59.044 186853 DEBUG oslo_concurrency.lockutils [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 02:59:59 np0005531887 nova_compute[186849]: 2025-11-22 07:59:59.045 186853 DEBUG oslo_concurrency.lockutils [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 02:59:59 np0005531887 nova_compute[186849]: 2025-11-22 07:59:59.045 186853 DEBUG oslo_concurrency.lockutils [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:00:00 np0005531887 nova_compute[186849]: 2025-11-22 08:00:00.796 186853 DEBUG nova.network.neutron [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Successfully created port: 44ab3743-f6b1-4f3e-9686-53c9ebd45d37 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:00:00 np0005531887 podman[226430]: 2025-11-22 08:00:00.839809663 +0000 UTC m=+0.057204018 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 22 03:00:01 np0005531887 nova_compute[186849]: 2025-11-22 08:00:01.627 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:02 np0005531887 nova_compute[186849]: 2025-11-22 08:00:02.613 186853 DEBUG nova.network.neutron [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Successfully updated port: 44ab3743-f6b1-4f3e-9686-53c9ebd45d37 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:00:02 np0005531887 nova_compute[186849]: 2025-11-22 08:00:02.626 186853 DEBUG oslo_concurrency.lockutils [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquiring lock "refresh_cache-36558ab5-ef38-44f5-8dd6-98c8e20c68c7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:00:02 np0005531887 nova_compute[186849]: 2025-11-22 08:00:02.626 186853 DEBUG oslo_concurrency.lockutils [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquired lock "refresh_cache-36558ab5-ef38-44f5-8dd6-98c8e20c68c7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:00:02 np0005531887 nova_compute[186849]: 2025-11-22 08:00:02.627 186853 DEBUG nova.network.neutron [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:00:02 np0005531887 nova_compute[186849]: 2025-11-22 08:00:02.705 186853 DEBUG nova.compute.manager [req-9414fe00-a3b6-4b73-8f3d-13117106ef35 req-a97be97c-e478-4c83-9351-d0c8bc873268 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Received event network-changed-44ab3743-f6b1-4f3e-9686-53c9ebd45d37 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:00:02 np0005531887 nova_compute[186849]: 2025-11-22 08:00:02.705 186853 DEBUG nova.compute.manager [req-9414fe00-a3b6-4b73-8f3d-13117106ef35 req-a97be97c-e478-4c83-9351-d0c8bc873268 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Refreshing instance network info cache due to event network-changed-44ab3743-f6b1-4f3e-9686-53c9ebd45d37. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:00:02 np0005531887 nova_compute[186849]: 2025-11-22 08:00:02.706 186853 DEBUG oslo_concurrency.lockutils [req-9414fe00-a3b6-4b73-8f3d-13117106ef35 req-a97be97c-e478-4c83-9351-d0c8bc873268 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-36558ab5-ef38-44f5-8dd6-98c8e20c68c7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:00:02 np0005531887 nova_compute[186849]: 2025-11-22 08:00:02.781 186853 DEBUG nova.network.neutron [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:00:02 np0005531887 nova_compute[186849]: 2025-11-22 08:00:02.895 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:04 np0005531887 nova_compute[186849]: 2025-11-22 08:00:04.274 186853 DEBUG nova.network.neutron [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Updating instance_info_cache with network_info: [{"id": "44ab3743-f6b1-4f3e-9686-53c9ebd45d37", "address": "fa:16:3e:06:f1:87", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44ab3743-f6", "ovs_interfaceid": "44ab3743-f6b1-4f3e-9686-53c9ebd45d37", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:00:04 np0005531887 nova_compute[186849]: 2025-11-22 08:00:04.292 186853 DEBUG oslo_concurrency.lockutils [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Releasing lock "refresh_cache-36558ab5-ef38-44f5-8dd6-98c8e20c68c7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:00:04 np0005531887 nova_compute[186849]: 2025-11-22 08:00:04.292 186853 DEBUG nova.compute.manager [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Instance network_info: |[{"id": "44ab3743-f6b1-4f3e-9686-53c9ebd45d37", "address": "fa:16:3e:06:f1:87", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44ab3743-f6", "ovs_interfaceid": "44ab3743-f6b1-4f3e-9686-53c9ebd45d37", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 03:00:04 np0005531887 nova_compute[186849]: 2025-11-22 08:00:04.293 186853 DEBUG oslo_concurrency.lockutils [req-9414fe00-a3b6-4b73-8f3d-13117106ef35 req-a97be97c-e478-4c83-9351-d0c8bc873268 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-36558ab5-ef38-44f5-8dd6-98c8e20c68c7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:00:04 np0005531887 nova_compute[186849]: 2025-11-22 08:00:04.293 186853 DEBUG nova.network.neutron [req-9414fe00-a3b6-4b73-8f3d-13117106ef35 req-a97be97c-e478-4c83-9351-d0c8bc873268 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Refreshing network info cache for port 44ab3743-f6b1-4f3e-9686-53c9ebd45d37 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:00:04 np0005531887 nova_compute[186849]: 2025-11-22 08:00:04.296 186853 DEBUG nova.virt.libvirt.driver [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Start _get_guest_xml network_info=[{"id": "44ab3743-f6b1-4f3e-9686-53c9ebd45d37", "address": "fa:16:3e:06:f1:87", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44ab3743-f6", "ovs_interfaceid": "44ab3743-f6b1-4f3e-9686-53c9ebd45d37", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'size': 0, 'boot_index': 0, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:00:04 np0005531887 nova_compute[186849]: 2025-11-22 08:00:04.302 186853 WARNING nova.virt.libvirt.driver [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:00:04 np0005531887 nova_compute[186849]: 2025-11-22 08:00:04.310 186853 DEBUG nova.virt.libvirt.host [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:00:04 np0005531887 nova_compute[186849]: 2025-11-22 08:00:04.311 186853 DEBUG nova.virt.libvirt.host [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:00:04 np0005531887 nova_compute[186849]: 2025-11-22 08:00:04.316 186853 DEBUG nova.virt.libvirt.host [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:00:04 np0005531887 nova_compute[186849]: 2025-11-22 08:00:04.317 186853 DEBUG nova.virt.libvirt.host [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:00:04 np0005531887 nova_compute[186849]: 2025-11-22 08:00:04.319 186853 DEBUG nova.virt.libvirt.driver [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:00:04 np0005531887 nova_compute[186849]: 2025-11-22 08:00:04.319 186853 DEBUG nova.virt.hardware [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:00:04 np0005531887 nova_compute[186849]: 2025-11-22 08:00:04.320 186853 DEBUG nova.virt.hardware [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:00:04 np0005531887 nova_compute[186849]: 2025-11-22 08:00:04.321 186853 DEBUG nova.virt.hardware [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:00:04 np0005531887 nova_compute[186849]: 2025-11-22 08:00:04.321 186853 DEBUG nova.virt.hardware [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:00:04 np0005531887 nova_compute[186849]: 2025-11-22 08:00:04.322 186853 DEBUG nova.virt.hardware [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:00:04 np0005531887 nova_compute[186849]: 2025-11-22 08:00:04.323 186853 DEBUG nova.virt.hardware [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:00:04 np0005531887 nova_compute[186849]: 2025-11-22 08:00:04.323 186853 DEBUG nova.virt.hardware [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:00:04 np0005531887 nova_compute[186849]: 2025-11-22 08:00:04.324 186853 DEBUG nova.virt.hardware [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:00:04 np0005531887 nova_compute[186849]: 2025-11-22 08:00:04.324 186853 DEBUG nova.virt.hardware [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:00:04 np0005531887 nova_compute[186849]: 2025-11-22 08:00:04.325 186853 DEBUG nova.virt.hardware [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:00:04 np0005531887 nova_compute[186849]: 2025-11-22 08:00:04.325 186853 DEBUG nova.virt.hardware [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:00:04 np0005531887 nova_compute[186849]: 2025-11-22 08:00:04.331 186853 DEBUG nova.virt.libvirt.vif [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:59:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-733706123',display_name='tempest-AttachInterfacesTestJSON-server-733706123',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-733706123',id=91,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEVCPVBzzzKnP+R6/lDJiQ5B8RfgEpMCDa4dk9to8phNzvju3oinz4x7dgw6Zbn8afUUsbXFYWd5w1dgd7O2KN/oXSpHA9eKzGAIMbR7cZPcwDLZProoH/PeBT61VIfVUA==',key_name='tempest-keypair-329982621',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2ce7c88f9ad440f494bdba03e7ece1bf',ramdisk_id='',reservation_id='r-lkepgert',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-148239263',owner_user_name='tempest-AttachInterfacesTestJSON-148239263-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:59:58Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='53d50a77d3f2416c8fcc459cc343d045',uuid=36558ab5-ef38-44f5-8dd6-98c8e20c68c7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "44ab3743-f6b1-4f3e-9686-53c9ebd45d37", "address": "fa:16:3e:06:f1:87", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44ab3743-f6", "ovs_interfaceid": "44ab3743-f6b1-4f3e-9686-53c9ebd45d37", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:00:04 np0005531887 nova_compute[186849]: 2025-11-22 08:00:04.332 186853 DEBUG nova.network.os_vif_util [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converting VIF {"id": "44ab3743-f6b1-4f3e-9686-53c9ebd45d37", "address": "fa:16:3e:06:f1:87", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44ab3743-f6", "ovs_interfaceid": "44ab3743-f6b1-4f3e-9686-53c9ebd45d37", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:00:04 np0005531887 nova_compute[186849]: 2025-11-22 08:00:04.333 186853 DEBUG nova.network.os_vif_util [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:06:f1:87,bridge_name='br-int',has_traffic_filtering=True,id=44ab3743-f6b1-4f3e-9686-53c9ebd45d37,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap44ab3743-f6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:00:04 np0005531887 nova_compute[186849]: 2025-11-22 08:00:04.335 186853 DEBUG nova.objects.instance [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lazy-loading 'pci_devices' on Instance uuid 36558ab5-ef38-44f5-8dd6-98c8e20c68c7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:00:04 np0005531887 nova_compute[186849]: 2025-11-22 08:00:04.347 186853 DEBUG nova.virt.libvirt.driver [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:00:04 np0005531887 nova_compute[186849]:  <uuid>36558ab5-ef38-44f5-8dd6-98c8e20c68c7</uuid>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:  <name>instance-0000005b</name>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:  <memory>131072</memory>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:  <vcpu>1</vcpu>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:00:04 np0005531887 nova_compute[186849]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:      <nova:name>tempest-AttachInterfacesTestJSON-server-733706123</nova:name>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:      <nova:creationTime>2025-11-22 08:00:04</nova:creationTime>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:      <nova:flavor name="m1.nano">
Nov 22 03:00:04 np0005531887 nova_compute[186849]:        <nova:memory>128</nova:memory>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:        <nova:disk>1</nova:disk>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:        <nova:swap>0</nova:swap>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:      </nova:flavor>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:      <nova:owner>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:        <nova:user uuid="53d50a77d3f2416c8fcc459cc343d045">tempest-AttachInterfacesTestJSON-148239263-project-member</nova:user>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:        <nova:project uuid="2ce7c88f9ad440f494bdba03e7ece1bf">tempest-AttachInterfacesTestJSON-148239263</nova:project>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:      </nova:owner>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:      <nova:ports>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:        <nova:port uuid="44ab3743-f6b1-4f3e-9686-53c9ebd45d37">
Nov 22 03:00:04 np0005531887 nova_compute[186849]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:        </nova:port>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:      </nova:ports>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:    </nova:instance>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:  <sysinfo type="smbios">
Nov 22 03:00:04 np0005531887 nova_compute[186849]:    <system>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:      <entry name="serial">36558ab5-ef38-44f5-8dd6-98c8e20c68c7</entry>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:      <entry name="uuid">36558ab5-ef38-44f5-8dd6-98c8e20c68c7</entry>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:    </system>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:  <os>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:    <boot dev="hd"/>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:    <smbios mode="sysinfo"/>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:  </os>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:  <features>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:    <vmcoreinfo/>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:  </features>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:  <clock offset="utc">
Nov 22 03:00:04 np0005531887 nova_compute[186849]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:    <timer name="hpet" present="no"/>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:  </clock>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:  <cpu mode="custom" match="exact">
Nov 22 03:00:04 np0005531887 nova_compute[186849]:    <model>Nehalem</model>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:  <devices>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:    <disk type="file" device="disk">
Nov 22 03:00:04 np0005531887 nova_compute[186849]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/36558ab5-ef38-44f5-8dd6-98c8e20c68c7/disk"/>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:      <target dev="vda" bus="virtio"/>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:    <disk type="file" device="cdrom">
Nov 22 03:00:04 np0005531887 nova_compute[186849]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/36558ab5-ef38-44f5-8dd6-98c8e20c68c7/disk.config"/>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:      <target dev="sda" bus="sata"/>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:    <interface type="ethernet">
Nov 22 03:00:04 np0005531887 nova_compute[186849]:      <mac address="fa:16:3e:06:f1:87"/>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:      <mtu size="1442"/>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:      <target dev="tap44ab3743-f6"/>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:    </interface>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:    <serial type="pty">
Nov 22 03:00:04 np0005531887 nova_compute[186849]:      <log file="/var/lib/nova/instances/36558ab5-ef38-44f5-8dd6-98c8e20c68c7/console.log" append="off"/>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:    </serial>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:    <video>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:    </video>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:    <input type="tablet" bus="usb"/>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:    <rng model="virtio">
Nov 22 03:00:04 np0005531887 nova_compute[186849]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:    </rng>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:    <controller type="usb" index="0"/>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:    <memballoon model="virtio">
Nov 22 03:00:04 np0005531887 nova_compute[186849]:      <stats period="10"/>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 03:00:04 np0005531887 nova_compute[186849]:  </devices>
Nov 22 03:00:04 np0005531887 nova_compute[186849]: </domain>
Nov 22 03:00:04 np0005531887 nova_compute[186849]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:00:04 np0005531887 nova_compute[186849]: 2025-11-22 08:00:04.348 186853 DEBUG nova.compute.manager [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Preparing to wait for external event network-vif-plugged-44ab3743-f6b1-4f3e-9686-53c9ebd45d37 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:00:04 np0005531887 nova_compute[186849]: 2025-11-22 08:00:04.348 186853 DEBUG oslo_concurrency.lockutils [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquiring lock "36558ab5-ef38-44f5-8dd6-98c8e20c68c7-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:00:04 np0005531887 nova_compute[186849]: 2025-11-22 08:00:04.349 186853 DEBUG oslo_concurrency.lockutils [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "36558ab5-ef38-44f5-8dd6-98c8e20c68c7-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:00:04 np0005531887 nova_compute[186849]: 2025-11-22 08:00:04.349 186853 DEBUG oslo_concurrency.lockutils [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "36558ab5-ef38-44f5-8dd6-98c8e20c68c7-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:00:04 np0005531887 nova_compute[186849]: 2025-11-22 08:00:04.350 186853 DEBUG nova.virt.libvirt.vif [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:59:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-733706123',display_name='tempest-AttachInterfacesTestJSON-server-733706123',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-733706123',id=91,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEVCPVBzzzKnP+R6/lDJiQ5B8RfgEpMCDa4dk9to8phNzvju3oinz4x7dgw6Zbn8afUUsbXFYWd5w1dgd7O2KN/oXSpHA9eKzGAIMbR7cZPcwDLZProoH/PeBT61VIfVUA==',key_name='tempest-keypair-329982621',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2ce7c88f9ad440f494bdba03e7ece1bf',ramdisk_id='',reservation_id='r-lkepgert',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-148239263',owner_user_name='tempest-AttachInterfacesTestJSON-148239263-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:59:58Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='53d50a77d3f2416c8fcc459cc343d045',uuid=36558ab5-ef38-44f5-8dd6-98c8e20c68c7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "44ab3743-f6b1-4f3e-9686-53c9ebd45d37", "address": "fa:16:3e:06:f1:87", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44ab3743-f6", "ovs_interfaceid": "44ab3743-f6b1-4f3e-9686-53c9ebd45d37", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:00:04 np0005531887 nova_compute[186849]: 2025-11-22 08:00:04.350 186853 DEBUG nova.network.os_vif_util [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converting VIF {"id": "44ab3743-f6b1-4f3e-9686-53c9ebd45d37", "address": "fa:16:3e:06:f1:87", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44ab3743-f6", "ovs_interfaceid": "44ab3743-f6b1-4f3e-9686-53c9ebd45d37", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:00:04 np0005531887 nova_compute[186849]: 2025-11-22 08:00:04.351 186853 DEBUG nova.network.os_vif_util [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:06:f1:87,bridge_name='br-int',has_traffic_filtering=True,id=44ab3743-f6b1-4f3e-9686-53c9ebd45d37,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap44ab3743-f6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:00:04 np0005531887 nova_compute[186849]: 2025-11-22 08:00:04.351 186853 DEBUG os_vif [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:06:f1:87,bridge_name='br-int',has_traffic_filtering=True,id=44ab3743-f6b1-4f3e-9686-53c9ebd45d37,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap44ab3743-f6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:00:04 np0005531887 nova_compute[186849]: 2025-11-22 08:00:04.352 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:04 np0005531887 nova_compute[186849]: 2025-11-22 08:00:04.352 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:00:04 np0005531887 nova_compute[186849]: 2025-11-22 08:00:04.352 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:00:04 np0005531887 nova_compute[186849]: 2025-11-22 08:00:04.355 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:04 np0005531887 nova_compute[186849]: 2025-11-22 08:00:04.356 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap44ab3743-f6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:00:04 np0005531887 nova_compute[186849]: 2025-11-22 08:00:04.356 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap44ab3743-f6, col_values=(('external_ids', {'iface-id': '44ab3743-f6b1-4f3e-9686-53c9ebd45d37', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:06:f1:87', 'vm-uuid': '36558ab5-ef38-44f5-8dd6-98c8e20c68c7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:00:04 np0005531887 nova_compute[186849]: 2025-11-22 08:00:04.358 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:04 np0005531887 NetworkManager[55210]: <info>  [1763798404.3597] manager: (tap44ab3743-f6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/123)
Nov 22 03:00:04 np0005531887 nova_compute[186849]: 2025-11-22 08:00:04.362 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:00:04 np0005531887 nova_compute[186849]: 2025-11-22 08:00:04.364 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:04 np0005531887 nova_compute[186849]: 2025-11-22 08:00:04.366 186853 INFO os_vif [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:06:f1:87,bridge_name='br-int',has_traffic_filtering=True,id=44ab3743-f6b1-4f3e-9686-53c9ebd45d37,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap44ab3743-f6')#033[00m
Nov 22 03:00:04 np0005531887 nova_compute[186849]: 2025-11-22 08:00:04.454 186853 DEBUG nova.virt.libvirt.driver [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:00:04 np0005531887 nova_compute[186849]: 2025-11-22 08:00:04.454 186853 DEBUG nova.virt.libvirt.driver [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:00:04 np0005531887 nova_compute[186849]: 2025-11-22 08:00:04.455 186853 DEBUG nova.virt.libvirt.driver [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] No VIF found with MAC fa:16:3e:06:f1:87, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:00:04 np0005531887 nova_compute[186849]: 2025-11-22 08:00:04.455 186853 INFO nova.virt.libvirt.driver [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Using config drive#033[00m
Nov 22 03:00:04 np0005531887 podman[226452]: 2025-11-22 08:00:04.851878114 +0000 UTC m=+0.061799147 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:00:05 np0005531887 nova_compute[186849]: 2025-11-22 08:00:05.299 186853 INFO nova.virt.libvirt.driver [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Creating config drive at /var/lib/nova/instances/36558ab5-ef38-44f5-8dd6-98c8e20c68c7/disk.config#033[00m
Nov 22 03:00:05 np0005531887 nova_compute[186849]: 2025-11-22 08:00:05.307 186853 DEBUG oslo_concurrency.processutils [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/36558ab5-ef38-44f5-8dd6-98c8e20c68c7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzr6ley8p execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:00:05 np0005531887 nova_compute[186849]: 2025-11-22 08:00:05.435 186853 DEBUG oslo_concurrency.processutils [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/36558ab5-ef38-44f5-8dd6-98c8e20c68c7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzr6ley8p" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:00:05 np0005531887 kernel: tap44ab3743-f6: entered promiscuous mode
Nov 22 03:00:05 np0005531887 NetworkManager[55210]: <info>  [1763798405.4977] manager: (tap44ab3743-f6): new Tun device (/org/freedesktop/NetworkManager/Devices/124)
Nov 22 03:00:05 np0005531887 ovn_controller[95130]: 2025-11-22T08:00:05Z|00255|binding|INFO|Claiming lport 44ab3743-f6b1-4f3e-9686-53c9ebd45d37 for this chassis.
Nov 22 03:00:05 np0005531887 ovn_controller[95130]: 2025-11-22T08:00:05Z|00256|binding|INFO|44ab3743-f6b1-4f3e-9686-53c9ebd45d37: Claiming fa:16:3e:06:f1:87 10.100.0.9
Nov 22 03:00:05 np0005531887 nova_compute[186849]: 2025-11-22 08:00:05.497 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:05 np0005531887 nova_compute[186849]: 2025-11-22 08:00:05.502 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:05.521 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:06:f1:87 10.100.0.9'], port_security=['fa:16:3e:06:f1:87 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '36558ab5-ef38-44f5-8dd6-98c8e20c68c7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6a4a282c-db22-41de-b34b-2960aa032ca8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd3a9b01b-3ebf-4060-a682-c07cc7a09738', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0acaade9-a442-4c9d-a882-bd397d30fce8, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=44ab3743-f6b1-4f3e-9686-53c9ebd45d37) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:00:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:05.523 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 44ab3743-f6b1-4f3e-9686-53c9ebd45d37 in datapath 6a4a282c-db22-41de-b34b-2960aa032ca8 bound to our chassis#033[00m
Nov 22 03:00:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:05.524 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6a4a282c-db22-41de-b34b-2960aa032ca8#033[00m
Nov 22 03:00:05 np0005531887 systemd-udevd[226492]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:00:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:05.537 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[e54a2103-474f-4401-8782-f7739a06cc3d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:05.538 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6a4a282c-d1 in ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:00:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:05.540 213790 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6a4a282c-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:00:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:05.540 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[a99d993c-b176-46b2-a5d9-c86411182b38]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:05.541 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[e5e42000-5fb9-4b0b-8ba5-02c0da040e01]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:05 np0005531887 NetworkManager[55210]: <info>  [1763798405.5463] device (tap44ab3743-f6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:00:05 np0005531887 NetworkManager[55210]: <info>  [1763798405.5476] device (tap44ab3743-f6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:00:05 np0005531887 systemd-machined[153180]: New machine qemu-37-instance-0000005b.
Nov 22 03:00:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:05.555 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[69348781-7373-40f7-8dde-9f922281b4ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:05 np0005531887 nova_compute[186849]: 2025-11-22 08:00:05.562 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:05 np0005531887 ovn_controller[95130]: 2025-11-22T08:00:05Z|00257|binding|INFO|Setting lport 44ab3743-f6b1-4f3e-9686-53c9ebd45d37 ovn-installed in OVS
Nov 22 03:00:05 np0005531887 ovn_controller[95130]: 2025-11-22T08:00:05Z|00258|binding|INFO|Setting lport 44ab3743-f6b1-4f3e-9686-53c9ebd45d37 up in Southbound
Nov 22 03:00:05 np0005531887 nova_compute[186849]: 2025-11-22 08:00:05.567 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:05 np0005531887 systemd[1]: Started Virtual Machine qemu-37-instance-0000005b.
Nov 22 03:00:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:05.584 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[08d4f539-f774-4aae-ba1c-282547fa18d5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:05.619 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[d53e954b-7c82-45ad-a6b5-05747eb0e6af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:05 np0005531887 NetworkManager[55210]: <info>  [1763798405.6280] manager: (tap6a4a282c-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/125)
Nov 22 03:00:05 np0005531887 systemd-udevd[226499]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:00:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:05.626 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[618995fb-e1aa-48de-be5b-9ac963051ba9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:05.660 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[b9a1d2d3-e383-41af-9370-1c1125e727e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:05.666 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[cad5c94b-8cc4-46c4-a3d8-07704329e4d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:05 np0005531887 NetworkManager[55210]: <info>  [1763798405.6915] device (tap6a4a282c-d0): carrier: link connected
Nov 22 03:00:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:05.699 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[e3ef067f-ec06-4886-949d-fe5040f319aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:05.716 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[9ef763ca-679f-4a2d-8f79-651cdaf26780]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6a4a282c-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b3:7a:86'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 85], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 514829, 'reachable_time': 41281, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226528, 'error': None, 'target': 'ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:05.737 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[c926bad8-eead-4ae8-aa0d-408a77cd33bf]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb3:7a86'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 514829, 'tstamp': 514829}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226529, 'error': None, 'target': 'ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:05.756 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[78809ade-2e5e-48f2-9800-b2cc177f9474]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6a4a282c-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b3:7a:86'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 85], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 514829, 'reachable_time': 41281, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226530, 'error': None, 'target': 'ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:05 np0005531887 nova_compute[186849]: 2025-11-22 08:00:05.774 186853 DEBUG nova.network.neutron [req-9414fe00-a3b6-4b73-8f3d-13117106ef35 req-a97be97c-e478-4c83-9351-d0c8bc873268 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Updated VIF entry in instance network info cache for port 44ab3743-f6b1-4f3e-9686-53c9ebd45d37. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:00:05 np0005531887 nova_compute[186849]: 2025-11-22 08:00:05.776 186853 DEBUG nova.network.neutron [req-9414fe00-a3b6-4b73-8f3d-13117106ef35 req-a97be97c-e478-4c83-9351-d0c8bc873268 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Updating instance_info_cache with network_info: [{"id": "44ab3743-f6b1-4f3e-9686-53c9ebd45d37", "address": "fa:16:3e:06:f1:87", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44ab3743-f6", "ovs_interfaceid": "44ab3743-f6b1-4f3e-9686-53c9ebd45d37", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:00:05 np0005531887 nova_compute[186849]: 2025-11-22 08:00:05.790 186853 DEBUG oslo_concurrency.lockutils [req-9414fe00-a3b6-4b73-8f3d-13117106ef35 req-a97be97c-e478-4c83-9351-d0c8bc873268 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-36558ab5-ef38-44f5-8dd6-98c8e20c68c7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:00:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:05.796 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[50cd3107-e4aa-4463-8d4c-50d8e8ef3ffa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:05 np0005531887 nova_compute[186849]: 2025-11-22 08:00:05.860 186853 DEBUG nova.compute.manager [req-cf31d60a-3772-46c1-af1e-c00eed0769cf req-24e0e009-44f2-4f83-9086-556154fd088a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Received event network-vif-plugged-44ab3743-f6b1-4f3e-9686-53c9ebd45d37 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:00:05 np0005531887 nova_compute[186849]: 2025-11-22 08:00:05.861 186853 DEBUG oslo_concurrency.lockutils [req-cf31d60a-3772-46c1-af1e-c00eed0769cf req-24e0e009-44f2-4f83-9086-556154fd088a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "36558ab5-ef38-44f5-8dd6-98c8e20c68c7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:00:05 np0005531887 nova_compute[186849]: 2025-11-22 08:00:05.865 186853 DEBUG oslo_concurrency.lockutils [req-cf31d60a-3772-46c1-af1e-c00eed0769cf req-24e0e009-44f2-4f83-9086-556154fd088a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "36558ab5-ef38-44f5-8dd6-98c8e20c68c7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:00:05 np0005531887 nova_compute[186849]: 2025-11-22 08:00:05.865 186853 DEBUG oslo_concurrency.lockutils [req-cf31d60a-3772-46c1-af1e-c00eed0769cf req-24e0e009-44f2-4f83-9086-556154fd088a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "36558ab5-ef38-44f5-8dd6-98c8e20c68c7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:00:05 np0005531887 nova_compute[186849]: 2025-11-22 08:00:05.865 186853 DEBUG nova.compute.manager [req-cf31d60a-3772-46c1-af1e-c00eed0769cf req-24e0e009-44f2-4f83-9086-556154fd088a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Processing event network-vif-plugged-44ab3743-f6b1-4f3e-9686-53c9ebd45d37 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:00:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:05.873 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[4e13b7a8-b4b0-4d7c-b691-f8ad385ec08f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:05.875 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6a4a282c-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:00:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:05.875 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:00:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:05.876 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6a4a282c-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:00:05 np0005531887 nova_compute[186849]: 2025-11-22 08:00:05.877 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:05 np0005531887 NetworkManager[55210]: <info>  [1763798405.8788] manager: (tap6a4a282c-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/126)
Nov 22 03:00:05 np0005531887 kernel: tap6a4a282c-d0: entered promiscuous mode
Nov 22 03:00:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:05.880 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6a4a282c-d0, col_values=(('external_ids', {'iface-id': '26692495-261e-4628-ae4d-0a33d676c097'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:00:05 np0005531887 ovn_controller[95130]: 2025-11-22T08:00:05Z|00259|binding|INFO|Releasing lport 26692495-261e-4628-ae4d-0a33d676c097 from this chassis (sb_readonly=0)
Nov 22 03:00:05 np0005531887 nova_compute[186849]: 2025-11-22 08:00:05.883 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:05 np0005531887 nova_compute[186849]: 2025-11-22 08:00:05.896 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:05.897 104084 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6a4a282c-db22-41de-b34b-2960aa032ca8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6a4a282c-db22-41de-b34b-2960aa032ca8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:00:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:05.898 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[13d3959b-b3b8-4c87-af16-032800039719]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:05.899 104084 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:00:05 np0005531887 ovn_metadata_agent[104079]: global
Nov 22 03:00:05 np0005531887 ovn_metadata_agent[104079]:    log         /dev/log local0 debug
Nov 22 03:00:05 np0005531887 ovn_metadata_agent[104079]:    log-tag     haproxy-metadata-proxy-6a4a282c-db22-41de-b34b-2960aa032ca8
Nov 22 03:00:05 np0005531887 ovn_metadata_agent[104079]:    user        root
Nov 22 03:00:05 np0005531887 ovn_metadata_agent[104079]:    group       root
Nov 22 03:00:05 np0005531887 ovn_metadata_agent[104079]:    maxconn     1024
Nov 22 03:00:05 np0005531887 ovn_metadata_agent[104079]:    pidfile     /var/lib/neutron/external/pids/6a4a282c-db22-41de-b34b-2960aa032ca8.pid.haproxy
Nov 22 03:00:05 np0005531887 ovn_metadata_agent[104079]:    daemon
Nov 22 03:00:05 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:00:05 np0005531887 ovn_metadata_agent[104079]: defaults
Nov 22 03:00:05 np0005531887 ovn_metadata_agent[104079]:    log global
Nov 22 03:00:05 np0005531887 ovn_metadata_agent[104079]:    mode http
Nov 22 03:00:05 np0005531887 ovn_metadata_agent[104079]:    option httplog
Nov 22 03:00:05 np0005531887 ovn_metadata_agent[104079]:    option dontlognull
Nov 22 03:00:05 np0005531887 ovn_metadata_agent[104079]:    option http-server-close
Nov 22 03:00:05 np0005531887 ovn_metadata_agent[104079]:    option forwardfor
Nov 22 03:00:05 np0005531887 ovn_metadata_agent[104079]:    retries                 3
Nov 22 03:00:05 np0005531887 ovn_metadata_agent[104079]:    timeout http-request    30s
Nov 22 03:00:05 np0005531887 ovn_metadata_agent[104079]:    timeout connect         30s
Nov 22 03:00:05 np0005531887 ovn_metadata_agent[104079]:    timeout client          32s
Nov 22 03:00:05 np0005531887 ovn_metadata_agent[104079]:    timeout server          32s
Nov 22 03:00:05 np0005531887 ovn_metadata_agent[104079]:    timeout http-keep-alive 30s
Nov 22 03:00:05 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:00:05 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:00:05 np0005531887 ovn_metadata_agent[104079]: listen listener
Nov 22 03:00:05 np0005531887 ovn_metadata_agent[104079]:    bind 169.254.169.254:80
Nov 22 03:00:05 np0005531887 ovn_metadata_agent[104079]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:00:05 np0005531887 ovn_metadata_agent[104079]:    http-request add-header X-OVN-Network-ID 6a4a282c-db22-41de-b34b-2960aa032ca8
Nov 22 03:00:05 np0005531887 ovn_metadata_agent[104079]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:00:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:05.900 104084 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8', 'env', 'PROCESS_TAG=haproxy-6a4a282c-db22-41de-b34b-2960aa032ca8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6a4a282c-db22-41de-b34b-2960aa032ca8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:00:05 np0005531887 nova_compute[186849]: 2025-11-22 08:00:05.960 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798405.9600804, 36558ab5-ef38-44f5-8dd6-98c8e20c68c7 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:00:05 np0005531887 nova_compute[186849]: 2025-11-22 08:00:05.961 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] VM Started (Lifecycle Event)#033[00m
Nov 22 03:00:05 np0005531887 nova_compute[186849]: 2025-11-22 08:00:05.963 186853 DEBUG nova.compute.manager [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:00:05 np0005531887 nova_compute[186849]: 2025-11-22 08:00:05.967 186853 DEBUG nova.virt.libvirt.driver [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:00:05 np0005531887 nova_compute[186849]: 2025-11-22 08:00:05.970 186853 INFO nova.virt.libvirt.driver [-] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Instance spawned successfully.#033[00m
Nov 22 03:00:05 np0005531887 nova_compute[186849]: 2025-11-22 08:00:05.971 186853 DEBUG nova.virt.libvirt.driver [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 03:00:05 np0005531887 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 03:00:05 np0005531887 nova_compute[186849]: 2025-11-22 08:00:05.983 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:00:05 np0005531887 nova_compute[186849]: 2025-11-22 08:00:05.996 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:00:06 np0005531887 nova_compute[186849]: 2025-11-22 08:00:06.000 186853 DEBUG nova.virt.libvirt.driver [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:00:06 np0005531887 nova_compute[186849]: 2025-11-22 08:00:06.001 186853 DEBUG nova.virt.libvirt.driver [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:00:06 np0005531887 nova_compute[186849]: 2025-11-22 08:00:06.002 186853 DEBUG nova.virt.libvirt.driver [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:00:06 np0005531887 nova_compute[186849]: 2025-11-22 08:00:06.002 186853 DEBUG nova.virt.libvirt.driver [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:00:06 np0005531887 nova_compute[186849]: 2025-11-22 08:00:06.003 186853 DEBUG nova.virt.libvirt.driver [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:00:06 np0005531887 nova_compute[186849]: 2025-11-22 08:00:06.003 186853 DEBUG nova.virt.libvirt.driver [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:00:06 np0005531887 nova_compute[186849]: 2025-11-22 08:00:06.032 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:00:06 np0005531887 nova_compute[186849]: 2025-11-22 08:00:06.033 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798405.9603097, 36558ab5-ef38-44f5-8dd6-98c8e20c68c7 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:00:06 np0005531887 nova_compute[186849]: 2025-11-22 08:00:06.033 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:00:06 np0005531887 nova_compute[186849]: 2025-11-22 08:00:06.061 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:00:06 np0005531887 nova_compute[186849]: 2025-11-22 08:00:06.066 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798405.9668634, 36558ab5-ef38-44f5-8dd6-98c8e20c68c7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:00:06 np0005531887 nova_compute[186849]: 2025-11-22 08:00:06.067 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:00:06 np0005531887 nova_compute[186849]: 2025-11-22 08:00:06.090 186853 INFO nova.compute.manager [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Took 7.69 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 03:00:06 np0005531887 nova_compute[186849]: 2025-11-22 08:00:06.091 186853 DEBUG nova.compute.manager [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:00:06 np0005531887 nova_compute[186849]: 2025-11-22 08:00:06.094 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:00:06 np0005531887 nova_compute[186849]: 2025-11-22 08:00:06.102 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:00:06 np0005531887 nova_compute[186849]: 2025-11-22 08:00:06.133 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:00:06 np0005531887 nova_compute[186849]: 2025-11-22 08:00:06.168 186853 INFO nova.compute.manager [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Took 8.22 seconds to build instance.#033[00m
Nov 22 03:00:06 np0005531887 nova_compute[186849]: 2025-11-22 08:00:06.184 186853 DEBUG oslo_concurrency.lockutils [None req-09673b0c-34fa-4b87-9630-e4a98fb4970a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "36558ab5-ef38-44f5-8dd6-98c8e20c68c7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.331s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:00:06 np0005531887 podman[226570]: 2025-11-22 08:00:06.315705147 +0000 UTC m=+0.033875750 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:00:06 np0005531887 podman[226570]: 2025-11-22 08:00:06.42805693 +0000 UTC m=+0.146227483 container create a84878a3ec868ea3b301ca39714621f96c654ba6914f1c781c09a1080ab34c24 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 22 03:00:06 np0005531887 systemd[1]: Started libpod-conmon-a84878a3ec868ea3b301ca39714621f96c654ba6914f1c781c09a1080ab34c24.scope.
Nov 22 03:00:06 np0005531887 systemd[1]: Started libcrun container.
Nov 22 03:00:06 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/302df88bc62da9afa78331b929022c6d043bd2a26cd3955681c5cc6d5fcd4c01/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:00:06 np0005531887 nova_compute[186849]: 2025-11-22 08:00:06.630 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:06 np0005531887 podman[226570]: 2025-11-22 08:00:06.802542779 +0000 UTC m=+0.520713332 container init a84878a3ec868ea3b301ca39714621f96c654ba6914f1c781c09a1080ab34c24 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 22 03:00:06 np0005531887 podman[226570]: 2025-11-22 08:00:06.809037206 +0000 UTC m=+0.527207739 container start a84878a3ec868ea3b301ca39714621f96c654ba6914f1c781c09a1080ab34c24 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 22 03:00:06 np0005531887 neutron-haproxy-ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8[226586]: [NOTICE]   (226590) : New worker (226592) forked
Nov 22 03:00:06 np0005531887 neutron-haproxy-ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8[226586]: [NOTICE]   (226590) : Loading success.
Nov 22 03:00:07 np0005531887 nova_compute[186849]: 2025-11-22 08:00:07.963 186853 DEBUG nova.compute.manager [req-81bc57ab-29d0-4e38-9350-7929f6e1b9ab req-c073f20b-358a-4030-bbee-12ca4814b74b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Received event network-vif-plugged-44ab3743-f6b1-4f3e-9686-53c9ebd45d37 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:00:07 np0005531887 nova_compute[186849]: 2025-11-22 08:00:07.964 186853 DEBUG oslo_concurrency.lockutils [req-81bc57ab-29d0-4e38-9350-7929f6e1b9ab req-c073f20b-358a-4030-bbee-12ca4814b74b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "36558ab5-ef38-44f5-8dd6-98c8e20c68c7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:00:07 np0005531887 nova_compute[186849]: 2025-11-22 08:00:07.965 186853 DEBUG oslo_concurrency.lockutils [req-81bc57ab-29d0-4e38-9350-7929f6e1b9ab req-c073f20b-358a-4030-bbee-12ca4814b74b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "36558ab5-ef38-44f5-8dd6-98c8e20c68c7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:00:07 np0005531887 nova_compute[186849]: 2025-11-22 08:00:07.965 186853 DEBUG oslo_concurrency.lockutils [req-81bc57ab-29d0-4e38-9350-7929f6e1b9ab req-c073f20b-358a-4030-bbee-12ca4814b74b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "36558ab5-ef38-44f5-8dd6-98c8e20c68c7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:00:07 np0005531887 nova_compute[186849]: 2025-11-22 08:00:07.966 186853 DEBUG nova.compute.manager [req-81bc57ab-29d0-4e38-9350-7929f6e1b9ab req-c073f20b-358a-4030-bbee-12ca4814b74b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] No waiting events found dispatching network-vif-plugged-44ab3743-f6b1-4f3e-9686-53c9ebd45d37 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:00:07 np0005531887 nova_compute[186849]: 2025-11-22 08:00:07.967 186853 WARNING nova.compute.manager [req-81bc57ab-29d0-4e38-9350-7929f6e1b9ab req-c073f20b-358a-4030-bbee-12ca4814b74b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Received unexpected event network-vif-plugged-44ab3743-f6b1-4f3e-9686-53c9ebd45d37 for instance with vm_state active and task_state None.#033[00m
Nov 22 03:00:08 np0005531887 NetworkManager[55210]: <info>  [1763798408.5430] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/127)
Nov 22 03:00:08 np0005531887 NetworkManager[55210]: <info>  [1763798408.5438] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/128)
Nov 22 03:00:08 np0005531887 nova_compute[186849]: 2025-11-22 08:00:08.545 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:08 np0005531887 nova_compute[186849]: 2025-11-22 08:00:08.644 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:08 np0005531887 ovn_controller[95130]: 2025-11-22T08:00:08Z|00260|binding|INFO|Releasing lport 26692495-261e-4628-ae4d-0a33d676c097 from this chassis (sb_readonly=0)
Nov 22 03:00:08 np0005531887 nova_compute[186849]: 2025-11-22 08:00:08.657 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:09 np0005531887 nova_compute[186849]: 2025-11-22 08:00:09.303 186853 DEBUG nova.compute.manager [req-eb13c6b3-adc6-4ec6-b586-6851d932f460 req-3996dd91-1df8-4dc0-b830-ef06cfbc1140 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Received event network-changed-44ab3743-f6b1-4f3e-9686-53c9ebd45d37 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:00:09 np0005531887 nova_compute[186849]: 2025-11-22 08:00:09.303 186853 DEBUG nova.compute.manager [req-eb13c6b3-adc6-4ec6-b586-6851d932f460 req-3996dd91-1df8-4dc0-b830-ef06cfbc1140 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Refreshing instance network info cache due to event network-changed-44ab3743-f6b1-4f3e-9686-53c9ebd45d37. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:00:09 np0005531887 nova_compute[186849]: 2025-11-22 08:00:09.303 186853 DEBUG oslo_concurrency.lockutils [req-eb13c6b3-adc6-4ec6-b586-6851d932f460 req-3996dd91-1df8-4dc0-b830-ef06cfbc1140 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-36558ab5-ef38-44f5-8dd6-98c8e20c68c7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:00:09 np0005531887 nova_compute[186849]: 2025-11-22 08:00:09.303 186853 DEBUG oslo_concurrency.lockutils [req-eb13c6b3-adc6-4ec6-b586-6851d932f460 req-3996dd91-1df8-4dc0-b830-ef06cfbc1140 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-36558ab5-ef38-44f5-8dd6-98c8e20c68c7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:00:09 np0005531887 nova_compute[186849]: 2025-11-22 08:00:09.303 186853 DEBUG nova.network.neutron [req-eb13c6b3-adc6-4ec6-b586-6851d932f460 req-3996dd91-1df8-4dc0-b830-ef06cfbc1140 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Refreshing network info cache for port 44ab3743-f6b1-4f3e-9686-53c9ebd45d37 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:00:09 np0005531887 nova_compute[186849]: 2025-11-22 08:00:09.358 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:10 np0005531887 podman[226604]: 2025-11-22 08:00:10.514689666 +0000 UTC m=+0.068447097 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, name=ubi9-minimal, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, distribution-scope=public, managed_by=edpm_ansible, vendor=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41)
Nov 22 03:00:10 np0005531887 nova_compute[186849]: 2025-11-22 08:00:10.813 186853 DEBUG nova.network.neutron [req-eb13c6b3-adc6-4ec6-b586-6851d932f460 req-3996dd91-1df8-4dc0-b830-ef06cfbc1140 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Updated VIF entry in instance network info cache for port 44ab3743-f6b1-4f3e-9686-53c9ebd45d37. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:00:10 np0005531887 nova_compute[186849]: 2025-11-22 08:00:10.814 186853 DEBUG nova.network.neutron [req-eb13c6b3-adc6-4ec6-b586-6851d932f460 req-3996dd91-1df8-4dc0-b830-ef06cfbc1140 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Updating instance_info_cache with network_info: [{"id": "44ab3743-f6b1-4f3e-9686-53c9ebd45d37", "address": "fa:16:3e:06:f1:87", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44ab3743-f6", "ovs_interfaceid": "44ab3743-f6b1-4f3e-9686-53c9ebd45d37", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:00:10 np0005531887 nova_compute[186849]: 2025-11-22 08:00:10.837 186853 DEBUG oslo_concurrency.lockutils [req-eb13c6b3-adc6-4ec6-b586-6851d932f460 req-3996dd91-1df8-4dc0-b830-ef06cfbc1140 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-36558ab5-ef38-44f5-8dd6-98c8e20c68c7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:00:11 np0005531887 nova_compute[186849]: 2025-11-22 08:00:11.633 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:14 np0005531887 nova_compute[186849]: 2025-11-22 08:00:14.360 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:14 np0005531887 podman[226625]: 2025-11-22 08:00:14.847513239 +0000 UTC m=+0.064899576 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:00:14 np0005531887 podman[226626]: 2025-11-22 08:00:14.889186698 +0000 UTC m=+0.097604806 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible)
Nov 22 03:00:16 np0005531887 nova_compute[186849]: 2025-11-22 08:00:16.636 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:19 np0005531887 nova_compute[186849]: 2025-11-22 08:00:19.362 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:19 np0005531887 podman[226672]: 2025-11-22 08:00:19.854539622 +0000 UTC m=+0.067161775 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 03:00:21 np0005531887 ovn_controller[95130]: 2025-11-22T08:00:21Z|00033|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:06:f1:87 10.100.0.9
Nov 22 03:00:21 np0005531887 ovn_controller[95130]: 2025-11-22T08:00:21Z|00034|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:06:f1:87 10.100.0.9
Nov 22 03:00:21 np0005531887 nova_compute[186849]: 2025-11-22 08:00:21.639 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:24 np0005531887 nova_compute[186849]: 2025-11-22 08:00:24.365 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:25 np0005531887 podman[226717]: 2025-11-22 08:00:25.851336674 +0000 UTC m=+0.063309505 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 03:00:26 np0005531887 nova_compute[186849]: 2025-11-22 08:00:26.642 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:29 np0005531887 nova_compute[186849]: 2025-11-22 08:00:29.367 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:31 np0005531887 nova_compute[186849]: 2025-11-22 08:00:31.644 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:31 np0005531887 podman[226737]: 2025-11-22 08:00:31.85378877 +0000 UTC m=+0.064910826 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Nov 22 03:00:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:34.065 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:00:34 np0005531887 nova_compute[186849]: 2025-11-22 08:00:34.066 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:34.067 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:00:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:34.068 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:00:34 np0005531887 nova_compute[186849]: 2025-11-22 08:00:34.369 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:35 np0005531887 podman[226758]: 2025-11-22 08:00:35.850190853 +0000 UTC m=+0.063063469 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 03:00:36 np0005531887 nova_compute[186849]: 2025-11-22 08:00:36.646 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.667 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '36558ab5-ef38-44f5-8dd6-98c8e20c68c7', 'name': 'tempest-AttachInterfacesTestJSON-server-733706123', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000005b', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'hostId': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.668 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.687 12 DEBUG ceilometer.compute.pollsters [-] 36558ab5-ef38-44f5-8dd6-98c8e20c68c7/disk.device.allocation volume: 30679040 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.689 12 DEBUG ceilometer.compute.pollsters [-] 36558ab5-ef38-44f5-8dd6-98c8e20c68c7/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.692 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5d78c76e-a3d7-46a8-b014-ef129ab8306e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30679040, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': '36558ab5-ef38-44f5-8dd6-98c8e20c68c7-vda', 'timestamp': '2025-11-22T08:00:36.668870', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-733706123', 'name': 'instance-0000005b', 'instance_id': '36558ab5-ef38-44f5-8dd6-98c8e20c68c7', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '543f3edc-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5179.338689459, 'message_signature': 'aa9f461a66621ff88b7431f4d6d40038f002cfe172fcc714b80d3d32fa521876'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': '36558ab5-ef38-44f5-8dd6-98c8e20c68c7-sda', 'timestamp': '2025-11-22T08:00:36.668870', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-733706123', 'name': 'instance-0000005b', 'instance_id': '36558ab5-ef38-44f5-8dd6-98c8e20c68c7', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '543f6358-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5179.338689459, 'message_signature': '7dfdffd1499cf9967dd846ba183a0a5dde07e10f5fa501f9f1dd2c5757930bce'}]}, 'timestamp': '2025-11-22 08:00:36.689947', '_unique_id': '40d6e06286e344ca93680183230fe8b5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.692 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.692 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.692 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.692 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.692 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.692 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.692 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.692 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.692 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.692 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.692 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.692 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.692 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.692 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.692 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.692 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.692 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.692 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.692 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.692 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.692 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.692 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.692 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.692 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.692 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.692 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.692 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.692 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.692 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.692 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.692 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.694 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.726 12 DEBUG ceilometer.compute.pollsters [-] 36558ab5-ef38-44f5-8dd6-98c8e20c68c7/disk.device.write.requests volume: 272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.727 12 DEBUG ceilometer.compute.pollsters [-] 36558ab5-ef38-44f5-8dd6-98c8e20c68c7/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.728 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd5fc0bf0-0ffc-4330-a35e-9a28c7222799', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 272, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': '36558ab5-ef38-44f5-8dd6-98c8e20c68c7-vda', 'timestamp': '2025-11-22T08:00:36.695270', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-733706123', 'name': 'instance-0000005b', 'instance_id': '36558ab5-ef38-44f5-8dd6-98c8e20c68c7', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '544510dc-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5179.365151039, 'message_signature': '4aa8ba915b7660b1bf8070b306947be2a5783ff2d8d91de4f28cdaa4f9f0dec7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': '36558ab5-ef38-44f5-8dd6-98c8e20c68c7-sda', 'timestamp': '2025-11-22T08:00:36.695270', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-733706123', 'name': 'instance-0000005b', 'instance_id': '36558ab5-ef38-44f5-8dd6-98c8e20c68c7', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '54452144-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5179.365151039, 'message_signature': '53d6fc8b6de2c90214db5de43b5bee20ee9c2e524faca8ffbe2ae9f5ca6a28bc'}]}, 'timestamp': '2025-11-22 08:00:36.727404', '_unique_id': '3491cb1b62514ec686c5de6af039df26'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.728 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.728 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.728 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.728 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.728 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.728 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.728 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.728 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.728 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.728 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.728 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.728 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.728 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.728 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.728 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.728 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.728 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.728 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.728 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.728 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.728 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.728 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.728 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.728 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.728 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.728 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.728 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.728 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.728 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.728 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.728 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.729 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.729 12 DEBUG ceilometer.compute.pollsters [-] 36558ab5-ef38-44f5-8dd6-98c8e20c68c7/disk.device.write.latency volume: 7906099803 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.730 12 DEBUG ceilometer.compute.pollsters [-] 36558ab5-ef38-44f5-8dd6-98c8e20c68c7/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.731 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4687d338-0905-4cbc-b52f-7dfaa009193a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 7906099803, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': '36558ab5-ef38-44f5-8dd6-98c8e20c68c7-vda', 'timestamp': '2025-11-22T08:00:36.729888', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-733706123', 'name': 'instance-0000005b', 'instance_id': '36558ab5-ef38-44f5-8dd6-98c8e20c68c7', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '54459034-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5179.365151039, 'message_signature': '5bd832aec6e445741aacc3c0608f50e22cebc28e7741bd853ea3ed522bcd65f7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': '36558ab5-ef38-44f5-8dd6-98c8e20c68c7-sda', 'timestamp': '2025-11-22T08:00:36.729888', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-733706123', 'name': 'instance-0000005b', 'instance_id': '36558ab5-ef38-44f5-8dd6-98c8e20c68c7', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '54459ad4-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5179.365151039, 'message_signature': 'e680f33ce8f51922dc23a80cb47ccad6bec5923252e3369efdfc83cef64b0b19'}]}, 'timestamp': '2025-11-22 08:00:36.730466', '_unique_id': 'a25ab0506545415f8553f6276349b298'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.731 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.731 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.731 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.731 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.731 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.731 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.731 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.731 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.731 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.731 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.731 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.731 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.731 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.731 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.731 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.731 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.731 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.731 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.731 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.731 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.731 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.731 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.731 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.731 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.731 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.731 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.731 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.731 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.731 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.731 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.731 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.731 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.735 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 36558ab5-ef38-44f5-8dd6-98c8e20c68c7 / tap44ab3743-f6 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.735 12 DEBUG ceilometer.compute.pollsters [-] 36558ab5-ef38-44f5-8dd6-98c8e20c68c7/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.736 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '00a6e04b-d806-4fc2-82fd-5657c8d9be6d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'instance-0000005b-36558ab5-ef38-44f5-8dd6-98c8e20c68c7-tap44ab3743-f6', 'timestamp': '2025-11-22T08:00:36.731918', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-733706123', 'name': 'tap44ab3743-f6', 'instance_id': '36558ab5-ef38-44f5-8dd6-98c8e20c68c7', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:06:f1:87', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap44ab3743-f6'}, 'message_id': '54466748-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5179.401766328, 'message_signature': 'd3e4ba2aaa0d4ad567ab2723c464795f292a3d1b93cbaaa7f80fb9a348aa5d7a'}]}, 'timestamp': '2025-11-22 08:00:36.735733', '_unique_id': '5c97e2e895934973a49a79c090d54921'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.736 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.736 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.736 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.736 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.736 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.736 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.736 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.736 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.736 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.736 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.736 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.736 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.736 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.736 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.736 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.736 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.736 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.736 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.736 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.736 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.736 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.736 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.736 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.736 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.736 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.736 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.736 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.736 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.736 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.736 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.736 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.737 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.737 12 DEBUG ceilometer.compute.pollsters [-] 36558ab5-ef38-44f5-8dd6-98c8e20c68c7/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.738 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '62ab0e86-c0f8-4b05-93a1-f04f36762575', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'instance-0000005b-36558ab5-ef38-44f5-8dd6-98c8e20c68c7-tap44ab3743-f6', 'timestamp': '2025-11-22T08:00:36.737737', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-733706123', 'name': 'tap44ab3743-f6', 'instance_id': '36558ab5-ef38-44f5-8dd6-98c8e20c68c7', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:06:f1:87', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap44ab3743-f6'}, 'message_id': '5446c1b6-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5179.401766328, 'message_signature': '48c6d22b7e3637a2660ef4d3c498dd739ef77bbca9180356653843a78be09c01'}]}, 'timestamp': '2025-11-22 08:00:36.738027', '_unique_id': '4408b2e854a74756a342192ae4e5e922'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.738 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.738 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.738 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.738 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.738 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.738 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.738 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.738 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.738 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.738 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.738 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.738 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.738 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.738 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.738 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.738 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.738 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.738 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.738 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.738 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.738 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.738 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.738 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.738 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.738 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.738 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.738 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.738 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.738 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.738 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.738 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.739 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.756 12 DEBUG ceilometer.compute.pollsters [-] 36558ab5-ef38-44f5-8dd6-98c8e20c68c7/cpu volume: 14870000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.757 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '454040ac-3c73-4bc8-845e-19ad99a29921', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 14870000000, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': '36558ab5-ef38-44f5-8dd6-98c8e20c68c7', 'timestamp': '2025-11-22T08:00:36.739293', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-733706123', 'name': 'instance-0000005b', 'instance_id': '36558ab5-ef38-44f5-8dd6-98c8e20c68c7', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '544997d8-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5179.425664882, 'message_signature': '81f586e9454b20a20c3514d0f792f66ff4b49be25377d1748d47d141afbb2a68'}]}, 'timestamp': '2025-11-22 08:00:36.756737', '_unique_id': 'ea1033fdde4c42a0a87a012799f68b9b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.757 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.757 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.757 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.757 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.757 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.757 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.757 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.757 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.757 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.757 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.757 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.757 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.757 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.757 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.757 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.757 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.757 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.757 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.757 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.757 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.757 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.757 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.757 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.757 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.757 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.757 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.757 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.757 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.757 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.757 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.757 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.758 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.758 12 DEBUG ceilometer.compute.pollsters [-] 36558ab5-ef38-44f5-8dd6-98c8e20c68c7/network.incoming.packets volume: 14 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.759 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b9889cae-6aad-4bad-97fb-9d5fbe9bf9d3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 14, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'instance-0000005b-36558ab5-ef38-44f5-8dd6-98c8e20c68c7-tap44ab3743-f6', 'timestamp': '2025-11-22T08:00:36.758746', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-733706123', 'name': 'tap44ab3743-f6', 'instance_id': '36558ab5-ef38-44f5-8dd6-98c8e20c68c7', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:06:f1:87', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap44ab3743-f6'}, 'message_id': '5449f5f2-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5179.401766328, 'message_signature': 'c3d5769844c94fb8f8ff049d35d857e6c06f53f354fbc9bc5bc7e03367ef61d9'}]}, 'timestamp': '2025-11-22 08:00:36.759007', '_unique_id': 'a97f810d6ec149799a8d081048466459'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.759 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.759 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.759 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.759 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.759 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.759 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.759 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.759 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.759 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.759 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.759 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.759 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.759 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.759 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.759 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.759 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.759 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.759 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.759 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.759 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.759 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.759 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.759 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.759 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.759 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.759 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.759 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.759 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.759 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.759 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.759 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.760 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.760 12 DEBUG ceilometer.compute.pollsters [-] 36558ab5-ef38-44f5-8dd6-98c8e20c68c7/disk.device.read.bytes volume: 31001088 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.760 12 DEBUG ceilometer.compute.pollsters [-] 36558ab5-ef38-44f5-8dd6-98c8e20c68c7/disk.device.read.bytes volume: 299326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.761 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0fdfd1da-0adb-4cc3-a291-006b99bdf0d4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31001088, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': '36558ab5-ef38-44f5-8dd6-98c8e20c68c7-vda', 'timestamp': '2025-11-22T08:00:36.760139', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-733706123', 'name': 'instance-0000005b', 'instance_id': '36558ab5-ef38-44f5-8dd6-98c8e20c68c7', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '544a2c66-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5179.365151039, 'message_signature': 'b3e3c8461e7da870370258c39edfa51adcd3bc4cd3a5419cd36b58e2753b5421'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 299326, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': '36558ab5-ef38-44f5-8dd6-98c8e20c68c7-sda', 'timestamp': '2025-11-22T08:00:36.760139', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-733706123', 'name': 'instance-0000005b', 'instance_id': '36558ab5-ef38-44f5-8dd6-98c8e20c68c7', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '544a347c-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5179.365151039, 'message_signature': 'e30aff89904353f072eb3f999acf5ce7332b66d7112c2cbae32bb4494ae8fff0'}]}, 'timestamp': '2025-11-22 08:00:36.760610', '_unique_id': '914fc7bca169499e83fff7a1167215ce'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.761 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.761 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.761 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.761 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.761 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.761 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.761 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.761 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.761 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.761 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.761 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.761 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.761 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.761 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.761 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.761 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.761 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.761 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.761 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.761 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.761 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.761 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.761 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.761 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.761 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.761 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.761 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.761 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.761 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.761 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.761 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.761 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.761 12 DEBUG ceilometer.compute.pollsters [-] 36558ab5-ef38-44f5-8dd6-98c8e20c68c7/memory.usage volume: 42.65625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.762 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f533944c-d5f2-4fc3-9987-32f9c65c8bed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.65625, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': '36558ab5-ef38-44f5-8dd6-98c8e20c68c7', 'timestamp': '2025-11-22T08:00:36.761729', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-733706123', 'name': 'instance-0000005b', 'instance_id': '36558ab5-ef38-44f5-8dd6-98c8e20c68c7', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '544a69ba-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5179.425664882, 'message_signature': 'c44594adb8ddc262b73a51a2bfcf915ffe94a20bc53de191e7068d324a0a2f79'}]}, 'timestamp': '2025-11-22 08:00:36.761953', '_unique_id': '1d4acbff9c6e43f3b884e646fe00168d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.762 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.762 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.762 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.762 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.762 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.762 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.762 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.762 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.762 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.762 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.762 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.762 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.762 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.762 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.762 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.762 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.762 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.762 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.762 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.762 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.762 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.762 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.762 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.762 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.762 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.762 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.762 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.762 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.762 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.762 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.762 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.762 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.763 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.763 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-AttachInterfacesTestJSON-server-733706123>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-AttachInterfacesTestJSON-server-733706123>]
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.763 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.763 12 DEBUG ceilometer.compute.pollsters [-] 36558ab5-ef38-44f5-8dd6-98c8e20c68c7/disk.device.read.requests volume: 1131 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.763 12 DEBUG ceilometer.compute.pollsters [-] 36558ab5-ef38-44f5-8dd6-98c8e20c68c7/disk.device.read.requests volume: 120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.764 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dbfdcf40-94cb-4a50-85da-600687ff605d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1131, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': '36558ab5-ef38-44f5-8dd6-98c8e20c68c7-vda', 'timestamp': '2025-11-22T08:00:36.763361', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-733706123', 'name': 'instance-0000005b', 'instance_id': '36558ab5-ef38-44f5-8dd6-98c8e20c68c7', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '544aa95c-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5179.365151039, 'message_signature': '419921b84c60d341209fa9facd89a62a560868e20c8b3d038c5fb56e8ee67dea'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 120, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': '36558ab5-ef38-44f5-8dd6-98c8e20c68c7-sda', 'timestamp': '2025-11-22T08:00:36.763361', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-733706123', 'name': 'instance-0000005b', 'instance_id': '36558ab5-ef38-44f5-8dd6-98c8e20c68c7', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '544ab186-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5179.365151039, 'message_signature': '3cabd3efc63d35b7de5258b47d0266f8276e1c242c4df1942ee90e0fce9cb76d'}]}, 'timestamp': '2025-11-22 08:00:36.763783', '_unique_id': '4f18cedc8ba64c1a91cfe1f638a66072'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.764 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.764 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.764 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.764 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.764 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.764 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.764 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.764 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.764 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.764 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.764 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.764 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.764 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.764 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.764 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.764 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.764 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.764 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.764 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.764 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.764 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.764 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.764 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.764 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.764 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.764 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.764 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.764 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.764 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.764 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.764 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.764 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.764 12 DEBUG ceilometer.compute.pollsters [-] 36558ab5-ef38-44f5-8dd6-98c8e20c68c7/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.765 12 DEBUG ceilometer.compute.pollsters [-] 36558ab5-ef38-44f5-8dd6-98c8e20c68c7/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.765 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f93d9c60-a2ae-454d-a6a0-9bbc898e28a0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': '36558ab5-ef38-44f5-8dd6-98c8e20c68c7-vda', 'timestamp': '2025-11-22T08:00:36.764945', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-733706123', 'name': 'instance-0000005b', 'instance_id': '36558ab5-ef38-44f5-8dd6-98c8e20c68c7', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '544ae746-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5179.338689459, 'message_signature': 'cc81e8fd83c0c37f98709ae36f6ec2a30dbc8f9d099c3646ac20e456462cc95b'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': '36558ab5-ef38-44f5-8dd6-98c8e20c68c7-sda', 'timestamp': '2025-11-22T08:00:36.764945', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-733706123', 'name': 'instance-0000005b', 'instance_id': '36558ab5-ef38-44f5-8dd6-98c8e20c68c7', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '544aefb6-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5179.338689459, 'message_signature': '6c3f3e7626d0199c53e2c9cc6c77be9f0b2dcba532f608e3b30dd0bcffbde609'}]}, 'timestamp': '2025-11-22 08:00:36.765375', '_unique_id': '3bd9c7bac4144545a49b43073d905ee6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.765 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.765 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.765 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.765 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.765 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.765 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.765 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.765 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.765 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.765 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.765 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.765 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.765 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.765 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.765 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.765 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.765 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.765 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.765 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.765 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.765 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.765 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.765 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.765 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.765 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.765 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.765 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.765 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.765 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.765 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.765 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.766 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.766 12 DEBUG ceilometer.compute.pollsters [-] 36558ab5-ef38-44f5-8dd6-98c8e20c68c7/network.incoming.bytes volume: 1722 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.767 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0f7f57f8-6b50-446d-9124-abc13cf7e2b5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1722, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'instance-0000005b-36558ab5-ef38-44f5-8dd6-98c8e20c68c7-tap44ab3743-f6', 'timestamp': '2025-11-22T08:00:36.766485', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-733706123', 'name': 'tap44ab3743-f6', 'instance_id': '36558ab5-ef38-44f5-8dd6-98c8e20c68c7', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:06:f1:87', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap44ab3743-f6'}, 'message_id': '544b2396-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5179.401766328, 'message_signature': '2c8fb91637bbd4f7bb8237554d383a84b373c6627d2a71b45f9ca5624a6b53fb'}]}, 'timestamp': '2025-11-22 08:00:36.766717', '_unique_id': '9cfce9b95b144fe9b61ba30dc517653e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.767 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.767 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.767 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.767 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.767 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.767 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.767 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.767 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.767 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.767 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.767 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.767 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.767 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.767 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.767 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.767 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.767 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.767 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.767 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.767 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.767 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.767 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.767 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.767 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.767 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.767 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.767 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.767 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.767 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.767 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.767 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.767 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.767 12 DEBUG ceilometer.compute.pollsters [-] 36558ab5-ef38-44f5-8dd6-98c8e20c68c7/disk.device.read.latency volume: 2307051034 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.768 12 DEBUG ceilometer.compute.pollsters [-] 36558ab5-ef38-44f5-8dd6-98c8e20c68c7/disk.device.read.latency volume: 80473677 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.768 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '88151b60-d4b8-4e34-b9e1-846c066aac4e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2307051034, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': '36558ab5-ef38-44f5-8dd6-98c8e20c68c7-vda', 'timestamp': '2025-11-22T08:00:36.767939', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-733706123', 'name': 'instance-0000005b', 'instance_id': '36558ab5-ef38-44f5-8dd6-98c8e20c68c7', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '544b5c26-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5179.365151039, 'message_signature': '8db834426e11b9475a2b99817e35be8e6b755bd0ec8b5a9485c05cd248def459'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 80473677, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': '36558ab5-ef38-44f5-8dd6-98c8e20c68c7-sda', 'timestamp': '2025-11-22T08:00:36.767939', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-733706123', 'name': 'instance-0000005b', 'instance_id': '36558ab5-ef38-44f5-8dd6-98c8e20c68c7', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '544b6482-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5179.365151039, 'message_signature': '0131a53c68ef6332d98cf77e1f052cf131dc1bc26a01f8407de713a73fa9d843'}]}, 'timestamp': '2025-11-22 08:00:36.768366', '_unique_id': 'd8e687b389b441039002ca19bf389c74'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.768 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.768 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.768 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.768 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.768 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.768 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.768 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.768 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.768 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.768 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.768 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.768 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.768 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.768 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.768 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.768 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.768 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.768 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.768 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.768 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.768 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.768 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.768 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.768 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.768 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.768 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.768 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.768 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.768 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.768 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.768 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.769 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.769 12 DEBUG ceilometer.compute.pollsters [-] 36558ab5-ef38-44f5-8dd6-98c8e20c68c7/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.770 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '510a9312-c131-4dcd-8012-7352cdea9c98', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'instance-0000005b-36558ab5-ef38-44f5-8dd6-98c8e20c68c7-tap44ab3743-f6', 'timestamp': '2025-11-22T08:00:36.769505', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-733706123', 'name': 'tap44ab3743-f6', 'instance_id': '36558ab5-ef38-44f5-8dd6-98c8e20c68c7', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:06:f1:87', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap44ab3743-f6'}, 'message_id': '544b995c-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5179.401766328, 'message_signature': '8862d264da5a705ceb5cdfb630a9a592f792ac65f70225cb323c9afb089b95cd'}]}, 'timestamp': '2025-11-22 08:00:36.769731', '_unique_id': '2688096fe8604134b261ba3e130e239a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.770 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.770 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.770 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.770 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.770 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.770 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.770 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.770 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.770 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.770 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.770 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.770 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.770 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.770 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.770 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.770 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.770 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.770 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.770 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.770 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.770 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.770 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.770 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.770 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.770 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.770 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.770 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.770 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.770 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.770 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.770 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.770 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.770 12 DEBUG ceilometer.compute.pollsters [-] 36558ab5-ef38-44f5-8dd6-98c8e20c68c7/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.771 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aa8b4e78-8fe4-4de4-89ee-fd752063de0a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'instance-0000005b-36558ab5-ef38-44f5-8dd6-98c8e20c68c7-tap44ab3743-f6', 'timestamp': '2025-11-22T08:00:36.770863', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-733706123', 'name': 'tap44ab3743-f6', 'instance_id': '36558ab5-ef38-44f5-8dd6-98c8e20c68c7', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:06:f1:87', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap44ab3743-f6'}, 'message_id': '544bceae-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5179.401766328, 'message_signature': 'ad3827d5074f50a5cf9c0655fcea7a026af62a5f3d5dcde4500b20fd6e6d28e7'}]}, 'timestamp': '2025-11-22 08:00:36.771097', '_unique_id': '98b3b1441a6647e78d94f0b548342808'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.771 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.771 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.771 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.771 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.771 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.771 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.771 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.771 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.771 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.771 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.771 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.771 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.771 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.771 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.771 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.771 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.771 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.771 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.771 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.771 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.771 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.771 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.771 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.771 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.771 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.771 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.771 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.771 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.771 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.771 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.771 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.772 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.772 12 DEBUG ceilometer.compute.pollsters [-] 36558ab5-ef38-44f5-8dd6-98c8e20c68c7/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.772 12 DEBUG ceilometer.compute.pollsters [-] 36558ab5-ef38-44f5-8dd6-98c8e20c68c7/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.773 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd3b55e9e-3e58-4ed5-b106-1f0676a0084a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': '36558ab5-ef38-44f5-8dd6-98c8e20c68c7-vda', 'timestamp': '2025-11-22T08:00:36.772147', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-733706123', 'name': 'instance-0000005b', 'instance_id': '36558ab5-ef38-44f5-8dd6-98c8e20c68c7', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '544c0108-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5179.338689459, 'message_signature': 'f1c0267ca6215eefea17d04845731e37d0bcadf9ffadf93eefc59b25e23cdd6a'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': '36558ab5-ef38-44f5-8dd6-98c8e20c68c7-sda', 'timestamp': '2025-11-22T08:00:36.772147', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-733706123', 'name': 'instance-0000005b', 'instance_id': '36558ab5-ef38-44f5-8dd6-98c8e20c68c7', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '544c08ce-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5179.338689459, 'message_signature': 'f6c6911169ad2597fd94638171f36b0964ed7f9ca7c99a1c982dc8e255731772'}]}, 'timestamp': '2025-11-22 08:00:36.772604', '_unique_id': '0cfb72d3efdf4b81b390e023b32bc2ec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.773 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.773 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.773 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.773 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.773 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.773 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.773 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.773 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.773 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.773 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.773 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.773 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.773 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.773 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.773 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.773 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.773 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.773 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.773 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.773 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.773 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.773 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.773 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.773 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.773 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.773 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.773 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.773 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.773 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.773 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.773 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.773 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.773 12 DEBUG ceilometer.compute.pollsters [-] 36558ab5-ef38-44f5-8dd6-98c8e20c68c7/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.774 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '339c8d34-0b1b-40a2-bc07-d35ddfd57e2c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'instance-0000005b-36558ab5-ef38-44f5-8dd6-98c8e20c68c7-tap44ab3743-f6', 'timestamp': '2025-11-22T08:00:36.773851', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-733706123', 'name': 'tap44ab3743-f6', 'instance_id': '36558ab5-ef38-44f5-8dd6-98c8e20c68c7', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:06:f1:87', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap44ab3743-f6'}, 'message_id': '544c4320-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5179.401766328, 'message_signature': 'bd63ec1193ca89e54c63f361f77f13107b9344b941af8b1bda819dcef2f0dfd5'}]}, 'timestamp': '2025-11-22 08:00:36.774077', '_unique_id': '5e928ee5e4f54361b2717e411bdf92b1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.774 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.774 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.774 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.774 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.774 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.774 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.774 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.774 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.774 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.774 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.774 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.774 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.774 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.774 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.774 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.774 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.774 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.774 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.774 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.774 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.774 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.774 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.774 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.774 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.774 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.774 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.774 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.774 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.774 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.774 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.774 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.775 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.775 12 DEBUG ceilometer.compute.pollsters [-] 36558ab5-ef38-44f5-8dd6-98c8e20c68c7/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.775 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '61ceac36-8d92-4783-bed4-b220d700a304', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'instance-0000005b-36558ab5-ef38-44f5-8dd6-98c8e20c68c7-tap44ab3743-f6', 'timestamp': '2025-11-22T08:00:36.775099', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-733706123', 'name': 'tap44ab3743-f6', 'instance_id': '36558ab5-ef38-44f5-8dd6-98c8e20c68c7', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:06:f1:87', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap44ab3743-f6'}, 'message_id': '544c73c2-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5179.401766328, 'message_signature': '8e2003207677828adab25c7bff53c757fca7bf0ae85da50ed607ae166b1dac1d'}]}, 'timestamp': '2025-11-22 08:00:36.775348', '_unique_id': 'f055efb56d91498ea3abcbe00ddce0d4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.775 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.775 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.775 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.775 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.775 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.775 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.775 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.775 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.775 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.775 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.775 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.775 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.775 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.775 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.775 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.775 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.775 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.775 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.775 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.775 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.775 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.775 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.775 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.775 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.775 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.775 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.775 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.775 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.775 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.775 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.775 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.776 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.776 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.776 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-AttachInterfacesTestJSON-server-733706123>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-AttachInterfacesTestJSON-server-733706123>]
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.776 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.776 12 DEBUG ceilometer.compute.pollsters [-] 36558ab5-ef38-44f5-8dd6-98c8e20c68c7/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.777 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '30dc9bba-dd93-48d5-b3c3-51c9931c7e88', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'instance-0000005b-36558ab5-ef38-44f5-8dd6-98c8e20c68c7-tap44ab3743-f6', 'timestamp': '2025-11-22T08:00:36.776701', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-733706123', 'name': 'tap44ab3743-f6', 'instance_id': '36558ab5-ef38-44f5-8dd6-98c8e20c68c7', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:06:f1:87', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap44ab3743-f6'}, 'message_id': '544cb274-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5179.401766328, 'message_signature': 'c7a8c68e0fc7e786ea9a3ef11cde3a931bf17e1a361994e81b6e6634e86fff6f'}]}, 'timestamp': '2025-11-22 08:00:36.776926', '_unique_id': 'c79a6f48a7004105b2583f448f5f3be2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.777 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.777 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.777 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.777 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.777 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.777 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.777 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.777 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.777 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.777 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.777 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.777 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.777 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.777 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.777 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.777 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.777 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.777 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.777 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.777 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.777 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.777 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.777 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.777 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.777 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.777 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.777 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.777 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.777 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.777 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.777 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.777 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.777 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.778 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-AttachInterfacesTestJSON-server-733706123>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-AttachInterfacesTestJSON-server-733706123>]
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.778 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.778 12 DEBUG ceilometer.compute.pollsters [-] 36558ab5-ef38-44f5-8dd6-98c8e20c68c7/disk.device.write.bytes volume: 72933376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.778 12 DEBUG ceilometer.compute.pollsters [-] 36558ab5-ef38-44f5-8dd6-98c8e20c68c7/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.779 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '652a8ee1-5a4d-4fc3-ad34-b9742ea83262', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72933376, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': '36558ab5-ef38-44f5-8dd6-98c8e20c68c7-vda', 'timestamp': '2025-11-22T08:00:36.778223', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-733706123', 'name': 'instance-0000005b', 'instance_id': '36558ab5-ef38-44f5-8dd6-98c8e20c68c7', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '544cee88-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5179.365151039, 'message_signature': '36a9dbfda5f3477652b4defc0f90032e354555d334f68658048fd7e1e08f370c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': '36558ab5-ef38-44f5-8dd6-98c8e20c68c7-sda', 'timestamp': '2025-11-22T08:00:36.778223', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-733706123', 'name': 'instance-0000005b', 'instance_id': '36558ab5-ef38-44f5-8dd6-98c8e20c68c7', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '544cf676-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5179.365151039, 'message_signature': '2151410890e4f8025cb6caba4826a8e9fc7da4042844a93b8065a850e5bc3cdf'}]}, 'timestamp': '2025-11-22 08:00:36.778656', '_unique_id': 'f1506f56013c41569176698a6c9e9153'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.779 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.779 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.779 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.779 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.779 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.779 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.779 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.779 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.779 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.779 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.779 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.779 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.779 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.779 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.779 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.779 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.779 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.779 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.779 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.779 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.779 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.779 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.779 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.779 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.779 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.779 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.779 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.779 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.779 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.779 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.779 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.779 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.779 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.779 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-AttachInterfacesTestJSON-server-733706123>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-AttachInterfacesTestJSON-server-733706123>]
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.779 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.780 12 DEBUG ceilometer.compute.pollsters [-] 36558ab5-ef38-44f5-8dd6-98c8e20c68c7/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.780 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd114db76-97a3-484e-acf4-08d5f3eec49e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'instance-0000005b-36558ab5-ef38-44f5-8dd6-98c8e20c68c7-tap44ab3743-f6', 'timestamp': '2025-11-22T08:00:36.779986', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-733706123', 'name': 'tap44ab3743-f6', 'instance_id': '36558ab5-ef38-44f5-8dd6-98c8e20c68c7', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:06:f1:87', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap44ab3743-f6'}, 'message_id': '544d32b2-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5179.401766328, 'message_signature': '6f2b3bda67346a6317bbb621ccf9540c8cde2ad3981a96d71df8c0b696a95a84'}]}, 'timestamp': '2025-11-22 08:00:36.780209', '_unique_id': 'd3352c42d20a4630a8250727bf96788c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.780 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.780 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.780 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.780 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.780 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.780 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.780 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.780 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.780 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.780 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.780 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.780 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.780 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.780 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.780 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.780 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.780 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.780 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.780 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.780 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.780 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.780 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.780 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.780 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.780 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.780 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.780 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.780 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.780 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.780 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:00:36.780 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:00:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:37.332 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:00:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:37.333 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:00:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:37.334 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:00:39 np0005531887 nova_compute[186849]: 2025-11-22 08:00:39.372 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:40 np0005531887 podman[226783]: 2025-11-22 08:00:40.883302385 +0000 UTC m=+0.095263625 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, vendor=Red Hat, Inc., config_id=edpm, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, architecture=x86_64, vcs-type=git, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 22 03:00:41 np0005531887 nova_compute[186849]: 2025-11-22 08:00:41.650 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:42 np0005531887 nova_compute[186849]: 2025-11-22 08:00:42.624 186853 DEBUG oslo_concurrency.lockutils [None req-909fe67c-3ffe-4699-aa7b-1eb71a48a270 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquiring lock "interface-36558ab5-ef38-44f5-8dd6-98c8e20c68c7-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:00:42 np0005531887 nova_compute[186849]: 2025-11-22 08:00:42.625 186853 DEBUG oslo_concurrency.lockutils [None req-909fe67c-3ffe-4699-aa7b-1eb71a48a270 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "interface-36558ab5-ef38-44f5-8dd6-98c8e20c68c7-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:00:42 np0005531887 nova_compute[186849]: 2025-11-22 08:00:42.625 186853 DEBUG nova.objects.instance [None req-909fe67c-3ffe-4699-aa7b-1eb71a48a270 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lazy-loading 'flavor' on Instance uuid 36558ab5-ef38-44f5-8dd6-98c8e20c68c7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:00:43 np0005531887 nova_compute[186849]: 2025-11-22 08:00:43.734 186853 DEBUG nova.objects.instance [None req-909fe67c-3ffe-4699-aa7b-1eb71a48a270 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lazy-loading 'pci_requests' on Instance uuid 36558ab5-ef38-44f5-8dd6-98c8e20c68c7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:00:43 np0005531887 nova_compute[186849]: 2025-11-22 08:00:43.756 186853 DEBUG nova.network.neutron [None req-909fe67c-3ffe-4699-aa7b-1eb71a48a270 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:00:43 np0005531887 nova_compute[186849]: 2025-11-22 08:00:43.941 186853 DEBUG nova.policy [None req-909fe67c-3ffe-4699-aa7b-1eb71a48a270 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:00:44 np0005531887 nova_compute[186849]: 2025-11-22 08:00:44.374 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:45 np0005531887 nova_compute[186849]: 2025-11-22 08:00:45.345 186853 DEBUG nova.network.neutron [None req-909fe67c-3ffe-4699-aa7b-1eb71a48a270 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Successfully created port: 55cf8749-2c0c-4c1d-82dc-ccef7f624cb3 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:00:45 np0005531887 podman[226804]: 2025-11-22 08:00:45.85096472 +0000 UTC m=+0.064820514 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251118, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Nov 22 03:00:45 np0005531887 podman[226805]: 2025-11-22 08:00:45.897770551 +0000 UTC m=+0.105774375 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 22 03:00:46 np0005531887 nova_compute[186849]: 2025-11-22 08:00:46.652 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:46 np0005531887 nova_compute[186849]: 2025-11-22 08:00:46.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:00:48 np0005531887 nova_compute[186849]: 2025-11-22 08:00:48.460 186853 DEBUG nova.network.neutron [None req-909fe67c-3ffe-4699-aa7b-1eb71a48a270 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Successfully updated port: 55cf8749-2c0c-4c1d-82dc-ccef7f624cb3 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:00:48 np0005531887 nova_compute[186849]: 2025-11-22 08:00:48.478 186853 DEBUG oslo_concurrency.lockutils [None req-909fe67c-3ffe-4699-aa7b-1eb71a48a270 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquiring lock "refresh_cache-36558ab5-ef38-44f5-8dd6-98c8e20c68c7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:00:48 np0005531887 nova_compute[186849]: 2025-11-22 08:00:48.479 186853 DEBUG oslo_concurrency.lockutils [None req-909fe67c-3ffe-4699-aa7b-1eb71a48a270 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquired lock "refresh_cache-36558ab5-ef38-44f5-8dd6-98c8e20c68c7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:00:48 np0005531887 nova_compute[186849]: 2025-11-22 08:00:48.479 186853 DEBUG nova.network.neutron [None req-909fe67c-3ffe-4699-aa7b-1eb71a48a270 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:00:48 np0005531887 nova_compute[186849]: 2025-11-22 08:00:48.586 186853 DEBUG nova.compute.manager [req-4a36be55-205e-41c8-9bf0-3abe43de8920 req-cf818d83-47a0-4cf3-9078-c1cca1965f46 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Received event network-changed-55cf8749-2c0c-4c1d-82dc-ccef7f624cb3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:00:48 np0005531887 nova_compute[186849]: 2025-11-22 08:00:48.587 186853 DEBUG nova.compute.manager [req-4a36be55-205e-41c8-9bf0-3abe43de8920 req-cf818d83-47a0-4cf3-9078-c1cca1965f46 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Refreshing instance network info cache due to event network-changed-55cf8749-2c0c-4c1d-82dc-ccef7f624cb3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:00:48 np0005531887 nova_compute[186849]: 2025-11-22 08:00:48.587 186853 DEBUG oslo_concurrency.lockutils [req-4a36be55-205e-41c8-9bf0-3abe43de8920 req-cf818d83-47a0-4cf3-9078-c1cca1965f46 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-36558ab5-ef38-44f5-8dd6-98c8e20c68c7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:00:48 np0005531887 nova_compute[186849]: 2025-11-22 08:00:48.691 186853 WARNING nova.network.neutron [None req-909fe67c-3ffe-4699-aa7b-1eb71a48a270 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] 6a4a282c-db22-41de-b34b-2960aa032ca8 already exists in list: networks containing: ['6a4a282c-db22-41de-b34b-2960aa032ca8']. ignoring it#033[00m
Nov 22 03:00:49 np0005531887 nova_compute[186849]: 2025-11-22 08:00:49.377 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:50 np0005531887 podman[226853]: 2025-11-22 08:00:50.843585245 +0000 UTC m=+0.063266974 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 03:00:51 np0005531887 nova_compute[186849]: 2025-11-22 08:00:51.457 186853 DEBUG nova.network.neutron [None req-909fe67c-3ffe-4699-aa7b-1eb71a48a270 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Updating instance_info_cache with network_info: [{"id": "44ab3743-f6b1-4f3e-9686-53c9ebd45d37", "address": "fa:16:3e:06:f1:87", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44ab3743-f6", "ovs_interfaceid": "44ab3743-f6b1-4f3e-9686-53c9ebd45d37", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "55cf8749-2c0c-4c1d-82dc-ccef7f624cb3", "address": "fa:16:3e:cf:cc:9e", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap55cf8749-2c", "ovs_interfaceid": "55cf8749-2c0c-4c1d-82dc-ccef7f624cb3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:00:51 np0005531887 nova_compute[186849]: 2025-11-22 08:00:51.496 186853 DEBUG oslo_concurrency.lockutils [None req-909fe67c-3ffe-4699-aa7b-1eb71a48a270 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Releasing lock "refresh_cache-36558ab5-ef38-44f5-8dd6-98c8e20c68c7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:00:51 np0005531887 nova_compute[186849]: 2025-11-22 08:00:51.498 186853 DEBUG oslo_concurrency.lockutils [req-4a36be55-205e-41c8-9bf0-3abe43de8920 req-cf818d83-47a0-4cf3-9078-c1cca1965f46 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-36558ab5-ef38-44f5-8dd6-98c8e20c68c7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:00:51 np0005531887 nova_compute[186849]: 2025-11-22 08:00:51.498 186853 DEBUG nova.network.neutron [req-4a36be55-205e-41c8-9bf0-3abe43de8920 req-cf818d83-47a0-4cf3-9078-c1cca1965f46 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Refreshing network info cache for port 55cf8749-2c0c-4c1d-82dc-ccef7f624cb3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:00:51 np0005531887 nova_compute[186849]: 2025-11-22 08:00:51.504 186853 DEBUG nova.virt.libvirt.vif [None req-909fe67c-3ffe-4699-aa7b-1eb71a48a270 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:59:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-733706123',display_name='tempest-AttachInterfacesTestJSON-server-733706123',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-733706123',id=91,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEVCPVBzzzKnP+R6/lDJiQ5B8RfgEpMCDa4dk9to8phNzvju3oinz4x7dgw6Zbn8afUUsbXFYWd5w1dgd7O2KN/oXSpHA9eKzGAIMbR7cZPcwDLZProoH/PeBT61VIfVUA==',key_name='tempest-keypair-329982621',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:00:06Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='2ce7c88f9ad440f494bdba03e7ece1bf',ramdisk_id='',reservation_id='r-lkepgert',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-148239263',owner_user_name='tempest-AttachInterfacesTestJSON-148239263-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:00:06Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='53d50a77d3f2416c8fcc459cc343d045',uuid=36558ab5-ef38-44f5-8dd6-98c8e20c68c7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "55cf8749-2c0c-4c1d-82dc-ccef7f624cb3", "address": "fa:16:3e:cf:cc:9e", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap55cf8749-2c", "ovs_interfaceid": "55cf8749-2c0c-4c1d-82dc-ccef7f624cb3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:00:51 np0005531887 nova_compute[186849]: 2025-11-22 08:00:51.505 186853 DEBUG nova.network.os_vif_util [None req-909fe67c-3ffe-4699-aa7b-1eb71a48a270 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converting VIF {"id": "55cf8749-2c0c-4c1d-82dc-ccef7f624cb3", "address": "fa:16:3e:cf:cc:9e", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap55cf8749-2c", "ovs_interfaceid": "55cf8749-2c0c-4c1d-82dc-ccef7f624cb3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:00:51 np0005531887 nova_compute[186849]: 2025-11-22 08:00:51.506 186853 DEBUG nova.network.os_vif_util [None req-909fe67c-3ffe-4699-aa7b-1eb71a48a270 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cf:cc:9e,bridge_name='br-int',has_traffic_filtering=True,id=55cf8749-2c0c-4c1d-82dc-ccef7f624cb3,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap55cf8749-2c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:00:51 np0005531887 nova_compute[186849]: 2025-11-22 08:00:51.506 186853 DEBUG os_vif [None req-909fe67c-3ffe-4699-aa7b-1eb71a48a270 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cf:cc:9e,bridge_name='br-int',has_traffic_filtering=True,id=55cf8749-2c0c-4c1d-82dc-ccef7f624cb3,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap55cf8749-2c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:00:51 np0005531887 nova_compute[186849]: 2025-11-22 08:00:51.507 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:51 np0005531887 nova_compute[186849]: 2025-11-22 08:00:51.507 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:00:51 np0005531887 nova_compute[186849]: 2025-11-22 08:00:51.508 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:00:51 np0005531887 nova_compute[186849]: 2025-11-22 08:00:51.511 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:51 np0005531887 nova_compute[186849]: 2025-11-22 08:00:51.512 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap55cf8749-2c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:00:51 np0005531887 nova_compute[186849]: 2025-11-22 08:00:51.512 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap55cf8749-2c, col_values=(('external_ids', {'iface-id': '55cf8749-2c0c-4c1d-82dc-ccef7f624cb3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cf:cc:9e', 'vm-uuid': '36558ab5-ef38-44f5-8dd6-98c8e20c68c7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:00:51 np0005531887 nova_compute[186849]: 2025-11-22 08:00:51.514 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:51 np0005531887 NetworkManager[55210]: <info>  [1763798451.5168] manager: (tap55cf8749-2c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/129)
Nov 22 03:00:51 np0005531887 nova_compute[186849]: 2025-11-22 08:00:51.518 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:00:51 np0005531887 nova_compute[186849]: 2025-11-22 08:00:51.526 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:51 np0005531887 nova_compute[186849]: 2025-11-22 08:00:51.527 186853 INFO os_vif [None req-909fe67c-3ffe-4699-aa7b-1eb71a48a270 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cf:cc:9e,bridge_name='br-int',has_traffic_filtering=True,id=55cf8749-2c0c-4c1d-82dc-ccef7f624cb3,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap55cf8749-2c')#033[00m
Nov 22 03:00:51 np0005531887 nova_compute[186849]: 2025-11-22 08:00:51.528 186853 DEBUG nova.virt.libvirt.vif [None req-909fe67c-3ffe-4699-aa7b-1eb71a48a270 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:59:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-733706123',display_name='tempest-AttachInterfacesTestJSON-server-733706123',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-733706123',id=91,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEVCPVBzzzKnP+R6/lDJiQ5B8RfgEpMCDa4dk9to8phNzvju3oinz4x7dgw6Zbn8afUUsbXFYWd5w1dgd7O2KN/oXSpHA9eKzGAIMbR7cZPcwDLZProoH/PeBT61VIfVUA==',key_name='tempest-keypair-329982621',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:00:06Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='2ce7c88f9ad440f494bdba03e7ece1bf',ramdisk_id='',reservation_id='r-lkepgert',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-148239263',owner_user_name='tempest-AttachInterfacesTestJSON-148239263-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:00:06Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='53d50a77d3f2416c8fcc459cc343d045',uuid=36558ab5-ef38-44f5-8dd6-98c8e20c68c7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "55cf8749-2c0c-4c1d-82dc-ccef7f624cb3", "address": "fa:16:3e:cf:cc:9e", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap55cf8749-2c", "ovs_interfaceid": "55cf8749-2c0c-4c1d-82dc-ccef7f624cb3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:00:51 np0005531887 nova_compute[186849]: 2025-11-22 08:00:51.529 186853 DEBUG nova.network.os_vif_util [None req-909fe67c-3ffe-4699-aa7b-1eb71a48a270 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converting VIF {"id": "55cf8749-2c0c-4c1d-82dc-ccef7f624cb3", "address": "fa:16:3e:cf:cc:9e", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap55cf8749-2c", "ovs_interfaceid": "55cf8749-2c0c-4c1d-82dc-ccef7f624cb3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:00:51 np0005531887 nova_compute[186849]: 2025-11-22 08:00:51.530 186853 DEBUG nova.network.os_vif_util [None req-909fe67c-3ffe-4699-aa7b-1eb71a48a270 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cf:cc:9e,bridge_name='br-int',has_traffic_filtering=True,id=55cf8749-2c0c-4c1d-82dc-ccef7f624cb3,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap55cf8749-2c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:00:51 np0005531887 nova_compute[186849]: 2025-11-22 08:00:51.533 186853 DEBUG nova.virt.libvirt.guest [None req-909fe67c-3ffe-4699-aa7b-1eb71a48a270 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] attach device xml: <interface type="ethernet">
Nov 22 03:00:51 np0005531887 nova_compute[186849]:  <mac address="fa:16:3e:cf:cc:9e"/>
Nov 22 03:00:51 np0005531887 nova_compute[186849]:  <model type="virtio"/>
Nov 22 03:00:51 np0005531887 nova_compute[186849]:  <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:00:51 np0005531887 nova_compute[186849]:  <mtu size="1442"/>
Nov 22 03:00:51 np0005531887 nova_compute[186849]:  <target dev="tap55cf8749-2c"/>
Nov 22 03:00:51 np0005531887 nova_compute[186849]: </interface>
Nov 22 03:00:51 np0005531887 nova_compute[186849]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Nov 22 03:00:51 np0005531887 NetworkManager[55210]: <info>  [1763798451.5515] manager: (tap55cf8749-2c): new Tun device (/org/freedesktop/NetworkManager/Devices/130)
Nov 22 03:00:51 np0005531887 kernel: tap55cf8749-2c: entered promiscuous mode
Nov 22 03:00:51 np0005531887 ovn_controller[95130]: 2025-11-22T08:00:51Z|00261|binding|INFO|Claiming lport 55cf8749-2c0c-4c1d-82dc-ccef7f624cb3 for this chassis.
Nov 22 03:00:51 np0005531887 ovn_controller[95130]: 2025-11-22T08:00:51Z|00262|binding|INFO|55cf8749-2c0c-4c1d-82dc-ccef7f624cb3: Claiming fa:16:3e:cf:cc:9e 10.100.0.14
Nov 22 03:00:51 np0005531887 nova_compute[186849]: 2025-11-22 08:00:51.556 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:51 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:51.567 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cf:cc:9e 10.100.0.14'], port_security=['fa:16:3e:cf:cc:9e 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '36558ab5-ef38-44f5-8dd6-98c8e20c68c7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6a4a282c-db22-41de-b34b-2960aa032ca8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd4a48801-4b3f-49e9-aa90-fb1d486a915e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0acaade9-a442-4c9d-a882-bd397d30fce8, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=55cf8749-2c0c-4c1d-82dc-ccef7f624cb3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:00:51 np0005531887 ovn_controller[95130]: 2025-11-22T08:00:51Z|00263|binding|INFO|Setting lport 55cf8749-2c0c-4c1d-82dc-ccef7f624cb3 ovn-installed in OVS
Nov 22 03:00:51 np0005531887 ovn_controller[95130]: 2025-11-22T08:00:51Z|00264|binding|INFO|Setting lport 55cf8749-2c0c-4c1d-82dc-ccef7f624cb3 up in Southbound
Nov 22 03:00:51 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:51.569 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 55cf8749-2c0c-4c1d-82dc-ccef7f624cb3 in datapath 6a4a282c-db22-41de-b34b-2960aa032ca8 bound to our chassis#033[00m
Nov 22 03:00:51 np0005531887 nova_compute[186849]: 2025-11-22 08:00:51.571 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:51 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:51.571 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6a4a282c-db22-41de-b34b-2960aa032ca8#033[00m
Nov 22 03:00:51 np0005531887 nova_compute[186849]: 2025-11-22 08:00:51.573 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:51 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:51.588 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[eba8f08d-7f18-4d31-ba0b-17c0ff47431b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:51 np0005531887 systemd-udevd[226885]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:00:51 np0005531887 NetworkManager[55210]: <info>  [1763798451.6088] device (tap55cf8749-2c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:00:51 np0005531887 NetworkManager[55210]: <info>  [1763798451.6098] device (tap55cf8749-2c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:00:51 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:51.629 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[2d52ad54-106e-45fe-a21d-32e76f6940b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:51 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:51.634 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[441e63d6-4263-44ed-92aa-06767132a6fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:51 np0005531887 nova_compute[186849]: 2025-11-22 08:00:51.655 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:51 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:51.669 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[522865dc-df54-4fc6-a64e-38f58e5fd607]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:51 np0005531887 nova_compute[186849]: 2025-11-22 08:00:51.689 186853 DEBUG nova.virt.libvirt.driver [None req-909fe67c-3ffe-4699-aa7b-1eb71a48a270 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:00:51 np0005531887 nova_compute[186849]: 2025-11-22 08:00:51.691 186853 DEBUG nova.virt.libvirt.driver [None req-909fe67c-3ffe-4699-aa7b-1eb71a48a270 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:00:51 np0005531887 nova_compute[186849]: 2025-11-22 08:00:51.691 186853 DEBUG nova.virt.libvirt.driver [None req-909fe67c-3ffe-4699-aa7b-1eb71a48a270 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] No VIF found with MAC fa:16:3e:06:f1:87, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:00:51 np0005531887 nova_compute[186849]: 2025-11-22 08:00:51.691 186853 DEBUG nova.virt.libvirt.driver [None req-909fe67c-3ffe-4699-aa7b-1eb71a48a270 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] No VIF found with MAC fa:16:3e:cf:cc:9e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:00:51 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:51.695 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[ca61ed1d-5948-46a8-8e8b-99238c38ffa2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6a4a282c-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b3:7a:86'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 85], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 514829, 'reachable_time': 41281, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226892, 'error': None, 'target': 'ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:51 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:51.714 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[c7a62328-ea47-4da1-8a71-6291718c12e9]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap6a4a282c-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 514843, 'tstamp': 514843}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226893, 'error': None, 'target': 'ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap6a4a282c-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 514847, 'tstamp': 514847}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226893, 'error': None, 'target': 'ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:51 np0005531887 nova_compute[186849]: 2025-11-22 08:00:51.717 186853 DEBUG nova.virt.libvirt.guest [None req-909fe67c-3ffe-4699-aa7b-1eb71a48a270 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:00:51 np0005531887 nova_compute[186849]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:00:51 np0005531887 nova_compute[186849]:  <nova:name>tempest-AttachInterfacesTestJSON-server-733706123</nova:name>
Nov 22 03:00:51 np0005531887 nova_compute[186849]:  <nova:creationTime>2025-11-22 08:00:51</nova:creationTime>
Nov 22 03:00:51 np0005531887 nova_compute[186849]:  <nova:flavor name="m1.nano">
Nov 22 03:00:51 np0005531887 nova_compute[186849]:    <nova:memory>128</nova:memory>
Nov 22 03:00:51 np0005531887 nova_compute[186849]:    <nova:disk>1</nova:disk>
Nov 22 03:00:51 np0005531887 nova_compute[186849]:    <nova:swap>0</nova:swap>
Nov 22 03:00:51 np0005531887 nova_compute[186849]:    <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:00:51 np0005531887 nova_compute[186849]:    <nova:vcpus>1</nova:vcpus>
Nov 22 03:00:51 np0005531887 nova_compute[186849]:  </nova:flavor>
Nov 22 03:00:51 np0005531887 nova_compute[186849]:  <nova:owner>
Nov 22 03:00:51 np0005531887 nova_compute[186849]:    <nova:user uuid="53d50a77d3f2416c8fcc459cc343d045">tempest-AttachInterfacesTestJSON-148239263-project-member</nova:user>
Nov 22 03:00:51 np0005531887 nova_compute[186849]:    <nova:project uuid="2ce7c88f9ad440f494bdba03e7ece1bf">tempest-AttachInterfacesTestJSON-148239263</nova:project>
Nov 22 03:00:51 np0005531887 nova_compute[186849]:  </nova:owner>
Nov 22 03:00:51 np0005531887 nova_compute[186849]:  <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:00:51 np0005531887 nova_compute[186849]:  <nova:ports>
Nov 22 03:00:51 np0005531887 nova_compute[186849]:    <nova:port uuid="44ab3743-f6b1-4f3e-9686-53c9ebd45d37">
Nov 22 03:00:51 np0005531887 nova_compute[186849]:      <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Nov 22 03:00:51 np0005531887 nova_compute[186849]:    </nova:port>
Nov 22 03:00:51 np0005531887 nova_compute[186849]:    <nova:port uuid="55cf8749-2c0c-4c1d-82dc-ccef7f624cb3">
Nov 22 03:00:51 np0005531887 nova_compute[186849]:      <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 22 03:00:51 np0005531887 nova_compute[186849]:    </nova:port>
Nov 22 03:00:51 np0005531887 nova_compute[186849]:  </nova:ports>
Nov 22 03:00:51 np0005531887 nova_compute[186849]: </nova:instance>
Nov 22 03:00:51 np0005531887 nova_compute[186849]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Nov 22 03:00:51 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:51.721 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6a4a282c-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:00:51 np0005531887 nova_compute[186849]: 2025-11-22 08:00:51.727 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:51 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:51.734 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6a4a282c-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:00:51 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:51.735 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:00:51 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:51.735 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6a4a282c-d0, col_values=(('external_ids', {'iface-id': '26692495-261e-4628-ae4d-0a33d676c097'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:00:51 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:51.735 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:00:51 np0005531887 nova_compute[186849]: 2025-11-22 08:00:51.765 186853 DEBUG oslo_concurrency.lockutils [None req-909fe67c-3ffe-4699-aa7b-1eb71a48a270 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "interface-36558ab5-ef38-44f5-8dd6-98c8e20c68c7-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 9.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:00:51 np0005531887 nova_compute[186849]: 2025-11-22 08:00:51.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:00:52 np0005531887 nova_compute[186849]: 2025-11-22 08:00:52.174 186853 DEBUG nova.compute.manager [req-48f218b8-19c6-4bfc-99ab-50536281284c req-3ba5da90-ba45-4915-ba97-7b457fc7c893 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Received event network-vif-plugged-55cf8749-2c0c-4c1d-82dc-ccef7f624cb3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:00:52 np0005531887 nova_compute[186849]: 2025-11-22 08:00:52.174 186853 DEBUG oslo_concurrency.lockutils [req-48f218b8-19c6-4bfc-99ab-50536281284c req-3ba5da90-ba45-4915-ba97-7b457fc7c893 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "36558ab5-ef38-44f5-8dd6-98c8e20c68c7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:00:52 np0005531887 nova_compute[186849]: 2025-11-22 08:00:52.175 186853 DEBUG oslo_concurrency.lockutils [req-48f218b8-19c6-4bfc-99ab-50536281284c req-3ba5da90-ba45-4915-ba97-7b457fc7c893 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "36558ab5-ef38-44f5-8dd6-98c8e20c68c7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:00:52 np0005531887 nova_compute[186849]: 2025-11-22 08:00:52.175 186853 DEBUG oslo_concurrency.lockutils [req-48f218b8-19c6-4bfc-99ab-50536281284c req-3ba5da90-ba45-4915-ba97-7b457fc7c893 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "36558ab5-ef38-44f5-8dd6-98c8e20c68c7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:00:52 np0005531887 nova_compute[186849]: 2025-11-22 08:00:52.175 186853 DEBUG nova.compute.manager [req-48f218b8-19c6-4bfc-99ab-50536281284c req-3ba5da90-ba45-4915-ba97-7b457fc7c893 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] No waiting events found dispatching network-vif-plugged-55cf8749-2c0c-4c1d-82dc-ccef7f624cb3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:00:52 np0005531887 nova_compute[186849]: 2025-11-22 08:00:52.175 186853 WARNING nova.compute.manager [req-48f218b8-19c6-4bfc-99ab-50536281284c req-3ba5da90-ba45-4915-ba97-7b457fc7c893 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Received unexpected event network-vif-plugged-55cf8749-2c0c-4c1d-82dc-ccef7f624cb3 for instance with vm_state active and task_state None.#033[00m
Nov 22 03:00:52 np0005531887 nova_compute[186849]: 2025-11-22 08:00:52.762 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:00:52 np0005531887 nova_compute[186849]: 2025-11-22 08:00:52.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:00:52 np0005531887 nova_compute[186849]: 2025-11-22 08:00:52.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:00:52 np0005531887 nova_compute[186849]: 2025-11-22 08:00:52.817 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:00:52 np0005531887 nova_compute[186849]: 2025-11-22 08:00:52.817 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:00:52 np0005531887 nova_compute[186849]: 2025-11-22 08:00:52.818 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:00:52 np0005531887 nova_compute[186849]: 2025-11-22 08:00:52.818 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:00:52 np0005531887 nova_compute[186849]: 2025-11-22 08:00:52.893 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/36558ab5-ef38-44f5-8dd6-98c8e20c68c7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:00:52 np0005531887 nova_compute[186849]: 2025-11-22 08:00:52.976 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/36558ab5-ef38-44f5-8dd6-98c8e20c68c7/disk --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:00:52 np0005531887 nova_compute[186849]: 2025-11-22 08:00:52.978 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/36558ab5-ef38-44f5-8dd6-98c8e20c68c7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:00:53 np0005531887 nova_compute[186849]: 2025-11-22 08:00:53.047 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/36558ab5-ef38-44f5-8dd6-98c8e20c68c7/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:00:53 np0005531887 nova_compute[186849]: 2025-11-22 08:00:53.334 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:00:53 np0005531887 nova_compute[186849]: 2025-11-22 08:00:53.336 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5591MB free_disk=73.31648254394531GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:00:53 np0005531887 nova_compute[186849]: 2025-11-22 08:00:53.336 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:00:53 np0005531887 nova_compute[186849]: 2025-11-22 08:00:53.336 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:00:53 np0005531887 nova_compute[186849]: 2025-11-22 08:00:53.460 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Instance 36558ab5-ef38-44f5-8dd6-98c8e20c68c7 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 03:00:53 np0005531887 nova_compute[186849]: 2025-11-22 08:00:53.461 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:00:53 np0005531887 nova_compute[186849]: 2025-11-22 08:00:53.461 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:00:53 np0005531887 nova_compute[186849]: 2025-11-22 08:00:53.476 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Refreshing inventories for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 22 03:00:53 np0005531887 nova_compute[186849]: 2025-11-22 08:00:53.494 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Updating ProviderTree inventory for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 22 03:00:53 np0005531887 nova_compute[186849]: 2025-11-22 08:00:53.495 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Updating inventory in ProviderTree for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 22 03:00:53 np0005531887 nova_compute[186849]: 2025-11-22 08:00:53.519 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Refreshing aggregate associations for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 22 03:00:53 np0005531887 nova_compute[186849]: 2025-11-22 08:00:53.559 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Refreshing trait associations for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78, traits: COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_MMX,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_PCNET _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 22 03:00:53 np0005531887 nova_compute[186849]: 2025-11-22 08:00:53.616 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:00:53 np0005531887 nova_compute[186849]: 2025-11-22 08:00:53.644 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:00:53 np0005531887 nova_compute[186849]: 2025-11-22 08:00:53.692 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:00:53 np0005531887 nova_compute[186849]: 2025-11-22 08:00:53.694 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.358s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:00:53 np0005531887 nova_compute[186849]: 2025-11-22 08:00:53.954 186853 DEBUG nova.network.neutron [req-4a36be55-205e-41c8-9bf0-3abe43de8920 req-cf818d83-47a0-4cf3-9078-c1cca1965f46 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Updated VIF entry in instance network info cache for port 55cf8749-2c0c-4c1d-82dc-ccef7f624cb3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:00:53 np0005531887 nova_compute[186849]: 2025-11-22 08:00:53.955 186853 DEBUG nova.network.neutron [req-4a36be55-205e-41c8-9bf0-3abe43de8920 req-cf818d83-47a0-4cf3-9078-c1cca1965f46 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Updating instance_info_cache with network_info: [{"id": "44ab3743-f6b1-4f3e-9686-53c9ebd45d37", "address": "fa:16:3e:06:f1:87", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44ab3743-f6", "ovs_interfaceid": "44ab3743-f6b1-4f3e-9686-53c9ebd45d37", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "55cf8749-2c0c-4c1d-82dc-ccef7f624cb3", "address": "fa:16:3e:cf:cc:9e", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap55cf8749-2c", "ovs_interfaceid": "55cf8749-2c0c-4c1d-82dc-ccef7f624cb3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:00:53 np0005531887 nova_compute[186849]: 2025-11-22 08:00:53.979 186853 DEBUG oslo_concurrency.lockutils [req-4a36be55-205e-41c8-9bf0-3abe43de8920 req-cf818d83-47a0-4cf3-9078-c1cca1965f46 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-36558ab5-ef38-44f5-8dd6-98c8e20c68c7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:00:54 np0005531887 nova_compute[186849]: 2025-11-22 08:00:54.109 186853 DEBUG oslo_concurrency.lockutils [None req-b232cbdd-955f-45ca-83b5-ba9e2ed07e8f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquiring lock "interface-36558ab5-ef38-44f5-8dd6-98c8e20c68c7-55cf8749-2c0c-4c1d-82dc-ccef7f624cb3" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:00:54 np0005531887 nova_compute[186849]: 2025-11-22 08:00:54.110 186853 DEBUG oslo_concurrency.lockutils [None req-b232cbdd-955f-45ca-83b5-ba9e2ed07e8f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "interface-36558ab5-ef38-44f5-8dd6-98c8e20c68c7-55cf8749-2c0c-4c1d-82dc-ccef7f624cb3" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:00:54 np0005531887 nova_compute[186849]: 2025-11-22 08:00:54.133 186853 DEBUG nova.objects.instance [None req-b232cbdd-955f-45ca-83b5-ba9e2ed07e8f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lazy-loading 'flavor' on Instance uuid 36558ab5-ef38-44f5-8dd6-98c8e20c68c7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:00:54 np0005531887 nova_compute[186849]: 2025-11-22 08:00:54.177 186853 DEBUG nova.virt.libvirt.vif [None req-b232cbdd-955f-45ca-83b5-ba9e2ed07e8f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:59:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-733706123',display_name='tempest-AttachInterfacesTestJSON-server-733706123',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-733706123',id=91,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEVCPVBzzzKnP+R6/lDJiQ5B8RfgEpMCDa4dk9to8phNzvju3oinz4x7dgw6Zbn8afUUsbXFYWd5w1dgd7O2KN/oXSpHA9eKzGAIMbR7cZPcwDLZProoH/PeBT61VIfVUA==',key_name='tempest-keypair-329982621',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:00:06Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2ce7c88f9ad440f494bdba03e7ece1bf',ramdisk_id='',reservation_id='r-lkepgert',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-148239263',owner_user_name='tempest-AttachInterfacesTestJSON-148239263-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:00:06Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='53d50a77d3f2416c8fcc459cc343d045',uuid=36558ab5-ef38-44f5-8dd6-98c8e20c68c7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "55cf8749-2c0c-4c1d-82dc-ccef7f624cb3", "address": "fa:16:3e:cf:cc:9e", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap55cf8749-2c", "ovs_interfaceid": "55cf8749-2c0c-4c1d-82dc-ccef7f624cb3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:00:54 np0005531887 nova_compute[186849]: 2025-11-22 08:00:54.178 186853 DEBUG nova.network.os_vif_util [None req-b232cbdd-955f-45ca-83b5-ba9e2ed07e8f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converting VIF {"id": "55cf8749-2c0c-4c1d-82dc-ccef7f624cb3", "address": "fa:16:3e:cf:cc:9e", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap55cf8749-2c", "ovs_interfaceid": "55cf8749-2c0c-4c1d-82dc-ccef7f624cb3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:00:54 np0005531887 nova_compute[186849]: 2025-11-22 08:00:54.179 186853 DEBUG nova.network.os_vif_util [None req-b232cbdd-955f-45ca-83b5-ba9e2ed07e8f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cf:cc:9e,bridge_name='br-int',has_traffic_filtering=True,id=55cf8749-2c0c-4c1d-82dc-ccef7f624cb3,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap55cf8749-2c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:00:54 np0005531887 nova_compute[186849]: 2025-11-22 08:00:54.184 186853 DEBUG nova.virt.libvirt.guest [None req-b232cbdd-955f-45ca-83b5-ba9e2ed07e8f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:cf:cc:9e"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap55cf8749-2c"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 22 03:00:54 np0005531887 nova_compute[186849]: 2025-11-22 08:00:54.187 186853 DEBUG nova.virt.libvirt.guest [None req-b232cbdd-955f-45ca-83b5-ba9e2ed07e8f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:cf:cc:9e"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap55cf8749-2c"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 22 03:00:54 np0005531887 nova_compute[186849]: 2025-11-22 08:00:54.191 186853 DEBUG nova.virt.libvirt.driver [None req-b232cbdd-955f-45ca-83b5-ba9e2ed07e8f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Attempting to detach device tap55cf8749-2c from instance 36558ab5-ef38-44f5-8dd6-98c8e20c68c7 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Nov 22 03:00:54 np0005531887 nova_compute[186849]: 2025-11-22 08:00:54.192 186853 DEBUG nova.virt.libvirt.guest [None req-b232cbdd-955f-45ca-83b5-ba9e2ed07e8f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] detach device xml: <interface type="ethernet">
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <mac address="fa:16:3e:cf:cc:9e"/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <model type="virtio"/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <mtu size="1442"/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <target dev="tap55cf8749-2c"/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]: </interface>
Nov 22 03:00:54 np0005531887 nova_compute[186849]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Nov 22 03:00:54 np0005531887 nova_compute[186849]: 2025-11-22 08:00:54.202 186853 DEBUG nova.virt.libvirt.guest [None req-b232cbdd-955f-45ca-83b5-ba9e2ed07e8f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:cf:cc:9e"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap55cf8749-2c"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 22 03:00:54 np0005531887 nova_compute[186849]: 2025-11-22 08:00:54.209 186853 DEBUG nova.virt.libvirt.guest [None req-b232cbdd-955f-45ca-83b5-ba9e2ed07e8f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:cf:cc:9e"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap55cf8749-2c"/></interface>not found in domain: <domain type='kvm' id='37'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <name>instance-0000005b</name>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <uuid>36558ab5-ef38-44f5-8dd6-98c8e20c68c7</uuid>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <nova:name>tempest-AttachInterfacesTestJSON-server-733706123</nova:name>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <nova:creationTime>2025-11-22 08:00:51</nova:creationTime>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <nova:flavor name="m1.nano">
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <nova:memory>128</nova:memory>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <nova:disk>1</nova:disk>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <nova:swap>0</nova:swap>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <nova:vcpus>1</nova:vcpus>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  </nova:flavor>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <nova:owner>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <nova:user uuid="53d50a77d3f2416c8fcc459cc343d045">tempest-AttachInterfacesTestJSON-148239263-project-member</nova:user>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <nova:project uuid="2ce7c88f9ad440f494bdba03e7ece1bf">tempest-AttachInterfacesTestJSON-148239263</nova:project>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  </nova:owner>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <nova:ports>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <nova:port uuid="44ab3743-f6b1-4f3e-9686-53c9ebd45d37">
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </nova:port>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <nova:port uuid="55cf8749-2c0c-4c1d-82dc-ccef7f624cb3">
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </nova:port>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  </nova:ports>
Nov 22 03:00:54 np0005531887 nova_compute[186849]: </nova:instance>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <memory unit='KiB'>131072</memory>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <currentMemory unit='KiB'>131072</currentMemory>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <vcpu placement='static'>1</vcpu>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <resource>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <partition>/machine</partition>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  </resource>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <sysinfo type='smbios'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <system>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <entry name='manufacturer'>RDO</entry>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <entry name='product'>OpenStack Compute</entry>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <entry name='serial'>36558ab5-ef38-44f5-8dd6-98c8e20c68c7</entry>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <entry name='uuid'>36558ab5-ef38-44f5-8dd6-98c8e20c68c7</entry>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <entry name='family'>Virtual Machine</entry>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </system>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <os>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <boot dev='hd'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <smbios mode='sysinfo'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  </os>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <features>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <vmcoreinfo state='on'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  </features>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <cpu mode='custom' match='exact' check='full'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <model fallback='forbid'>Nehalem</model>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <feature policy='require' name='x2apic'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <feature policy='require' name='hypervisor'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <feature policy='require' name='vme'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <clock offset='utc'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <timer name='pit' tickpolicy='delay'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <timer name='rtc' tickpolicy='catchup'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <timer name='hpet' present='no'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  </clock>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <on_poweroff>destroy</on_poweroff>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <on_reboot>restart</on_reboot>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <on_crash>destroy</on_crash>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <devices>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <disk type='file' device='disk'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <driver name='qemu' type='qcow2' cache='none'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <source file='/var/lib/nova/instances/36558ab5-ef38-44f5-8dd6-98c8e20c68c7/disk' index='2'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <backingStore type='file' index='3'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:        <format type='raw'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:        <source file='/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:        <backingStore/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      </backingStore>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <target dev='vda' bus='virtio'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='virtio-disk0'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <disk type='file' device='cdrom'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <driver name='qemu' type='raw' cache='none'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <source file='/var/lib/nova/instances/36558ab5-ef38-44f5-8dd6-98c8e20c68c7/disk.config' index='1'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <backingStore/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <target dev='sda' bus='sata'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <readonly/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='sata0-0-0'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <controller type='pci' index='0' model='pcie-root'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='pcie.0'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <controller type='pci' index='1' model='pcie-root-port'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <target chassis='1' port='0x10'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='pci.1'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <controller type='pci' index='2' model='pcie-root-port'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <target chassis='2' port='0x11'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='pci.2'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <controller type='pci' index='3' model='pcie-root-port'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <target chassis='3' port='0x12'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='pci.3'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <controller type='pci' index='4' model='pcie-root-port'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <target chassis='4' port='0x13'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='pci.4'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <controller type='pci' index='5' model='pcie-root-port'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <target chassis='5' port='0x14'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='pci.5'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <controller type='pci' index='6' model='pcie-root-port'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <target chassis='6' port='0x15'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='pci.6'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <controller type='pci' index='7' model='pcie-root-port'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <target chassis='7' port='0x16'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='pci.7'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <controller type='pci' index='8' model='pcie-root-port'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <target chassis='8' port='0x17'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='pci.8'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <controller type='pci' index='9' model='pcie-root-port'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <target chassis='9' port='0x18'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='pci.9'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <controller type='pci' index='10' model='pcie-root-port'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <target chassis='10' port='0x19'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='pci.10'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <controller type='pci' index='11' model='pcie-root-port'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <target chassis='11' port='0x1a'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='pci.11'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <controller type='pci' index='12' model='pcie-root-port'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <target chassis='12' port='0x1b'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='pci.12'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <controller type='pci' index='13' model='pcie-root-port'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <target chassis='13' port='0x1c'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='pci.13'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <controller type='pci' index='14' model='pcie-root-port'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <target chassis='14' port='0x1d'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='pci.14'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <controller type='pci' index='15' model='pcie-root-port'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <target chassis='15' port='0x1e'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='pci.15'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <controller type='pci' index='16' model='pcie-root-port'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <target chassis='16' port='0x1f'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='pci.16'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <controller type='pci' index='17' model='pcie-root-port'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <target chassis='17' port='0x20'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='pci.17'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <controller type='pci' index='18' model='pcie-root-port'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <target chassis='18' port='0x21'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='pci.18'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <controller type='pci' index='19' model='pcie-root-port'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <target chassis='19' port='0x22'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='pci.19'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <controller type='pci' index='20' model='pcie-root-port'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <target chassis='20' port='0x23'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='pci.20'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <controller type='pci' index='21' model='pcie-root-port'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <target chassis='21' port='0x24'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='pci.21'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <controller type='pci' index='22' model='pcie-root-port'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <target chassis='22' port='0x25'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='pci.22'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <controller type='pci' index='23' model='pcie-root-port'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <target chassis='23' port='0x26'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='pci.23'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <controller type='pci' index='24' model='pcie-root-port'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <target chassis='24' port='0x27'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='pci.24'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <controller type='pci' index='25' model='pcie-root-port'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <target chassis='25' port='0x28'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='pci.25'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <model name='pcie-pci-bridge'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='pci.26'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <controller type='usb' index='0' model='piix3-uhci'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='usb'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <controller type='sata' index='0'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='ide'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <interface type='ethernet'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <mac address='fa:16:3e:06:f1:87'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <target dev='tap44ab3743-f6'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <model type='virtio'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <driver name='vhost' rx_queue_size='512'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <mtu size='1442'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='net0'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </interface>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <interface type='ethernet'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <mac address='fa:16:3e:cf:cc:9e'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <target dev='tap55cf8749-2c'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <model type='virtio'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <driver name='vhost' rx_queue_size='512'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <mtu size='1442'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='net1'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </interface>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <serial type='pty'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <source path='/dev/pts/0'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <log file='/var/lib/nova/instances/36558ab5-ef38-44f5-8dd6-98c8e20c68c7/console.log' append='off'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <target type='isa-serial' port='0'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:        <model name='isa-serial'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      </target>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='serial0'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </serial>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <console type='pty' tty='/dev/pts/0'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <source path='/dev/pts/0'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <log file='/var/lib/nova/instances/36558ab5-ef38-44f5-8dd6-98c8e20c68c7/console.log' append='off'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <target type='serial' port='0'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='serial0'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </console>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <input type='tablet' bus='usb'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='input0'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='usb' bus='0' port='1'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </input>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <input type='mouse' bus='ps2'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='input1'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </input>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <input type='keyboard' bus='ps2'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='input2'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </input>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <listen type='address' address='::0'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </graphics>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <audio id='1' type='none'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <video>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <model type='virtio' heads='1' primary='yes'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='video0'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </video>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <watchdog model='itco' action='reset'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='watchdog0'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </watchdog>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <memballoon model='virtio'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <stats period='10'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='balloon0'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <rng model='virtio'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <backend model='random'>/dev/urandom</backend>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='rng0'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </rng>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  </devices>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <label>system_u:system_r:svirt_t:s0:c632,c844</label>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c632,c844</imagelabel>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  </seclabel>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <label>+107:+107</label>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <imagelabel>+107:+107</imagelabel>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  </seclabel>
Nov 22 03:00:54 np0005531887 nova_compute[186849]: </domain>
Nov 22 03:00:54 np0005531887 nova_compute[186849]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Nov 22 03:00:54 np0005531887 nova_compute[186849]: 2025-11-22 08:00:54.210 186853 INFO nova.virt.libvirt.driver [None req-b232cbdd-955f-45ca-83b5-ba9e2ed07e8f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Successfully detached device tap55cf8749-2c from instance 36558ab5-ef38-44f5-8dd6-98c8e20c68c7 from the persistent domain config.#033[00m
Nov 22 03:00:54 np0005531887 nova_compute[186849]: 2025-11-22 08:00:54.211 186853 DEBUG nova.virt.libvirt.driver [None req-b232cbdd-955f-45ca-83b5-ba9e2ed07e8f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] (1/8): Attempting to detach device tap55cf8749-2c with device alias net1 from instance 36558ab5-ef38-44f5-8dd6-98c8e20c68c7 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Nov 22 03:00:54 np0005531887 nova_compute[186849]: 2025-11-22 08:00:54.212 186853 DEBUG nova.virt.libvirt.guest [None req-b232cbdd-955f-45ca-83b5-ba9e2ed07e8f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] detach device xml: <interface type="ethernet">
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <mac address="fa:16:3e:cf:cc:9e"/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <model type="virtio"/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <mtu size="1442"/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <target dev="tap55cf8749-2c"/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]: </interface>
Nov 22 03:00:54 np0005531887 nova_compute[186849]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Nov 22 03:00:54 np0005531887 nova_compute[186849]: 2025-11-22 08:00:54.275 186853 DEBUG nova.compute.manager [req-76af0ab4-b540-4c97-a3e2-14b8058cf24f req-db4fe956-7bf3-4abe-b4cc-fc6bf7544766 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Received event network-vif-plugged-55cf8749-2c0c-4c1d-82dc-ccef7f624cb3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:00:54 np0005531887 kernel: tap55cf8749-2c (unregistering): left promiscuous mode
Nov 22 03:00:54 np0005531887 nova_compute[186849]: 2025-11-22 08:00:54.275 186853 DEBUG oslo_concurrency.lockutils [req-76af0ab4-b540-4c97-a3e2-14b8058cf24f req-db4fe956-7bf3-4abe-b4cc-fc6bf7544766 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "36558ab5-ef38-44f5-8dd6-98c8e20c68c7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:00:54 np0005531887 nova_compute[186849]: 2025-11-22 08:00:54.276 186853 DEBUG oslo_concurrency.lockutils [req-76af0ab4-b540-4c97-a3e2-14b8058cf24f req-db4fe956-7bf3-4abe-b4cc-fc6bf7544766 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "36558ab5-ef38-44f5-8dd6-98c8e20c68c7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:00:54 np0005531887 nova_compute[186849]: 2025-11-22 08:00:54.276 186853 DEBUG oslo_concurrency.lockutils [req-76af0ab4-b540-4c97-a3e2-14b8058cf24f req-db4fe956-7bf3-4abe-b4cc-fc6bf7544766 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "36558ab5-ef38-44f5-8dd6-98c8e20c68c7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:00:54 np0005531887 nova_compute[186849]: 2025-11-22 08:00:54.276 186853 DEBUG nova.compute.manager [req-76af0ab4-b540-4c97-a3e2-14b8058cf24f req-db4fe956-7bf3-4abe-b4cc-fc6bf7544766 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] No waiting events found dispatching network-vif-plugged-55cf8749-2c0c-4c1d-82dc-ccef7f624cb3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:00:54 np0005531887 nova_compute[186849]: 2025-11-22 08:00:54.277 186853 WARNING nova.compute.manager [req-76af0ab4-b540-4c97-a3e2-14b8058cf24f req-db4fe956-7bf3-4abe-b4cc-fc6bf7544766 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Received unexpected event network-vif-plugged-55cf8749-2c0c-4c1d-82dc-ccef7f624cb3 for instance with vm_state active and task_state None.#033[00m
Nov 22 03:00:54 np0005531887 NetworkManager[55210]: <info>  [1763798454.2801] device (tap55cf8749-2c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:00:54 np0005531887 ovn_controller[95130]: 2025-11-22T08:00:54Z|00265|binding|INFO|Releasing lport 55cf8749-2c0c-4c1d-82dc-ccef7f624cb3 from this chassis (sb_readonly=0)
Nov 22 03:00:54 np0005531887 ovn_controller[95130]: 2025-11-22T08:00:54Z|00266|binding|INFO|Setting lport 55cf8749-2c0c-4c1d-82dc-ccef7f624cb3 down in Southbound
Nov 22 03:00:54 np0005531887 nova_compute[186849]: 2025-11-22 08:00:54.289 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:54 np0005531887 ovn_controller[95130]: 2025-11-22T08:00:54Z|00267|binding|INFO|Removing iface tap55cf8749-2c ovn-installed in OVS
Nov 22 03:00:54 np0005531887 nova_compute[186849]: 2025-11-22 08:00:54.292 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:54 np0005531887 nova_compute[186849]: 2025-11-22 08:00:54.294 186853 DEBUG nova.virt.libvirt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Received event <DeviceRemovedEvent: 1763798454.2938132, 36558ab5-ef38-44f5-8dd6-98c8e20c68c7 => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Nov 22 03:00:54 np0005531887 nova_compute[186849]: 2025-11-22 08:00:54.296 186853 DEBUG nova.virt.libvirt.driver [None req-b232cbdd-955f-45ca-83b5-ba9e2ed07e8f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Start waiting for the detach event from libvirt for device tap55cf8749-2c with device alias net1 for instance 36558ab5-ef38-44f5-8dd6-98c8e20c68c7 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Nov 22 03:00:54 np0005531887 nova_compute[186849]: 2025-11-22 08:00:54.297 186853 DEBUG nova.virt.libvirt.guest [None req-b232cbdd-955f-45ca-83b5-ba9e2ed07e8f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:cf:cc:9e"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap55cf8749-2c"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 22 03:00:54 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:54.297 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cf:cc:9e 10.100.0.14'], port_security=['fa:16:3e:cf:cc:9e 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '36558ab5-ef38-44f5-8dd6-98c8e20c68c7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6a4a282c-db22-41de-b34b-2960aa032ca8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd4a48801-4b3f-49e9-aa90-fb1d486a915e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0acaade9-a442-4c9d-a882-bd397d30fce8, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=55cf8749-2c0c-4c1d-82dc-ccef7f624cb3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:00:54 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:54.299 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 55cf8749-2c0c-4c1d-82dc-ccef7f624cb3 in datapath 6a4a282c-db22-41de-b34b-2960aa032ca8 unbound from our chassis#033[00m
Nov 22 03:00:54 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:54.301 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6a4a282c-db22-41de-b34b-2960aa032ca8#033[00m
Nov 22 03:00:54 np0005531887 nova_compute[186849]: 2025-11-22 08:00:54.301 186853 DEBUG nova.virt.libvirt.guest [None req-b232cbdd-955f-45ca-83b5-ba9e2ed07e8f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:cf:cc:9e"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap55cf8749-2c"/></interface>not found in domain: <domain type='kvm' id='37'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <name>instance-0000005b</name>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <uuid>36558ab5-ef38-44f5-8dd6-98c8e20c68c7</uuid>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <nova:name>tempest-AttachInterfacesTestJSON-server-733706123</nova:name>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <nova:creationTime>2025-11-22 08:00:51</nova:creationTime>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <nova:flavor name="m1.nano">
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <nova:memory>128</nova:memory>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <nova:disk>1</nova:disk>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <nova:swap>0</nova:swap>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <nova:vcpus>1</nova:vcpus>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  </nova:flavor>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <nova:owner>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <nova:user uuid="53d50a77d3f2416c8fcc459cc343d045">tempest-AttachInterfacesTestJSON-148239263-project-member</nova:user>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <nova:project uuid="2ce7c88f9ad440f494bdba03e7ece1bf">tempest-AttachInterfacesTestJSON-148239263</nova:project>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  </nova:owner>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <nova:ports>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <nova:port uuid="44ab3743-f6b1-4f3e-9686-53c9ebd45d37">
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </nova:port>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <nova:port uuid="55cf8749-2c0c-4c1d-82dc-ccef7f624cb3">
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </nova:port>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  </nova:ports>
Nov 22 03:00:54 np0005531887 nova_compute[186849]: </nova:instance>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <memory unit='KiB'>131072</memory>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <currentMemory unit='KiB'>131072</currentMemory>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <vcpu placement='static'>1</vcpu>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <resource>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <partition>/machine</partition>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  </resource>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <sysinfo type='smbios'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <system>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <entry name='manufacturer'>RDO</entry>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <entry name='product'>OpenStack Compute</entry>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <entry name='serial'>36558ab5-ef38-44f5-8dd6-98c8e20c68c7</entry>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <entry name='uuid'>36558ab5-ef38-44f5-8dd6-98c8e20c68c7</entry>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <entry name='family'>Virtual Machine</entry>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </system>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <os>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <boot dev='hd'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <smbios mode='sysinfo'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  </os>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <features>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <vmcoreinfo state='on'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  </features>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <cpu mode='custom' match='exact' check='full'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <model fallback='forbid'>Nehalem</model>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <feature policy='require' name='x2apic'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <feature policy='require' name='hypervisor'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <feature policy='require' name='vme'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <clock offset='utc'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <timer name='pit' tickpolicy='delay'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <timer name='rtc' tickpolicy='catchup'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <timer name='hpet' present='no'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  </clock>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <on_poweroff>destroy</on_poweroff>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <on_reboot>restart</on_reboot>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <on_crash>destroy</on_crash>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <devices>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <disk type='file' device='disk'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <driver name='qemu' type='qcow2' cache='none'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <source file='/var/lib/nova/instances/36558ab5-ef38-44f5-8dd6-98c8e20c68c7/disk' index='2'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <backingStore type='file' index='3'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:        <format type='raw'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:        <source file='/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:        <backingStore/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      </backingStore>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <target dev='vda' bus='virtio'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='virtio-disk0'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <disk type='file' device='cdrom'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <driver name='qemu' type='raw' cache='none'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <source file='/var/lib/nova/instances/36558ab5-ef38-44f5-8dd6-98c8e20c68c7/disk.config' index='1'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <backingStore/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <target dev='sda' bus='sata'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <readonly/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='sata0-0-0'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <controller type='pci' index='0' model='pcie-root'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='pcie.0'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <controller type='pci' index='1' model='pcie-root-port'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <target chassis='1' port='0x10'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='pci.1'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <controller type='pci' index='2' model='pcie-root-port'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <target chassis='2' port='0x11'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='pci.2'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <controller type='pci' index='3' model='pcie-root-port'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <target chassis='3' port='0x12'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='pci.3'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <controller type='pci' index='4' model='pcie-root-port'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <target chassis='4' port='0x13'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='pci.4'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <controller type='pci' index='5' model='pcie-root-port'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <target chassis='5' port='0x14'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='pci.5'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <controller type='pci' index='6' model='pcie-root-port'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <target chassis='6' port='0x15'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='pci.6'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <controller type='pci' index='7' model='pcie-root-port'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <target chassis='7' port='0x16'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='pci.7'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <controller type='pci' index='8' model='pcie-root-port'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <target chassis='8' port='0x17'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='pci.8'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <controller type='pci' index='9' model='pcie-root-port'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <target chassis='9' port='0x18'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='pci.9'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <controller type='pci' index='10' model='pcie-root-port'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <target chassis='10' port='0x19'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='pci.10'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <controller type='pci' index='11' model='pcie-root-port'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <target chassis='11' port='0x1a'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='pci.11'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <controller type='pci' index='12' model='pcie-root-port'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <target chassis='12' port='0x1b'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='pci.12'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <controller type='pci' index='13' model='pcie-root-port'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <target chassis='13' port='0x1c'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='pci.13'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <controller type='pci' index='14' model='pcie-root-port'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <target chassis='14' port='0x1d'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='pci.14'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <controller type='pci' index='15' model='pcie-root-port'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <target chassis='15' port='0x1e'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='pci.15'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <controller type='pci' index='16' model='pcie-root-port'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <target chassis='16' port='0x1f'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='pci.16'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <controller type='pci' index='17' model='pcie-root-port'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <target chassis='17' port='0x20'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='pci.17'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <controller type='pci' index='18' model='pcie-root-port'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <target chassis='18' port='0x21'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='pci.18'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <controller type='pci' index='19' model='pcie-root-port'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <target chassis='19' port='0x22'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='pci.19'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <controller type='pci' index='20' model='pcie-root-port'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <target chassis='20' port='0x23'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='pci.20'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <controller type='pci' index='21' model='pcie-root-port'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <target chassis='21' port='0x24'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='pci.21'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <controller type='pci' index='22' model='pcie-root-port'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <target chassis='22' port='0x25'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='pci.22'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <controller type='pci' index='23' model='pcie-root-port'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <target chassis='23' port='0x26'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='pci.23'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <controller type='pci' index='24' model='pcie-root-port'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <target chassis='24' port='0x27'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='pci.24'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <controller type='pci' index='25' model='pcie-root-port'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <target chassis='25' port='0x28'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='pci.25'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <model name='pcie-pci-bridge'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='pci.26'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <controller type='usb' index='0' model='piix3-uhci'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='usb'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <controller type='sata' index='0'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='ide'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <interface type='ethernet'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <mac address='fa:16:3e:06:f1:87'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <target dev='tap44ab3743-f6'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <model type='virtio'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <driver name='vhost' rx_queue_size='512'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <mtu size='1442'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='net0'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </interface>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <serial type='pty'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <source path='/dev/pts/0'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <log file='/var/lib/nova/instances/36558ab5-ef38-44f5-8dd6-98c8e20c68c7/console.log' append='off'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <target type='isa-serial' port='0'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:        <model name='isa-serial'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      </target>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='serial0'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </serial>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <console type='pty' tty='/dev/pts/0'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <source path='/dev/pts/0'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <log file='/var/lib/nova/instances/36558ab5-ef38-44f5-8dd6-98c8e20c68c7/console.log' append='off'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <target type='serial' port='0'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='serial0'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </console>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <input type='tablet' bus='usb'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='input0'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='usb' bus='0' port='1'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </input>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <input type='mouse' bus='ps2'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='input1'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </input>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <input type='keyboard' bus='ps2'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='input2'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </input>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <listen type='address' address='::0'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </graphics>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <audio id='1' type='none'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <video>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <model type='virtio' heads='1' primary='yes'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='video0'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </video>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <watchdog model='itco' action='reset'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='watchdog0'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </watchdog>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <memballoon model='virtio'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <stats period='10'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='balloon0'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <rng model='virtio'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <backend model='random'>/dev/urandom</backend>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <alias name='rng0'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </rng>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  </devices>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <label>system_u:system_r:svirt_t:s0:c632,c844</label>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c632,c844</imagelabel>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  </seclabel>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <label>+107:+107</label>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <imagelabel>+107:+107</imagelabel>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  </seclabel>
Nov 22 03:00:54 np0005531887 nova_compute[186849]: </domain>
Nov 22 03:00:54 np0005531887 nova_compute[186849]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Nov 22 03:00:54 np0005531887 nova_compute[186849]: 2025-11-22 08:00:54.302 186853 INFO nova.virt.libvirt.driver [None req-b232cbdd-955f-45ca-83b5-ba9e2ed07e8f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Successfully detached device tap55cf8749-2c from instance 36558ab5-ef38-44f5-8dd6-98c8e20c68c7 from the live domain config.#033[00m
Nov 22 03:00:54 np0005531887 nova_compute[186849]: 2025-11-22 08:00:54.302 186853 DEBUG nova.virt.libvirt.vif [None req-b232cbdd-955f-45ca-83b5-ba9e2ed07e8f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:59:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-733706123',display_name='tempest-AttachInterfacesTestJSON-server-733706123',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-733706123',id=91,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEVCPVBzzzKnP+R6/lDJiQ5B8RfgEpMCDa4dk9to8phNzvju3oinz4x7dgw6Zbn8afUUsbXFYWd5w1dgd7O2KN/oXSpHA9eKzGAIMbR7cZPcwDLZProoH/PeBT61VIfVUA==',key_name='tempest-keypair-329982621',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:00:06Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2ce7c88f9ad440f494bdba03e7ece1bf',ramdisk_id='',reservation_id='r-lkepgert',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-148239263',owner_user_name='tempest-AttachInterfacesTestJSON-148239263-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:00:06Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='53d50a77d3f2416c8fcc459cc343d045',uuid=36558ab5-ef38-44f5-8dd6-98c8e20c68c7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "55cf8749-2c0c-4c1d-82dc-ccef7f624cb3", "address": "fa:16:3e:cf:cc:9e", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap55cf8749-2c", "ovs_interfaceid": "55cf8749-2c0c-4c1d-82dc-ccef7f624cb3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:00:54 np0005531887 nova_compute[186849]: 2025-11-22 08:00:54.303 186853 DEBUG nova.network.os_vif_util [None req-b232cbdd-955f-45ca-83b5-ba9e2ed07e8f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converting VIF {"id": "55cf8749-2c0c-4c1d-82dc-ccef7f624cb3", "address": "fa:16:3e:cf:cc:9e", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap55cf8749-2c", "ovs_interfaceid": "55cf8749-2c0c-4c1d-82dc-ccef7f624cb3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:00:54 np0005531887 nova_compute[186849]: 2025-11-22 08:00:54.304 186853 DEBUG nova.network.os_vif_util [None req-b232cbdd-955f-45ca-83b5-ba9e2ed07e8f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cf:cc:9e,bridge_name='br-int',has_traffic_filtering=True,id=55cf8749-2c0c-4c1d-82dc-ccef7f624cb3,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap55cf8749-2c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:00:54 np0005531887 nova_compute[186849]: 2025-11-22 08:00:54.304 186853 DEBUG os_vif [None req-b232cbdd-955f-45ca-83b5-ba9e2ed07e8f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cf:cc:9e,bridge_name='br-int',has_traffic_filtering=True,id=55cf8749-2c0c-4c1d-82dc-ccef7f624cb3,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap55cf8749-2c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:00:54 np0005531887 nova_compute[186849]: 2025-11-22 08:00:54.307 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:54 np0005531887 nova_compute[186849]: 2025-11-22 08:00:54.307 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap55cf8749-2c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:00:54 np0005531887 nova_compute[186849]: 2025-11-22 08:00:54.309 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:54 np0005531887 nova_compute[186849]: 2025-11-22 08:00:54.310 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:00:54 np0005531887 nova_compute[186849]: 2025-11-22 08:00:54.311 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:54 np0005531887 nova_compute[186849]: 2025-11-22 08:00:54.314 186853 INFO os_vif [None req-b232cbdd-955f-45ca-83b5-ba9e2ed07e8f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cf:cc:9e,bridge_name='br-int',has_traffic_filtering=True,id=55cf8749-2c0c-4c1d-82dc-ccef7f624cb3,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap55cf8749-2c')#033[00m
Nov 22 03:00:54 np0005531887 nova_compute[186849]: 2025-11-22 08:00:54.315 186853 DEBUG nova.virt.libvirt.guest [None req-b232cbdd-955f-45ca-83b5-ba9e2ed07e8f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <nova:name>tempest-AttachInterfacesTestJSON-server-733706123</nova:name>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <nova:creationTime>2025-11-22 08:00:54</nova:creationTime>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <nova:flavor name="m1.nano">
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <nova:memory>128</nova:memory>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <nova:disk>1</nova:disk>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <nova:swap>0</nova:swap>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <nova:vcpus>1</nova:vcpus>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  </nova:flavor>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <nova:owner>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <nova:user uuid="53d50a77d3f2416c8fcc459cc343d045">tempest-AttachInterfacesTestJSON-148239263-project-member</nova:user>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <nova:project uuid="2ce7c88f9ad440f494bdba03e7ece1bf">tempest-AttachInterfacesTestJSON-148239263</nova:project>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  </nova:owner>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  <nova:ports>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    <nova:port uuid="44ab3743-f6b1-4f3e-9686-53c9ebd45d37">
Nov 22 03:00:54 np0005531887 nova_compute[186849]:      <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:    </nova:port>
Nov 22 03:00:54 np0005531887 nova_compute[186849]:  </nova:ports>
Nov 22 03:00:54 np0005531887 nova_compute[186849]: </nova:instance>
Nov 22 03:00:54 np0005531887 nova_compute[186849]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Nov 22 03:00:54 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:54.321 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[638caf49-1916-4fe2-b6d9-f0c21bc86254]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:54 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:54.364 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[9e5421df-7a45-436e-8770-e023fd822e54]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:54 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:54.369 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[ecac184d-d568-4aa5-b684-e19d46d67342]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:54 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:54.408 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[11ece7fd-693b-424a-b0f8-772c8122d6b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:54 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:54.431 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[c209d71e-cabb-4137-b2a0-8a3aa59bf1d7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6a4a282c-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b3:7a:86'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 85], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 514829, 'reachable_time': 41281, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226910, 'error': None, 'target': 'ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:54 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:54.451 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[871eda00-ac74-4902-99cd-37f5b37d565d]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap6a4a282c-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 514843, 'tstamp': 514843}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226911, 'error': None, 'target': 'ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap6a4a282c-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 514847, 'tstamp': 514847}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226911, 'error': None, 'target': 'ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:54 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:54.453 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6a4a282c-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:00:54 np0005531887 nova_compute[186849]: 2025-11-22 08:00:54.454 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:54 np0005531887 nova_compute[186849]: 2025-11-22 08:00:54.455 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:54 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:54.457 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6a4a282c-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:00:54 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:54.458 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:00:54 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:54.458 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6a4a282c-d0, col_values=(('external_ids', {'iface-id': '26692495-261e-4628-ae4d-0a33d676c097'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:00:54 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:54.459 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:00:54 np0005531887 nova_compute[186849]: 2025-11-22 08:00:54.696 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:00:54 np0005531887 nova_compute[186849]: 2025-11-22 08:00:54.696 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:00:54 np0005531887 nova_compute[186849]: 2025-11-22 08:00:54.937 186853 DEBUG oslo_concurrency.lockutils [None req-b232cbdd-955f-45ca-83b5-ba9e2ed07e8f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquiring lock "refresh_cache-36558ab5-ef38-44f5-8dd6-98c8e20c68c7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:00:54 np0005531887 nova_compute[186849]: 2025-11-22 08:00:54.937 186853 DEBUG oslo_concurrency.lockutils [None req-b232cbdd-955f-45ca-83b5-ba9e2ed07e8f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquired lock "refresh_cache-36558ab5-ef38-44f5-8dd6-98c8e20c68c7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:00:54 np0005531887 nova_compute[186849]: 2025-11-22 08:00:54.937 186853 DEBUG nova.network.neutron [None req-b232cbdd-955f-45ca-83b5-ba9e2ed07e8f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:00:55 np0005531887 nova_compute[186849]: 2025-11-22 08:00:55.009 186853 DEBUG nova.compute.manager [req-781c89ef-4978-462f-9e4b-49b0675086dc req-d80f55b9-563b-49d6-aeec-63f2e82b920b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Received event network-vif-deleted-55cf8749-2c0c-4c1d-82dc-ccef7f624cb3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:00:55 np0005531887 nova_compute[186849]: 2025-11-22 08:00:55.009 186853 INFO nova.compute.manager [req-781c89ef-4978-462f-9e4b-49b0675086dc req-d80f55b9-563b-49d6-aeec-63f2e82b920b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Neutron deleted interface 55cf8749-2c0c-4c1d-82dc-ccef7f624cb3; detaching it from the instance and deleting it from the info cache#033[00m
Nov 22 03:00:55 np0005531887 nova_compute[186849]: 2025-11-22 08:00:55.009 186853 DEBUG nova.network.neutron [req-781c89ef-4978-462f-9e4b-49b0675086dc req-d80f55b9-563b-49d6-aeec-63f2e82b920b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Updating instance_info_cache with network_info: [{"id": "44ab3743-f6b1-4f3e-9686-53c9ebd45d37", "address": "fa:16:3e:06:f1:87", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44ab3743-f6", "ovs_interfaceid": "44ab3743-f6b1-4f3e-9686-53c9ebd45d37", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:00:55 np0005531887 nova_compute[186849]: 2025-11-22 08:00:55.029 186853 DEBUG nova.objects.instance [req-781c89ef-4978-462f-9e4b-49b0675086dc req-d80f55b9-563b-49d6-aeec-63f2e82b920b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lazy-loading 'system_metadata' on Instance uuid 36558ab5-ef38-44f5-8dd6-98c8e20c68c7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:00:55 np0005531887 nova_compute[186849]: 2025-11-22 08:00:55.067 186853 DEBUG nova.objects.instance [req-781c89ef-4978-462f-9e4b-49b0675086dc req-d80f55b9-563b-49d6-aeec-63f2e82b920b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lazy-loading 'flavor' on Instance uuid 36558ab5-ef38-44f5-8dd6-98c8e20c68c7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:00:55 np0005531887 nova_compute[186849]: 2025-11-22 08:00:55.110 186853 DEBUG nova.virt.libvirt.vif [req-781c89ef-4978-462f-9e4b-49b0675086dc req-d80f55b9-563b-49d6-aeec-63f2e82b920b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:59:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-733706123',display_name='tempest-AttachInterfacesTestJSON-server-733706123',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-733706123',id=91,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEVCPVBzzzKnP+R6/lDJiQ5B8RfgEpMCDa4dk9to8phNzvju3oinz4x7dgw6Zbn8afUUsbXFYWd5w1dgd7O2KN/oXSpHA9eKzGAIMbR7cZPcwDLZProoH/PeBT61VIfVUA==',key_name='tempest-keypair-329982621',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:00:06Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2ce7c88f9ad440f494bdba03e7ece1bf',ramdisk_id='',reservation_id='r-lkepgert',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-148239263',owner_user_name='tempest-AttachInterfacesTestJSON-148239263-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:00:06Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='53d50a77d3f2416c8fcc459cc343d045',uuid=36558ab5-ef38-44f5-8dd6-98c8e20c68c7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "55cf8749-2c0c-4c1d-82dc-ccef7f624cb3", "address": "fa:16:3e:cf:cc:9e", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap55cf8749-2c", "ovs_interfaceid": "55cf8749-2c0c-4c1d-82dc-ccef7f624cb3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:00:55 np0005531887 nova_compute[186849]: 2025-11-22 08:00:55.111 186853 DEBUG nova.network.os_vif_util [req-781c89ef-4978-462f-9e4b-49b0675086dc req-d80f55b9-563b-49d6-aeec-63f2e82b920b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Converting VIF {"id": "55cf8749-2c0c-4c1d-82dc-ccef7f624cb3", "address": "fa:16:3e:cf:cc:9e", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap55cf8749-2c", "ovs_interfaceid": "55cf8749-2c0c-4c1d-82dc-ccef7f624cb3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:00:55 np0005531887 nova_compute[186849]: 2025-11-22 08:00:55.112 186853 DEBUG nova.network.os_vif_util [req-781c89ef-4978-462f-9e4b-49b0675086dc req-d80f55b9-563b-49d6-aeec-63f2e82b920b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cf:cc:9e,bridge_name='br-int',has_traffic_filtering=True,id=55cf8749-2c0c-4c1d-82dc-ccef7f624cb3,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap55cf8749-2c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:00:55 np0005531887 nova_compute[186849]: 2025-11-22 08:00:55.120 186853 DEBUG nova.virt.libvirt.guest [req-781c89ef-4978-462f-9e4b-49b0675086dc req-d80f55b9-563b-49d6-aeec-63f2e82b920b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:cf:cc:9e"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap55cf8749-2c"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 22 03:00:55 np0005531887 nova_compute[186849]: 2025-11-22 08:00:55.126 186853 DEBUG nova.virt.libvirt.guest [req-781c89ef-4978-462f-9e4b-49b0675086dc req-d80f55b9-563b-49d6-aeec-63f2e82b920b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:cf:cc:9e"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap55cf8749-2c"/></interface>not found in domain: <domain type='kvm' id='37'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  <name>instance-0000005b</name>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  <uuid>36558ab5-ef38-44f5-8dd6-98c8e20c68c7</uuid>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  <nova:name>tempest-AttachInterfacesTestJSON-server-733706123</nova:name>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  <nova:creationTime>2025-11-22 08:00:54</nova:creationTime>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  <nova:flavor name="m1.nano">
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <nova:memory>128</nova:memory>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <nova:disk>1</nova:disk>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <nova:swap>0</nova:swap>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <nova:vcpus>1</nova:vcpus>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  </nova:flavor>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  <nova:owner>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <nova:user uuid="53d50a77d3f2416c8fcc459cc343d045">tempest-AttachInterfacesTestJSON-148239263-project-member</nova:user>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <nova:project uuid="2ce7c88f9ad440f494bdba03e7ece1bf">tempest-AttachInterfacesTestJSON-148239263</nova:project>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  </nova:owner>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  <nova:ports>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <nova:port uuid="44ab3743-f6b1-4f3e-9686-53c9ebd45d37">
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </nova:port>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  </nova:ports>
Nov 22 03:00:55 np0005531887 nova_compute[186849]: </nova:instance>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  <memory unit='KiB'>131072</memory>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  <currentMemory unit='KiB'>131072</currentMemory>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  <vcpu placement='static'>1</vcpu>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  <resource>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <partition>/machine</partition>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  </resource>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  <sysinfo type='smbios'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <system>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <entry name='manufacturer'>RDO</entry>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <entry name='product'>OpenStack Compute</entry>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <entry name='serial'>36558ab5-ef38-44f5-8dd6-98c8e20c68c7</entry>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <entry name='uuid'>36558ab5-ef38-44f5-8dd6-98c8e20c68c7</entry>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <entry name='family'>Virtual Machine</entry>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </system>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  <os>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <boot dev='hd'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <smbios mode='sysinfo'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  </os>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  <features>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <vmcoreinfo state='on'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  </features>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  <cpu mode='custom' match='exact' check='full'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <model fallback='forbid'>Nehalem</model>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <feature policy='require' name='x2apic'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <feature policy='require' name='hypervisor'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <feature policy='require' name='vme'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  <clock offset='utc'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <timer name='pit' tickpolicy='delay'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <timer name='rtc' tickpolicy='catchup'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <timer name='hpet' present='no'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  </clock>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  <on_poweroff>destroy</on_poweroff>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  <on_reboot>restart</on_reboot>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  <on_crash>destroy</on_crash>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  <devices>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <disk type='file' device='disk'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <driver name='qemu' type='qcow2' cache='none'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <source file='/var/lib/nova/instances/36558ab5-ef38-44f5-8dd6-98c8e20c68c7/disk' index='2'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <backingStore type='file' index='3'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:        <format type='raw'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:        <source file='/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:        <backingStore/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      </backingStore>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <target dev='vda' bus='virtio'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='virtio-disk0'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <disk type='file' device='cdrom'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <driver name='qemu' type='raw' cache='none'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <source file='/var/lib/nova/instances/36558ab5-ef38-44f5-8dd6-98c8e20c68c7/disk.config' index='1'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <backingStore/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <target dev='sda' bus='sata'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <readonly/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='sata0-0-0'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <controller type='pci' index='0' model='pcie-root'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='pcie.0'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <controller type='pci' index='1' model='pcie-root-port'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <target chassis='1' port='0x10'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='pci.1'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <controller type='pci' index='2' model='pcie-root-port'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <target chassis='2' port='0x11'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='pci.2'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <controller type='pci' index='3' model='pcie-root-port'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <target chassis='3' port='0x12'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='pci.3'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <controller type='pci' index='4' model='pcie-root-port'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <target chassis='4' port='0x13'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='pci.4'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <controller type='pci' index='5' model='pcie-root-port'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <target chassis='5' port='0x14'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='pci.5'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <controller type='pci' index='6' model='pcie-root-port'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <target chassis='6' port='0x15'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='pci.6'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <controller type='pci' index='7' model='pcie-root-port'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <target chassis='7' port='0x16'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='pci.7'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <controller type='pci' index='8' model='pcie-root-port'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <target chassis='8' port='0x17'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='pci.8'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <controller type='pci' index='9' model='pcie-root-port'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <target chassis='9' port='0x18'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='pci.9'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <controller type='pci' index='10' model='pcie-root-port'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <target chassis='10' port='0x19'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='pci.10'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <controller type='pci' index='11' model='pcie-root-port'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <target chassis='11' port='0x1a'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='pci.11'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <controller type='pci' index='12' model='pcie-root-port'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <target chassis='12' port='0x1b'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='pci.12'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <controller type='pci' index='13' model='pcie-root-port'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <target chassis='13' port='0x1c'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='pci.13'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <controller type='pci' index='14' model='pcie-root-port'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <target chassis='14' port='0x1d'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='pci.14'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <controller type='pci' index='15' model='pcie-root-port'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <target chassis='15' port='0x1e'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='pci.15'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <controller type='pci' index='16' model='pcie-root-port'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <target chassis='16' port='0x1f'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='pci.16'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <controller type='pci' index='17' model='pcie-root-port'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <target chassis='17' port='0x20'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='pci.17'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <controller type='pci' index='18' model='pcie-root-port'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <target chassis='18' port='0x21'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='pci.18'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <controller type='pci' index='19' model='pcie-root-port'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <target chassis='19' port='0x22'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='pci.19'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <controller type='pci' index='20' model='pcie-root-port'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <target chassis='20' port='0x23'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='pci.20'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <controller type='pci' index='21' model='pcie-root-port'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <target chassis='21' port='0x24'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='pci.21'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <controller type='pci' index='22' model='pcie-root-port'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <target chassis='22' port='0x25'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='pci.22'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <controller type='pci' index='23' model='pcie-root-port'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <target chassis='23' port='0x26'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='pci.23'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <controller type='pci' index='24' model='pcie-root-port'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <target chassis='24' port='0x27'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='pci.24'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <controller type='pci' index='25' model='pcie-root-port'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <target chassis='25' port='0x28'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='pci.25'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <model name='pcie-pci-bridge'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='pci.26'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <controller type='usb' index='0' model='piix3-uhci'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='usb'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <controller type='sata' index='0'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='ide'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <interface type='ethernet'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <mac address='fa:16:3e:06:f1:87'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <target dev='tap44ab3743-f6'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <model type='virtio'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <driver name='vhost' rx_queue_size='512'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <mtu size='1442'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='net0'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </interface>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <serial type='pty'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <source path='/dev/pts/0'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <log file='/var/lib/nova/instances/36558ab5-ef38-44f5-8dd6-98c8e20c68c7/console.log' append='off'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <target type='isa-serial' port='0'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:        <model name='isa-serial'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      </target>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='serial0'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </serial>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <console type='pty' tty='/dev/pts/0'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <source path='/dev/pts/0'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <log file='/var/lib/nova/instances/36558ab5-ef38-44f5-8dd6-98c8e20c68c7/console.log' append='off'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <target type='serial' port='0'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='serial0'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </console>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <input type='tablet' bus='usb'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='input0'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='usb' bus='0' port='1'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </input>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <input type='mouse' bus='ps2'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='input1'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </input>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <input type='keyboard' bus='ps2'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='input2'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </input>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <listen type='address' address='::0'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </graphics>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <audio id='1' type='none'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <video>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <model type='virtio' heads='1' primary='yes'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='video0'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </video>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <watchdog model='itco' action='reset'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='watchdog0'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </watchdog>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <memballoon model='virtio'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <stats period='10'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='balloon0'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <rng model='virtio'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <backend model='random'>/dev/urandom</backend>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='rng0'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </rng>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  </devices>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <label>system_u:system_r:svirt_t:s0:c632,c844</label>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c632,c844</imagelabel>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  </seclabel>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <label>+107:+107</label>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <imagelabel>+107:+107</imagelabel>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  </seclabel>
Nov 22 03:00:55 np0005531887 nova_compute[186849]: </domain>
Nov 22 03:00:55 np0005531887 nova_compute[186849]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Nov 22 03:00:55 np0005531887 nova_compute[186849]: 2025-11-22 08:00:55.128 186853 DEBUG nova.virt.libvirt.guest [req-781c89ef-4978-462f-9e4b-49b0675086dc req-d80f55b9-563b-49d6-aeec-63f2e82b920b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:cf:cc:9e"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap55cf8749-2c"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 22 03:00:55 np0005531887 nova_compute[186849]: 2025-11-22 08:00:55.134 186853 DEBUG nova.virt.libvirt.guest [req-781c89ef-4978-462f-9e4b-49b0675086dc req-d80f55b9-563b-49d6-aeec-63f2e82b920b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:cf:cc:9e"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap55cf8749-2c"/></interface>not found in domain: <domain type='kvm' id='37'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  <name>instance-0000005b</name>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  <uuid>36558ab5-ef38-44f5-8dd6-98c8e20c68c7</uuid>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  <nova:name>tempest-AttachInterfacesTestJSON-server-733706123</nova:name>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  <nova:creationTime>2025-11-22 08:00:54</nova:creationTime>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  <nova:flavor name="m1.nano">
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <nova:memory>128</nova:memory>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <nova:disk>1</nova:disk>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <nova:swap>0</nova:swap>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <nova:vcpus>1</nova:vcpus>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  </nova:flavor>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  <nova:owner>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <nova:user uuid="53d50a77d3f2416c8fcc459cc343d045">tempest-AttachInterfacesTestJSON-148239263-project-member</nova:user>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <nova:project uuid="2ce7c88f9ad440f494bdba03e7ece1bf">tempest-AttachInterfacesTestJSON-148239263</nova:project>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  </nova:owner>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  <nova:ports>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <nova:port uuid="44ab3743-f6b1-4f3e-9686-53c9ebd45d37">
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </nova:port>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  </nova:ports>
Nov 22 03:00:55 np0005531887 nova_compute[186849]: </nova:instance>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  <memory unit='KiB'>131072</memory>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  <currentMemory unit='KiB'>131072</currentMemory>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  <vcpu placement='static'>1</vcpu>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  <resource>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <partition>/machine</partition>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  </resource>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  <sysinfo type='smbios'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <system>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <entry name='manufacturer'>RDO</entry>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <entry name='product'>OpenStack Compute</entry>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <entry name='serial'>36558ab5-ef38-44f5-8dd6-98c8e20c68c7</entry>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <entry name='uuid'>36558ab5-ef38-44f5-8dd6-98c8e20c68c7</entry>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <entry name='family'>Virtual Machine</entry>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </system>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  <os>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <boot dev='hd'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <smbios mode='sysinfo'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  </os>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  <features>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <vmcoreinfo state='on'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  </features>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  <cpu mode='custom' match='exact' check='full'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <model fallback='forbid'>Nehalem</model>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <feature policy='require' name='x2apic'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <feature policy='require' name='hypervisor'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <feature policy='require' name='vme'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  <clock offset='utc'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <timer name='pit' tickpolicy='delay'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <timer name='rtc' tickpolicy='catchup'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <timer name='hpet' present='no'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  </clock>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  <on_poweroff>destroy</on_poweroff>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  <on_reboot>restart</on_reboot>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  <on_crash>destroy</on_crash>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  <devices>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <disk type='file' device='disk'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <driver name='qemu' type='qcow2' cache='none'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <source file='/var/lib/nova/instances/36558ab5-ef38-44f5-8dd6-98c8e20c68c7/disk' index='2'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <backingStore type='file' index='3'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:        <format type='raw'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:        <source file='/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:        <backingStore/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      </backingStore>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <target dev='vda' bus='virtio'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='virtio-disk0'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <disk type='file' device='cdrom'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <driver name='qemu' type='raw' cache='none'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <source file='/var/lib/nova/instances/36558ab5-ef38-44f5-8dd6-98c8e20c68c7/disk.config' index='1'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <backingStore/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <target dev='sda' bus='sata'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <readonly/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='sata0-0-0'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <controller type='pci' index='0' model='pcie-root'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='pcie.0'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <controller type='pci' index='1' model='pcie-root-port'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <target chassis='1' port='0x10'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='pci.1'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <controller type='pci' index='2' model='pcie-root-port'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <target chassis='2' port='0x11'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='pci.2'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <controller type='pci' index='3' model='pcie-root-port'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <target chassis='3' port='0x12'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='pci.3'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <controller type='pci' index='4' model='pcie-root-port'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <target chassis='4' port='0x13'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='pci.4'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <controller type='pci' index='5' model='pcie-root-port'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <target chassis='5' port='0x14'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='pci.5'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <controller type='pci' index='6' model='pcie-root-port'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <target chassis='6' port='0x15'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='pci.6'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <controller type='pci' index='7' model='pcie-root-port'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <target chassis='7' port='0x16'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='pci.7'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <controller type='pci' index='8' model='pcie-root-port'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <target chassis='8' port='0x17'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='pci.8'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <controller type='pci' index='9' model='pcie-root-port'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <target chassis='9' port='0x18'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='pci.9'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <controller type='pci' index='10' model='pcie-root-port'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <target chassis='10' port='0x19'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='pci.10'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <controller type='pci' index='11' model='pcie-root-port'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <target chassis='11' port='0x1a'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='pci.11'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <controller type='pci' index='12' model='pcie-root-port'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <target chassis='12' port='0x1b'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='pci.12'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <controller type='pci' index='13' model='pcie-root-port'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <target chassis='13' port='0x1c'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='pci.13'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <controller type='pci' index='14' model='pcie-root-port'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <target chassis='14' port='0x1d'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='pci.14'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <controller type='pci' index='15' model='pcie-root-port'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <target chassis='15' port='0x1e'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='pci.15'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <controller type='pci' index='16' model='pcie-root-port'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <target chassis='16' port='0x1f'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='pci.16'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <controller type='pci' index='17' model='pcie-root-port'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <target chassis='17' port='0x20'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='pci.17'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <controller type='pci' index='18' model='pcie-root-port'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <target chassis='18' port='0x21'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='pci.18'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <controller type='pci' index='19' model='pcie-root-port'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <target chassis='19' port='0x22'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='pci.19'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <controller type='pci' index='20' model='pcie-root-port'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <target chassis='20' port='0x23'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='pci.20'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <controller type='pci' index='21' model='pcie-root-port'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <target chassis='21' port='0x24'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='pci.21'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <controller type='pci' index='22' model='pcie-root-port'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <target chassis='22' port='0x25'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='pci.22'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <controller type='pci' index='23' model='pcie-root-port'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <target chassis='23' port='0x26'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='pci.23'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <controller type='pci' index='24' model='pcie-root-port'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <target chassis='24' port='0x27'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='pci.24'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <controller type='pci' index='25' model='pcie-root-port'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <target chassis='25' port='0x28'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='pci.25'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <model name='pcie-pci-bridge'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='pci.26'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <controller type='usb' index='0' model='piix3-uhci'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='usb'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <controller type='sata' index='0'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='ide'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <interface type='ethernet'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <mac address='fa:16:3e:06:f1:87'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <target dev='tap44ab3743-f6'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <model type='virtio'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <driver name='vhost' rx_queue_size='512'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <mtu size='1442'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='net0'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </interface>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <serial type='pty'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <source path='/dev/pts/0'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <log file='/var/lib/nova/instances/36558ab5-ef38-44f5-8dd6-98c8e20c68c7/console.log' append='off'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <target type='isa-serial' port='0'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:        <model name='isa-serial'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      </target>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='serial0'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </serial>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <console type='pty' tty='/dev/pts/0'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <source path='/dev/pts/0'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <log file='/var/lib/nova/instances/36558ab5-ef38-44f5-8dd6-98c8e20c68c7/console.log' append='off'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <target type='serial' port='0'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='serial0'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </console>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <input type='tablet' bus='usb'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='input0'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='usb' bus='0' port='1'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </input>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <input type='mouse' bus='ps2'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='input1'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </input>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <input type='keyboard' bus='ps2'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='input2'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </input>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <listen type='address' address='::0'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </graphics>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <audio id='1' type='none'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <video>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <model type='virtio' heads='1' primary='yes'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='video0'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </video>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <watchdog model='itco' action='reset'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='watchdog0'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </watchdog>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <memballoon model='virtio'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <stats period='10'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='balloon0'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <rng model='virtio'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <backend model='random'>/dev/urandom</backend>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <alias name='rng0'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </rng>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  </devices>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <label>system_u:system_r:svirt_t:s0:c632,c844</label>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c632,c844</imagelabel>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  </seclabel>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <label>+107:+107</label>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <imagelabel>+107:+107</imagelabel>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  </seclabel>
Nov 22 03:00:55 np0005531887 nova_compute[186849]: </domain>
Nov 22 03:00:55 np0005531887 nova_compute[186849]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Nov 22 03:00:55 np0005531887 nova_compute[186849]: 2025-11-22 08:00:55.135 186853 WARNING nova.virt.libvirt.driver [req-781c89ef-4978-462f-9e4b-49b0675086dc req-d80f55b9-563b-49d6-aeec-63f2e82b920b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Detaching interface fa:16:3e:cf:cc:9e failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tap55cf8749-2c' not found.#033[00m
Nov 22 03:00:55 np0005531887 nova_compute[186849]: 2025-11-22 08:00:55.136 186853 DEBUG nova.virt.libvirt.vif [req-781c89ef-4978-462f-9e4b-49b0675086dc req-d80f55b9-563b-49d6-aeec-63f2e82b920b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:59:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-733706123',display_name='tempest-AttachInterfacesTestJSON-server-733706123',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-733706123',id=91,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEVCPVBzzzKnP+R6/lDJiQ5B8RfgEpMCDa4dk9to8phNzvju3oinz4x7dgw6Zbn8afUUsbXFYWd5w1dgd7O2KN/oXSpHA9eKzGAIMbR7cZPcwDLZProoH/PeBT61VIfVUA==',key_name='tempest-keypair-329982621',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:00:06Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2ce7c88f9ad440f494bdba03e7ece1bf',ramdisk_id='',reservation_id='r-lkepgert',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-148239263',owner_user_name='tempest-AttachInterfacesTestJSON-148239263-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:00:06Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='53d50a77d3f2416c8fcc459cc343d045',uuid=36558ab5-ef38-44f5-8dd6-98c8e20c68c7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "55cf8749-2c0c-4c1d-82dc-ccef7f624cb3", "address": "fa:16:3e:cf:cc:9e", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap55cf8749-2c", "ovs_interfaceid": "55cf8749-2c0c-4c1d-82dc-ccef7f624cb3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:00:55 np0005531887 nova_compute[186849]: 2025-11-22 08:00:55.136 186853 DEBUG nova.network.os_vif_util [req-781c89ef-4978-462f-9e4b-49b0675086dc req-d80f55b9-563b-49d6-aeec-63f2e82b920b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Converting VIF {"id": "55cf8749-2c0c-4c1d-82dc-ccef7f624cb3", "address": "fa:16:3e:cf:cc:9e", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap55cf8749-2c", "ovs_interfaceid": "55cf8749-2c0c-4c1d-82dc-ccef7f624cb3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:00:55 np0005531887 nova_compute[186849]: 2025-11-22 08:00:55.138 186853 DEBUG nova.network.os_vif_util [req-781c89ef-4978-462f-9e4b-49b0675086dc req-d80f55b9-563b-49d6-aeec-63f2e82b920b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cf:cc:9e,bridge_name='br-int',has_traffic_filtering=True,id=55cf8749-2c0c-4c1d-82dc-ccef7f624cb3,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap55cf8749-2c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:00:55 np0005531887 nova_compute[186849]: 2025-11-22 08:00:55.139 186853 DEBUG os_vif [req-781c89ef-4978-462f-9e4b-49b0675086dc req-d80f55b9-563b-49d6-aeec-63f2e82b920b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cf:cc:9e,bridge_name='br-int',has_traffic_filtering=True,id=55cf8749-2c0c-4c1d-82dc-ccef7f624cb3,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap55cf8749-2c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:00:55 np0005531887 nova_compute[186849]: 2025-11-22 08:00:55.141 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:55 np0005531887 nova_compute[186849]: 2025-11-22 08:00:55.142 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap55cf8749-2c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:00:55 np0005531887 nova_compute[186849]: 2025-11-22 08:00:55.142 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:00:55 np0005531887 nova_compute[186849]: 2025-11-22 08:00:55.145 186853 INFO os_vif [req-781c89ef-4978-462f-9e4b-49b0675086dc req-d80f55b9-563b-49d6-aeec-63f2e82b920b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cf:cc:9e,bridge_name='br-int',has_traffic_filtering=True,id=55cf8749-2c0c-4c1d-82dc-ccef7f624cb3,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap55cf8749-2c')#033[00m
Nov 22 03:00:55 np0005531887 nova_compute[186849]: 2025-11-22 08:00:55.147 186853 DEBUG nova.virt.libvirt.guest [req-781c89ef-4978-462f-9e4b-49b0675086dc req-d80f55b9-563b-49d6-aeec-63f2e82b920b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  <nova:name>tempest-AttachInterfacesTestJSON-server-733706123</nova:name>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  <nova:creationTime>2025-11-22 08:00:55</nova:creationTime>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  <nova:flavor name="m1.nano">
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <nova:memory>128</nova:memory>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <nova:disk>1</nova:disk>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <nova:swap>0</nova:swap>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <nova:vcpus>1</nova:vcpus>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  </nova:flavor>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  <nova:owner>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <nova:user uuid="53d50a77d3f2416c8fcc459cc343d045">tempest-AttachInterfacesTestJSON-148239263-project-member</nova:user>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <nova:project uuid="2ce7c88f9ad440f494bdba03e7ece1bf">tempest-AttachInterfacesTestJSON-148239263</nova:project>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  </nova:owner>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  <nova:ports>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    <nova:port uuid="44ab3743-f6b1-4f3e-9686-53c9ebd45d37">
Nov 22 03:00:55 np0005531887 nova_compute[186849]:      <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:    </nova:port>
Nov 22 03:00:55 np0005531887 nova_compute[186849]:  </nova:ports>
Nov 22 03:00:55 np0005531887 nova_compute[186849]: </nova:instance>
Nov 22 03:00:55 np0005531887 nova_compute[186849]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Nov 22 03:00:55 np0005531887 nova_compute[186849]: 2025-11-22 08:00:55.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:00:55 np0005531887 nova_compute[186849]: 2025-11-22 08:00:55.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:00:56 np0005531887 nova_compute[186849]: 2025-11-22 08:00:56.390 186853 DEBUG nova.compute.manager [req-fc0c5971-f7c9-4f72-90c7-f3053740ba0c req-1d252926-4d50-441d-889d-adb4e18c5310 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Received event network-vif-unplugged-55cf8749-2c0c-4c1d-82dc-ccef7f624cb3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:00:56 np0005531887 nova_compute[186849]: 2025-11-22 08:00:56.391 186853 DEBUG oslo_concurrency.lockutils [req-fc0c5971-f7c9-4f72-90c7-f3053740ba0c req-1d252926-4d50-441d-889d-adb4e18c5310 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "36558ab5-ef38-44f5-8dd6-98c8e20c68c7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:00:56 np0005531887 nova_compute[186849]: 2025-11-22 08:00:56.391 186853 DEBUG oslo_concurrency.lockutils [req-fc0c5971-f7c9-4f72-90c7-f3053740ba0c req-1d252926-4d50-441d-889d-adb4e18c5310 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "36558ab5-ef38-44f5-8dd6-98c8e20c68c7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:00:56 np0005531887 nova_compute[186849]: 2025-11-22 08:00:56.391 186853 DEBUG oslo_concurrency.lockutils [req-fc0c5971-f7c9-4f72-90c7-f3053740ba0c req-1d252926-4d50-441d-889d-adb4e18c5310 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "36558ab5-ef38-44f5-8dd6-98c8e20c68c7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:00:56 np0005531887 nova_compute[186849]: 2025-11-22 08:00:56.391 186853 DEBUG nova.compute.manager [req-fc0c5971-f7c9-4f72-90c7-f3053740ba0c req-1d252926-4d50-441d-889d-adb4e18c5310 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] No waiting events found dispatching network-vif-unplugged-55cf8749-2c0c-4c1d-82dc-ccef7f624cb3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:00:56 np0005531887 nova_compute[186849]: 2025-11-22 08:00:56.391 186853 WARNING nova.compute.manager [req-fc0c5971-f7c9-4f72-90c7-f3053740ba0c req-1d252926-4d50-441d-889d-adb4e18c5310 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Received unexpected event network-vif-unplugged-55cf8749-2c0c-4c1d-82dc-ccef7f624cb3 for instance with vm_state active and task_state None.#033[00m
Nov 22 03:00:56 np0005531887 nova_compute[186849]: 2025-11-22 08:00:56.392 186853 DEBUG nova.compute.manager [req-fc0c5971-f7c9-4f72-90c7-f3053740ba0c req-1d252926-4d50-441d-889d-adb4e18c5310 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Received event network-vif-plugged-55cf8749-2c0c-4c1d-82dc-ccef7f624cb3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:00:56 np0005531887 nova_compute[186849]: 2025-11-22 08:00:56.392 186853 DEBUG oslo_concurrency.lockutils [req-fc0c5971-f7c9-4f72-90c7-f3053740ba0c req-1d252926-4d50-441d-889d-adb4e18c5310 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "36558ab5-ef38-44f5-8dd6-98c8e20c68c7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:00:56 np0005531887 nova_compute[186849]: 2025-11-22 08:00:56.392 186853 DEBUG oslo_concurrency.lockutils [req-fc0c5971-f7c9-4f72-90c7-f3053740ba0c req-1d252926-4d50-441d-889d-adb4e18c5310 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "36558ab5-ef38-44f5-8dd6-98c8e20c68c7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:00:56 np0005531887 nova_compute[186849]: 2025-11-22 08:00:56.392 186853 DEBUG oslo_concurrency.lockutils [req-fc0c5971-f7c9-4f72-90c7-f3053740ba0c req-1d252926-4d50-441d-889d-adb4e18c5310 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "36558ab5-ef38-44f5-8dd6-98c8e20c68c7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:00:56 np0005531887 nova_compute[186849]: 2025-11-22 08:00:56.393 186853 DEBUG nova.compute.manager [req-fc0c5971-f7c9-4f72-90c7-f3053740ba0c req-1d252926-4d50-441d-889d-adb4e18c5310 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] No waiting events found dispatching network-vif-plugged-55cf8749-2c0c-4c1d-82dc-ccef7f624cb3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:00:56 np0005531887 nova_compute[186849]: 2025-11-22 08:00:56.393 186853 WARNING nova.compute.manager [req-fc0c5971-f7c9-4f72-90c7-f3053740ba0c req-1d252926-4d50-441d-889d-adb4e18c5310 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Received unexpected event network-vif-plugged-55cf8749-2c0c-4c1d-82dc-ccef7f624cb3 for instance with vm_state active and task_state None.#033[00m
Nov 22 03:00:56 np0005531887 nova_compute[186849]: 2025-11-22 08:00:56.658 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:56 np0005531887 nova_compute[186849]: 2025-11-22 08:00:56.662 186853 DEBUG oslo_concurrency.lockutils [None req-6e91bf1b-173e-4dae-bb92-bc5ba2a0bb6a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquiring lock "36558ab5-ef38-44f5-8dd6-98c8e20c68c7" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:00:56 np0005531887 nova_compute[186849]: 2025-11-22 08:00:56.662 186853 DEBUG oslo_concurrency.lockutils [None req-6e91bf1b-173e-4dae-bb92-bc5ba2a0bb6a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "36558ab5-ef38-44f5-8dd6-98c8e20c68c7" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:00:56 np0005531887 nova_compute[186849]: 2025-11-22 08:00:56.663 186853 DEBUG oslo_concurrency.lockutils [None req-6e91bf1b-173e-4dae-bb92-bc5ba2a0bb6a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquiring lock "36558ab5-ef38-44f5-8dd6-98c8e20c68c7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:00:56 np0005531887 nova_compute[186849]: 2025-11-22 08:00:56.663 186853 DEBUG oslo_concurrency.lockutils [None req-6e91bf1b-173e-4dae-bb92-bc5ba2a0bb6a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "36558ab5-ef38-44f5-8dd6-98c8e20c68c7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:00:56 np0005531887 nova_compute[186849]: 2025-11-22 08:00:56.663 186853 DEBUG oslo_concurrency.lockutils [None req-6e91bf1b-173e-4dae-bb92-bc5ba2a0bb6a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "36558ab5-ef38-44f5-8dd6-98c8e20c68c7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:00:56 np0005531887 nova_compute[186849]: 2025-11-22 08:00:56.671 186853 INFO nova.compute.manager [None req-6e91bf1b-173e-4dae-bb92-bc5ba2a0bb6a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Terminating instance#033[00m
Nov 22 03:00:56 np0005531887 nova_compute[186849]: 2025-11-22 08:00:56.680 186853 DEBUG nova.compute.manager [None req-6e91bf1b-173e-4dae-bb92-bc5ba2a0bb6a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 03:00:56 np0005531887 kernel: tap44ab3743-f6 (unregistering): left promiscuous mode
Nov 22 03:00:56 np0005531887 NetworkManager[55210]: <info>  [1763798456.7137] device (tap44ab3743-f6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:00:56 np0005531887 ovn_controller[95130]: 2025-11-22T08:00:56Z|00268|binding|INFO|Releasing lport 44ab3743-f6b1-4f3e-9686-53c9ebd45d37 from this chassis (sb_readonly=0)
Nov 22 03:00:56 np0005531887 nova_compute[186849]: 2025-11-22 08:00:56.724 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:56 np0005531887 ovn_controller[95130]: 2025-11-22T08:00:56Z|00269|binding|INFO|Setting lport 44ab3743-f6b1-4f3e-9686-53c9ebd45d37 down in Southbound
Nov 22 03:00:56 np0005531887 ovn_controller[95130]: 2025-11-22T08:00:56Z|00270|binding|INFO|Removing iface tap44ab3743-f6 ovn-installed in OVS
Nov 22 03:00:56 np0005531887 nova_compute[186849]: 2025-11-22 08:00:56.729 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:56 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:56.734 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:06:f1:87 10.100.0.9'], port_security=['fa:16:3e:06:f1:87 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '36558ab5-ef38-44f5-8dd6-98c8e20c68c7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6a4a282c-db22-41de-b34b-2960aa032ca8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd3a9b01b-3ebf-4060-a682-c07cc7a09738', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.175'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0acaade9-a442-4c9d-a882-bd397d30fce8, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=44ab3743-f6b1-4f3e-9686-53c9ebd45d37) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:00:56 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:56.736 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 44ab3743-f6b1-4f3e-9686-53c9ebd45d37 in datapath 6a4a282c-db22-41de-b34b-2960aa032ca8 unbound from our chassis#033[00m
Nov 22 03:00:56 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:56.737 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6a4a282c-db22-41de-b34b-2960aa032ca8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:00:56 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:56.738 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[6097760c-53e2-48ee-b9f3-baf7eac1acb8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:56 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:56.740 104084 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8 namespace which is not needed anymore#033[00m
Nov 22 03:00:56 np0005531887 nova_compute[186849]: 2025-11-22 08:00:56.746 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:56 np0005531887 systemd[1]: machine-qemu\x2d37\x2dinstance\x2d0000005b.scope: Deactivated successfully.
Nov 22 03:00:56 np0005531887 systemd[1]: machine-qemu\x2d37\x2dinstance\x2d0000005b.scope: Consumed 17.995s CPU time.
Nov 22 03:00:56 np0005531887 systemd-machined[153180]: Machine qemu-37-instance-0000005b terminated.
Nov 22 03:00:56 np0005531887 podman[226916]: 2025-11-22 08:00:56.847463049 +0000 UTC m=+0.080401594 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118)
Nov 22 03:00:56 np0005531887 neutron-haproxy-ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8[226586]: [NOTICE]   (226590) : haproxy version is 2.8.14-c23fe91
Nov 22 03:00:56 np0005531887 neutron-haproxy-ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8[226586]: [NOTICE]   (226590) : path to executable is /usr/sbin/haproxy
Nov 22 03:00:56 np0005531887 neutron-haproxy-ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8[226586]: [WARNING]  (226590) : Exiting Master process...
Nov 22 03:00:56 np0005531887 neutron-haproxy-ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8[226586]: [ALERT]    (226590) : Current worker (226592) exited with code 143 (Terminated)
Nov 22 03:00:56 np0005531887 neutron-haproxy-ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8[226586]: [WARNING]  (226590) : All workers exited. Exiting... (0)
Nov 22 03:00:56 np0005531887 systemd[1]: libpod-a84878a3ec868ea3b301ca39714621f96c654ba6914f1c781c09a1080ab34c24.scope: Deactivated successfully.
Nov 22 03:00:56 np0005531887 podman[226954]: 2025-11-22 08:00:56.957926303 +0000 UTC m=+0.099011272 container died a84878a3ec868ea3b301ca39714621f96c654ba6914f1c781c09a1080ab34c24 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 03:00:56 np0005531887 nova_compute[186849]: 2025-11-22 08:00:56.986 186853 INFO nova.virt.libvirt.driver [-] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Instance destroyed successfully.#033[00m
Nov 22 03:00:56 np0005531887 nova_compute[186849]: 2025-11-22 08:00:56.987 186853 DEBUG nova.objects.instance [None req-6e91bf1b-173e-4dae-bb92-bc5ba2a0bb6a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lazy-loading 'resources' on Instance uuid 36558ab5-ef38-44f5-8dd6-98c8e20c68c7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:00:57 np0005531887 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a84878a3ec868ea3b301ca39714621f96c654ba6914f1c781c09a1080ab34c24-userdata-shm.mount: Deactivated successfully.
Nov 22 03:00:57 np0005531887 systemd[1]: var-lib-containers-storage-overlay-302df88bc62da9afa78331b929022c6d043bd2a26cd3955681c5cc6d5fcd4c01-merged.mount: Deactivated successfully.
Nov 22 03:00:57 np0005531887 nova_compute[186849]: 2025-11-22 08:00:57.033 186853 DEBUG nova.virt.libvirt.vif [None req-6e91bf1b-173e-4dae-bb92-bc5ba2a0bb6a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:59:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-733706123',display_name='tempest-AttachInterfacesTestJSON-server-733706123',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-733706123',id=91,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEVCPVBzzzKnP+R6/lDJiQ5B8RfgEpMCDa4dk9to8phNzvju3oinz4x7dgw6Zbn8afUUsbXFYWd5w1dgd7O2KN/oXSpHA9eKzGAIMbR7cZPcwDLZProoH/PeBT61VIfVUA==',key_name='tempest-keypair-329982621',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:00:06Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2ce7c88f9ad440f494bdba03e7ece1bf',ramdisk_id='',reservation_id='r-lkepgert',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-148239263',owner_user_name='tempest-AttachInterfacesTestJSON-148239263-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:00:06Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='53d50a77d3f2416c8fcc459cc343d045',uuid=36558ab5-ef38-44f5-8dd6-98c8e20c68c7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "44ab3743-f6b1-4f3e-9686-53c9ebd45d37", "address": "fa:16:3e:06:f1:87", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44ab3743-f6", "ovs_interfaceid": "44ab3743-f6b1-4f3e-9686-53c9ebd45d37", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:00:57 np0005531887 nova_compute[186849]: 2025-11-22 08:00:57.034 186853 DEBUG nova.network.os_vif_util [None req-6e91bf1b-173e-4dae-bb92-bc5ba2a0bb6a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converting VIF {"id": "44ab3743-f6b1-4f3e-9686-53c9ebd45d37", "address": "fa:16:3e:06:f1:87", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44ab3743-f6", "ovs_interfaceid": "44ab3743-f6b1-4f3e-9686-53c9ebd45d37", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:00:57 np0005531887 nova_compute[186849]: 2025-11-22 08:00:57.035 186853 DEBUG nova.network.os_vif_util [None req-6e91bf1b-173e-4dae-bb92-bc5ba2a0bb6a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:06:f1:87,bridge_name='br-int',has_traffic_filtering=True,id=44ab3743-f6b1-4f3e-9686-53c9ebd45d37,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap44ab3743-f6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:00:57 np0005531887 nova_compute[186849]: 2025-11-22 08:00:57.036 186853 DEBUG os_vif [None req-6e91bf1b-173e-4dae-bb92-bc5ba2a0bb6a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:06:f1:87,bridge_name='br-int',has_traffic_filtering=True,id=44ab3743-f6b1-4f3e-9686-53c9ebd45d37,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap44ab3743-f6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:00:57 np0005531887 nova_compute[186849]: 2025-11-22 08:00:57.038 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:57 np0005531887 nova_compute[186849]: 2025-11-22 08:00:57.038 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap44ab3743-f6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:00:57 np0005531887 nova_compute[186849]: 2025-11-22 08:00:57.043 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:57 np0005531887 nova_compute[186849]: 2025-11-22 08:00:57.047 186853 INFO os_vif [None req-6e91bf1b-173e-4dae-bb92-bc5ba2a0bb6a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:06:f1:87,bridge_name='br-int',has_traffic_filtering=True,id=44ab3743-f6b1-4f3e-9686-53c9ebd45d37,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap44ab3743-f6')#033[00m
Nov 22 03:00:57 np0005531887 nova_compute[186849]: 2025-11-22 08:00:57.049 186853 INFO nova.virt.libvirt.driver [None req-6e91bf1b-173e-4dae-bb92-bc5ba2a0bb6a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Deleting instance files /var/lib/nova/instances/36558ab5-ef38-44f5-8dd6-98c8e20c68c7_del#033[00m
Nov 22 03:00:57 np0005531887 nova_compute[186849]: 2025-11-22 08:00:57.049 186853 INFO nova.virt.libvirt.driver [None req-6e91bf1b-173e-4dae-bb92-bc5ba2a0bb6a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Deletion of /var/lib/nova/instances/36558ab5-ef38-44f5-8dd6-98c8e20c68c7_del complete#033[00m
Nov 22 03:00:57 np0005531887 podman[226954]: 2025-11-22 08:00:57.063670087 +0000 UTC m=+0.204755056 container cleanup a84878a3ec868ea3b301ca39714621f96c654ba6914f1c781c09a1080ab34c24 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 03:00:57 np0005531887 systemd[1]: libpod-conmon-a84878a3ec868ea3b301ca39714621f96c654ba6914f1c781c09a1080ab34c24.scope: Deactivated successfully.
Nov 22 03:00:57 np0005531887 nova_compute[186849]: 2025-11-22 08:00:57.173 186853 INFO nova.network.neutron [None req-b232cbdd-955f-45ca-83b5-ba9e2ed07e8f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Port 55cf8749-2c0c-4c1d-82dc-ccef7f624cb3 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Nov 22 03:00:57 np0005531887 nova_compute[186849]: 2025-11-22 08:00:57.174 186853 DEBUG nova.network.neutron [None req-b232cbdd-955f-45ca-83b5-ba9e2ed07e8f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Updating instance_info_cache with network_info: [{"id": "44ab3743-f6b1-4f3e-9686-53c9ebd45d37", "address": "fa:16:3e:06:f1:87", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44ab3743-f6", "ovs_interfaceid": "44ab3743-f6b1-4f3e-9686-53c9ebd45d37", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:00:57 np0005531887 nova_compute[186849]: 2025-11-22 08:00:57.196 186853 INFO nova.compute.manager [None req-6e91bf1b-173e-4dae-bb92-bc5ba2a0bb6a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Took 0.51 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 03:00:57 np0005531887 nova_compute[186849]: 2025-11-22 08:00:57.197 186853 DEBUG oslo.service.loopingcall [None req-6e91bf1b-173e-4dae-bb92-bc5ba2a0bb6a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 03:00:57 np0005531887 nova_compute[186849]: 2025-11-22 08:00:57.197 186853 DEBUG nova.compute.manager [-] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 03:00:57 np0005531887 nova_compute[186849]: 2025-11-22 08:00:57.198 186853 DEBUG nova.network.neutron [-] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 03:00:57 np0005531887 nova_compute[186849]: 2025-11-22 08:00:57.201 186853 DEBUG oslo_concurrency.lockutils [None req-b232cbdd-955f-45ca-83b5-ba9e2ed07e8f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Releasing lock "refresh_cache-36558ab5-ef38-44f5-8dd6-98c8e20c68c7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:00:57 np0005531887 podman[226998]: 2025-11-22 08:00:57.220157382 +0000 UTC m=+0.130268334 container remove a84878a3ec868ea3b301ca39714621f96c654ba6914f1c781c09a1080ab34c24 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 03:00:57 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:57.226 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[67aad663-54f5-4447-98ff-c9735736d00c]: (4, ('Sat Nov 22 08:00:56 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8 (a84878a3ec868ea3b301ca39714621f96c654ba6914f1c781c09a1080ab34c24)\na84878a3ec868ea3b301ca39714621f96c654ba6914f1c781c09a1080ab34c24\nSat Nov 22 08:00:57 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8 (a84878a3ec868ea3b301ca39714621f96c654ba6914f1c781c09a1080ab34c24)\na84878a3ec868ea3b301ca39714621f96c654ba6914f1c781c09a1080ab34c24\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:57 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:57.228 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[b6242bee-1bf5-4558-b04c-e7f69608f982]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:57 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:57.229 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6a4a282c-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:00:57 np0005531887 nova_compute[186849]: 2025-11-22 08:00:57.231 186853 DEBUG oslo_concurrency.lockutils [None req-b232cbdd-955f-45ca-83b5-ba9e2ed07e8f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "interface-36558ab5-ef38-44f5-8dd6-98c8e20c68c7-55cf8749-2c0c-4c1d-82dc-ccef7f624cb3" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 3.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:00:57 np0005531887 kernel: tap6a4a282c-d0: left promiscuous mode
Nov 22 03:00:57 np0005531887 nova_compute[186849]: 2025-11-22 08:00:57.233 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:57 np0005531887 nova_compute[186849]: 2025-11-22 08:00:57.246 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:00:57 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:57.249 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[7d5f890c-d5b0-401d-b9ca-d7d2e0fffce7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:57 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:57.263 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[fa1e60df-c459-4592-a2d8-c207293abe81]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:57 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:57.265 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[9072b823-231f-44b2-b4df-0eb78f7e470e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:57 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:57.286 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[c4b17562-08ea-4e09-a38c-008f2c293e97]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 514822, 'reachable_time': 34477, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227013, 'error': None, 'target': 'ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:57 np0005531887 systemd[1]: run-netns-ovnmeta\x2d6a4a282c\x2ddb22\x2d41de\x2db34b\x2d2960aa032ca8.mount: Deactivated successfully.
Nov 22 03:00:57 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:57.294 104200 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:00:57 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:00:57.296 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[33313d0f-d52a-405c-92e9-27c12ccb93c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:00:58 np0005531887 nova_compute[186849]: 2025-11-22 08:00:58.532 186853 DEBUG nova.compute.manager [req-dd6c16cf-b5a5-4faf-919d-ac6b6216cdfc req-4c558849-9790-49b9-b9b3-777cc849daa8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Received event network-vif-unplugged-44ab3743-f6b1-4f3e-9686-53c9ebd45d37 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:00:58 np0005531887 nova_compute[186849]: 2025-11-22 08:00:58.533 186853 DEBUG oslo_concurrency.lockutils [req-dd6c16cf-b5a5-4faf-919d-ac6b6216cdfc req-4c558849-9790-49b9-b9b3-777cc849daa8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "36558ab5-ef38-44f5-8dd6-98c8e20c68c7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:00:58 np0005531887 nova_compute[186849]: 2025-11-22 08:00:58.533 186853 DEBUG oslo_concurrency.lockutils [req-dd6c16cf-b5a5-4faf-919d-ac6b6216cdfc req-4c558849-9790-49b9-b9b3-777cc849daa8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "36558ab5-ef38-44f5-8dd6-98c8e20c68c7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:00:58 np0005531887 nova_compute[186849]: 2025-11-22 08:00:58.533 186853 DEBUG oslo_concurrency.lockutils [req-dd6c16cf-b5a5-4faf-919d-ac6b6216cdfc req-4c558849-9790-49b9-b9b3-777cc849daa8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "36558ab5-ef38-44f5-8dd6-98c8e20c68c7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:00:58 np0005531887 nova_compute[186849]: 2025-11-22 08:00:58.534 186853 DEBUG nova.compute.manager [req-dd6c16cf-b5a5-4faf-919d-ac6b6216cdfc req-4c558849-9790-49b9-b9b3-777cc849daa8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] No waiting events found dispatching network-vif-unplugged-44ab3743-f6b1-4f3e-9686-53c9ebd45d37 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:00:58 np0005531887 nova_compute[186849]: 2025-11-22 08:00:58.534 186853 DEBUG nova.compute.manager [req-dd6c16cf-b5a5-4faf-919d-ac6b6216cdfc req-4c558849-9790-49b9-b9b3-777cc849daa8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Received event network-vif-unplugged-44ab3743-f6b1-4f3e-9686-53c9ebd45d37 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 03:00:58 np0005531887 nova_compute[186849]: 2025-11-22 08:00:58.534 186853 DEBUG nova.compute.manager [req-dd6c16cf-b5a5-4faf-919d-ac6b6216cdfc req-4c558849-9790-49b9-b9b3-777cc849daa8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Received event network-vif-plugged-44ab3743-f6b1-4f3e-9686-53c9ebd45d37 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:00:58 np0005531887 nova_compute[186849]: 2025-11-22 08:00:58.534 186853 DEBUG oslo_concurrency.lockutils [req-dd6c16cf-b5a5-4faf-919d-ac6b6216cdfc req-4c558849-9790-49b9-b9b3-777cc849daa8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "36558ab5-ef38-44f5-8dd6-98c8e20c68c7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:00:58 np0005531887 nova_compute[186849]: 2025-11-22 08:00:58.535 186853 DEBUG oslo_concurrency.lockutils [req-dd6c16cf-b5a5-4faf-919d-ac6b6216cdfc req-4c558849-9790-49b9-b9b3-777cc849daa8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "36558ab5-ef38-44f5-8dd6-98c8e20c68c7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:00:58 np0005531887 nova_compute[186849]: 2025-11-22 08:00:58.535 186853 DEBUG oslo_concurrency.lockutils [req-dd6c16cf-b5a5-4faf-919d-ac6b6216cdfc req-4c558849-9790-49b9-b9b3-777cc849daa8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "36558ab5-ef38-44f5-8dd6-98c8e20c68c7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:00:58 np0005531887 nova_compute[186849]: 2025-11-22 08:00:58.535 186853 DEBUG nova.compute.manager [req-dd6c16cf-b5a5-4faf-919d-ac6b6216cdfc req-4c558849-9790-49b9-b9b3-777cc849daa8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] No waiting events found dispatching network-vif-plugged-44ab3743-f6b1-4f3e-9686-53c9ebd45d37 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:00:58 np0005531887 nova_compute[186849]: 2025-11-22 08:00:58.535 186853 WARNING nova.compute.manager [req-dd6c16cf-b5a5-4faf-919d-ac6b6216cdfc req-4c558849-9790-49b9-b9b3-777cc849daa8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Received unexpected event network-vif-plugged-44ab3743-f6b1-4f3e-9686-53c9ebd45d37 for instance with vm_state active and task_state deleting.#033[00m
Nov 22 03:00:58 np0005531887 nova_compute[186849]: 2025-11-22 08:00:58.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:00:58 np0005531887 nova_compute[186849]: 2025-11-22 08:00:58.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:00:58 np0005531887 nova_compute[186849]: 2025-11-22 08:00:58.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:00:58 np0005531887 nova_compute[186849]: 2025-11-22 08:00:58.784 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Nov 22 03:00:58 np0005531887 nova_compute[186849]: 2025-11-22 08:00:58.785 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:01:00 np0005531887 nova_compute[186849]: 2025-11-22 08:01:00.068 186853 DEBUG nova.network.neutron [-] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:01:00 np0005531887 nova_compute[186849]: 2025-11-22 08:01:00.085 186853 INFO nova.compute.manager [-] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Took 2.89 seconds to deallocate network for instance.#033[00m
Nov 22 03:01:00 np0005531887 nova_compute[186849]: 2025-11-22 08:01:00.179 186853 DEBUG oslo_concurrency.lockutils [None req-6e91bf1b-173e-4dae-bb92-bc5ba2a0bb6a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:01:00 np0005531887 nova_compute[186849]: 2025-11-22 08:01:00.180 186853 DEBUG oslo_concurrency.lockutils [None req-6e91bf1b-173e-4dae-bb92-bc5ba2a0bb6a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:01:00 np0005531887 nova_compute[186849]: 2025-11-22 08:01:00.233 186853 DEBUG nova.compute.manager [req-de6754c2-41dd-47c5-9a8c-d21e91f6d747 req-6000480c-1b19-49b5-9fb7-f8b080a96d24 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Received event network-vif-deleted-44ab3743-f6b1-4f3e-9686-53c9ebd45d37 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:01:00 np0005531887 nova_compute[186849]: 2025-11-22 08:01:00.272 186853 DEBUG nova.compute.provider_tree [None req-6e91bf1b-173e-4dae-bb92-bc5ba2a0bb6a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:01:00 np0005531887 nova_compute[186849]: 2025-11-22 08:01:00.296 186853 DEBUG nova.scheduler.client.report [None req-6e91bf1b-173e-4dae-bb92-bc5ba2a0bb6a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:01:00 np0005531887 nova_compute[186849]: 2025-11-22 08:01:00.322 186853 DEBUG oslo_concurrency.lockutils [None req-6e91bf1b-173e-4dae-bb92-bc5ba2a0bb6a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.142s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:01:00 np0005531887 nova_compute[186849]: 2025-11-22 08:01:00.353 186853 INFO nova.scheduler.client.report [None req-6e91bf1b-173e-4dae-bb92-bc5ba2a0bb6a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Deleted allocations for instance 36558ab5-ef38-44f5-8dd6-98c8e20c68c7#033[00m
Nov 22 03:01:00 np0005531887 nova_compute[186849]: 2025-11-22 08:01:00.702 186853 DEBUG oslo_concurrency.lockutils [None req-6e91bf1b-173e-4dae-bb92-bc5ba2a0bb6a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "36558ab5-ef38-44f5-8dd6-98c8e20c68c7" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.040s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:01:01 np0005531887 nova_compute[186849]: 2025-11-22 08:01:01.659 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:02 np0005531887 nova_compute[186849]: 2025-11-22 08:01:02.041 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:02 np0005531887 podman[227025]: 2025-11-22 08:01:02.891782421 +0000 UTC m=+0.101895176 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:01:06 np0005531887 nova_compute[186849]: 2025-11-22 08:01:06.662 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:06 np0005531887 podman[227045]: 2025-11-22 08:01:06.833566848 +0000 UTC m=+0.056584822 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 03:01:07 np0005531887 nova_compute[186849]: 2025-11-22 08:01:07.044 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:11 np0005531887 nova_compute[186849]: 2025-11-22 08:01:11.264 186853 DEBUG oslo_concurrency.lockutils [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquiring lock "fb921e88-22d2-4ae6-9d09-970505f0d5bb" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:01:11 np0005531887 nova_compute[186849]: 2025-11-22 08:01:11.265 186853 DEBUG oslo_concurrency.lockutils [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "fb921e88-22d2-4ae6-9d09-970505f0d5bb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:01:11 np0005531887 nova_compute[186849]: 2025-11-22 08:01:11.297 186853 DEBUG nova.compute.manager [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 03:01:11 np0005531887 nova_compute[186849]: 2025-11-22 08:01:11.442 186853 DEBUG oslo_concurrency.lockutils [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:01:11 np0005531887 nova_compute[186849]: 2025-11-22 08:01:11.443 186853 DEBUG oslo_concurrency.lockutils [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:01:11 np0005531887 nova_compute[186849]: 2025-11-22 08:01:11.449 186853 DEBUG nova.virt.hardware [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:01:11 np0005531887 nova_compute[186849]: 2025-11-22 08:01:11.450 186853 INFO nova.compute.claims [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 22 03:01:11 np0005531887 nova_compute[186849]: 2025-11-22 08:01:11.605 186853 DEBUG nova.compute.provider_tree [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:01:11 np0005531887 nova_compute[186849]: 2025-11-22 08:01:11.631 186853 DEBUG nova.scheduler.client.report [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:01:11 np0005531887 nova_compute[186849]: 2025-11-22 08:01:11.664 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:11 np0005531887 nova_compute[186849]: 2025-11-22 08:01:11.666 186853 DEBUG oslo_concurrency.lockutils [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.223s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:01:11 np0005531887 nova_compute[186849]: 2025-11-22 08:01:11.667 186853 DEBUG nova.compute.manager [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 03:01:11 np0005531887 nova_compute[186849]: 2025-11-22 08:01:11.730 186853 DEBUG nova.compute.manager [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 03:01:11 np0005531887 nova_compute[186849]: 2025-11-22 08:01:11.731 186853 DEBUG nova.network.neutron [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:01:11 np0005531887 nova_compute[186849]: 2025-11-22 08:01:11.751 186853 INFO nova.virt.libvirt.driver [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 03:01:11 np0005531887 nova_compute[186849]: 2025-11-22 08:01:11.773 186853 DEBUG nova.compute.manager [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 03:01:11 np0005531887 podman[227069]: 2025-11-22 08:01:11.859943668 +0000 UTC m=+0.079241864 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, name=ubi9-minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., config_id=edpm, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, release=1755695350, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.buildah.version=1.33.7, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 03:01:11 np0005531887 nova_compute[186849]: 2025-11-22 08:01:11.896 186853 DEBUG nova.compute.manager [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 03:01:11 np0005531887 nova_compute[186849]: 2025-11-22 08:01:11.898 186853 DEBUG nova.virt.libvirt.driver [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:01:11 np0005531887 nova_compute[186849]: 2025-11-22 08:01:11.898 186853 INFO nova.virt.libvirt.driver [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Creating image(s)#033[00m
Nov 22 03:01:11 np0005531887 nova_compute[186849]: 2025-11-22 08:01:11.899 186853 DEBUG oslo_concurrency.lockutils [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquiring lock "/var/lib/nova/instances/fb921e88-22d2-4ae6-9d09-970505f0d5bb/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:01:11 np0005531887 nova_compute[186849]: 2025-11-22 08:01:11.899 186853 DEBUG oslo_concurrency.lockutils [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "/var/lib/nova/instances/fb921e88-22d2-4ae6-9d09-970505f0d5bb/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:01:11 np0005531887 nova_compute[186849]: 2025-11-22 08:01:11.900 186853 DEBUG oslo_concurrency.lockutils [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "/var/lib/nova/instances/fb921e88-22d2-4ae6-9d09-970505f0d5bb/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:01:11 np0005531887 nova_compute[186849]: 2025-11-22 08:01:11.915 186853 DEBUG oslo_concurrency.processutils [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:01:11 np0005531887 nova_compute[186849]: 2025-11-22 08:01:11.985 186853 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798456.9844558, 36558ab5-ef38-44f5-8dd6-98c8e20c68c7 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:01:11 np0005531887 nova_compute[186849]: 2025-11-22 08:01:11.986 186853 INFO nova.compute.manager [-] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:01:11 np0005531887 nova_compute[186849]: 2025-11-22 08:01:11.990 186853 DEBUG oslo_concurrency.processutils [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:01:11 np0005531887 nova_compute[186849]: 2025-11-22 08:01:11.990 186853 DEBUG oslo_concurrency.lockutils [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:01:11 np0005531887 nova_compute[186849]: 2025-11-22 08:01:11.991 186853 DEBUG oslo_concurrency.lockutils [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:01:12 np0005531887 nova_compute[186849]: 2025-11-22 08:01:12.002 186853 DEBUG oslo_concurrency.processutils [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:01:12 np0005531887 nova_compute[186849]: 2025-11-22 08:01:12.025 186853 DEBUG nova.compute.manager [None req-19dfbfc0-d9c2-4728-a51b-1713dbf95c19 - - - - - -] [instance: 36558ab5-ef38-44f5-8dd6-98c8e20c68c7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:01:12 np0005531887 nova_compute[186849]: 2025-11-22 08:01:12.045 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:12 np0005531887 nova_compute[186849]: 2025-11-22 08:01:12.069 186853 DEBUG oslo_concurrency.processutils [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:01:12 np0005531887 nova_compute[186849]: 2025-11-22 08:01:12.070 186853 DEBUG oslo_concurrency.processutils [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/fb921e88-22d2-4ae6-9d09-970505f0d5bb/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:01:12 np0005531887 nova_compute[186849]: 2025-11-22 08:01:12.235 186853 DEBUG nova.policy [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:01:12 np0005531887 nova_compute[186849]: 2025-11-22 08:01:12.591 186853 DEBUG oslo_concurrency.processutils [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/fb921e88-22d2-4ae6-9d09-970505f0d5bb/disk 1073741824" returned: 0 in 0.520s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:01:12 np0005531887 nova_compute[186849]: 2025-11-22 08:01:12.592 186853 DEBUG oslo_concurrency.lockutils [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.601s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:01:12 np0005531887 nova_compute[186849]: 2025-11-22 08:01:12.593 186853 DEBUG oslo_concurrency.processutils [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:01:12 np0005531887 nova_compute[186849]: 2025-11-22 08:01:12.660 186853 DEBUG oslo_concurrency.processutils [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:01:12 np0005531887 nova_compute[186849]: 2025-11-22 08:01:12.662 186853 DEBUG nova.virt.disk.api [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Checking if we can resize image /var/lib/nova/instances/fb921e88-22d2-4ae6-9d09-970505f0d5bb/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:01:12 np0005531887 nova_compute[186849]: 2025-11-22 08:01:12.662 186853 DEBUG oslo_concurrency.processutils [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fb921e88-22d2-4ae6-9d09-970505f0d5bb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:01:12 np0005531887 nova_compute[186849]: 2025-11-22 08:01:12.735 186853 DEBUG oslo_concurrency.processutils [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fb921e88-22d2-4ae6-9d09-970505f0d5bb/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:01:12 np0005531887 nova_compute[186849]: 2025-11-22 08:01:12.736 186853 DEBUG nova.virt.disk.api [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Cannot resize image /var/lib/nova/instances/fb921e88-22d2-4ae6-9d09-970505f0d5bb/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:01:12 np0005531887 nova_compute[186849]: 2025-11-22 08:01:12.736 186853 DEBUG nova.objects.instance [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lazy-loading 'migration_context' on Instance uuid fb921e88-22d2-4ae6-9d09-970505f0d5bb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:01:12 np0005531887 nova_compute[186849]: 2025-11-22 08:01:12.755 186853 DEBUG nova.virt.libvirt.driver [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:01:12 np0005531887 nova_compute[186849]: 2025-11-22 08:01:12.756 186853 DEBUG nova.virt.libvirt.driver [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Ensure instance console log exists: /var/lib/nova/instances/fb921e88-22d2-4ae6-9d09-970505f0d5bb/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:01:12 np0005531887 nova_compute[186849]: 2025-11-22 08:01:12.756 186853 DEBUG oslo_concurrency.lockutils [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:01:12 np0005531887 nova_compute[186849]: 2025-11-22 08:01:12.757 186853 DEBUG oslo_concurrency.lockutils [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:01:12 np0005531887 nova_compute[186849]: 2025-11-22 08:01:12.757 186853 DEBUG oslo_concurrency.lockutils [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:01:13 np0005531887 nova_compute[186849]: 2025-11-22 08:01:13.833 186853 DEBUG nova.network.neutron [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Successfully created port: 8979b5b9-122d-4f08-832e-80c8509b9f9a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:01:15 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:01:15.387 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=26, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=25) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:01:15 np0005531887 nova_compute[186849]: 2025-11-22 08:01:15.388 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:15 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:01:15.389 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:01:15 np0005531887 nova_compute[186849]: 2025-11-22 08:01:15.685 186853 DEBUG nova.network.neutron [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Successfully updated port: 8979b5b9-122d-4f08-832e-80c8509b9f9a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:01:15 np0005531887 nova_compute[186849]: 2025-11-22 08:01:15.703 186853 DEBUG oslo_concurrency.lockutils [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquiring lock "refresh_cache-fb921e88-22d2-4ae6-9d09-970505f0d5bb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:01:15 np0005531887 nova_compute[186849]: 2025-11-22 08:01:15.704 186853 DEBUG oslo_concurrency.lockutils [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquired lock "refresh_cache-fb921e88-22d2-4ae6-9d09-970505f0d5bb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:01:15 np0005531887 nova_compute[186849]: 2025-11-22 08:01:15.704 186853 DEBUG nova.network.neutron [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:01:15 np0005531887 nova_compute[186849]: 2025-11-22 08:01:15.902 186853 DEBUG nova.compute.manager [req-d1a66e84-4f39-4c7f-84f2-c267cbba5d5b req-4e42a75c-06af-45b4-a755-ec242afb1d2d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Received event network-changed-8979b5b9-122d-4f08-832e-80c8509b9f9a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:01:15 np0005531887 nova_compute[186849]: 2025-11-22 08:01:15.902 186853 DEBUG nova.compute.manager [req-d1a66e84-4f39-4c7f-84f2-c267cbba5d5b req-4e42a75c-06af-45b4-a755-ec242afb1d2d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Refreshing instance network info cache due to event network-changed-8979b5b9-122d-4f08-832e-80c8509b9f9a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:01:15 np0005531887 nova_compute[186849]: 2025-11-22 08:01:15.902 186853 DEBUG oslo_concurrency.lockutils [req-d1a66e84-4f39-4c7f-84f2-c267cbba5d5b req-4e42a75c-06af-45b4-a755-ec242afb1d2d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-fb921e88-22d2-4ae6-9d09-970505f0d5bb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:01:16 np0005531887 nova_compute[186849]: 2025-11-22 08:01:16.045 186853 DEBUG nova.network.neutron [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:01:16 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:01:16.391 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '26'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:01:16 np0005531887 nova_compute[186849]: 2025-11-22 08:01:16.665 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:16 np0005531887 podman[227105]: 2025-11-22 08:01:16.850341256 +0000 UTC m=+0.069376242 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 22 03:01:16 np0005531887 podman[227106]: 2025-11-22 08:01:16.880604342 +0000 UTC m=+0.096906907 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 22 03:01:17 np0005531887 nova_compute[186849]: 2025-11-22 08:01:17.047 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:17 np0005531887 nova_compute[186849]: 2025-11-22 08:01:17.291 186853 DEBUG nova.network.neutron [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Updating instance_info_cache with network_info: [{"id": "8979b5b9-122d-4f08-832e-80c8509b9f9a", "address": "fa:16:3e:a0:69:e9", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8979b5b9-12", "ovs_interfaceid": "8979b5b9-122d-4f08-832e-80c8509b9f9a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:01:17 np0005531887 nova_compute[186849]: 2025-11-22 08:01:17.313 186853 DEBUG oslo_concurrency.lockutils [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Releasing lock "refresh_cache-fb921e88-22d2-4ae6-9d09-970505f0d5bb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:01:17 np0005531887 nova_compute[186849]: 2025-11-22 08:01:17.313 186853 DEBUG nova.compute.manager [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Instance network_info: |[{"id": "8979b5b9-122d-4f08-832e-80c8509b9f9a", "address": "fa:16:3e:a0:69:e9", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8979b5b9-12", "ovs_interfaceid": "8979b5b9-122d-4f08-832e-80c8509b9f9a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 03:01:17 np0005531887 nova_compute[186849]: 2025-11-22 08:01:17.313 186853 DEBUG oslo_concurrency.lockutils [req-d1a66e84-4f39-4c7f-84f2-c267cbba5d5b req-4e42a75c-06af-45b4-a755-ec242afb1d2d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-fb921e88-22d2-4ae6-9d09-970505f0d5bb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:01:17 np0005531887 nova_compute[186849]: 2025-11-22 08:01:17.313 186853 DEBUG nova.network.neutron [req-d1a66e84-4f39-4c7f-84f2-c267cbba5d5b req-4e42a75c-06af-45b4-a755-ec242afb1d2d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Refreshing network info cache for port 8979b5b9-122d-4f08-832e-80c8509b9f9a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:01:17 np0005531887 nova_compute[186849]: 2025-11-22 08:01:17.317 186853 DEBUG nova.virt.libvirt.driver [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Start _get_guest_xml network_info=[{"id": "8979b5b9-122d-4f08-832e-80c8509b9f9a", "address": "fa:16:3e:a0:69:e9", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8979b5b9-12", "ovs_interfaceid": "8979b5b9-122d-4f08-832e-80c8509b9f9a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'size': 0, 'boot_index': 0, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:01:17 np0005531887 nova_compute[186849]: 2025-11-22 08:01:17.324 186853 WARNING nova.virt.libvirt.driver [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:01:17 np0005531887 nova_compute[186849]: 2025-11-22 08:01:17.331 186853 DEBUG nova.virt.libvirt.host [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:01:17 np0005531887 nova_compute[186849]: 2025-11-22 08:01:17.332 186853 DEBUG nova.virt.libvirt.host [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:01:17 np0005531887 nova_compute[186849]: 2025-11-22 08:01:17.335 186853 DEBUG nova.virt.libvirt.host [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:01:17 np0005531887 nova_compute[186849]: 2025-11-22 08:01:17.335 186853 DEBUG nova.virt.libvirt.host [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:01:17 np0005531887 nova_compute[186849]: 2025-11-22 08:01:17.336 186853 DEBUG nova.virt.libvirt.driver [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:01:17 np0005531887 nova_compute[186849]: 2025-11-22 08:01:17.336 186853 DEBUG nova.virt.hardware [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:01:17 np0005531887 nova_compute[186849]: 2025-11-22 08:01:17.337 186853 DEBUG nova.virt.hardware [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:01:17 np0005531887 nova_compute[186849]: 2025-11-22 08:01:17.337 186853 DEBUG nova.virt.hardware [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:01:17 np0005531887 nova_compute[186849]: 2025-11-22 08:01:17.337 186853 DEBUG nova.virt.hardware [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:01:17 np0005531887 nova_compute[186849]: 2025-11-22 08:01:17.337 186853 DEBUG nova.virt.hardware [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:01:17 np0005531887 nova_compute[186849]: 2025-11-22 08:01:17.338 186853 DEBUG nova.virt.hardware [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:01:17 np0005531887 nova_compute[186849]: 2025-11-22 08:01:17.338 186853 DEBUG nova.virt.hardware [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:01:17 np0005531887 nova_compute[186849]: 2025-11-22 08:01:17.338 186853 DEBUG nova.virt.hardware [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:01:17 np0005531887 nova_compute[186849]: 2025-11-22 08:01:17.338 186853 DEBUG nova.virt.hardware [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:01:17 np0005531887 nova_compute[186849]: 2025-11-22 08:01:17.338 186853 DEBUG nova.virt.hardware [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:01:17 np0005531887 nova_compute[186849]: 2025-11-22 08:01:17.339 186853 DEBUG nova.virt.hardware [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:01:17 np0005531887 nova_compute[186849]: 2025-11-22 08:01:17.342 186853 DEBUG nova.virt.libvirt.vif [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:01:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-253236039',display_name='tempest-AttachInterfacesTestJSON-server-253236039',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-253236039',id=95,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPlFNWaBPkXmo1oJCfLN54tXf/v5r9z/rUlzmZkdlM69K1yL3DA4wVyYYlX6OkgGyxa3eXCtkM1Rz2ltwe/CBwFo/i1WEXVw/DQ8O39rJeQjSBKmeD14VReQyrKNKErHuw==',key_name='tempest-keypair-1627237356',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2ce7c88f9ad440f494bdba03e7ece1bf',ramdisk_id='',reservation_id='r-hi30eyon',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-148239263',owner_user_name='tempest-AttachInterfacesTestJSON-148239263-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:01:11Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='53d50a77d3f2416c8fcc459cc343d045',uuid=fb921e88-22d2-4ae6-9d09-970505f0d5bb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8979b5b9-122d-4f08-832e-80c8509b9f9a", "address": "fa:16:3e:a0:69:e9", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8979b5b9-12", "ovs_interfaceid": "8979b5b9-122d-4f08-832e-80c8509b9f9a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:01:17 np0005531887 nova_compute[186849]: 2025-11-22 08:01:17.342 186853 DEBUG nova.network.os_vif_util [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converting VIF {"id": "8979b5b9-122d-4f08-832e-80c8509b9f9a", "address": "fa:16:3e:a0:69:e9", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8979b5b9-12", "ovs_interfaceid": "8979b5b9-122d-4f08-832e-80c8509b9f9a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:01:17 np0005531887 nova_compute[186849]: 2025-11-22 08:01:17.343 186853 DEBUG nova.network.os_vif_util [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a0:69:e9,bridge_name='br-int',has_traffic_filtering=True,id=8979b5b9-122d-4f08-832e-80c8509b9f9a,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8979b5b9-12') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:01:17 np0005531887 nova_compute[186849]: 2025-11-22 08:01:17.344 186853 DEBUG nova.objects.instance [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lazy-loading 'pci_devices' on Instance uuid fb921e88-22d2-4ae6-9d09-970505f0d5bb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:01:17 np0005531887 nova_compute[186849]: 2025-11-22 08:01:17.360 186853 DEBUG nova.virt.libvirt.driver [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:01:17 np0005531887 nova_compute[186849]:  <uuid>fb921e88-22d2-4ae6-9d09-970505f0d5bb</uuid>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:  <name>instance-0000005f</name>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:  <memory>131072</memory>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:  <vcpu>1</vcpu>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:01:17 np0005531887 nova_compute[186849]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:      <nova:name>tempest-AttachInterfacesTestJSON-server-253236039</nova:name>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:      <nova:creationTime>2025-11-22 08:01:17</nova:creationTime>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:      <nova:flavor name="m1.nano">
Nov 22 03:01:17 np0005531887 nova_compute[186849]:        <nova:memory>128</nova:memory>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:        <nova:disk>1</nova:disk>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:        <nova:swap>0</nova:swap>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:      </nova:flavor>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:      <nova:owner>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:        <nova:user uuid="53d50a77d3f2416c8fcc459cc343d045">tempest-AttachInterfacesTestJSON-148239263-project-member</nova:user>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:        <nova:project uuid="2ce7c88f9ad440f494bdba03e7ece1bf">tempest-AttachInterfacesTestJSON-148239263</nova:project>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:      </nova:owner>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:      <nova:ports>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:        <nova:port uuid="8979b5b9-122d-4f08-832e-80c8509b9f9a">
Nov 22 03:01:17 np0005531887 nova_compute[186849]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:        </nova:port>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:      </nova:ports>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:    </nova:instance>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:  <sysinfo type="smbios">
Nov 22 03:01:17 np0005531887 nova_compute[186849]:    <system>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:      <entry name="serial">fb921e88-22d2-4ae6-9d09-970505f0d5bb</entry>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:      <entry name="uuid">fb921e88-22d2-4ae6-9d09-970505f0d5bb</entry>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:    </system>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:  <os>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:    <boot dev="hd"/>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:    <smbios mode="sysinfo"/>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:  </os>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:  <features>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:    <vmcoreinfo/>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:  </features>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:  <clock offset="utc">
Nov 22 03:01:17 np0005531887 nova_compute[186849]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:    <timer name="hpet" present="no"/>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:  </clock>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:  <cpu mode="custom" match="exact">
Nov 22 03:01:17 np0005531887 nova_compute[186849]:    <model>Nehalem</model>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:  <devices>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:    <disk type="file" device="disk">
Nov 22 03:01:17 np0005531887 nova_compute[186849]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/fb921e88-22d2-4ae6-9d09-970505f0d5bb/disk"/>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:      <target dev="vda" bus="virtio"/>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:    <disk type="file" device="cdrom">
Nov 22 03:01:17 np0005531887 nova_compute[186849]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/fb921e88-22d2-4ae6-9d09-970505f0d5bb/disk.config"/>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:      <target dev="sda" bus="sata"/>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:    <interface type="ethernet">
Nov 22 03:01:17 np0005531887 nova_compute[186849]:      <mac address="fa:16:3e:a0:69:e9"/>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:      <mtu size="1442"/>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:      <target dev="tap8979b5b9-12"/>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:    </interface>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:    <serial type="pty">
Nov 22 03:01:17 np0005531887 nova_compute[186849]:      <log file="/var/lib/nova/instances/fb921e88-22d2-4ae6-9d09-970505f0d5bb/console.log" append="off"/>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:    </serial>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:    <video>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:    </video>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:    <input type="tablet" bus="usb"/>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:    <rng model="virtio">
Nov 22 03:01:17 np0005531887 nova_compute[186849]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:    </rng>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:    <controller type="usb" index="0"/>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:    <memballoon model="virtio">
Nov 22 03:01:17 np0005531887 nova_compute[186849]:      <stats period="10"/>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 03:01:17 np0005531887 nova_compute[186849]:  </devices>
Nov 22 03:01:17 np0005531887 nova_compute[186849]: </domain>
Nov 22 03:01:17 np0005531887 nova_compute[186849]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:01:17 np0005531887 nova_compute[186849]: 2025-11-22 08:01:17.362 186853 DEBUG nova.compute.manager [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Preparing to wait for external event network-vif-plugged-8979b5b9-122d-4f08-832e-80c8509b9f9a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:01:17 np0005531887 nova_compute[186849]: 2025-11-22 08:01:17.362 186853 DEBUG oslo_concurrency.lockutils [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquiring lock "fb921e88-22d2-4ae6-9d09-970505f0d5bb-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:01:17 np0005531887 nova_compute[186849]: 2025-11-22 08:01:17.362 186853 DEBUG oslo_concurrency.lockutils [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "fb921e88-22d2-4ae6-9d09-970505f0d5bb-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:01:17 np0005531887 nova_compute[186849]: 2025-11-22 08:01:17.362 186853 DEBUG oslo_concurrency.lockutils [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "fb921e88-22d2-4ae6-9d09-970505f0d5bb-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:01:17 np0005531887 nova_compute[186849]: 2025-11-22 08:01:17.363 186853 DEBUG nova.virt.libvirt.vif [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:01:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-253236039',display_name='tempest-AttachInterfacesTestJSON-server-253236039',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-253236039',id=95,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPlFNWaBPkXmo1oJCfLN54tXf/v5r9z/rUlzmZkdlM69K1yL3DA4wVyYYlX6OkgGyxa3eXCtkM1Rz2ltwe/CBwFo/i1WEXVw/DQ8O39rJeQjSBKmeD14VReQyrKNKErHuw==',key_name='tempest-keypair-1627237356',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2ce7c88f9ad440f494bdba03e7ece1bf',ramdisk_id='',reservation_id='r-hi30eyon',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-148239263',owner_user_name='tempest-AttachInterfacesTestJSON-148239263-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:01:11Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='53d50a77d3f2416c8fcc459cc343d045',uuid=fb921e88-22d2-4ae6-9d09-970505f0d5bb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8979b5b9-122d-4f08-832e-80c8509b9f9a", "address": "fa:16:3e:a0:69:e9", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8979b5b9-12", "ovs_interfaceid": "8979b5b9-122d-4f08-832e-80c8509b9f9a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:01:17 np0005531887 nova_compute[186849]: 2025-11-22 08:01:17.364 186853 DEBUG nova.network.os_vif_util [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converting VIF {"id": "8979b5b9-122d-4f08-832e-80c8509b9f9a", "address": "fa:16:3e:a0:69:e9", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8979b5b9-12", "ovs_interfaceid": "8979b5b9-122d-4f08-832e-80c8509b9f9a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:01:17 np0005531887 nova_compute[186849]: 2025-11-22 08:01:17.364 186853 DEBUG nova.network.os_vif_util [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a0:69:e9,bridge_name='br-int',has_traffic_filtering=True,id=8979b5b9-122d-4f08-832e-80c8509b9f9a,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8979b5b9-12') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:01:17 np0005531887 nova_compute[186849]: 2025-11-22 08:01:17.365 186853 DEBUG os_vif [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a0:69:e9,bridge_name='br-int',has_traffic_filtering=True,id=8979b5b9-122d-4f08-832e-80c8509b9f9a,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8979b5b9-12') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:01:17 np0005531887 nova_compute[186849]: 2025-11-22 08:01:17.366 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:17 np0005531887 nova_compute[186849]: 2025-11-22 08:01:17.366 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:01:17 np0005531887 nova_compute[186849]: 2025-11-22 08:01:17.366 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:01:17 np0005531887 nova_compute[186849]: 2025-11-22 08:01:17.369 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:17 np0005531887 nova_compute[186849]: 2025-11-22 08:01:17.370 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8979b5b9-12, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:01:17 np0005531887 nova_compute[186849]: 2025-11-22 08:01:17.370 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8979b5b9-12, col_values=(('external_ids', {'iface-id': '8979b5b9-122d-4f08-832e-80c8509b9f9a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a0:69:e9', 'vm-uuid': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:01:17 np0005531887 nova_compute[186849]: 2025-11-22 08:01:17.372 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:17 np0005531887 NetworkManager[55210]: <info>  [1763798477.3738] manager: (tap8979b5b9-12): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/131)
Nov 22 03:01:17 np0005531887 nova_compute[186849]: 2025-11-22 08:01:17.376 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:01:17 np0005531887 nova_compute[186849]: 2025-11-22 08:01:17.382 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:17 np0005531887 nova_compute[186849]: 2025-11-22 08:01:17.383 186853 INFO os_vif [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a0:69:e9,bridge_name='br-int',has_traffic_filtering=True,id=8979b5b9-122d-4f08-832e-80c8509b9f9a,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8979b5b9-12')#033[00m
Nov 22 03:01:17 np0005531887 nova_compute[186849]: 2025-11-22 08:01:17.610 186853 DEBUG nova.virt.libvirt.driver [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:01:17 np0005531887 nova_compute[186849]: 2025-11-22 08:01:17.611 186853 DEBUG nova.virt.libvirt.driver [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:01:17 np0005531887 nova_compute[186849]: 2025-11-22 08:01:17.611 186853 DEBUG nova.virt.libvirt.driver [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] No VIF found with MAC fa:16:3e:a0:69:e9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:01:17 np0005531887 nova_compute[186849]: 2025-11-22 08:01:17.612 186853 INFO nova.virt.libvirt.driver [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Using config drive#033[00m
Nov 22 03:01:18 np0005531887 nova_compute[186849]: 2025-11-22 08:01:18.350 186853 INFO nova.virt.libvirt.driver [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Creating config drive at /var/lib/nova/instances/fb921e88-22d2-4ae6-9d09-970505f0d5bb/disk.config#033[00m
Nov 22 03:01:18 np0005531887 nova_compute[186849]: 2025-11-22 08:01:18.355 186853 DEBUG oslo_concurrency.processutils [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/fb921e88-22d2-4ae6-9d09-970505f0d5bb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjdrxhwe7 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:01:18 np0005531887 nova_compute[186849]: 2025-11-22 08:01:18.486 186853 DEBUG oslo_concurrency.processutils [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/fb921e88-22d2-4ae6-9d09-970505f0d5bb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjdrxhwe7" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:01:18 np0005531887 kernel: tap8979b5b9-12: entered promiscuous mode
Nov 22 03:01:18 np0005531887 NetworkManager[55210]: <info>  [1763798478.5811] manager: (tap8979b5b9-12): new Tun device (/org/freedesktop/NetworkManager/Devices/132)
Nov 22 03:01:18 np0005531887 ovn_controller[95130]: 2025-11-22T08:01:18Z|00271|binding|INFO|Claiming lport 8979b5b9-122d-4f08-832e-80c8509b9f9a for this chassis.
Nov 22 03:01:18 np0005531887 ovn_controller[95130]: 2025-11-22T08:01:18Z|00272|binding|INFO|8979b5b9-122d-4f08-832e-80c8509b9f9a: Claiming fa:16:3e:a0:69:e9 10.100.0.11
Nov 22 03:01:18 np0005531887 nova_compute[186849]: 2025-11-22 08:01:18.582 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:01:18.593 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a0:69:e9 10.100.0.11'], port_security=['fa:16:3e:a0:69:e9 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6a4a282c-db22-41de-b34b-2960aa032ca8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'neutron:revision_number': '2', 'neutron:security_group_ids': '6931849a-3956-46e1-8bb0-462d1b35b82c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0acaade9-a442-4c9d-a882-bd397d30fce8, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=8979b5b9-122d-4f08-832e-80c8509b9f9a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:01:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:01:18.594 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 8979b5b9-122d-4f08-832e-80c8509b9f9a in datapath 6a4a282c-db22-41de-b34b-2960aa032ca8 bound to our chassis#033[00m
Nov 22 03:01:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:01:18.596 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6a4a282c-db22-41de-b34b-2960aa032ca8#033[00m
Nov 22 03:01:18 np0005531887 ovn_controller[95130]: 2025-11-22T08:01:18Z|00273|binding|INFO|Setting lport 8979b5b9-122d-4f08-832e-80c8509b9f9a ovn-installed in OVS
Nov 22 03:01:18 np0005531887 ovn_controller[95130]: 2025-11-22T08:01:18Z|00274|binding|INFO|Setting lport 8979b5b9-122d-4f08-832e-80c8509b9f9a up in Southbound
Nov 22 03:01:18 np0005531887 nova_compute[186849]: 2025-11-22 08:01:18.599 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:18 np0005531887 nova_compute[186849]: 2025-11-22 08:01:18.601 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:01:18.611 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[e0bc74e8-ade9-49da-aa2c-1eaf96afb544]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:01:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:01:18.612 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6a4a282c-d1 in ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:01:18 np0005531887 systemd-udevd[227172]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:01:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:01:18.614 213790 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6a4a282c-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:01:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:01:18.614 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[e1d2ff93-47fc-4ef4-a0b9-21b7ed95d2cb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:01:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:01:18.614 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[1fb6e2d4-a901-4021-bd48-108cced1e2ba]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:01:18 np0005531887 systemd-machined[153180]: New machine qemu-38-instance-0000005f.
Nov 22 03:01:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:01:18.626 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[121d30b0-0bff-4976-bbc4-d8543369e63d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:01:18 np0005531887 NetworkManager[55210]: <info>  [1763798478.6312] device (tap8979b5b9-12): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:01:18 np0005531887 systemd[1]: Started Virtual Machine qemu-38-instance-0000005f.
Nov 22 03:01:18 np0005531887 NetworkManager[55210]: <info>  [1763798478.6331] device (tap8979b5b9-12): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:01:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:01:18.652 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[7993b2cb-d5e9-4c33-afc2-1b0d2152b12e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:01:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:01:18.686 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[0a766008-66f5-477e-b01b-68913dedb465]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:01:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:01:18.693 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[9cb1005a-68c1-48a1-a6af-16b1938bbb03]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:01:18 np0005531887 systemd-udevd[227176]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:01:18 np0005531887 NetworkManager[55210]: <info>  [1763798478.6944] manager: (tap6a4a282c-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/133)
Nov 22 03:01:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:01:18.743 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[39d7b16a-3088-45cc-80de-af2b88ab97f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:01:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:01:18.746 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[87a7a5bd-15dd-4072-bef0-22000dab05d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:01:18 np0005531887 NetworkManager[55210]: <info>  [1763798478.7777] device (tap6a4a282c-d0): carrier: link connected
Nov 22 03:01:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:01:18.787 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[3696e44c-b928-4883-8a25-f7ce4aef855d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:01:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:01:18.807 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[45d1a959-ccb7-4aee-89b4-a4d96b2d58ef]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6a4a282c-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b3:7a:86'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 89], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 522138, 'reachable_time': 28434, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227205, 'error': None, 'target': 'ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:01:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:01:18.828 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[27d0ba4c-c4eb-4308-a185-9626e7090e0b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb3:7a86'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 522138, 'tstamp': 522138}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227206, 'error': None, 'target': 'ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:01:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:01:18.850 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[fb9daa64-475b-4170-9599-6b2246a3bd0a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6a4a282c-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b3:7a:86'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 89], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 522138, 'reachable_time': 28434, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227207, 'error': None, 'target': 'ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:01:18 np0005531887 nova_compute[186849]: 2025-11-22 08:01:18.858 186853 DEBUG nova.compute.manager [req-1f9343a4-fc17-4c42-8ba2-b099065a502b req-eb925d8a-0376-49de-82aa-f295d928df05 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Received event network-vif-plugged-8979b5b9-122d-4f08-832e-80c8509b9f9a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:01:18 np0005531887 nova_compute[186849]: 2025-11-22 08:01:18.859 186853 DEBUG oslo_concurrency.lockutils [req-1f9343a4-fc17-4c42-8ba2-b099065a502b req-eb925d8a-0376-49de-82aa-f295d928df05 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "fb921e88-22d2-4ae6-9d09-970505f0d5bb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:01:18 np0005531887 nova_compute[186849]: 2025-11-22 08:01:18.859 186853 DEBUG oslo_concurrency.lockutils [req-1f9343a4-fc17-4c42-8ba2-b099065a502b req-eb925d8a-0376-49de-82aa-f295d928df05 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "fb921e88-22d2-4ae6-9d09-970505f0d5bb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:01:18 np0005531887 nova_compute[186849]: 2025-11-22 08:01:18.859 186853 DEBUG oslo_concurrency.lockutils [req-1f9343a4-fc17-4c42-8ba2-b099065a502b req-eb925d8a-0376-49de-82aa-f295d928df05 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "fb921e88-22d2-4ae6-9d09-970505f0d5bb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:01:18 np0005531887 nova_compute[186849]: 2025-11-22 08:01:18.860 186853 DEBUG nova.compute.manager [req-1f9343a4-fc17-4c42-8ba2-b099065a502b req-eb925d8a-0376-49de-82aa-f295d928df05 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Processing event network-vif-plugged-8979b5b9-122d-4f08-832e-80c8509b9f9a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:01:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:01:18.896 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[b37dc8eb-b058-4b08-bd4e-91ef6a5e5a0d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:01:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:01:18.970 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[69a5ab68-5e7c-4d38-8be6-870c266b68db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:01:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:01:18.972 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6a4a282c-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:01:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:01:18.973 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:01:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:01:18.973 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6a4a282c-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:01:18 np0005531887 nova_compute[186849]: 2025-11-22 08:01:18.975 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:18 np0005531887 kernel: tap6a4a282c-d0: entered promiscuous mode
Nov 22 03:01:18 np0005531887 NetworkManager[55210]: <info>  [1763798478.9778] manager: (tap6a4a282c-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/134)
Nov 22 03:01:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:01:18.978 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6a4a282c-d0, col_values=(('external_ids', {'iface-id': '26692495-261e-4628-ae4d-0a33d676c097'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:01:18 np0005531887 nova_compute[186849]: 2025-11-22 08:01:18.979 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:18 np0005531887 ovn_controller[95130]: 2025-11-22T08:01:18Z|00275|binding|INFO|Releasing lport 26692495-261e-4628-ae4d-0a33d676c097 from this chassis (sb_readonly=0)
Nov 22 03:01:18 np0005531887 nova_compute[186849]: 2025-11-22 08:01:18.991 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:01:18.993 104084 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6a4a282c-db22-41de-b34b-2960aa032ca8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6a4a282c-db22-41de-b34b-2960aa032ca8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:01:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:01:18.994 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[c44723d1-2474-411c-8a26-3bc01c4994ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:01:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:01:18.995 104084 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:01:18 np0005531887 ovn_metadata_agent[104079]: global
Nov 22 03:01:18 np0005531887 ovn_metadata_agent[104079]:    log         /dev/log local0 debug
Nov 22 03:01:18 np0005531887 ovn_metadata_agent[104079]:    log-tag     haproxy-metadata-proxy-6a4a282c-db22-41de-b34b-2960aa032ca8
Nov 22 03:01:18 np0005531887 ovn_metadata_agent[104079]:    user        root
Nov 22 03:01:18 np0005531887 ovn_metadata_agent[104079]:    group       root
Nov 22 03:01:18 np0005531887 ovn_metadata_agent[104079]:    maxconn     1024
Nov 22 03:01:18 np0005531887 ovn_metadata_agent[104079]:    pidfile     /var/lib/neutron/external/pids/6a4a282c-db22-41de-b34b-2960aa032ca8.pid.haproxy
Nov 22 03:01:18 np0005531887 ovn_metadata_agent[104079]:    daemon
Nov 22 03:01:18 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:01:18 np0005531887 ovn_metadata_agent[104079]: defaults
Nov 22 03:01:18 np0005531887 ovn_metadata_agent[104079]:    log global
Nov 22 03:01:18 np0005531887 ovn_metadata_agent[104079]:    mode http
Nov 22 03:01:18 np0005531887 ovn_metadata_agent[104079]:    option httplog
Nov 22 03:01:18 np0005531887 ovn_metadata_agent[104079]:    option dontlognull
Nov 22 03:01:18 np0005531887 ovn_metadata_agent[104079]:    option http-server-close
Nov 22 03:01:18 np0005531887 ovn_metadata_agent[104079]:    option forwardfor
Nov 22 03:01:18 np0005531887 ovn_metadata_agent[104079]:    retries                 3
Nov 22 03:01:18 np0005531887 ovn_metadata_agent[104079]:    timeout http-request    30s
Nov 22 03:01:18 np0005531887 ovn_metadata_agent[104079]:    timeout connect         30s
Nov 22 03:01:18 np0005531887 ovn_metadata_agent[104079]:    timeout client          32s
Nov 22 03:01:18 np0005531887 ovn_metadata_agent[104079]:    timeout server          32s
Nov 22 03:01:18 np0005531887 ovn_metadata_agent[104079]:    timeout http-keep-alive 30s
Nov 22 03:01:18 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:01:18 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:01:18 np0005531887 ovn_metadata_agent[104079]: listen listener
Nov 22 03:01:18 np0005531887 ovn_metadata_agent[104079]:    bind 169.254.169.254:80
Nov 22 03:01:18 np0005531887 ovn_metadata_agent[104079]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:01:18 np0005531887 ovn_metadata_agent[104079]:    http-request add-header X-OVN-Network-ID 6a4a282c-db22-41de-b34b-2960aa032ca8
Nov 22 03:01:18 np0005531887 ovn_metadata_agent[104079]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:01:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:01:18.996 104084 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8', 'env', 'PROCESS_TAG=haproxy-6a4a282c-db22-41de-b34b-2960aa032ca8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6a4a282c-db22-41de-b34b-2960aa032ca8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:01:19 np0005531887 nova_compute[186849]: 2025-11-22 08:01:19.409 186853 DEBUG nova.compute.manager [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:01:19 np0005531887 nova_compute[186849]: 2025-11-22 08:01:19.411 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798479.4102144, fb921e88-22d2-4ae6-9d09-970505f0d5bb => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:01:19 np0005531887 nova_compute[186849]: 2025-11-22 08:01:19.411 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] VM Started (Lifecycle Event)#033[00m
Nov 22 03:01:19 np0005531887 nova_compute[186849]: 2025-11-22 08:01:19.414 186853 DEBUG nova.virt.libvirt.driver [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:01:19 np0005531887 nova_compute[186849]: 2025-11-22 08:01:19.420 186853 INFO nova.virt.libvirt.driver [-] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Instance spawned successfully.#033[00m
Nov 22 03:01:19 np0005531887 nova_compute[186849]: 2025-11-22 08:01:19.420 186853 DEBUG nova.virt.libvirt.driver [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 03:01:19 np0005531887 nova_compute[186849]: 2025-11-22 08:01:19.434 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:01:19 np0005531887 podman[227241]: 2025-11-22 08:01:19.342427914 +0000 UTC m=+0.022431187 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:01:19 np0005531887 nova_compute[186849]: 2025-11-22 08:01:19.440 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:01:19 np0005531887 nova_compute[186849]: 2025-11-22 08:01:19.445 186853 DEBUG nova.virt.libvirt.driver [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:01:19 np0005531887 nova_compute[186849]: 2025-11-22 08:01:19.446 186853 DEBUG nova.virt.libvirt.driver [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:01:19 np0005531887 nova_compute[186849]: 2025-11-22 08:01:19.446 186853 DEBUG nova.virt.libvirt.driver [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:01:19 np0005531887 nova_compute[186849]: 2025-11-22 08:01:19.446 186853 DEBUG nova.virt.libvirt.driver [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:01:19 np0005531887 nova_compute[186849]: 2025-11-22 08:01:19.447 186853 DEBUG nova.virt.libvirt.driver [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:01:19 np0005531887 nova_compute[186849]: 2025-11-22 08:01:19.447 186853 DEBUG nova.virt.libvirt.driver [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:01:19 np0005531887 nova_compute[186849]: 2025-11-22 08:01:19.475 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:01:19 np0005531887 nova_compute[186849]: 2025-11-22 08:01:19.475 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798479.4103916, fb921e88-22d2-4ae6-9d09-970505f0d5bb => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:01:19 np0005531887 nova_compute[186849]: 2025-11-22 08:01:19.475 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:01:19 np0005531887 nova_compute[186849]: 2025-11-22 08:01:19.510 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:01:19 np0005531887 nova_compute[186849]: 2025-11-22 08:01:19.516 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798479.41364, fb921e88-22d2-4ae6-9d09-970505f0d5bb => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:01:19 np0005531887 nova_compute[186849]: 2025-11-22 08:01:19.516 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:01:19 np0005531887 nova_compute[186849]: 2025-11-22 08:01:19.523 186853 INFO nova.compute.manager [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Took 7.63 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 03:01:19 np0005531887 nova_compute[186849]: 2025-11-22 08:01:19.524 186853 DEBUG nova.compute.manager [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:01:19 np0005531887 nova_compute[186849]: 2025-11-22 08:01:19.534 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:01:19 np0005531887 nova_compute[186849]: 2025-11-22 08:01:19.540 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:01:19 np0005531887 nova_compute[186849]: 2025-11-22 08:01:19.558 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:01:19 np0005531887 nova_compute[186849]: 2025-11-22 08:01:19.611 186853 INFO nova.compute.manager [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Took 8.20 seconds to build instance.#033[00m
Nov 22 03:01:19 np0005531887 nova_compute[186849]: 2025-11-22 08:01:19.641 186853 DEBUG oslo_concurrency.lockutils [None req-64f3df8b-3117-422e-b7ee-29b260f9197d 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "fb921e88-22d2-4ae6-9d09-970505f0d5bb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.377s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:01:19 np0005531887 nova_compute[186849]: 2025-11-22 08:01:19.701 186853 DEBUG nova.network.neutron [req-d1a66e84-4f39-4c7f-84f2-c267cbba5d5b req-4e42a75c-06af-45b4-a755-ec242afb1d2d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Updated VIF entry in instance network info cache for port 8979b5b9-122d-4f08-832e-80c8509b9f9a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:01:19 np0005531887 nova_compute[186849]: 2025-11-22 08:01:19.701 186853 DEBUG nova.network.neutron [req-d1a66e84-4f39-4c7f-84f2-c267cbba5d5b req-4e42a75c-06af-45b4-a755-ec242afb1d2d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Updating instance_info_cache with network_info: [{"id": "8979b5b9-122d-4f08-832e-80c8509b9f9a", "address": "fa:16:3e:a0:69:e9", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8979b5b9-12", "ovs_interfaceid": "8979b5b9-122d-4f08-832e-80c8509b9f9a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:01:19 np0005531887 nova_compute[186849]: 2025-11-22 08:01:19.716 186853 DEBUG oslo_concurrency.lockutils [req-d1a66e84-4f39-4c7f-84f2-c267cbba5d5b req-4e42a75c-06af-45b4-a755-ec242afb1d2d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-fb921e88-22d2-4ae6-9d09-970505f0d5bb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:01:20 np0005531887 podman[227241]: 2025-11-22 08:01:20.537080831 +0000 UTC m=+1.217084104 container create 642e5f623e8a18a64ae62d2e5d1ad93d03442bc30cc29d0796ea3ce332ec6bb1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118)
Nov 22 03:01:20 np0005531887 nova_compute[186849]: 2025-11-22 08:01:20.978 186853 DEBUG nova.compute.manager [req-e0d5c7c4-6b25-475e-82cb-c048f012456b req-23756d0a-5a63-4ba8-954e-d532671f6d82 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Received event network-vif-plugged-8979b5b9-122d-4f08-832e-80c8509b9f9a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:01:20 np0005531887 nova_compute[186849]: 2025-11-22 08:01:20.978 186853 DEBUG oslo_concurrency.lockutils [req-e0d5c7c4-6b25-475e-82cb-c048f012456b req-23756d0a-5a63-4ba8-954e-d532671f6d82 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "fb921e88-22d2-4ae6-9d09-970505f0d5bb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:01:20 np0005531887 nova_compute[186849]: 2025-11-22 08:01:20.978 186853 DEBUG oslo_concurrency.lockutils [req-e0d5c7c4-6b25-475e-82cb-c048f012456b req-23756d0a-5a63-4ba8-954e-d532671f6d82 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "fb921e88-22d2-4ae6-9d09-970505f0d5bb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:01:20 np0005531887 nova_compute[186849]: 2025-11-22 08:01:20.978 186853 DEBUG oslo_concurrency.lockutils [req-e0d5c7c4-6b25-475e-82cb-c048f012456b req-23756d0a-5a63-4ba8-954e-d532671f6d82 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "fb921e88-22d2-4ae6-9d09-970505f0d5bb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:01:20 np0005531887 nova_compute[186849]: 2025-11-22 08:01:20.979 186853 DEBUG nova.compute.manager [req-e0d5c7c4-6b25-475e-82cb-c048f012456b req-23756d0a-5a63-4ba8-954e-d532671f6d82 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] No waiting events found dispatching network-vif-plugged-8979b5b9-122d-4f08-832e-80c8509b9f9a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:01:20 np0005531887 nova_compute[186849]: 2025-11-22 08:01:20.979 186853 WARNING nova.compute.manager [req-e0d5c7c4-6b25-475e-82cb-c048f012456b req-23756d0a-5a63-4ba8-954e-d532671f6d82 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Received unexpected event network-vif-plugged-8979b5b9-122d-4f08-832e-80c8509b9f9a for instance with vm_state active and task_state None.#033[00m
Nov 22 03:01:20 np0005531887 systemd[1]: Started libpod-conmon-642e5f623e8a18a64ae62d2e5d1ad93d03442bc30cc29d0796ea3ce332ec6bb1.scope.
Nov 22 03:01:21 np0005531887 systemd[1]: Started libcrun container.
Nov 22 03:01:21 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7aeb595a4e479dbd4fb62d44e84d3b3c0b1131ed8880d6fd6aad99cad5934290/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:01:21 np0005531887 podman[227241]: 2025-11-22 08:01:21.266525858 +0000 UTC m=+1.946529161 container init 642e5f623e8a18a64ae62d2e5d1ad93d03442bc30cc29d0796ea3ce332ec6bb1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 22 03:01:21 np0005531887 podman[227241]: 2025-11-22 08:01:21.272420759 +0000 UTC m=+1.952424032 container start 642e5f623e8a18a64ae62d2e5d1ad93d03442bc30cc29d0796ea3ce332ec6bb1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 22 03:01:21 np0005531887 neutron-haproxy-ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8[227262]: [NOTICE]   (227275) : New worker (227277) forked
Nov 22 03:01:21 np0005531887 neutron-haproxy-ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8[227262]: [NOTICE]   (227275) : Loading success.
Nov 22 03:01:21 np0005531887 podman[227261]: 2025-11-22 08:01:21.490401202 +0000 UTC m=+0.483926288 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 03:01:21 np0005531887 nova_compute[186849]: 2025-11-22 08:01:21.668 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:22 np0005531887 nova_compute[186849]: 2025-11-22 08:01:22.374 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:23 np0005531887 nova_compute[186849]: 2025-11-22 08:01:23.662 186853 DEBUG nova.compute.manager [req-d2e74f60-9841-4253-8d82-a9f1476a54c2 req-4b535b04-961f-497e-996c-0457a85f7c0a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Received event network-changed-8979b5b9-122d-4f08-832e-80c8509b9f9a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:01:23 np0005531887 nova_compute[186849]: 2025-11-22 08:01:23.663 186853 DEBUG nova.compute.manager [req-d2e74f60-9841-4253-8d82-a9f1476a54c2 req-4b535b04-961f-497e-996c-0457a85f7c0a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Refreshing instance network info cache due to event network-changed-8979b5b9-122d-4f08-832e-80c8509b9f9a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:01:23 np0005531887 nova_compute[186849]: 2025-11-22 08:01:23.663 186853 DEBUG oslo_concurrency.lockutils [req-d2e74f60-9841-4253-8d82-a9f1476a54c2 req-4b535b04-961f-497e-996c-0457a85f7c0a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-fb921e88-22d2-4ae6-9d09-970505f0d5bb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:01:23 np0005531887 nova_compute[186849]: 2025-11-22 08:01:23.663 186853 DEBUG oslo_concurrency.lockutils [req-d2e74f60-9841-4253-8d82-a9f1476a54c2 req-4b535b04-961f-497e-996c-0457a85f7c0a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-fb921e88-22d2-4ae6-9d09-970505f0d5bb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:01:23 np0005531887 nova_compute[186849]: 2025-11-22 08:01:23.663 186853 DEBUG nova.network.neutron [req-d2e74f60-9841-4253-8d82-a9f1476a54c2 req-4b535b04-961f-497e-996c-0457a85f7c0a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Refreshing network info cache for port 8979b5b9-122d-4f08-832e-80c8509b9f9a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:01:25 np0005531887 nova_compute[186849]: 2025-11-22 08:01:25.223 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:26 np0005531887 nova_compute[186849]: 2025-11-22 08:01:26.338 186853 DEBUG nova.network.neutron [req-d2e74f60-9841-4253-8d82-a9f1476a54c2 req-4b535b04-961f-497e-996c-0457a85f7c0a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Updated VIF entry in instance network info cache for port 8979b5b9-122d-4f08-832e-80c8509b9f9a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:01:26 np0005531887 nova_compute[186849]: 2025-11-22 08:01:26.338 186853 DEBUG nova.network.neutron [req-d2e74f60-9841-4253-8d82-a9f1476a54c2 req-4b535b04-961f-497e-996c-0457a85f7c0a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Updating instance_info_cache with network_info: [{"id": "8979b5b9-122d-4f08-832e-80c8509b9f9a", "address": "fa:16:3e:a0:69:e9", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8979b5b9-12", "ovs_interfaceid": "8979b5b9-122d-4f08-832e-80c8509b9f9a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:01:26 np0005531887 nova_compute[186849]: 2025-11-22 08:01:26.385 186853 DEBUG oslo_concurrency.lockutils [req-d2e74f60-9841-4253-8d82-a9f1476a54c2 req-4b535b04-961f-497e-996c-0457a85f7c0a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-fb921e88-22d2-4ae6-9d09-970505f0d5bb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:01:26 np0005531887 nova_compute[186849]: 2025-11-22 08:01:26.670 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:27 np0005531887 nova_compute[186849]: 2025-11-22 08:01:27.375 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:27 np0005531887 podman[227298]: 2025-11-22 08:01:27.835502142 +0000 UTC m=+0.058046050 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 22 03:01:31 np0005531887 nova_compute[186849]: 2025-11-22 08:01:31.673 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:32 np0005531887 nova_compute[186849]: 2025-11-22 08:01:32.071 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:32 np0005531887 nova_compute[186849]: 2025-11-22 08:01:32.377 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:33 np0005531887 podman[227318]: 2025-11-22 08:01:33.829448791 +0000 UTC m=+0.054650714 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 22 03:01:36 np0005531887 nova_compute[186849]: 2025-11-22 08:01:36.675 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:01:37.336 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:01:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:01:37.337 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:01:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:01:37.338 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:01:37 np0005531887 nova_compute[186849]: 2025-11-22 08:01:37.380 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:37 np0005531887 podman[227356]: 2025-11-22 08:01:37.841963794 +0000 UTC m=+0.059828117 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 03:01:41 np0005531887 nova_compute[186849]: 2025-11-22 08:01:41.678 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:42 np0005531887 nova_compute[186849]: 2025-11-22 08:01:42.385 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:42 np0005531887 ovn_controller[95130]: 2025-11-22T08:01:42Z|00035|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a0:69:e9 10.100.0.11
Nov 22 03:01:42 np0005531887 ovn_controller[95130]: 2025-11-22T08:01:42Z|00036|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a0:69:e9 10.100.0.11
Nov 22 03:01:42 np0005531887 podman[227382]: 2025-11-22 08:01:42.850678071 +0000 UTC m=+0.062374781 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, release=1755695350, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git)
Nov 22 03:01:46 np0005531887 nova_compute[186849]: 2025-11-22 08:01:46.680 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:47 np0005531887 nova_compute[186849]: 2025-11-22 08:01:47.387 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:47 np0005531887 podman[227405]: 2025-11-22 08:01:47.843108571 +0000 UTC m=+0.062975647 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:01:47 np0005531887 podman[227406]: 2025-11-22 08:01:47.894156181 +0000 UTC m=+0.110196538 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3)
Nov 22 03:01:48 np0005531887 nova_compute[186849]: 2025-11-22 08:01:48.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:01:51 np0005531887 nova_compute[186849]: 2025-11-22 08:01:51.682 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:51 np0005531887 podman[227446]: 2025-11-22 08:01:51.82506748 +0000 UTC m=+0.047832758 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 03:01:52 np0005531887 nova_compute[186849]: 2025-11-22 08:01:52.390 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:52 np0005531887 nova_compute[186849]: 2025-11-22 08:01:52.763 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:01:52 np0005531887 nova_compute[186849]: 2025-11-22 08:01:52.767 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:01:52 np0005531887 nova_compute[186849]: 2025-11-22 08:01:52.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:01:52 np0005531887 nova_compute[186849]: 2025-11-22 08:01:52.791 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:01:52 np0005531887 nova_compute[186849]: 2025-11-22 08:01:52.791 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:01:52 np0005531887 nova_compute[186849]: 2025-11-22 08:01:52.792 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:01:52 np0005531887 nova_compute[186849]: 2025-11-22 08:01:52.792 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:01:52 np0005531887 nova_compute[186849]: 2025-11-22 08:01:52.867 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fb921e88-22d2-4ae6-9d09-970505f0d5bb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:01:52 np0005531887 nova_compute[186849]: 2025-11-22 08:01:52.936 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fb921e88-22d2-4ae6-9d09-970505f0d5bb/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:01:52 np0005531887 nova_compute[186849]: 2025-11-22 08:01:52.937 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fb921e88-22d2-4ae6-9d09-970505f0d5bb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:01:53 np0005531887 nova_compute[186849]: 2025-11-22 08:01:53.001 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fb921e88-22d2-4ae6-9d09-970505f0d5bb/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:01:53 np0005531887 nova_compute[186849]: 2025-11-22 08:01:53.180 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:01:53 np0005531887 nova_compute[186849]: 2025-11-22 08:01:53.182 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5595MB free_disk=73.31669998168945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:01:53 np0005531887 nova_compute[186849]: 2025-11-22 08:01:53.182 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:01:53 np0005531887 nova_compute[186849]: 2025-11-22 08:01:53.182 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:01:53 np0005531887 nova_compute[186849]: 2025-11-22 08:01:53.276 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Instance fb921e88-22d2-4ae6-9d09-970505f0d5bb actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 03:01:53 np0005531887 nova_compute[186849]: 2025-11-22 08:01:53.277 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:01:53 np0005531887 nova_compute[186849]: 2025-11-22 08:01:53.277 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:01:53 np0005531887 nova_compute[186849]: 2025-11-22 08:01:53.323 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:01:53 np0005531887 nova_compute[186849]: 2025-11-22 08:01:53.342 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:01:53 np0005531887 nova_compute[186849]: 2025-11-22 08:01:53.378 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:01:53 np0005531887 nova_compute[186849]: 2025-11-22 08:01:53.379 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.196s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:01:54 np0005531887 nova_compute[186849]: 2025-11-22 08:01:54.380 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:01:54 np0005531887 nova_compute[186849]: 2025-11-22 08:01:54.381 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:01:54 np0005531887 nova_compute[186849]: 2025-11-22 08:01:54.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:01:55 np0005531887 nova_compute[186849]: 2025-11-22 08:01:55.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:01:56 np0005531887 nova_compute[186849]: 2025-11-22 08:01:56.683 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:56 np0005531887 nova_compute[186849]: 2025-11-22 08:01:56.767 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:01:57 np0005531887 nova_compute[186849]: 2025-11-22 08:01:57.393 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:57 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:01:57.765 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=27, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=26) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:01:57 np0005531887 nova_compute[186849]: 2025-11-22 08:01:57.766 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:01:57 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:01:57.767 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:01:57 np0005531887 nova_compute[186849]: 2025-11-22 08:01:57.875 186853 DEBUG oslo_concurrency.lockutils [None req-4e1cf450-bb43-400e-bfee-25dd3d47b55a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquiring lock "interface-fb921e88-22d2-4ae6-9d09-970505f0d5bb-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:01:57 np0005531887 nova_compute[186849]: 2025-11-22 08:01:57.875 186853 DEBUG oslo_concurrency.lockutils [None req-4e1cf450-bb43-400e-bfee-25dd3d47b55a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "interface-fb921e88-22d2-4ae6-9d09-970505f0d5bb-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:01:57 np0005531887 nova_compute[186849]: 2025-11-22 08:01:57.876 186853 DEBUG nova.objects.instance [None req-4e1cf450-bb43-400e-bfee-25dd3d47b55a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lazy-loading 'flavor' on Instance uuid fb921e88-22d2-4ae6-9d09-970505f0d5bb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:01:57 np0005531887 nova_compute[186849]: 2025-11-22 08:01:57.900 186853 DEBUG nova.objects.instance [None req-4e1cf450-bb43-400e-bfee-25dd3d47b55a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lazy-loading 'pci_requests' on Instance uuid fb921e88-22d2-4ae6-9d09-970505f0d5bb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:01:57 np0005531887 nova_compute[186849]: 2025-11-22 08:01:57.920 186853 DEBUG nova.network.neutron [None req-4e1cf450-bb43-400e-bfee-25dd3d47b55a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:01:58 np0005531887 nova_compute[186849]: 2025-11-22 08:01:58.599 186853 DEBUG nova.policy [None req-4e1cf450-bb43-400e-bfee-25dd3d47b55a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:01:58 np0005531887 nova_compute[186849]: 2025-11-22 08:01:58.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:01:58 np0005531887 nova_compute[186849]: 2025-11-22 08:01:58.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:01:58 np0005531887 nova_compute[186849]: 2025-11-22 08:01:58.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:01:58 np0005531887 podman[227477]: 2025-11-22 08:01:58.844398131 +0000 UTC m=+0.065467251 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Nov 22 03:01:58 np0005531887 nova_compute[186849]: 2025-11-22 08:01:58.978 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "refresh_cache-fb921e88-22d2-4ae6-9d09-970505f0d5bb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:01:58 np0005531887 nova_compute[186849]: 2025-11-22 08:01:58.978 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquired lock "refresh_cache-fb921e88-22d2-4ae6-9d09-970505f0d5bb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:01:58 np0005531887 nova_compute[186849]: 2025-11-22 08:01:58.978 186853 DEBUG nova.network.neutron [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 03:01:58 np0005531887 nova_compute[186849]: 2025-11-22 08:01:58.978 186853 DEBUG nova.objects.instance [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid fb921e88-22d2-4ae6-9d09-970505f0d5bb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:01:59 np0005531887 nova_compute[186849]: 2025-11-22 08:01:59.722 186853 DEBUG nova.network.neutron [None req-4e1cf450-bb43-400e-bfee-25dd3d47b55a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Successfully created port: a5480b52-2a9f-4662-8b66-bd078a80ca44 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:02:01 np0005531887 nova_compute[186849]: 2025-11-22 08:02:01.052 186853 DEBUG nova.network.neutron [None req-4e1cf450-bb43-400e-bfee-25dd3d47b55a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Successfully updated port: a5480b52-2a9f-4662-8b66-bd078a80ca44 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:02:01 np0005531887 nova_compute[186849]: 2025-11-22 08:02:01.083 186853 DEBUG oslo_concurrency.lockutils [None req-4e1cf450-bb43-400e-bfee-25dd3d47b55a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquiring lock "refresh_cache-fb921e88-22d2-4ae6-9d09-970505f0d5bb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:02:01 np0005531887 nova_compute[186849]: 2025-11-22 08:02:01.686 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:01 np0005531887 nova_compute[186849]: 2025-11-22 08:02:01.746 186853 DEBUG nova.compute.manager [req-039f0b16-c3a5-4e4b-b9c1-b2163a1beb0c req-c6564ad9-3148-4c2d-acd8-7029e2512296 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Received event network-changed-a5480b52-2a9f-4662-8b66-bd078a80ca44 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:02:01 np0005531887 nova_compute[186849]: 2025-11-22 08:02:01.746 186853 DEBUG nova.compute.manager [req-039f0b16-c3a5-4e4b-b9c1-b2163a1beb0c req-c6564ad9-3148-4c2d-acd8-7029e2512296 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Refreshing instance network info cache due to event network-changed-a5480b52-2a9f-4662-8b66-bd078a80ca44. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:02:01 np0005531887 nova_compute[186849]: 2025-11-22 08:02:01.746 186853 DEBUG oslo_concurrency.lockutils [req-039f0b16-c3a5-4e4b-b9c1-b2163a1beb0c req-c6564ad9-3148-4c2d-acd8-7029e2512296 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-fb921e88-22d2-4ae6-9d09-970505f0d5bb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:02:02 np0005531887 nova_compute[186849]: 2025-11-22 08:02:02.227 186853 DEBUG nova.network.neutron [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Updating instance_info_cache with network_info: [{"id": "8979b5b9-122d-4f08-832e-80c8509b9f9a", "address": "fa:16:3e:a0:69:e9", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8979b5b9-12", "ovs_interfaceid": "8979b5b9-122d-4f08-832e-80c8509b9f9a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "a5480b52-2a9f-4662-8b66-bd078a80ca44", "address": "fa:16:3e:5f:20:36", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": null, "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "unbound", "details": {}, "devname": "tapa5480b52-2a", "ovs_interfaceid": null, "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:02:02 np0005531887 nova_compute[186849]: 2025-11-22 08:02:02.248 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Releasing lock "refresh_cache-fb921e88-22d2-4ae6-9d09-970505f0d5bb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:02:02 np0005531887 nova_compute[186849]: 2025-11-22 08:02:02.249 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 03:02:02 np0005531887 nova_compute[186849]: 2025-11-22 08:02:02.250 186853 DEBUG oslo_concurrency.lockutils [None req-4e1cf450-bb43-400e-bfee-25dd3d47b55a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquired lock "refresh_cache-fb921e88-22d2-4ae6-9d09-970505f0d5bb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:02:02 np0005531887 nova_compute[186849]: 2025-11-22 08:02:02.250 186853 DEBUG nova.network.neutron [None req-4e1cf450-bb43-400e-bfee-25dd3d47b55a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:02:02 np0005531887 nova_compute[186849]: 2025-11-22 08:02:02.398 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:02 np0005531887 nova_compute[186849]: 2025-11-22 08:02:02.651 186853 WARNING nova.network.neutron [None req-4e1cf450-bb43-400e-bfee-25dd3d47b55a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] 6a4a282c-db22-41de-b34b-2960aa032ca8 already exists in list: networks containing: ['6a4a282c-db22-41de-b34b-2960aa032ca8']. ignoring it#033[00m
Nov 22 03:02:02 np0005531887 nova_compute[186849]: 2025-11-22 08:02:02.651 186853 WARNING nova.network.neutron [None req-4e1cf450-bb43-400e-bfee-25dd3d47b55a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] 6a4a282c-db22-41de-b34b-2960aa032ca8 already exists in list: networks containing: ['6a4a282c-db22-41de-b34b-2960aa032ca8']. ignoring it#033[00m
Nov 22 03:02:02 np0005531887 nova_compute[186849]: 2025-11-22 08:02:02.652 186853 WARNING nova.network.neutron [None req-4e1cf450-bb43-400e-bfee-25dd3d47b55a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] a5480b52-2a9f-4662-8b66-bd078a80ca44 already exists in list: port_ids containing: ['a5480b52-2a9f-4662-8b66-bd078a80ca44']. ignoring it#033[00m
Nov 22 03:02:04 np0005531887 nova_compute[186849]: 2025-11-22 08:02:04.122 186853 DEBUG nova.network.neutron [None req-4e1cf450-bb43-400e-bfee-25dd3d47b55a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Updating instance_info_cache with network_info: [{"id": "8979b5b9-122d-4f08-832e-80c8509b9f9a", "address": "fa:16:3e:a0:69:e9", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8979b5b9-12", "ovs_interfaceid": "8979b5b9-122d-4f08-832e-80c8509b9f9a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "a5480b52-2a9f-4662-8b66-bd078a80ca44", "address": "fa:16:3e:5f:20:36", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5480b52-2a", "ovs_interfaceid": "a5480b52-2a9f-4662-8b66-bd078a80ca44", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:02:04 np0005531887 nova_compute[186849]: 2025-11-22 08:02:04.139 186853 DEBUG oslo_concurrency.lockutils [None req-4e1cf450-bb43-400e-bfee-25dd3d47b55a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Releasing lock "refresh_cache-fb921e88-22d2-4ae6-9d09-970505f0d5bb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:02:04 np0005531887 nova_compute[186849]: 2025-11-22 08:02:04.140 186853 DEBUG oslo_concurrency.lockutils [req-039f0b16-c3a5-4e4b-b9c1-b2163a1beb0c req-c6564ad9-3148-4c2d-acd8-7029e2512296 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-fb921e88-22d2-4ae6-9d09-970505f0d5bb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:02:04 np0005531887 nova_compute[186849]: 2025-11-22 08:02:04.141 186853 DEBUG nova.network.neutron [req-039f0b16-c3a5-4e4b-b9c1-b2163a1beb0c req-c6564ad9-3148-4c2d-acd8-7029e2512296 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Refreshing network info cache for port a5480b52-2a9f-4662-8b66-bd078a80ca44 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:02:04 np0005531887 nova_compute[186849]: 2025-11-22 08:02:04.144 186853 DEBUG nova.virt.libvirt.vif [None req-4e1cf450-bb43-400e-bfee-25dd3d47b55a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:01:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-253236039',display_name='tempest-AttachInterfacesTestJSON-server-253236039',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-253236039',id=95,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPlFNWaBPkXmo1oJCfLN54tXf/v5r9z/rUlzmZkdlM69K1yL3DA4wVyYYlX6OkgGyxa3eXCtkM1Rz2ltwe/CBwFo/i1WEXVw/DQ8O39rJeQjSBKmeD14VReQyrKNKErHuw==',key_name='tempest-keypair-1627237356',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:01:19Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='2ce7c88f9ad440f494bdba03e7ece1bf',ramdisk_id='',reservation_id='r-hi30eyon',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-148239263',owner_user_name='tempest-AttachInterfacesTestJSON-148239263-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:01:19Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='53d50a77d3f2416c8fcc459cc343d045',uuid=fb921e88-22d2-4ae6-9d09-970505f0d5bb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a5480b52-2a9f-4662-8b66-bd078a80ca44", "address": "fa:16:3e:5f:20:36", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5480b52-2a", "ovs_interfaceid": "a5480b52-2a9f-4662-8b66-bd078a80ca44", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:02:04 np0005531887 nova_compute[186849]: 2025-11-22 08:02:04.145 186853 DEBUG nova.network.os_vif_util [None req-4e1cf450-bb43-400e-bfee-25dd3d47b55a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converting VIF {"id": "a5480b52-2a9f-4662-8b66-bd078a80ca44", "address": "fa:16:3e:5f:20:36", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5480b52-2a", "ovs_interfaceid": "a5480b52-2a9f-4662-8b66-bd078a80ca44", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:02:04 np0005531887 nova_compute[186849]: 2025-11-22 08:02:04.145 186853 DEBUG nova.network.os_vif_util [None req-4e1cf450-bb43-400e-bfee-25dd3d47b55a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5f:20:36,bridge_name='br-int',has_traffic_filtering=True,id=a5480b52-2a9f-4662-8b66-bd078a80ca44,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5480b52-2a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:02:04 np0005531887 nova_compute[186849]: 2025-11-22 08:02:04.147 186853 DEBUG os_vif [None req-4e1cf450-bb43-400e-bfee-25dd3d47b55a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5f:20:36,bridge_name='br-int',has_traffic_filtering=True,id=a5480b52-2a9f-4662-8b66-bd078a80ca44,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5480b52-2a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:02:04 np0005531887 nova_compute[186849]: 2025-11-22 08:02:04.148 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:04 np0005531887 nova_compute[186849]: 2025-11-22 08:02:04.149 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:02:04 np0005531887 nova_compute[186849]: 2025-11-22 08:02:04.150 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:02:04 np0005531887 nova_compute[186849]: 2025-11-22 08:02:04.157 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:04 np0005531887 nova_compute[186849]: 2025-11-22 08:02:04.158 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa5480b52-2a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:02:04 np0005531887 nova_compute[186849]: 2025-11-22 08:02:04.159 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa5480b52-2a, col_values=(('external_ids', {'iface-id': 'a5480b52-2a9f-4662-8b66-bd078a80ca44', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5f:20:36', 'vm-uuid': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:02:04 np0005531887 nova_compute[186849]: 2025-11-22 08:02:04.161 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:04 np0005531887 NetworkManager[55210]: <info>  [1763798524.1620] manager: (tapa5480b52-2a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/135)
Nov 22 03:02:04 np0005531887 nova_compute[186849]: 2025-11-22 08:02:04.164 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:02:04 np0005531887 nova_compute[186849]: 2025-11-22 08:02:04.168 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:04 np0005531887 nova_compute[186849]: 2025-11-22 08:02:04.169 186853 INFO os_vif [None req-4e1cf450-bb43-400e-bfee-25dd3d47b55a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5f:20:36,bridge_name='br-int',has_traffic_filtering=True,id=a5480b52-2a9f-4662-8b66-bd078a80ca44,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5480b52-2a')#033[00m
Nov 22 03:02:04 np0005531887 nova_compute[186849]: 2025-11-22 08:02:04.170 186853 DEBUG nova.virt.libvirt.vif [None req-4e1cf450-bb43-400e-bfee-25dd3d47b55a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:01:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-253236039',display_name='tempest-AttachInterfacesTestJSON-server-253236039',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-253236039',id=95,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPlFNWaBPkXmo1oJCfLN54tXf/v5r9z/rUlzmZkdlM69K1yL3DA4wVyYYlX6OkgGyxa3eXCtkM1Rz2ltwe/CBwFo/i1WEXVw/DQ8O39rJeQjSBKmeD14VReQyrKNKErHuw==',key_name='tempest-keypair-1627237356',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:01:19Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='2ce7c88f9ad440f494bdba03e7ece1bf',ramdisk_id='',reservation_id='r-hi30eyon',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-148239263',owner_user_name='tempest-AttachInterfacesTestJSON-148239263-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:01:19Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='53d50a77d3f2416c8fcc459cc343d045',uuid=fb921e88-22d2-4ae6-9d09-970505f0d5bb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a5480b52-2a9f-4662-8b66-bd078a80ca44", "address": "fa:16:3e:5f:20:36", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5480b52-2a", "ovs_interfaceid": "a5480b52-2a9f-4662-8b66-bd078a80ca44", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:02:04 np0005531887 nova_compute[186849]: 2025-11-22 08:02:04.171 186853 DEBUG nova.network.os_vif_util [None req-4e1cf450-bb43-400e-bfee-25dd3d47b55a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converting VIF {"id": "a5480b52-2a9f-4662-8b66-bd078a80ca44", "address": "fa:16:3e:5f:20:36", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5480b52-2a", "ovs_interfaceid": "a5480b52-2a9f-4662-8b66-bd078a80ca44", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:02:04 np0005531887 nova_compute[186849]: 2025-11-22 08:02:04.171 186853 DEBUG nova.network.os_vif_util [None req-4e1cf450-bb43-400e-bfee-25dd3d47b55a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5f:20:36,bridge_name='br-int',has_traffic_filtering=True,id=a5480b52-2a9f-4662-8b66-bd078a80ca44,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5480b52-2a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:02:04 np0005531887 nova_compute[186849]: 2025-11-22 08:02:04.173 186853 DEBUG nova.virt.libvirt.guest [None req-4e1cf450-bb43-400e-bfee-25dd3d47b55a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] attach device xml: <interface type="ethernet">
Nov 22 03:02:04 np0005531887 nova_compute[186849]:  <mac address="fa:16:3e:5f:20:36"/>
Nov 22 03:02:04 np0005531887 nova_compute[186849]:  <model type="virtio"/>
Nov 22 03:02:04 np0005531887 nova_compute[186849]:  <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:02:04 np0005531887 nova_compute[186849]:  <mtu size="1442"/>
Nov 22 03:02:04 np0005531887 nova_compute[186849]:  <target dev="tapa5480b52-2a"/>
Nov 22 03:02:04 np0005531887 nova_compute[186849]: </interface>
Nov 22 03:02:04 np0005531887 nova_compute[186849]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Nov 22 03:02:04 np0005531887 kernel: tapa5480b52-2a: entered promiscuous mode
Nov 22 03:02:04 np0005531887 NetworkManager[55210]: <info>  [1763798524.1852] manager: (tapa5480b52-2a): new Tun device (/org/freedesktop/NetworkManager/Devices/136)
Nov 22 03:02:04 np0005531887 ovn_controller[95130]: 2025-11-22T08:02:04Z|00276|binding|INFO|Claiming lport a5480b52-2a9f-4662-8b66-bd078a80ca44 for this chassis.
Nov 22 03:02:04 np0005531887 ovn_controller[95130]: 2025-11-22T08:02:04Z|00277|binding|INFO|a5480b52-2a9f-4662-8b66-bd078a80ca44: Claiming fa:16:3e:5f:20:36 10.100.0.10
Nov 22 03:02:04 np0005531887 nova_compute[186849]: 2025-11-22 08:02:04.187 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:04.196 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5f:20:36 10.100.0.10'], port_security=['fa:16:3e:5f:20:36 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6a4a282c-db22-41de-b34b-2960aa032ca8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd4a48801-4b3f-49e9-aa90-fb1d486a915e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0acaade9-a442-4c9d-a882-bd397d30fce8, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=a5480b52-2a9f-4662-8b66-bd078a80ca44) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:02:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:04.197 104084 INFO neutron.agent.ovn.metadata.agent [-] Port a5480b52-2a9f-4662-8b66-bd078a80ca44 in datapath 6a4a282c-db22-41de-b34b-2960aa032ca8 bound to our chassis#033[00m
Nov 22 03:02:04 np0005531887 ovn_controller[95130]: 2025-11-22T08:02:04Z|00278|binding|INFO|Setting lport a5480b52-2a9f-4662-8b66-bd078a80ca44 ovn-installed in OVS
Nov 22 03:02:04 np0005531887 ovn_controller[95130]: 2025-11-22T08:02:04Z|00279|binding|INFO|Setting lport a5480b52-2a9f-4662-8b66-bd078a80ca44 up in Southbound
Nov 22 03:02:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:04.199 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6a4a282c-db22-41de-b34b-2960aa032ca8#033[00m
Nov 22 03:02:04 np0005531887 nova_compute[186849]: 2025-11-22 08:02:04.200 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:04 np0005531887 nova_compute[186849]: 2025-11-22 08:02:04.210 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:04.214 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[2b48634b-2ce6-4058-bfec-ffa15218eea5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:04 np0005531887 systemd-udevd[227513]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:02:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:04.245 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[ea5b46d8-e99a-42c4-b0f9-f4ddcb1cbacf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:04.248 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[52c40147-2091-4bcf-a758-d1386d0ad14b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:04 np0005531887 NetworkManager[55210]: <info>  [1763798524.2523] device (tapa5480b52-2a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:02:04 np0005531887 NetworkManager[55210]: <info>  [1763798524.2559] device (tapa5480b52-2a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:02:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:04.283 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[b7d81b34-38ab-40f5-afa0-92600fb35829]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:04 np0005531887 nova_compute[186849]: 2025-11-22 08:02:04.306 186853 DEBUG nova.virt.libvirt.driver [None req-4e1cf450-bb43-400e-bfee-25dd3d47b55a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:02:04 np0005531887 nova_compute[186849]: 2025-11-22 08:02:04.307 186853 DEBUG nova.virt.libvirt.driver [None req-4e1cf450-bb43-400e-bfee-25dd3d47b55a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:02:04 np0005531887 nova_compute[186849]: 2025-11-22 08:02:04.307 186853 DEBUG nova.virt.libvirt.driver [None req-4e1cf450-bb43-400e-bfee-25dd3d47b55a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] No VIF found with MAC fa:16:3e:a0:69:e9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:02:04 np0005531887 nova_compute[186849]: 2025-11-22 08:02:04.307 186853 DEBUG nova.virt.libvirt.driver [None req-4e1cf450-bb43-400e-bfee-25dd3d47b55a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] No VIF found with MAC fa:16:3e:5f:20:36, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:02:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:04.312 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[5146bdc7-cab7-4d45-811f-ad21023473f4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6a4a282c-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b3:7a:86'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 89], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 522138, 'reachable_time': 34113, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227529, 'error': None, 'target': 'ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:04.325 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[f95e42f6-8c3a-45e1-801b-1ce305b73a8e]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap6a4a282c-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 522153, 'tstamp': 522153}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227530, 'error': None, 'target': 'ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap6a4a282c-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 522157, 'tstamp': 522157}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227530, 'error': None, 'target': 'ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:04 np0005531887 nova_compute[186849]: 2025-11-22 08:02:04.326 186853 DEBUG nova.virt.libvirt.guest [None req-4e1cf450-bb43-400e-bfee-25dd3d47b55a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:02:04 np0005531887 nova_compute[186849]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:02:04 np0005531887 nova_compute[186849]:  <nova:name>tempest-AttachInterfacesTestJSON-server-253236039</nova:name>
Nov 22 03:02:04 np0005531887 nova_compute[186849]:  <nova:creationTime>2025-11-22 08:02:04</nova:creationTime>
Nov 22 03:02:04 np0005531887 nova_compute[186849]:  <nova:flavor name="m1.nano">
Nov 22 03:02:04 np0005531887 nova_compute[186849]:    <nova:memory>128</nova:memory>
Nov 22 03:02:04 np0005531887 nova_compute[186849]:    <nova:disk>1</nova:disk>
Nov 22 03:02:04 np0005531887 nova_compute[186849]:    <nova:swap>0</nova:swap>
Nov 22 03:02:04 np0005531887 nova_compute[186849]:    <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:02:04 np0005531887 nova_compute[186849]:    <nova:vcpus>1</nova:vcpus>
Nov 22 03:02:04 np0005531887 nova_compute[186849]:  </nova:flavor>
Nov 22 03:02:04 np0005531887 nova_compute[186849]:  <nova:owner>
Nov 22 03:02:04 np0005531887 nova_compute[186849]:    <nova:user uuid="53d50a77d3f2416c8fcc459cc343d045">tempest-AttachInterfacesTestJSON-148239263-project-member</nova:user>
Nov 22 03:02:04 np0005531887 nova_compute[186849]:    <nova:project uuid="2ce7c88f9ad440f494bdba03e7ece1bf">tempest-AttachInterfacesTestJSON-148239263</nova:project>
Nov 22 03:02:04 np0005531887 nova_compute[186849]:  </nova:owner>
Nov 22 03:02:04 np0005531887 nova_compute[186849]:  <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:02:04 np0005531887 nova_compute[186849]:  <nova:ports>
Nov 22 03:02:04 np0005531887 nova_compute[186849]:    <nova:port uuid="8979b5b9-122d-4f08-832e-80c8509b9f9a">
Nov 22 03:02:04 np0005531887 nova_compute[186849]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 22 03:02:04 np0005531887 nova_compute[186849]:    </nova:port>
Nov 22 03:02:04 np0005531887 nova_compute[186849]:    <nova:port uuid="a5480b52-2a9f-4662-8b66-bd078a80ca44">
Nov 22 03:02:04 np0005531887 nova_compute[186849]:      <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Nov 22 03:02:04 np0005531887 nova_compute[186849]:    </nova:port>
Nov 22 03:02:04 np0005531887 nova_compute[186849]:  </nova:ports>
Nov 22 03:02:04 np0005531887 nova_compute[186849]: </nova:instance>
Nov 22 03:02:04 np0005531887 nova_compute[186849]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Nov 22 03:02:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:04.327 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6a4a282c-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:02:04 np0005531887 podman[227501]: 2025-11-22 08:02:04.329563414 +0000 UTC m=+0.111964404 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 22 03:02:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:04.331 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6a4a282c-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:02:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:04.331 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:02:04 np0005531887 nova_compute[186849]: 2025-11-22 08:02:04.331 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:04.331 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6a4a282c-d0, col_values=(('external_ids', {'iface-id': '26692495-261e-4628-ae4d-0a33d676c097'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:02:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:04.332 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:02:04 np0005531887 nova_compute[186849]: 2025-11-22 08:02:04.353 186853 DEBUG oslo_concurrency.lockutils [None req-4e1cf450-bb43-400e-bfee-25dd3d47b55a 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "interface-fb921e88-22d2-4ae6-9d09-970505f0d5bb-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 6.477s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:02:04 np0005531887 nova_compute[186849]: 2025-11-22 08:02:04.399 186853 DEBUG nova.compute.manager [req-5a0a64ab-cb11-451a-a3ce-2db3d8d81bda req-70bcee9a-0415-43b2-a8f3-c2e09de086fe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Received event network-vif-plugged-a5480b52-2a9f-4662-8b66-bd078a80ca44 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:02:04 np0005531887 nova_compute[186849]: 2025-11-22 08:02:04.399 186853 DEBUG oslo_concurrency.lockutils [req-5a0a64ab-cb11-451a-a3ce-2db3d8d81bda req-70bcee9a-0415-43b2-a8f3-c2e09de086fe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "fb921e88-22d2-4ae6-9d09-970505f0d5bb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:02:04 np0005531887 nova_compute[186849]: 2025-11-22 08:02:04.400 186853 DEBUG oslo_concurrency.lockutils [req-5a0a64ab-cb11-451a-a3ce-2db3d8d81bda req-70bcee9a-0415-43b2-a8f3-c2e09de086fe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "fb921e88-22d2-4ae6-9d09-970505f0d5bb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:02:04 np0005531887 nova_compute[186849]: 2025-11-22 08:02:04.400 186853 DEBUG oslo_concurrency.lockutils [req-5a0a64ab-cb11-451a-a3ce-2db3d8d81bda req-70bcee9a-0415-43b2-a8f3-c2e09de086fe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "fb921e88-22d2-4ae6-9d09-970505f0d5bb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:02:04 np0005531887 nova_compute[186849]: 2025-11-22 08:02:04.400 186853 DEBUG nova.compute.manager [req-5a0a64ab-cb11-451a-a3ce-2db3d8d81bda req-70bcee9a-0415-43b2-a8f3-c2e09de086fe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] No waiting events found dispatching network-vif-plugged-a5480b52-2a9f-4662-8b66-bd078a80ca44 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:02:04 np0005531887 nova_compute[186849]: 2025-11-22 08:02:04.400 186853 WARNING nova.compute.manager [req-5a0a64ab-cb11-451a-a3ce-2db3d8d81bda req-70bcee9a-0415-43b2-a8f3-c2e09de086fe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Received unexpected event network-vif-plugged-a5480b52-2a9f-4662-8b66-bd078a80ca44 for instance with vm_state active and task_state None.#033[00m
Nov 22 03:02:05 np0005531887 nova_compute[186849]: 2025-11-22 08:02:05.141 186853 DEBUG oslo_concurrency.lockutils [None req-456b3d6d-4d7e-4d21-91a5-8dd48c74b352 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquiring lock "interface-fb921e88-22d2-4ae6-9d09-970505f0d5bb-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:02:05 np0005531887 nova_compute[186849]: 2025-11-22 08:02:05.141 186853 DEBUG oslo_concurrency.lockutils [None req-456b3d6d-4d7e-4d21-91a5-8dd48c74b352 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "interface-fb921e88-22d2-4ae6-9d09-970505f0d5bb-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:02:05 np0005531887 nova_compute[186849]: 2025-11-22 08:02:05.141 186853 DEBUG nova.objects.instance [None req-456b3d6d-4d7e-4d21-91a5-8dd48c74b352 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lazy-loading 'flavor' on Instance uuid fb921e88-22d2-4ae6-9d09-970505f0d5bb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:02:05 np0005531887 nova_compute[186849]: 2025-11-22 08:02:05.578 186853 DEBUG nova.network.neutron [req-039f0b16-c3a5-4e4b-b9c1-b2163a1beb0c req-c6564ad9-3148-4c2d-acd8-7029e2512296 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Updated VIF entry in instance network info cache for port a5480b52-2a9f-4662-8b66-bd078a80ca44. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:02:05 np0005531887 nova_compute[186849]: 2025-11-22 08:02:05.579 186853 DEBUG nova.network.neutron [req-039f0b16-c3a5-4e4b-b9c1-b2163a1beb0c req-c6564ad9-3148-4c2d-acd8-7029e2512296 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Updating instance_info_cache with network_info: [{"id": "8979b5b9-122d-4f08-832e-80c8509b9f9a", "address": "fa:16:3e:a0:69:e9", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8979b5b9-12", "ovs_interfaceid": "8979b5b9-122d-4f08-832e-80c8509b9f9a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "a5480b52-2a9f-4662-8b66-bd078a80ca44", "address": "fa:16:3e:5f:20:36", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5480b52-2a", "ovs_interfaceid": "a5480b52-2a9f-4662-8b66-bd078a80ca44", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:02:05 np0005531887 nova_compute[186849]: 2025-11-22 08:02:05.598 186853 DEBUG oslo_concurrency.lockutils [req-039f0b16-c3a5-4e4b-b9c1-b2163a1beb0c req-c6564ad9-3148-4c2d-acd8-7029e2512296 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-fb921e88-22d2-4ae6-9d09-970505f0d5bb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:02:06 np0005531887 ovn_controller[95130]: 2025-11-22T08:02:06Z|00037|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:5f:20:36 10.100.0.10
Nov 22 03:02:06 np0005531887 ovn_controller[95130]: 2025-11-22T08:02:06Z|00038|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:5f:20:36 10.100.0.10
Nov 22 03:02:06 np0005531887 nova_compute[186849]: 2025-11-22 08:02:06.244 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:02:06 np0005531887 nova_compute[186849]: 2025-11-22 08:02:06.650 186853 DEBUG nova.objects.instance [None req-456b3d6d-4d7e-4d21-91a5-8dd48c74b352 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lazy-loading 'pci_requests' on Instance uuid fb921e88-22d2-4ae6-9d09-970505f0d5bb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:02:06 np0005531887 nova_compute[186849]: 2025-11-22 08:02:06.661 186853 DEBUG nova.network.neutron [None req-456b3d6d-4d7e-4d21-91a5-8dd48c74b352 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:02:06 np0005531887 nova_compute[186849]: 2025-11-22 08:02:06.688 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:07 np0005531887 nova_compute[186849]: 2025-11-22 08:02:07.684 186853 DEBUG nova.policy [None req-456b3d6d-4d7e-4d21-91a5-8dd48c74b352 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:02:07 np0005531887 nova_compute[186849]: 2025-11-22 08:02:07.747 186853 DEBUG nova.compute.manager [req-c48c9793-29cd-446e-8c85-189388d2ea61 req-c5086de2-b7b4-4f48-9c05-f49ad0647c3e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Received event network-vif-plugged-a5480b52-2a9f-4662-8b66-bd078a80ca44 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:02:07 np0005531887 nova_compute[186849]: 2025-11-22 08:02:07.747 186853 DEBUG oslo_concurrency.lockutils [req-c48c9793-29cd-446e-8c85-189388d2ea61 req-c5086de2-b7b4-4f48-9c05-f49ad0647c3e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "fb921e88-22d2-4ae6-9d09-970505f0d5bb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:02:07 np0005531887 nova_compute[186849]: 2025-11-22 08:02:07.747 186853 DEBUG oslo_concurrency.lockutils [req-c48c9793-29cd-446e-8c85-189388d2ea61 req-c5086de2-b7b4-4f48-9c05-f49ad0647c3e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "fb921e88-22d2-4ae6-9d09-970505f0d5bb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:02:07 np0005531887 nova_compute[186849]: 2025-11-22 08:02:07.748 186853 DEBUG oslo_concurrency.lockutils [req-c48c9793-29cd-446e-8c85-189388d2ea61 req-c5086de2-b7b4-4f48-9c05-f49ad0647c3e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "fb921e88-22d2-4ae6-9d09-970505f0d5bb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:02:07 np0005531887 nova_compute[186849]: 2025-11-22 08:02:07.748 186853 DEBUG nova.compute.manager [req-c48c9793-29cd-446e-8c85-189388d2ea61 req-c5086de2-b7b4-4f48-9c05-f49ad0647c3e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] No waiting events found dispatching network-vif-plugged-a5480b52-2a9f-4662-8b66-bd078a80ca44 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:02:07 np0005531887 nova_compute[186849]: 2025-11-22 08:02:07.748 186853 WARNING nova.compute.manager [req-c48c9793-29cd-446e-8c85-189388d2ea61 req-c5086de2-b7b4-4f48-9c05-f49ad0647c3e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Received unexpected event network-vif-plugged-a5480b52-2a9f-4662-8b66-bd078a80ca44 for instance with vm_state active and task_state None.#033[00m
Nov 22 03:02:07 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:07.768 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '27'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:02:08 np0005531887 podman[227531]: 2025-11-22 08:02:08.849683083 +0000 UTC m=+0.063318125 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 03:02:09 np0005531887 nova_compute[186849]: 2025-11-22 08:02:09.164 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:11 np0005531887 nova_compute[186849]: 2025-11-22 08:02:11.297 186853 DEBUG nova.network.neutron [None req-456b3d6d-4d7e-4d21-91a5-8dd48c74b352 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Successfully created port: 26960794-1ab2-46fd-917b-8b5e28186dc3 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:02:11 np0005531887 nova_compute[186849]: 2025-11-22 08:02:11.691 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:13 np0005531887 nova_compute[186849]: 2025-11-22 08:02:13.056 186853 DEBUG nova.network.neutron [None req-456b3d6d-4d7e-4d21-91a5-8dd48c74b352 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Successfully updated port: 26960794-1ab2-46fd-917b-8b5e28186dc3 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:02:13 np0005531887 nova_compute[186849]: 2025-11-22 08:02:13.079 186853 DEBUG oslo_concurrency.lockutils [None req-456b3d6d-4d7e-4d21-91a5-8dd48c74b352 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquiring lock "refresh_cache-fb921e88-22d2-4ae6-9d09-970505f0d5bb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:02:13 np0005531887 nova_compute[186849]: 2025-11-22 08:02:13.080 186853 DEBUG oslo_concurrency.lockutils [None req-456b3d6d-4d7e-4d21-91a5-8dd48c74b352 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquired lock "refresh_cache-fb921e88-22d2-4ae6-9d09-970505f0d5bb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:02:13 np0005531887 nova_compute[186849]: 2025-11-22 08:02:13.081 186853 DEBUG nova.network.neutron [None req-456b3d6d-4d7e-4d21-91a5-8dd48c74b352 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:02:13 np0005531887 nova_compute[186849]: 2025-11-22 08:02:13.252 186853 WARNING nova.network.neutron [None req-456b3d6d-4d7e-4d21-91a5-8dd48c74b352 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] 6a4a282c-db22-41de-b34b-2960aa032ca8 already exists in list: networks containing: ['6a4a282c-db22-41de-b34b-2960aa032ca8']. ignoring it#033[00m
Nov 22 03:02:13 np0005531887 nova_compute[186849]: 2025-11-22 08:02:13.253 186853 WARNING nova.network.neutron [None req-456b3d6d-4d7e-4d21-91a5-8dd48c74b352 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] 6a4a282c-db22-41de-b34b-2960aa032ca8 already exists in list: networks containing: ['6a4a282c-db22-41de-b34b-2960aa032ca8']. ignoring it#033[00m
Nov 22 03:02:13 np0005531887 nova_compute[186849]: 2025-11-22 08:02:13.820 186853 DEBUG nova.compute.manager [req-12e8498e-0f39-493a-8529-92a73fb07b56 req-ec1137df-f963-4276-840b-03ada302ef82 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Received event network-changed-26960794-1ab2-46fd-917b-8b5e28186dc3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:02:13 np0005531887 nova_compute[186849]: 2025-11-22 08:02:13.821 186853 DEBUG nova.compute.manager [req-12e8498e-0f39-493a-8529-92a73fb07b56 req-ec1137df-f963-4276-840b-03ada302ef82 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Refreshing instance network info cache due to event network-changed-26960794-1ab2-46fd-917b-8b5e28186dc3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:02:13 np0005531887 nova_compute[186849]: 2025-11-22 08:02:13.821 186853 DEBUG oslo_concurrency.lockutils [req-12e8498e-0f39-493a-8529-92a73fb07b56 req-ec1137df-f963-4276-840b-03ada302ef82 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-fb921e88-22d2-4ae6-9d09-970505f0d5bb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:02:13 np0005531887 podman[227556]: 2025-11-22 08:02:13.832224388 +0000 UTC m=+0.055196828 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, io.openshift.tags=minimal rhel9, name=ubi9-minimal, release=1755695350, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, vcs-type=git, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 22 03:02:14 np0005531887 nova_compute[186849]: 2025-11-22 08:02:14.168 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:16 np0005531887 nova_compute[186849]: 2025-11-22 08:02:16.693 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:17 np0005531887 nova_compute[186849]: 2025-11-22 08:02:17.876 186853 DEBUG nova.network.neutron [None req-456b3d6d-4d7e-4d21-91a5-8dd48c74b352 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Updating instance_info_cache with network_info: [{"id": "8979b5b9-122d-4f08-832e-80c8509b9f9a", "address": "fa:16:3e:a0:69:e9", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8979b5b9-12", "ovs_interfaceid": "8979b5b9-122d-4f08-832e-80c8509b9f9a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "a5480b52-2a9f-4662-8b66-bd078a80ca44", "address": "fa:16:3e:5f:20:36", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5480b52-2a", "ovs_interfaceid": "a5480b52-2a9f-4662-8b66-bd078a80ca44", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "26960794-1ab2-46fd-917b-8b5e28186dc3", "address": "fa:16:3e:75:a1:57", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26960794-1a", "ovs_interfaceid": "26960794-1ab2-46fd-917b-8b5e28186dc3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:02:17 np0005531887 nova_compute[186849]: 2025-11-22 08:02:17.967 186853 DEBUG oslo_concurrency.lockutils [None req-456b3d6d-4d7e-4d21-91a5-8dd48c74b352 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Releasing lock "refresh_cache-fb921e88-22d2-4ae6-9d09-970505f0d5bb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:02:17 np0005531887 nova_compute[186849]: 2025-11-22 08:02:17.970 186853 DEBUG oslo_concurrency.lockutils [req-12e8498e-0f39-493a-8529-92a73fb07b56 req-ec1137df-f963-4276-840b-03ada302ef82 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-fb921e88-22d2-4ae6-9d09-970505f0d5bb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:02:17 np0005531887 nova_compute[186849]: 2025-11-22 08:02:17.971 186853 DEBUG nova.network.neutron [req-12e8498e-0f39-493a-8529-92a73fb07b56 req-ec1137df-f963-4276-840b-03ada302ef82 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Refreshing network info cache for port 26960794-1ab2-46fd-917b-8b5e28186dc3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:02:17 np0005531887 nova_compute[186849]: 2025-11-22 08:02:17.974 186853 DEBUG nova.virt.libvirt.vif [None req-456b3d6d-4d7e-4d21-91a5-8dd48c74b352 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:01:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-253236039',display_name='tempest-AttachInterfacesTestJSON-server-253236039',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-253236039',id=95,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPlFNWaBPkXmo1oJCfLN54tXf/v5r9z/rUlzmZkdlM69K1yL3DA4wVyYYlX6OkgGyxa3eXCtkM1Rz2ltwe/CBwFo/i1WEXVw/DQ8O39rJeQjSBKmeD14VReQyrKNKErHuw==',key_name='tempest-keypair-1627237356',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:01:19Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='2ce7c88f9ad440f494bdba03e7ece1bf',ramdisk_id='',reservation_id='r-hi30eyon',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-148239263',owner_user_name='tempest-AttachInterfacesTestJSON-148239263-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:01:19Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='53d50a77d3f2416c8fcc459cc343d045',uuid=fb921e88-22d2-4ae6-9d09-970505f0d5bb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "26960794-1ab2-46fd-917b-8b5e28186dc3", "address": "fa:16:3e:75:a1:57", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26960794-1a", "ovs_interfaceid": "26960794-1ab2-46fd-917b-8b5e28186dc3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:02:17 np0005531887 nova_compute[186849]: 2025-11-22 08:02:17.974 186853 DEBUG nova.network.os_vif_util [None req-456b3d6d-4d7e-4d21-91a5-8dd48c74b352 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converting VIF {"id": "26960794-1ab2-46fd-917b-8b5e28186dc3", "address": "fa:16:3e:75:a1:57", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26960794-1a", "ovs_interfaceid": "26960794-1ab2-46fd-917b-8b5e28186dc3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:02:17 np0005531887 nova_compute[186849]: 2025-11-22 08:02:17.975 186853 DEBUG nova.network.os_vif_util [None req-456b3d6d-4d7e-4d21-91a5-8dd48c74b352 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:75:a1:57,bridge_name='br-int',has_traffic_filtering=True,id=26960794-1ab2-46fd-917b-8b5e28186dc3,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap26960794-1a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:02:17 np0005531887 nova_compute[186849]: 2025-11-22 08:02:17.975 186853 DEBUG os_vif [None req-456b3d6d-4d7e-4d21-91a5-8dd48c74b352 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:75:a1:57,bridge_name='br-int',has_traffic_filtering=True,id=26960794-1ab2-46fd-917b-8b5e28186dc3,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap26960794-1a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:02:17 np0005531887 nova_compute[186849]: 2025-11-22 08:02:17.976 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:17 np0005531887 nova_compute[186849]: 2025-11-22 08:02:17.976 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:02:17 np0005531887 nova_compute[186849]: 2025-11-22 08:02:17.977 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:02:17 np0005531887 nova_compute[186849]: 2025-11-22 08:02:17.980 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:17 np0005531887 nova_compute[186849]: 2025-11-22 08:02:17.980 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap26960794-1a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:02:17 np0005531887 nova_compute[186849]: 2025-11-22 08:02:17.981 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap26960794-1a, col_values=(('external_ids', {'iface-id': '26960794-1ab2-46fd-917b-8b5e28186dc3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:75:a1:57', 'vm-uuid': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:02:17 np0005531887 NetworkManager[55210]: <info>  [1763798537.9850] manager: (tap26960794-1a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/137)
Nov 22 03:02:17 np0005531887 nova_compute[186849]: 2025-11-22 08:02:17.988 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:17 np0005531887 nova_compute[186849]: 2025-11-22 08:02:17.991 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:02:17 np0005531887 nova_compute[186849]: 2025-11-22 08:02:17.992 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:17 np0005531887 nova_compute[186849]: 2025-11-22 08:02:17.993 186853 INFO os_vif [None req-456b3d6d-4d7e-4d21-91a5-8dd48c74b352 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:75:a1:57,bridge_name='br-int',has_traffic_filtering=True,id=26960794-1ab2-46fd-917b-8b5e28186dc3,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap26960794-1a')#033[00m
Nov 22 03:02:17 np0005531887 nova_compute[186849]: 2025-11-22 08:02:17.995 186853 DEBUG nova.virt.libvirt.vif [None req-456b3d6d-4d7e-4d21-91a5-8dd48c74b352 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:01:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-253236039',display_name='tempest-AttachInterfacesTestJSON-server-253236039',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-253236039',id=95,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPlFNWaBPkXmo1oJCfLN54tXf/v5r9z/rUlzmZkdlM69K1yL3DA4wVyYYlX6OkgGyxa3eXCtkM1Rz2ltwe/CBwFo/i1WEXVw/DQ8O39rJeQjSBKmeD14VReQyrKNKErHuw==',key_name='tempest-keypair-1627237356',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:01:19Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='2ce7c88f9ad440f494bdba03e7ece1bf',ramdisk_id='',reservation_id='r-hi30eyon',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-148239263',owner_user_name='tempest-AttachInterfacesTestJSON-148239263-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:01:19Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='53d50a77d3f2416c8fcc459cc343d045',uuid=fb921e88-22d2-4ae6-9d09-970505f0d5bb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "26960794-1ab2-46fd-917b-8b5e28186dc3", "address": "fa:16:3e:75:a1:57", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26960794-1a", "ovs_interfaceid": "26960794-1ab2-46fd-917b-8b5e28186dc3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:02:17 np0005531887 nova_compute[186849]: 2025-11-22 08:02:17.995 186853 DEBUG nova.network.os_vif_util [None req-456b3d6d-4d7e-4d21-91a5-8dd48c74b352 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converting VIF {"id": "26960794-1ab2-46fd-917b-8b5e28186dc3", "address": "fa:16:3e:75:a1:57", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26960794-1a", "ovs_interfaceid": "26960794-1ab2-46fd-917b-8b5e28186dc3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:02:17 np0005531887 nova_compute[186849]: 2025-11-22 08:02:17.996 186853 DEBUG nova.network.os_vif_util [None req-456b3d6d-4d7e-4d21-91a5-8dd48c74b352 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:75:a1:57,bridge_name='br-int',has_traffic_filtering=True,id=26960794-1ab2-46fd-917b-8b5e28186dc3,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap26960794-1a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:02:17 np0005531887 nova_compute[186849]: 2025-11-22 08:02:17.999 186853 DEBUG nova.virt.libvirt.guest [None req-456b3d6d-4d7e-4d21-91a5-8dd48c74b352 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] attach device xml: <interface type="ethernet">
Nov 22 03:02:17 np0005531887 nova_compute[186849]:  <mac address="fa:16:3e:75:a1:57"/>
Nov 22 03:02:17 np0005531887 nova_compute[186849]:  <model type="virtio"/>
Nov 22 03:02:17 np0005531887 nova_compute[186849]:  <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:02:17 np0005531887 nova_compute[186849]:  <mtu size="1442"/>
Nov 22 03:02:17 np0005531887 nova_compute[186849]:  <target dev="tap26960794-1a"/>
Nov 22 03:02:17 np0005531887 nova_compute[186849]: </interface>
Nov 22 03:02:17 np0005531887 nova_compute[186849]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Nov 22 03:02:18 np0005531887 kernel: tap26960794-1a: entered promiscuous mode
Nov 22 03:02:18 np0005531887 NetworkManager[55210]: <info>  [1763798538.0100] manager: (tap26960794-1a): new Tun device (/org/freedesktop/NetworkManager/Devices/138)
Nov 22 03:02:18 np0005531887 nova_compute[186849]: 2025-11-22 08:02:18.011 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:18 np0005531887 ovn_controller[95130]: 2025-11-22T08:02:18Z|00280|binding|INFO|Claiming lport 26960794-1ab2-46fd-917b-8b5e28186dc3 for this chassis.
Nov 22 03:02:18 np0005531887 ovn_controller[95130]: 2025-11-22T08:02:18Z|00281|binding|INFO|26960794-1ab2-46fd-917b-8b5e28186dc3: Claiming fa:16:3e:75:a1:57 10.100.0.7
Nov 22 03:02:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:18.027 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:75:a1:57 10.100.0.7'], port_security=['fa:16:3e:75:a1:57 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6a4a282c-db22-41de-b34b-2960aa032ca8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd4a48801-4b3f-49e9-aa90-fb1d486a915e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0acaade9-a442-4c9d-a882-bd397d30fce8, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=26960794-1ab2-46fd-917b-8b5e28186dc3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:02:18 np0005531887 nova_compute[186849]: 2025-11-22 08:02:18.029 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:18 np0005531887 ovn_controller[95130]: 2025-11-22T08:02:18Z|00282|binding|INFO|Setting lport 26960794-1ab2-46fd-917b-8b5e28186dc3 ovn-installed in OVS
Nov 22 03:02:18 np0005531887 ovn_controller[95130]: 2025-11-22T08:02:18Z|00283|binding|INFO|Setting lport 26960794-1ab2-46fd-917b-8b5e28186dc3 up in Southbound
Nov 22 03:02:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:18.028 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 26960794-1ab2-46fd-917b-8b5e28186dc3 in datapath 6a4a282c-db22-41de-b34b-2960aa032ca8 bound to our chassis#033[00m
Nov 22 03:02:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:18.030 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6a4a282c-db22-41de-b34b-2960aa032ca8#033[00m
Nov 22 03:02:18 np0005531887 nova_compute[186849]: 2025-11-22 08:02:18.031 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:18 np0005531887 nova_compute[186849]: 2025-11-22 08:02:18.035 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:18.047 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[64a6977b-970b-4e00-bbce-ad57b765196c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:18 np0005531887 systemd-udevd[227599]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:02:18 np0005531887 NetworkManager[55210]: <info>  [1763798538.0763] device (tap26960794-1a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:02:18 np0005531887 NetworkManager[55210]: <info>  [1763798538.0771] device (tap26960794-1a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:02:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:18.077 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[6afc3c2b-7965-42cb-a65a-69ff345a45f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:18.080 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[aedb1aa0-19e7-45a7-8c17-17b8ad7ae658]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:18.115 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[d9bd5050-501f-4087-b13c-ed21b1cc0f0f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:18 np0005531887 podman[227584]: 2025-11-22 08:02:18.126877971 +0000 UTC m=+0.081196515 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm)
Nov 22 03:02:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:18.137 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[cf64f0af-15c7-4afd-b146-1bb4dbe4ebc0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6a4a282c-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b3:7a:86'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 89], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 522138, 'reachable_time': 34113, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227631, 'error': None, 'target': 'ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:18.155 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[6268d84f-9d63-4af2-8cc1-978e068fb698]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap6a4a282c-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 522153, 'tstamp': 522153}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227635, 'error': None, 'target': 'ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap6a4a282c-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 522157, 'tstamp': 522157}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227635, 'error': None, 'target': 'ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:18.156 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6a4a282c-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:02:18 np0005531887 nova_compute[186849]: 2025-11-22 08:02:18.158 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:18.159 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6a4a282c-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:02:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:18.159 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:02:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:18.160 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6a4a282c-d0, col_values=(('external_ids', {'iface-id': '26692495-261e-4628-ae4d-0a33d676c097'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:02:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:18.160 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:02:18 np0005531887 podman[227585]: 2025-11-22 08:02:18.162741112 +0000 UTC m=+0.112309844 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:02:18 np0005531887 nova_compute[186849]: 2025-11-22 08:02:18.167 186853 DEBUG nova.virt.libvirt.driver [None req-456b3d6d-4d7e-4d21-91a5-8dd48c74b352 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:02:18 np0005531887 nova_compute[186849]: 2025-11-22 08:02:18.167 186853 DEBUG nova.virt.libvirt.driver [None req-456b3d6d-4d7e-4d21-91a5-8dd48c74b352 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:02:18 np0005531887 nova_compute[186849]: 2025-11-22 08:02:18.168 186853 DEBUG nova.virt.libvirt.driver [None req-456b3d6d-4d7e-4d21-91a5-8dd48c74b352 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] No VIF found with MAC fa:16:3e:a0:69:e9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:02:18 np0005531887 nova_compute[186849]: 2025-11-22 08:02:18.168 186853 DEBUG nova.virt.libvirt.driver [None req-456b3d6d-4d7e-4d21-91a5-8dd48c74b352 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] No VIF found with MAC fa:16:3e:5f:20:36, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:02:18 np0005531887 nova_compute[186849]: 2025-11-22 08:02:18.168 186853 DEBUG nova.virt.libvirt.driver [None req-456b3d6d-4d7e-4d21-91a5-8dd48c74b352 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] No VIF found with MAC fa:16:3e:75:a1:57, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:02:18 np0005531887 nova_compute[186849]: 2025-11-22 08:02:18.198 186853 DEBUG nova.virt.libvirt.guest [None req-456b3d6d-4d7e-4d21-91a5-8dd48c74b352 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:02:18 np0005531887 nova_compute[186849]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:02:18 np0005531887 nova_compute[186849]:  <nova:name>tempest-AttachInterfacesTestJSON-server-253236039</nova:name>
Nov 22 03:02:18 np0005531887 nova_compute[186849]:  <nova:creationTime>2025-11-22 08:02:18</nova:creationTime>
Nov 22 03:02:18 np0005531887 nova_compute[186849]:  <nova:flavor name="m1.nano">
Nov 22 03:02:18 np0005531887 nova_compute[186849]:    <nova:memory>128</nova:memory>
Nov 22 03:02:18 np0005531887 nova_compute[186849]:    <nova:disk>1</nova:disk>
Nov 22 03:02:18 np0005531887 nova_compute[186849]:    <nova:swap>0</nova:swap>
Nov 22 03:02:18 np0005531887 nova_compute[186849]:    <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:02:18 np0005531887 nova_compute[186849]:    <nova:vcpus>1</nova:vcpus>
Nov 22 03:02:18 np0005531887 nova_compute[186849]:  </nova:flavor>
Nov 22 03:02:18 np0005531887 nova_compute[186849]:  <nova:owner>
Nov 22 03:02:18 np0005531887 nova_compute[186849]:    <nova:user uuid="53d50a77d3f2416c8fcc459cc343d045">tempest-AttachInterfacesTestJSON-148239263-project-member</nova:user>
Nov 22 03:02:18 np0005531887 nova_compute[186849]:    <nova:project uuid="2ce7c88f9ad440f494bdba03e7ece1bf">tempest-AttachInterfacesTestJSON-148239263</nova:project>
Nov 22 03:02:18 np0005531887 nova_compute[186849]:  </nova:owner>
Nov 22 03:02:18 np0005531887 nova_compute[186849]:  <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:02:18 np0005531887 nova_compute[186849]:  <nova:ports>
Nov 22 03:02:18 np0005531887 nova_compute[186849]:    <nova:port uuid="8979b5b9-122d-4f08-832e-80c8509b9f9a">
Nov 22 03:02:18 np0005531887 nova_compute[186849]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 22 03:02:18 np0005531887 nova_compute[186849]:    </nova:port>
Nov 22 03:02:18 np0005531887 nova_compute[186849]:    <nova:port uuid="a5480b52-2a9f-4662-8b66-bd078a80ca44">
Nov 22 03:02:18 np0005531887 nova_compute[186849]:      <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Nov 22 03:02:18 np0005531887 nova_compute[186849]:    </nova:port>
Nov 22 03:02:18 np0005531887 nova_compute[186849]:    <nova:port uuid="26960794-1ab2-46fd-917b-8b5e28186dc3">
Nov 22 03:02:18 np0005531887 nova_compute[186849]:      <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 22 03:02:18 np0005531887 nova_compute[186849]:    </nova:port>
Nov 22 03:02:18 np0005531887 nova_compute[186849]:  </nova:ports>
Nov 22 03:02:18 np0005531887 nova_compute[186849]: </nova:instance>
Nov 22 03:02:18 np0005531887 nova_compute[186849]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Nov 22 03:02:18 np0005531887 nova_compute[186849]: 2025-11-22 08:02:18.224 186853 DEBUG oslo_concurrency.lockutils [None req-456b3d6d-4d7e-4d21-91a5-8dd48c74b352 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "interface-fb921e88-22d2-4ae6-9d09-970505f0d5bb-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 13.083s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:02:18 np0005531887 nova_compute[186849]: 2025-11-22 08:02:18.883 186853 DEBUG nova.compute.manager [req-fa0e0dbb-673e-4fa6-b2dc-abcd6f5a7e55 req-00cb75e1-174c-4df6-a943-8ba9582b15b7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Received event network-vif-plugged-26960794-1ab2-46fd-917b-8b5e28186dc3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:02:18 np0005531887 nova_compute[186849]: 2025-11-22 08:02:18.884 186853 DEBUG oslo_concurrency.lockutils [req-fa0e0dbb-673e-4fa6-b2dc-abcd6f5a7e55 req-00cb75e1-174c-4df6-a943-8ba9582b15b7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "fb921e88-22d2-4ae6-9d09-970505f0d5bb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:02:18 np0005531887 nova_compute[186849]: 2025-11-22 08:02:18.884 186853 DEBUG oslo_concurrency.lockutils [req-fa0e0dbb-673e-4fa6-b2dc-abcd6f5a7e55 req-00cb75e1-174c-4df6-a943-8ba9582b15b7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "fb921e88-22d2-4ae6-9d09-970505f0d5bb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:02:18 np0005531887 nova_compute[186849]: 2025-11-22 08:02:18.884 186853 DEBUG oslo_concurrency.lockutils [req-fa0e0dbb-673e-4fa6-b2dc-abcd6f5a7e55 req-00cb75e1-174c-4df6-a943-8ba9582b15b7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "fb921e88-22d2-4ae6-9d09-970505f0d5bb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:02:18 np0005531887 nova_compute[186849]: 2025-11-22 08:02:18.884 186853 DEBUG nova.compute.manager [req-fa0e0dbb-673e-4fa6-b2dc-abcd6f5a7e55 req-00cb75e1-174c-4df6-a943-8ba9582b15b7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] No waiting events found dispatching network-vif-plugged-26960794-1ab2-46fd-917b-8b5e28186dc3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:02:18 np0005531887 nova_compute[186849]: 2025-11-22 08:02:18.885 186853 WARNING nova.compute.manager [req-fa0e0dbb-673e-4fa6-b2dc-abcd6f5a7e55 req-00cb75e1-174c-4df6-a943-8ba9582b15b7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Received unexpected event network-vif-plugged-26960794-1ab2-46fd-917b-8b5e28186dc3 for instance with vm_state active and task_state None.#033[00m
Nov 22 03:02:20 np0005531887 nova_compute[186849]: 2025-11-22 08:02:20.034 186853 DEBUG nova.network.neutron [req-12e8498e-0f39-493a-8529-92a73fb07b56 req-ec1137df-f963-4276-840b-03ada302ef82 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Updated VIF entry in instance network info cache for port 26960794-1ab2-46fd-917b-8b5e28186dc3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:02:20 np0005531887 nova_compute[186849]: 2025-11-22 08:02:20.035 186853 DEBUG nova.network.neutron [req-12e8498e-0f39-493a-8529-92a73fb07b56 req-ec1137df-f963-4276-840b-03ada302ef82 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Updating instance_info_cache with network_info: [{"id": "8979b5b9-122d-4f08-832e-80c8509b9f9a", "address": "fa:16:3e:a0:69:e9", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8979b5b9-12", "ovs_interfaceid": "8979b5b9-122d-4f08-832e-80c8509b9f9a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "a5480b52-2a9f-4662-8b66-bd078a80ca44", "address": "fa:16:3e:5f:20:36", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5480b52-2a", "ovs_interfaceid": "a5480b52-2a9f-4662-8b66-bd078a80ca44", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "26960794-1ab2-46fd-917b-8b5e28186dc3", "address": "fa:16:3e:75:a1:57", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26960794-1a", "ovs_interfaceid": "26960794-1ab2-46fd-917b-8b5e28186dc3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:02:20 np0005531887 nova_compute[186849]: 2025-11-22 08:02:20.060 186853 DEBUG oslo_concurrency.lockutils [req-12e8498e-0f39-493a-8529-92a73fb07b56 req-ec1137df-f963-4276-840b-03ada302ef82 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-fb921e88-22d2-4ae6-9d09-970505f0d5bb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:02:20 np0005531887 ovn_controller[95130]: 2025-11-22T08:02:20Z|00039|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:75:a1:57 10.100.0.7
Nov 22 03:02:20 np0005531887 ovn_controller[95130]: 2025-11-22T08:02:20Z|00040|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:75:a1:57 10.100.0.7
Nov 22 03:02:20 np0005531887 nova_compute[186849]: 2025-11-22 08:02:20.954 186853 DEBUG oslo_concurrency.lockutils [None req-4360d31d-bf57-4507-8fb9-b6c804eb90f3 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquiring lock "interface-fb921e88-22d2-4ae6-9d09-970505f0d5bb-b7ba05ee-8492-4d3b-ae6b-fd90eee67f56" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:02:20 np0005531887 nova_compute[186849]: 2025-11-22 08:02:20.955 186853 DEBUG oslo_concurrency.lockutils [None req-4360d31d-bf57-4507-8fb9-b6c804eb90f3 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "interface-fb921e88-22d2-4ae6-9d09-970505f0d5bb-b7ba05ee-8492-4d3b-ae6b-fd90eee67f56" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:02:20 np0005531887 nova_compute[186849]: 2025-11-22 08:02:20.955 186853 DEBUG nova.objects.instance [None req-4360d31d-bf57-4507-8fb9-b6c804eb90f3 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lazy-loading 'flavor' on Instance uuid fb921e88-22d2-4ae6-9d09-970505f0d5bb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:02:21 np0005531887 nova_compute[186849]: 2025-11-22 08:02:21.695 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:21 np0005531887 nova_compute[186849]: 2025-11-22 08:02:21.716 186853 DEBUG nova.compute.manager [req-7f5899e4-ef6e-4980-a23f-5f3bac8caf2a req-6f225c76-02ee-44e8-a4bc-6d3d53e0f142 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Received event network-vif-plugged-26960794-1ab2-46fd-917b-8b5e28186dc3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:02:21 np0005531887 nova_compute[186849]: 2025-11-22 08:02:21.716 186853 DEBUG oslo_concurrency.lockutils [req-7f5899e4-ef6e-4980-a23f-5f3bac8caf2a req-6f225c76-02ee-44e8-a4bc-6d3d53e0f142 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "fb921e88-22d2-4ae6-9d09-970505f0d5bb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:02:21 np0005531887 nova_compute[186849]: 2025-11-22 08:02:21.717 186853 DEBUG oslo_concurrency.lockutils [req-7f5899e4-ef6e-4980-a23f-5f3bac8caf2a req-6f225c76-02ee-44e8-a4bc-6d3d53e0f142 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "fb921e88-22d2-4ae6-9d09-970505f0d5bb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:02:21 np0005531887 nova_compute[186849]: 2025-11-22 08:02:21.717 186853 DEBUG oslo_concurrency.lockutils [req-7f5899e4-ef6e-4980-a23f-5f3bac8caf2a req-6f225c76-02ee-44e8-a4bc-6d3d53e0f142 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "fb921e88-22d2-4ae6-9d09-970505f0d5bb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:02:21 np0005531887 nova_compute[186849]: 2025-11-22 08:02:21.717 186853 DEBUG nova.compute.manager [req-7f5899e4-ef6e-4980-a23f-5f3bac8caf2a req-6f225c76-02ee-44e8-a4bc-6d3d53e0f142 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] No waiting events found dispatching network-vif-plugged-26960794-1ab2-46fd-917b-8b5e28186dc3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:02:21 np0005531887 nova_compute[186849]: 2025-11-22 08:02:21.717 186853 WARNING nova.compute.manager [req-7f5899e4-ef6e-4980-a23f-5f3bac8caf2a req-6f225c76-02ee-44e8-a4bc-6d3d53e0f142 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Received unexpected event network-vif-plugged-26960794-1ab2-46fd-917b-8b5e28186dc3 for instance with vm_state active and task_state None.#033[00m
Nov 22 03:02:21 np0005531887 nova_compute[186849]: 2025-11-22 08:02:21.922 186853 DEBUG nova.objects.instance [None req-4360d31d-bf57-4507-8fb9-b6c804eb90f3 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lazy-loading 'pci_requests' on Instance uuid fb921e88-22d2-4ae6-9d09-970505f0d5bb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:02:21 np0005531887 nova_compute[186849]: 2025-11-22 08:02:21.946 186853 DEBUG nova.network.neutron [None req-4360d31d-bf57-4507-8fb9-b6c804eb90f3 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:02:22 np0005531887 nova_compute[186849]: 2025-11-22 08:02:22.649 186853 DEBUG nova.policy [None req-4360d31d-bf57-4507-8fb9-b6c804eb90f3 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:02:22 np0005531887 podman[227639]: 2025-11-22 08:02:22.835013656 +0000 UTC m=+0.055897465 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 03:02:22 np0005531887 nova_compute[186849]: 2025-11-22 08:02:22.984 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:23 np0005531887 nova_compute[186849]: 2025-11-22 08:02:23.502 186853 DEBUG nova.network.neutron [None req-4360d31d-bf57-4507-8fb9-b6c804eb90f3 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Successfully updated port: b7ba05ee-8492-4d3b-ae6b-fd90eee67f56 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:02:23 np0005531887 nova_compute[186849]: 2025-11-22 08:02:23.517 186853 DEBUG oslo_concurrency.lockutils [None req-4360d31d-bf57-4507-8fb9-b6c804eb90f3 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquiring lock "refresh_cache-fb921e88-22d2-4ae6-9d09-970505f0d5bb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:02:23 np0005531887 nova_compute[186849]: 2025-11-22 08:02:23.518 186853 DEBUG oslo_concurrency.lockutils [None req-4360d31d-bf57-4507-8fb9-b6c804eb90f3 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquired lock "refresh_cache-fb921e88-22d2-4ae6-9d09-970505f0d5bb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:02:23 np0005531887 nova_compute[186849]: 2025-11-22 08:02:23.518 186853 DEBUG nova.network.neutron [None req-4360d31d-bf57-4507-8fb9-b6c804eb90f3 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:02:23 np0005531887 nova_compute[186849]: 2025-11-22 08:02:23.618 186853 DEBUG nova.compute.manager [req-a28f81d5-c04f-418e-aad8-700a1a8739b1 req-5b5a4b8a-2cd4-40bf-99cc-f28abfd3d8a5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Received event network-changed-b7ba05ee-8492-4d3b-ae6b-fd90eee67f56 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:02:23 np0005531887 nova_compute[186849]: 2025-11-22 08:02:23.619 186853 DEBUG nova.compute.manager [req-a28f81d5-c04f-418e-aad8-700a1a8739b1 req-5b5a4b8a-2cd4-40bf-99cc-f28abfd3d8a5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Refreshing instance network info cache due to event network-changed-b7ba05ee-8492-4d3b-ae6b-fd90eee67f56. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:02:23 np0005531887 nova_compute[186849]: 2025-11-22 08:02:23.619 186853 DEBUG oslo_concurrency.lockutils [req-a28f81d5-c04f-418e-aad8-700a1a8739b1 req-5b5a4b8a-2cd4-40bf-99cc-f28abfd3d8a5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-fb921e88-22d2-4ae6-9d09-970505f0d5bb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:02:23 np0005531887 nova_compute[186849]: 2025-11-22 08:02:23.701 186853 WARNING nova.network.neutron [None req-4360d31d-bf57-4507-8fb9-b6c804eb90f3 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] 6a4a282c-db22-41de-b34b-2960aa032ca8 already exists in list: networks containing: ['6a4a282c-db22-41de-b34b-2960aa032ca8']. ignoring it#033[00m
Nov 22 03:02:23 np0005531887 nova_compute[186849]: 2025-11-22 08:02:23.702 186853 WARNING nova.network.neutron [None req-4360d31d-bf57-4507-8fb9-b6c804eb90f3 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] 6a4a282c-db22-41de-b34b-2960aa032ca8 already exists in list: networks containing: ['6a4a282c-db22-41de-b34b-2960aa032ca8']. ignoring it#033[00m
Nov 22 03:02:23 np0005531887 nova_compute[186849]: 2025-11-22 08:02:23.702 186853 WARNING nova.network.neutron [None req-4360d31d-bf57-4507-8fb9-b6c804eb90f3 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] 6a4a282c-db22-41de-b34b-2960aa032ca8 already exists in list: networks containing: ['6a4a282c-db22-41de-b34b-2960aa032ca8']. ignoring it#033[00m
Nov 22 03:02:26 np0005531887 nova_compute[186849]: 2025-11-22 08:02:26.698 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:27 np0005531887 nova_compute[186849]: 2025-11-22 08:02:27.990 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:29 np0005531887 podman[227661]: 2025-11-22 08:02:29.872230557 +0000 UTC m=+0.087742473 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:02:31 np0005531887 nova_compute[186849]: 2025-11-22 08:02:31.703 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:32 np0005531887 nova_compute[186849]: 2025-11-22 08:02:32.992 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:33 np0005531887 nova_compute[186849]: 2025-11-22 08:02:33.724 186853 DEBUG nova.network.neutron [None req-4360d31d-bf57-4507-8fb9-b6c804eb90f3 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Updating instance_info_cache with network_info: [{"id": "8979b5b9-122d-4f08-832e-80c8509b9f9a", "address": "fa:16:3e:a0:69:e9", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8979b5b9-12", "ovs_interfaceid": "8979b5b9-122d-4f08-832e-80c8509b9f9a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "a5480b52-2a9f-4662-8b66-bd078a80ca44", "address": "fa:16:3e:5f:20:36", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5480b52-2a", "ovs_interfaceid": "a5480b52-2a9f-4662-8b66-bd078a80ca44", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "26960794-1ab2-46fd-917b-8b5e28186dc3", "address": "fa:16:3e:75:a1:57", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26960794-1a", "ovs_interfaceid": "26960794-1ab2-46fd-917b-8b5e28186dc3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "b7ba05ee-8492-4d3b-ae6b-fd90eee67f56", "address": "fa:16:3e:ff:85:53", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7ba05ee-84", "ovs_interfaceid": "b7ba05ee-8492-4d3b-ae6b-fd90eee67f56", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:02:33 np0005531887 nova_compute[186849]: 2025-11-22 08:02:33.886 186853 DEBUG oslo_concurrency.lockutils [None req-4360d31d-bf57-4507-8fb9-b6c804eb90f3 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Releasing lock "refresh_cache-fb921e88-22d2-4ae6-9d09-970505f0d5bb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:02:33 np0005531887 nova_compute[186849]: 2025-11-22 08:02:33.888 186853 DEBUG oslo_concurrency.lockutils [req-a28f81d5-c04f-418e-aad8-700a1a8739b1 req-5b5a4b8a-2cd4-40bf-99cc-f28abfd3d8a5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-fb921e88-22d2-4ae6-9d09-970505f0d5bb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:02:33 np0005531887 nova_compute[186849]: 2025-11-22 08:02:33.888 186853 DEBUG nova.network.neutron [req-a28f81d5-c04f-418e-aad8-700a1a8739b1 req-5b5a4b8a-2cd4-40bf-99cc-f28abfd3d8a5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Refreshing network info cache for port b7ba05ee-8492-4d3b-ae6b-fd90eee67f56 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:02:33 np0005531887 nova_compute[186849]: 2025-11-22 08:02:33.892 186853 DEBUG nova.virt.libvirt.vif [None req-4360d31d-bf57-4507-8fb9-b6c804eb90f3 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:01:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-253236039',display_name='tempest-AttachInterfacesTestJSON-server-253236039',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-253236039',id=95,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPlFNWaBPkXmo1oJCfLN54tXf/v5r9z/rUlzmZkdlM69K1yL3DA4wVyYYlX6OkgGyxa3eXCtkM1Rz2ltwe/CBwFo/i1WEXVw/DQ8O39rJeQjSBKmeD14VReQyrKNKErHuw==',key_name='tempest-keypair-1627237356',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:01:19Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='2ce7c88f9ad440f494bdba03e7ece1bf',ramdisk_id='',reservation_id='r-hi30eyon',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-148239263',owner_user_name='tempest-AttachInterfacesTestJSON-148239263-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:01:19Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='53d50a77d3f2416c8fcc459cc343d045',uuid=fb921e88-22d2-4ae6-9d09-970505f0d5bb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b7ba05ee-8492-4d3b-ae6b-fd90eee67f56", "address": "fa:16:3e:ff:85:53", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7ba05ee-84", "ovs_interfaceid": "b7ba05ee-8492-4d3b-ae6b-fd90eee67f56", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:02:33 np0005531887 nova_compute[186849]: 2025-11-22 08:02:33.893 186853 DEBUG nova.network.os_vif_util [None req-4360d31d-bf57-4507-8fb9-b6c804eb90f3 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converting VIF {"id": "b7ba05ee-8492-4d3b-ae6b-fd90eee67f56", "address": "fa:16:3e:ff:85:53", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7ba05ee-84", "ovs_interfaceid": "b7ba05ee-8492-4d3b-ae6b-fd90eee67f56", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:02:33 np0005531887 nova_compute[186849]: 2025-11-22 08:02:33.894 186853 DEBUG nova.network.os_vif_util [None req-4360d31d-bf57-4507-8fb9-b6c804eb90f3 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ff:85:53,bridge_name='br-int',has_traffic_filtering=True,id=b7ba05ee-8492-4d3b-ae6b-fd90eee67f56,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb7ba05ee-84') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:02:33 np0005531887 nova_compute[186849]: 2025-11-22 08:02:33.894 186853 DEBUG os_vif [None req-4360d31d-bf57-4507-8fb9-b6c804eb90f3 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:85:53,bridge_name='br-int',has_traffic_filtering=True,id=b7ba05ee-8492-4d3b-ae6b-fd90eee67f56,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb7ba05ee-84') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:02:33 np0005531887 nova_compute[186849]: 2025-11-22 08:02:33.895 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:33 np0005531887 nova_compute[186849]: 2025-11-22 08:02:33.895 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:02:33 np0005531887 nova_compute[186849]: 2025-11-22 08:02:33.895 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:02:33 np0005531887 nova_compute[186849]: 2025-11-22 08:02:33.898 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:33 np0005531887 nova_compute[186849]: 2025-11-22 08:02:33.898 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb7ba05ee-84, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:02:33 np0005531887 nova_compute[186849]: 2025-11-22 08:02:33.899 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb7ba05ee-84, col_values=(('external_ids', {'iface-id': 'b7ba05ee-8492-4d3b-ae6b-fd90eee67f56', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ff:85:53', 'vm-uuid': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:02:33 np0005531887 nova_compute[186849]: 2025-11-22 08:02:33.900 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:33 np0005531887 nova_compute[186849]: 2025-11-22 08:02:33.902 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:02:33 np0005531887 NetworkManager[55210]: <info>  [1763798553.9032] manager: (tapb7ba05ee-84): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/139)
Nov 22 03:02:33 np0005531887 nova_compute[186849]: 2025-11-22 08:02:33.910 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:33 np0005531887 nova_compute[186849]: 2025-11-22 08:02:33.912 186853 INFO os_vif [None req-4360d31d-bf57-4507-8fb9-b6c804eb90f3 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:85:53,bridge_name='br-int',has_traffic_filtering=True,id=b7ba05ee-8492-4d3b-ae6b-fd90eee67f56,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb7ba05ee-84')#033[00m
Nov 22 03:02:33 np0005531887 nova_compute[186849]: 2025-11-22 08:02:33.913 186853 DEBUG nova.virt.libvirt.vif [None req-4360d31d-bf57-4507-8fb9-b6c804eb90f3 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:01:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-253236039',display_name='tempest-AttachInterfacesTestJSON-server-253236039',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-253236039',id=95,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPlFNWaBPkXmo1oJCfLN54tXf/v5r9z/rUlzmZkdlM69K1yL3DA4wVyYYlX6OkgGyxa3eXCtkM1Rz2ltwe/CBwFo/i1WEXVw/DQ8O39rJeQjSBKmeD14VReQyrKNKErHuw==',key_name='tempest-keypair-1627237356',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:01:19Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='2ce7c88f9ad440f494bdba03e7ece1bf',ramdisk_id='',reservation_id='r-hi30eyon',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-148239263',owner_user_name='tempest-AttachInterfacesTestJSON-148239263-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:01:19Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='53d50a77d3f2416c8fcc459cc343d045',uuid=fb921e88-22d2-4ae6-9d09-970505f0d5bb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b7ba05ee-8492-4d3b-ae6b-fd90eee67f56", "address": "fa:16:3e:ff:85:53", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7ba05ee-84", "ovs_interfaceid": "b7ba05ee-8492-4d3b-ae6b-fd90eee67f56", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:02:33 np0005531887 nova_compute[186849]: 2025-11-22 08:02:33.914 186853 DEBUG nova.network.os_vif_util [None req-4360d31d-bf57-4507-8fb9-b6c804eb90f3 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converting VIF {"id": "b7ba05ee-8492-4d3b-ae6b-fd90eee67f56", "address": "fa:16:3e:ff:85:53", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7ba05ee-84", "ovs_interfaceid": "b7ba05ee-8492-4d3b-ae6b-fd90eee67f56", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:02:33 np0005531887 nova_compute[186849]: 2025-11-22 08:02:33.914 186853 DEBUG nova.network.os_vif_util [None req-4360d31d-bf57-4507-8fb9-b6c804eb90f3 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ff:85:53,bridge_name='br-int',has_traffic_filtering=True,id=b7ba05ee-8492-4d3b-ae6b-fd90eee67f56,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb7ba05ee-84') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:02:33 np0005531887 nova_compute[186849]: 2025-11-22 08:02:33.919 186853 DEBUG nova.virt.libvirt.guest [None req-4360d31d-bf57-4507-8fb9-b6c804eb90f3 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] attach device xml: <interface type="ethernet">
Nov 22 03:02:33 np0005531887 nova_compute[186849]:  <mac address="fa:16:3e:ff:85:53"/>
Nov 22 03:02:33 np0005531887 nova_compute[186849]:  <model type="virtio"/>
Nov 22 03:02:33 np0005531887 nova_compute[186849]:  <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:02:33 np0005531887 nova_compute[186849]:  <mtu size="1442"/>
Nov 22 03:02:33 np0005531887 nova_compute[186849]:  <target dev="tapb7ba05ee-84"/>
Nov 22 03:02:33 np0005531887 nova_compute[186849]: </interface>
Nov 22 03:02:33 np0005531887 nova_compute[186849]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Nov 22 03:02:33 np0005531887 kernel: tapb7ba05ee-84: entered promiscuous mode
Nov 22 03:02:33 np0005531887 NetworkManager[55210]: <info>  [1763798553.9350] manager: (tapb7ba05ee-84): new Tun device (/org/freedesktop/NetworkManager/Devices/140)
Nov 22 03:02:33 np0005531887 nova_compute[186849]: 2025-11-22 08:02:33.936 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:33 np0005531887 ovn_controller[95130]: 2025-11-22T08:02:33Z|00284|binding|INFO|Claiming lport b7ba05ee-8492-4d3b-ae6b-fd90eee67f56 for this chassis.
Nov 22 03:02:33 np0005531887 ovn_controller[95130]: 2025-11-22T08:02:33Z|00285|binding|INFO|b7ba05ee-8492-4d3b-ae6b-fd90eee67f56: Claiming fa:16:3e:ff:85:53 10.100.0.13
Nov 22 03:02:33 np0005531887 ovn_controller[95130]: 2025-11-22T08:02:33Z|00286|binding|INFO|Setting lport b7ba05ee-8492-4d3b-ae6b-fd90eee67f56 ovn-installed in OVS
Nov 22 03:02:33 np0005531887 nova_compute[186849]: 2025-11-22 08:02:33.953 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:33 np0005531887 nova_compute[186849]: 2025-11-22 08:02:33.957 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:33 np0005531887 systemd-udevd[227688]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:02:33 np0005531887 ovn_controller[95130]: 2025-11-22T08:02:33Z|00287|binding|INFO|Setting lport b7ba05ee-8492-4d3b-ae6b-fd90eee67f56 up in Southbound
Nov 22 03:02:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:33.986 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ff:85:53 10.100.0.13'], port_security=['fa:16:3e:ff:85:53 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-AttachInterfacesTestJSON-728286560', 'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6a4a282c-db22-41de-b34b-2960aa032ca8', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-AttachInterfacesTestJSON-728286560', 'neutron:project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd4a48801-4b3f-49e9-aa90-fb1d486a915e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0acaade9-a442-4c9d-a882-bd397d30fce8, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=b7ba05ee-8492-4d3b-ae6b-fd90eee67f56) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:02:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:33.987 104084 INFO neutron.agent.ovn.metadata.agent [-] Port b7ba05ee-8492-4d3b-ae6b-fd90eee67f56 in datapath 6a4a282c-db22-41de-b34b-2960aa032ca8 bound to our chassis#033[00m
Nov 22 03:02:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:33.989 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6a4a282c-db22-41de-b34b-2960aa032ca8#033[00m
Nov 22 03:02:33 np0005531887 NetworkManager[55210]: <info>  [1763798553.9936] device (tapb7ba05ee-84): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:02:33 np0005531887 NetworkManager[55210]: <info>  [1763798553.9948] device (tapb7ba05ee-84): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:02:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:34.015 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[6ed94c3d-98aa-4ce0-8fdb-79fece08264e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:34.050 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[270d9cab-5692-415c-b113-d3fa2965e1ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:34.054 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[cf0b941e-0e46-47fa-821f-d057f991d572]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:34 np0005531887 nova_compute[186849]: 2025-11-22 08:02:34.067 186853 DEBUG nova.virt.libvirt.driver [None req-4360d31d-bf57-4507-8fb9-b6c804eb90f3 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:02:34 np0005531887 nova_compute[186849]: 2025-11-22 08:02:34.067 186853 DEBUG nova.virt.libvirt.driver [None req-4360d31d-bf57-4507-8fb9-b6c804eb90f3 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:02:34 np0005531887 nova_compute[186849]: 2025-11-22 08:02:34.068 186853 DEBUG nova.virt.libvirt.driver [None req-4360d31d-bf57-4507-8fb9-b6c804eb90f3 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] No VIF found with MAC fa:16:3e:a0:69:e9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:02:34 np0005531887 nova_compute[186849]: 2025-11-22 08:02:34.068 186853 DEBUG nova.virt.libvirt.driver [None req-4360d31d-bf57-4507-8fb9-b6c804eb90f3 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] No VIF found with MAC fa:16:3e:5f:20:36, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:02:34 np0005531887 nova_compute[186849]: 2025-11-22 08:02:34.068 186853 DEBUG nova.virt.libvirt.driver [None req-4360d31d-bf57-4507-8fb9-b6c804eb90f3 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] No VIF found with MAC fa:16:3e:75:a1:57, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:02:34 np0005531887 nova_compute[186849]: 2025-11-22 08:02:34.068 186853 DEBUG nova.virt.libvirt.driver [None req-4360d31d-bf57-4507-8fb9-b6c804eb90f3 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] No VIF found with MAC fa:16:3e:ff:85:53, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:02:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:34.089 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[354506d5-8e38-4501-83ba-f6841dfcb93e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:34.109 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[7b3d6950-b660-4dd2-86f8-361fe6a3798e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6a4a282c-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b3:7a:86'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 9, 'rx_bytes': 784, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 9, 'rx_bytes': 784, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 89], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 522138, 'reachable_time': 34113, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227696, 'error': None, 'target': 'ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:34 np0005531887 nova_compute[186849]: 2025-11-22 08:02:34.113 186853 DEBUG nova.virt.libvirt.guest [None req-4360d31d-bf57-4507-8fb9-b6c804eb90f3 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:02:34 np0005531887 nova_compute[186849]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:02:34 np0005531887 nova_compute[186849]:  <nova:name>tempest-AttachInterfacesTestJSON-server-253236039</nova:name>
Nov 22 03:02:34 np0005531887 nova_compute[186849]:  <nova:creationTime>2025-11-22 08:02:34</nova:creationTime>
Nov 22 03:02:34 np0005531887 nova_compute[186849]:  <nova:flavor name="m1.nano">
Nov 22 03:02:34 np0005531887 nova_compute[186849]:    <nova:memory>128</nova:memory>
Nov 22 03:02:34 np0005531887 nova_compute[186849]:    <nova:disk>1</nova:disk>
Nov 22 03:02:34 np0005531887 nova_compute[186849]:    <nova:swap>0</nova:swap>
Nov 22 03:02:34 np0005531887 nova_compute[186849]:    <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:02:34 np0005531887 nova_compute[186849]:    <nova:vcpus>1</nova:vcpus>
Nov 22 03:02:34 np0005531887 nova_compute[186849]:  </nova:flavor>
Nov 22 03:02:34 np0005531887 nova_compute[186849]:  <nova:owner>
Nov 22 03:02:34 np0005531887 nova_compute[186849]:    <nova:user uuid="53d50a77d3f2416c8fcc459cc343d045">tempest-AttachInterfacesTestJSON-148239263-project-member</nova:user>
Nov 22 03:02:34 np0005531887 nova_compute[186849]:    <nova:project uuid="2ce7c88f9ad440f494bdba03e7ece1bf">tempest-AttachInterfacesTestJSON-148239263</nova:project>
Nov 22 03:02:34 np0005531887 nova_compute[186849]:  </nova:owner>
Nov 22 03:02:34 np0005531887 nova_compute[186849]:  <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:02:34 np0005531887 nova_compute[186849]:  <nova:ports>
Nov 22 03:02:34 np0005531887 nova_compute[186849]:    <nova:port uuid="8979b5b9-122d-4f08-832e-80c8509b9f9a">
Nov 22 03:02:34 np0005531887 nova_compute[186849]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 22 03:02:34 np0005531887 nova_compute[186849]:    </nova:port>
Nov 22 03:02:34 np0005531887 nova_compute[186849]:    <nova:port uuid="a5480b52-2a9f-4662-8b66-bd078a80ca44">
Nov 22 03:02:34 np0005531887 nova_compute[186849]:      <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Nov 22 03:02:34 np0005531887 nova_compute[186849]:    </nova:port>
Nov 22 03:02:34 np0005531887 nova_compute[186849]:    <nova:port uuid="26960794-1ab2-46fd-917b-8b5e28186dc3">
Nov 22 03:02:34 np0005531887 nova_compute[186849]:      <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 22 03:02:34 np0005531887 nova_compute[186849]:    </nova:port>
Nov 22 03:02:34 np0005531887 nova_compute[186849]:    <nova:port uuid="b7ba05ee-8492-4d3b-ae6b-fd90eee67f56">
Nov 22 03:02:34 np0005531887 nova_compute[186849]:      <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Nov 22 03:02:34 np0005531887 nova_compute[186849]:    </nova:port>
Nov 22 03:02:34 np0005531887 nova_compute[186849]:  </nova:ports>
Nov 22 03:02:34 np0005531887 nova_compute[186849]: </nova:instance>
Nov 22 03:02:34 np0005531887 nova_compute[186849]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Nov 22 03:02:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:34.131 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[693f4f34-f8bb-4536-abd1-9f0f3821fd9b]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap6a4a282c-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 522153, 'tstamp': 522153}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227697, 'error': None, 'target': 'ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap6a4a282c-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 522157, 'tstamp': 522157}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227697, 'error': None, 'target': 'ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:34.133 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6a4a282c-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:02:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:34.136 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6a4a282c-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:02:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:34.136 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:02:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:34.137 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6a4a282c-d0, col_values=(('external_ids', {'iface-id': '26692495-261e-4628-ae4d-0a33d676c097'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:02:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:34.137 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:02:34 np0005531887 nova_compute[186849]: 2025-11-22 08:02:34.140 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:34 np0005531887 nova_compute[186849]: 2025-11-22 08:02:34.143 186853 DEBUG oslo_concurrency.lockutils [None req-4360d31d-bf57-4507-8fb9-b6c804eb90f3 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "interface-fb921e88-22d2-4ae6-9d09-970505f0d5bb-b7ba05ee-8492-4d3b-ae6b-fd90eee67f56" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 13.188s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:02:34 np0005531887 nova_compute[186849]: 2025-11-22 08:02:34.828 186853 DEBUG nova.compute.manager [req-83134628-7118-4641-aebc-7b2c0aeb0d5f req-efc5cf4f-2db2-4b16-a6b7-699847e0fd5d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Received event network-vif-plugged-b7ba05ee-8492-4d3b-ae6b-fd90eee67f56 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:02:34 np0005531887 nova_compute[186849]: 2025-11-22 08:02:34.829 186853 DEBUG oslo_concurrency.lockutils [req-83134628-7118-4641-aebc-7b2c0aeb0d5f req-efc5cf4f-2db2-4b16-a6b7-699847e0fd5d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "fb921e88-22d2-4ae6-9d09-970505f0d5bb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:02:34 np0005531887 nova_compute[186849]: 2025-11-22 08:02:34.830 186853 DEBUG oslo_concurrency.lockutils [req-83134628-7118-4641-aebc-7b2c0aeb0d5f req-efc5cf4f-2db2-4b16-a6b7-699847e0fd5d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "fb921e88-22d2-4ae6-9d09-970505f0d5bb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:02:34 np0005531887 nova_compute[186849]: 2025-11-22 08:02:34.830 186853 DEBUG oslo_concurrency.lockutils [req-83134628-7118-4641-aebc-7b2c0aeb0d5f req-efc5cf4f-2db2-4b16-a6b7-699847e0fd5d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "fb921e88-22d2-4ae6-9d09-970505f0d5bb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:02:34 np0005531887 nova_compute[186849]: 2025-11-22 08:02:34.830 186853 DEBUG nova.compute.manager [req-83134628-7118-4641-aebc-7b2c0aeb0d5f req-efc5cf4f-2db2-4b16-a6b7-699847e0fd5d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] No waiting events found dispatching network-vif-plugged-b7ba05ee-8492-4d3b-ae6b-fd90eee67f56 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:02:34 np0005531887 nova_compute[186849]: 2025-11-22 08:02:34.830 186853 WARNING nova.compute.manager [req-83134628-7118-4641-aebc-7b2c0aeb0d5f req-efc5cf4f-2db2-4b16-a6b7-699847e0fd5d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Received unexpected event network-vif-plugged-b7ba05ee-8492-4d3b-ae6b-fd90eee67f56 for instance with vm_state active and task_state None.#033[00m
Nov 22 03:02:34 np0005531887 podman[227698]: 2025-11-22 08:02:34.850292489 +0000 UTC m=+0.066152188 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd)
Nov 22 03:02:35 np0005531887 ovn_controller[95130]: 2025-11-22T08:02:35Z|00041|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ff:85:53 10.100.0.13
Nov 22 03:02:35 np0005531887 ovn_controller[95130]: 2025-11-22T08:02:35Z|00042|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ff:85:53 10.100.0.13
Nov 22 03:02:35 np0005531887 nova_compute[186849]: 2025-11-22 08:02:35.945 186853 DEBUG nova.network.neutron [req-a28f81d5-c04f-418e-aad8-700a1a8739b1 req-5b5a4b8a-2cd4-40bf-99cc-f28abfd3d8a5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Updated VIF entry in instance network info cache for port b7ba05ee-8492-4d3b-ae6b-fd90eee67f56. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:02:35 np0005531887 nova_compute[186849]: 2025-11-22 08:02:35.946 186853 DEBUG nova.network.neutron [req-a28f81d5-c04f-418e-aad8-700a1a8739b1 req-5b5a4b8a-2cd4-40bf-99cc-f28abfd3d8a5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Updating instance_info_cache with network_info: [{"id": "8979b5b9-122d-4f08-832e-80c8509b9f9a", "address": "fa:16:3e:a0:69:e9", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8979b5b9-12", "ovs_interfaceid": "8979b5b9-122d-4f08-832e-80c8509b9f9a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "a5480b52-2a9f-4662-8b66-bd078a80ca44", "address": "fa:16:3e:5f:20:36", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5480b52-2a", "ovs_interfaceid": "a5480b52-2a9f-4662-8b66-bd078a80ca44", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "26960794-1ab2-46fd-917b-8b5e28186dc3", "address": "fa:16:3e:75:a1:57", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26960794-1a", "ovs_interfaceid": "26960794-1ab2-46fd-917b-8b5e28186dc3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "b7ba05ee-8492-4d3b-ae6b-fd90eee67f56", "address": "fa:16:3e:ff:85:53", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7ba05ee-84", "ovs_interfaceid": "b7ba05ee-8492-4d3b-ae6b-fd90eee67f56", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:02:35 np0005531887 nova_compute[186849]: 2025-11-22 08:02:35.962 186853 DEBUG oslo_concurrency.lockutils [req-a28f81d5-c04f-418e-aad8-700a1a8739b1 req-5b5a4b8a-2cd4-40bf-99cc-f28abfd3d8a5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-fb921e88-22d2-4ae6-9d09-970505f0d5bb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.668 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'name': 'tempest-AttachInterfacesTestJSON-server-253236039', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000005f', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'hostId': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.671 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.684 12 DEBUG ceilometer.compute.pollsters [-] fb921e88-22d2-4ae6-9d09-970505f0d5bb/disk.device.allocation volume: 30351360 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.685 12 DEBUG ceilometer.compute.pollsters [-] fb921e88-22d2-4ae6-9d09-970505f0d5bb/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.687 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8da2d3f0-674f-43b8-98d9-06c6679602a5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30351360, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb-vda', 'timestamp': '2025-11-22T08:02:36.671242', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-253236039', 'name': 'instance-0000005f', 'instance_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9bc554da-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5299.341079596, 'message_signature': '16d4ce7a5cd7b66b7cd34ae20e4a4d43dd5488efdbd022bbfa9a213124996615'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb-sda', 'timestamp': '2025-11-22T08:02:36.671242', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-253236039', 'name': 'instance-0000005f', 'instance_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9bc56330-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5299.341079596, 'message_signature': '6b032330d9b803844385d0bf2677bff9245e618d08881e8f416c8d8d18fe95cf'}]}, 'timestamp': '2025-11-22 08:02:36.686163', '_unique_id': '3eba801b8794468aa18b04fe65d9501e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.687 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.687 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.687 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.687 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.687 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.687 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.687 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.687 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.687 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.687 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.687 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.687 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.687 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.687 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.687 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.687 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.687 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.687 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.687 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.687 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.687 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.687 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.687 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.687 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.687 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.687 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.687 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.687 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.687 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.687 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.687 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.688 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.688 12 DEBUG ceilometer.compute.pollsters [-] fb921e88-22d2-4ae6-9d09-970505f0d5bb/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.688 12 DEBUG ceilometer.compute.pollsters [-] fb921e88-22d2-4ae6-9d09-970505f0d5bb/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.689 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c040a645-b402-40a2-a6d6-be9aa0889096', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb-vda', 'timestamp': '2025-11-22T08:02:36.688343', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-253236039', 'name': 'instance-0000005f', 'instance_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9bc5c3c0-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5299.341079596, 'message_signature': 'fceda48646f59ae11e02dd16be514cea15d1736ed6edf81fa56469b21b709086'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb-sda', 'timestamp': '2025-11-22T08:02:36.688343', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-253236039', 'name': 'instance-0000005f', 'instance_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9bc5cd48-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5299.341079596, 'message_signature': '629d1600d722a8434e022d7da0e8498440aa000399e6b25f01bfbce852f20bb7'}]}, 'timestamp': '2025-11-22 08:02:36.688824', '_unique_id': '7c3da8311f554f3bae37557391018d7c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.689 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.689 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.689 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.689 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.689 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.689 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.689 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.689 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.689 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.689 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.689 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.689 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.689 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.689 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.689 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.689 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.689 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.689 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.689 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.689 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.689 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.689 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.689 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.689 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.689 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.689 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.689 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.689 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.689 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.689 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.689 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.690 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.692 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for fb921e88-22d2-4ae6-9d09-970505f0d5bb / tap8979b5b9-12 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.693 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for fb921e88-22d2-4ae6-9d09-970505f0d5bb / tapa5480b52-2a inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.693 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for fb921e88-22d2-4ae6-9d09-970505f0d5bb / tap26960794-1a inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.694 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for fb921e88-22d2-4ae6-9d09-970505f0d5bb / tapb7ba05ee-84 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.694 12 DEBUG ceilometer.compute.pollsters [-] fb921e88-22d2-4ae6-9d09-970505f0d5bb/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.694 12 DEBUG ceilometer.compute.pollsters [-] fb921e88-22d2-4ae6-9d09-970505f0d5bb/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.695 12 DEBUG ceilometer.compute.pollsters [-] fb921e88-22d2-4ae6-9d09-970505f0d5bb/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.695 12 DEBUG ceilometer.compute.pollsters [-] fb921e88-22d2-4ae6-9d09-970505f0d5bb/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.696 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0f0aecf7-d55c-4b35-b4b6-0f4fcf61c655', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'instance-0000005f-fb921e88-22d2-4ae6-9d09-970505f0d5bb-tap8979b5b9-12', 'timestamp': '2025-11-22T08:02:36.690075', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-253236039', 'name': 'tap8979b5b9-12', 'instance_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a0:69:e9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8979b5b9-12'}, 'message_id': '9bc6b898-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5299.359894758, 'message_signature': 'fa7bc2fda5a5e2fad8d8d9334a45443db10ee757a423ea098d6e69864a66e028'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'instance-0000005f-fb921e88-22d2-4ae6-9d09-970505f0d5bb-tapa5480b52-2a', 'timestamp': '2025-11-22T08:02:36.690075', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-253236039', 'name': 'tapa5480b52-2a', 'instance_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5f:20:36', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa5480b52-2a'}, 'message_id': '9bc6c6da-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5299.359894758, 'message_signature': '792123a4ab4f02c6955c9bc1a7ccae04064299c4140a42ba4162c437ba52ee5f'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'instance-0000005f-fb921e88-22d2-4ae6-9d09-970505f0d5bb-tap26960794-1a', 'timestamp': '2025-11-22T08:02:36.690075', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-253236039', 'name': 'tap26960794-1a', 'instance_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:75:a1:57', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap26960794-1a'}, 'message_id': '9bc6d0b2-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5299.359894758, 'message_signature': '934f1d644e4239d8bc46f1ce499cc8b6e078b77d78c17d93f0cdc685a848e4a9'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'instance-0000005f-fb921e88-22d2-4ae6-9d09-970505f0d5bb-tapb7ba05ee-84', 'timestamp': '2025-11-22T08:02:36.690075', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-253236039', 'name': 'tapb7ba05ee-84', 'instance_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ff:85:53', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb7ba05ee-84'}, 'message_id': '9bc6dada-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5299.359894758, 'message_signature': '4773b1ef1a13341a77ce6f571d61bebe0e7e4f0761316dae7094d890aeff365e'}]}, 'timestamp': '2025-11-22 08:02:36.695756', '_unique_id': 'f181b6f201184b5399d3460d5e35b0a2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.696 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.696 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.696 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.696 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.696 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.696 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.696 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.696 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.696 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.696 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.696 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.696 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.696 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.696 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.696 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.696 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.696 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.696 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.696 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.696 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.696 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.696 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.696 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.696 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.696 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.696 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.696 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.696 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.696 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.696 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.696 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.697 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 03:02:36 np0005531887 nova_compute[186849]: 2025-11-22 08:02:36.705 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.724 12 DEBUG ceilometer.compute.pollsters [-] fb921e88-22d2-4ae6-9d09-970505f0d5bb/disk.device.write.requests volume: 329 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.725 12 DEBUG ceilometer.compute.pollsters [-] fb921e88-22d2-4ae6-9d09-970505f0d5bb/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.726 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1b292c1c-18fc-4c47-b6fa-326b85b7d5cc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 329, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb-vda', 'timestamp': '2025-11-22T08:02:36.697869', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-253236039', 'name': 'instance-0000005f', 'instance_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9bcb4fc0-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5299.367695309, 'message_signature': '7d3de8ad5f6172d149a176892174c089a3f76ba4987ee52c04a1b07e4fc160c5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb-sda', 'timestamp': '2025-11-22T08:02:36.697869', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-253236039', 'name': 'instance-0000005f', 'instance_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9bcb5bc8-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5299.367695309, 'message_signature': '0d51404b231aa78ef89416208fb46e02322d4e17ff7a47ea9e4c99dc6808adc7'}]}, 'timestamp': '2025-11-22 08:02:36.725288', '_unique_id': '110d4c4ed04a4fb1974cf61fdca92356'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.726 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.726 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.726 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.726 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.726 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.726 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.726 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.726 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.726 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.726 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.726 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.726 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.726 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.726 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.726 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.726 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.726 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.726 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.726 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.726 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.726 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.726 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.726 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.726 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.726 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.726 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.726 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.726 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.726 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.726 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.726 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.726 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.742 12 DEBUG ceilometer.compute.pollsters [-] fb921e88-22d2-4ae6-9d09-970505f0d5bb/memory.usage volume: 45.125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.743 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eeeb2ecc-6a1b-4084-aab0-2806e9496aaa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 45.125, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'timestamp': '2025-11-22T08:02:36.726989', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-253236039', 'name': 'instance-0000005f', 'instance_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '9bce08be-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5299.411981925, 'message_signature': 'd3c173fd9e1a3abae06537516ba118ac4b177d55c1fa9e22d54b83a2644b9924'}]}, 'timestamp': '2025-11-22 08:02:36.742883', '_unique_id': 'c11120d3d4354722b2c3223044bc7e5f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.743 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.743 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.743 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.743 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.743 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.743 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.743 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.743 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.743 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.743 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.743 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.743 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.743 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.743 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.743 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.743 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.743 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.743 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.743 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.743 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.743 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.743 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.743 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.743 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.743 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.743 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.743 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.743 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.743 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.743 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.743 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.744 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.744 12 DEBUG ceilometer.compute.pollsters [-] fb921e88-22d2-4ae6-9d09-970505f0d5bb/network.outgoing.bytes volume: 3390 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.744 12 DEBUG ceilometer.compute.pollsters [-] fb921e88-22d2-4ae6-9d09-970505f0d5bb/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.745 12 DEBUG ceilometer.compute.pollsters [-] fb921e88-22d2-4ae6-9d09-970505f0d5bb/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.745 12 DEBUG ceilometer.compute.pollsters [-] fb921e88-22d2-4ae6-9d09-970505f0d5bb/network.outgoing.bytes volume: 1152 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.746 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '08740d91-6ef6-49fd-a855-0070799abbd7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3390, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'instance-0000005f-fb921e88-22d2-4ae6-9d09-970505f0d5bb-tap8979b5b9-12', 'timestamp': '2025-11-22T08:02:36.744668', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-253236039', 'name': 'tap8979b5b9-12', 'instance_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a0:69:e9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8979b5b9-12'}, 'message_id': '9bce5d6e-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5299.359894758, 'message_signature': '412723f1c53deb82d5663848b5d8818a0ffff3880a2bbe477817ba9217894db1'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'instance-0000005f-fb921e88-22d2-4ae6-9d09-970505f0d5bb-tapa5480b52-2a', 'timestamp': '2025-11-22T08:02:36.744668', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-253236039', 'name': 'tapa5480b52-2a', 'instance_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5f:20:36', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa5480b52-2a'}, 'message_id': '9bce67be-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5299.359894758, 'message_signature': '84a159400c153fb4422e03b69d8e70ae42e6ec4ea6aa51a928f49cf4d23d46bd'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'instance-0000005f-fb921e88-22d2-4ae6-9d09-970505f0d5bb-tap26960794-1a', 'timestamp': '2025-11-22T08:02:36.744668', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-253236039', 'name': 'tap26960794-1a', 'instance_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:75:a1:57', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap26960794-1a'}, 'message_id': '9bce722c-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5299.359894758, 'message_signature': '791a2292eea3c9d32927f606844d73fbff094a65eccefd2c9379c1a651f7040d'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1152, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'instance-0000005f-fb921e88-22d2-4ae6-9d09-970505f0d5bb-tapb7ba05ee-84', 'timestamp': '2025-11-22T08:02:36.744668', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-253236039', 'name': 'tapb7ba05ee-84', 'instance_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ff:85:53', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb7ba05ee-84'}, 'message_id': '9bce7a9c-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5299.359894758, 'message_signature': 'adfeb61685d47bc0df76b30f07befd22b620ff9d3971fd6e18c8aa848d51bd1a'}]}, 'timestamp': '2025-11-22 08:02:36.745723', '_unique_id': '1dd524bac2c54caa9cdd8877d014ebea'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.746 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.746 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.746 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.746 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.746 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.746 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.746 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.746 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.746 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.746 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.746 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.746 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.746 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.746 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.746 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.746 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.746 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.746 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.746 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.746 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.746 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.746 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.746 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.746 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.746 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.746 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.746 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.746 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.746 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.746 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.746 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.747 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.747 12 DEBUG ceilometer.compute.pollsters [-] fb921e88-22d2-4ae6-9d09-970505f0d5bb/disk.device.write.bytes volume: 73080832 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.747 12 DEBUG ceilometer.compute.pollsters [-] fb921e88-22d2-4ae6-9d09-970505f0d5bb/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.748 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fac454ce-6be1-4d1a-a14a-a3a610da3d16', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73080832, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb-vda', 'timestamp': '2025-11-22T08:02:36.747403', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-253236039', 'name': 'instance-0000005f', 'instance_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9bcec826-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5299.367695309, 'message_signature': '59356c7fe590a596835464f23f026de3fbad878107423755e4c954fe32604b11'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb-sda', 'timestamp': '2025-11-22T08:02:36.747403', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-253236039', 'name': 'instance-0000005f', 'instance_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9bced3ac-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5299.367695309, 'message_signature': '7b820a2034b6963e6009f48401eac74473e9f42635a83488f6041fd524d724e8'}]}, 'timestamp': '2025-11-22 08:02:36.748018', '_unique_id': '941d4c69d88c43309babc9d617fd449a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.748 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.748 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.748 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.748 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.748 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.748 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.748 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.748 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.748 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.748 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.748 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.748 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.748 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.748 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.748 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.748 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.748 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.748 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.748 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.748 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.748 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.748 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.748 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.748 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.748 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.748 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.748 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.748 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.748 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.748 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.748 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.749 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.749 12 DEBUG ceilometer.compute.pollsters [-] fb921e88-22d2-4ae6-9d09-970505f0d5bb/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.749 12 DEBUG ceilometer.compute.pollsters [-] fb921e88-22d2-4ae6-9d09-970505f0d5bb/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.749 12 DEBUG ceilometer.compute.pollsters [-] fb921e88-22d2-4ae6-9d09-970505f0d5bb/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.750 12 DEBUG ceilometer.compute.pollsters [-] fb921e88-22d2-4ae6-9d09-970505f0d5bb/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.750 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0833eab2-c88b-48f4-8100-e1cda30dff20', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'instance-0000005f-fb921e88-22d2-4ae6-9d09-970505f0d5bb-tap8979b5b9-12', 'timestamp': '2025-11-22T08:02:36.749285', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-253236039', 'name': 'tap8979b5b9-12', 'instance_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a0:69:e9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8979b5b9-12'}, 'message_id': '9bcf101a-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5299.359894758, 'message_signature': '05b656338906c3a87d44293df486b293dfe1c163e64e1d5ee47190fdb640184f'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'instance-0000005f-fb921e88-22d2-4ae6-9d09-970505f0d5bb-tapa5480b52-2a', 'timestamp': '2025-11-22T08:02:36.749285', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-253236039', 'name': 'tapa5480b52-2a', 'instance_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5f:20:36', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa5480b52-2a'}, 'message_id': '9bcf1a6a-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5299.359894758, 'message_signature': '877a4e05b012af4e1d2408fe5e1b80286e7db82ef7be57cb8cc0fa7f45e31ed1'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'instance-0000005f-fb921e88-22d2-4ae6-9d09-970505f0d5bb-tap26960794-1a', 'timestamp': '2025-11-22T08:02:36.749285', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-253236039', 'name': 'tap26960794-1a', 'instance_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:75:a1:57', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap26960794-1a'}, 'message_id': '9bcf242e-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5299.359894758, 'message_signature': '48a013ae89a917e58a6f8f05177920e903227400131f798c3b1b604479f2b68b'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'instance-0000005f-fb921e88-22d2-4ae6-9d09-970505f0d5bb-tapb7ba05ee-84', 'timestamp': '2025-11-22T08:02:36.749285', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-253236039', 'name': 'tapb7ba05ee-84', 'instance_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ff:85:53', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb7ba05ee-84'}, 'message_id': '9bcf2ece-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5299.359894758, 'message_signature': 'a5c29f531f83cd26c6e2dec136302b89153d91f5991f44492a758c6c4402956c'}]}, 'timestamp': '2025-11-22 08:02:36.750338', '_unique_id': '38015c4ccd6e41b889053b2049f8810d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.750 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.750 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.750 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.750 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.750 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.750 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.750 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.750 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.750 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.750 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.750 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.750 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.750 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.750 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.750 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.750 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.750 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.750 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.750 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.750 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.750 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.750 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.750 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.750 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.750 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.750 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.750 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.750 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.750 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.750 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.750 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.751 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.751 12 DEBUG ceilometer.compute.pollsters [-] fb921e88-22d2-4ae6-9d09-970505f0d5bb/network.incoming.bytes volume: 4689 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.751 12 DEBUG ceilometer.compute.pollsters [-] fb921e88-22d2-4ae6-9d09-970505f0d5bb/network.incoming.bytes volume: 1598 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.752 12 DEBUG ceilometer.compute.pollsters [-] fb921e88-22d2-4ae6-9d09-970505f0d5bb/network.incoming.bytes volume: 1430 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.752 12 DEBUG ceilometer.compute.pollsters [-] fb921e88-22d2-4ae6-9d09-970505f0d5bb/network.incoming.bytes volume: 1346 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.753 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a9b71d8c-17ad-4c63-ad06-53426d1570c8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4689, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'instance-0000005f-fb921e88-22d2-4ae6-9d09-970505f0d5bb-tap8979b5b9-12', 'timestamp': '2025-11-22T08:02:36.751643', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-253236039', 'name': 'tap8979b5b9-12', 'instance_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a0:69:e9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8979b5b9-12'}, 'message_id': '9bcf6c36-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5299.359894758, 'message_signature': '62b5813d1cf3f9d7ae624d2b8b09f522de9af8c2fc35d63e2e7c14c07c09a6d7'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1598, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'instance-0000005f-fb921e88-22d2-4ae6-9d09-970505f0d5bb-tapa5480b52-2a', 'timestamp': '2025-11-22T08:02:36.751643', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-253236039', 'name': 'tapa5480b52-2a', 'instance_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5f:20:36', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa5480b52-2a'}, 'message_id': '9bcf74a6-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5299.359894758, 'message_signature': '1c0859c1d35967c04eb9cd89007a0f081ed52f6580507ed75b88a9474e482aab'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1430, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'instance-0000005f-fb921e88-22d2-4ae6-9d09-970505f0d5bb-tap26960794-1a', 'timestamp': '2025-11-22T08:02:36.751643', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-253236039', 'name': 'tap26960794-1a', 'instance_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:75:a1:57', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap26960794-1a'}, 'message_id': '9bcf7fdc-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5299.359894758, 'message_signature': '51f793d7d2cd6bed5274356231c7ae862e8479a71605274eaf0fa9e577b1865a'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1346, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'instance-0000005f-fb921e88-22d2-4ae6-9d09-970505f0d5bb-tapb7ba05ee-84', 'timestamp': '2025-11-22T08:02:36.751643', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-253236039', 'name': 'tapb7ba05ee-84', 'instance_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ff:85:53', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb7ba05ee-84'}, 'message_id': '9bcf8a0e-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5299.359894758, 'message_signature': 'd0edd92297d999d221e32d7c1fb12b4866ebd7fe205415f6596497894b9c44aa'}]}, 'timestamp': '2025-11-22 08:02:36.752675', '_unique_id': '16444e0516354ca3b919dcffe4a4edb3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.753 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.753 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.753 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.753 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.753 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.753 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.753 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.753 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.753 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.753 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.753 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.753 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.753 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.753 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.753 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.753 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.753 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.753 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.753 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.753 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.753 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.753 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.753 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.753 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.753 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.753 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.753 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.753 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.753 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.753 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.753 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.753 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.754 12 DEBUG ceilometer.compute.pollsters [-] fb921e88-22d2-4ae6-9d09-970505f0d5bb/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.754 12 DEBUG ceilometer.compute.pollsters [-] fb921e88-22d2-4ae6-9d09-970505f0d5bb/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.754 12 DEBUG ceilometer.compute.pollsters [-] fb921e88-22d2-4ae6-9d09-970505f0d5bb/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.754 12 DEBUG ceilometer.compute.pollsters [-] fb921e88-22d2-4ae6-9d09-970505f0d5bb/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.755 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fdb2010b-bed2-483c-8264-87916728e69c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'instance-0000005f-fb921e88-22d2-4ae6-9d09-970505f0d5bb-tap8979b5b9-12', 'timestamp': '2025-11-22T08:02:36.754057', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-253236039', 'name': 'tap8979b5b9-12', 'instance_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a0:69:e9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8979b5b9-12'}, 'message_id': '9bcfca46-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5299.359894758, 'message_signature': '1e5984bf973152999b278d50a0ca47f024b17f754376385683d891ed5cf2ade2'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'instance-0000005f-fb921e88-22d2-4ae6-9d09-970505f0d5bb-tapa5480b52-2a', 'timestamp': '2025-11-22T08:02:36.754057', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-253236039', 'name': 'tapa5480b52-2a', 'instance_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5f:20:36', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa5480b52-2a'}, 'message_id': '9bcfd374-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5299.359894758, 'message_signature': 'cebc4026a0c60d6894347ab855312d0ba9f49f05e51e72daf114c6418da59ed3'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'instance-0000005f-fb921e88-22d2-4ae6-9d09-970505f0d5bb-tap26960794-1a', 'timestamp': '2025-11-22T08:02:36.754057', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-253236039', 'name': 'tap26960794-1a', 'instance_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:75:a1:57', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap26960794-1a'}, 'message_id': '9bcfdb58-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5299.359894758, 'message_signature': 'e0bc99e712da612b56c5c2a2f7606851259599339120bd4ac504a0e578f8cfb4'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'instance-0000005f-fb921e88-22d2-4ae6-9d09-970505f0d5bb-tapb7ba05ee-84', 'timestamp': '2025-11-22T08:02:36.754057', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-253236039', 'name': 'tapb7ba05ee-84', 'instance_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ff:85:53', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb7ba05ee-84'}, 'message_id': '9bcfe382-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5299.359894758, 'message_signature': '5b9caf78994c415090bf37231120fdb6741f1e7426f81a3d11e7dc3c1ea45cb7'}]}, 'timestamp': '2025-11-22 08:02:36.754929', '_unique_id': '9637f69f0717451e90b3ed728a25298a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.755 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.755 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.755 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.755 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.755 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.755 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.755 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.755 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.755 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.755 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.755 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.755 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.755 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.755 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.755 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.755 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.755 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.755 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.755 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.755 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.755 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.755 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.755 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.755 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.755 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.755 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.755 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.755 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.755 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.755 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.755 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.756 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.756 12 DEBUG ceilometer.compute.pollsters [-] fb921e88-22d2-4ae6-9d09-970505f0d5bb/disk.device.usage volume: 30081024 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.756 12 DEBUG ceilometer.compute.pollsters [-] fb921e88-22d2-4ae6-9d09-970505f0d5bb/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.757 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c08a7684-7b91-4c41-b82d-38d2235d6ad1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30081024, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb-vda', 'timestamp': '2025-11-22T08:02:36.756241', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-253236039', 'name': 'instance-0000005f', 'instance_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9bd02252-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5299.341079596, 'message_signature': 'cc5c68a167f76ca18c1cfbe9b1a50a09abba0e4a753f5f3bb0719c780de5f855'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb-sda', 'timestamp': '2025-11-22T08:02:36.756241', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-253236039', 'name': 'instance-0000005f', 'instance_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9bd02ce8-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5299.341079596, 'message_signature': '2f70843a1ac41af534fc27fae2958edc4a793d09bd6cb9275afd9a954fae60b7'}]}, 'timestamp': '2025-11-22 08:02:36.756835', '_unique_id': '9e42a3b912694fe38b854109d516394b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.757 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.757 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.757 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.757 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.757 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.757 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.757 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.757 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.757 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.757 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.757 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.757 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.757 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.757 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.757 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.757 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.757 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.757 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.757 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.757 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.757 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.757 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.757 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.757 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.757 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.757 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.757 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.757 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.757 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.757 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.757 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.758 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.758 12 DEBUG ceilometer.compute.pollsters [-] fb921e88-22d2-4ae6-9d09-970505f0d5bb/network.outgoing.packets volume: 28 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.758 12 DEBUG ceilometer.compute.pollsters [-] fb921e88-22d2-4ae6-9d09-970505f0d5bb/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.758 12 DEBUG ceilometer.compute.pollsters [-] fb921e88-22d2-4ae6-9d09-970505f0d5bb/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.759 12 DEBUG ceilometer.compute.pollsters [-] fb921e88-22d2-4ae6-9d09-970505f0d5bb/network.outgoing.packets volume: 8 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.760 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '949d692e-e429-4312-90ee-47ec3bbc247d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 28, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'instance-0000005f-fb921e88-22d2-4ae6-9d09-970505f0d5bb-tap8979b5b9-12', 'timestamp': '2025-11-22T08:02:36.758164', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-253236039', 'name': 'tap8979b5b9-12', 'instance_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a0:69:e9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8979b5b9-12'}, 'message_id': '9bd06ce4-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5299.359894758, 'message_signature': '2526a2a1181f5b229d4ab2a9c8a1a8ef13f2763cf9a8dd5c2da96b78ace916bb'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'instance-0000005f-fb921e88-22d2-4ae6-9d09-970505f0d5bb-tapa5480b52-2a', 'timestamp': '2025-11-22T08:02:36.758164', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-253236039', 'name': 'tapa5480b52-2a', 'instance_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5f:20:36', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa5480b52-2a'}, 'message_id': '9bd0787e-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5299.359894758, 'message_signature': '5ac4a806875a4e8e98c3ff35c6ab43ae9a274e41cc02868877b747fbfba2a213'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'instance-0000005f-fb921e88-22d2-4ae6-9d09-970505f0d5bb-tap26960794-1a', 'timestamp': '2025-11-22T08:02:36.758164', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-253236039', 'name': 'tap26960794-1a', 'instance_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:75:a1:57', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap26960794-1a'}, 'message_id': '9bd083be-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5299.359894758, 'message_signature': '674b7c5ca4ce3d4a597ac09cc97b5208f9ad34bc6b702d0cd1a280c3d95ddedf'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 8, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'instance-0000005f-fb921e88-22d2-4ae6-9d09-970505f0d5bb-tapb7ba05ee-84', 'timestamp': '2025-11-22T08:02:36.758164', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-253236039', 'name': 'tapb7ba05ee-84', 'instance_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ff:85:53', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb7ba05ee-84'}, 'message_id': '9bd08f4e-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5299.359894758, 'message_signature': '7e39d4148a61046b12a57fb9a0e91bc0e6775cc00b0dd1dcdc2bb5150434be0b'}]}, 'timestamp': '2025-11-22 08:02:36.759399', '_unique_id': 'e05c1fea82cd4086a00267ef06fcf524'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.760 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.760 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.760 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.760 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.760 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.760 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.760 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.760 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.760 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.760 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.760 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.760 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.760 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.760 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.760 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.760 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.760 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.760 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.760 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.760 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.760 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.760 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.760 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.760 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.760 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.760 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.760 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.760 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.760 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.760 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.760 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.761 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.761 12 DEBUG ceilometer.compute.pollsters [-] fb921e88-22d2-4ae6-9d09-970505f0d5bb/disk.device.read.requests volume: 1114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.761 12 DEBUG ceilometer.compute.pollsters [-] fb921e88-22d2-4ae6-9d09-970505f0d5bb/disk.device.read.requests volume: 120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.763 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9e7473c1-3ac8-49b2-94b1-d8b0fb9e377d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1114, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb-vda', 'timestamp': '2025-11-22T08:02:36.761504', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-253236039', 'name': 'instance-0000005f', 'instance_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9bd0effc-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5299.367695309, 'message_signature': 'd3f690e66bc6e2aeec6fe7931ddbd1a2dc0968bad14be0cdffdf39737a6d1ac8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 120, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb-sda', 'timestamp': '2025-11-22T08:02:36.761504', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-253236039', 'name': 'instance-0000005f', 'instance_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9bd0fd44-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5299.367695309, 'message_signature': 'a2a4c5f070b5ebfd4ddda9ac4cfbcbf9a4c7c4dd2bba397d0fb0f85921eb50c3'}]}, 'timestamp': '2025-11-22 08:02:36.762196', '_unique_id': '5b687620c4984c5593d0c9306e8e1692'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.763 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.763 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.763 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.763 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.763 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.763 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.763 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.763 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.763 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.763 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.763 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.763 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.763 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.763 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.763 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.763 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.763 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.763 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.763 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.763 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.763 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.763 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.763 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.763 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.763 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.763 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.763 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.763 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.763 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.763 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.763 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.764 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.764 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.764 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-AttachInterfacesTestJSON-server-253236039>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-AttachInterfacesTestJSON-server-253236039>]
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.764 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.764 12 DEBUG ceilometer.compute.pollsters [-] fb921e88-22d2-4ae6-9d09-970505f0d5bb/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.765 12 DEBUG ceilometer.compute.pollsters [-] fb921e88-22d2-4ae6-9d09-970505f0d5bb/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.765 12 DEBUG ceilometer.compute.pollsters [-] fb921e88-22d2-4ae6-9d09-970505f0d5bb/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.765 12 DEBUG ceilometer.compute.pollsters [-] fb921e88-22d2-4ae6-9d09-970505f0d5bb/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.766 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4d9b1c1b-bcfa-4ddb-a471-e2772be96ad7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'instance-0000005f-fb921e88-22d2-4ae6-9d09-970505f0d5bb-tap8979b5b9-12', 'timestamp': '2025-11-22T08:02:36.764776', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-253236039', 'name': 'tap8979b5b9-12', 'instance_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a0:69:e9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8979b5b9-12'}, 'message_id': '9bd16f18-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5299.359894758, 'message_signature': '3dd7061c45c8b04a7b757e0eb1c65114afd88485b5b135408c5b0a68bdd7f00b'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'instance-0000005f-fb921e88-22d2-4ae6-9d09-970505f0d5bb-tapa5480b52-2a', 'timestamp': '2025-11-22T08:02:36.764776', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-253236039', 'name': 'tapa5480b52-2a', 'instance_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5f:20:36', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa5480b52-2a'}, 'message_id': '9bd17896-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5299.359894758, 'message_signature': '564f7354a1e715e411b9e7de0b3d37a100378612b97984390d404bb10f4b9fd1'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'instance-0000005f-fb921e88-22d2-4ae6-9d09-970505f0d5bb-tap26960794-1a', 'timestamp': '2025-11-22T08:02:36.764776', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-253236039', 'name': 'tap26960794-1a', 'instance_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:75:a1:57', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap26960794-1a'}, 'message_id': '9bd1821e-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5299.359894758, 'message_signature': 'cd75fc91e2ec0bb90c486437a979ba000e6a1fff8f8c39bb4242b9b6feb1362c'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'instance-0000005f-fb921e88-22d2-4ae6-9d09-970505f0d5bb-tapb7ba05ee-84', 'timestamp': '2025-11-22T08:02:36.764776', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-253236039', 'name': 'tapb7ba05ee-84', 'instance_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ff:85:53', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb7ba05ee-84'}, 'message_id': '9bd18ac0-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5299.359894758, 'message_signature': '406b0a7dc206754855783d77f10ea4262c8514469a88a1a2b9a555caa1e93d92'}]}, 'timestamp': '2025-11-22 08:02:36.765779', '_unique_id': '4a29796bcf2c4ed28d5fdf906701120b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.766 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.766 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.766 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.766 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.766 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.766 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.766 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.766 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.766 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.766 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.766 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.766 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.766 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.766 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.766 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.766 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.766 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.766 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.766 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.766 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.766 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.766 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.766 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.766 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.766 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.766 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.766 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.766 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.766 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.766 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.766 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.767 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.767 12 DEBUG ceilometer.compute.pollsters [-] fb921e88-22d2-4ae6-9d09-970505f0d5bb/network.incoming.packets volume: 37 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.767 12 DEBUG ceilometer.compute.pollsters [-] fb921e88-22d2-4ae6-9d09-970505f0d5bb/network.incoming.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.767 12 DEBUG ceilometer.compute.pollsters [-] fb921e88-22d2-4ae6-9d09-970505f0d5bb/network.incoming.packets volume: 12 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.768 12 DEBUG ceilometer.compute.pollsters [-] fb921e88-22d2-4ae6-9d09-970505f0d5bb/network.incoming.packets volume: 10 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.769 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '71c16a51-3634-4636-9dff-e48dce5885b7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 37, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'instance-0000005f-fb921e88-22d2-4ae6-9d09-970505f0d5bb-tap8979b5b9-12', 'timestamp': '2025-11-22T08:02:36.767286', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-253236039', 'name': 'tap8979b5b9-12', 'instance_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a0:69:e9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8979b5b9-12'}, 'message_id': '9bd1d07a-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5299.359894758, 'message_signature': 'fc37ed7956ba589ae371ce635a40f7eb4f92f3e9748247cb9da9816c0027e809'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'instance-0000005f-fb921e88-22d2-4ae6-9d09-970505f0d5bb-tapa5480b52-2a', 'timestamp': '2025-11-22T08:02:36.767286', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-253236039', 'name': 'tapa5480b52-2a', 'instance_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5f:20:36', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa5480b52-2a'}, 'message_id': '9bd1db74-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5299.359894758, 'message_signature': 'b7413b14c33d6529a55bce173621b133dab29fc983174a7ac3d2cc86ff7001dd'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 12, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'instance-0000005f-fb921e88-22d2-4ae6-9d09-970505f0d5bb-tap26960794-1a', 'timestamp': '2025-11-22T08:02:36.767286', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-253236039', 'name': 'tap26960794-1a', 'instance_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:75:a1:57', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap26960794-1a'}, 'message_id': '9bd1e4de-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5299.359894758, 'message_signature': 'd329441cbd6cd9d8af199ee69e57f3e2a2b513490a6e454544ab4c30983cc2b9'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 10, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'instance-0000005f-fb921e88-22d2-4ae6-9d09-970505f0d5bb-tapb7ba05ee-84', 'timestamp': '2025-11-22T08:02:36.767286', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-253236039', 'name': 'tapb7ba05ee-84', 'instance_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ff:85:53', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb7ba05ee-84'}, 'message_id': '9bd1f154-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5299.359894758, 'message_signature': '61bb5563720f073c791e0cabc929c76db58ca1f81dc7ff436d28b65361f0de5a'}]}, 'timestamp': '2025-11-22 08:02:36.768408', '_unique_id': '7f656824f48a4773a4086ab5a9347140'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.769 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.769 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.769 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.769 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.769 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.769 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.769 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.769 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.769 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.769 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.769 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.769 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.769 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.769 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.769 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.769 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.769 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.769 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.769 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.769 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.769 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.769 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.769 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.769 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.769 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.769 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.769 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.769 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.769 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.769 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.769 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.769 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.769 12 DEBUG ceilometer.compute.pollsters [-] fb921e88-22d2-4ae6-9d09-970505f0d5bb/disk.device.read.latency volume: 2624790906 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.770 12 DEBUG ceilometer.compute.pollsters [-] fb921e88-22d2-4ae6-9d09-970505f0d5bb/disk.device.read.latency volume: 690665944 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.770 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1fa2e4b4-47e9-4900-a9c2-8afc74924089', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2624790906, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb-vda', 'timestamp': '2025-11-22T08:02:36.769737', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-253236039', 'name': 'instance-0000005f', 'instance_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9bd23038-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5299.367695309, 'message_signature': 'b613ea2919ea28f049fe7a37a4ce104ab531a6ef60889aa18d1856274a8bd21a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 690665944, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb-sda', 'timestamp': '2025-11-22T08:02:36.769737', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-253236039', 'name': 'instance-0000005f', 'instance_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9bd23ad8-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5299.367695309, 'message_signature': '48e3f06f97c5694b98283d7053a89cc243b7b92018fd426bb0e73a8ac4ca90ec'}]}, 'timestamp': '2025-11-22 08:02:36.770313', '_unique_id': 'ce8f960eb20248b5ad11a98045942660'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.770 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.770 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.770 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.770 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.770 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.770 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.770 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.770 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.770 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.770 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.770 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.770 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.770 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.770 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.770 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.770 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.770 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.770 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.770 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.770 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.770 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.770 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.770 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.770 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.770 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.770 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.770 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.770 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.770 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.770 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.770 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.771 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.771 12 DEBUG ceilometer.compute.pollsters [-] fb921e88-22d2-4ae6-9d09-970505f0d5bb/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.771 12 DEBUG ceilometer.compute.pollsters [-] fb921e88-22d2-4ae6-9d09-970505f0d5bb/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.772 12 DEBUG ceilometer.compute.pollsters [-] fb921e88-22d2-4ae6-9d09-970505f0d5bb/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.772 12 DEBUG ceilometer.compute.pollsters [-] fb921e88-22d2-4ae6-9d09-970505f0d5bb/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.773 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f9bb8c09-d9d6-4eac-9bfe-c856a19d6c53', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'instance-0000005f-fb921e88-22d2-4ae6-9d09-970505f0d5bb-tap8979b5b9-12', 'timestamp': '2025-11-22T08:02:36.771570', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-253236039', 'name': 'tap8979b5b9-12', 'instance_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a0:69:e9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8979b5b9-12'}, 'message_id': '9bd277a0-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5299.359894758, 'message_signature': 'e9b39c57922b6d637802f5c51a260daf8e0fda42ec0692f0720b7a1ffea082b0'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'instance-0000005f-fb921e88-22d2-4ae6-9d09-970505f0d5bb-tapa5480b52-2a', 'timestamp': '2025-11-22T08:02:36.771570', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-253236039', 'name': 'tapa5480b52-2a', 'instance_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5f:20:36', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa5480b52-2a'}, 'message_id': '9bd283a8-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5299.359894758, 'message_signature': 'eb3df1508296dab69e02e6b54bd4092f72d2e159540214dc10f6dea302e949b2'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'instance-0000005f-fb921e88-22d2-4ae6-9d09-970505f0d5bb-tap26960794-1a', 'timestamp': '2025-11-22T08:02:36.771570', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-253236039', 'name': 'tap26960794-1a', 'instance_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:75:a1:57', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap26960794-1a'}, 'message_id': '9bd28e34-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5299.359894758, 'message_signature': '7360f1a8ab43d4b1eb3d62d4bd0d1f6f8b44c8a78455875f8a9100d5e224b0e9'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'instance-0000005f-fb921e88-22d2-4ae6-9d09-970505f0d5bb-tapb7ba05ee-84', 'timestamp': '2025-11-22T08:02:36.771570', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-253236039', 'name': 'tapb7ba05ee-84', 'instance_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ff:85:53', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb7ba05ee-84'}, 'message_id': '9bd29730-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5299.359894758, 'message_signature': 'bfd4824adc1d17043e18d2a8319e85469dd894af0c9d2704deb1b88c86d7df4e'}]}, 'timestamp': '2025-11-22 08:02:36.772672', '_unique_id': 'dada056adf604e28827e9b80600c4cd1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.773 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.773 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.773 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.773 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.773 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.773 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.773 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.773 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.773 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.773 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.773 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.773 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.773 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.773 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.773 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.773 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.773 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.773 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.773 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.773 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.773 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.773 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.773 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.773 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.773 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.773 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.773 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.773 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.773 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.773 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.773 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.774 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.774 12 DEBUG ceilometer.compute.pollsters [-] fb921e88-22d2-4ae6-9d09-970505f0d5bb/cpu volume: 15130000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.775 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5700dadf-65d7-40b3-b618-5b1a60bb5f2d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 15130000000, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'timestamp': '2025-11-22T08:02:36.774230', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-253236039', 'name': 'instance-0000005f', 'instance_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '9bd2e118-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5299.411981925, 'message_signature': 'eefa4b8d6c61a2c6105b5179b2e7fbad18418503563271d6c75a75732e1841c0'}]}, 'timestamp': '2025-11-22 08:02:36.774581', '_unique_id': '4fe09e9b06a049e592e1a5a8e634ea0a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.775 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.775 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.775 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.775 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.775 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.775 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.775 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.775 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.775 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.775 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.775 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.775 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.775 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.775 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.775 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.775 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.775 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.775 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.775 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.775 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.775 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.775 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.775 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.775 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.775 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.775 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.775 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.775 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.775 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.775 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.775 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.775 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.775 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.776 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-AttachInterfacesTestJSON-server-253236039>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-AttachInterfacesTestJSON-server-253236039>]
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.776 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.776 12 DEBUG ceilometer.compute.pollsters [-] fb921e88-22d2-4ae6-9d09-970505f0d5bb/disk.device.read.bytes volume: 30190080 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.776 12 DEBUG ceilometer.compute.pollsters [-] fb921e88-22d2-4ae6-9d09-970505f0d5bb/disk.device.read.bytes volume: 299326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.777 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '34d52df2-4188-4d28-8ca5-a085f8ff1db5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30190080, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb-vda', 'timestamp': '2025-11-22T08:02:36.776404', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-253236039', 'name': 'instance-0000005f', 'instance_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9bd33528-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5299.367695309, 'message_signature': '7a0a1df27087070c963465b78c5156bf610dbba78e609ab0dc833806a4f25cab'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 299326, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb-sda', 'timestamp': '2025-11-22T08:02:36.776404', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-253236039', 'name': 'instance-0000005f', 'instance_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9bd3413a-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5299.367695309, 'message_signature': '24dae922963b54e78c9c07da4efbfc82e2b848d70925043076287b712ab676f8'}]}, 'timestamp': '2025-11-22 08:02:36.777035', '_unique_id': 'cee09e362ffa4f11ac5851da5eb5b1ca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.777 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.777 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.777 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.777 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.777 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.777 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.777 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.777 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.777 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.777 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.777 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.777 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.777 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.777 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.777 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.777 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.777 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.777 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.777 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.777 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.777 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.777 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.777 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.777 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.777 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.777 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.777 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.777 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.777 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.777 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.777 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.778 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.778 12 DEBUG ceilometer.compute.pollsters [-] fb921e88-22d2-4ae6-9d09-970505f0d5bb/disk.device.write.latency volume: 267148408858 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.778 12 DEBUG ceilometer.compute.pollsters [-] fb921e88-22d2-4ae6-9d09-970505f0d5bb/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.780 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8165cd34-e5d8-48f6-8765-b7711165e5fb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 267148408858, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb-vda', 'timestamp': '2025-11-22T08:02:36.778624', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-253236039', 'name': 'instance-0000005f', 'instance_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9bd38b5e-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5299.367695309, 'message_signature': 'a9aa482fdce72fe7821c1d4602695b93a356746af93cf239254f5baa5accdc50'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb-sda', 'timestamp': '2025-11-22T08:02:36.778624', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-253236039', 'name': 'instance-0000005f', 'instance_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9bd3987e-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5299.367695309, 'message_signature': '76b51736a90d44f7a6c8ecbcfcc52fe8ef92001a0f2678c8609c02c465bdb372'}]}, 'timestamp': '2025-11-22 08:02:36.779286', '_unique_id': 'a258449365ae493597352d224533395f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.780 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.780 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.780 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.780 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.780 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.780 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.780 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.780 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.780 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.780 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.780 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.780 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.780 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.780 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.780 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.780 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.780 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.780 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.780 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.780 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.780 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.780 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.780 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.780 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.780 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.780 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.780 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.780 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.780 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.780 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.780 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.780 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.780 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.781 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-AttachInterfacesTestJSON-server-253236039>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-AttachInterfacesTestJSON-server-253236039>]
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.781 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.781 12 DEBUG ceilometer.compute.pollsters [-] fb921e88-22d2-4ae6-9d09-970505f0d5bb/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.781 12 DEBUG ceilometer.compute.pollsters [-] fb921e88-22d2-4ae6-9d09-970505f0d5bb/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.781 12 DEBUG ceilometer.compute.pollsters [-] fb921e88-22d2-4ae6-9d09-970505f0d5bb/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.782 12 DEBUG ceilometer.compute.pollsters [-] fb921e88-22d2-4ae6-9d09-970505f0d5bb/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.783 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3503dcec-cfa7-4816-9563-ddb12fae6c19', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'instance-0000005f-fb921e88-22d2-4ae6-9d09-970505f0d5bb-tap8979b5b9-12', 'timestamp': '2025-11-22T08:02:36.781402', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-253236039', 'name': 'tap8979b5b9-12', 'instance_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a0:69:e9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8979b5b9-12'}, 'message_id': '9bd3f8aa-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5299.359894758, 'message_signature': '375f72e17a4a9bcafd904a90eb498de12abe00beb9765e2104c3c891daea5bc9'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'instance-0000005f-fb921e88-22d2-4ae6-9d09-970505f0d5bb-tapa5480b52-2a', 'timestamp': '2025-11-22T08:02:36.781402', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-253236039', 'name': 'tapa5480b52-2a', 'instance_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5f:20:36', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa5480b52-2a'}, 'message_id': '9bd40264-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5299.359894758, 'message_signature': '73e78787fe054269fb8fd974fc13b46b28b31335f99384346f2a637536600a8e'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'instance-0000005f-fb921e88-22d2-4ae6-9d09-970505f0d5bb-tap26960794-1a', 'timestamp': '2025-11-22T08:02:36.781402', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-253236039', 'name': 'tap26960794-1a', 'instance_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:75:a1:57', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap26960794-1a'}, 'message_id': '9bd40b60-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5299.359894758, 'message_signature': 'dd2d2b1591f9e1c7ddccc79c42fa3628ffa20d96dd934736b929ebe703aad36b'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_name': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_name': None, 'resource_id': 'instance-0000005f-fb921e88-22d2-4ae6-9d09-970505f0d5bb-tapb7ba05ee-84', 'timestamp': '2025-11-22T08:02:36.781402', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-253236039', 'name': 'tapb7ba05ee-84', 'instance_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'instance_type': 'm1.nano', 'host': 'c2eef4a67205388dc7b285ac36c810157b7bd6fb232611f0e68f345e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ff:85:53', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb7ba05ee-84'}, 'message_id': '9bd414f2-c779-11f0-9b25-fa163ecc0304', 'monotonic_time': 5299.359894758, 'message_signature': '7de20b2d14634bba9ea8a8fd89867998382c4651b48edbf9385fa5d7b81acdbd'}]}, 'timestamp': '2025-11-22 08:02:36.782435', '_unique_id': '385f16d685e34041a58249d16ac20915'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.783 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.783 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.783 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.783 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.783 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.783 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.783 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.783 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.783 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.783 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.783 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.783 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.783 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.783 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.783 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.783 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.783 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.783 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.783 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.783 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.783 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.783 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.783 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.783 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.783 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.783 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.783 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.783 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.783 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.783 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.783 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.784 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.784 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:02:36.784 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-AttachInterfacesTestJSON-server-253236039>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-AttachInterfacesTestJSON-server-253236039>]
Nov 22 03:02:36 np0005531887 nova_compute[186849]: 2025-11-22 08:02:36.996 186853 DEBUG oslo_concurrency.lockutils [None req-bee60642-d2ad-408b-829c-6d2ebd15a79f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquiring lock "interface-fb921e88-22d2-4ae6-9d09-970505f0d5bb-a5480b52-2a9f-4662-8b66-bd078a80ca44" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:02:36 np0005531887 nova_compute[186849]: 2025-11-22 08:02:36.997 186853 DEBUG oslo_concurrency.lockutils [None req-bee60642-d2ad-408b-829c-6d2ebd15a79f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "interface-fb921e88-22d2-4ae6-9d09-970505f0d5bb-a5480b52-2a9f-4662-8b66-bd078a80ca44" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:02:37 np0005531887 nova_compute[186849]: 2025-11-22 08:02:37.025 186853 DEBUG nova.objects.instance [None req-bee60642-d2ad-408b-829c-6d2ebd15a79f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lazy-loading 'flavor' on Instance uuid fb921e88-22d2-4ae6-9d09-970505f0d5bb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:02:37 np0005531887 nova_compute[186849]: 2025-11-22 08:02:37.056 186853 DEBUG nova.virt.libvirt.vif [None req-bee60642-d2ad-408b-829c-6d2ebd15a79f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:01:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-253236039',display_name='tempest-AttachInterfacesTestJSON-server-253236039',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-253236039',id=95,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPlFNWaBPkXmo1oJCfLN54tXf/v5r9z/rUlzmZkdlM69K1yL3DA4wVyYYlX6OkgGyxa3eXCtkM1Rz2ltwe/CBwFo/i1WEXVw/DQ8O39rJeQjSBKmeD14VReQyrKNKErHuw==',key_name='tempest-keypair-1627237356',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:01:19Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2ce7c88f9ad440f494bdba03e7ece1bf',ramdisk_id='',reservation_id='r-hi30eyon',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-148239263',owner_user_name='tempest-AttachInterfacesTestJSON-148239263-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:01:19Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='53d50a77d3f2416c8fcc459cc343d045',uuid=fb921e88-22d2-4ae6-9d09-970505f0d5bb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a5480b52-2a9f-4662-8b66-bd078a80ca44", "address": "fa:16:3e:5f:20:36", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5480b52-2a", "ovs_interfaceid": "a5480b52-2a9f-4662-8b66-bd078a80ca44", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:02:37 np0005531887 nova_compute[186849]: 2025-11-22 08:02:37.057 186853 DEBUG nova.network.os_vif_util [None req-bee60642-d2ad-408b-829c-6d2ebd15a79f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converting VIF {"id": "a5480b52-2a9f-4662-8b66-bd078a80ca44", "address": "fa:16:3e:5f:20:36", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5480b52-2a", "ovs_interfaceid": "a5480b52-2a9f-4662-8b66-bd078a80ca44", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:02:37 np0005531887 nova_compute[186849]: 2025-11-22 08:02:37.058 186853 DEBUG nova.network.os_vif_util [None req-bee60642-d2ad-408b-829c-6d2ebd15a79f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5f:20:36,bridge_name='br-int',has_traffic_filtering=True,id=a5480b52-2a9f-4662-8b66-bd078a80ca44,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5480b52-2a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:02:37 np0005531887 nova_compute[186849]: 2025-11-22 08:02:37.061 186853 DEBUG nova.virt.libvirt.guest [None req-bee60642-d2ad-408b-829c-6d2ebd15a79f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:5f:20:36"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapa5480b52-2a"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 22 03:02:37 np0005531887 nova_compute[186849]: 2025-11-22 08:02:37.065 186853 DEBUG nova.virt.libvirt.guest [None req-bee60642-d2ad-408b-829c-6d2ebd15a79f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:5f:20:36"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapa5480b52-2a"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 22 03:02:37 np0005531887 nova_compute[186849]: 2025-11-22 08:02:37.069 186853 DEBUG nova.virt.libvirt.driver [None req-bee60642-d2ad-408b-829c-6d2ebd15a79f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Attempting to detach device tapa5480b52-2a from instance fb921e88-22d2-4ae6-9d09-970505f0d5bb from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Nov 22 03:02:37 np0005531887 nova_compute[186849]: 2025-11-22 08:02:37.070 186853 DEBUG nova.virt.libvirt.guest [None req-bee60642-d2ad-408b-829c-6d2ebd15a79f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] detach device xml: <interface type="ethernet">
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <mac address="fa:16:3e:5f:20:36"/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <model type="virtio"/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <mtu size="1442"/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <target dev="tapa5480b52-2a"/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]: </interface>
Nov 22 03:02:37 np0005531887 nova_compute[186849]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Nov 22 03:02:37 np0005531887 nova_compute[186849]: 2025-11-22 08:02:37.122 186853 DEBUG nova.virt.libvirt.guest [None req-bee60642-d2ad-408b-829c-6d2ebd15a79f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:5f:20:36"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapa5480b52-2a"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 22 03:02:37 np0005531887 nova_compute[186849]: 2025-11-22 08:02:37.127 186853 DEBUG nova.virt.libvirt.guest [None req-bee60642-d2ad-408b-829c-6d2ebd15a79f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:5f:20:36"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapa5480b52-2a"/></interface>not found in domain: <domain type='kvm' id='38'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <name>instance-0000005f</name>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <uuid>fb921e88-22d2-4ae6-9d09-970505f0d5bb</uuid>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <nova:name>tempest-AttachInterfacesTestJSON-server-253236039</nova:name>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <nova:creationTime>2025-11-22 08:02:34</nova:creationTime>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <nova:flavor name="m1.nano">
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <nova:memory>128</nova:memory>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <nova:disk>1</nova:disk>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <nova:swap>0</nova:swap>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <nova:vcpus>1</nova:vcpus>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  </nova:flavor>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <nova:owner>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <nova:user uuid="53d50a77d3f2416c8fcc459cc343d045">tempest-AttachInterfacesTestJSON-148239263-project-member</nova:user>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <nova:project uuid="2ce7c88f9ad440f494bdba03e7ece1bf">tempest-AttachInterfacesTestJSON-148239263</nova:project>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  </nova:owner>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <nova:ports>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <nova:port uuid="8979b5b9-122d-4f08-832e-80c8509b9f9a">
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </nova:port>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <nova:port uuid="a5480b52-2a9f-4662-8b66-bd078a80ca44">
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </nova:port>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <nova:port uuid="26960794-1ab2-46fd-917b-8b5e28186dc3">
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </nova:port>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <nova:port uuid="b7ba05ee-8492-4d3b-ae6b-fd90eee67f56">
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </nova:port>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  </nova:ports>
Nov 22 03:02:37 np0005531887 nova_compute[186849]: </nova:instance>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <memory unit='KiB'>131072</memory>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <currentMemory unit='KiB'>131072</currentMemory>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <vcpu placement='static'>1</vcpu>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <resource>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <partition>/machine</partition>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  </resource>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <sysinfo type='smbios'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <system>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <entry name='manufacturer'>RDO</entry>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <entry name='product'>OpenStack Compute</entry>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <entry name='serial'>fb921e88-22d2-4ae6-9d09-970505f0d5bb</entry>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <entry name='uuid'>fb921e88-22d2-4ae6-9d09-970505f0d5bb</entry>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <entry name='family'>Virtual Machine</entry>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </system>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <os>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <boot dev='hd'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <smbios mode='sysinfo'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  </os>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <features>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <vmcoreinfo state='on'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  </features>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <cpu mode='custom' match='exact' check='full'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <model fallback='forbid'>Nehalem</model>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <feature policy='require' name='x2apic'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <feature policy='require' name='hypervisor'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <feature policy='require' name='vme'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <clock offset='utc'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <timer name='pit' tickpolicy='delay'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <timer name='rtc' tickpolicy='catchup'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <timer name='hpet' present='no'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  </clock>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <on_poweroff>destroy</on_poweroff>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <on_reboot>restart</on_reboot>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <on_crash>destroy</on_crash>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <devices>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <disk type='file' device='disk'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <driver name='qemu' type='qcow2' cache='none'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <source file='/var/lib/nova/instances/fb921e88-22d2-4ae6-9d09-970505f0d5bb/disk' index='2'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <backingStore type='file' index='3'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:        <format type='raw'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:        <source file='/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:        <backingStore/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      </backingStore>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target dev='vda' bus='virtio'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='virtio-disk0'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <disk type='file' device='cdrom'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <driver name='qemu' type='raw' cache='none'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <source file='/var/lib/nova/instances/fb921e88-22d2-4ae6-9d09-970505f0d5bb/disk.config' index='1'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <backingStore/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target dev='sda' bus='sata'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <readonly/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='sata0-0-0'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <controller type='pci' index='0' model='pcie-root'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='pcie.0'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <controller type='pci' index='1' model='pcie-root-port'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target chassis='1' port='0x10'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='pci.1'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <controller type='pci' index='2' model='pcie-root-port'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target chassis='2' port='0x11'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='pci.2'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <controller type='pci' index='3' model='pcie-root-port'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target chassis='3' port='0x12'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='pci.3'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <controller type='pci' index='4' model='pcie-root-port'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target chassis='4' port='0x13'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='pci.4'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <controller type='pci' index='5' model='pcie-root-port'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target chassis='5' port='0x14'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='pci.5'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <controller type='pci' index='6' model='pcie-root-port'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target chassis='6' port='0x15'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='pci.6'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <controller type='pci' index='7' model='pcie-root-port'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target chassis='7' port='0x16'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='pci.7'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <controller type='pci' index='8' model='pcie-root-port'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target chassis='8' port='0x17'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='pci.8'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <controller type='pci' index='9' model='pcie-root-port'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target chassis='9' port='0x18'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='pci.9'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <controller type='pci' index='10' model='pcie-root-port'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target chassis='10' port='0x19'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='pci.10'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <controller type='pci' index='11' model='pcie-root-port'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target chassis='11' port='0x1a'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='pci.11'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <controller type='pci' index='12' model='pcie-root-port'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target chassis='12' port='0x1b'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='pci.12'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <controller type='pci' index='13' model='pcie-root-port'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target chassis='13' port='0x1c'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='pci.13'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <controller type='pci' index='14' model='pcie-root-port'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target chassis='14' port='0x1d'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='pci.14'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <controller type='pci' index='15' model='pcie-root-port'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target chassis='15' port='0x1e'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='pci.15'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <controller type='pci' index='16' model='pcie-root-port'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target chassis='16' port='0x1f'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='pci.16'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <controller type='pci' index='17' model='pcie-root-port'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target chassis='17' port='0x20'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='pci.17'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <controller type='pci' index='18' model='pcie-root-port'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target chassis='18' port='0x21'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='pci.18'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <controller type='pci' index='19' model='pcie-root-port'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target chassis='19' port='0x22'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='pci.19'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <controller type='pci' index='20' model='pcie-root-port'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target chassis='20' port='0x23'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='pci.20'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <controller type='pci' index='21' model='pcie-root-port'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target chassis='21' port='0x24'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='pci.21'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <controller type='pci' index='22' model='pcie-root-port'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target chassis='22' port='0x25'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='pci.22'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <controller type='pci' index='23' model='pcie-root-port'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target chassis='23' port='0x26'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='pci.23'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <controller type='pci' index='24' model='pcie-root-port'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target chassis='24' port='0x27'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='pci.24'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <controller type='pci' index='25' model='pcie-root-port'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target chassis='25' port='0x28'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='pci.25'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <model name='pcie-pci-bridge'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='pci.26'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <controller type='usb' index='0' model='piix3-uhci'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='usb'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <controller type='sata' index='0'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='ide'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <interface type='ethernet'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <mac address='fa:16:3e:a0:69:e9'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target dev='tap8979b5b9-12'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <model type='virtio'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <driver name='vhost' rx_queue_size='512'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <mtu size='1442'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='net0'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </interface>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <interface type='ethernet'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <mac address='fa:16:3e:5f:20:36'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target dev='tapa5480b52-2a'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <model type='virtio'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <driver name='vhost' rx_queue_size='512'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <mtu size='1442'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='net1'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </interface>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <interface type='ethernet'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <mac address='fa:16:3e:75:a1:57'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target dev='tap26960794-1a'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <model type='virtio'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <driver name='vhost' rx_queue_size='512'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <mtu size='1442'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='net2'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x07' slot='0x00' function='0x0'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </interface>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <interface type='ethernet'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <mac address='fa:16:3e:ff:85:53'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target dev='tapb7ba05ee-84'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <model type='virtio'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <driver name='vhost' rx_queue_size='512'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <mtu size='1442'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='net3'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x08' slot='0x00' function='0x0'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </interface>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <serial type='pty'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <source path='/dev/pts/0'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <log file='/var/lib/nova/instances/fb921e88-22d2-4ae6-9d09-970505f0d5bb/console.log' append='off'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target type='isa-serial' port='0'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:        <model name='isa-serial'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      </target>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='serial0'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </serial>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <console type='pty' tty='/dev/pts/0'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <source path='/dev/pts/0'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <log file='/var/lib/nova/instances/fb921e88-22d2-4ae6-9d09-970505f0d5bb/console.log' append='off'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target type='serial' port='0'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='serial0'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </console>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <input type='tablet' bus='usb'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='input0'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='usb' bus='0' port='1'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </input>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <input type='mouse' bus='ps2'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='input1'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </input>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <input type='keyboard' bus='ps2'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='input2'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </input>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <listen type='address' address='::0'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </graphics>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <audio id='1' type='none'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <video>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <model type='virtio' heads='1' primary='yes'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='video0'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </video>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <watchdog model='itco' action='reset'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='watchdog0'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </watchdog>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <memballoon model='virtio'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <stats period='10'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='balloon0'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <rng model='virtio'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <backend model='random'>/dev/urandom</backend>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='rng0'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </rng>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  </devices>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <label>system_u:system_r:svirt_t:s0:c391,c624</label>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c391,c624</imagelabel>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  </seclabel>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <label>+107:+107</label>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <imagelabel>+107:+107</imagelabel>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  </seclabel>
Nov 22 03:02:37 np0005531887 nova_compute[186849]: </domain>
Nov 22 03:02:37 np0005531887 nova_compute[186849]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Nov 22 03:02:37 np0005531887 nova_compute[186849]: 2025-11-22 08:02:37.129 186853 INFO nova.virt.libvirt.driver [None req-bee60642-d2ad-408b-829c-6d2ebd15a79f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Successfully detached device tapa5480b52-2a from instance fb921e88-22d2-4ae6-9d09-970505f0d5bb from the persistent domain config.#033[00m
Nov 22 03:02:37 np0005531887 nova_compute[186849]: 2025-11-22 08:02:37.129 186853 DEBUG nova.virt.libvirt.driver [None req-bee60642-d2ad-408b-829c-6d2ebd15a79f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] (1/8): Attempting to detach device tapa5480b52-2a with device alias net1 from instance fb921e88-22d2-4ae6-9d09-970505f0d5bb from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Nov 22 03:02:37 np0005531887 nova_compute[186849]: 2025-11-22 08:02:37.130 186853 DEBUG nova.virt.libvirt.guest [None req-bee60642-d2ad-408b-829c-6d2ebd15a79f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] detach device xml: <interface type="ethernet">
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <mac address="fa:16:3e:5f:20:36"/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <model type="virtio"/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <mtu size="1442"/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <target dev="tapa5480b52-2a"/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]: </interface>
Nov 22 03:02:37 np0005531887 nova_compute[186849]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Nov 22 03:02:37 np0005531887 kernel: tapa5480b52-2a (unregistering): left promiscuous mode
Nov 22 03:02:37 np0005531887 NetworkManager[55210]: <info>  [1763798557.2448] device (tapa5480b52-2a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:02:37 np0005531887 nova_compute[186849]: 2025-11-22 08:02:37.248 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:37 np0005531887 ovn_controller[95130]: 2025-11-22T08:02:37Z|00288|binding|INFO|Releasing lport a5480b52-2a9f-4662-8b66-bd078a80ca44 from this chassis (sb_readonly=0)
Nov 22 03:02:37 np0005531887 ovn_controller[95130]: 2025-11-22T08:02:37Z|00289|binding|INFO|Setting lport a5480b52-2a9f-4662-8b66-bd078a80ca44 down in Southbound
Nov 22 03:02:37 np0005531887 ovn_controller[95130]: 2025-11-22T08:02:37Z|00290|binding|INFO|Removing iface tapa5480b52-2a ovn-installed in OVS
Nov 22 03:02:37 np0005531887 nova_compute[186849]: 2025-11-22 08:02:37.252 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:37 np0005531887 nova_compute[186849]: 2025-11-22 08:02:37.267 186853 DEBUG nova.virt.libvirt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Received event <DeviceRemovedEvent: 1763798557.2668447, fb921e88-22d2-4ae6-9d09-970505f0d5bb => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Nov 22 03:02:37 np0005531887 nova_compute[186849]: 2025-11-22 08:02:37.269 186853 DEBUG nova.virt.libvirt.driver [None req-bee60642-d2ad-408b-829c-6d2ebd15a79f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Start waiting for the detach event from libvirt for device tapa5480b52-2a with device alias net1 for instance fb921e88-22d2-4ae6-9d09-970505f0d5bb _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Nov 22 03:02:37 np0005531887 nova_compute[186849]: 2025-11-22 08:02:37.269 186853 DEBUG nova.virt.libvirt.guest [None req-bee60642-d2ad-408b-829c-6d2ebd15a79f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:5f:20:36"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapa5480b52-2a"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 22 03:02:37 np0005531887 nova_compute[186849]: 2025-11-22 08:02:37.274 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:37 np0005531887 nova_compute[186849]: 2025-11-22 08:02:37.276 186853 DEBUG nova.virt.libvirt.guest [None req-bee60642-d2ad-408b-829c-6d2ebd15a79f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:5f:20:36"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapa5480b52-2a"/></interface>not found in domain: <domain type='kvm' id='38'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <name>instance-0000005f</name>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <uuid>fb921e88-22d2-4ae6-9d09-970505f0d5bb</uuid>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <nova:name>tempest-AttachInterfacesTestJSON-server-253236039</nova:name>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <nova:creationTime>2025-11-22 08:02:34</nova:creationTime>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <nova:flavor name="m1.nano">
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <nova:memory>128</nova:memory>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <nova:disk>1</nova:disk>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <nova:swap>0</nova:swap>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <nova:vcpus>1</nova:vcpus>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  </nova:flavor>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <nova:owner>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <nova:user uuid="53d50a77d3f2416c8fcc459cc343d045">tempest-AttachInterfacesTestJSON-148239263-project-member</nova:user>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <nova:project uuid="2ce7c88f9ad440f494bdba03e7ece1bf">tempest-AttachInterfacesTestJSON-148239263</nova:project>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  </nova:owner>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <nova:ports>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <nova:port uuid="8979b5b9-122d-4f08-832e-80c8509b9f9a">
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </nova:port>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <nova:port uuid="a5480b52-2a9f-4662-8b66-bd078a80ca44">
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </nova:port>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <nova:port uuid="26960794-1ab2-46fd-917b-8b5e28186dc3">
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </nova:port>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <nova:port uuid="b7ba05ee-8492-4d3b-ae6b-fd90eee67f56">
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </nova:port>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  </nova:ports>
Nov 22 03:02:37 np0005531887 nova_compute[186849]: </nova:instance>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <memory unit='KiB'>131072</memory>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <currentMemory unit='KiB'>131072</currentMemory>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <vcpu placement='static'>1</vcpu>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <resource>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <partition>/machine</partition>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  </resource>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <sysinfo type='smbios'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <system>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <entry name='manufacturer'>RDO</entry>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <entry name='product'>OpenStack Compute</entry>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <entry name='serial'>fb921e88-22d2-4ae6-9d09-970505f0d5bb</entry>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <entry name='uuid'>fb921e88-22d2-4ae6-9d09-970505f0d5bb</entry>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <entry name='family'>Virtual Machine</entry>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </system>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <os>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <boot dev='hd'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <smbios mode='sysinfo'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  </os>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <features>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <vmcoreinfo state='on'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  </features>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <cpu mode='custom' match='exact' check='full'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <model fallback='forbid'>Nehalem</model>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <feature policy='require' name='x2apic'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <feature policy='require' name='hypervisor'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <feature policy='require' name='vme'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <clock offset='utc'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <timer name='pit' tickpolicy='delay'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <timer name='rtc' tickpolicy='catchup'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <timer name='hpet' present='no'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  </clock>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <on_poweroff>destroy</on_poweroff>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <on_reboot>restart</on_reboot>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <on_crash>destroy</on_crash>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <devices>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <disk type='file' device='disk'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <driver name='qemu' type='qcow2' cache='none'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <source file='/var/lib/nova/instances/fb921e88-22d2-4ae6-9d09-970505f0d5bb/disk' index='2'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <backingStore type='file' index='3'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:        <format type='raw'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:        <source file='/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:        <backingStore/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      </backingStore>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target dev='vda' bus='virtio'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='virtio-disk0'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <disk type='file' device='cdrom'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <driver name='qemu' type='raw' cache='none'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <source file='/var/lib/nova/instances/fb921e88-22d2-4ae6-9d09-970505f0d5bb/disk.config' index='1'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <backingStore/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target dev='sda' bus='sata'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <readonly/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='sata0-0-0'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <controller type='pci' index='0' model='pcie-root'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='pcie.0'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <controller type='pci' index='1' model='pcie-root-port'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target chassis='1' port='0x10'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='pci.1'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <controller type='pci' index='2' model='pcie-root-port'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target chassis='2' port='0x11'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='pci.2'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <controller type='pci' index='3' model='pcie-root-port'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target chassis='3' port='0x12'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='pci.3'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <controller type='pci' index='4' model='pcie-root-port'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target chassis='4' port='0x13'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='pci.4'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <controller type='pci' index='5' model='pcie-root-port'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target chassis='5' port='0x14'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='pci.5'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <controller type='pci' index='6' model='pcie-root-port'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target chassis='6' port='0x15'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='pci.6'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <controller type='pci' index='7' model='pcie-root-port'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target chassis='7' port='0x16'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='pci.7'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <controller type='pci' index='8' model='pcie-root-port'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target chassis='8' port='0x17'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='pci.8'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <controller type='pci' index='9' model='pcie-root-port'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target chassis='9' port='0x18'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='pci.9'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <controller type='pci' index='10' model='pcie-root-port'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target chassis='10' port='0x19'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='pci.10'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <controller type='pci' index='11' model='pcie-root-port'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target chassis='11' port='0x1a'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='pci.11'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <controller type='pci' index='12' model='pcie-root-port'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target chassis='12' port='0x1b'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='pci.12'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <controller type='pci' index='13' model='pcie-root-port'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target chassis='13' port='0x1c'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='pci.13'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <controller type='pci' index='14' model='pcie-root-port'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target chassis='14' port='0x1d'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='pci.14'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <controller type='pci' index='15' model='pcie-root-port'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target chassis='15' port='0x1e'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='pci.15'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <controller type='pci' index='16' model='pcie-root-port'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target chassis='16' port='0x1f'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='pci.16'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <controller type='pci' index='17' model='pcie-root-port'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target chassis='17' port='0x20'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='pci.17'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <controller type='pci' index='18' model='pcie-root-port'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target chassis='18' port='0x21'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='pci.18'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <controller type='pci' index='19' model='pcie-root-port'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target chassis='19' port='0x22'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='pci.19'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <controller type='pci' index='20' model='pcie-root-port'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target chassis='20' port='0x23'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='pci.20'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <controller type='pci' index='21' model='pcie-root-port'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target chassis='21' port='0x24'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='pci.21'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <controller type='pci' index='22' model='pcie-root-port'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target chassis='22' port='0x25'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='pci.22'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <controller type='pci' index='23' model='pcie-root-port'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target chassis='23' port='0x26'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='pci.23'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <controller type='pci' index='24' model='pcie-root-port'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target chassis='24' port='0x27'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='pci.24'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <controller type='pci' index='25' model='pcie-root-port'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target chassis='25' port='0x28'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='pci.25'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <model name='pcie-pci-bridge'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='pci.26'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <controller type='usb' index='0' model='piix3-uhci'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='usb'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <controller type='sata' index='0'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='ide'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <interface type='ethernet'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <mac address='fa:16:3e:a0:69:e9'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target dev='tap8979b5b9-12'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <model type='virtio'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <driver name='vhost' rx_queue_size='512'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <mtu size='1442'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='net0'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </interface>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <interface type='ethernet'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <mac address='fa:16:3e:75:a1:57'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target dev='tap26960794-1a'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <model type='virtio'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <driver name='vhost' rx_queue_size='512'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <mtu size='1442'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='net2'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x07' slot='0x00' function='0x0'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </interface>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <interface type='ethernet'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <mac address='fa:16:3e:ff:85:53'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target dev='tapb7ba05ee-84'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <model type='virtio'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <driver name='vhost' rx_queue_size='512'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <mtu size='1442'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='net3'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x08' slot='0x00' function='0x0'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </interface>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <serial type='pty'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <source path='/dev/pts/0'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <log file='/var/lib/nova/instances/fb921e88-22d2-4ae6-9d09-970505f0d5bb/console.log' append='off'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target type='isa-serial' port='0'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:        <model name='isa-serial'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      </target>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='serial0'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </serial>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <console type='pty' tty='/dev/pts/0'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <source path='/dev/pts/0'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <log file='/var/lib/nova/instances/fb921e88-22d2-4ae6-9d09-970505f0d5bb/console.log' append='off'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <target type='serial' port='0'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='serial0'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </console>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <input type='tablet' bus='usb'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='input0'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='usb' bus='0' port='1'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </input>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <input type='mouse' bus='ps2'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='input1'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </input>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <input type='keyboard' bus='ps2'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='input2'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </input>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <listen type='address' address='::0'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </graphics>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <audio id='1' type='none'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <video>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <model type='virtio' heads='1' primary='yes'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='video0'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </video>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <watchdog model='itco' action='reset'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='watchdog0'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </watchdog>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <memballoon model='virtio'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <stats period='10'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='balloon0'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <rng model='virtio'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <backend model='random'>/dev/urandom</backend>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <alias name='rng0'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </rng>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  </devices>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <label>system_u:system_r:svirt_t:s0:c391,c624</label>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c391,c624</imagelabel>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  </seclabel>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <label>+107:+107</label>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <imagelabel>+107:+107</imagelabel>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  </seclabel>
Nov 22 03:02:37 np0005531887 nova_compute[186849]: </domain>
Nov 22 03:02:37 np0005531887 nova_compute[186849]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Nov 22 03:02:37 np0005531887 nova_compute[186849]: 2025-11-22 08:02:37.276 186853 INFO nova.virt.libvirt.driver [None req-bee60642-d2ad-408b-829c-6d2ebd15a79f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Successfully detached device tapa5480b52-2a from instance fb921e88-22d2-4ae6-9d09-970505f0d5bb from the live domain config.#033[00m
Nov 22 03:02:37 np0005531887 nova_compute[186849]: 2025-11-22 08:02:37.277 186853 DEBUG nova.virt.libvirt.vif [None req-bee60642-d2ad-408b-829c-6d2ebd15a79f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:01:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-253236039',display_name='tempest-AttachInterfacesTestJSON-server-253236039',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-253236039',id=95,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPlFNWaBPkXmo1oJCfLN54tXf/v5r9z/rUlzmZkdlM69K1yL3DA4wVyYYlX6OkgGyxa3eXCtkM1Rz2ltwe/CBwFo/i1WEXVw/DQ8O39rJeQjSBKmeD14VReQyrKNKErHuw==',key_name='tempest-keypair-1627237356',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:01:19Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2ce7c88f9ad440f494bdba03e7ece1bf',ramdisk_id='',reservation_id='r-hi30eyon',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-148239263',owner_user_name='tempest-AttachInterfacesTestJSON-148239263-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:01:19Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='53d50a77d3f2416c8fcc459cc343d045',uuid=fb921e88-22d2-4ae6-9d09-970505f0d5bb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a5480b52-2a9f-4662-8b66-bd078a80ca44", "address": "fa:16:3e:5f:20:36", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5480b52-2a", "ovs_interfaceid": "a5480b52-2a9f-4662-8b66-bd078a80ca44", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:02:37 np0005531887 nova_compute[186849]: 2025-11-22 08:02:37.277 186853 DEBUG nova.network.os_vif_util [None req-bee60642-d2ad-408b-829c-6d2ebd15a79f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converting VIF {"id": "a5480b52-2a9f-4662-8b66-bd078a80ca44", "address": "fa:16:3e:5f:20:36", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5480b52-2a", "ovs_interfaceid": "a5480b52-2a9f-4662-8b66-bd078a80ca44", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:02:37 np0005531887 nova_compute[186849]: 2025-11-22 08:02:37.278 186853 DEBUG nova.network.os_vif_util [None req-bee60642-d2ad-408b-829c-6d2ebd15a79f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5f:20:36,bridge_name='br-int',has_traffic_filtering=True,id=a5480b52-2a9f-4662-8b66-bd078a80ca44,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5480b52-2a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:02:37 np0005531887 nova_compute[186849]: 2025-11-22 08:02:37.278 186853 DEBUG os_vif [None req-bee60642-d2ad-408b-829c-6d2ebd15a79f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:5f:20:36,bridge_name='br-int',has_traffic_filtering=True,id=a5480b52-2a9f-4662-8b66-bd078a80ca44,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5480b52-2a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:02:37 np0005531887 nova_compute[186849]: 2025-11-22 08:02:37.281 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:37 np0005531887 nova_compute[186849]: 2025-11-22 08:02:37.281 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa5480b52-2a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:02:37 np0005531887 nova_compute[186849]: 2025-11-22 08:02:37.283 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:37 np0005531887 nova_compute[186849]: 2025-11-22 08:02:37.284 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:37 np0005531887 nova_compute[186849]: 2025-11-22 08:02:37.286 186853 INFO os_vif [None req-bee60642-d2ad-408b-829c-6d2ebd15a79f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:5f:20:36,bridge_name='br-int',has_traffic_filtering=True,id=a5480b52-2a9f-4662-8b66-bd078a80ca44,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5480b52-2a')#033[00m
Nov 22 03:02:37 np0005531887 nova_compute[186849]: 2025-11-22 08:02:37.287 186853 DEBUG nova.virt.libvirt.guest [None req-bee60642-d2ad-408b-829c-6d2ebd15a79f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <nova:name>tempest-AttachInterfacesTestJSON-server-253236039</nova:name>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <nova:creationTime>2025-11-22 08:02:37</nova:creationTime>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <nova:flavor name="m1.nano">
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <nova:memory>128</nova:memory>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <nova:disk>1</nova:disk>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <nova:swap>0</nova:swap>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <nova:vcpus>1</nova:vcpus>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  </nova:flavor>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <nova:owner>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <nova:user uuid="53d50a77d3f2416c8fcc459cc343d045">tempest-AttachInterfacesTestJSON-148239263-project-member</nova:user>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <nova:project uuid="2ce7c88f9ad440f494bdba03e7ece1bf">tempest-AttachInterfacesTestJSON-148239263</nova:project>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  </nova:owner>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  <nova:ports>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <nova:port uuid="8979b5b9-122d-4f08-832e-80c8509b9f9a">
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </nova:port>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <nova:port uuid="26960794-1ab2-46fd-917b-8b5e28186dc3">
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </nova:port>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    <nova:port uuid="b7ba05ee-8492-4d3b-ae6b-fd90eee67f56">
Nov 22 03:02:37 np0005531887 nova_compute[186849]:      <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:    </nova:port>
Nov 22 03:02:37 np0005531887 nova_compute[186849]:  </nova:ports>
Nov 22 03:02:37 np0005531887 nova_compute[186849]: </nova:instance>
Nov 22 03:02:37 np0005531887 nova_compute[186849]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Nov 22 03:02:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:37.287 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5f:20:36 10.100.0.10'], port_security=['fa:16:3e:5f:20:36 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6a4a282c-db22-41de-b34b-2960aa032ca8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd4a48801-4b3f-49e9-aa90-fb1d486a915e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0acaade9-a442-4c9d-a882-bd397d30fce8, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=a5480b52-2a9f-4662-8b66-bd078a80ca44) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:02:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:37.289 104084 INFO neutron.agent.ovn.metadata.agent [-] Port a5480b52-2a9f-4662-8b66-bd078a80ca44 in datapath 6a4a282c-db22-41de-b34b-2960aa032ca8 unbound from our chassis#033[00m
Nov 22 03:02:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:37.290 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6a4a282c-db22-41de-b34b-2960aa032ca8#033[00m
Nov 22 03:02:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:37.309 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[8c9c0eb1-c50d-4574-b279-a3412b9ec957]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:37.336 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:02:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:37.337 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:02:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:37.338 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:02:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:37.351 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[62b3c6fd-b207-4724-b36b-414fa35dc333]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:37.355 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[496ecaf7-2374-4d18-9646-e2a32e96d39f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:37.389 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[b3cb6e73-d141-4dd3-9ecd-41d724bc353a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:37.413 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[c4ca1584-fccd-49d0-844d-099025e46c38]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6a4a282c-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b3:7a:86'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 11, 'rx_bytes': 784, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 11, 'rx_bytes': 784, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 89], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 522138, 'reachable_time': 34113, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227730, 'error': None, 'target': 'ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:37.432 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[7300b08d-2a6a-456c-8272-eea892701971]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap6a4a282c-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 522153, 'tstamp': 522153}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227731, 'error': None, 'target': 'ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap6a4a282c-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 522157, 'tstamp': 522157}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227731, 'error': None, 'target': 'ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:37.435 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6a4a282c-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:02:37 np0005531887 nova_compute[186849]: 2025-11-22 08:02:37.437 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:37 np0005531887 nova_compute[186849]: 2025-11-22 08:02:37.438 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:37.438 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6a4a282c-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:02:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:37.439 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:02:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:37.439 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6a4a282c-d0, col_values=(('external_ids', {'iface-id': '26692495-261e-4628-ae4d-0a33d676c097'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:02:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:37.439 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:02:37 np0005531887 nova_compute[186849]: 2025-11-22 08:02:37.793 186853 DEBUG nova.compute.manager [req-9562421e-2359-4f8c-b911-d459a1b378c3 req-512ace7a-5f07-4061-9e6a-a881be91690f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Received event network-vif-plugged-b7ba05ee-8492-4d3b-ae6b-fd90eee67f56 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:02:37 np0005531887 nova_compute[186849]: 2025-11-22 08:02:37.794 186853 DEBUG oslo_concurrency.lockutils [req-9562421e-2359-4f8c-b911-d459a1b378c3 req-512ace7a-5f07-4061-9e6a-a881be91690f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "fb921e88-22d2-4ae6-9d09-970505f0d5bb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:02:37 np0005531887 nova_compute[186849]: 2025-11-22 08:02:37.794 186853 DEBUG oslo_concurrency.lockutils [req-9562421e-2359-4f8c-b911-d459a1b378c3 req-512ace7a-5f07-4061-9e6a-a881be91690f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "fb921e88-22d2-4ae6-9d09-970505f0d5bb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:02:37 np0005531887 nova_compute[186849]: 2025-11-22 08:02:37.794 186853 DEBUG oslo_concurrency.lockutils [req-9562421e-2359-4f8c-b911-d459a1b378c3 req-512ace7a-5f07-4061-9e6a-a881be91690f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "fb921e88-22d2-4ae6-9d09-970505f0d5bb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:02:37 np0005531887 nova_compute[186849]: 2025-11-22 08:02:37.794 186853 DEBUG nova.compute.manager [req-9562421e-2359-4f8c-b911-d459a1b378c3 req-512ace7a-5f07-4061-9e6a-a881be91690f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] No waiting events found dispatching network-vif-plugged-b7ba05ee-8492-4d3b-ae6b-fd90eee67f56 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:02:37 np0005531887 nova_compute[186849]: 2025-11-22 08:02:37.795 186853 WARNING nova.compute.manager [req-9562421e-2359-4f8c-b911-d459a1b378c3 req-512ace7a-5f07-4061-9e6a-a881be91690f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Received unexpected event network-vif-plugged-b7ba05ee-8492-4d3b-ae6b-fd90eee67f56 for instance with vm_state active and task_state None.#033[00m
Nov 22 03:02:37 np0005531887 nova_compute[186849]: 2025-11-22 08:02:37.854 186853 DEBUG nova.compute.manager [req-85b1a8e6-03e1-4bbf-8f36-be132d899f23 req-4f32aaaf-3d1a-4006-bf44-f13b5f71ba72 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Received event network-vif-unplugged-a5480b52-2a9f-4662-8b66-bd078a80ca44 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:02:37 np0005531887 nova_compute[186849]: 2025-11-22 08:02:37.854 186853 DEBUG oslo_concurrency.lockutils [req-85b1a8e6-03e1-4bbf-8f36-be132d899f23 req-4f32aaaf-3d1a-4006-bf44-f13b5f71ba72 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "fb921e88-22d2-4ae6-9d09-970505f0d5bb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:02:37 np0005531887 nova_compute[186849]: 2025-11-22 08:02:37.854 186853 DEBUG oslo_concurrency.lockutils [req-85b1a8e6-03e1-4bbf-8f36-be132d899f23 req-4f32aaaf-3d1a-4006-bf44-f13b5f71ba72 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "fb921e88-22d2-4ae6-9d09-970505f0d5bb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:02:37 np0005531887 nova_compute[186849]: 2025-11-22 08:02:37.855 186853 DEBUG oslo_concurrency.lockutils [req-85b1a8e6-03e1-4bbf-8f36-be132d899f23 req-4f32aaaf-3d1a-4006-bf44-f13b5f71ba72 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "fb921e88-22d2-4ae6-9d09-970505f0d5bb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:02:37 np0005531887 nova_compute[186849]: 2025-11-22 08:02:37.855 186853 DEBUG nova.compute.manager [req-85b1a8e6-03e1-4bbf-8f36-be132d899f23 req-4f32aaaf-3d1a-4006-bf44-f13b5f71ba72 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] No waiting events found dispatching network-vif-unplugged-a5480b52-2a9f-4662-8b66-bd078a80ca44 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:02:37 np0005531887 nova_compute[186849]: 2025-11-22 08:02:37.855 186853 WARNING nova.compute.manager [req-85b1a8e6-03e1-4bbf-8f36-be132d899f23 req-4f32aaaf-3d1a-4006-bf44-f13b5f71ba72 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Received unexpected event network-vif-unplugged-a5480b52-2a9f-4662-8b66-bd078a80ca44 for instance with vm_state active and task_state None.#033[00m
Nov 22 03:02:38 np0005531887 nova_compute[186849]: 2025-11-22 08:02:38.754 186853 DEBUG oslo_concurrency.lockutils [None req-bee60642-d2ad-408b-829c-6d2ebd15a79f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquiring lock "refresh_cache-fb921e88-22d2-4ae6-9d09-970505f0d5bb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:02:38 np0005531887 nova_compute[186849]: 2025-11-22 08:02:38.755 186853 DEBUG oslo_concurrency.lockutils [None req-bee60642-d2ad-408b-829c-6d2ebd15a79f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquired lock "refresh_cache-fb921e88-22d2-4ae6-9d09-970505f0d5bb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:02:38 np0005531887 nova_compute[186849]: 2025-11-22 08:02:38.755 186853 DEBUG nova.network.neutron [None req-bee60642-d2ad-408b-829c-6d2ebd15a79f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:02:38 np0005531887 nova_compute[186849]: 2025-11-22 08:02:38.847 186853 DEBUG nova.compute.manager [req-68158fb9-9c36-4aab-a10d-dee724c322df req-e9803f6a-1cdd-4ce2-96b4-f35efe35a06d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Received event network-vif-deleted-a5480b52-2a9f-4662-8b66-bd078a80ca44 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:02:38 np0005531887 nova_compute[186849]: 2025-11-22 08:02:38.848 186853 INFO nova.compute.manager [req-68158fb9-9c36-4aab-a10d-dee724c322df req-e9803f6a-1cdd-4ce2-96b4-f35efe35a06d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Neutron deleted interface a5480b52-2a9f-4662-8b66-bd078a80ca44; detaching it from the instance and deleting it from the info cache#033[00m
Nov 22 03:02:38 np0005531887 nova_compute[186849]: 2025-11-22 08:02:38.848 186853 DEBUG nova.network.neutron [req-68158fb9-9c36-4aab-a10d-dee724c322df req-e9803f6a-1cdd-4ce2-96b4-f35efe35a06d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Updating instance_info_cache with network_info: [{"id": "8979b5b9-122d-4f08-832e-80c8509b9f9a", "address": "fa:16:3e:a0:69:e9", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8979b5b9-12", "ovs_interfaceid": "8979b5b9-122d-4f08-832e-80c8509b9f9a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "26960794-1ab2-46fd-917b-8b5e28186dc3", "address": "fa:16:3e:75:a1:57", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26960794-1a", "ovs_interfaceid": "26960794-1ab2-46fd-917b-8b5e28186dc3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "b7ba05ee-8492-4d3b-ae6b-fd90eee67f56", "address": "fa:16:3e:ff:85:53", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7ba05ee-84", "ovs_interfaceid": "b7ba05ee-8492-4d3b-ae6b-fd90eee67f56", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:02:38 np0005531887 nova_compute[186849]: 2025-11-22 08:02:38.871 186853 DEBUG nova.objects.instance [req-68158fb9-9c36-4aab-a10d-dee724c322df req-e9803f6a-1cdd-4ce2-96b4-f35efe35a06d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lazy-loading 'system_metadata' on Instance uuid fb921e88-22d2-4ae6-9d09-970505f0d5bb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:02:38 np0005531887 nova_compute[186849]: 2025-11-22 08:02:38.890 186853 DEBUG nova.objects.instance [req-68158fb9-9c36-4aab-a10d-dee724c322df req-e9803f6a-1cdd-4ce2-96b4-f35efe35a06d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lazy-loading 'flavor' on Instance uuid fb921e88-22d2-4ae6-9d09-970505f0d5bb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:02:38 np0005531887 nova_compute[186849]: 2025-11-22 08:02:38.911 186853 DEBUG nova.virt.libvirt.vif [req-68158fb9-9c36-4aab-a10d-dee724c322df req-e9803f6a-1cdd-4ce2-96b4-f35efe35a06d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:01:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-253236039',display_name='tempest-AttachInterfacesTestJSON-server-253236039',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-253236039',id=95,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPlFNWaBPkXmo1oJCfLN54tXf/v5r9z/rUlzmZkdlM69K1yL3DA4wVyYYlX6OkgGyxa3eXCtkM1Rz2ltwe/CBwFo/i1WEXVw/DQ8O39rJeQjSBKmeD14VReQyrKNKErHuw==',key_name='tempest-keypair-1627237356',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:01:19Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2ce7c88f9ad440f494bdba03e7ece1bf',ramdisk_id='',reservation_id='r-hi30eyon',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-148239263',owner_user_name='tempest-AttachInterfacesTestJSON-148239263-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:01:19Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='53d50a77d3f2416c8fcc459cc343d045',uuid=fb921e88-22d2-4ae6-9d09-970505f0d5bb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a5480b52-2a9f-4662-8b66-bd078a80ca44", "address": "fa:16:3e:5f:20:36", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5480b52-2a", "ovs_interfaceid": "a5480b52-2a9f-4662-8b66-bd078a80ca44", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:02:38 np0005531887 nova_compute[186849]: 2025-11-22 08:02:38.912 186853 DEBUG nova.network.os_vif_util [req-68158fb9-9c36-4aab-a10d-dee724c322df req-e9803f6a-1cdd-4ce2-96b4-f35efe35a06d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Converting VIF {"id": "a5480b52-2a9f-4662-8b66-bd078a80ca44", "address": "fa:16:3e:5f:20:36", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5480b52-2a", "ovs_interfaceid": "a5480b52-2a9f-4662-8b66-bd078a80ca44", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:02:38 np0005531887 nova_compute[186849]: 2025-11-22 08:02:38.912 186853 DEBUG nova.network.os_vif_util [req-68158fb9-9c36-4aab-a10d-dee724c322df req-e9803f6a-1cdd-4ce2-96b4-f35efe35a06d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5f:20:36,bridge_name='br-int',has_traffic_filtering=True,id=a5480b52-2a9f-4662-8b66-bd078a80ca44,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5480b52-2a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:02:38 np0005531887 nova_compute[186849]: 2025-11-22 08:02:38.917 186853 DEBUG nova.virt.libvirt.guest [req-68158fb9-9c36-4aab-a10d-dee724c322df req-e9803f6a-1cdd-4ce2-96b4-f35efe35a06d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:5f:20:36"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapa5480b52-2a"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 22 03:02:38 np0005531887 nova_compute[186849]: 2025-11-22 08:02:38.923 186853 DEBUG nova.virt.libvirt.guest [req-68158fb9-9c36-4aab-a10d-dee724c322df req-e9803f6a-1cdd-4ce2-96b4-f35efe35a06d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:5f:20:36"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapa5480b52-2a"/></interface>not found in domain: <domain type='kvm' id='38'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  <name>instance-0000005f</name>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  <uuid>fb921e88-22d2-4ae6-9d09-970505f0d5bb</uuid>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  <nova:name>tempest-AttachInterfacesTestJSON-server-253236039</nova:name>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  <nova:creationTime>2025-11-22 08:02:37</nova:creationTime>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  <nova:flavor name="m1.nano">
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <nova:memory>128</nova:memory>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <nova:disk>1</nova:disk>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <nova:swap>0</nova:swap>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <nova:vcpus>1</nova:vcpus>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  </nova:flavor>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  <nova:owner>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <nova:user uuid="53d50a77d3f2416c8fcc459cc343d045">tempest-AttachInterfacesTestJSON-148239263-project-member</nova:user>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <nova:project uuid="2ce7c88f9ad440f494bdba03e7ece1bf">tempest-AttachInterfacesTestJSON-148239263</nova:project>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  </nova:owner>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  <nova:ports>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <nova:port uuid="8979b5b9-122d-4f08-832e-80c8509b9f9a">
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </nova:port>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <nova:port uuid="26960794-1ab2-46fd-917b-8b5e28186dc3">
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </nova:port>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <nova:port uuid="b7ba05ee-8492-4d3b-ae6b-fd90eee67f56">
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </nova:port>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  </nova:ports>
Nov 22 03:02:38 np0005531887 nova_compute[186849]: </nova:instance>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  <memory unit='KiB'>131072</memory>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  <currentMemory unit='KiB'>131072</currentMemory>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  <vcpu placement='static'>1</vcpu>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  <resource>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <partition>/machine</partition>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  </resource>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  <sysinfo type='smbios'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <system>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <entry name='manufacturer'>RDO</entry>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <entry name='product'>OpenStack Compute</entry>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <entry name='serial'>fb921e88-22d2-4ae6-9d09-970505f0d5bb</entry>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <entry name='uuid'>fb921e88-22d2-4ae6-9d09-970505f0d5bb</entry>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <entry name='family'>Virtual Machine</entry>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </system>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  <os>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <boot dev='hd'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <smbios mode='sysinfo'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  </os>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  <features>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <vmcoreinfo state='on'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  </features>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  <cpu mode='custom' match='exact' check='full'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <model fallback='forbid'>Nehalem</model>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <feature policy='require' name='x2apic'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <feature policy='require' name='hypervisor'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <feature policy='require' name='vme'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  <clock offset='utc'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <timer name='pit' tickpolicy='delay'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <timer name='rtc' tickpolicy='catchup'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <timer name='hpet' present='no'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  </clock>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  <on_poweroff>destroy</on_poweroff>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  <on_reboot>restart</on_reboot>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  <on_crash>destroy</on_crash>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  <devices>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <disk type='file' device='disk'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <driver name='qemu' type='qcow2' cache='none'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <source file='/var/lib/nova/instances/fb921e88-22d2-4ae6-9d09-970505f0d5bb/disk' index='2'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <backingStore type='file' index='3'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:        <format type='raw'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:        <source file='/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:        <backingStore/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      </backingStore>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target dev='vda' bus='virtio'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='virtio-disk0'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <disk type='file' device='cdrom'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <driver name='qemu' type='raw' cache='none'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <source file='/var/lib/nova/instances/fb921e88-22d2-4ae6-9d09-970505f0d5bb/disk.config' index='1'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <backingStore/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target dev='sda' bus='sata'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <readonly/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='sata0-0-0'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='0' model='pcie-root'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='pcie.0'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='1' model='pcie-root-port'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target chassis='1' port='0x10'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='pci.1'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='2' model='pcie-root-port'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target chassis='2' port='0x11'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='pci.2'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='3' model='pcie-root-port'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target chassis='3' port='0x12'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='pci.3'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='4' model='pcie-root-port'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target chassis='4' port='0x13'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='pci.4'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='5' model='pcie-root-port'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target chassis='5' port='0x14'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='pci.5'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='6' model='pcie-root-port'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target chassis='6' port='0x15'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='pci.6'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='7' model='pcie-root-port'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target chassis='7' port='0x16'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='pci.7'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='8' model='pcie-root-port'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target chassis='8' port='0x17'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='pci.8'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='9' model='pcie-root-port'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target chassis='9' port='0x18'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='pci.9'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='10' model='pcie-root-port'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target chassis='10' port='0x19'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='pci.10'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='11' model='pcie-root-port'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target chassis='11' port='0x1a'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='pci.11'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='12' model='pcie-root-port'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target chassis='12' port='0x1b'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='pci.12'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='13' model='pcie-root-port'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target chassis='13' port='0x1c'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='pci.13'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='14' model='pcie-root-port'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target chassis='14' port='0x1d'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='pci.14'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='15' model='pcie-root-port'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target chassis='15' port='0x1e'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='pci.15'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='16' model='pcie-root-port'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target chassis='16' port='0x1f'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='pci.16'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='17' model='pcie-root-port'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target chassis='17' port='0x20'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='pci.17'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='18' model='pcie-root-port'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target chassis='18' port='0x21'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='pci.18'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='19' model='pcie-root-port'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target chassis='19' port='0x22'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='pci.19'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='20' model='pcie-root-port'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target chassis='20' port='0x23'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='pci.20'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='21' model='pcie-root-port'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target chassis='21' port='0x24'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='pci.21'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='22' model='pcie-root-port'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target chassis='22' port='0x25'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='pci.22'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='23' model='pcie-root-port'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target chassis='23' port='0x26'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='pci.23'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='24' model='pcie-root-port'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target chassis='24' port='0x27'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='pci.24'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='25' model='pcie-root-port'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target chassis='25' port='0x28'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='pci.25'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <model name='pcie-pci-bridge'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='pci.26'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <controller type='usb' index='0' model='piix3-uhci'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='usb'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <controller type='sata' index='0'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='ide'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <interface type='ethernet'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <mac address='fa:16:3e:a0:69:e9'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target dev='tap8979b5b9-12'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <model type='virtio'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <driver name='vhost' rx_queue_size='512'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <mtu size='1442'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='net0'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </interface>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <interface type='ethernet'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <mac address='fa:16:3e:75:a1:57'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target dev='tap26960794-1a'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <model type='virtio'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <driver name='vhost' rx_queue_size='512'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <mtu size='1442'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='net2'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x07' slot='0x00' function='0x0'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </interface>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <interface type='ethernet'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <mac address='fa:16:3e:ff:85:53'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target dev='tapb7ba05ee-84'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <model type='virtio'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <driver name='vhost' rx_queue_size='512'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <mtu size='1442'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='net3'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x08' slot='0x00' function='0x0'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </interface>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <serial type='pty'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <source path='/dev/pts/0'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <log file='/var/lib/nova/instances/fb921e88-22d2-4ae6-9d09-970505f0d5bb/console.log' append='off'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target type='isa-serial' port='0'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:        <model name='isa-serial'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      </target>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='serial0'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </serial>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <console type='pty' tty='/dev/pts/0'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <source path='/dev/pts/0'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <log file='/var/lib/nova/instances/fb921e88-22d2-4ae6-9d09-970505f0d5bb/console.log' append='off'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target type='serial' port='0'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='serial0'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </console>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <input type='tablet' bus='usb'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='input0'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='usb' bus='0' port='1'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </input>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <input type='mouse' bus='ps2'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='input1'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </input>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <input type='keyboard' bus='ps2'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='input2'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </input>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <listen type='address' address='::0'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </graphics>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <audio id='1' type='none'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <video>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <model type='virtio' heads='1' primary='yes'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='video0'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </video>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <watchdog model='itco' action='reset'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='watchdog0'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </watchdog>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <memballoon model='virtio'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <stats period='10'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='balloon0'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <rng model='virtio'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <backend model='random'>/dev/urandom</backend>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='rng0'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </rng>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  </devices>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <label>system_u:system_r:svirt_t:s0:c391,c624</label>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c391,c624</imagelabel>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  </seclabel>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <label>+107:+107</label>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <imagelabel>+107:+107</imagelabel>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  </seclabel>
Nov 22 03:02:38 np0005531887 nova_compute[186849]: </domain>
Nov 22 03:02:38 np0005531887 nova_compute[186849]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Nov 22 03:02:38 np0005531887 nova_compute[186849]: 2025-11-22 08:02:38.923 186853 DEBUG nova.virt.libvirt.guest [req-68158fb9-9c36-4aab-a10d-dee724c322df req-e9803f6a-1cdd-4ce2-96b4-f35efe35a06d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:5f:20:36"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapa5480b52-2a"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 22 03:02:38 np0005531887 nova_compute[186849]: 2025-11-22 08:02:38.929 186853 DEBUG nova.virt.libvirt.guest [req-68158fb9-9c36-4aab-a10d-dee724c322df req-e9803f6a-1cdd-4ce2-96b4-f35efe35a06d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:5f:20:36"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapa5480b52-2a"/></interface>not found in domain: <domain type='kvm' id='38'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  <name>instance-0000005f</name>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  <uuid>fb921e88-22d2-4ae6-9d09-970505f0d5bb</uuid>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  <nova:name>tempest-AttachInterfacesTestJSON-server-253236039</nova:name>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  <nova:creationTime>2025-11-22 08:02:37</nova:creationTime>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  <nova:flavor name="m1.nano">
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <nova:memory>128</nova:memory>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <nova:disk>1</nova:disk>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <nova:swap>0</nova:swap>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <nova:vcpus>1</nova:vcpus>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  </nova:flavor>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  <nova:owner>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <nova:user uuid="53d50a77d3f2416c8fcc459cc343d045">tempest-AttachInterfacesTestJSON-148239263-project-member</nova:user>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <nova:project uuid="2ce7c88f9ad440f494bdba03e7ece1bf">tempest-AttachInterfacesTestJSON-148239263</nova:project>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  </nova:owner>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  <nova:ports>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <nova:port uuid="8979b5b9-122d-4f08-832e-80c8509b9f9a">
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </nova:port>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <nova:port uuid="26960794-1ab2-46fd-917b-8b5e28186dc3">
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </nova:port>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <nova:port uuid="b7ba05ee-8492-4d3b-ae6b-fd90eee67f56">
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </nova:port>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  </nova:ports>
Nov 22 03:02:38 np0005531887 nova_compute[186849]: </nova:instance>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  <memory unit='KiB'>131072</memory>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  <currentMemory unit='KiB'>131072</currentMemory>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  <vcpu placement='static'>1</vcpu>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  <resource>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <partition>/machine</partition>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  </resource>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  <sysinfo type='smbios'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <system>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <entry name='manufacturer'>RDO</entry>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <entry name='product'>OpenStack Compute</entry>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <entry name='serial'>fb921e88-22d2-4ae6-9d09-970505f0d5bb</entry>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <entry name='uuid'>fb921e88-22d2-4ae6-9d09-970505f0d5bb</entry>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <entry name='family'>Virtual Machine</entry>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </system>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  <os>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <boot dev='hd'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <smbios mode='sysinfo'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  </os>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  <features>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <vmcoreinfo state='on'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  </features>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  <cpu mode='custom' match='exact' check='full'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <model fallback='forbid'>Nehalem</model>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <feature policy='require' name='x2apic'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <feature policy='require' name='hypervisor'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <feature policy='require' name='vme'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  <clock offset='utc'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <timer name='pit' tickpolicy='delay'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <timer name='rtc' tickpolicy='catchup'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <timer name='hpet' present='no'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  </clock>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  <on_poweroff>destroy</on_poweroff>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  <on_reboot>restart</on_reboot>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  <on_crash>destroy</on_crash>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  <devices>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <disk type='file' device='disk'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <driver name='qemu' type='qcow2' cache='none'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <source file='/var/lib/nova/instances/fb921e88-22d2-4ae6-9d09-970505f0d5bb/disk' index='2'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <backingStore type='file' index='3'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:        <format type='raw'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:        <source file='/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:        <backingStore/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      </backingStore>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target dev='vda' bus='virtio'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='virtio-disk0'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <disk type='file' device='cdrom'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <driver name='qemu' type='raw' cache='none'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <source file='/var/lib/nova/instances/fb921e88-22d2-4ae6-9d09-970505f0d5bb/disk.config' index='1'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <backingStore/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target dev='sda' bus='sata'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <readonly/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='sata0-0-0'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='0' model='pcie-root'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='pcie.0'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='1' model='pcie-root-port'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target chassis='1' port='0x10'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='pci.1'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='2' model='pcie-root-port'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target chassis='2' port='0x11'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='pci.2'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='3' model='pcie-root-port'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target chassis='3' port='0x12'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='pci.3'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='4' model='pcie-root-port'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target chassis='4' port='0x13'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='pci.4'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='5' model='pcie-root-port'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target chassis='5' port='0x14'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='pci.5'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='6' model='pcie-root-port'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target chassis='6' port='0x15'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='pci.6'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='7' model='pcie-root-port'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target chassis='7' port='0x16'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='pci.7'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='8' model='pcie-root-port'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target chassis='8' port='0x17'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='pci.8'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='9' model='pcie-root-port'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target chassis='9' port='0x18'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='pci.9'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='10' model='pcie-root-port'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target chassis='10' port='0x19'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='pci.10'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='11' model='pcie-root-port'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target chassis='11' port='0x1a'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='pci.11'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='12' model='pcie-root-port'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target chassis='12' port='0x1b'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='pci.12'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='13' model='pcie-root-port'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target chassis='13' port='0x1c'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='pci.13'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='14' model='pcie-root-port'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target chassis='14' port='0x1d'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='pci.14'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='15' model='pcie-root-port'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target chassis='15' port='0x1e'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='pci.15'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='16' model='pcie-root-port'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target chassis='16' port='0x1f'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='pci.16'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='17' model='pcie-root-port'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target chassis='17' port='0x20'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='pci.17'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='18' model='pcie-root-port'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target chassis='18' port='0x21'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='pci.18'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='19' model='pcie-root-port'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target chassis='19' port='0x22'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='pci.19'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='20' model='pcie-root-port'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target chassis='20' port='0x23'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='pci.20'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='21' model='pcie-root-port'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target chassis='21' port='0x24'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='pci.21'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='22' model='pcie-root-port'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target chassis='22' port='0x25'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='pci.22'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='23' model='pcie-root-port'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target chassis='23' port='0x26'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='pci.23'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='24' model='pcie-root-port'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target chassis='24' port='0x27'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='pci.24'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='25' model='pcie-root-port'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target chassis='25' port='0x28'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='pci.25'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <model name='pcie-pci-bridge'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='pci.26'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <controller type='usb' index='0' model='piix3-uhci'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='usb'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <controller type='sata' index='0'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='ide'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <interface type='ethernet'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <mac address='fa:16:3e:a0:69:e9'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target dev='tap8979b5b9-12'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <model type='virtio'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <driver name='vhost' rx_queue_size='512'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <mtu size='1442'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='net0'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </interface>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <interface type='ethernet'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <mac address='fa:16:3e:75:a1:57'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target dev='tap26960794-1a'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <model type='virtio'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <driver name='vhost' rx_queue_size='512'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <mtu size='1442'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='net2'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x07' slot='0x00' function='0x0'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </interface>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <interface type='ethernet'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <mac address='fa:16:3e:ff:85:53'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target dev='tapb7ba05ee-84'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <model type='virtio'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <driver name='vhost' rx_queue_size='512'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <mtu size='1442'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='net3'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x08' slot='0x00' function='0x0'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </interface>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <serial type='pty'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <source path='/dev/pts/0'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <log file='/var/lib/nova/instances/fb921e88-22d2-4ae6-9d09-970505f0d5bb/console.log' append='off'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target type='isa-serial' port='0'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:        <model name='isa-serial'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      </target>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='serial0'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </serial>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <console type='pty' tty='/dev/pts/0'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <source path='/dev/pts/0'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <log file='/var/lib/nova/instances/fb921e88-22d2-4ae6-9d09-970505f0d5bb/console.log' append='off'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <target type='serial' port='0'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='serial0'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </console>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <input type='tablet' bus='usb'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='input0'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='usb' bus='0' port='1'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </input>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <input type='mouse' bus='ps2'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='input1'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </input>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <input type='keyboard' bus='ps2'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='input2'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </input>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <listen type='address' address='::0'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </graphics>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <audio id='1' type='none'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <video>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <model type='virtio' heads='1' primary='yes'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='video0'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </video>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <watchdog model='itco' action='reset'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='watchdog0'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </watchdog>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <memballoon model='virtio'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <stats period='10'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='balloon0'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <rng model='virtio'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <backend model='random'>/dev/urandom</backend>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <alias name='rng0'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </rng>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  </devices>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <label>system_u:system_r:svirt_t:s0:c391,c624</label>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c391,c624</imagelabel>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  </seclabel>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <label>+107:+107</label>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <imagelabel>+107:+107</imagelabel>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  </seclabel>
Nov 22 03:02:38 np0005531887 nova_compute[186849]: </domain>
Nov 22 03:02:38 np0005531887 nova_compute[186849]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Nov 22 03:02:38 np0005531887 nova_compute[186849]: 2025-11-22 08:02:38.930 186853 WARNING nova.virt.libvirt.driver [req-68158fb9-9c36-4aab-a10d-dee724c322df req-e9803f6a-1cdd-4ce2-96b4-f35efe35a06d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Detaching interface fa:16:3e:5f:20:36 failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tapa5480b52-2a' not found.#033[00m
Nov 22 03:02:38 np0005531887 nova_compute[186849]: 2025-11-22 08:02:38.931 186853 DEBUG nova.virt.libvirt.vif [req-68158fb9-9c36-4aab-a10d-dee724c322df req-e9803f6a-1cdd-4ce2-96b4-f35efe35a06d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:01:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-253236039',display_name='tempest-AttachInterfacesTestJSON-server-253236039',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-253236039',id=95,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPlFNWaBPkXmo1oJCfLN54tXf/v5r9z/rUlzmZkdlM69K1yL3DA4wVyYYlX6OkgGyxa3eXCtkM1Rz2ltwe/CBwFo/i1WEXVw/DQ8O39rJeQjSBKmeD14VReQyrKNKErHuw==',key_name='tempest-keypair-1627237356',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:01:19Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2ce7c88f9ad440f494bdba03e7ece1bf',ramdisk_id='',reservation_id='r-hi30eyon',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-148239263',owner_user_name='tempest-AttachInterfacesTestJSON-148239263-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:01:19Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='53d50a77d3f2416c8fcc459cc343d045',uuid=fb921e88-22d2-4ae6-9d09-970505f0d5bb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a5480b52-2a9f-4662-8b66-bd078a80ca44", "address": "fa:16:3e:5f:20:36", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5480b52-2a", "ovs_interfaceid": "a5480b52-2a9f-4662-8b66-bd078a80ca44", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:02:38 np0005531887 nova_compute[186849]: 2025-11-22 08:02:38.931 186853 DEBUG nova.network.os_vif_util [req-68158fb9-9c36-4aab-a10d-dee724c322df req-e9803f6a-1cdd-4ce2-96b4-f35efe35a06d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Converting VIF {"id": "a5480b52-2a9f-4662-8b66-bd078a80ca44", "address": "fa:16:3e:5f:20:36", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5480b52-2a", "ovs_interfaceid": "a5480b52-2a9f-4662-8b66-bd078a80ca44", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:02:38 np0005531887 nova_compute[186849]: 2025-11-22 08:02:38.932 186853 DEBUG nova.network.os_vif_util [req-68158fb9-9c36-4aab-a10d-dee724c322df req-e9803f6a-1cdd-4ce2-96b4-f35efe35a06d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5f:20:36,bridge_name='br-int',has_traffic_filtering=True,id=a5480b52-2a9f-4662-8b66-bd078a80ca44,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5480b52-2a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:02:38 np0005531887 nova_compute[186849]: 2025-11-22 08:02:38.933 186853 DEBUG os_vif [req-68158fb9-9c36-4aab-a10d-dee724c322df req-e9803f6a-1cdd-4ce2-96b4-f35efe35a06d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:5f:20:36,bridge_name='br-int',has_traffic_filtering=True,id=a5480b52-2a9f-4662-8b66-bd078a80ca44,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5480b52-2a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:02:38 np0005531887 nova_compute[186849]: 2025-11-22 08:02:38.935 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:38 np0005531887 nova_compute[186849]: 2025-11-22 08:02:38.936 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa5480b52-2a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:02:38 np0005531887 nova_compute[186849]: 2025-11-22 08:02:38.936 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:02:38 np0005531887 nova_compute[186849]: 2025-11-22 08:02:38.940 186853 INFO os_vif [req-68158fb9-9c36-4aab-a10d-dee724c322df req-e9803f6a-1cdd-4ce2-96b4-f35efe35a06d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:5f:20:36,bridge_name='br-int',has_traffic_filtering=True,id=a5480b52-2a9f-4662-8b66-bd078a80ca44,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5480b52-2a')#033[00m
Nov 22 03:02:38 np0005531887 nova_compute[186849]: 2025-11-22 08:02:38.941 186853 DEBUG nova.virt.libvirt.guest [req-68158fb9-9c36-4aab-a10d-dee724c322df req-e9803f6a-1cdd-4ce2-96b4-f35efe35a06d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  <nova:name>tempest-AttachInterfacesTestJSON-server-253236039</nova:name>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  <nova:creationTime>2025-11-22 08:02:38</nova:creationTime>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  <nova:flavor name="m1.nano">
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <nova:memory>128</nova:memory>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <nova:disk>1</nova:disk>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <nova:swap>0</nova:swap>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <nova:vcpus>1</nova:vcpus>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  </nova:flavor>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  <nova:owner>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <nova:user uuid="53d50a77d3f2416c8fcc459cc343d045">tempest-AttachInterfacesTestJSON-148239263-project-member</nova:user>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <nova:project uuid="2ce7c88f9ad440f494bdba03e7ece1bf">tempest-AttachInterfacesTestJSON-148239263</nova:project>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  </nova:owner>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  <nova:ports>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <nova:port uuid="8979b5b9-122d-4f08-832e-80c8509b9f9a">
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </nova:port>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <nova:port uuid="26960794-1ab2-46fd-917b-8b5e28186dc3">
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </nova:port>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    <nova:port uuid="b7ba05ee-8492-4d3b-ae6b-fd90eee67f56">
Nov 22 03:02:38 np0005531887 nova_compute[186849]:      <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:    </nova:port>
Nov 22 03:02:38 np0005531887 nova_compute[186849]:  </nova:ports>
Nov 22 03:02:38 np0005531887 nova_compute[186849]: </nova:instance>
Nov 22 03:02:38 np0005531887 nova_compute[186849]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Nov 22 03:02:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:38.998 104084 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 58df35d4-c8af-42a2-874f-5b02efe09040 with type ""#033[00m
Nov 22 03:02:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:39.000 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ff:85:53 10.100.0.13'], port_security=['fa:16:3e:ff:85:53 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-AttachInterfacesTestJSON-728286560', 'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6a4a282c-db22-41de-b34b-2960aa032ca8', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-AttachInterfacesTestJSON-728286560', 'neutron:project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd4a48801-4b3f-49e9-aa90-fb1d486a915e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0acaade9-a442-4c9d-a882-bd397d30fce8, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=b7ba05ee-8492-4d3b-ae6b-fd90eee67f56) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:02:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:39.001 104084 INFO neutron.agent.ovn.metadata.agent [-] Port b7ba05ee-8492-4d3b-ae6b-fd90eee67f56 in datapath 6a4a282c-db22-41de-b34b-2960aa032ca8 unbound from our chassis#033[00m
Nov 22 03:02:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:39.003 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6a4a282c-db22-41de-b34b-2960aa032ca8#033[00m
Nov 22 03:02:39 np0005531887 ovn_controller[95130]: 2025-11-22T08:02:39Z|00291|binding|INFO|Removing iface tapb7ba05ee-84 ovn-installed in OVS
Nov 22 03:02:39 np0005531887 ovn_controller[95130]: 2025-11-22T08:02:39Z|00292|binding|INFO|Removing lport b7ba05ee-8492-4d3b-ae6b-fd90eee67f56 ovn-installed in OVS
Nov 22 03:02:39 np0005531887 nova_compute[186849]: 2025-11-22 08:02:39.010 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:39.019 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[3b1d731e-edbd-4116-9060-23c4c8a8995f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:39 np0005531887 nova_compute[186849]: 2025-11-22 08:02:39.022 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:39.053 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[2d6bcea6-eb72-4087-99a6-35af5be3acb5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:39.057 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[237f6745-615f-4c42-8101-a9954601602d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:39.090 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[c51c519f-7c78-4a97-863f-6d22e6c66c95]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:39.110 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[69f9571b-4c43-4e10-8e89-339751d405cf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6a4a282c-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b3:7a:86'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 13, 'rx_bytes': 784, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 13, 'rx_bytes': 784, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 89], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 522138, 'reachable_time': 34113, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227737, 'error': None, 'target': 'ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:39.128 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[868d5a06-6856-42df-b940-53790917a661]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap6a4a282c-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 522153, 'tstamp': 522153}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227738, 'error': None, 'target': 'ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap6a4a282c-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 522157, 'tstamp': 522157}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227738, 'error': None, 'target': 'ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:39.130 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6a4a282c-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:02:39 np0005531887 nova_compute[186849]: 2025-11-22 08:02:39.132 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:39 np0005531887 nova_compute[186849]: 2025-11-22 08:02:39.134 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:39.134 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6a4a282c-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:02:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:39.135 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:02:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:39.135 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6a4a282c-d0, col_values=(('external_ids', {'iface-id': '26692495-261e-4628-ae4d-0a33d676c097'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:02:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:39.135 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:02:39 np0005531887 nova_compute[186849]: 2025-11-22 08:02:39.317 186853 DEBUG oslo_concurrency.lockutils [None req-c56d879d-7f06-40f9-892c-c383285a6d47 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquiring lock "fb921e88-22d2-4ae6-9d09-970505f0d5bb" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:02:39 np0005531887 nova_compute[186849]: 2025-11-22 08:02:39.317 186853 DEBUG oslo_concurrency.lockutils [None req-c56d879d-7f06-40f9-892c-c383285a6d47 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "fb921e88-22d2-4ae6-9d09-970505f0d5bb" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:02:39 np0005531887 nova_compute[186849]: 2025-11-22 08:02:39.317 186853 DEBUG oslo_concurrency.lockutils [None req-c56d879d-7f06-40f9-892c-c383285a6d47 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquiring lock "fb921e88-22d2-4ae6-9d09-970505f0d5bb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:02:39 np0005531887 nova_compute[186849]: 2025-11-22 08:02:39.319 186853 DEBUG oslo_concurrency.lockutils [None req-c56d879d-7f06-40f9-892c-c383285a6d47 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "fb921e88-22d2-4ae6-9d09-970505f0d5bb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:02:39 np0005531887 nova_compute[186849]: 2025-11-22 08:02:39.319 186853 DEBUG oslo_concurrency.lockutils [None req-c56d879d-7f06-40f9-892c-c383285a6d47 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "fb921e88-22d2-4ae6-9d09-970505f0d5bb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:02:39 np0005531887 nova_compute[186849]: 2025-11-22 08:02:39.328 186853 INFO nova.compute.manager [None req-c56d879d-7f06-40f9-892c-c383285a6d47 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Terminating instance#033[00m
Nov 22 03:02:39 np0005531887 nova_compute[186849]: 2025-11-22 08:02:39.335 186853 DEBUG nova.compute.manager [None req-c56d879d-7f06-40f9-892c-c383285a6d47 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 03:02:39 np0005531887 kernel: tap8979b5b9-12 (unregistering): left promiscuous mode
Nov 22 03:02:39 np0005531887 NetworkManager[55210]: <info>  [1763798559.3740] device (tap8979b5b9-12): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:02:39 np0005531887 ovn_controller[95130]: 2025-11-22T08:02:39Z|00293|binding|INFO|Releasing lport 8979b5b9-122d-4f08-832e-80c8509b9f9a from this chassis (sb_readonly=0)
Nov 22 03:02:39 np0005531887 ovn_controller[95130]: 2025-11-22T08:02:39Z|00294|binding|INFO|Setting lport 8979b5b9-122d-4f08-832e-80c8509b9f9a down in Southbound
Nov 22 03:02:39 np0005531887 ovn_controller[95130]: 2025-11-22T08:02:39Z|00295|binding|INFO|Removing iface tap8979b5b9-12 ovn-installed in OVS
Nov 22 03:02:39 np0005531887 nova_compute[186849]: 2025-11-22 08:02:39.382 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:39 np0005531887 nova_compute[186849]: 2025-11-22 08:02:39.385 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:39 np0005531887 nova_compute[186849]: 2025-11-22 08:02:39.396 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:39 np0005531887 kernel: tap26960794-1a (unregistering): left promiscuous mode
Nov 22 03:02:39 np0005531887 NetworkManager[55210]: <info>  [1763798559.4030] device (tap26960794-1a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:02:39 np0005531887 nova_compute[186849]: 2025-11-22 08:02:39.416 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:39 np0005531887 ovn_controller[95130]: 2025-11-22T08:02:39Z|00296|binding|INFO|Releasing lport 26960794-1ab2-46fd-917b-8b5e28186dc3 from this chassis (sb_readonly=1)
Nov 22 03:02:39 np0005531887 ovn_controller[95130]: 2025-11-22T08:02:39Z|00297|binding|INFO|Removing iface tap26960794-1a ovn-installed in OVS
Nov 22 03:02:39 np0005531887 ovn_controller[95130]: 2025-11-22T08:02:39Z|00298|if_status|INFO|Dropped 1 log messages in last 484 seconds (most recently, 484 seconds ago) due to excessive rate
Nov 22 03:02:39 np0005531887 ovn_controller[95130]: 2025-11-22T08:02:39Z|00299|if_status|INFO|Not setting lport 26960794-1ab2-46fd-917b-8b5e28186dc3 down as sb is readonly
Nov 22 03:02:39 np0005531887 nova_compute[186849]: 2025-11-22 08:02:39.418 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:39 np0005531887 ovn_controller[95130]: 2025-11-22T08:02:39Z|00300|binding|INFO|Setting lport 26960794-1ab2-46fd-917b-8b5e28186dc3 down in Southbound
Nov 22 03:02:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:39.420 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a0:69:e9 10.100.0.11'], port_security=['fa:16:3e:a0:69:e9 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6a4a282c-db22-41de-b34b-2960aa032ca8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'neutron:revision_number': '4', 'neutron:security_group_ids': '6931849a-3956-46e1-8bb0-462d1b35b82c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.200'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0acaade9-a442-4c9d-a882-bd397d30fce8, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=8979b5b9-122d-4f08-832e-80c8509b9f9a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:02:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:39.422 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 8979b5b9-122d-4f08-832e-80c8509b9f9a in datapath 6a4a282c-db22-41de-b34b-2960aa032ca8 unbound from our chassis#033[00m
Nov 22 03:02:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:39.424 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6a4a282c-db22-41de-b34b-2960aa032ca8#033[00m
Nov 22 03:02:39 np0005531887 kernel: tapb7ba05ee-84 (unregistering): left promiscuous mode
Nov 22 03:02:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:39.431 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:75:a1:57 10.100.0.7'], port_security=['fa:16:3e:75:a1:57 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'fb921e88-22d2-4ae6-9d09-970505f0d5bb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6a4a282c-db22-41de-b34b-2960aa032ca8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd4a48801-4b3f-49e9-aa90-fb1d486a915e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0acaade9-a442-4c9d-a882-bd397d30fce8, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=26960794-1ab2-46fd-917b-8b5e28186dc3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:02:39 np0005531887 nova_compute[186849]: 2025-11-22 08:02:39.429 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:39 np0005531887 NetworkManager[55210]: <info>  [1763798559.4351] device (tapb7ba05ee-84): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:02:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:39.444 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[e76cb172-38ea-4da7-9600-084722e69285]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:39 np0005531887 nova_compute[186849]: 2025-11-22 08:02:39.448 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:39 np0005531887 podman[227742]: 2025-11-22 08:02:39.471885883 +0000 UTC m=+0.061459308 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:02:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:39.481 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[eb5b8665-d923-4c51-b79b-1005c13b3f34]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:39.485 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[495e49f9-4aa1-4cc7-b347-14171689acaa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:39 np0005531887 systemd[1]: machine-qemu\x2d38\x2dinstance\x2d0000005f.scope: Deactivated successfully.
Nov 22 03:02:39 np0005531887 systemd[1]: machine-qemu\x2d38\x2dinstance\x2d0000005f.scope: Consumed 19.752s CPU time.
Nov 22 03:02:39 np0005531887 systemd-machined[153180]: Machine qemu-38-instance-0000005f terminated.
Nov 22 03:02:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:39.520 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[cb79125d-2b94-4ffa-bc4d-fb51ce559f0a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:39.538 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[2661fef2-be84-4dce-95fc-7f832e08d56f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6a4a282c-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b3:7a:86'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 15, 'rx_bytes': 784, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 15, 'rx_bytes': 784, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 89], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 522138, 'reachable_time': 34113, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227785, 'error': None, 'target': 'ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:39.557 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[6aa784b6-f338-4087-a084-d92fe6918e94]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap6a4a282c-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 522153, 'tstamp': 522153}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227786, 'error': None, 'target': 'ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap6a4a282c-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 522157, 'tstamp': 522157}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227786, 'error': None, 'target': 'ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:39.558 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6a4a282c-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:02:39 np0005531887 nova_compute[186849]: 2025-11-22 08:02:39.560 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:39 np0005531887 NetworkManager[55210]: <info>  [1763798559.5621] manager: (tap8979b5b9-12): new Tun device (/org/freedesktop/NetworkManager/Devices/141)
Nov 22 03:02:39 np0005531887 nova_compute[186849]: 2025-11-22 08:02:39.574 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:39.575 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6a4a282c-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:02:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:39.575 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:02:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:39.575 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6a4a282c-d0, col_values=(('external_ids', {'iface-id': '26692495-261e-4628-ae4d-0a33d676c097'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:02:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:39.576 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:02:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:39.577 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 26960794-1ab2-46fd-917b-8b5e28186dc3 in datapath 6a4a282c-db22-41de-b34b-2960aa032ca8 unbound from our chassis#033[00m
Nov 22 03:02:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:39.578 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6a4a282c-db22-41de-b34b-2960aa032ca8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:02:39 np0005531887 NetworkManager[55210]: <info>  [1763798559.5794] manager: (tap26960794-1a): new Tun device (/org/freedesktop/NetworkManager/Devices/142)
Nov 22 03:02:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:39.579 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[2fb257e9-3646-407a-a6c2-876ceb85d101]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:39.580 104084 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8 namespace which is not needed anymore#033[00m
Nov 22 03:02:39 np0005531887 NetworkManager[55210]: <info>  [1763798559.5902] manager: (tapb7ba05ee-84): new Tun device (/org/freedesktop/NetworkManager/Devices/143)
Nov 22 03:02:39 np0005531887 nova_compute[186849]: 2025-11-22 08:02:39.645 186853 INFO nova.virt.libvirt.driver [-] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Instance destroyed successfully.#033[00m
Nov 22 03:02:39 np0005531887 nova_compute[186849]: 2025-11-22 08:02:39.645 186853 DEBUG nova.objects.instance [None req-c56d879d-7f06-40f9-892c-c383285a6d47 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lazy-loading 'resources' on Instance uuid fb921e88-22d2-4ae6-9d09-970505f0d5bb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:02:39 np0005531887 nova_compute[186849]: 2025-11-22 08:02:39.658 186853 DEBUG nova.virt.libvirt.vif [None req-c56d879d-7f06-40f9-892c-c383285a6d47 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:01:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-253236039',display_name='tempest-AttachInterfacesTestJSON-server-253236039',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-253236039',id=95,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPlFNWaBPkXmo1oJCfLN54tXf/v5r9z/rUlzmZkdlM69K1yL3DA4wVyYYlX6OkgGyxa3eXCtkM1Rz2ltwe/CBwFo/i1WEXVw/DQ8O39rJeQjSBKmeD14VReQyrKNKErHuw==',key_name='tempest-keypair-1627237356',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:01:19Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2ce7c88f9ad440f494bdba03e7ece1bf',ramdisk_id='',reservation_id='r-hi30eyon',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-148239263',owner_user_name='tempest-AttachInterfacesTestJSON-148239263-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:01:19Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='53d50a77d3f2416c8fcc459cc343d045',uuid=fb921e88-22d2-4ae6-9d09-970505f0d5bb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8979b5b9-122d-4f08-832e-80c8509b9f9a", "address": "fa:16:3e:a0:69:e9", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8979b5b9-12", "ovs_interfaceid": "8979b5b9-122d-4f08-832e-80c8509b9f9a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:02:39 np0005531887 nova_compute[186849]: 2025-11-22 08:02:39.659 186853 DEBUG nova.network.os_vif_util [None req-c56d879d-7f06-40f9-892c-c383285a6d47 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converting VIF {"id": "8979b5b9-122d-4f08-832e-80c8509b9f9a", "address": "fa:16:3e:a0:69:e9", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8979b5b9-12", "ovs_interfaceid": "8979b5b9-122d-4f08-832e-80c8509b9f9a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:02:39 np0005531887 nova_compute[186849]: 2025-11-22 08:02:39.659 186853 DEBUG nova.network.os_vif_util [None req-c56d879d-7f06-40f9-892c-c383285a6d47 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a0:69:e9,bridge_name='br-int',has_traffic_filtering=True,id=8979b5b9-122d-4f08-832e-80c8509b9f9a,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8979b5b9-12') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:02:39 np0005531887 nova_compute[186849]: 2025-11-22 08:02:39.660 186853 DEBUG os_vif [None req-c56d879d-7f06-40f9-892c-c383285a6d47 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a0:69:e9,bridge_name='br-int',has_traffic_filtering=True,id=8979b5b9-122d-4f08-832e-80c8509b9f9a,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8979b5b9-12') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:02:39 np0005531887 nova_compute[186849]: 2025-11-22 08:02:39.661 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:39 np0005531887 nova_compute[186849]: 2025-11-22 08:02:39.662 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8979b5b9-12, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:02:39 np0005531887 nova_compute[186849]: 2025-11-22 08:02:39.663 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:39 np0005531887 nova_compute[186849]: 2025-11-22 08:02:39.665 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:02:39 np0005531887 nova_compute[186849]: 2025-11-22 08:02:39.671 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:39 np0005531887 nova_compute[186849]: 2025-11-22 08:02:39.673 186853 INFO os_vif [None req-c56d879d-7f06-40f9-892c-c383285a6d47 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a0:69:e9,bridge_name='br-int',has_traffic_filtering=True,id=8979b5b9-122d-4f08-832e-80c8509b9f9a,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8979b5b9-12')#033[00m
Nov 22 03:02:39 np0005531887 nova_compute[186849]: 2025-11-22 08:02:39.674 186853 DEBUG nova.virt.libvirt.vif [None req-c56d879d-7f06-40f9-892c-c383285a6d47 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:01:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-253236039',display_name='tempest-AttachInterfacesTestJSON-server-253236039',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-253236039',id=95,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPlFNWaBPkXmo1oJCfLN54tXf/v5r9z/rUlzmZkdlM69K1yL3DA4wVyYYlX6OkgGyxa3eXCtkM1Rz2ltwe/CBwFo/i1WEXVw/DQ8O39rJeQjSBKmeD14VReQyrKNKErHuw==',key_name='tempest-keypair-1627237356',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:01:19Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2ce7c88f9ad440f494bdba03e7ece1bf',ramdisk_id='',reservation_id='r-hi30eyon',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-148239263',owner_user_name='tempest-AttachInterfacesTestJSON-148239263-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:01:19Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='53d50a77d3f2416c8fcc459cc343d045',uuid=fb921e88-22d2-4ae6-9d09-970505f0d5bb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "26960794-1ab2-46fd-917b-8b5e28186dc3", "address": "fa:16:3e:75:a1:57", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26960794-1a", "ovs_interfaceid": "26960794-1ab2-46fd-917b-8b5e28186dc3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:02:39 np0005531887 nova_compute[186849]: 2025-11-22 08:02:39.674 186853 DEBUG nova.network.os_vif_util [None req-c56d879d-7f06-40f9-892c-c383285a6d47 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converting VIF {"id": "26960794-1ab2-46fd-917b-8b5e28186dc3", "address": "fa:16:3e:75:a1:57", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26960794-1a", "ovs_interfaceid": "26960794-1ab2-46fd-917b-8b5e28186dc3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:02:39 np0005531887 nova_compute[186849]: 2025-11-22 08:02:39.674 186853 DEBUG nova.network.os_vif_util [None req-c56d879d-7f06-40f9-892c-c383285a6d47 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:75:a1:57,bridge_name='br-int',has_traffic_filtering=True,id=26960794-1ab2-46fd-917b-8b5e28186dc3,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap26960794-1a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:02:39 np0005531887 nova_compute[186849]: 2025-11-22 08:02:39.675 186853 DEBUG os_vif [None req-c56d879d-7f06-40f9-892c-c383285a6d47 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:75:a1:57,bridge_name='br-int',has_traffic_filtering=True,id=26960794-1ab2-46fd-917b-8b5e28186dc3,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap26960794-1a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:02:39 np0005531887 nova_compute[186849]: 2025-11-22 08:02:39.676 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:39 np0005531887 nova_compute[186849]: 2025-11-22 08:02:39.676 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap26960794-1a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:02:39 np0005531887 nova_compute[186849]: 2025-11-22 08:02:39.677 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:39 np0005531887 nova_compute[186849]: 2025-11-22 08:02:39.678 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:02:39 np0005531887 nova_compute[186849]: 2025-11-22 08:02:39.682 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:39 np0005531887 nova_compute[186849]: 2025-11-22 08:02:39.683 186853 INFO os_vif [None req-c56d879d-7f06-40f9-892c-c383285a6d47 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:75:a1:57,bridge_name='br-int',has_traffic_filtering=True,id=26960794-1ab2-46fd-917b-8b5e28186dc3,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap26960794-1a')#033[00m
Nov 22 03:02:39 np0005531887 nova_compute[186849]: 2025-11-22 08:02:39.684 186853 DEBUG nova.virt.libvirt.vif [None req-c56d879d-7f06-40f9-892c-c383285a6d47 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:01:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-253236039',display_name='tempest-AttachInterfacesTestJSON-server-253236039',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-253236039',id=95,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPlFNWaBPkXmo1oJCfLN54tXf/v5r9z/rUlzmZkdlM69K1yL3DA4wVyYYlX6OkgGyxa3eXCtkM1Rz2ltwe/CBwFo/i1WEXVw/DQ8O39rJeQjSBKmeD14VReQyrKNKErHuw==',key_name='tempest-keypair-1627237356',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:01:19Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2ce7c88f9ad440f494bdba03e7ece1bf',ramdisk_id='',reservation_id='r-hi30eyon',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-148239263',owner_user_name='tempest-AttachInterfacesTestJSON-148239263-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:01:19Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='53d50a77d3f2416c8fcc459cc343d045',uuid=fb921e88-22d2-4ae6-9d09-970505f0d5bb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b7ba05ee-8492-4d3b-ae6b-fd90eee67f56", "address": "fa:16:3e:ff:85:53", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7ba05ee-84", "ovs_interfaceid": "b7ba05ee-8492-4d3b-ae6b-fd90eee67f56", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:02:39 np0005531887 nova_compute[186849]: 2025-11-22 08:02:39.684 186853 DEBUG nova.network.os_vif_util [None req-c56d879d-7f06-40f9-892c-c383285a6d47 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converting VIF {"id": "b7ba05ee-8492-4d3b-ae6b-fd90eee67f56", "address": "fa:16:3e:ff:85:53", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7ba05ee-84", "ovs_interfaceid": "b7ba05ee-8492-4d3b-ae6b-fd90eee67f56", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:02:39 np0005531887 nova_compute[186849]: 2025-11-22 08:02:39.685 186853 DEBUG nova.network.os_vif_util [None req-c56d879d-7f06-40f9-892c-c383285a6d47 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ff:85:53,bridge_name='br-int',has_traffic_filtering=True,id=b7ba05ee-8492-4d3b-ae6b-fd90eee67f56,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb7ba05ee-84') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:02:39 np0005531887 nova_compute[186849]: 2025-11-22 08:02:39.685 186853 DEBUG os_vif [None req-c56d879d-7f06-40f9-892c-c383285a6d47 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:85:53,bridge_name='br-int',has_traffic_filtering=True,id=b7ba05ee-8492-4d3b-ae6b-fd90eee67f56,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb7ba05ee-84') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:02:39 np0005531887 nova_compute[186849]: 2025-11-22 08:02:39.687 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:39 np0005531887 nova_compute[186849]: 2025-11-22 08:02:39.687 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb7ba05ee-84, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:02:39 np0005531887 nova_compute[186849]: 2025-11-22 08:02:39.691 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:39 np0005531887 nova_compute[186849]: 2025-11-22 08:02:39.692 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:02:39 np0005531887 nova_compute[186849]: 2025-11-22 08:02:39.694 186853 INFO os_vif [None req-c56d879d-7f06-40f9-892c-c383285a6d47 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:85:53,bridge_name='br-int',has_traffic_filtering=True,id=b7ba05ee-8492-4d3b-ae6b-fd90eee67f56,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb7ba05ee-84')#033[00m
Nov 22 03:02:39 np0005531887 nova_compute[186849]: 2025-11-22 08:02:39.694 186853 INFO nova.virt.libvirt.driver [None req-c56d879d-7f06-40f9-892c-c383285a6d47 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Deleting instance files /var/lib/nova/instances/fb921e88-22d2-4ae6-9d09-970505f0d5bb_del#033[00m
Nov 22 03:02:39 np0005531887 nova_compute[186849]: 2025-11-22 08:02:39.695 186853 INFO nova.virt.libvirt.driver [None req-c56d879d-7f06-40f9-892c-c383285a6d47 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Deletion of /var/lib/nova/instances/fb921e88-22d2-4ae6-9d09-970505f0d5bb_del complete#033[00m
Nov 22 03:02:39 np0005531887 nova_compute[186849]: 2025-11-22 08:02:39.773 186853 INFO nova.compute.manager [None req-c56d879d-7f06-40f9-892c-c383285a6d47 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Took 0.44 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 03:02:39 np0005531887 nova_compute[186849]: 2025-11-22 08:02:39.773 186853 DEBUG oslo.service.loopingcall [None req-c56d879d-7f06-40f9-892c-c383285a6d47 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 03:02:39 np0005531887 nova_compute[186849]: 2025-11-22 08:02:39.774 186853 DEBUG nova.compute.manager [-] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 03:02:39 np0005531887 nova_compute[186849]: 2025-11-22 08:02:39.774 186853 DEBUG nova.network.neutron [-] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 03:02:39 np0005531887 neutron-haproxy-ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8[227262]: [NOTICE]   (227275) : haproxy version is 2.8.14-c23fe91
Nov 22 03:02:39 np0005531887 neutron-haproxy-ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8[227262]: [NOTICE]   (227275) : path to executable is /usr/sbin/haproxy
Nov 22 03:02:39 np0005531887 neutron-haproxy-ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8[227262]: [WARNING]  (227275) : Exiting Master process...
Nov 22 03:02:39 np0005531887 neutron-haproxy-ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8[227262]: [WARNING]  (227275) : Exiting Master process...
Nov 22 03:02:39 np0005531887 neutron-haproxy-ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8[227262]: [ALERT]    (227275) : Current worker (227277) exited with code 143 (Terminated)
Nov 22 03:02:39 np0005531887 neutron-haproxy-ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8[227262]: [WARNING]  (227275) : All workers exited. Exiting... (0)
Nov 22 03:02:39 np0005531887 systemd[1]: libpod-642e5f623e8a18a64ae62d2e5d1ad93d03442bc30cc29d0796ea3ce332ec6bb1.scope: Deactivated successfully.
Nov 22 03:02:40 np0005531887 podman[227844]: 2025-11-22 08:02:40.002352716 +0000 UTC m=+0.330009720 container died 642e5f623e8a18a64ae62d2e5d1ad93d03442bc30cc29d0796ea3ce332ec6bb1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:02:40 np0005531887 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-642e5f623e8a18a64ae62d2e5d1ad93d03442bc30cc29d0796ea3ce332ec6bb1-userdata-shm.mount: Deactivated successfully.
Nov 22 03:02:40 np0005531887 systemd[1]: var-lib-containers-storage-overlay-7aeb595a4e479dbd4fb62d44e84d3b3c0b1131ed8880d6fd6aad99cad5934290-merged.mount: Deactivated successfully.
Nov 22 03:02:40 np0005531887 podman[227844]: 2025-11-22 08:02:40.330799353 +0000 UTC m=+0.658456367 container cleanup 642e5f623e8a18a64ae62d2e5d1ad93d03442bc30cc29d0796ea3ce332ec6bb1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118)
Nov 22 03:02:40 np0005531887 systemd[1]: libpod-conmon-642e5f623e8a18a64ae62d2e5d1ad93d03442bc30cc29d0796ea3ce332ec6bb1.scope: Deactivated successfully.
Nov 22 03:02:40 np0005531887 nova_compute[186849]: 2025-11-22 08:02:40.576 186853 DEBUG nova.compute.manager [req-1167cedb-10e1-4ee7-b0de-d0bbbdf991d1 req-eadb7c7d-489a-47f6-bbfe-0b51f7f29096 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Received event network-vif-unplugged-8979b5b9-122d-4f08-832e-80c8509b9f9a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:02:40 np0005531887 nova_compute[186849]: 2025-11-22 08:02:40.577 186853 DEBUG oslo_concurrency.lockutils [req-1167cedb-10e1-4ee7-b0de-d0bbbdf991d1 req-eadb7c7d-489a-47f6-bbfe-0b51f7f29096 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "fb921e88-22d2-4ae6-9d09-970505f0d5bb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:02:40 np0005531887 nova_compute[186849]: 2025-11-22 08:02:40.577 186853 DEBUG oslo_concurrency.lockutils [req-1167cedb-10e1-4ee7-b0de-d0bbbdf991d1 req-eadb7c7d-489a-47f6-bbfe-0b51f7f29096 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "fb921e88-22d2-4ae6-9d09-970505f0d5bb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:02:40 np0005531887 nova_compute[186849]: 2025-11-22 08:02:40.577 186853 DEBUG oslo_concurrency.lockutils [req-1167cedb-10e1-4ee7-b0de-d0bbbdf991d1 req-eadb7c7d-489a-47f6-bbfe-0b51f7f29096 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "fb921e88-22d2-4ae6-9d09-970505f0d5bb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:02:40 np0005531887 nova_compute[186849]: 2025-11-22 08:02:40.577 186853 DEBUG nova.compute.manager [req-1167cedb-10e1-4ee7-b0de-d0bbbdf991d1 req-eadb7c7d-489a-47f6-bbfe-0b51f7f29096 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] No waiting events found dispatching network-vif-unplugged-8979b5b9-122d-4f08-832e-80c8509b9f9a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:02:40 np0005531887 nova_compute[186849]: 2025-11-22 08:02:40.578 186853 DEBUG nova.compute.manager [req-1167cedb-10e1-4ee7-b0de-d0bbbdf991d1 req-eadb7c7d-489a-47f6-bbfe-0b51f7f29096 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Received event network-vif-unplugged-8979b5b9-122d-4f08-832e-80c8509b9f9a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 03:02:40 np0005531887 podman[227877]: 2025-11-22 08:02:40.582153494 +0000 UTC m=+0.227851208 container remove 642e5f623e8a18a64ae62d2e5d1ad93d03442bc30cc29d0796ea3ce332ec6bb1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:02:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:40.592 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[213c8f91-61aa-457e-a1ca-9edafe1be2f9]: (4, ('Sat Nov 22 08:02:39 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8 (642e5f623e8a18a64ae62d2e5d1ad93d03442bc30cc29d0796ea3ce332ec6bb1)\n642e5f623e8a18a64ae62d2e5d1ad93d03442bc30cc29d0796ea3ce332ec6bb1\nSat Nov 22 08:02:40 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8 (642e5f623e8a18a64ae62d2e5d1ad93d03442bc30cc29d0796ea3ce332ec6bb1)\n642e5f623e8a18a64ae62d2e5d1ad93d03442bc30cc29d0796ea3ce332ec6bb1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:40.595 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[05abe446-d7bb-490a-ae5a-b5bfece01f18]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:40.596 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6a4a282c-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:02:40 np0005531887 nova_compute[186849]: 2025-11-22 08:02:40.598 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:40 np0005531887 kernel: tap6a4a282c-d0: left promiscuous mode
Nov 22 03:02:40 np0005531887 nova_compute[186849]: 2025-11-22 08:02:40.613 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:40 np0005531887 nova_compute[186849]: 2025-11-22 08:02:40.615 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:40.617 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[f81290a6-a43a-4f76-b993-d93e4c999adf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:40.632 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[fb79b693-1360-4367-afaa-dc5f978dc1f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:40.634 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[2f387116-00bf-41ac-a0df-744208dfa76e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:40 np0005531887 nova_compute[186849]: 2025-11-22 08:02:40.640 186853 DEBUG nova.compute.manager [req-8c0bbe20-2f51-4e98-97b2-aab7de08b4d4 req-64f8ef08-1c7c-4f6b-b771-b864ed8da5a9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Received event network-vif-plugged-a5480b52-2a9f-4662-8b66-bd078a80ca44 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:02:40 np0005531887 nova_compute[186849]: 2025-11-22 08:02:40.641 186853 DEBUG oslo_concurrency.lockutils [req-8c0bbe20-2f51-4e98-97b2-aab7de08b4d4 req-64f8ef08-1c7c-4f6b-b771-b864ed8da5a9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "fb921e88-22d2-4ae6-9d09-970505f0d5bb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:02:40 np0005531887 nova_compute[186849]: 2025-11-22 08:02:40.642 186853 DEBUG oslo_concurrency.lockutils [req-8c0bbe20-2f51-4e98-97b2-aab7de08b4d4 req-64f8ef08-1c7c-4f6b-b771-b864ed8da5a9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "fb921e88-22d2-4ae6-9d09-970505f0d5bb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:02:40 np0005531887 nova_compute[186849]: 2025-11-22 08:02:40.642 186853 DEBUG oslo_concurrency.lockutils [req-8c0bbe20-2f51-4e98-97b2-aab7de08b4d4 req-64f8ef08-1c7c-4f6b-b771-b864ed8da5a9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "fb921e88-22d2-4ae6-9d09-970505f0d5bb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:02:40 np0005531887 nova_compute[186849]: 2025-11-22 08:02:40.642 186853 DEBUG nova.compute.manager [req-8c0bbe20-2f51-4e98-97b2-aab7de08b4d4 req-64f8ef08-1c7c-4f6b-b771-b864ed8da5a9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] No waiting events found dispatching network-vif-plugged-a5480b52-2a9f-4662-8b66-bd078a80ca44 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:02:40 np0005531887 nova_compute[186849]: 2025-11-22 08:02:40.643 186853 WARNING nova.compute.manager [req-8c0bbe20-2f51-4e98-97b2-aab7de08b4d4 req-64f8ef08-1c7c-4f6b-b771-b864ed8da5a9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Received unexpected event network-vif-plugged-a5480b52-2a9f-4662-8b66-bd078a80ca44 for instance with vm_state active and task_state deleting.#033[00m
Nov 22 03:02:40 np0005531887 nova_compute[186849]: 2025-11-22 08:02:40.643 186853 DEBUG nova.compute.manager [req-8c0bbe20-2f51-4e98-97b2-aab7de08b4d4 req-64f8ef08-1c7c-4f6b-b771-b864ed8da5a9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Received event network-vif-deleted-b7ba05ee-8492-4d3b-ae6b-fd90eee67f56 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:02:40 np0005531887 nova_compute[186849]: 2025-11-22 08:02:40.643 186853 INFO nova.compute.manager [req-8c0bbe20-2f51-4e98-97b2-aab7de08b4d4 req-64f8ef08-1c7c-4f6b-b771-b864ed8da5a9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Neutron deleted interface b7ba05ee-8492-4d3b-ae6b-fd90eee67f56; detaching it from the instance and deleting it from the info cache#033[00m
Nov 22 03:02:40 np0005531887 nova_compute[186849]: 2025-11-22 08:02:40.643 186853 DEBUG nova.network.neutron [req-8c0bbe20-2f51-4e98-97b2-aab7de08b4d4 req-64f8ef08-1c7c-4f6b-b771-b864ed8da5a9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Updating instance_info_cache with network_info: [{"id": "8979b5b9-122d-4f08-832e-80c8509b9f9a", "address": "fa:16:3e:a0:69:e9", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8979b5b9-12", "ovs_interfaceid": "8979b5b9-122d-4f08-832e-80c8509b9f9a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "26960794-1ab2-46fd-917b-8b5e28186dc3", "address": "fa:16:3e:75:a1:57", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26960794-1a", "ovs_interfaceid": "26960794-1ab2-46fd-917b-8b5e28186dc3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:02:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:40.656 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[cf70db41-22a7-469a-9281-d2489d3c13e9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 522128, 'reachable_time': 29097, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227889, 'error': None, 'target': 'ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:40.659 104200 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:02:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:02:40.660 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[86c8d385-8b53-4c3f-804e-5b19dae4ebcf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:02:40 np0005531887 systemd[1]: run-netns-ovnmeta\x2d6a4a282c\x2ddb22\x2d41de\x2db34b\x2d2960aa032ca8.mount: Deactivated successfully.
Nov 22 03:02:40 np0005531887 nova_compute[186849]: 2025-11-22 08:02:40.668 186853 DEBUG nova.compute.manager [req-8c0bbe20-2f51-4e98-97b2-aab7de08b4d4 req-64f8ef08-1c7c-4f6b-b771-b864ed8da5a9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Detach interface failed, port_id=b7ba05ee-8492-4d3b-ae6b-fd90eee67f56, reason: Instance fb921e88-22d2-4ae6-9d09-970505f0d5bb could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Nov 22 03:02:40 np0005531887 nova_compute[186849]: 2025-11-22 08:02:40.669 186853 DEBUG nova.compute.manager [req-8c0bbe20-2f51-4e98-97b2-aab7de08b4d4 req-64f8ef08-1c7c-4f6b-b771-b864ed8da5a9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Received event network-vif-unplugged-26960794-1ab2-46fd-917b-8b5e28186dc3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:02:40 np0005531887 nova_compute[186849]: 2025-11-22 08:02:40.669 186853 DEBUG oslo_concurrency.lockutils [req-8c0bbe20-2f51-4e98-97b2-aab7de08b4d4 req-64f8ef08-1c7c-4f6b-b771-b864ed8da5a9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "fb921e88-22d2-4ae6-9d09-970505f0d5bb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:02:40 np0005531887 nova_compute[186849]: 2025-11-22 08:02:40.669 186853 DEBUG oslo_concurrency.lockutils [req-8c0bbe20-2f51-4e98-97b2-aab7de08b4d4 req-64f8ef08-1c7c-4f6b-b771-b864ed8da5a9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "fb921e88-22d2-4ae6-9d09-970505f0d5bb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:02:40 np0005531887 nova_compute[186849]: 2025-11-22 08:02:40.669 186853 DEBUG oslo_concurrency.lockutils [req-8c0bbe20-2f51-4e98-97b2-aab7de08b4d4 req-64f8ef08-1c7c-4f6b-b771-b864ed8da5a9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "fb921e88-22d2-4ae6-9d09-970505f0d5bb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:02:40 np0005531887 nova_compute[186849]: 2025-11-22 08:02:40.670 186853 DEBUG nova.compute.manager [req-8c0bbe20-2f51-4e98-97b2-aab7de08b4d4 req-64f8ef08-1c7c-4f6b-b771-b864ed8da5a9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] No waiting events found dispatching network-vif-unplugged-26960794-1ab2-46fd-917b-8b5e28186dc3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:02:40 np0005531887 nova_compute[186849]: 2025-11-22 08:02:40.670 186853 DEBUG nova.compute.manager [req-8c0bbe20-2f51-4e98-97b2-aab7de08b4d4 req-64f8ef08-1c7c-4f6b-b771-b864ed8da5a9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Received event network-vif-unplugged-26960794-1ab2-46fd-917b-8b5e28186dc3 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 03:02:40 np0005531887 nova_compute[186849]: 2025-11-22 08:02:40.680 186853 DEBUG neutronclient.v2_0.client [-] Error message: {"NeutronError": {"type": "PortNotFound", "message": "Port b7ba05ee-8492-4d3b-ae6b-fd90eee67f56 could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Nov 22 03:02:40 np0005531887 nova_compute[186849]: 2025-11-22 08:02:40.680 186853 DEBUG nova.network.neutron [-] Unable to show port b7ba05ee-8492-4d3b-ae6b-fd90eee67f56 as it no longer exists. _unbind_ports /usr/lib/python3.9/site-packages/nova/network/neutron.py:666#033[00m
Nov 22 03:02:41 np0005531887 nova_compute[186849]: 2025-11-22 08:02:41.114 186853 INFO nova.network.neutron [None req-bee60642-d2ad-408b-829c-6d2ebd15a79f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Port a5480b52-2a9f-4662-8b66-bd078a80ca44 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Nov 22 03:02:41 np0005531887 nova_compute[186849]: 2025-11-22 08:02:41.707 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:42 np0005531887 nova_compute[186849]: 2025-11-22 08:02:42.004 186853 DEBUG nova.compute.manager [req-d7c96106-8857-48c8-a88e-e5705fd59056 req-3278085e-78b4-4a71-8edf-31444a7e9833 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Received event network-vif-deleted-26960794-1ab2-46fd-917b-8b5e28186dc3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:02:42 np0005531887 nova_compute[186849]: 2025-11-22 08:02:42.004 186853 INFO nova.compute.manager [req-d7c96106-8857-48c8-a88e-e5705fd59056 req-3278085e-78b4-4a71-8edf-31444a7e9833 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Neutron deleted interface 26960794-1ab2-46fd-917b-8b5e28186dc3; detaching it from the instance and deleting it from the info cache#033[00m
Nov 22 03:02:42 np0005531887 nova_compute[186849]: 2025-11-22 08:02:42.005 186853 DEBUG nova.network.neutron [req-d7c96106-8857-48c8-a88e-e5705fd59056 req-3278085e-78b4-4a71-8edf-31444a7e9833 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Updating instance_info_cache with network_info: [{"id": "8979b5b9-122d-4f08-832e-80c8509b9f9a", "address": "fa:16:3e:a0:69:e9", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8979b5b9-12", "ovs_interfaceid": "8979b5b9-122d-4f08-832e-80c8509b9f9a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:02:42 np0005531887 nova_compute[186849]: 2025-11-22 08:02:42.041 186853 DEBUG nova.compute.manager [req-d7c96106-8857-48c8-a88e-e5705fd59056 req-3278085e-78b4-4a71-8edf-31444a7e9833 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Detach interface failed, port_id=26960794-1ab2-46fd-917b-8b5e28186dc3, reason: Instance fb921e88-22d2-4ae6-9d09-970505f0d5bb could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Nov 22 03:02:42 np0005531887 nova_compute[186849]: 2025-11-22 08:02:42.677 186853 DEBUG nova.compute.manager [req-0bf2785f-bc6b-410b-8c74-82729ef7b581 req-0d747239-3448-4e4d-9018-8a1750f7b474 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Received event network-vif-plugged-8979b5b9-122d-4f08-832e-80c8509b9f9a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:02:42 np0005531887 nova_compute[186849]: 2025-11-22 08:02:42.677 186853 DEBUG oslo_concurrency.lockutils [req-0bf2785f-bc6b-410b-8c74-82729ef7b581 req-0d747239-3448-4e4d-9018-8a1750f7b474 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "fb921e88-22d2-4ae6-9d09-970505f0d5bb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:02:42 np0005531887 nova_compute[186849]: 2025-11-22 08:02:42.678 186853 DEBUG oslo_concurrency.lockutils [req-0bf2785f-bc6b-410b-8c74-82729ef7b581 req-0d747239-3448-4e4d-9018-8a1750f7b474 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "fb921e88-22d2-4ae6-9d09-970505f0d5bb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:02:42 np0005531887 nova_compute[186849]: 2025-11-22 08:02:42.678 186853 DEBUG oslo_concurrency.lockutils [req-0bf2785f-bc6b-410b-8c74-82729ef7b581 req-0d747239-3448-4e4d-9018-8a1750f7b474 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "fb921e88-22d2-4ae6-9d09-970505f0d5bb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:02:42 np0005531887 nova_compute[186849]: 2025-11-22 08:02:42.678 186853 DEBUG nova.compute.manager [req-0bf2785f-bc6b-410b-8c74-82729ef7b581 req-0d747239-3448-4e4d-9018-8a1750f7b474 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] No waiting events found dispatching network-vif-plugged-8979b5b9-122d-4f08-832e-80c8509b9f9a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:02:42 np0005531887 nova_compute[186849]: 2025-11-22 08:02:42.678 186853 WARNING nova.compute.manager [req-0bf2785f-bc6b-410b-8c74-82729ef7b581 req-0d747239-3448-4e4d-9018-8a1750f7b474 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Received unexpected event network-vif-plugged-8979b5b9-122d-4f08-832e-80c8509b9f9a for instance with vm_state active and task_state deleting.#033[00m
Nov 22 03:02:42 np0005531887 nova_compute[186849]: 2025-11-22 08:02:42.740 186853 DEBUG nova.compute.manager [req-01fe14e3-b2dd-4318-adf7-99f83424add0 req-4a9f6cfe-7864-476d-9365-26507cb340cf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Received event network-vif-plugged-26960794-1ab2-46fd-917b-8b5e28186dc3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:02:42 np0005531887 nova_compute[186849]: 2025-11-22 08:02:42.741 186853 DEBUG oslo_concurrency.lockutils [req-01fe14e3-b2dd-4318-adf7-99f83424add0 req-4a9f6cfe-7864-476d-9365-26507cb340cf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "fb921e88-22d2-4ae6-9d09-970505f0d5bb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:02:42 np0005531887 nova_compute[186849]: 2025-11-22 08:02:42.741 186853 DEBUG oslo_concurrency.lockutils [req-01fe14e3-b2dd-4318-adf7-99f83424add0 req-4a9f6cfe-7864-476d-9365-26507cb340cf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "fb921e88-22d2-4ae6-9d09-970505f0d5bb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:02:42 np0005531887 nova_compute[186849]: 2025-11-22 08:02:42.741 186853 DEBUG oslo_concurrency.lockutils [req-01fe14e3-b2dd-4318-adf7-99f83424add0 req-4a9f6cfe-7864-476d-9365-26507cb340cf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "fb921e88-22d2-4ae6-9d09-970505f0d5bb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:02:42 np0005531887 nova_compute[186849]: 2025-11-22 08:02:42.742 186853 DEBUG nova.compute.manager [req-01fe14e3-b2dd-4318-adf7-99f83424add0 req-4a9f6cfe-7864-476d-9365-26507cb340cf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] No waiting events found dispatching network-vif-plugged-26960794-1ab2-46fd-917b-8b5e28186dc3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:02:42 np0005531887 nova_compute[186849]: 2025-11-22 08:02:42.742 186853 WARNING nova.compute.manager [req-01fe14e3-b2dd-4318-adf7-99f83424add0 req-4a9f6cfe-7864-476d-9365-26507cb340cf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Received unexpected event network-vif-plugged-26960794-1ab2-46fd-917b-8b5e28186dc3 for instance with vm_state active and task_state deleting.#033[00m
Nov 22 03:02:43 np0005531887 nova_compute[186849]: 2025-11-22 08:02:43.234 186853 DEBUG nova.network.neutron [-] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:02:43 np0005531887 nova_compute[186849]: 2025-11-22 08:02:43.248 186853 INFO nova.compute.manager [-] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Took 3.47 seconds to deallocate network for instance.#033[00m
Nov 22 03:02:43 np0005531887 nova_compute[186849]: 2025-11-22 08:02:43.360 186853 DEBUG oslo_concurrency.lockutils [None req-c56d879d-7f06-40f9-892c-c383285a6d47 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:02:43 np0005531887 nova_compute[186849]: 2025-11-22 08:02:43.361 186853 DEBUG oslo_concurrency.lockutils [None req-c56d879d-7f06-40f9-892c-c383285a6d47 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:02:43 np0005531887 nova_compute[186849]: 2025-11-22 08:02:43.438 186853 DEBUG nova.compute.provider_tree [None req-c56d879d-7f06-40f9-892c-c383285a6d47 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:02:43 np0005531887 nova_compute[186849]: 2025-11-22 08:02:43.451 186853 DEBUG nova.scheduler.client.report [None req-c56d879d-7f06-40f9-892c-c383285a6d47 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:02:43 np0005531887 nova_compute[186849]: 2025-11-22 08:02:43.469 186853 DEBUG oslo_concurrency.lockutils [None req-c56d879d-7f06-40f9-892c-c383285a6d47 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:02:43 np0005531887 nova_compute[186849]: 2025-11-22 08:02:43.497 186853 INFO nova.scheduler.client.report [None req-c56d879d-7f06-40f9-892c-c383285a6d47 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Deleted allocations for instance fb921e88-22d2-4ae6-9d09-970505f0d5bb#033[00m
Nov 22 03:02:43 np0005531887 nova_compute[186849]: 2025-11-22 08:02:43.552 186853 DEBUG oslo_concurrency.lockutils [None req-c56d879d-7f06-40f9-892c-c383285a6d47 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "fb921e88-22d2-4ae6-9d09-970505f0d5bb" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.235s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:02:44 np0005531887 nova_compute[186849]: 2025-11-22 08:02:44.076 186853 DEBUG nova.compute.manager [req-b0774cbc-bdd2-47d6-a71e-7405e97133dc req-20cc8532-f2a5-4625-bec4-4e6d27555d1f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Received event network-vif-deleted-8979b5b9-122d-4f08-832e-80c8509b9f9a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:02:44 np0005531887 nova_compute[186849]: 2025-11-22 08:02:44.690 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:45 np0005531887 podman[227894]: 2025-11-22 08:02:45.304195165 +0000 UTC m=+0.094559167 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, version=9.6, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, vcs-type=git, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, name=ubi9-minimal)
Nov 22 03:02:45 np0005531887 nova_compute[186849]: 2025-11-22 08:02:45.414 186853 DEBUG nova.network.neutron [None req-bee60642-d2ad-408b-829c-6d2ebd15a79f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Updating instance_info_cache with network_info: [{"id": "8979b5b9-122d-4f08-832e-80c8509b9f9a", "address": "fa:16:3e:a0:69:e9", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8979b5b9-12", "ovs_interfaceid": "8979b5b9-122d-4f08-832e-80c8509b9f9a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "26960794-1ab2-46fd-917b-8b5e28186dc3", "address": "fa:16:3e:75:a1:57", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26960794-1a", "ovs_interfaceid": "26960794-1ab2-46fd-917b-8b5e28186dc3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "b7ba05ee-8492-4d3b-ae6b-fd90eee67f56", "address": "fa:16:3e:ff:85:53", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7ba05ee-84", "ovs_interfaceid": "b7ba05ee-8492-4d3b-ae6b-fd90eee67f56", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:02:45 np0005531887 nova_compute[186849]: 2025-11-22 08:02:45.454 186853 DEBUG oslo_concurrency.lockutils [None req-bee60642-d2ad-408b-829c-6d2ebd15a79f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Releasing lock "refresh_cache-fb921e88-22d2-4ae6-9d09-970505f0d5bb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:02:45 np0005531887 nova_compute[186849]: 2025-11-22 08:02:45.492 186853 DEBUG oslo_concurrency.lockutils [None req-bee60642-d2ad-408b-829c-6d2ebd15a79f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "interface-fb921e88-22d2-4ae6-9d09-970505f0d5bb-a5480b52-2a9f-4662-8b66-bd078a80ca44" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 8.495s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:02:46 np0005531887 nova_compute[186849]: 2025-11-22 08:02:46.709 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:48 np0005531887 podman[227918]: 2025-11-22 08:02:48.855435883 +0000 UTC m=+0.074669038 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 22 03:02:48 np0005531887 podman[227919]: 2025-11-22 08:02:48.885594956 +0000 UTC m=+0.101057114 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Nov 22 03:02:49 np0005531887 nova_compute[186849]: 2025-11-22 08:02:49.693 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:49 np0005531887 nova_compute[186849]: 2025-11-22 08:02:49.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:02:51 np0005531887 nova_compute[186849]: 2025-11-22 08:02:51.714 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:52 np0005531887 nova_compute[186849]: 2025-11-22 08:02:52.762 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:02:52 np0005531887 nova_compute[186849]: 2025-11-22 08:02:52.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:02:52 np0005531887 nova_compute[186849]: 2025-11-22 08:02:52.792 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:02:52 np0005531887 nova_compute[186849]: 2025-11-22 08:02:52.793 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:02:52 np0005531887 nova_compute[186849]: 2025-11-22 08:02:52.793 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:02:52 np0005531887 nova_compute[186849]: 2025-11-22 08:02:52.793 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:02:52 np0005531887 nova_compute[186849]: 2025-11-22 08:02:52.962 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:02:52 np0005531887 nova_compute[186849]: 2025-11-22 08:02:52.963 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5756MB free_disk=73.34571838378906GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:02:52 np0005531887 nova_compute[186849]: 2025-11-22 08:02:52.964 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:02:52 np0005531887 nova_compute[186849]: 2025-11-22 08:02:52.964 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:02:53 np0005531887 nova_compute[186849]: 2025-11-22 08:02:53.014 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:02:53 np0005531887 nova_compute[186849]: 2025-11-22 08:02:53.014 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:02:53 np0005531887 nova_compute[186849]: 2025-11-22 08:02:53.033 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:02:53 np0005531887 nova_compute[186849]: 2025-11-22 08:02:53.043 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:02:53 np0005531887 nova_compute[186849]: 2025-11-22 08:02:53.063 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:02:53 np0005531887 nova_compute[186849]: 2025-11-22 08:02:53.064 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:02:53 np0005531887 podman[227965]: 2025-11-22 08:02:53.839545156 +0000 UTC m=+0.060536704 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 03:02:54 np0005531887 nova_compute[186849]: 2025-11-22 08:02:54.064 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:02:54 np0005531887 nova_compute[186849]: 2025-11-22 08:02:54.065 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:02:54 np0005531887 nova_compute[186849]: 2025-11-22 08:02:54.643 186853 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798559.6423266, fb921e88-22d2-4ae6-9d09-970505f0d5bb => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:02:54 np0005531887 nova_compute[186849]: 2025-11-22 08:02:54.644 186853 INFO nova.compute.manager [-] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:02:54 np0005531887 nova_compute[186849]: 2025-11-22 08:02:54.659 186853 DEBUG nova.compute.manager [None req-a65ffaca-8592-41aa-8dae-ac53d8f63420 - - - - - -] [instance: fb921e88-22d2-4ae6-9d09-970505f0d5bb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:02:54 np0005531887 nova_compute[186849]: 2025-11-22 08:02:54.695 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:54 np0005531887 nova_compute[186849]: 2025-11-22 08:02:54.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:02:56 np0005531887 nova_compute[186849]: 2025-11-22 08:02:56.380 186853 DEBUG oslo_concurrency.lockutils [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquiring lock "87323ed4-21ca-4440-802a-6f396fa56b00" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:02:56 np0005531887 nova_compute[186849]: 2025-11-22 08:02:56.380 186853 DEBUG oslo_concurrency.lockutils [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "87323ed4-21ca-4440-802a-6f396fa56b00" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:02:56 np0005531887 nova_compute[186849]: 2025-11-22 08:02:56.436 186853 DEBUG nova.compute.manager [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 03:02:56 np0005531887 nova_compute[186849]: 2025-11-22 08:02:56.563 186853 DEBUG oslo_concurrency.lockutils [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:02:56 np0005531887 nova_compute[186849]: 2025-11-22 08:02:56.563 186853 DEBUG oslo_concurrency.lockutils [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:02:56 np0005531887 nova_compute[186849]: 2025-11-22 08:02:56.570 186853 DEBUG nova.virt.hardware [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:02:56 np0005531887 nova_compute[186849]: 2025-11-22 08:02:56.571 186853 INFO nova.compute.claims [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 22 03:02:56 np0005531887 nova_compute[186849]: 2025-11-22 08:02:56.718 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:56 np0005531887 nova_compute[186849]: 2025-11-22 08:02:56.722 186853 DEBUG nova.compute.provider_tree [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:02:56 np0005531887 nova_compute[186849]: 2025-11-22 08:02:56.734 186853 DEBUG nova.scheduler.client.report [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:02:56 np0005531887 nova_compute[186849]: 2025-11-22 08:02:56.762 186853 DEBUG oslo_concurrency.lockutils [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.199s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:02:56 np0005531887 nova_compute[186849]: 2025-11-22 08:02:56.763 186853 DEBUG nova.compute.manager [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 03:02:56 np0005531887 nova_compute[186849]: 2025-11-22 08:02:56.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:02:56 np0005531887 nova_compute[186849]: 2025-11-22 08:02:56.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:02:56 np0005531887 nova_compute[186849]: 2025-11-22 08:02:56.879 186853 DEBUG nova.compute.manager [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 03:02:56 np0005531887 nova_compute[186849]: 2025-11-22 08:02:56.879 186853 DEBUG nova.network.neutron [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:02:56 np0005531887 nova_compute[186849]: 2025-11-22 08:02:56.907 186853 INFO nova.virt.libvirt.driver [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 03:02:56 np0005531887 nova_compute[186849]: 2025-11-22 08:02:56.965 186853 DEBUG nova.compute.manager [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 03:02:57 np0005531887 nova_compute[186849]: 2025-11-22 08:02:57.184 186853 DEBUG nova.compute.manager [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 03:02:57 np0005531887 nova_compute[186849]: 2025-11-22 08:02:57.186 186853 DEBUG nova.virt.libvirt.driver [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:02:57 np0005531887 nova_compute[186849]: 2025-11-22 08:02:57.186 186853 INFO nova.virt.libvirt.driver [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Creating image(s)#033[00m
Nov 22 03:02:57 np0005531887 nova_compute[186849]: 2025-11-22 08:02:57.187 186853 DEBUG oslo_concurrency.lockutils [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquiring lock "/var/lib/nova/instances/87323ed4-21ca-4440-802a-6f396fa56b00/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:02:57 np0005531887 nova_compute[186849]: 2025-11-22 08:02:57.187 186853 DEBUG oslo_concurrency.lockutils [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "/var/lib/nova/instances/87323ed4-21ca-4440-802a-6f396fa56b00/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:02:57 np0005531887 nova_compute[186849]: 2025-11-22 08:02:57.188 186853 DEBUG oslo_concurrency.lockutils [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "/var/lib/nova/instances/87323ed4-21ca-4440-802a-6f396fa56b00/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:02:57 np0005531887 nova_compute[186849]: 2025-11-22 08:02:57.203 186853 DEBUG oslo_concurrency.processutils [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:02:57 np0005531887 nova_compute[186849]: 2025-11-22 08:02:57.268 186853 DEBUG oslo_concurrency.processutils [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:02:57 np0005531887 nova_compute[186849]: 2025-11-22 08:02:57.269 186853 DEBUG oslo_concurrency.lockutils [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:02:57 np0005531887 nova_compute[186849]: 2025-11-22 08:02:57.269 186853 DEBUG oslo_concurrency.lockutils [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:02:57 np0005531887 nova_compute[186849]: 2025-11-22 08:02:57.284 186853 DEBUG oslo_concurrency.processutils [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:02:57 np0005531887 nova_compute[186849]: 2025-11-22 08:02:57.350 186853 DEBUG oslo_concurrency.processutils [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:02:57 np0005531887 nova_compute[186849]: 2025-11-22 08:02:57.351 186853 DEBUG oslo_concurrency.processutils [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/87323ed4-21ca-4440-802a-6f396fa56b00/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:02:57 np0005531887 nova_compute[186849]: 2025-11-22 08:02:57.584 186853 DEBUG oslo_concurrency.processutils [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/87323ed4-21ca-4440-802a-6f396fa56b00/disk 1073741824" returned: 0 in 0.233s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:02:57 np0005531887 nova_compute[186849]: 2025-11-22 08:02:57.586 186853 DEBUG oslo_concurrency.lockutils [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.316s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:02:57 np0005531887 nova_compute[186849]: 2025-11-22 08:02:57.586 186853 DEBUG oslo_concurrency.processutils [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:02:57 np0005531887 nova_compute[186849]: 2025-11-22 08:02:57.663 186853 DEBUG oslo_concurrency.processutils [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:02:57 np0005531887 nova_compute[186849]: 2025-11-22 08:02:57.664 186853 DEBUG nova.virt.disk.api [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Checking if we can resize image /var/lib/nova/instances/87323ed4-21ca-4440-802a-6f396fa56b00/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:02:57 np0005531887 nova_compute[186849]: 2025-11-22 08:02:57.665 186853 DEBUG oslo_concurrency.processutils [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/87323ed4-21ca-4440-802a-6f396fa56b00/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:02:57 np0005531887 nova_compute[186849]: 2025-11-22 08:02:57.690 186853 DEBUG nova.policy [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:02:57 np0005531887 nova_compute[186849]: 2025-11-22 08:02:57.733 186853 DEBUG oslo_concurrency.processutils [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/87323ed4-21ca-4440-802a-6f396fa56b00/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:02:57 np0005531887 nova_compute[186849]: 2025-11-22 08:02:57.734 186853 DEBUG nova.virt.disk.api [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Cannot resize image /var/lib/nova/instances/87323ed4-21ca-4440-802a-6f396fa56b00/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:02:57 np0005531887 nova_compute[186849]: 2025-11-22 08:02:57.734 186853 DEBUG nova.objects.instance [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lazy-loading 'migration_context' on Instance uuid 87323ed4-21ca-4440-802a-6f396fa56b00 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:02:57 np0005531887 nova_compute[186849]: 2025-11-22 08:02:57.747 186853 DEBUG nova.virt.libvirt.driver [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:02:57 np0005531887 nova_compute[186849]: 2025-11-22 08:02:57.747 186853 DEBUG nova.virt.libvirt.driver [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Ensure instance console log exists: /var/lib/nova/instances/87323ed4-21ca-4440-802a-6f396fa56b00/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:02:57 np0005531887 nova_compute[186849]: 2025-11-22 08:02:57.748 186853 DEBUG oslo_concurrency.lockutils [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:02:57 np0005531887 nova_compute[186849]: 2025-11-22 08:02:57.748 186853 DEBUG oslo_concurrency.lockutils [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:02:57 np0005531887 nova_compute[186849]: 2025-11-22 08:02:57.748 186853 DEBUG oslo_concurrency.lockutils [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:02:58 np0005531887 nova_compute[186849]: 2025-11-22 08:02:58.035 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:02:59 np0005531887 nova_compute[186849]: 2025-11-22 08:02:59.700 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:02:59 np0005531887 nova_compute[186849]: 2025-11-22 08:02:59.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:02:59 np0005531887 nova_compute[186849]: 2025-11-22 08:02:59.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:02:59 np0005531887 nova_compute[186849]: 2025-11-22 08:02:59.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:02:59 np0005531887 nova_compute[186849]: 2025-11-22 08:02:59.785 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Nov 22 03:02:59 np0005531887 nova_compute[186849]: 2025-11-22 08:02:59.786 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:03:00 np0005531887 nova_compute[186849]: 2025-11-22 08:03:00.032 186853 DEBUG nova.network.neutron [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Successfully created port: 59d94620-ece2-4ba1-96e7-f25b12dc87fc _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:03:00 np0005531887 podman[228005]: 2025-11-22 08:03:00.859081692 +0000 UTC m=+0.078368682 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 03:03:01 np0005531887 nova_compute[186849]: 2025-11-22 08:03:01.015 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:01 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:01.018 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=28, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=27) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:03:01 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:01.021 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:03:01 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:01.022 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '28'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:03:01 np0005531887 nova_compute[186849]: 2025-11-22 08:03:01.719 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:02 np0005531887 nova_compute[186849]: 2025-11-22 08:03:02.121 186853 DEBUG nova.network.neutron [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Successfully updated port: 59d94620-ece2-4ba1-96e7-f25b12dc87fc _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:03:02 np0005531887 nova_compute[186849]: 2025-11-22 08:03:02.143 186853 DEBUG oslo_concurrency.lockutils [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquiring lock "refresh_cache-87323ed4-21ca-4440-802a-6f396fa56b00" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:03:02 np0005531887 nova_compute[186849]: 2025-11-22 08:03:02.143 186853 DEBUG oslo_concurrency.lockutils [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquired lock "refresh_cache-87323ed4-21ca-4440-802a-6f396fa56b00" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:03:02 np0005531887 nova_compute[186849]: 2025-11-22 08:03:02.144 186853 DEBUG nova.network.neutron [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:03:02 np0005531887 nova_compute[186849]: 2025-11-22 08:03:02.218 186853 DEBUG nova.compute.manager [req-ca65ce5d-86a1-421c-8252-247c4eae4518 req-8f1526fb-4bec-40a6-a1b4-4d342621a080 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Received event network-changed-59d94620-ece2-4ba1-96e7-f25b12dc87fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:03:02 np0005531887 nova_compute[186849]: 2025-11-22 08:03:02.219 186853 DEBUG nova.compute.manager [req-ca65ce5d-86a1-421c-8252-247c4eae4518 req-8f1526fb-4bec-40a6-a1b4-4d342621a080 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Refreshing instance network info cache due to event network-changed-59d94620-ece2-4ba1-96e7-f25b12dc87fc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:03:02 np0005531887 nova_compute[186849]: 2025-11-22 08:03:02.219 186853 DEBUG oslo_concurrency.lockutils [req-ca65ce5d-86a1-421c-8252-247c4eae4518 req-8f1526fb-4bec-40a6-a1b4-4d342621a080 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-87323ed4-21ca-4440-802a-6f396fa56b00" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:03:02 np0005531887 nova_compute[186849]: 2025-11-22 08:03:02.666 186853 DEBUG nova.network.neutron [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:03:04 np0005531887 nova_compute[186849]: 2025-11-22 08:03:04.120 186853 DEBUG nova.network.neutron [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Updating instance_info_cache with network_info: [{"id": "59d94620-ece2-4ba1-96e7-f25b12dc87fc", "address": "fa:16:3e:78:71:fb", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap59d94620-ec", "ovs_interfaceid": "59d94620-ece2-4ba1-96e7-f25b12dc87fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:03:04 np0005531887 nova_compute[186849]: 2025-11-22 08:03:04.144 186853 DEBUG oslo_concurrency.lockutils [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Releasing lock "refresh_cache-87323ed4-21ca-4440-802a-6f396fa56b00" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:03:04 np0005531887 nova_compute[186849]: 2025-11-22 08:03:04.144 186853 DEBUG nova.compute.manager [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Instance network_info: |[{"id": "59d94620-ece2-4ba1-96e7-f25b12dc87fc", "address": "fa:16:3e:78:71:fb", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap59d94620-ec", "ovs_interfaceid": "59d94620-ece2-4ba1-96e7-f25b12dc87fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 03:03:04 np0005531887 nova_compute[186849]: 2025-11-22 08:03:04.145 186853 DEBUG oslo_concurrency.lockutils [req-ca65ce5d-86a1-421c-8252-247c4eae4518 req-8f1526fb-4bec-40a6-a1b4-4d342621a080 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-87323ed4-21ca-4440-802a-6f396fa56b00" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:03:04 np0005531887 nova_compute[186849]: 2025-11-22 08:03:04.145 186853 DEBUG nova.network.neutron [req-ca65ce5d-86a1-421c-8252-247c4eae4518 req-8f1526fb-4bec-40a6-a1b4-4d342621a080 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Refreshing network info cache for port 59d94620-ece2-4ba1-96e7-f25b12dc87fc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:03:04 np0005531887 nova_compute[186849]: 2025-11-22 08:03:04.148 186853 DEBUG nova.virt.libvirt.driver [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Start _get_guest_xml network_info=[{"id": "59d94620-ece2-4ba1-96e7-f25b12dc87fc", "address": "fa:16:3e:78:71:fb", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap59d94620-ec", "ovs_interfaceid": "59d94620-ece2-4ba1-96e7-f25b12dc87fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'size': 0, 'boot_index': 0, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:03:04 np0005531887 nova_compute[186849]: 2025-11-22 08:03:04.154 186853 WARNING nova.virt.libvirt.driver [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:03:04 np0005531887 nova_compute[186849]: 2025-11-22 08:03:04.166 186853 DEBUG nova.virt.libvirt.host [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:03:04 np0005531887 nova_compute[186849]: 2025-11-22 08:03:04.168 186853 DEBUG nova.virt.libvirt.host [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:03:04 np0005531887 nova_compute[186849]: 2025-11-22 08:03:04.177 186853 DEBUG nova.virt.libvirt.host [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:03:04 np0005531887 nova_compute[186849]: 2025-11-22 08:03:04.179 186853 DEBUG nova.virt.libvirt.host [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:03:04 np0005531887 nova_compute[186849]: 2025-11-22 08:03:04.180 186853 DEBUG nova.virt.libvirt.driver [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:03:04 np0005531887 nova_compute[186849]: 2025-11-22 08:03:04.180 186853 DEBUG nova.virt.hardware [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:03:04 np0005531887 nova_compute[186849]: 2025-11-22 08:03:04.181 186853 DEBUG nova.virt.hardware [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:03:04 np0005531887 nova_compute[186849]: 2025-11-22 08:03:04.181 186853 DEBUG nova.virt.hardware [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:03:04 np0005531887 nova_compute[186849]: 2025-11-22 08:03:04.181 186853 DEBUG nova.virt.hardware [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:03:04 np0005531887 nova_compute[186849]: 2025-11-22 08:03:04.182 186853 DEBUG nova.virt.hardware [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:03:04 np0005531887 nova_compute[186849]: 2025-11-22 08:03:04.182 186853 DEBUG nova.virt.hardware [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:03:04 np0005531887 nova_compute[186849]: 2025-11-22 08:03:04.182 186853 DEBUG nova.virt.hardware [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:03:04 np0005531887 nova_compute[186849]: 2025-11-22 08:03:04.182 186853 DEBUG nova.virt.hardware [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:03:04 np0005531887 nova_compute[186849]: 2025-11-22 08:03:04.183 186853 DEBUG nova.virt.hardware [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:03:04 np0005531887 nova_compute[186849]: 2025-11-22 08:03:04.183 186853 DEBUG nova.virt.hardware [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:03:04 np0005531887 nova_compute[186849]: 2025-11-22 08:03:04.183 186853 DEBUG nova.virt.hardware [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:03:04 np0005531887 nova_compute[186849]: 2025-11-22 08:03:04.187 186853 DEBUG nova.virt.libvirt.vif [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:02:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-2137784552',display_name='tempest-tempest.common.compute-instance-2137784552',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-2137784552',id=101,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ2pvf/H+ujHg0O+DCfl3iSnSZB+hOZsT1N0h7AAWcza7jj+TC3mzLnwxXf8MuF024jaM5DCjx5HRt44Je85H8cdbToJjcwxwTiW4fjXAIcLYMsjBTMa7LxgILfw3UjssQ==',key_name='tempest-keypair-981839442',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2ce7c88f9ad440f494bdba03e7ece1bf',ramdisk_id='',reservation_id='r-r80dn0e5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-148239263',owner_user_name='tempest-AttachInterfacesTestJSON-148239263-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:02:56Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='53d50a77d3f2416c8fcc459cc343d045',uuid=87323ed4-21ca-4440-802a-6f396fa56b00,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "59d94620-ece2-4ba1-96e7-f25b12dc87fc", "address": "fa:16:3e:78:71:fb", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap59d94620-ec", "ovs_interfaceid": "59d94620-ece2-4ba1-96e7-f25b12dc87fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:03:04 np0005531887 nova_compute[186849]: 2025-11-22 08:03:04.187 186853 DEBUG nova.network.os_vif_util [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converting VIF {"id": "59d94620-ece2-4ba1-96e7-f25b12dc87fc", "address": "fa:16:3e:78:71:fb", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap59d94620-ec", "ovs_interfaceid": "59d94620-ece2-4ba1-96e7-f25b12dc87fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:03:04 np0005531887 nova_compute[186849]: 2025-11-22 08:03:04.188 186853 DEBUG nova.network.os_vif_util [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:78:71:fb,bridge_name='br-int',has_traffic_filtering=True,id=59d94620-ece2-4ba1-96e7-f25b12dc87fc,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap59d94620-ec') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:03:04 np0005531887 nova_compute[186849]: 2025-11-22 08:03:04.188 186853 DEBUG nova.objects.instance [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lazy-loading 'pci_devices' on Instance uuid 87323ed4-21ca-4440-802a-6f396fa56b00 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:03:04 np0005531887 nova_compute[186849]: 2025-11-22 08:03:04.204 186853 DEBUG nova.virt.libvirt.driver [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:03:04 np0005531887 nova_compute[186849]:  <uuid>87323ed4-21ca-4440-802a-6f396fa56b00</uuid>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:  <name>instance-00000065</name>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:  <memory>131072</memory>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:  <vcpu>1</vcpu>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:03:04 np0005531887 nova_compute[186849]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:      <nova:name>tempest-tempest.common.compute-instance-2137784552</nova:name>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:      <nova:creationTime>2025-11-22 08:03:04</nova:creationTime>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:      <nova:flavor name="m1.nano">
Nov 22 03:03:04 np0005531887 nova_compute[186849]:        <nova:memory>128</nova:memory>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:        <nova:disk>1</nova:disk>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:        <nova:swap>0</nova:swap>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:      </nova:flavor>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:      <nova:owner>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:        <nova:user uuid="53d50a77d3f2416c8fcc459cc343d045">tempest-AttachInterfacesTestJSON-148239263-project-member</nova:user>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:        <nova:project uuid="2ce7c88f9ad440f494bdba03e7ece1bf">tempest-AttachInterfacesTestJSON-148239263</nova:project>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:      </nova:owner>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:      <nova:ports>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:        <nova:port uuid="59d94620-ece2-4ba1-96e7-f25b12dc87fc">
Nov 22 03:03:04 np0005531887 nova_compute[186849]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:        </nova:port>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:      </nova:ports>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:    </nova:instance>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:  <sysinfo type="smbios">
Nov 22 03:03:04 np0005531887 nova_compute[186849]:    <system>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:      <entry name="serial">87323ed4-21ca-4440-802a-6f396fa56b00</entry>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:      <entry name="uuid">87323ed4-21ca-4440-802a-6f396fa56b00</entry>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:    </system>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:  <os>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:    <boot dev="hd"/>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:    <smbios mode="sysinfo"/>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:  </os>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:  <features>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:    <vmcoreinfo/>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:  </features>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:  <clock offset="utc">
Nov 22 03:03:04 np0005531887 nova_compute[186849]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:    <timer name="hpet" present="no"/>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:  </clock>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:  <cpu mode="custom" match="exact">
Nov 22 03:03:04 np0005531887 nova_compute[186849]:    <model>Nehalem</model>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:  <devices>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:    <disk type="file" device="disk">
Nov 22 03:03:04 np0005531887 nova_compute[186849]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/87323ed4-21ca-4440-802a-6f396fa56b00/disk"/>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:      <target dev="vda" bus="virtio"/>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:    <disk type="file" device="cdrom">
Nov 22 03:03:04 np0005531887 nova_compute[186849]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/87323ed4-21ca-4440-802a-6f396fa56b00/disk.config"/>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:      <target dev="sda" bus="sata"/>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:    <interface type="ethernet">
Nov 22 03:03:04 np0005531887 nova_compute[186849]:      <mac address="fa:16:3e:78:71:fb"/>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:      <mtu size="1442"/>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:      <target dev="tap59d94620-ec"/>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:    </interface>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:    <serial type="pty">
Nov 22 03:03:04 np0005531887 nova_compute[186849]:      <log file="/var/lib/nova/instances/87323ed4-21ca-4440-802a-6f396fa56b00/console.log" append="off"/>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:    </serial>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:    <video>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:    </video>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:    <input type="tablet" bus="usb"/>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:    <rng model="virtio">
Nov 22 03:03:04 np0005531887 nova_compute[186849]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:    </rng>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:    <controller type="usb" index="0"/>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:    <memballoon model="virtio">
Nov 22 03:03:04 np0005531887 nova_compute[186849]:      <stats period="10"/>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 03:03:04 np0005531887 nova_compute[186849]:  </devices>
Nov 22 03:03:04 np0005531887 nova_compute[186849]: </domain>
Nov 22 03:03:04 np0005531887 nova_compute[186849]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:03:04 np0005531887 nova_compute[186849]: 2025-11-22 08:03:04.206 186853 DEBUG nova.compute.manager [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Preparing to wait for external event network-vif-plugged-59d94620-ece2-4ba1-96e7-f25b12dc87fc prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:03:04 np0005531887 nova_compute[186849]: 2025-11-22 08:03:04.207 186853 DEBUG oslo_concurrency.lockutils [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquiring lock "87323ed4-21ca-4440-802a-6f396fa56b00-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:03:04 np0005531887 nova_compute[186849]: 2025-11-22 08:03:04.208 186853 DEBUG oslo_concurrency.lockutils [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "87323ed4-21ca-4440-802a-6f396fa56b00-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:03:04 np0005531887 nova_compute[186849]: 2025-11-22 08:03:04.208 186853 DEBUG oslo_concurrency.lockutils [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "87323ed4-21ca-4440-802a-6f396fa56b00-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:03:04 np0005531887 nova_compute[186849]: 2025-11-22 08:03:04.209 186853 DEBUG nova.virt.libvirt.vif [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:02:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-2137784552',display_name='tempest-tempest.common.compute-instance-2137784552',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-2137784552',id=101,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ2pvf/H+ujHg0O+DCfl3iSnSZB+hOZsT1N0h7AAWcza7jj+TC3mzLnwxXf8MuF024jaM5DCjx5HRt44Je85H8cdbToJjcwxwTiW4fjXAIcLYMsjBTMa7LxgILfw3UjssQ==',key_name='tempest-keypair-981839442',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2ce7c88f9ad440f494bdba03e7ece1bf',ramdisk_id='',reservation_id='r-r80dn0e5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-148239263',owner_user_name='tempest-AttachInterfacesTestJSON-148239263-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:02:56Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='53d50a77d3f2416c8fcc459cc343d045',uuid=87323ed4-21ca-4440-802a-6f396fa56b00,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "59d94620-ece2-4ba1-96e7-f25b12dc87fc", "address": "fa:16:3e:78:71:fb", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap59d94620-ec", "ovs_interfaceid": "59d94620-ece2-4ba1-96e7-f25b12dc87fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:03:04 np0005531887 nova_compute[186849]: 2025-11-22 08:03:04.209 186853 DEBUG nova.network.os_vif_util [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converting VIF {"id": "59d94620-ece2-4ba1-96e7-f25b12dc87fc", "address": "fa:16:3e:78:71:fb", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap59d94620-ec", "ovs_interfaceid": "59d94620-ece2-4ba1-96e7-f25b12dc87fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:03:04 np0005531887 nova_compute[186849]: 2025-11-22 08:03:04.210 186853 DEBUG nova.network.os_vif_util [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:78:71:fb,bridge_name='br-int',has_traffic_filtering=True,id=59d94620-ece2-4ba1-96e7-f25b12dc87fc,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap59d94620-ec') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:03:04 np0005531887 nova_compute[186849]: 2025-11-22 08:03:04.211 186853 DEBUG os_vif [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:78:71:fb,bridge_name='br-int',has_traffic_filtering=True,id=59d94620-ece2-4ba1-96e7-f25b12dc87fc,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap59d94620-ec') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:03:04 np0005531887 nova_compute[186849]: 2025-11-22 08:03:04.211 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:04 np0005531887 nova_compute[186849]: 2025-11-22 08:03:04.212 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:03:04 np0005531887 nova_compute[186849]: 2025-11-22 08:03:04.212 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:03:04 np0005531887 nova_compute[186849]: 2025-11-22 08:03:04.217 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:04 np0005531887 nova_compute[186849]: 2025-11-22 08:03:04.217 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap59d94620-ec, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:03:04 np0005531887 nova_compute[186849]: 2025-11-22 08:03:04.218 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap59d94620-ec, col_values=(('external_ids', {'iface-id': '59d94620-ece2-4ba1-96e7-f25b12dc87fc', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:78:71:fb', 'vm-uuid': '87323ed4-21ca-4440-802a-6f396fa56b00'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:03:04 np0005531887 nova_compute[186849]: 2025-11-22 08:03:04.220 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:04 np0005531887 NetworkManager[55210]: <info>  [1763798584.2220] manager: (tap59d94620-ec): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/144)
Nov 22 03:03:04 np0005531887 nova_compute[186849]: 2025-11-22 08:03:04.224 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:03:04 np0005531887 nova_compute[186849]: 2025-11-22 08:03:04.229 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:04 np0005531887 nova_compute[186849]: 2025-11-22 08:03:04.231 186853 INFO os_vif [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:78:71:fb,bridge_name='br-int',has_traffic_filtering=True,id=59d94620-ece2-4ba1-96e7-f25b12dc87fc,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap59d94620-ec')#033[00m
Nov 22 03:03:04 np0005531887 nova_compute[186849]: 2025-11-22 08:03:04.405 186853 DEBUG nova.virt.libvirt.driver [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:03:04 np0005531887 nova_compute[186849]: 2025-11-22 08:03:04.406 186853 DEBUG nova.virt.libvirt.driver [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:03:04 np0005531887 nova_compute[186849]: 2025-11-22 08:03:04.406 186853 DEBUG nova.virt.libvirt.driver [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] No VIF found with MAC fa:16:3e:78:71:fb, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:03:04 np0005531887 nova_compute[186849]: 2025-11-22 08:03:04.407 186853 INFO nova.virt.libvirt.driver [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Using config drive#033[00m
Nov 22 03:03:04 np0005531887 nova_compute[186849]: 2025-11-22 08:03:04.896 186853 INFO nova.virt.libvirt.driver [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Creating config drive at /var/lib/nova/instances/87323ed4-21ca-4440-802a-6f396fa56b00/disk.config#033[00m
Nov 22 03:03:04 np0005531887 nova_compute[186849]: 2025-11-22 08:03:04.902 186853 DEBUG oslo_concurrency.processutils [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/87323ed4-21ca-4440-802a-6f396fa56b00/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5agl41r9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:03:05 np0005531887 nova_compute[186849]: 2025-11-22 08:03:05.027 186853 DEBUG oslo_concurrency.processutils [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/87323ed4-21ca-4440-802a-6f396fa56b00/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5agl41r9" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:03:05 np0005531887 kernel: tap59d94620-ec: entered promiscuous mode
Nov 22 03:03:05 np0005531887 NetworkManager[55210]: <info>  [1763798585.1285] manager: (tap59d94620-ec): new Tun device (/org/freedesktop/NetworkManager/Devices/145)
Nov 22 03:03:05 np0005531887 ovn_controller[95130]: 2025-11-22T08:03:05Z|00301|binding|INFO|Claiming lport 59d94620-ece2-4ba1-96e7-f25b12dc87fc for this chassis.
Nov 22 03:03:05 np0005531887 ovn_controller[95130]: 2025-11-22T08:03:05Z|00302|binding|INFO|59d94620-ece2-4ba1-96e7-f25b12dc87fc: Claiming fa:16:3e:78:71:fb 10.100.0.8
Nov 22 03:03:05 np0005531887 nova_compute[186849]: 2025-11-22 08:03:05.127 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:05 np0005531887 ovn_controller[95130]: 2025-11-22T08:03:05Z|00303|binding|INFO|Setting lport 59d94620-ece2-4ba1-96e7-f25b12dc87fc ovn-installed in OVS
Nov 22 03:03:05 np0005531887 ovn_controller[95130]: 2025-11-22T08:03:05Z|00304|binding|INFO|Setting lport 59d94620-ece2-4ba1-96e7-f25b12dc87fc up in Southbound
Nov 22 03:03:05 np0005531887 nova_compute[186849]: 2025-11-22 08:03:05.142 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:05 np0005531887 nova_compute[186849]: 2025-11-22 08:03:05.143 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:05.139 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:78:71:fb 10.100.0.8'], port_security=['fa:16:3e:78:71:fb 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '87323ed4-21ca-4440-802a-6f396fa56b00', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6a4a282c-db22-41de-b34b-2960aa032ca8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'neutron:revision_number': '2', 'neutron:security_group_ids': '6da332c7-e52a-4f92-8c24-c2ee0c6e77d0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0acaade9-a442-4c9d-a882-bd397d30fce8, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=59d94620-ece2-4ba1-96e7-f25b12dc87fc) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:03:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:05.141 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 59d94620-ece2-4ba1-96e7-f25b12dc87fc in datapath 6a4a282c-db22-41de-b34b-2960aa032ca8 bound to our chassis#033[00m
Nov 22 03:03:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:05.143 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6a4a282c-db22-41de-b34b-2960aa032ca8#033[00m
Nov 22 03:03:05 np0005531887 nova_compute[186849]: 2025-11-22 08:03:05.146 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:05 np0005531887 nova_compute[186849]: 2025-11-22 08:03:05.151 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:05.156 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[c1cd231e-3561-4c09-aeeb-a3c9d7b1cbd5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:05.157 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6a4a282c-d1 in ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:03:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:05.159 213790 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6a4a282c-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:03:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:05.159 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[072d5752-fb76-4b15-9cbb-6c24a067de0f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:05.160 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[a4d2ba85-a523-45c2-97a5-33471fb13c89]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:05.172 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[f4cb1453-cc05-428b-8a22-d806c26752ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:05 np0005531887 systemd-machined[153180]: New machine qemu-39-instance-00000065.
Nov 22 03:03:05 np0005531887 systemd-udevd[228062]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:03:05 np0005531887 NetworkManager[55210]: <info>  [1763798585.1955] device (tap59d94620-ec): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:03:05 np0005531887 NetworkManager[55210]: <info>  [1763798585.1966] device (tap59d94620-ec): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:03:05 np0005531887 systemd[1]: Started Virtual Machine qemu-39-instance-00000065.
Nov 22 03:03:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:05.200 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[1af8bf7e-713e-40c8-91d7-9e4c947b77ed]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:05 np0005531887 podman[228038]: 2025-11-22 08:03:05.22331047 +0000 UTC m=+0.108700900 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 22 03:03:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:05.232 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[6c3762cf-c77c-44e1-b43c-d9cb2de0d729]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:05.238 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[79db7580-7a65-448b-878e-788f109ea038]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:05 np0005531887 NetworkManager[55210]: <info>  [1763798585.2395] manager: (tap6a4a282c-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/146)
Nov 22 03:03:05 np0005531887 systemd-udevd[228066]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:03:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:05.270 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[61499f7f-4186-48ca-88f5-047624b5cf80]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:05.272 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[494dd8cb-2027-44b0-b675-8c4e9ff3318a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:05 np0005531887 NetworkManager[55210]: <info>  [1763798585.2954] device (tap6a4a282c-d0): carrier: link connected
Nov 22 03:03:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:05.301 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[6eac4a0d-5c82-4068-8be6-a8f4258e89ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:05.322 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[8e5baf25-f31d-45bf-8544-3398ae92deb8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6a4a282c-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b3:7a:86'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 97], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 532790, 'reachable_time': 24498, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228097, 'error': None, 'target': 'ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:05.341 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[47db1818-d76c-4e29-8f55-1f0ede394005]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb3:7a86'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 532790, 'tstamp': 532790}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228098, 'error': None, 'target': 'ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:05.364 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[29ca0eb4-46f8-4392-8a54-881a7dc6bef3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6a4a282c-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b3:7a:86'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 97], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 532790, 'reachable_time': 24498, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 228099, 'error': None, 'target': 'ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:05.398 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[31f340c4-fae2-47f8-a17a-900e678daf01]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:05.464 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[213cdf98-0766-4901-9eff-0ef14d60d7e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:05.466 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6a4a282c-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:03:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:05.466 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:03:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:05.467 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6a4a282c-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:03:05 np0005531887 nova_compute[186849]: 2025-11-22 08:03:05.469 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:05 np0005531887 NetworkManager[55210]: <info>  [1763798585.4696] manager: (tap6a4a282c-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/147)
Nov 22 03:03:05 np0005531887 kernel: tap6a4a282c-d0: entered promiscuous mode
Nov 22 03:03:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:05.475 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6a4a282c-d0, col_values=(('external_ids', {'iface-id': '26692495-261e-4628-ae4d-0a33d676c097'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:03:05 np0005531887 ovn_controller[95130]: 2025-11-22T08:03:05Z|00305|binding|INFO|Releasing lport 26692495-261e-4628-ae4d-0a33d676c097 from this chassis (sb_readonly=0)
Nov 22 03:03:05 np0005531887 nova_compute[186849]: 2025-11-22 08:03:05.477 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:05.480 104084 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6a4a282c-db22-41de-b34b-2960aa032ca8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6a4a282c-db22-41de-b34b-2960aa032ca8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:03:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:05.481 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[a1c324b4-ae3b-43d3-b9c8-e224bc68089e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:05.482 104084 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:03:05 np0005531887 ovn_metadata_agent[104079]: global
Nov 22 03:03:05 np0005531887 ovn_metadata_agent[104079]:    log         /dev/log local0 debug
Nov 22 03:03:05 np0005531887 ovn_metadata_agent[104079]:    log-tag     haproxy-metadata-proxy-6a4a282c-db22-41de-b34b-2960aa032ca8
Nov 22 03:03:05 np0005531887 ovn_metadata_agent[104079]:    user        root
Nov 22 03:03:05 np0005531887 ovn_metadata_agent[104079]:    group       root
Nov 22 03:03:05 np0005531887 ovn_metadata_agent[104079]:    maxconn     1024
Nov 22 03:03:05 np0005531887 ovn_metadata_agent[104079]:    pidfile     /var/lib/neutron/external/pids/6a4a282c-db22-41de-b34b-2960aa032ca8.pid.haproxy
Nov 22 03:03:05 np0005531887 ovn_metadata_agent[104079]:    daemon
Nov 22 03:03:05 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:03:05 np0005531887 ovn_metadata_agent[104079]: defaults
Nov 22 03:03:05 np0005531887 ovn_metadata_agent[104079]:    log global
Nov 22 03:03:05 np0005531887 ovn_metadata_agent[104079]:    mode http
Nov 22 03:03:05 np0005531887 ovn_metadata_agent[104079]:    option httplog
Nov 22 03:03:05 np0005531887 ovn_metadata_agent[104079]:    option dontlognull
Nov 22 03:03:05 np0005531887 ovn_metadata_agent[104079]:    option http-server-close
Nov 22 03:03:05 np0005531887 ovn_metadata_agent[104079]:    option forwardfor
Nov 22 03:03:05 np0005531887 ovn_metadata_agent[104079]:    retries                 3
Nov 22 03:03:05 np0005531887 ovn_metadata_agent[104079]:    timeout http-request    30s
Nov 22 03:03:05 np0005531887 ovn_metadata_agent[104079]:    timeout connect         30s
Nov 22 03:03:05 np0005531887 ovn_metadata_agent[104079]:    timeout client          32s
Nov 22 03:03:05 np0005531887 ovn_metadata_agent[104079]:    timeout server          32s
Nov 22 03:03:05 np0005531887 ovn_metadata_agent[104079]:    timeout http-keep-alive 30s
Nov 22 03:03:05 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:03:05 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:03:05 np0005531887 ovn_metadata_agent[104079]: listen listener
Nov 22 03:03:05 np0005531887 ovn_metadata_agent[104079]:    bind 169.254.169.254:80
Nov 22 03:03:05 np0005531887 ovn_metadata_agent[104079]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:03:05 np0005531887 ovn_metadata_agent[104079]:    http-request add-header X-OVN-Network-ID 6a4a282c-db22-41de-b34b-2960aa032ca8
Nov 22 03:03:05 np0005531887 ovn_metadata_agent[104079]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:03:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:05.484 104084 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8', 'env', 'PROCESS_TAG=haproxy-6a4a282c-db22-41de-b34b-2960aa032ca8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6a4a282c-db22-41de-b34b-2960aa032ca8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:03:05 np0005531887 nova_compute[186849]: 2025-11-22 08:03:05.490 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:05 np0005531887 nova_compute[186849]: 2025-11-22 08:03:05.774 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798585.7736428, 87323ed4-21ca-4440-802a-6f396fa56b00 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:03:05 np0005531887 nova_compute[186849]: 2025-11-22 08:03:05.775 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] VM Started (Lifecycle Event)#033[00m
Nov 22 03:03:05 np0005531887 nova_compute[186849]: 2025-11-22 08:03:05.797 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:03:05 np0005531887 nova_compute[186849]: 2025-11-22 08:03:05.801 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798585.7738516, 87323ed4-21ca-4440-802a-6f396fa56b00 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:03:05 np0005531887 nova_compute[186849]: 2025-11-22 08:03:05.802 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:03:05 np0005531887 nova_compute[186849]: 2025-11-22 08:03:05.820 186853 DEBUG nova.compute.manager [req-3a46afb1-d6b6-491e-9df2-2b6508fb02e3 req-337cd2f1-3318-464a-b4db-9be298d9e5fb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Received event network-vif-plugged-59d94620-ece2-4ba1-96e7-f25b12dc87fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:03:05 np0005531887 nova_compute[186849]: 2025-11-22 08:03:05.821 186853 DEBUG oslo_concurrency.lockutils [req-3a46afb1-d6b6-491e-9df2-2b6508fb02e3 req-337cd2f1-3318-464a-b4db-9be298d9e5fb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "87323ed4-21ca-4440-802a-6f396fa56b00-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:03:05 np0005531887 nova_compute[186849]: 2025-11-22 08:03:05.821 186853 DEBUG oslo_concurrency.lockutils [req-3a46afb1-d6b6-491e-9df2-2b6508fb02e3 req-337cd2f1-3318-464a-b4db-9be298d9e5fb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "87323ed4-21ca-4440-802a-6f396fa56b00-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:03:05 np0005531887 nova_compute[186849]: 2025-11-22 08:03:05.821 186853 DEBUG oslo_concurrency.lockutils [req-3a46afb1-d6b6-491e-9df2-2b6508fb02e3 req-337cd2f1-3318-464a-b4db-9be298d9e5fb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "87323ed4-21ca-4440-802a-6f396fa56b00-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:03:05 np0005531887 nova_compute[186849]: 2025-11-22 08:03:05.822 186853 DEBUG nova.compute.manager [req-3a46afb1-d6b6-491e-9df2-2b6508fb02e3 req-337cd2f1-3318-464a-b4db-9be298d9e5fb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Processing event network-vif-plugged-59d94620-ece2-4ba1-96e7-f25b12dc87fc _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:03:05 np0005531887 nova_compute[186849]: 2025-11-22 08:03:05.823 186853 DEBUG nova.compute.manager [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:03:05 np0005531887 nova_compute[186849]: 2025-11-22 08:03:05.825 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:03:05 np0005531887 nova_compute[186849]: 2025-11-22 08:03:05.828 186853 DEBUG nova.virt.libvirt.driver [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:03:05 np0005531887 nova_compute[186849]: 2025-11-22 08:03:05.830 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798585.827717, 87323ed4-21ca-4440-802a-6f396fa56b00 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:03:05 np0005531887 nova_compute[186849]: 2025-11-22 08:03:05.831 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:03:05 np0005531887 nova_compute[186849]: 2025-11-22 08:03:05.836 186853 INFO nova.virt.libvirt.driver [-] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Instance spawned successfully.#033[00m
Nov 22 03:03:05 np0005531887 nova_compute[186849]: 2025-11-22 08:03:05.837 186853 DEBUG nova.virt.libvirt.driver [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 03:03:05 np0005531887 nova_compute[186849]: 2025-11-22 08:03:05.860 186853 DEBUG nova.virt.libvirt.driver [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:03:05 np0005531887 nova_compute[186849]: 2025-11-22 08:03:05.861 186853 DEBUG nova.virt.libvirt.driver [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:03:05 np0005531887 nova_compute[186849]: 2025-11-22 08:03:05.861 186853 DEBUG nova.virt.libvirt.driver [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:03:05 np0005531887 nova_compute[186849]: 2025-11-22 08:03:05.862 186853 DEBUG nova.virt.libvirt.driver [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:03:05 np0005531887 nova_compute[186849]: 2025-11-22 08:03:05.862 186853 DEBUG nova.virt.libvirt.driver [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:03:05 np0005531887 nova_compute[186849]: 2025-11-22 08:03:05.862 186853 DEBUG nova.virt.libvirt.driver [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:03:05 np0005531887 nova_compute[186849]: 2025-11-22 08:03:05.866 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:03:05 np0005531887 nova_compute[186849]: 2025-11-22 08:03:05.869 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:03:05 np0005531887 nova_compute[186849]: 2025-11-22 08:03:05.890 186853 DEBUG nova.network.neutron [req-ca65ce5d-86a1-421c-8252-247c4eae4518 req-8f1526fb-4bec-40a6-a1b4-4d342621a080 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Updated VIF entry in instance network info cache for port 59d94620-ece2-4ba1-96e7-f25b12dc87fc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:03:05 np0005531887 nova_compute[186849]: 2025-11-22 08:03:05.890 186853 DEBUG nova.network.neutron [req-ca65ce5d-86a1-421c-8252-247c4eae4518 req-8f1526fb-4bec-40a6-a1b4-4d342621a080 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Updating instance_info_cache with network_info: [{"id": "59d94620-ece2-4ba1-96e7-f25b12dc87fc", "address": "fa:16:3e:78:71:fb", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap59d94620-ec", "ovs_interfaceid": "59d94620-ece2-4ba1-96e7-f25b12dc87fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:03:05 np0005531887 nova_compute[186849]: 2025-11-22 08:03:05.911 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:03:05 np0005531887 nova_compute[186849]: 2025-11-22 08:03:05.925 186853 DEBUG oslo_concurrency.lockutils [req-ca65ce5d-86a1-421c-8252-247c4eae4518 req-8f1526fb-4bec-40a6-a1b4-4d342621a080 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-87323ed4-21ca-4440-802a-6f396fa56b00" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:03:05 np0005531887 nova_compute[186849]: 2025-11-22 08:03:05.962 186853 INFO nova.compute.manager [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Took 8.78 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 03:03:05 np0005531887 nova_compute[186849]: 2025-11-22 08:03:05.962 186853 DEBUG nova.compute.manager [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:03:05 np0005531887 podman[228136]: 2025-11-22 08:03:05.868011853 +0000 UTC m=+0.023366439 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:03:06 np0005531887 nova_compute[186849]: 2025-11-22 08:03:06.051 186853 INFO nova.compute.manager [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Took 9.52 seconds to build instance.#033[00m
Nov 22 03:03:06 np0005531887 nova_compute[186849]: 2025-11-22 08:03:06.067 186853 DEBUG oslo_concurrency.lockutils [None req-be8a04be-177a-4bc3-87fd-ee2c6b9ece02 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "87323ed4-21ca-4440-802a-6f396fa56b00" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.687s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:03:06 np0005531887 podman[228136]: 2025-11-22 08:03:06.503008498 +0000 UTC m=+0.658363064 container create 7e1c178a584d1652a474516f3e657c20b0d62b02781df843c656e4e5355057ea (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 22 03:03:06 np0005531887 systemd[1]: Started libpod-conmon-7e1c178a584d1652a474516f3e657c20b0d62b02781df843c656e4e5355057ea.scope.
Nov 22 03:03:06 np0005531887 systemd[1]: Started libcrun container.
Nov 22 03:03:06 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/74bb2844f1bff1c1301e99ae8ebfa5449098a366a5b38d3285eea2788b7024bc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:03:06 np0005531887 nova_compute[186849]: 2025-11-22 08:03:06.722 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:06 np0005531887 podman[228136]: 2025-11-22 08:03:06.79258954 +0000 UTC m=+0.947944136 container init 7e1c178a584d1652a474516f3e657c20b0d62b02781df843c656e4e5355057ea (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 03:03:06 np0005531887 podman[228136]: 2025-11-22 08:03:06.799378384 +0000 UTC m=+0.954732950 container start 7e1c178a584d1652a474516f3e657c20b0d62b02781df843c656e4e5355057ea (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 22 03:03:06 np0005531887 neutron-haproxy-ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8[228151]: [NOTICE]   (228155) : New worker (228157) forked
Nov 22 03:03:06 np0005531887 neutron-haproxy-ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8[228151]: [NOTICE]   (228155) : Loading success.
Nov 22 03:03:07 np0005531887 nova_compute[186849]: 2025-11-22 08:03:07.923 186853 DEBUG nova.compute.manager [req-8923a4aa-912c-4002-bfd4-19137474f33d req-14fdc98d-87d1-44a6-9e57-94ea9a62c36c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Received event network-vif-plugged-59d94620-ece2-4ba1-96e7-f25b12dc87fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:03:07 np0005531887 nova_compute[186849]: 2025-11-22 08:03:07.924 186853 DEBUG oslo_concurrency.lockutils [req-8923a4aa-912c-4002-bfd4-19137474f33d req-14fdc98d-87d1-44a6-9e57-94ea9a62c36c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "87323ed4-21ca-4440-802a-6f396fa56b00-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:03:07 np0005531887 nova_compute[186849]: 2025-11-22 08:03:07.924 186853 DEBUG oslo_concurrency.lockutils [req-8923a4aa-912c-4002-bfd4-19137474f33d req-14fdc98d-87d1-44a6-9e57-94ea9a62c36c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "87323ed4-21ca-4440-802a-6f396fa56b00-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:03:07 np0005531887 nova_compute[186849]: 2025-11-22 08:03:07.924 186853 DEBUG oslo_concurrency.lockutils [req-8923a4aa-912c-4002-bfd4-19137474f33d req-14fdc98d-87d1-44a6-9e57-94ea9a62c36c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "87323ed4-21ca-4440-802a-6f396fa56b00-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:03:07 np0005531887 nova_compute[186849]: 2025-11-22 08:03:07.924 186853 DEBUG nova.compute.manager [req-8923a4aa-912c-4002-bfd4-19137474f33d req-14fdc98d-87d1-44a6-9e57-94ea9a62c36c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] No waiting events found dispatching network-vif-plugged-59d94620-ece2-4ba1-96e7-f25b12dc87fc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:03:07 np0005531887 nova_compute[186849]: 2025-11-22 08:03:07.925 186853 WARNING nova.compute.manager [req-8923a4aa-912c-4002-bfd4-19137474f33d req-14fdc98d-87d1-44a6-9e57-94ea9a62c36c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Received unexpected event network-vif-plugged-59d94620-ece2-4ba1-96e7-f25b12dc87fc for instance with vm_state active and task_state None.#033[00m
Nov 22 03:03:09 np0005531887 nova_compute[186849]: 2025-11-22 08:03:09.223 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:09 np0005531887 podman[228166]: 2025-11-22 08:03:09.851212267 +0000 UTC m=+0.069361270 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:03:11 np0005531887 nova_compute[186849]: 2025-11-22 08:03:11.724 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:12 np0005531887 nova_compute[186849]: 2025-11-22 08:03:12.410 186853 DEBUG nova.compute.manager [req-d506c1ef-3051-425e-80ba-852e2c2102f5 req-25dd8f09-b95e-4136-9993-ffc73748c1d1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Received event network-changed-59d94620-ece2-4ba1-96e7-f25b12dc87fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:03:12 np0005531887 nova_compute[186849]: 2025-11-22 08:03:12.411 186853 DEBUG nova.compute.manager [req-d506c1ef-3051-425e-80ba-852e2c2102f5 req-25dd8f09-b95e-4136-9993-ffc73748c1d1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Refreshing instance network info cache due to event network-changed-59d94620-ece2-4ba1-96e7-f25b12dc87fc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:03:12 np0005531887 nova_compute[186849]: 2025-11-22 08:03:12.411 186853 DEBUG oslo_concurrency.lockutils [req-d506c1ef-3051-425e-80ba-852e2c2102f5 req-25dd8f09-b95e-4136-9993-ffc73748c1d1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-87323ed4-21ca-4440-802a-6f396fa56b00" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:03:12 np0005531887 nova_compute[186849]: 2025-11-22 08:03:12.411 186853 DEBUG oslo_concurrency.lockutils [req-d506c1ef-3051-425e-80ba-852e2c2102f5 req-25dd8f09-b95e-4136-9993-ffc73748c1d1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-87323ed4-21ca-4440-802a-6f396fa56b00" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:03:12 np0005531887 nova_compute[186849]: 2025-11-22 08:03:12.412 186853 DEBUG nova.network.neutron [req-d506c1ef-3051-425e-80ba-852e2c2102f5 req-25dd8f09-b95e-4136-9993-ffc73748c1d1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Refreshing network info cache for port 59d94620-ece2-4ba1-96e7-f25b12dc87fc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:03:14 np0005531887 nova_compute[186849]: 2025-11-22 08:03:14.030 186853 DEBUG nova.network.neutron [req-d506c1ef-3051-425e-80ba-852e2c2102f5 req-25dd8f09-b95e-4136-9993-ffc73748c1d1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Updated VIF entry in instance network info cache for port 59d94620-ece2-4ba1-96e7-f25b12dc87fc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:03:14 np0005531887 nova_compute[186849]: 2025-11-22 08:03:14.030 186853 DEBUG nova.network.neutron [req-d506c1ef-3051-425e-80ba-852e2c2102f5 req-25dd8f09-b95e-4136-9993-ffc73748c1d1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Updating instance_info_cache with network_info: [{"id": "59d94620-ece2-4ba1-96e7-f25b12dc87fc", "address": "fa:16:3e:78:71:fb", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap59d94620-ec", "ovs_interfaceid": "59d94620-ece2-4ba1-96e7-f25b12dc87fc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:03:14 np0005531887 nova_compute[186849]: 2025-11-22 08:03:14.091 186853 DEBUG oslo_concurrency.lockutils [req-d506c1ef-3051-425e-80ba-852e2c2102f5 req-25dd8f09-b95e-4136-9993-ffc73748c1d1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-87323ed4-21ca-4440-802a-6f396fa56b00" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:03:14 np0005531887 nova_compute[186849]: 2025-11-22 08:03:14.225 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:15 np0005531887 podman[228190]: 2025-11-22 08:03:15.832339248 +0000 UTC m=+0.055632589 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, distribution-scope=public, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, architecture=x86_64)
Nov 22 03:03:16 np0005531887 nova_compute[186849]: 2025-11-22 08:03:16.726 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:19 np0005531887 nova_compute[186849]: 2025-11-22 08:03:19.230 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:19 np0005531887 podman[228214]: 2025-11-22 08:03:19.852891419 +0000 UTC m=+0.071407094 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 03:03:19 np0005531887 podman[228215]: 2025-11-22 08:03:19.883037392 +0000 UTC m=+0.096809895 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:03:21 np0005531887 nova_compute[186849]: 2025-11-22 08:03:21.729 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:24 np0005531887 nova_compute[186849]: 2025-11-22 08:03:24.233 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:24 np0005531887 podman[228280]: 2025-11-22 08:03:24.830693893 +0000 UTC m=+0.053346640 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:03:26 np0005531887 nova_compute[186849]: 2025-11-22 08:03:26.041 186853 DEBUG nova.compute.manager [req-07d33adb-8664-4ecf-be7a-a9fb1639f04d req-74ac4585-56e2-4cc7-b338-14c1a56ee598 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Received event network-changed-59d94620-ece2-4ba1-96e7-f25b12dc87fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:03:26 np0005531887 nova_compute[186849]: 2025-11-22 08:03:26.041 186853 DEBUG nova.compute.manager [req-07d33adb-8664-4ecf-be7a-a9fb1639f04d req-74ac4585-56e2-4cc7-b338-14c1a56ee598 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Refreshing instance network info cache due to event network-changed-59d94620-ece2-4ba1-96e7-f25b12dc87fc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:03:26 np0005531887 nova_compute[186849]: 2025-11-22 08:03:26.041 186853 DEBUG oslo_concurrency.lockutils [req-07d33adb-8664-4ecf-be7a-a9fb1639f04d req-74ac4585-56e2-4cc7-b338-14c1a56ee598 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-87323ed4-21ca-4440-802a-6f396fa56b00" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:03:26 np0005531887 nova_compute[186849]: 2025-11-22 08:03:26.041 186853 DEBUG oslo_concurrency.lockutils [req-07d33adb-8664-4ecf-be7a-a9fb1639f04d req-74ac4585-56e2-4cc7-b338-14c1a56ee598 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-87323ed4-21ca-4440-802a-6f396fa56b00" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:03:26 np0005531887 nova_compute[186849]: 2025-11-22 08:03:26.042 186853 DEBUG nova.network.neutron [req-07d33adb-8664-4ecf-be7a-a9fb1639f04d req-74ac4585-56e2-4cc7-b338-14c1a56ee598 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Refreshing network info cache for port 59d94620-ece2-4ba1-96e7-f25b12dc87fc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:03:26 np0005531887 ovn_controller[95130]: 2025-11-22T08:03:26Z|00043|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:78:71:fb 10.100.0.8
Nov 22 03:03:26 np0005531887 ovn_controller[95130]: 2025-11-22T08:03:26Z|00044|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:78:71:fb 10.100.0.8
Nov 22 03:03:26 np0005531887 nova_compute[186849]: 2025-11-22 08:03:26.731 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:29 np0005531887 nova_compute[186849]: 2025-11-22 08:03:29.042 186853 DEBUG nova.network.neutron [req-07d33adb-8664-4ecf-be7a-a9fb1639f04d req-74ac4585-56e2-4cc7-b338-14c1a56ee598 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Updated VIF entry in instance network info cache for port 59d94620-ece2-4ba1-96e7-f25b12dc87fc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:03:29 np0005531887 nova_compute[186849]: 2025-11-22 08:03:29.042 186853 DEBUG nova.network.neutron [req-07d33adb-8664-4ecf-be7a-a9fb1639f04d req-74ac4585-56e2-4cc7-b338-14c1a56ee598 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Updating instance_info_cache with network_info: [{"id": "59d94620-ece2-4ba1-96e7-f25b12dc87fc", "address": "fa:16:3e:78:71:fb", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap59d94620-ec", "ovs_interfaceid": "59d94620-ece2-4ba1-96e7-f25b12dc87fc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:03:29 np0005531887 nova_compute[186849]: 2025-11-22 08:03:29.093 186853 DEBUG oslo_concurrency.lockutils [req-07d33adb-8664-4ecf-be7a-a9fb1639f04d req-74ac4585-56e2-4cc7-b338-14c1a56ee598 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-87323ed4-21ca-4440-802a-6f396fa56b00" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:03:29 np0005531887 nova_compute[186849]: 2025-11-22 08:03:29.238 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:30 np0005531887 ovn_controller[95130]: 2025-11-22T08:03:30Z|00306|binding|INFO|Releasing lport 26692495-261e-4628-ae4d-0a33d676c097 from this chassis (sb_readonly=0)
Nov 22 03:03:30 np0005531887 nova_compute[186849]: 2025-11-22 08:03:30.539 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:30 np0005531887 nova_compute[186849]: 2025-11-22 08:03:30.791 186853 DEBUG nova.compute.manager [req-2ace43d7-0713-4b88-b507-77e5d1163989 req-5e9c504d-75fe-483a-80d7-a8f31f120c66 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Received event network-changed-59d94620-ece2-4ba1-96e7-f25b12dc87fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:03:30 np0005531887 nova_compute[186849]: 2025-11-22 08:03:30.791 186853 DEBUG nova.compute.manager [req-2ace43d7-0713-4b88-b507-77e5d1163989 req-5e9c504d-75fe-483a-80d7-a8f31f120c66 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Refreshing instance network info cache due to event network-changed-59d94620-ece2-4ba1-96e7-f25b12dc87fc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:03:30 np0005531887 nova_compute[186849]: 2025-11-22 08:03:30.792 186853 DEBUG oslo_concurrency.lockutils [req-2ace43d7-0713-4b88-b507-77e5d1163989 req-5e9c504d-75fe-483a-80d7-a8f31f120c66 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-87323ed4-21ca-4440-802a-6f396fa56b00" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:03:30 np0005531887 nova_compute[186849]: 2025-11-22 08:03:30.792 186853 DEBUG oslo_concurrency.lockutils [req-2ace43d7-0713-4b88-b507-77e5d1163989 req-5e9c504d-75fe-483a-80d7-a8f31f120c66 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-87323ed4-21ca-4440-802a-6f396fa56b00" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:03:30 np0005531887 nova_compute[186849]: 2025-11-22 08:03:30.792 186853 DEBUG nova.network.neutron [req-2ace43d7-0713-4b88-b507-77e5d1163989 req-5e9c504d-75fe-483a-80d7-a8f31f120c66 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Refreshing network info cache for port 59d94620-ece2-4ba1-96e7-f25b12dc87fc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:03:31 np0005531887 nova_compute[186849]: 2025-11-22 08:03:31.733 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:31 np0005531887 podman[228305]: 2025-11-22 08:03:31.831912708 +0000 UTC m=+0.045278593 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent)
Nov 22 03:03:31 np0005531887 nova_compute[186849]: 2025-11-22 08:03:31.966 186853 DEBUG oslo_concurrency.lockutils [None req-a9292c9d-2cd9-45b9-8394-e17f4713e76f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquiring lock "interface-87323ed4-21ca-4440-802a-6f396fa56b00-650f9e14-a6b8-46d0-8167-1eb22fcbc8fc" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:03:31 np0005531887 nova_compute[186849]: 2025-11-22 08:03:31.966 186853 DEBUG oslo_concurrency.lockutils [None req-a9292c9d-2cd9-45b9-8394-e17f4713e76f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "interface-87323ed4-21ca-4440-802a-6f396fa56b00-650f9e14-a6b8-46d0-8167-1eb22fcbc8fc" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:03:31 np0005531887 nova_compute[186849]: 2025-11-22 08:03:31.967 186853 DEBUG nova.objects.instance [None req-a9292c9d-2cd9-45b9-8394-e17f4713e76f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lazy-loading 'flavor' on Instance uuid 87323ed4-21ca-4440-802a-6f396fa56b00 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:03:32 np0005531887 nova_compute[186849]: 2025-11-22 08:03:32.721 186853 DEBUG nova.objects.instance [None req-a9292c9d-2cd9-45b9-8394-e17f4713e76f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lazy-loading 'pci_requests' on Instance uuid 87323ed4-21ca-4440-802a-6f396fa56b00 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:03:32 np0005531887 nova_compute[186849]: 2025-11-22 08:03:32.754 186853 DEBUG nova.network.neutron [None req-a9292c9d-2cd9-45b9-8394-e17f4713e76f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:03:33 np0005531887 nova_compute[186849]: 2025-11-22 08:03:33.160 186853 DEBUG nova.policy [None req-a9292c9d-2cd9-45b9-8394-e17f4713e76f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '53d50a77d3f2416c8fcc459cc343d045', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:03:33 np0005531887 nova_compute[186849]: 2025-11-22 08:03:33.192 186853 DEBUG nova.network.neutron [req-2ace43d7-0713-4b88-b507-77e5d1163989 req-5e9c504d-75fe-483a-80d7-a8f31f120c66 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Updated VIF entry in instance network info cache for port 59d94620-ece2-4ba1-96e7-f25b12dc87fc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:03:33 np0005531887 nova_compute[186849]: 2025-11-22 08:03:33.193 186853 DEBUG nova.network.neutron [req-2ace43d7-0713-4b88-b507-77e5d1163989 req-5e9c504d-75fe-483a-80d7-a8f31f120c66 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Updating instance_info_cache with network_info: [{"id": "59d94620-ece2-4ba1-96e7-f25b12dc87fc", "address": "fa:16:3e:78:71:fb", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap59d94620-ec", "ovs_interfaceid": "59d94620-ece2-4ba1-96e7-f25b12dc87fc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:03:33 np0005531887 nova_compute[186849]: 2025-11-22 08:03:33.226 186853 DEBUG oslo_concurrency.lockutils [req-2ace43d7-0713-4b88-b507-77e5d1163989 req-5e9c504d-75fe-483a-80d7-a8f31f120c66 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-87323ed4-21ca-4440-802a-6f396fa56b00" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:03:34 np0005531887 nova_compute[186849]: 2025-11-22 08:03:34.242 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:34 np0005531887 nova_compute[186849]: 2025-11-22 08:03:34.523 186853 DEBUG nova.network.neutron [None req-a9292c9d-2cd9-45b9-8394-e17f4713e76f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Successfully updated port: 650f9e14-a6b8-46d0-8167-1eb22fcbc8fc _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:03:34 np0005531887 nova_compute[186849]: 2025-11-22 08:03:34.539 186853 DEBUG oslo_concurrency.lockutils [None req-a9292c9d-2cd9-45b9-8394-e17f4713e76f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquiring lock "refresh_cache-87323ed4-21ca-4440-802a-6f396fa56b00" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:03:34 np0005531887 nova_compute[186849]: 2025-11-22 08:03:34.540 186853 DEBUG oslo_concurrency.lockutils [None req-a9292c9d-2cd9-45b9-8394-e17f4713e76f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquired lock "refresh_cache-87323ed4-21ca-4440-802a-6f396fa56b00" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:03:34 np0005531887 nova_compute[186849]: 2025-11-22 08:03:34.540 186853 DEBUG nova.network.neutron [None req-a9292c9d-2cd9-45b9-8394-e17f4713e76f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:03:34 np0005531887 nova_compute[186849]: 2025-11-22 08:03:34.740 186853 DEBUG nova.compute.manager [req-22b89b37-bf09-4205-99af-f698bec740c9 req-b6d13d54-5732-4630-8936-d8771e065ad6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Received event network-changed-650f9e14-a6b8-46d0-8167-1eb22fcbc8fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:03:34 np0005531887 nova_compute[186849]: 2025-11-22 08:03:34.740 186853 DEBUG nova.compute.manager [req-22b89b37-bf09-4205-99af-f698bec740c9 req-b6d13d54-5732-4630-8936-d8771e065ad6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Refreshing instance network info cache due to event network-changed-650f9e14-a6b8-46d0-8167-1eb22fcbc8fc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:03:34 np0005531887 nova_compute[186849]: 2025-11-22 08:03:34.741 186853 DEBUG oslo_concurrency.lockutils [req-22b89b37-bf09-4205-99af-f698bec740c9 req-b6d13d54-5732-4630-8936-d8771e065ad6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-87323ed4-21ca-4440-802a-6f396fa56b00" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:03:34 np0005531887 nova_compute[186849]: 2025-11-22 08:03:34.781 186853 WARNING nova.network.neutron [None req-a9292c9d-2cd9-45b9-8394-e17f4713e76f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] 6a4a282c-db22-41de-b34b-2960aa032ca8 already exists in list: networks containing: ['6a4a282c-db22-41de-b34b-2960aa032ca8']. ignoring it#033[00m
Nov 22 03:03:35 np0005531887 podman[228322]: 2025-11-22 08:03:35.826087461 +0000 UTC m=+0.052052326 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 22 03:03:36 np0005531887 nova_compute[186849]: 2025-11-22 08:03:36.673 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:03:36 np0005531887 nova_compute[186849]: 2025-11-22 08:03:36.736 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:37 np0005531887 nova_compute[186849]: 2025-11-22 08:03:37.053 186853 DEBUG nova.network.neutron [None req-a9292c9d-2cd9-45b9-8394-e17f4713e76f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Updating instance_info_cache with network_info: [{"id": "59d94620-ece2-4ba1-96e7-f25b12dc87fc", "address": "fa:16:3e:78:71:fb", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap59d94620-ec", "ovs_interfaceid": "59d94620-ece2-4ba1-96e7-f25b12dc87fc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "650f9e14-a6b8-46d0-8167-1eb22fcbc8fc", "address": "fa:16:3e:35:e8:28", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap650f9e14-a6", "ovs_interfaceid": "650f9e14-a6b8-46d0-8167-1eb22fcbc8fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:03:37 np0005531887 nova_compute[186849]: 2025-11-22 08:03:37.072 186853 DEBUG oslo_concurrency.lockutils [None req-a9292c9d-2cd9-45b9-8394-e17f4713e76f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Releasing lock "refresh_cache-87323ed4-21ca-4440-802a-6f396fa56b00" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:03:37 np0005531887 nova_compute[186849]: 2025-11-22 08:03:37.073 186853 DEBUG oslo_concurrency.lockutils [req-22b89b37-bf09-4205-99af-f698bec740c9 req-b6d13d54-5732-4630-8936-d8771e065ad6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-87323ed4-21ca-4440-802a-6f396fa56b00" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:03:37 np0005531887 nova_compute[186849]: 2025-11-22 08:03:37.074 186853 DEBUG nova.network.neutron [req-22b89b37-bf09-4205-99af-f698bec740c9 req-b6d13d54-5732-4630-8936-d8771e065ad6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Refreshing network info cache for port 650f9e14-a6b8-46d0-8167-1eb22fcbc8fc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:03:37 np0005531887 nova_compute[186849]: 2025-11-22 08:03:37.077 186853 DEBUG nova.virt.libvirt.vif [None req-a9292c9d-2cd9-45b9-8394-e17f4713e76f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:02:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-2137784552',display_name='tempest-tempest.common.compute-instance-2137784552',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-2137784552',id=101,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ2pvf/H+ujHg0O+DCfl3iSnSZB+hOZsT1N0h7AAWcza7jj+TC3mzLnwxXf8MuF024jaM5DCjx5HRt44Je85H8cdbToJjcwxwTiW4fjXAIcLYMsjBTMa7LxgILfw3UjssQ==',key_name='tempest-keypair-981839442',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:03:05Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='2ce7c88f9ad440f494bdba03e7ece1bf',ramdisk_id='',reservation_id='r-r80dn0e5',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-148239263',owner_user_name='tempest-AttachInterfacesTestJSON-148239263-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:03:06Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='53d50a77d3f2416c8fcc459cc343d045',uuid=87323ed4-21ca-4440-802a-6f396fa56b00,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "650f9e14-a6b8-46d0-8167-1eb22fcbc8fc", "address": "fa:16:3e:35:e8:28", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap650f9e14-a6", "ovs_interfaceid": "650f9e14-a6b8-46d0-8167-1eb22fcbc8fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:03:37 np0005531887 nova_compute[186849]: 2025-11-22 08:03:37.078 186853 DEBUG nova.network.os_vif_util [None req-a9292c9d-2cd9-45b9-8394-e17f4713e76f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converting VIF {"id": "650f9e14-a6b8-46d0-8167-1eb22fcbc8fc", "address": "fa:16:3e:35:e8:28", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap650f9e14-a6", "ovs_interfaceid": "650f9e14-a6b8-46d0-8167-1eb22fcbc8fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:03:37 np0005531887 nova_compute[186849]: 2025-11-22 08:03:37.079 186853 DEBUG nova.network.os_vif_util [None req-a9292c9d-2cd9-45b9-8394-e17f4713e76f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:35:e8:28,bridge_name='br-int',has_traffic_filtering=True,id=650f9e14-a6b8-46d0-8167-1eb22fcbc8fc,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap650f9e14-a6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:03:37 np0005531887 nova_compute[186849]: 2025-11-22 08:03:37.080 186853 DEBUG os_vif [None req-a9292c9d-2cd9-45b9-8394-e17f4713e76f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:e8:28,bridge_name='br-int',has_traffic_filtering=True,id=650f9e14-a6b8-46d0-8167-1eb22fcbc8fc,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap650f9e14-a6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:03:37 np0005531887 nova_compute[186849]: 2025-11-22 08:03:37.080 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:37 np0005531887 nova_compute[186849]: 2025-11-22 08:03:37.081 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:03:37 np0005531887 nova_compute[186849]: 2025-11-22 08:03:37.081 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:03:37 np0005531887 nova_compute[186849]: 2025-11-22 08:03:37.084 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:37 np0005531887 nova_compute[186849]: 2025-11-22 08:03:37.084 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap650f9e14-a6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:03:37 np0005531887 nova_compute[186849]: 2025-11-22 08:03:37.085 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap650f9e14-a6, col_values=(('external_ids', {'iface-id': '650f9e14-a6b8-46d0-8167-1eb22fcbc8fc', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:35:e8:28', 'vm-uuid': '87323ed4-21ca-4440-802a-6f396fa56b00'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:03:37 np0005531887 NetworkManager[55210]: <info>  [1763798617.0882] manager: (tap650f9e14-a6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/148)
Nov 22 03:03:37 np0005531887 nova_compute[186849]: 2025-11-22 08:03:37.089 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:37 np0005531887 nova_compute[186849]: 2025-11-22 08:03:37.091 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:03:37 np0005531887 nova_compute[186849]: 2025-11-22 08:03:37.094 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:37 np0005531887 nova_compute[186849]: 2025-11-22 08:03:37.095 186853 INFO os_vif [None req-a9292c9d-2cd9-45b9-8394-e17f4713e76f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:e8:28,bridge_name='br-int',has_traffic_filtering=True,id=650f9e14-a6b8-46d0-8167-1eb22fcbc8fc,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap650f9e14-a6')#033[00m
Nov 22 03:03:37 np0005531887 nova_compute[186849]: 2025-11-22 08:03:37.096 186853 DEBUG nova.virt.libvirt.vif [None req-a9292c9d-2cd9-45b9-8394-e17f4713e76f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:02:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-2137784552',display_name='tempest-tempest.common.compute-instance-2137784552',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-2137784552',id=101,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ2pvf/H+ujHg0O+DCfl3iSnSZB+hOZsT1N0h7AAWcza7jj+TC3mzLnwxXf8MuF024jaM5DCjx5HRt44Je85H8cdbToJjcwxwTiW4fjXAIcLYMsjBTMa7LxgILfw3UjssQ==',key_name='tempest-keypair-981839442',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:03:05Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='2ce7c88f9ad440f494bdba03e7ece1bf',ramdisk_id='',reservation_id='r-r80dn0e5',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-148239263',owner_user_name='tempest-AttachInterfacesTestJSON-148239263-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:03:06Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='53d50a77d3f2416c8fcc459cc343d045',uuid=87323ed4-21ca-4440-802a-6f396fa56b00,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "650f9e14-a6b8-46d0-8167-1eb22fcbc8fc", "address": "fa:16:3e:35:e8:28", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap650f9e14-a6", "ovs_interfaceid": "650f9e14-a6b8-46d0-8167-1eb22fcbc8fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:03:37 np0005531887 nova_compute[186849]: 2025-11-22 08:03:37.096 186853 DEBUG nova.network.os_vif_util [None req-a9292c9d-2cd9-45b9-8394-e17f4713e76f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converting VIF {"id": "650f9e14-a6b8-46d0-8167-1eb22fcbc8fc", "address": "fa:16:3e:35:e8:28", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap650f9e14-a6", "ovs_interfaceid": "650f9e14-a6b8-46d0-8167-1eb22fcbc8fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:03:37 np0005531887 nova_compute[186849]: 2025-11-22 08:03:37.096 186853 DEBUG nova.network.os_vif_util [None req-a9292c9d-2cd9-45b9-8394-e17f4713e76f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:35:e8:28,bridge_name='br-int',has_traffic_filtering=True,id=650f9e14-a6b8-46d0-8167-1eb22fcbc8fc,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap650f9e14-a6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:03:37 np0005531887 nova_compute[186849]: 2025-11-22 08:03:37.099 186853 DEBUG nova.virt.libvirt.guest [None req-a9292c9d-2cd9-45b9-8394-e17f4713e76f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] attach device xml: <interface type="ethernet">
Nov 22 03:03:37 np0005531887 nova_compute[186849]:  <mac address="fa:16:3e:35:e8:28"/>
Nov 22 03:03:37 np0005531887 nova_compute[186849]:  <model type="virtio"/>
Nov 22 03:03:37 np0005531887 nova_compute[186849]:  <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:03:37 np0005531887 nova_compute[186849]:  <mtu size="1442"/>
Nov 22 03:03:37 np0005531887 nova_compute[186849]:  <target dev="tap650f9e14-a6"/>
Nov 22 03:03:37 np0005531887 nova_compute[186849]: </interface>
Nov 22 03:03:37 np0005531887 nova_compute[186849]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Nov 22 03:03:37 np0005531887 kernel: tap650f9e14-a6: entered promiscuous mode
Nov 22 03:03:37 np0005531887 NetworkManager[55210]: <info>  [1763798617.1121] manager: (tap650f9e14-a6): new Tun device (/org/freedesktop/NetworkManager/Devices/149)
Nov 22 03:03:37 np0005531887 nova_compute[186849]: 2025-11-22 08:03:37.113 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:37 np0005531887 ovn_controller[95130]: 2025-11-22T08:03:37Z|00307|binding|INFO|Claiming lport 650f9e14-a6b8-46d0-8167-1eb22fcbc8fc for this chassis.
Nov 22 03:03:37 np0005531887 ovn_controller[95130]: 2025-11-22T08:03:37Z|00308|binding|INFO|650f9e14-a6b8-46d0-8167-1eb22fcbc8fc: Claiming fa:16:3e:35:e8:28 10.100.0.6
Nov 22 03:03:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:37.120 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:35:e8:28 10.100.0.6'], port_security=['fa:16:3e:35:e8:28 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-AttachInterfacesTestJSON-638313878', 'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '87323ed4-21ca-4440-802a-6f396fa56b00', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6a4a282c-db22-41de-b34b-2960aa032ca8', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-AttachInterfacesTestJSON-638313878', 'neutron:project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd4a48801-4b3f-49e9-aa90-fb1d486a915e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0acaade9-a442-4c9d-a882-bd397d30fce8, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=650f9e14-a6b8-46d0-8167-1eb22fcbc8fc) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:03:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:37.122 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 650f9e14-a6b8-46d0-8167-1eb22fcbc8fc in datapath 6a4a282c-db22-41de-b34b-2960aa032ca8 bound to our chassis#033[00m
Nov 22 03:03:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:37.123 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6a4a282c-db22-41de-b34b-2960aa032ca8#033[00m
Nov 22 03:03:37 np0005531887 ovn_controller[95130]: 2025-11-22T08:03:37Z|00309|binding|INFO|Setting lport 650f9e14-a6b8-46d0-8167-1eb22fcbc8fc ovn-installed in OVS
Nov 22 03:03:37 np0005531887 ovn_controller[95130]: 2025-11-22T08:03:37Z|00310|binding|INFO|Setting lport 650f9e14-a6b8-46d0-8167-1eb22fcbc8fc up in Southbound
Nov 22 03:03:37 np0005531887 nova_compute[186849]: 2025-11-22 08:03:37.129 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:37 np0005531887 nova_compute[186849]: 2025-11-22 08:03:37.133 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:37.139 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[95535a88-4638-4bd4-bb32-643e3f47106c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:37 np0005531887 systemd-udevd[228350]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:03:37 np0005531887 NetworkManager[55210]: <info>  [1763798617.1648] device (tap650f9e14-a6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:03:37 np0005531887 NetworkManager[55210]: <info>  [1763798617.1655] device (tap650f9e14-a6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:03:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:37.175 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[c95cfd91-1571-41d2-8fb6-7932a01296d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:37.178 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[05d16424-ba17-47e1-895f-136896f369fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:37 np0005531887 nova_compute[186849]: 2025-11-22 08:03:37.207 186853 DEBUG nova.virt.libvirt.driver [None req-a9292c9d-2cd9-45b9-8394-e17f4713e76f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:03:37 np0005531887 nova_compute[186849]: 2025-11-22 08:03:37.208 186853 DEBUG nova.virt.libvirt.driver [None req-a9292c9d-2cd9-45b9-8394-e17f4713e76f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:03:37 np0005531887 nova_compute[186849]: 2025-11-22 08:03:37.208 186853 DEBUG nova.virt.libvirt.driver [None req-a9292c9d-2cd9-45b9-8394-e17f4713e76f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] No VIF found with MAC fa:16:3e:78:71:fb, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:03:37 np0005531887 nova_compute[186849]: 2025-11-22 08:03:37.208 186853 DEBUG nova.virt.libvirt.driver [None req-a9292c9d-2cd9-45b9-8394-e17f4713e76f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] No VIF found with MAC fa:16:3e:35:e8:28, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:03:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:37.210 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[459d1773-4bfc-46be-817b-b2d9d049ddab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:37.231 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[ad10fc66-c81e-438d-a253-f849f2d57f17]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6a4a282c-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b3:7a:86'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 97], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 532790, 'reachable_time': 24498, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228356, 'error': None, 'target': 'ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:37 np0005531887 nova_compute[186849]: 2025-11-22 08:03:37.240 186853 DEBUG nova.virt.libvirt.guest [None req-a9292c9d-2cd9-45b9-8394-e17f4713e76f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:03:37 np0005531887 nova_compute[186849]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:03:37 np0005531887 nova_compute[186849]:  <nova:name>tempest-tempest.common.compute-instance-2137784552</nova:name>
Nov 22 03:03:37 np0005531887 nova_compute[186849]:  <nova:creationTime>2025-11-22 08:03:37</nova:creationTime>
Nov 22 03:03:37 np0005531887 nova_compute[186849]:  <nova:flavor name="m1.nano">
Nov 22 03:03:37 np0005531887 nova_compute[186849]:    <nova:memory>128</nova:memory>
Nov 22 03:03:37 np0005531887 nova_compute[186849]:    <nova:disk>1</nova:disk>
Nov 22 03:03:37 np0005531887 nova_compute[186849]:    <nova:swap>0</nova:swap>
Nov 22 03:03:37 np0005531887 nova_compute[186849]:    <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:03:37 np0005531887 nova_compute[186849]:    <nova:vcpus>1</nova:vcpus>
Nov 22 03:03:37 np0005531887 nova_compute[186849]:  </nova:flavor>
Nov 22 03:03:37 np0005531887 nova_compute[186849]:  <nova:owner>
Nov 22 03:03:37 np0005531887 nova_compute[186849]:    <nova:user uuid="53d50a77d3f2416c8fcc459cc343d045">tempest-AttachInterfacesTestJSON-148239263-project-member</nova:user>
Nov 22 03:03:37 np0005531887 nova_compute[186849]:    <nova:project uuid="2ce7c88f9ad440f494bdba03e7ece1bf">tempest-AttachInterfacesTestJSON-148239263</nova:project>
Nov 22 03:03:37 np0005531887 nova_compute[186849]:  </nova:owner>
Nov 22 03:03:37 np0005531887 nova_compute[186849]:  <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:03:37 np0005531887 nova_compute[186849]:  <nova:ports>
Nov 22 03:03:37 np0005531887 nova_compute[186849]:    <nova:port uuid="59d94620-ece2-4ba1-96e7-f25b12dc87fc">
Nov 22 03:03:37 np0005531887 nova_compute[186849]:      <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 22 03:03:37 np0005531887 nova_compute[186849]:    </nova:port>
Nov 22 03:03:37 np0005531887 nova_compute[186849]:    <nova:port uuid="650f9e14-a6b8-46d0-8167-1eb22fcbc8fc">
Nov 22 03:03:37 np0005531887 nova_compute[186849]:      <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Nov 22 03:03:37 np0005531887 nova_compute[186849]:    </nova:port>
Nov 22 03:03:37 np0005531887 nova_compute[186849]:  </nova:ports>
Nov 22 03:03:37 np0005531887 nova_compute[186849]: </nova:instance>
Nov 22 03:03:37 np0005531887 nova_compute[186849]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Nov 22 03:03:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:37.248 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[2ea62129-bfd6-4eda-bbba-28387bcf2cbd]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap6a4a282c-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 532803, 'tstamp': 532803}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228357, 'error': None, 'target': 'ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap6a4a282c-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 532806, 'tstamp': 532806}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228357, 'error': None, 'target': 'ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:37.250 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6a4a282c-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:03:37 np0005531887 nova_compute[186849]: 2025-11-22 08:03:37.253 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:37.253 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6a4a282c-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:03:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:37.253 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:03:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:37.254 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6a4a282c-d0, col_values=(('external_ids', {'iface-id': '26692495-261e-4628-ae4d-0a33d676c097'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:03:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:37.254 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:03:37 np0005531887 nova_compute[186849]: 2025-11-22 08:03:37.276 186853 DEBUG oslo_concurrency.lockutils [None req-a9292c9d-2cd9-45b9-8394-e17f4713e76f 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "interface-87323ed4-21ca-4440-802a-6f396fa56b00-650f9e14-a6b8-46d0-8167-1eb22fcbc8fc" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 5.309s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:03:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:37.336 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:03:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:37.337 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:03:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:37.337 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:03:37 np0005531887 nova_compute[186849]: 2025-11-22 08:03:37.826 186853 DEBUG nova.compute.manager [req-ebda8c76-0220-47fc-8ee1-293667076571 req-7a102af1-98f1-4cdc-9c4c-140601258c75 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Received event network-vif-plugged-650f9e14-a6b8-46d0-8167-1eb22fcbc8fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:03:37 np0005531887 nova_compute[186849]: 2025-11-22 08:03:37.826 186853 DEBUG oslo_concurrency.lockutils [req-ebda8c76-0220-47fc-8ee1-293667076571 req-7a102af1-98f1-4cdc-9c4c-140601258c75 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "87323ed4-21ca-4440-802a-6f396fa56b00-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:03:37 np0005531887 nova_compute[186849]: 2025-11-22 08:03:37.827 186853 DEBUG oslo_concurrency.lockutils [req-ebda8c76-0220-47fc-8ee1-293667076571 req-7a102af1-98f1-4cdc-9c4c-140601258c75 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "87323ed4-21ca-4440-802a-6f396fa56b00-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:03:37 np0005531887 nova_compute[186849]: 2025-11-22 08:03:37.827 186853 DEBUG oslo_concurrency.lockutils [req-ebda8c76-0220-47fc-8ee1-293667076571 req-7a102af1-98f1-4cdc-9c4c-140601258c75 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "87323ed4-21ca-4440-802a-6f396fa56b00-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:03:37 np0005531887 nova_compute[186849]: 2025-11-22 08:03:37.827 186853 DEBUG nova.compute.manager [req-ebda8c76-0220-47fc-8ee1-293667076571 req-7a102af1-98f1-4cdc-9c4c-140601258c75 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] No waiting events found dispatching network-vif-plugged-650f9e14-a6b8-46d0-8167-1eb22fcbc8fc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:03:37 np0005531887 nova_compute[186849]: 2025-11-22 08:03:37.827 186853 WARNING nova.compute.manager [req-ebda8c76-0220-47fc-8ee1-293667076571 req-7a102af1-98f1-4cdc-9c4c-140601258c75 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Received unexpected event network-vif-plugged-650f9e14-a6b8-46d0-8167-1eb22fcbc8fc for instance with vm_state active and task_state None.#033[00m
Nov 22 03:03:38 np0005531887 nova_compute[186849]: 2025-11-22 08:03:38.164 186853 DEBUG oslo_concurrency.lockutils [None req-2e46aa80-dec3-445d-8793-bdbf3315afa7 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquiring lock "interface-87323ed4-21ca-4440-802a-6f396fa56b00-650f9e14-a6b8-46d0-8167-1eb22fcbc8fc" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:03:38 np0005531887 nova_compute[186849]: 2025-11-22 08:03:38.165 186853 DEBUG oslo_concurrency.lockutils [None req-2e46aa80-dec3-445d-8793-bdbf3315afa7 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "interface-87323ed4-21ca-4440-802a-6f396fa56b00-650f9e14-a6b8-46d0-8167-1eb22fcbc8fc" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:03:38 np0005531887 nova_compute[186849]: 2025-11-22 08:03:38.185 186853 DEBUG nova.objects.instance [None req-2e46aa80-dec3-445d-8793-bdbf3315afa7 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lazy-loading 'flavor' on Instance uuid 87323ed4-21ca-4440-802a-6f396fa56b00 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:03:38 np0005531887 nova_compute[186849]: 2025-11-22 08:03:38.215 186853 DEBUG nova.virt.libvirt.vif [None req-2e46aa80-dec3-445d-8793-bdbf3315afa7 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:02:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-2137784552',display_name='tempest-tempest.common.compute-instance-2137784552',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-2137784552',id=101,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ2pvf/H+ujHg0O+DCfl3iSnSZB+hOZsT1N0h7AAWcza7jj+TC3mzLnwxXf8MuF024jaM5DCjx5HRt44Je85H8cdbToJjcwxwTiW4fjXAIcLYMsjBTMa7LxgILfw3UjssQ==',key_name='tempest-keypair-981839442',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:03:05Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2ce7c88f9ad440f494bdba03e7ece1bf',ramdisk_id='',reservation_id='r-r80dn0e5',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-148239263',owner_user_name='tempest-AttachInterfacesTestJSON-148239263-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:03:06Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='53d50a77d3f2416c8fcc459cc343d045',uuid=87323ed4-21ca-4440-802a-6f396fa56b00,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "650f9e14-a6b8-46d0-8167-1eb22fcbc8fc", "address": "fa:16:3e:35:e8:28", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap650f9e14-a6", "ovs_interfaceid": "650f9e14-a6b8-46d0-8167-1eb22fcbc8fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:03:38 np0005531887 nova_compute[186849]: 2025-11-22 08:03:38.216 186853 DEBUG nova.network.os_vif_util [None req-2e46aa80-dec3-445d-8793-bdbf3315afa7 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converting VIF {"id": "650f9e14-a6b8-46d0-8167-1eb22fcbc8fc", "address": "fa:16:3e:35:e8:28", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap650f9e14-a6", "ovs_interfaceid": "650f9e14-a6b8-46d0-8167-1eb22fcbc8fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:03:38 np0005531887 nova_compute[186849]: 2025-11-22 08:03:38.216 186853 DEBUG nova.network.os_vif_util [None req-2e46aa80-dec3-445d-8793-bdbf3315afa7 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:35:e8:28,bridge_name='br-int',has_traffic_filtering=True,id=650f9e14-a6b8-46d0-8167-1eb22fcbc8fc,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap650f9e14-a6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:03:38 np0005531887 nova_compute[186849]: 2025-11-22 08:03:38.219 186853 DEBUG nova.virt.libvirt.guest [None req-2e46aa80-dec3-445d-8793-bdbf3315afa7 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:35:e8:28"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap650f9e14-a6"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 22 03:03:38 np0005531887 nova_compute[186849]: 2025-11-22 08:03:38.221 186853 DEBUG nova.virt.libvirt.guest [None req-2e46aa80-dec3-445d-8793-bdbf3315afa7 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:35:e8:28"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap650f9e14-a6"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 22 03:03:38 np0005531887 nova_compute[186849]: 2025-11-22 08:03:38.223 186853 DEBUG nova.virt.libvirt.driver [None req-2e46aa80-dec3-445d-8793-bdbf3315afa7 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Attempting to detach device tap650f9e14-a6 from instance 87323ed4-21ca-4440-802a-6f396fa56b00 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Nov 22 03:03:38 np0005531887 nova_compute[186849]: 2025-11-22 08:03:38.224 186853 DEBUG nova.virt.libvirt.guest [None req-2e46aa80-dec3-445d-8793-bdbf3315afa7 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] detach device xml: <interface type="ethernet">
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <mac address="fa:16:3e:35:e8:28"/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <model type="virtio"/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <mtu size="1442"/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <target dev="tap650f9e14-a6"/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]: </interface>
Nov 22 03:03:38 np0005531887 nova_compute[186849]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Nov 22 03:03:38 np0005531887 nova_compute[186849]: 2025-11-22 08:03:38.244 186853 DEBUG nova.virt.libvirt.guest [None req-2e46aa80-dec3-445d-8793-bdbf3315afa7 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:35:e8:28"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap650f9e14-a6"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 22 03:03:38 np0005531887 nova_compute[186849]: 2025-11-22 08:03:38.247 186853 DEBUG nova.virt.libvirt.guest [None req-2e46aa80-dec3-445d-8793-bdbf3315afa7 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:35:e8:28"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap650f9e14-a6"/></interface>not found in domain: <domain type='kvm' id='39'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <name>instance-00000065</name>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <uuid>87323ed4-21ca-4440-802a-6f396fa56b00</uuid>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <nova:name>tempest-tempest.common.compute-instance-2137784552</nova:name>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <nova:creationTime>2025-11-22 08:03:37</nova:creationTime>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <nova:flavor name="m1.nano">
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <nova:memory>128</nova:memory>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <nova:disk>1</nova:disk>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <nova:swap>0</nova:swap>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <nova:vcpus>1</nova:vcpus>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  </nova:flavor>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <nova:owner>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <nova:user uuid="53d50a77d3f2416c8fcc459cc343d045">tempest-AttachInterfacesTestJSON-148239263-project-member</nova:user>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <nova:project uuid="2ce7c88f9ad440f494bdba03e7ece1bf">tempest-AttachInterfacesTestJSON-148239263</nova:project>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  </nova:owner>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <nova:ports>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <nova:port uuid="59d94620-ece2-4ba1-96e7-f25b12dc87fc">
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </nova:port>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <nova:port uuid="650f9e14-a6b8-46d0-8167-1eb22fcbc8fc">
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </nova:port>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  </nova:ports>
Nov 22 03:03:38 np0005531887 nova_compute[186849]: </nova:instance>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <memory unit='KiB'>131072</memory>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <currentMemory unit='KiB'>131072</currentMemory>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <vcpu placement='static'>1</vcpu>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <resource>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <partition>/machine</partition>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  </resource>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <sysinfo type='smbios'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <system>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <entry name='manufacturer'>RDO</entry>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <entry name='product'>OpenStack Compute</entry>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <entry name='serial'>87323ed4-21ca-4440-802a-6f396fa56b00</entry>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <entry name='uuid'>87323ed4-21ca-4440-802a-6f396fa56b00</entry>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <entry name='family'>Virtual Machine</entry>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </system>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <os>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <boot dev='hd'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <smbios mode='sysinfo'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  </os>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <features>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <vmcoreinfo state='on'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  </features>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <cpu mode='custom' match='exact' check='full'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <model fallback='forbid'>Nehalem</model>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <feature policy='require' name='x2apic'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <feature policy='require' name='hypervisor'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <feature policy='require' name='vme'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <clock offset='utc'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <timer name='pit' tickpolicy='delay'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <timer name='rtc' tickpolicy='catchup'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <timer name='hpet' present='no'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  </clock>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <on_poweroff>destroy</on_poweroff>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <on_reboot>restart</on_reboot>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <on_crash>destroy</on_crash>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <devices>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <disk type='file' device='disk'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <driver name='qemu' type='qcow2' cache='none'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <source file='/var/lib/nova/instances/87323ed4-21ca-4440-802a-6f396fa56b00/disk' index='2'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <backingStore type='file' index='3'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:        <format type='raw'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:        <source file='/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:        <backingStore/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      </backingStore>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <target dev='vda' bus='virtio'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='virtio-disk0'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <disk type='file' device='cdrom'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <driver name='qemu' type='raw' cache='none'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <source file='/var/lib/nova/instances/87323ed4-21ca-4440-802a-6f396fa56b00/disk.config' index='1'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <backingStore/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <target dev='sda' bus='sata'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <readonly/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='sata0-0-0'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='0' model='pcie-root'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='pcie.0'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='1' model='pcie-root-port'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <target chassis='1' port='0x10'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='pci.1'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='2' model='pcie-root-port'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <target chassis='2' port='0x11'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='pci.2'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='3' model='pcie-root-port'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <target chassis='3' port='0x12'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='pci.3'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='4' model='pcie-root-port'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <target chassis='4' port='0x13'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='pci.4'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='5' model='pcie-root-port'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <target chassis='5' port='0x14'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='pci.5'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='6' model='pcie-root-port'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <target chassis='6' port='0x15'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='pci.6'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='7' model='pcie-root-port'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <target chassis='7' port='0x16'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='pci.7'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='8' model='pcie-root-port'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <target chassis='8' port='0x17'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='pci.8'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='9' model='pcie-root-port'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <target chassis='9' port='0x18'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='pci.9'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='10' model='pcie-root-port'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <target chassis='10' port='0x19'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='pci.10'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='11' model='pcie-root-port'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <target chassis='11' port='0x1a'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='pci.11'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='12' model='pcie-root-port'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <target chassis='12' port='0x1b'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='pci.12'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='13' model='pcie-root-port'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <target chassis='13' port='0x1c'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='pci.13'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='14' model='pcie-root-port'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <target chassis='14' port='0x1d'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='pci.14'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='15' model='pcie-root-port'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <target chassis='15' port='0x1e'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='pci.15'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='16' model='pcie-root-port'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <target chassis='16' port='0x1f'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='pci.16'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='17' model='pcie-root-port'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <target chassis='17' port='0x20'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='pci.17'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='18' model='pcie-root-port'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <target chassis='18' port='0x21'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='pci.18'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='19' model='pcie-root-port'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <target chassis='19' port='0x22'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='pci.19'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='20' model='pcie-root-port'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <target chassis='20' port='0x23'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='pci.20'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='21' model='pcie-root-port'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <target chassis='21' port='0x24'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='pci.21'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='22' model='pcie-root-port'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <target chassis='22' port='0x25'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='pci.22'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='23' model='pcie-root-port'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <target chassis='23' port='0x26'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='pci.23'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='24' model='pcie-root-port'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <target chassis='24' port='0x27'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='pci.24'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='25' model='pcie-root-port'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <target chassis='25' port='0x28'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='pci.25'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <model name='pcie-pci-bridge'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='pci.26'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <controller type='usb' index='0' model='piix3-uhci'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='usb'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <controller type='sata' index='0'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='ide'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <interface type='ethernet'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <mac address='fa:16:3e:78:71:fb'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <target dev='tap59d94620-ec'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <model type='virtio'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <driver name='vhost' rx_queue_size='512'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <mtu size='1442'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='net0'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </interface>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <interface type='ethernet'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <mac address='fa:16:3e:35:e8:28'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <target dev='tap650f9e14-a6'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <model type='virtio'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <driver name='vhost' rx_queue_size='512'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <mtu size='1442'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='net1'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </interface>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <serial type='pty'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <source path='/dev/pts/0'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <log file='/var/lib/nova/instances/87323ed4-21ca-4440-802a-6f396fa56b00/console.log' append='off'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <target type='isa-serial' port='0'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:        <model name='isa-serial'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      </target>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='serial0'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </serial>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <console type='pty' tty='/dev/pts/0'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <source path='/dev/pts/0'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <log file='/var/lib/nova/instances/87323ed4-21ca-4440-802a-6f396fa56b00/console.log' append='off'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <target type='serial' port='0'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='serial0'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </console>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <input type='tablet' bus='usb'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='input0'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='usb' bus='0' port='1'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </input>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <input type='mouse' bus='ps2'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='input1'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </input>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <input type='keyboard' bus='ps2'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='input2'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </input>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <listen type='address' address='::0'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </graphics>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <audio id='1' type='none'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <video>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <model type='virtio' heads='1' primary='yes'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='video0'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </video>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <watchdog model='itco' action='reset'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='watchdog0'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </watchdog>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <memballoon model='virtio'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <stats period='10'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='balloon0'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <rng model='virtio'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <backend model='random'>/dev/urandom</backend>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='rng0'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </rng>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  </devices>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <label>system_u:system_r:svirt_t:s0:c337,c775</label>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c337,c775</imagelabel>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  </seclabel>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <label>+107:+107</label>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <imagelabel>+107:+107</imagelabel>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  </seclabel>
Nov 22 03:03:38 np0005531887 nova_compute[186849]: </domain>
Nov 22 03:03:38 np0005531887 nova_compute[186849]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Nov 22 03:03:38 np0005531887 nova_compute[186849]: 2025-11-22 08:03:38.249 186853 INFO nova.virt.libvirt.driver [None req-2e46aa80-dec3-445d-8793-bdbf3315afa7 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Successfully detached device tap650f9e14-a6 from instance 87323ed4-21ca-4440-802a-6f396fa56b00 from the persistent domain config.#033[00m
Nov 22 03:03:38 np0005531887 nova_compute[186849]: 2025-11-22 08:03:38.249 186853 DEBUG nova.virt.libvirt.driver [None req-2e46aa80-dec3-445d-8793-bdbf3315afa7 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] (1/8): Attempting to detach device tap650f9e14-a6 with device alias net1 from instance 87323ed4-21ca-4440-802a-6f396fa56b00 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Nov 22 03:03:38 np0005531887 nova_compute[186849]: 2025-11-22 08:03:38.249 186853 DEBUG nova.virt.libvirt.guest [None req-2e46aa80-dec3-445d-8793-bdbf3315afa7 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] detach device xml: <interface type="ethernet">
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <mac address="fa:16:3e:35:e8:28"/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <model type="virtio"/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <mtu size="1442"/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <target dev="tap650f9e14-a6"/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]: </interface>
Nov 22 03:03:38 np0005531887 nova_compute[186849]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Nov 22 03:03:38 np0005531887 nova_compute[186849]: 2025-11-22 08:03:38.272 186853 DEBUG nova.network.neutron [req-22b89b37-bf09-4205-99af-f698bec740c9 req-b6d13d54-5732-4630-8936-d8771e065ad6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Updated VIF entry in instance network info cache for port 650f9e14-a6b8-46d0-8167-1eb22fcbc8fc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:03:38 np0005531887 nova_compute[186849]: 2025-11-22 08:03:38.272 186853 DEBUG nova.network.neutron [req-22b89b37-bf09-4205-99af-f698bec740c9 req-b6d13d54-5732-4630-8936-d8771e065ad6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Updating instance_info_cache with network_info: [{"id": "59d94620-ece2-4ba1-96e7-f25b12dc87fc", "address": "fa:16:3e:78:71:fb", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap59d94620-ec", "ovs_interfaceid": "59d94620-ece2-4ba1-96e7-f25b12dc87fc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "650f9e14-a6b8-46d0-8167-1eb22fcbc8fc", "address": "fa:16:3e:35:e8:28", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap650f9e14-a6", "ovs_interfaceid": "650f9e14-a6b8-46d0-8167-1eb22fcbc8fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:03:38 np0005531887 nova_compute[186849]: 2025-11-22 08:03:38.289 186853 DEBUG oslo_concurrency.lockutils [req-22b89b37-bf09-4205-99af-f698bec740c9 req-b6d13d54-5732-4630-8936-d8771e065ad6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-87323ed4-21ca-4440-802a-6f396fa56b00" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:03:38 np0005531887 kernel: tap650f9e14-a6 (unregistering): left promiscuous mode
Nov 22 03:03:38 np0005531887 NetworkManager[55210]: <info>  [1763798618.3510] device (tap650f9e14-a6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:03:38 np0005531887 ovn_controller[95130]: 2025-11-22T08:03:38Z|00311|binding|INFO|Releasing lport 650f9e14-a6b8-46d0-8167-1eb22fcbc8fc from this chassis (sb_readonly=0)
Nov 22 03:03:38 np0005531887 ovn_controller[95130]: 2025-11-22T08:03:38Z|00312|binding|INFO|Setting lport 650f9e14-a6b8-46d0-8167-1eb22fcbc8fc down in Southbound
Nov 22 03:03:38 np0005531887 nova_compute[186849]: 2025-11-22 08:03:38.360 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:38 np0005531887 ovn_controller[95130]: 2025-11-22T08:03:38Z|00313|binding|INFO|Removing iface tap650f9e14-a6 ovn-installed in OVS
Nov 22 03:03:38 np0005531887 nova_compute[186849]: 2025-11-22 08:03:38.364 186853 DEBUG nova.virt.libvirt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Received event <DeviceRemovedEvent: 1763798618.3644316, 87323ed4-21ca-4440-802a-6f396fa56b00 => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Nov 22 03:03:38 np0005531887 nova_compute[186849]: 2025-11-22 08:03:38.366 186853 DEBUG nova.virt.libvirt.driver [None req-2e46aa80-dec3-445d-8793-bdbf3315afa7 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Start waiting for the detach event from libvirt for device tap650f9e14-a6 with device alias net1 for instance 87323ed4-21ca-4440-802a-6f396fa56b00 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Nov 22 03:03:38 np0005531887 nova_compute[186849]: 2025-11-22 08:03:38.367 186853 DEBUG nova.virt.libvirt.guest [None req-2e46aa80-dec3-445d-8793-bdbf3315afa7 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:35:e8:28"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap650f9e14-a6"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 22 03:03:38 np0005531887 nova_compute[186849]: 2025-11-22 08:03:38.371 186853 DEBUG nova.virt.libvirt.guest [None req-2e46aa80-dec3-445d-8793-bdbf3315afa7 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:35:e8:28"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap650f9e14-a6"/></interface>not found in domain: <domain type='kvm' id='39'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <name>instance-00000065</name>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <uuid>87323ed4-21ca-4440-802a-6f396fa56b00</uuid>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <nova:name>tempest-tempest.common.compute-instance-2137784552</nova:name>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <nova:creationTime>2025-11-22 08:03:37</nova:creationTime>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <nova:flavor name="m1.nano">
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <nova:memory>128</nova:memory>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <nova:disk>1</nova:disk>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <nova:swap>0</nova:swap>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <nova:vcpus>1</nova:vcpus>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  </nova:flavor>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <nova:owner>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <nova:user uuid="53d50a77d3f2416c8fcc459cc343d045">tempest-AttachInterfacesTestJSON-148239263-project-member</nova:user>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <nova:project uuid="2ce7c88f9ad440f494bdba03e7ece1bf">tempest-AttachInterfacesTestJSON-148239263</nova:project>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  </nova:owner>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <nova:ports>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <nova:port uuid="59d94620-ece2-4ba1-96e7-f25b12dc87fc">
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </nova:port>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <nova:port uuid="650f9e14-a6b8-46d0-8167-1eb22fcbc8fc">
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </nova:port>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  </nova:ports>
Nov 22 03:03:38 np0005531887 nova_compute[186849]: </nova:instance>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <memory unit='KiB'>131072</memory>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <currentMemory unit='KiB'>131072</currentMemory>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <vcpu placement='static'>1</vcpu>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <resource>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <partition>/machine</partition>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  </resource>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <sysinfo type='smbios'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <system>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <entry name='manufacturer'>RDO</entry>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <entry name='product'>OpenStack Compute</entry>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <entry name='serial'>87323ed4-21ca-4440-802a-6f396fa56b00</entry>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <entry name='uuid'>87323ed4-21ca-4440-802a-6f396fa56b00</entry>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <entry name='family'>Virtual Machine</entry>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </system>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <os>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <boot dev='hd'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <smbios mode='sysinfo'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  </os>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <features>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <vmcoreinfo state='on'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  </features>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <cpu mode='custom' match='exact' check='full'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <model fallback='forbid'>Nehalem</model>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <feature policy='require' name='x2apic'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <feature policy='require' name='hypervisor'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <feature policy='require' name='vme'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <clock offset='utc'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <timer name='pit' tickpolicy='delay'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <timer name='rtc' tickpolicy='catchup'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <timer name='hpet' present='no'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  </clock>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <on_poweroff>destroy</on_poweroff>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <on_reboot>restart</on_reboot>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <on_crash>destroy</on_crash>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <devices>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <disk type='file' device='disk'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <driver name='qemu' type='qcow2' cache='none'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <source file='/var/lib/nova/instances/87323ed4-21ca-4440-802a-6f396fa56b00/disk' index='2'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <backingStore type='file' index='3'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:        <format type='raw'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:        <source file='/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:        <backingStore/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      </backingStore>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <target dev='vda' bus='virtio'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='virtio-disk0'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <disk type='file' device='cdrom'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <driver name='qemu' type='raw' cache='none'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <source file='/var/lib/nova/instances/87323ed4-21ca-4440-802a-6f396fa56b00/disk.config' index='1'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <backingStore/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <target dev='sda' bus='sata'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <readonly/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='sata0-0-0'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='0' model='pcie-root'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='pcie.0'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='1' model='pcie-root-port'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <target chassis='1' port='0x10'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='pci.1'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='2' model='pcie-root-port'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <target chassis='2' port='0x11'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='pci.2'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='3' model='pcie-root-port'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <target chassis='3' port='0x12'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='pci.3'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='4' model='pcie-root-port'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <target chassis='4' port='0x13'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='pci.4'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='5' model='pcie-root-port'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <target chassis='5' port='0x14'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='pci.5'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='6' model='pcie-root-port'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <target chassis='6' port='0x15'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='pci.6'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='7' model='pcie-root-port'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <target chassis='7' port='0x16'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='pci.7'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='8' model='pcie-root-port'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <target chassis='8' port='0x17'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='pci.8'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='9' model='pcie-root-port'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <target chassis='9' port='0x18'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='pci.9'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='10' model='pcie-root-port'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <target chassis='10' port='0x19'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='pci.10'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='11' model='pcie-root-port'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <target chassis='11' port='0x1a'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='pci.11'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='12' model='pcie-root-port'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <target chassis='12' port='0x1b'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='pci.12'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='13' model='pcie-root-port'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <target chassis='13' port='0x1c'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='pci.13'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='14' model='pcie-root-port'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <target chassis='14' port='0x1d'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='pci.14'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='15' model='pcie-root-port'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <target chassis='15' port='0x1e'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='pci.15'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='16' model='pcie-root-port'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <target chassis='16' port='0x1f'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='pci.16'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='17' model='pcie-root-port'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <target chassis='17' port='0x20'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='pci.17'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='18' model='pcie-root-port'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <target chassis='18' port='0x21'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='pci.18'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='19' model='pcie-root-port'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <target chassis='19' port='0x22'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='pci.19'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='20' model='pcie-root-port'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <target chassis='20' port='0x23'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='pci.20'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='21' model='pcie-root-port'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <target chassis='21' port='0x24'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='pci.21'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='22' model='pcie-root-port'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <target chassis='22' port='0x25'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='pci.22'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='23' model='pcie-root-port'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <target chassis='23' port='0x26'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='pci.23'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='24' model='pcie-root-port'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <target chassis='24' port='0x27'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='pci.24'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='25' model='pcie-root-port'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <model name='pcie-root-port'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <target chassis='25' port='0x28'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='pci.25'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <model name='pcie-pci-bridge'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='pci.26'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <controller type='usb' index='0' model='piix3-uhci'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='usb'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <controller type='sata' index='0'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='ide'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </controller>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <interface type='ethernet'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <mac address='fa:16:3e:78:71:fb'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <target dev='tap59d94620-ec'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <model type='virtio'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <driver name='vhost' rx_queue_size='512'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <mtu size='1442'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='net0'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </interface>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <serial type='pty'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <source path='/dev/pts/0'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <log file='/var/lib/nova/instances/87323ed4-21ca-4440-802a-6f396fa56b00/console.log' append='off'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <target type='isa-serial' port='0'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:        <model name='isa-serial'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      </target>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='serial0'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </serial>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <console type='pty' tty='/dev/pts/0'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <source path='/dev/pts/0'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <log file='/var/lib/nova/instances/87323ed4-21ca-4440-802a-6f396fa56b00/console.log' append='off'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <target type='serial' port='0'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='serial0'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </console>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <input type='tablet' bus='usb'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='input0'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='usb' bus='0' port='1'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </input>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <input type='mouse' bus='ps2'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='input1'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </input>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <input type='keyboard' bus='ps2'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='input2'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </input>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <listen type='address' address='::0'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </graphics>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <audio id='1' type='none'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <video>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <model type='virtio' heads='1' primary='yes'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='video0'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </video>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <watchdog model='itco' action='reset'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='watchdog0'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </watchdog>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <memballoon model='virtio'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <stats period='10'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='balloon0'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <rng model='virtio'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <backend model='random'>/dev/urandom</backend>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <alias name='rng0'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </rng>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  </devices>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <label>system_u:system_r:svirt_t:s0:c337,c775</label>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c337,c775</imagelabel>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  </seclabel>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <label>+107:+107</label>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <imagelabel>+107:+107</imagelabel>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  </seclabel>
Nov 22 03:03:38 np0005531887 nova_compute[186849]: </domain>
Nov 22 03:03:38 np0005531887 nova_compute[186849]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Nov 22 03:03:38 np0005531887 nova_compute[186849]: 2025-11-22 08:03:38.371 186853 INFO nova.virt.libvirt.driver [None req-2e46aa80-dec3-445d-8793-bdbf3315afa7 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Successfully detached device tap650f9e14-a6 from instance 87323ed4-21ca-4440-802a-6f396fa56b00 from the live domain config.#033[00m
Nov 22 03:03:38 np0005531887 nova_compute[186849]: 2025-11-22 08:03:38.372 186853 DEBUG nova.virt.libvirt.vif [None req-2e46aa80-dec3-445d-8793-bdbf3315afa7 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:02:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-2137784552',display_name='tempest-tempest.common.compute-instance-2137784552',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-2137784552',id=101,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ2pvf/H+ujHg0O+DCfl3iSnSZB+hOZsT1N0h7AAWcza7jj+TC3mzLnwxXf8MuF024jaM5DCjx5HRt44Je85H8cdbToJjcwxwTiW4fjXAIcLYMsjBTMa7LxgILfw3UjssQ==',key_name='tempest-keypair-981839442',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:03:05Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2ce7c88f9ad440f494bdba03e7ece1bf',ramdisk_id='',reservation_id='r-r80dn0e5',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-148239263',owner_user_name='tempest-AttachInterfacesTestJSON-148239263-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:03:06Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='53d50a77d3f2416c8fcc459cc343d045',uuid=87323ed4-21ca-4440-802a-6f396fa56b00,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "650f9e14-a6b8-46d0-8167-1eb22fcbc8fc", "address": "fa:16:3e:35:e8:28", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap650f9e14-a6", "ovs_interfaceid": "650f9e14-a6b8-46d0-8167-1eb22fcbc8fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:03:38 np0005531887 nova_compute[186849]: 2025-11-22 08:03:38.373 186853 DEBUG nova.network.os_vif_util [None req-2e46aa80-dec3-445d-8793-bdbf3315afa7 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converting VIF {"id": "650f9e14-a6b8-46d0-8167-1eb22fcbc8fc", "address": "fa:16:3e:35:e8:28", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap650f9e14-a6", "ovs_interfaceid": "650f9e14-a6b8-46d0-8167-1eb22fcbc8fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:03:38 np0005531887 nova_compute[186849]: 2025-11-22 08:03:38.373 186853 DEBUG nova.network.os_vif_util [None req-2e46aa80-dec3-445d-8793-bdbf3315afa7 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:35:e8:28,bridge_name='br-int',has_traffic_filtering=True,id=650f9e14-a6b8-46d0-8167-1eb22fcbc8fc,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap650f9e14-a6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:03:38 np0005531887 nova_compute[186849]: 2025-11-22 08:03:38.374 186853 DEBUG os_vif [None req-2e46aa80-dec3-445d-8793-bdbf3315afa7 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:e8:28,bridge_name='br-int',has_traffic_filtering=True,id=650f9e14-a6b8-46d0-8167-1eb22fcbc8fc,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap650f9e14-a6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:03:38 np0005531887 nova_compute[186849]: 2025-11-22 08:03:38.376 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:38 np0005531887 nova_compute[186849]: 2025-11-22 08:03:38.377 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap650f9e14-a6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:03:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:38.377 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:35:e8:28 10.100.0.6'], port_security=['fa:16:3e:35:e8:28 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-AttachInterfacesTestJSON-638313878', 'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '87323ed4-21ca-4440-802a-6f396fa56b00', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6a4a282c-db22-41de-b34b-2960aa032ca8', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-AttachInterfacesTestJSON-638313878', 'neutron:project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd4a48801-4b3f-49e9-aa90-fb1d486a915e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0acaade9-a442-4c9d-a882-bd397d30fce8, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=650f9e14-a6b8-46d0-8167-1eb22fcbc8fc) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:03:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:38.379 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 650f9e14-a6b8-46d0-8167-1eb22fcbc8fc in datapath 6a4a282c-db22-41de-b34b-2960aa032ca8 unbound from our chassis#033[00m
Nov 22 03:03:38 np0005531887 nova_compute[186849]: 2025-11-22 08:03:38.379 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:38 np0005531887 nova_compute[186849]: 2025-11-22 08:03:38.379 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:38 np0005531887 nova_compute[186849]: 2025-11-22 08:03:38.380 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:38.381 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6a4a282c-db22-41de-b34b-2960aa032ca8#033[00m
Nov 22 03:03:38 np0005531887 nova_compute[186849]: 2025-11-22 08:03:38.384 186853 INFO os_vif [None req-2e46aa80-dec3-445d-8793-bdbf3315afa7 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:e8:28,bridge_name='br-int',has_traffic_filtering=True,id=650f9e14-a6b8-46d0-8167-1eb22fcbc8fc,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap650f9e14-a6')#033[00m
Nov 22 03:03:38 np0005531887 nova_compute[186849]: 2025-11-22 08:03:38.385 186853 DEBUG nova.virt.libvirt.guest [None req-2e46aa80-dec3-445d-8793-bdbf3315afa7 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <nova:name>tempest-tempest.common.compute-instance-2137784552</nova:name>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <nova:creationTime>2025-11-22 08:03:38</nova:creationTime>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <nova:flavor name="m1.nano">
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <nova:memory>128</nova:memory>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <nova:disk>1</nova:disk>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <nova:swap>0</nova:swap>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <nova:vcpus>1</nova:vcpus>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  </nova:flavor>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <nova:owner>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <nova:user uuid="53d50a77d3f2416c8fcc459cc343d045">tempest-AttachInterfacesTestJSON-148239263-project-member</nova:user>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <nova:project uuid="2ce7c88f9ad440f494bdba03e7ece1bf">tempest-AttachInterfacesTestJSON-148239263</nova:project>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  </nova:owner>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  <nova:ports>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    <nova:port uuid="59d94620-ece2-4ba1-96e7-f25b12dc87fc">
Nov 22 03:03:38 np0005531887 nova_compute[186849]:      <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:    </nova:port>
Nov 22 03:03:38 np0005531887 nova_compute[186849]:  </nova:ports>
Nov 22 03:03:38 np0005531887 nova_compute[186849]: </nova:instance>
Nov 22 03:03:38 np0005531887 nova_compute[186849]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Nov 22 03:03:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:38.399 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[9ea637d8-9245-4eca-a7fc-475be8e87a38]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:38.432 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[6dabb07e-2567-4a2d-9866-107c001292df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:38.435 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[2911445e-9a28-490c-9b6f-881ab39465f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:38.465 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[c73215e0-a42c-4aa8-87d5-f6414b7461a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:38.485 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[fac0c39e-b11f-4812-8db1-57c2869fe6ea]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6a4a282c-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b3:7a:86'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 97], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 532790, 'reachable_time': 24498, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228367, 'error': None, 'target': 'ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:38.501 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[a11528b0-0510-4f40-96fa-6fc25db71410]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap6a4a282c-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 532803, 'tstamp': 532803}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228368, 'error': None, 'target': 'ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap6a4a282c-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 532806, 'tstamp': 532806}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228368, 'error': None, 'target': 'ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:03:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:38.502 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6a4a282c-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:03:38 np0005531887 nova_compute[186849]: 2025-11-22 08:03:38.503 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:38 np0005531887 nova_compute[186849]: 2025-11-22 08:03:38.504 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:38.505 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6a4a282c-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:03:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:38.506 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:03:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:38.507 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6a4a282c-d0, col_values=(('external_ids', {'iface-id': '26692495-261e-4628-ae4d-0a33d676c097'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:03:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:03:38.507 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:03:39 np0005531887 nova_compute[186849]: 2025-11-22 08:03:39.643 186853 DEBUG oslo_concurrency.lockutils [None req-2e46aa80-dec3-445d-8793-bdbf3315afa7 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquiring lock "refresh_cache-87323ed4-21ca-4440-802a-6f396fa56b00" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:03:39 np0005531887 nova_compute[186849]: 2025-11-22 08:03:39.643 186853 DEBUG oslo_concurrency.lockutils [None req-2e46aa80-dec3-445d-8793-bdbf3315afa7 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquired lock "refresh_cache-87323ed4-21ca-4440-802a-6f396fa56b00" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:03:39 np0005531887 nova_compute[186849]: 2025-11-22 08:03:39.644 186853 DEBUG nova.network.neutron [None req-2e46aa80-dec3-445d-8793-bdbf3315afa7 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:03:39 np0005531887 nova_compute[186849]: 2025-11-22 08:03:39.946 186853 DEBUG nova.compute.manager [req-724a466d-a9d9-480b-b410-4cdd0be466ce req-b955d9e8-4c83-4df9-a6a2-23694c4a1732 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Received event network-vif-plugged-650f9e14-a6b8-46d0-8167-1eb22fcbc8fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:03:39 np0005531887 nova_compute[186849]: 2025-11-22 08:03:39.946 186853 DEBUG oslo_concurrency.lockutils [req-724a466d-a9d9-480b-b410-4cdd0be466ce req-b955d9e8-4c83-4df9-a6a2-23694c4a1732 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "87323ed4-21ca-4440-802a-6f396fa56b00-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:03:39 np0005531887 nova_compute[186849]: 2025-11-22 08:03:39.947 186853 DEBUG oslo_concurrency.lockutils [req-724a466d-a9d9-480b-b410-4cdd0be466ce req-b955d9e8-4c83-4df9-a6a2-23694c4a1732 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "87323ed4-21ca-4440-802a-6f396fa56b00-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:03:39 np0005531887 nova_compute[186849]: 2025-11-22 08:03:39.947 186853 DEBUG oslo_concurrency.lockutils [req-724a466d-a9d9-480b-b410-4cdd0be466ce req-b955d9e8-4c83-4df9-a6a2-23694c4a1732 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "87323ed4-21ca-4440-802a-6f396fa56b00-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:03:39 np0005531887 nova_compute[186849]: 2025-11-22 08:03:39.947 186853 DEBUG nova.compute.manager [req-724a466d-a9d9-480b-b410-4cdd0be466ce req-b955d9e8-4c83-4df9-a6a2-23694c4a1732 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] No waiting events found dispatching network-vif-plugged-650f9e14-a6b8-46d0-8167-1eb22fcbc8fc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:03:39 np0005531887 nova_compute[186849]: 2025-11-22 08:03:39.948 186853 WARNING nova.compute.manager [req-724a466d-a9d9-480b-b410-4cdd0be466ce req-b955d9e8-4c83-4df9-a6a2-23694c4a1732 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Received unexpected event network-vif-plugged-650f9e14-a6b8-46d0-8167-1eb22fcbc8fc for instance with vm_state active and task_state None.#033[00m
Nov 22 03:03:39 np0005531887 nova_compute[186849]: 2025-11-22 08:03:39.948 186853 DEBUG nova.compute.manager [req-724a466d-a9d9-480b-b410-4cdd0be466ce req-b955d9e8-4c83-4df9-a6a2-23694c4a1732 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Received event network-vif-unplugged-650f9e14-a6b8-46d0-8167-1eb22fcbc8fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:03:39 np0005531887 nova_compute[186849]: 2025-11-22 08:03:39.948 186853 DEBUG oslo_concurrency.lockutils [req-724a466d-a9d9-480b-b410-4cdd0be466ce req-b955d9e8-4c83-4df9-a6a2-23694c4a1732 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "87323ed4-21ca-4440-802a-6f396fa56b00-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:03:39 np0005531887 nova_compute[186849]: 2025-11-22 08:03:39.949 186853 DEBUG oslo_concurrency.lockutils [req-724a466d-a9d9-480b-b410-4cdd0be466ce req-b955d9e8-4c83-4df9-a6a2-23694c4a1732 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "87323ed4-21ca-4440-802a-6f396fa56b00-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:03:39 np0005531887 nova_compute[186849]: 2025-11-22 08:03:39.949 186853 DEBUG oslo_concurrency.lockutils [req-724a466d-a9d9-480b-b410-4cdd0be466ce req-b955d9e8-4c83-4df9-a6a2-23694c4a1732 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "87323ed4-21ca-4440-802a-6f396fa56b00-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:03:39 np0005531887 nova_compute[186849]: 2025-11-22 08:03:39.949 186853 DEBUG nova.compute.manager [req-724a466d-a9d9-480b-b410-4cdd0be466ce req-b955d9e8-4c83-4df9-a6a2-23694c4a1732 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] No waiting events found dispatching network-vif-unplugged-650f9e14-a6b8-46d0-8167-1eb22fcbc8fc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:03:39 np0005531887 nova_compute[186849]: 2025-11-22 08:03:39.949 186853 WARNING nova.compute.manager [req-724a466d-a9d9-480b-b410-4cdd0be466ce req-b955d9e8-4c83-4df9-a6a2-23694c4a1732 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Received unexpected event network-vif-unplugged-650f9e14-a6b8-46d0-8167-1eb22fcbc8fc for instance with vm_state active and task_state None.#033[00m
Nov 22 03:03:39 np0005531887 nova_compute[186849]: 2025-11-22 08:03:39.950 186853 DEBUG nova.compute.manager [req-724a466d-a9d9-480b-b410-4cdd0be466ce req-b955d9e8-4c83-4df9-a6a2-23694c4a1732 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Received event network-vif-plugged-650f9e14-a6b8-46d0-8167-1eb22fcbc8fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:03:39 np0005531887 nova_compute[186849]: 2025-11-22 08:03:39.950 186853 DEBUG oslo_concurrency.lockutils [req-724a466d-a9d9-480b-b410-4cdd0be466ce req-b955d9e8-4c83-4df9-a6a2-23694c4a1732 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "87323ed4-21ca-4440-802a-6f396fa56b00-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:03:39 np0005531887 nova_compute[186849]: 2025-11-22 08:03:39.950 186853 DEBUG oslo_concurrency.lockutils [req-724a466d-a9d9-480b-b410-4cdd0be466ce req-b955d9e8-4c83-4df9-a6a2-23694c4a1732 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "87323ed4-21ca-4440-802a-6f396fa56b00-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:03:39 np0005531887 nova_compute[186849]: 2025-11-22 08:03:39.950 186853 DEBUG oslo_concurrency.lockutils [req-724a466d-a9d9-480b-b410-4cdd0be466ce req-b955d9e8-4c83-4df9-a6a2-23694c4a1732 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "87323ed4-21ca-4440-802a-6f396fa56b00-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:03:39 np0005531887 nova_compute[186849]: 2025-11-22 08:03:39.951 186853 DEBUG nova.compute.manager [req-724a466d-a9d9-480b-b410-4cdd0be466ce req-b955d9e8-4c83-4df9-a6a2-23694c4a1732 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] No waiting events found dispatching network-vif-plugged-650f9e14-a6b8-46d0-8167-1eb22fcbc8fc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:03:39 np0005531887 nova_compute[186849]: 2025-11-22 08:03:39.951 186853 WARNING nova.compute.manager [req-724a466d-a9d9-480b-b410-4cdd0be466ce req-b955d9e8-4c83-4df9-a6a2-23694c4a1732 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Received unexpected event network-vif-plugged-650f9e14-a6b8-46d0-8167-1eb22fcbc8fc for instance with vm_state active and task_state None.#033[00m
Nov 22 03:03:40 np0005531887 nova_compute[186849]: 2025-11-22 08:03:40.326 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:40 np0005531887 podman[228370]: 2025-11-22 08:03:40.852196154 +0000 UTC m=+0.070718545 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 03:03:41 np0005531887 nova_compute[186849]: 2025-11-22 08:03:41.133 186853 INFO nova.network.neutron [None req-2e46aa80-dec3-445d-8793-bdbf3315afa7 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Port 650f9e14-a6b8-46d0-8167-1eb22fcbc8fc from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Nov 22 03:03:41 np0005531887 nova_compute[186849]: 2025-11-22 08:03:41.133 186853 DEBUG nova.network.neutron [None req-2e46aa80-dec3-445d-8793-bdbf3315afa7 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Updating instance_info_cache with network_info: [{"id": "59d94620-ece2-4ba1-96e7-f25b12dc87fc", "address": "fa:16:3e:78:71:fb", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap59d94620-ec", "ovs_interfaceid": "59d94620-ece2-4ba1-96e7-f25b12dc87fc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:03:41 np0005531887 nova_compute[186849]: 2025-11-22 08:03:41.176 186853 DEBUG oslo_concurrency.lockutils [None req-2e46aa80-dec3-445d-8793-bdbf3315afa7 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Releasing lock "refresh_cache-87323ed4-21ca-4440-802a-6f396fa56b00" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:03:41 np0005531887 nova_compute[186849]: 2025-11-22 08:03:41.217 186853 DEBUG oslo_concurrency.lockutils [None req-2e46aa80-dec3-445d-8793-bdbf3315afa7 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "interface-87323ed4-21ca-4440-802a-6f396fa56b00-650f9e14-a6b8-46d0-8167-1eb22fcbc8fc" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 3.052s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:03:41 np0005531887 nova_compute[186849]: 2025-11-22 08:03:41.738 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:42 np0005531887 nova_compute[186849]: 2025-11-22 08:03:42.665 186853 DEBUG nova.compute.manager [req-1c0d63de-1889-462f-a173-96d847ef359c req-ecea571a-2462-4af3-b33b-bed560a606b8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Received event network-changed-59d94620-ece2-4ba1-96e7-f25b12dc87fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:03:42 np0005531887 nova_compute[186849]: 2025-11-22 08:03:42.666 186853 DEBUG nova.compute.manager [req-1c0d63de-1889-462f-a173-96d847ef359c req-ecea571a-2462-4af3-b33b-bed560a606b8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Refreshing instance network info cache due to event network-changed-59d94620-ece2-4ba1-96e7-f25b12dc87fc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:03:42 np0005531887 nova_compute[186849]: 2025-11-22 08:03:42.666 186853 DEBUG oslo_concurrency.lockutils [req-1c0d63de-1889-462f-a173-96d847ef359c req-ecea571a-2462-4af3-b33b-bed560a606b8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-87323ed4-21ca-4440-802a-6f396fa56b00" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:03:42 np0005531887 nova_compute[186849]: 2025-11-22 08:03:42.666 186853 DEBUG oslo_concurrency.lockutils [req-1c0d63de-1889-462f-a173-96d847ef359c req-ecea571a-2462-4af3-b33b-bed560a606b8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-87323ed4-21ca-4440-802a-6f396fa56b00" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:03:42 np0005531887 nova_compute[186849]: 2025-11-22 08:03:42.666 186853 DEBUG nova.network.neutron [req-1c0d63de-1889-462f-a173-96d847ef359c req-ecea571a-2462-4af3-b33b-bed560a606b8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Refreshing network info cache for port 59d94620-ece2-4ba1-96e7-f25b12dc87fc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:03:43 np0005531887 nova_compute[186849]: 2025-11-22 08:03:43.141 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:43 np0005531887 nova_compute[186849]: 2025-11-22 08:03:43.379 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:43 np0005531887 nova_compute[186849]: 2025-11-22 08:03:43.839 186853 DEBUG nova.network.neutron [req-1c0d63de-1889-462f-a173-96d847ef359c req-ecea571a-2462-4af3-b33b-bed560a606b8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Updated VIF entry in instance network info cache for port 59d94620-ece2-4ba1-96e7-f25b12dc87fc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:03:43 np0005531887 nova_compute[186849]: 2025-11-22 08:03:43.839 186853 DEBUG nova.network.neutron [req-1c0d63de-1889-462f-a173-96d847ef359c req-ecea571a-2462-4af3-b33b-bed560a606b8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Updating instance_info_cache with network_info: [{"id": "59d94620-ece2-4ba1-96e7-f25b12dc87fc", "address": "fa:16:3e:78:71:fb", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap59d94620-ec", "ovs_interfaceid": "59d94620-ece2-4ba1-96e7-f25b12dc87fc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:03:43 np0005531887 nova_compute[186849]: 2025-11-22 08:03:43.865 186853 DEBUG oslo_concurrency.lockutils [req-1c0d63de-1889-462f-a173-96d847ef359c req-ecea571a-2462-4af3-b33b-bed560a606b8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-87323ed4-21ca-4440-802a-6f396fa56b00" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:03:46 np0005531887 nova_compute[186849]: 2025-11-22 08:03:46.741 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:46 np0005531887 podman[228393]: 2025-11-22 08:03:46.838709583 +0000 UTC m=+0.056708086 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, architecture=x86_64, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_id=edpm, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vcs-type=git)
Nov 22 03:03:47 np0005531887 ovn_controller[95130]: 2025-11-22T08:03:47Z|00314|binding|INFO|Releasing lport 26692495-261e-4628-ae4d-0a33d676c097 from this chassis (sb_readonly=0)
Nov 22 03:03:47 np0005531887 nova_compute[186849]: 2025-11-22 08:03:47.245 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:48 np0005531887 nova_compute[186849]: 2025-11-22 08:03:48.382 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:49 np0005531887 nova_compute[186849]: 2025-11-22 08:03:49.792 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:03:50 np0005531887 podman[228415]: 2025-11-22 08:03:50.850431417 +0000 UTC m=+0.064421474 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible)
Nov 22 03:03:50 np0005531887 podman[228416]: 2025-11-22 08:03:50.902934034 +0000 UTC m=+0.113782740 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 22 03:03:51 np0005531887 nova_compute[186849]: 2025-11-22 08:03:51.744 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:52 np0005531887 nova_compute[186849]: 2025-11-22 08:03:52.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:03:52 np0005531887 nova_compute[186849]: 2025-11-22 08:03:52.798 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:03:52 np0005531887 nova_compute[186849]: 2025-11-22 08:03:52.799 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:03:52 np0005531887 nova_compute[186849]: 2025-11-22 08:03:52.799 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:03:52 np0005531887 nova_compute[186849]: 2025-11-22 08:03:52.799 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:03:52 np0005531887 nova_compute[186849]: 2025-11-22 08:03:52.932 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/87323ed4-21ca-4440-802a-6f396fa56b00/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:03:52 np0005531887 nova_compute[186849]: 2025-11-22 08:03:52.998 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/87323ed4-21ca-4440-802a-6f396fa56b00/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:03:52 np0005531887 nova_compute[186849]: 2025-11-22 08:03:52.999 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/87323ed4-21ca-4440-802a-6f396fa56b00/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:03:53 np0005531887 nova_compute[186849]: 2025-11-22 08:03:53.063 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/87323ed4-21ca-4440-802a-6f396fa56b00/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:03:53 np0005531887 nova_compute[186849]: 2025-11-22 08:03:53.266 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:03:53 np0005531887 nova_compute[186849]: 2025-11-22 08:03:53.267 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5580MB free_disk=73.31700897216797GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:03:53 np0005531887 nova_compute[186849]: 2025-11-22 08:03:53.267 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:03:53 np0005531887 nova_compute[186849]: 2025-11-22 08:03:53.268 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:03:53 np0005531887 nova_compute[186849]: 2025-11-22 08:03:53.386 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:53 np0005531887 nova_compute[186849]: 2025-11-22 08:03:53.759 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Instance 87323ed4-21ca-4440-802a-6f396fa56b00 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 03:03:53 np0005531887 nova_compute[186849]: 2025-11-22 08:03:53.760 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:03:53 np0005531887 nova_compute[186849]: 2025-11-22 08:03:53.760 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:03:53 np0005531887 nova_compute[186849]: 2025-11-22 08:03:53.900 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:03:53 np0005531887 nova_compute[186849]: 2025-11-22 08:03:53.913 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:03:53 np0005531887 nova_compute[186849]: 2025-11-22 08:03:53.972 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:03:53 np0005531887 nova_compute[186849]: 2025-11-22 08:03:53.972 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.704s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:03:54 np0005531887 nova_compute[186849]: 2025-11-22 08:03:54.967 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:03:54 np0005531887 nova_compute[186849]: 2025-11-22 08:03:54.967 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:03:54 np0005531887 nova_compute[186849]: 2025-11-22 08:03:54.967 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:03:55 np0005531887 podman[228469]: 2025-11-22 08:03:55.84683696 +0000 UTC m=+0.058675887 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:03:56 np0005531887 nova_compute[186849]: 2025-11-22 08:03:56.746 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:56 np0005531887 nova_compute[186849]: 2025-11-22 08:03:56.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:03:56 np0005531887 nova_compute[186849]: 2025-11-22 08:03:56.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:03:58 np0005531887 nova_compute[186849]: 2025-11-22 08:03:58.388 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:03:58 np0005531887 nova_compute[186849]: 2025-11-22 08:03:58.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:03:58 np0005531887 nova_compute[186849]: 2025-11-22 08:03:58.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:03:58 np0005531887 nova_compute[186849]: 2025-11-22 08:03:58.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 22 03:03:58 np0005531887 nova_compute[186849]: 2025-11-22 08:03:58.787 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 22 03:03:59 np0005531887 nova_compute[186849]: 2025-11-22 08:03:59.788 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:03:59 np0005531887 nova_compute[186849]: 2025-11-22 08:03:59.788 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:03:59 np0005531887 nova_compute[186849]: 2025-11-22 08:03:59.788 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:04:00 np0005531887 nova_compute[186849]: 2025-11-22 08:04:00.052 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "refresh_cache-87323ed4-21ca-4440-802a-6f396fa56b00" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:04:00 np0005531887 nova_compute[186849]: 2025-11-22 08:04:00.053 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquired lock "refresh_cache-87323ed4-21ca-4440-802a-6f396fa56b00" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:04:00 np0005531887 nova_compute[186849]: 2025-11-22 08:04:00.054 186853 DEBUG nova.network.neutron [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 03:04:00 np0005531887 nova_compute[186849]: 2025-11-22 08:04:00.054 186853 DEBUG nova.objects.instance [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 87323ed4-21ca-4440-802a-6f396fa56b00 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:04:01 np0005531887 nova_compute[186849]: 2025-11-22 08:04:01.748 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:02 np0005531887 podman[228492]: 2025-11-22 08:04:02.863365829 +0000 UTC m=+0.078568188 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 22 03:04:03 np0005531887 nova_compute[186849]: 2025-11-22 08:04:03.394 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:03 np0005531887 nova_compute[186849]: 2025-11-22 08:04:03.814 186853 DEBUG nova.network.neutron [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Updating instance_info_cache with network_info: [{"id": "59d94620-ece2-4ba1-96e7-f25b12dc87fc", "address": "fa:16:3e:78:71:fb", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap59d94620-ec", "ovs_interfaceid": "59d94620-ece2-4ba1-96e7-f25b12dc87fc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:04:03 np0005531887 nova_compute[186849]: 2025-11-22 08:04:03.830 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Releasing lock "refresh_cache-87323ed4-21ca-4440-802a-6f396fa56b00" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:04:03 np0005531887 nova_compute[186849]: 2025-11-22 08:04:03.830 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 03:04:03 np0005531887 nova_compute[186849]: 2025-11-22 08:04:03.831 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:04:03 np0005531887 nova_compute[186849]: 2025-11-22 08:04:03.831 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:04:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:04:04.123 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=29, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=28) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:04:04 np0005531887 nova_compute[186849]: 2025-11-22 08:04:04.125 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:04:04.125 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:04:05 np0005531887 nova_compute[186849]: 2025-11-22 08:04:05.867 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:06 np0005531887 nova_compute[186849]: 2025-11-22 08:04:06.750 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:06 np0005531887 nova_compute[186849]: 2025-11-22 08:04:06.781 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:04:06 np0005531887 nova_compute[186849]: 2025-11-22 08:04:06.802 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:04:06 np0005531887 nova_compute[186849]: 2025-11-22 08:04:06.803 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 22 03:04:06 np0005531887 podman[228511]: 2025-11-22 08:04:06.838496466 +0000 UTC m=+0.055496343 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 22 03:04:07 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:04:07.128 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '29'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:04:08 np0005531887 nova_compute[186849]: 2025-11-22 08:04:08.398 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:11 np0005531887 nova_compute[186849]: 2025-11-22 08:04:11.666 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:04:11 np0005531887 nova_compute[186849]: 2025-11-22 08:04:11.696 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Triggering sync for uuid 87323ed4-21ca-4440-802a-6f396fa56b00 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Nov 22 03:04:11 np0005531887 nova_compute[186849]: 2025-11-22 08:04:11.697 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "87323ed4-21ca-4440-802a-6f396fa56b00" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:04:11 np0005531887 nova_compute[186849]: 2025-11-22 08:04:11.697 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "87323ed4-21ca-4440-802a-6f396fa56b00" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:04:11 np0005531887 nova_compute[186849]: 2025-11-22 08:04:11.720 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "87323ed4-21ca-4440-802a-6f396fa56b00" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.023s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:04:11 np0005531887 nova_compute[186849]: 2025-11-22 08:04:11.753 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:11 np0005531887 podman[228533]: 2025-11-22 08:04:11.838872638 +0000 UTC m=+0.055816142 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:04:13 np0005531887 nova_compute[186849]: 2025-11-22 08:04:13.399 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:14 np0005531887 nova_compute[186849]: 2025-11-22 08:04:14.105 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:14 np0005531887 nova_compute[186849]: 2025-11-22 08:04:14.838 186853 DEBUG oslo_concurrency.lockutils [None req-ce096e0f-51f4-46e3-96b0-70ed93e5bfda 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquiring lock "87323ed4-21ca-4440-802a-6f396fa56b00" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:04:14 np0005531887 nova_compute[186849]: 2025-11-22 08:04:14.839 186853 DEBUG oslo_concurrency.lockutils [None req-ce096e0f-51f4-46e3-96b0-70ed93e5bfda 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "87323ed4-21ca-4440-802a-6f396fa56b00" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:04:14 np0005531887 nova_compute[186849]: 2025-11-22 08:04:14.840 186853 DEBUG oslo_concurrency.lockutils [None req-ce096e0f-51f4-46e3-96b0-70ed93e5bfda 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquiring lock "87323ed4-21ca-4440-802a-6f396fa56b00-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:04:14 np0005531887 nova_compute[186849]: 2025-11-22 08:04:14.840 186853 DEBUG oslo_concurrency.lockutils [None req-ce096e0f-51f4-46e3-96b0-70ed93e5bfda 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "87323ed4-21ca-4440-802a-6f396fa56b00-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:04:14 np0005531887 nova_compute[186849]: 2025-11-22 08:04:14.841 186853 DEBUG oslo_concurrency.lockutils [None req-ce096e0f-51f4-46e3-96b0-70ed93e5bfda 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "87323ed4-21ca-4440-802a-6f396fa56b00-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:04:14 np0005531887 nova_compute[186849]: 2025-11-22 08:04:14.848 186853 INFO nova.compute.manager [None req-ce096e0f-51f4-46e3-96b0-70ed93e5bfda 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Terminating instance#033[00m
Nov 22 03:04:14 np0005531887 nova_compute[186849]: 2025-11-22 08:04:14.853 186853 DEBUG nova.compute.manager [None req-ce096e0f-51f4-46e3-96b0-70ed93e5bfda 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 03:04:14 np0005531887 kernel: tap59d94620-ec (unregistering): left promiscuous mode
Nov 22 03:04:14 np0005531887 NetworkManager[55210]: <info>  [1763798654.9266] device (tap59d94620-ec): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:04:14 np0005531887 ovn_controller[95130]: 2025-11-22T08:04:14Z|00315|binding|INFO|Releasing lport 59d94620-ece2-4ba1-96e7-f25b12dc87fc from this chassis (sb_readonly=0)
Nov 22 03:04:14 np0005531887 nova_compute[186849]: 2025-11-22 08:04:14.932 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:14 np0005531887 ovn_controller[95130]: 2025-11-22T08:04:14Z|00316|binding|INFO|Setting lport 59d94620-ece2-4ba1-96e7-f25b12dc87fc down in Southbound
Nov 22 03:04:14 np0005531887 ovn_controller[95130]: 2025-11-22T08:04:14Z|00317|binding|INFO|Removing iface tap59d94620-ec ovn-installed in OVS
Nov 22 03:04:14 np0005531887 nova_compute[186849]: 2025-11-22 08:04:14.938 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:14 np0005531887 nova_compute[186849]: 2025-11-22 08:04:14.958 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:14 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:04:14.961 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:78:71:fb 10.100.0.8'], port_security=['fa:16:3e:78:71:fb 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '87323ed4-21ca-4440-802a-6f396fa56b00', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6a4a282c-db22-41de-b34b-2960aa032ca8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2ce7c88f9ad440f494bdba03e7ece1bf', 'neutron:revision_number': '4', 'neutron:security_group_ids': '6da332c7-e52a-4f92-8c24-c2ee0c6e77d0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0acaade9-a442-4c9d-a882-bd397d30fce8, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=59d94620-ece2-4ba1-96e7-f25b12dc87fc) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:04:14 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:04:14.962 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 59d94620-ece2-4ba1-96e7-f25b12dc87fc in datapath 6a4a282c-db22-41de-b34b-2960aa032ca8 unbound from our chassis#033[00m
Nov 22 03:04:14 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:04:14.964 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6a4a282c-db22-41de-b34b-2960aa032ca8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:04:14 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:04:14.966 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[28302d52-ed9b-42ab-bc59-0a6cb0b74747]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:04:14 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:04:14.967 104084 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8 namespace which is not needed anymore#033[00m
Nov 22 03:04:14 np0005531887 systemd[1]: machine-qemu\x2d39\x2dinstance\x2d00000065.scope: Deactivated successfully.
Nov 22 03:04:14 np0005531887 systemd[1]: machine-qemu\x2d39\x2dinstance\x2d00000065.scope: Consumed 19.060s CPU time.
Nov 22 03:04:14 np0005531887 systemd-machined[153180]: Machine qemu-39-instance-00000065 terminated.
Nov 22 03:04:15 np0005531887 nova_compute[186849]: 2025-11-22 08:04:15.111 186853 INFO nova.virt.libvirt.driver [-] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Instance destroyed successfully.#033[00m
Nov 22 03:04:15 np0005531887 nova_compute[186849]: 2025-11-22 08:04:15.113 186853 DEBUG nova.objects.instance [None req-ce096e0f-51f4-46e3-96b0-70ed93e5bfda 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lazy-loading 'resources' on Instance uuid 87323ed4-21ca-4440-802a-6f396fa56b00 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:04:15 np0005531887 nova_compute[186849]: 2025-11-22 08:04:15.136 186853 DEBUG nova.virt.libvirt.vif [None req-ce096e0f-51f4-46e3-96b0-70ed93e5bfda 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:02:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-2137784552',display_name='tempest-tempest.common.compute-instance-2137784552',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-2137784552',id=101,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ2pvf/H+ujHg0O+DCfl3iSnSZB+hOZsT1N0h7AAWcza7jj+TC3mzLnwxXf8MuF024jaM5DCjx5HRt44Je85H8cdbToJjcwxwTiW4fjXAIcLYMsjBTMa7LxgILfw3UjssQ==',key_name='tempest-keypair-981839442',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:03:05Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2ce7c88f9ad440f494bdba03e7ece1bf',ramdisk_id='',reservation_id='r-r80dn0e5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-148239263',owner_user_name='tempest-AttachInterfacesTestJSON-148239263-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:03:06Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='53d50a77d3f2416c8fcc459cc343d045',uuid=87323ed4-21ca-4440-802a-6f396fa56b00,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "59d94620-ece2-4ba1-96e7-f25b12dc87fc", "address": "fa:16:3e:78:71:fb", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap59d94620-ec", "ovs_interfaceid": "59d94620-ece2-4ba1-96e7-f25b12dc87fc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:04:15 np0005531887 nova_compute[186849]: 2025-11-22 08:04:15.136 186853 DEBUG nova.network.os_vif_util [None req-ce096e0f-51f4-46e3-96b0-70ed93e5bfda 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converting VIF {"id": "59d94620-ece2-4ba1-96e7-f25b12dc87fc", "address": "fa:16:3e:78:71:fb", "network": {"id": "6a4a282c-db22-41de-b34b-2960aa032ca8", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-641791513-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2ce7c88f9ad440f494bdba03e7ece1bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap59d94620-ec", "ovs_interfaceid": "59d94620-ece2-4ba1-96e7-f25b12dc87fc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:04:15 np0005531887 nova_compute[186849]: 2025-11-22 08:04:15.137 186853 DEBUG nova.network.os_vif_util [None req-ce096e0f-51f4-46e3-96b0-70ed93e5bfda 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:78:71:fb,bridge_name='br-int',has_traffic_filtering=True,id=59d94620-ece2-4ba1-96e7-f25b12dc87fc,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap59d94620-ec') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:04:15 np0005531887 nova_compute[186849]: 2025-11-22 08:04:15.138 186853 DEBUG os_vif [None req-ce096e0f-51f4-46e3-96b0-70ed93e5bfda 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:78:71:fb,bridge_name='br-int',has_traffic_filtering=True,id=59d94620-ece2-4ba1-96e7-f25b12dc87fc,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap59d94620-ec') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:04:15 np0005531887 nova_compute[186849]: 2025-11-22 08:04:15.139 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:15 np0005531887 nova_compute[186849]: 2025-11-22 08:04:15.140 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap59d94620-ec, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:04:15 np0005531887 nova_compute[186849]: 2025-11-22 08:04:15.141 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:15 np0005531887 nova_compute[186849]: 2025-11-22 08:04:15.143 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:15 np0005531887 nova_compute[186849]: 2025-11-22 08:04:15.146 186853 INFO os_vif [None req-ce096e0f-51f4-46e3-96b0-70ed93e5bfda 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:78:71:fb,bridge_name='br-int',has_traffic_filtering=True,id=59d94620-ece2-4ba1-96e7-f25b12dc87fc,network=Network(6a4a282c-db22-41de-b34b-2960aa032ca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap59d94620-ec')#033[00m
Nov 22 03:04:15 np0005531887 nova_compute[186849]: 2025-11-22 08:04:15.146 186853 INFO nova.virt.libvirt.driver [None req-ce096e0f-51f4-46e3-96b0-70ed93e5bfda 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Deleting instance files /var/lib/nova/instances/87323ed4-21ca-4440-802a-6f396fa56b00_del#033[00m
Nov 22 03:04:15 np0005531887 nova_compute[186849]: 2025-11-22 08:04:15.147 186853 INFO nova.virt.libvirt.driver [None req-ce096e0f-51f4-46e3-96b0-70ed93e5bfda 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Deletion of /var/lib/nova/instances/87323ed4-21ca-4440-802a-6f396fa56b00_del complete#033[00m
Nov 22 03:04:15 np0005531887 nova_compute[186849]: 2025-11-22 08:04:15.245 186853 INFO nova.compute.manager [None req-ce096e0f-51f4-46e3-96b0-70ed93e5bfda 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 03:04:15 np0005531887 nova_compute[186849]: 2025-11-22 08:04:15.246 186853 DEBUG oslo.service.loopingcall [None req-ce096e0f-51f4-46e3-96b0-70ed93e5bfda 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 03:04:15 np0005531887 nova_compute[186849]: 2025-11-22 08:04:15.246 186853 DEBUG nova.compute.manager [-] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 03:04:15 np0005531887 nova_compute[186849]: 2025-11-22 08:04:15.246 186853 DEBUG nova.network.neutron [-] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 03:04:15 np0005531887 neutron-haproxy-ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8[228151]: [NOTICE]   (228155) : haproxy version is 2.8.14-c23fe91
Nov 22 03:04:15 np0005531887 neutron-haproxy-ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8[228151]: [NOTICE]   (228155) : path to executable is /usr/sbin/haproxy
Nov 22 03:04:15 np0005531887 neutron-haproxy-ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8[228151]: [WARNING]  (228155) : Exiting Master process...
Nov 22 03:04:15 np0005531887 neutron-haproxy-ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8[228151]: [ALERT]    (228155) : Current worker (228157) exited with code 143 (Terminated)
Nov 22 03:04:15 np0005531887 neutron-haproxy-ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8[228151]: [WARNING]  (228155) : All workers exited. Exiting... (0)
Nov 22 03:04:15 np0005531887 systemd[1]: libpod-7e1c178a584d1652a474516f3e657c20b0d62b02781df843c656e4e5355057ea.scope: Deactivated successfully.
Nov 22 03:04:15 np0005531887 podman[228583]: 2025-11-22 08:04:15.459748194 +0000 UTC m=+0.401031410 container died 7e1c178a584d1652a474516f3e657c20b0d62b02781df843c656e4e5355057ea (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:04:16 np0005531887 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7e1c178a584d1652a474516f3e657c20b0d62b02781df843c656e4e5355057ea-userdata-shm.mount: Deactivated successfully.
Nov 22 03:04:16 np0005531887 systemd[1]: var-lib-containers-storage-overlay-74bb2844f1bff1c1301e99ae8ebfa5449098a366a5b38d3285eea2788b7024bc-merged.mount: Deactivated successfully.
Nov 22 03:04:16 np0005531887 nova_compute[186849]: 2025-11-22 08:04:16.757 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:16 np0005531887 podman[228583]: 2025-11-22 08:04:16.781080021 +0000 UTC m=+1.722363247 container cleanup 7e1c178a584d1652a474516f3e657c20b0d62b02781df843c656e4e5355057ea (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 22 03:04:16 np0005531887 systemd[1]: libpod-conmon-7e1c178a584d1652a474516f3e657c20b0d62b02781df843c656e4e5355057ea.scope: Deactivated successfully.
Nov 22 03:04:17 np0005531887 podman[228629]: 2025-11-22 08:04:17.766570218 +0000 UTC m=+0.955950976 container remove 7e1c178a584d1652a474516f3e657c20b0d62b02781df843c656e4e5355057ea (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:04:17 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:04:17.773 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[eb6df2ce-637e-48bb-8483-0cf2649bc2e2]: (4, ('Sat Nov 22 08:04:15 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8 (7e1c178a584d1652a474516f3e657c20b0d62b02781df843c656e4e5355057ea)\n7e1c178a584d1652a474516f3e657c20b0d62b02781df843c656e4e5355057ea\nSat Nov 22 08:04:16 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8 (7e1c178a584d1652a474516f3e657c20b0d62b02781df843c656e4e5355057ea)\n7e1c178a584d1652a474516f3e657c20b0d62b02781df843c656e4e5355057ea\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:04:17 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:04:17.775 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[1afb2b65-0b21-40e3-b64e-0f14eb8e4e6e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:04:17 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:04:17.776 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6a4a282c-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:04:17 np0005531887 nova_compute[186849]: 2025-11-22 08:04:17.780 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:17 np0005531887 kernel: tap6a4a282c-d0: left promiscuous mode
Nov 22 03:04:17 np0005531887 nova_compute[186849]: 2025-11-22 08:04:17.793 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:17 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:04:17.797 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[02e7d44f-16f6-469f-8aa8-89d88492306b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:04:17 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:04:17.813 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[3787bb52-b14c-4400-aa91-1ccbad2dbbf1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:04:17 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:04:17.815 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[9ac33c48-4b5c-4496-9bc4-04ce84c24438]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:04:17 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:04:17.829 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[cfb46df2-6f10-4968-b3b1-bd2c38b9eea3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 532783, 'reachable_time': 35662, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228658, 'error': None, 'target': 'ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:04:17 np0005531887 systemd[1]: run-netns-ovnmeta\x2d6a4a282c\x2ddb22\x2d41de\x2db34b\x2d2960aa032ca8.mount: Deactivated successfully.
Nov 22 03:04:17 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:04:17.837 104200 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6a4a282c-db22-41de-b34b-2960aa032ca8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:04:17 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:04:17.837 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[6b15dd56-bfba-403a-afed-d4da3a675412]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:04:17 np0005531887 podman[228642]: 2025-11-22 08:04:17.853883312 +0000 UTC m=+0.065815027 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, vendor=Red Hat, Inc., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, distribution-scope=public, version=9.6, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350)
Nov 22 03:04:18 np0005531887 nova_compute[186849]: 2025-11-22 08:04:18.607 186853 DEBUG nova.compute.manager [req-7555609b-2a62-4195-87cf-539bc3b2d83a req-dd080552-cfc0-4ae9-8fbe-cd54c73c8a60 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Received event network-vif-unplugged-59d94620-ece2-4ba1-96e7-f25b12dc87fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:04:18 np0005531887 nova_compute[186849]: 2025-11-22 08:04:18.608 186853 DEBUG oslo_concurrency.lockutils [req-7555609b-2a62-4195-87cf-539bc3b2d83a req-dd080552-cfc0-4ae9-8fbe-cd54c73c8a60 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "87323ed4-21ca-4440-802a-6f396fa56b00-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:04:18 np0005531887 nova_compute[186849]: 2025-11-22 08:04:18.608 186853 DEBUG oslo_concurrency.lockutils [req-7555609b-2a62-4195-87cf-539bc3b2d83a req-dd080552-cfc0-4ae9-8fbe-cd54c73c8a60 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "87323ed4-21ca-4440-802a-6f396fa56b00-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:04:18 np0005531887 nova_compute[186849]: 2025-11-22 08:04:18.608 186853 DEBUG oslo_concurrency.lockutils [req-7555609b-2a62-4195-87cf-539bc3b2d83a req-dd080552-cfc0-4ae9-8fbe-cd54c73c8a60 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "87323ed4-21ca-4440-802a-6f396fa56b00-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:04:18 np0005531887 nova_compute[186849]: 2025-11-22 08:04:18.608 186853 DEBUG nova.compute.manager [req-7555609b-2a62-4195-87cf-539bc3b2d83a req-dd080552-cfc0-4ae9-8fbe-cd54c73c8a60 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] No waiting events found dispatching network-vif-unplugged-59d94620-ece2-4ba1-96e7-f25b12dc87fc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:04:18 np0005531887 nova_compute[186849]: 2025-11-22 08:04:18.608 186853 DEBUG nova.compute.manager [req-7555609b-2a62-4195-87cf-539bc3b2d83a req-dd080552-cfc0-4ae9-8fbe-cd54c73c8a60 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Received event network-vif-unplugged-59d94620-ece2-4ba1-96e7-f25b12dc87fc for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 03:04:19 np0005531887 nova_compute[186849]: 2025-11-22 08:04:19.241 186853 DEBUG nova.compute.manager [req-1c803796-dbc3-4994-b9be-8690d6e6a87c req-6eb612bf-2a31-49bc-9233-dfdb2ed806f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Received event network-vif-deleted-59d94620-ece2-4ba1-96e7-f25b12dc87fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:04:19 np0005531887 nova_compute[186849]: 2025-11-22 08:04:19.242 186853 INFO nova.compute.manager [req-1c803796-dbc3-4994-b9be-8690d6e6a87c req-6eb612bf-2a31-49bc-9233-dfdb2ed806f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Neutron deleted interface 59d94620-ece2-4ba1-96e7-f25b12dc87fc; detaching it from the instance and deleting it from the info cache#033[00m
Nov 22 03:04:19 np0005531887 nova_compute[186849]: 2025-11-22 08:04:19.242 186853 DEBUG nova.network.neutron [req-1c803796-dbc3-4994-b9be-8690d6e6a87c req-6eb612bf-2a31-49bc-9233-dfdb2ed806f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:04:19 np0005531887 nova_compute[186849]: 2025-11-22 08:04:19.244 186853 DEBUG nova.network.neutron [-] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:04:19 np0005531887 nova_compute[186849]: 2025-11-22 08:04:19.272 186853 INFO nova.compute.manager [-] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Took 4.03 seconds to deallocate network for instance.#033[00m
Nov 22 03:04:19 np0005531887 nova_compute[186849]: 2025-11-22 08:04:19.274 186853 DEBUG nova.compute.manager [req-1c803796-dbc3-4994-b9be-8690d6e6a87c req-6eb612bf-2a31-49bc-9233-dfdb2ed806f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Detach interface failed, port_id=59d94620-ece2-4ba1-96e7-f25b12dc87fc, reason: Instance 87323ed4-21ca-4440-802a-6f396fa56b00 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Nov 22 03:04:19 np0005531887 nova_compute[186849]: 2025-11-22 08:04:19.396 186853 DEBUG oslo_concurrency.lockutils [None req-ce096e0f-51f4-46e3-96b0-70ed93e5bfda 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:04:19 np0005531887 nova_compute[186849]: 2025-11-22 08:04:19.396 186853 DEBUG oslo_concurrency.lockutils [None req-ce096e0f-51f4-46e3-96b0-70ed93e5bfda 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:04:19 np0005531887 nova_compute[186849]: 2025-11-22 08:04:19.626 186853 DEBUG nova.compute.provider_tree [None req-ce096e0f-51f4-46e3-96b0-70ed93e5bfda 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:04:19 np0005531887 nova_compute[186849]: 2025-11-22 08:04:19.656 186853 DEBUG nova.scheduler.client.report [None req-ce096e0f-51f4-46e3-96b0-70ed93e5bfda 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:04:19 np0005531887 nova_compute[186849]: 2025-11-22 08:04:19.683 186853 DEBUG oslo_concurrency.lockutils [None req-ce096e0f-51f4-46e3-96b0-70ed93e5bfda 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.286s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:04:19 np0005531887 nova_compute[186849]: 2025-11-22 08:04:19.734 186853 INFO nova.scheduler.client.report [None req-ce096e0f-51f4-46e3-96b0-70ed93e5bfda 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Deleted allocations for instance 87323ed4-21ca-4440-802a-6f396fa56b00#033[00m
Nov 22 03:04:19 np0005531887 nova_compute[186849]: 2025-11-22 08:04:19.833 186853 DEBUG oslo_concurrency.lockutils [None req-ce096e0f-51f4-46e3-96b0-70ed93e5bfda 53d50a77d3f2416c8fcc459cc343d045 2ce7c88f9ad440f494bdba03e7ece1bf - - default default] Lock "87323ed4-21ca-4440-802a-6f396fa56b00" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.994s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:04:20 np0005531887 nova_compute[186849]: 2025-11-22 08:04:20.142 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:20 np0005531887 nova_compute[186849]: 2025-11-22 08:04:20.716 186853 DEBUG nova.compute.manager [req-064c3e07-1127-472f-ac53-c171837aa6b1 req-04923118-0a61-41b2-81ab-0b881618d5c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Received event network-vif-plugged-59d94620-ece2-4ba1-96e7-f25b12dc87fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:04:20 np0005531887 nova_compute[186849]: 2025-11-22 08:04:20.716 186853 DEBUG oslo_concurrency.lockutils [req-064c3e07-1127-472f-ac53-c171837aa6b1 req-04923118-0a61-41b2-81ab-0b881618d5c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "87323ed4-21ca-4440-802a-6f396fa56b00-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:04:20 np0005531887 nova_compute[186849]: 2025-11-22 08:04:20.717 186853 DEBUG oslo_concurrency.lockutils [req-064c3e07-1127-472f-ac53-c171837aa6b1 req-04923118-0a61-41b2-81ab-0b881618d5c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "87323ed4-21ca-4440-802a-6f396fa56b00-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:04:20 np0005531887 nova_compute[186849]: 2025-11-22 08:04:20.717 186853 DEBUG oslo_concurrency.lockutils [req-064c3e07-1127-472f-ac53-c171837aa6b1 req-04923118-0a61-41b2-81ab-0b881618d5c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "87323ed4-21ca-4440-802a-6f396fa56b00-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:04:20 np0005531887 nova_compute[186849]: 2025-11-22 08:04:20.717 186853 DEBUG nova.compute.manager [req-064c3e07-1127-472f-ac53-c171837aa6b1 req-04923118-0a61-41b2-81ab-0b881618d5c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] No waiting events found dispatching network-vif-plugged-59d94620-ece2-4ba1-96e7-f25b12dc87fc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:04:20 np0005531887 nova_compute[186849]: 2025-11-22 08:04:20.717 186853 WARNING nova.compute.manager [req-064c3e07-1127-472f-ac53-c171837aa6b1 req-04923118-0a61-41b2-81ab-0b881618d5c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Received unexpected event network-vif-plugged-59d94620-ece2-4ba1-96e7-f25b12dc87fc for instance with vm_state deleted and task_state None.#033[00m
Nov 22 03:04:21 np0005531887 nova_compute[186849]: 2025-11-22 08:04:21.759 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:21 np0005531887 podman[228667]: 2025-11-22 08:04:21.875369465 +0000 UTC m=+0.084915203 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute)
Nov 22 03:04:21 np0005531887 podman[228668]: 2025-11-22 08:04:21.890161171 +0000 UTC m=+0.093287925 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:04:21 np0005531887 nova_compute[186849]: 2025-11-22 08:04:21.937 186853 DEBUG oslo_concurrency.lockutils [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Acquiring lock "4a06bc4f-7ec7-498b-9018-a4f2601aab63" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:04:21 np0005531887 nova_compute[186849]: 2025-11-22 08:04:21.937 186853 DEBUG oslo_concurrency.lockutils [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Lock "4a06bc4f-7ec7-498b-9018-a4f2601aab63" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:04:22 np0005531887 nova_compute[186849]: 2025-11-22 08:04:22.001 186853 DEBUG nova.compute.manager [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 03:04:22 np0005531887 nova_compute[186849]: 2025-11-22 08:04:22.119 186853 DEBUG oslo_concurrency.lockutils [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:04:22 np0005531887 nova_compute[186849]: 2025-11-22 08:04:22.120 186853 DEBUG oslo_concurrency.lockutils [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:04:22 np0005531887 nova_compute[186849]: 2025-11-22 08:04:22.126 186853 DEBUG nova.virt.hardware [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:04:22 np0005531887 nova_compute[186849]: 2025-11-22 08:04:22.126 186853 INFO nova.compute.claims [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 22 03:04:22 np0005531887 nova_compute[186849]: 2025-11-22 08:04:22.280 186853 DEBUG nova.compute.provider_tree [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:04:22 np0005531887 nova_compute[186849]: 2025-11-22 08:04:22.296 186853 DEBUG nova.scheduler.client.report [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:04:22 np0005531887 nova_compute[186849]: 2025-11-22 08:04:22.332 186853 DEBUG oslo_concurrency.lockutils [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.213s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:04:22 np0005531887 nova_compute[186849]: 2025-11-22 08:04:22.333 186853 DEBUG nova.compute.manager [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 03:04:22 np0005531887 nova_compute[186849]: 2025-11-22 08:04:22.435 186853 DEBUG nova.compute.manager [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 03:04:22 np0005531887 nova_compute[186849]: 2025-11-22 08:04:22.436 186853 DEBUG nova.network.neutron [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:04:22 np0005531887 nova_compute[186849]: 2025-11-22 08:04:22.454 186853 INFO nova.virt.libvirt.driver [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 03:04:22 np0005531887 nova_compute[186849]: 2025-11-22 08:04:22.477 186853 DEBUG nova.compute.manager [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 03:04:22 np0005531887 nova_compute[186849]: 2025-11-22 08:04:22.649 186853 DEBUG nova.compute.manager [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 03:04:22 np0005531887 nova_compute[186849]: 2025-11-22 08:04:22.651 186853 DEBUG nova.virt.libvirt.driver [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:04:22 np0005531887 nova_compute[186849]: 2025-11-22 08:04:22.651 186853 INFO nova.virt.libvirt.driver [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Creating image(s)#033[00m
Nov 22 03:04:22 np0005531887 nova_compute[186849]: 2025-11-22 08:04:22.652 186853 DEBUG oslo_concurrency.lockutils [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Acquiring lock "/var/lib/nova/instances/4a06bc4f-7ec7-498b-9018-a4f2601aab63/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:04:22 np0005531887 nova_compute[186849]: 2025-11-22 08:04:22.652 186853 DEBUG oslo_concurrency.lockutils [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Lock "/var/lib/nova/instances/4a06bc4f-7ec7-498b-9018-a4f2601aab63/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:04:22 np0005531887 nova_compute[186849]: 2025-11-22 08:04:22.653 186853 DEBUG oslo_concurrency.lockutils [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Lock "/var/lib/nova/instances/4a06bc4f-7ec7-498b-9018-a4f2601aab63/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:04:22 np0005531887 nova_compute[186849]: 2025-11-22 08:04:22.670 186853 DEBUG oslo_concurrency.processutils [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:04:22 np0005531887 nova_compute[186849]: 2025-11-22 08:04:22.740 186853 DEBUG oslo_concurrency.processutils [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:04:22 np0005531887 nova_compute[186849]: 2025-11-22 08:04:22.741 186853 DEBUG oslo_concurrency.lockutils [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:04:22 np0005531887 nova_compute[186849]: 2025-11-22 08:04:22.742 186853 DEBUG oslo_concurrency.lockutils [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:04:22 np0005531887 nova_compute[186849]: 2025-11-22 08:04:22.757 186853 DEBUG oslo_concurrency.processutils [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:04:22 np0005531887 nova_compute[186849]: 2025-11-22 08:04:22.816 186853 DEBUG oslo_concurrency.processutils [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:04:22 np0005531887 nova_compute[186849]: 2025-11-22 08:04:22.817 186853 DEBUG oslo_concurrency.processutils [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/4a06bc4f-7ec7-498b-9018-a4f2601aab63/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:04:22 np0005531887 nova_compute[186849]: 2025-11-22 08:04:22.876 186853 DEBUG nova.policy [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f4c19573a27c494699060e7ea79d5515', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '95e2ccde25b541d0968f3ccee43d9e35', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:04:23 np0005531887 nova_compute[186849]: 2025-11-22 08:04:23.684 186853 DEBUG oslo_concurrency.processutils [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/4a06bc4f-7ec7-498b-9018-a4f2601aab63/disk 1073741824" returned: 0 in 0.867s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:04:23 np0005531887 nova_compute[186849]: 2025-11-22 08:04:23.685 186853 DEBUG oslo_concurrency.lockutils [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.943s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:04:23 np0005531887 nova_compute[186849]: 2025-11-22 08:04:23.686 186853 DEBUG oslo_concurrency.processutils [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:04:23 np0005531887 nova_compute[186849]: 2025-11-22 08:04:23.753 186853 DEBUG oslo_concurrency.processutils [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:04:23 np0005531887 nova_compute[186849]: 2025-11-22 08:04:23.755 186853 DEBUG nova.virt.disk.api [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Checking if we can resize image /var/lib/nova/instances/4a06bc4f-7ec7-498b-9018-a4f2601aab63/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:04:23 np0005531887 nova_compute[186849]: 2025-11-22 08:04:23.755 186853 DEBUG oslo_concurrency.processutils [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4a06bc4f-7ec7-498b-9018-a4f2601aab63/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:04:23 np0005531887 nova_compute[186849]: 2025-11-22 08:04:23.813 186853 DEBUG oslo_concurrency.processutils [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4a06bc4f-7ec7-498b-9018-a4f2601aab63/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:04:23 np0005531887 nova_compute[186849]: 2025-11-22 08:04:23.815 186853 DEBUG nova.virt.disk.api [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Cannot resize image /var/lib/nova/instances/4a06bc4f-7ec7-498b-9018-a4f2601aab63/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:04:23 np0005531887 nova_compute[186849]: 2025-11-22 08:04:23.815 186853 DEBUG nova.objects.instance [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Lazy-loading 'migration_context' on Instance uuid 4a06bc4f-7ec7-498b-9018-a4f2601aab63 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:04:23 np0005531887 nova_compute[186849]: 2025-11-22 08:04:23.844 186853 DEBUG nova.virt.libvirt.driver [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:04:23 np0005531887 nova_compute[186849]: 2025-11-22 08:04:23.845 186853 DEBUG nova.virt.libvirt.driver [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Ensure instance console log exists: /var/lib/nova/instances/4a06bc4f-7ec7-498b-9018-a4f2601aab63/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:04:23 np0005531887 nova_compute[186849]: 2025-11-22 08:04:23.845 186853 DEBUG oslo_concurrency.lockutils [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:04:23 np0005531887 nova_compute[186849]: 2025-11-22 08:04:23.845 186853 DEBUG oslo_concurrency.lockutils [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:04:23 np0005531887 nova_compute[186849]: 2025-11-22 08:04:23.846 186853 DEBUG oslo_concurrency.lockutils [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:04:24 np0005531887 nova_compute[186849]: 2025-11-22 08:04:24.435 186853 DEBUG nova.network.neutron [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Successfully created port: 4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:04:25 np0005531887 nova_compute[186849]: 2025-11-22 08:04:25.144 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:26 np0005531887 nova_compute[186849]: 2025-11-22 08:04:26.760 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:26 np0005531887 podman[228728]: 2025-11-22 08:04:26.832329893 +0000 UTC m=+0.052797555 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 03:04:30 np0005531887 nova_compute[186849]: 2025-11-22 08:04:30.112 186853 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798655.1096313, 87323ed4-21ca-4440-802a-6f396fa56b00 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:04:30 np0005531887 nova_compute[186849]: 2025-11-22 08:04:30.113 186853 INFO nova.compute.manager [-] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:04:30 np0005531887 nova_compute[186849]: 2025-11-22 08:04:30.145 186853 DEBUG nova.compute.manager [None req-2d5c062f-979c-41e4-b835-bfd45fbfb85d - - - - - -] [instance: 87323ed4-21ca-4440-802a-6f396fa56b00] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:04:30 np0005531887 nova_compute[186849]: 2025-11-22 08:04:30.147 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:30 np0005531887 nova_compute[186849]: 2025-11-22 08:04:30.742 186853 DEBUG nova.network.neutron [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Successfully updated port: 4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:04:30 np0005531887 nova_compute[186849]: 2025-11-22 08:04:30.773 186853 DEBUG oslo_concurrency.lockutils [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Acquiring lock "refresh_cache-4a06bc4f-7ec7-498b-9018-a4f2601aab63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:04:30 np0005531887 nova_compute[186849]: 2025-11-22 08:04:30.773 186853 DEBUG oslo_concurrency.lockutils [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Acquired lock "refresh_cache-4a06bc4f-7ec7-498b-9018-a4f2601aab63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:04:30 np0005531887 nova_compute[186849]: 2025-11-22 08:04:30.774 186853 DEBUG nova.network.neutron [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:04:30 np0005531887 nova_compute[186849]: 2025-11-22 08:04:30.963 186853 DEBUG nova.compute.manager [req-2e5a9ac6-b828-4853-b4cb-421a9b22b500 req-2689e570-6e5d-4c47-b422-e4c7e75215ff 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Received event network-changed-4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:04:30 np0005531887 nova_compute[186849]: 2025-11-22 08:04:30.963 186853 DEBUG nova.compute.manager [req-2e5a9ac6-b828-4853-b4cb-421a9b22b500 req-2689e570-6e5d-4c47-b422-e4c7e75215ff 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Refreshing instance network info cache due to event network-changed-4fffc9c1-c4ac-4556-adf2-3c53f1c5d511. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:04:30 np0005531887 nova_compute[186849]: 2025-11-22 08:04:30.964 186853 DEBUG oslo_concurrency.lockutils [req-2e5a9ac6-b828-4853-b4cb-421a9b22b500 req-2689e570-6e5d-4c47-b422-e4c7e75215ff 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-4a06bc4f-7ec7-498b-9018-a4f2601aab63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:04:31 np0005531887 nova_compute[186849]: 2025-11-22 08:04:31.761 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:31 np0005531887 nova_compute[186849]: 2025-11-22 08:04:31.770 186853 DEBUG nova.network.neutron [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:04:33 np0005531887 podman[228752]: 2025-11-22 08:04:33.840865949 +0000 UTC m=+0.064537244 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 03:04:35 np0005531887 nova_compute[186849]: 2025-11-22 08:04:35.148 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:04:36.667 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:04:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:04:36.669 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:04:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:04:36.669 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:04:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:04:36.669 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:04:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:04:36.669 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:04:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:04:36.669 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:04:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:04:36.669 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:04:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:04:36.669 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:04:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:04:36.669 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:04:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:04:36.669 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:04:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:04:36.670 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:04:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:04:36.670 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:04:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:04:36.670 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:04:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:04:36.670 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:04:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:04:36.670 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:04:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:04:36.670 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:04:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:04:36.670 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:04:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:04:36.670 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:04:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:04:36.670 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:04:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:04:36.670 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:04:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:04:36.670 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:04:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:04:36.670 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:04:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:04:36.670 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:04:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:04:36.670 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:04:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:04:36.671 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:04:36 np0005531887 nova_compute[186849]: 2025-11-22 08:04:36.763 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:36 np0005531887 nova_compute[186849]: 2025-11-22 08:04:36.978 186853 DEBUG nova.network.neutron [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Updating instance_info_cache with network_info: [{"id": "4fffc9c1-c4ac-4556-adf2-3c53f1c5d511", "address": "fa:16:3e:5a:11:d8", "network": {"id": "b608b756-9b87-425a-824b-5086cdee060f", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-265562927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "95e2ccde25b541d0968f3ccee43d9e35", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4fffc9c1-c4", "ovs_interfaceid": "4fffc9c1-c4ac-4556-adf2-3c53f1c5d511", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:04:37 np0005531887 nova_compute[186849]: 2025-11-22 08:04:37.113 186853 DEBUG oslo_concurrency.lockutils [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Releasing lock "refresh_cache-4a06bc4f-7ec7-498b-9018-a4f2601aab63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:04:37 np0005531887 nova_compute[186849]: 2025-11-22 08:04:37.113 186853 DEBUG nova.compute.manager [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Instance network_info: |[{"id": "4fffc9c1-c4ac-4556-adf2-3c53f1c5d511", "address": "fa:16:3e:5a:11:d8", "network": {"id": "b608b756-9b87-425a-824b-5086cdee060f", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-265562927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "95e2ccde25b541d0968f3ccee43d9e35", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4fffc9c1-c4", "ovs_interfaceid": "4fffc9c1-c4ac-4556-adf2-3c53f1c5d511", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 03:04:37 np0005531887 nova_compute[186849]: 2025-11-22 08:04:37.114 186853 DEBUG oslo_concurrency.lockutils [req-2e5a9ac6-b828-4853-b4cb-421a9b22b500 req-2689e570-6e5d-4c47-b422-e4c7e75215ff 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-4a06bc4f-7ec7-498b-9018-a4f2601aab63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:04:37 np0005531887 nova_compute[186849]: 2025-11-22 08:04:37.115 186853 DEBUG nova.network.neutron [req-2e5a9ac6-b828-4853-b4cb-421a9b22b500 req-2689e570-6e5d-4c47-b422-e4c7e75215ff 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Refreshing network info cache for port 4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:04:37 np0005531887 nova_compute[186849]: 2025-11-22 08:04:37.118 186853 DEBUG nova.virt.libvirt.driver [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Start _get_guest_xml network_info=[{"id": "4fffc9c1-c4ac-4556-adf2-3c53f1c5d511", "address": "fa:16:3e:5a:11:d8", "network": {"id": "b608b756-9b87-425a-824b-5086cdee060f", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-265562927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "95e2ccde25b541d0968f3ccee43d9e35", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4fffc9c1-c4", "ovs_interfaceid": "4fffc9c1-c4ac-4556-adf2-3c53f1c5d511", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'size': 0, 'boot_index': 0, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:04:37 np0005531887 nova_compute[186849]: 2025-11-22 08:04:37.123 186853 WARNING nova.virt.libvirt.driver [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:04:37 np0005531887 nova_compute[186849]: 2025-11-22 08:04:37.140 186853 DEBUG nova.virt.libvirt.host [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:04:37 np0005531887 nova_compute[186849]: 2025-11-22 08:04:37.141 186853 DEBUG nova.virt.libvirt.host [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:04:37 np0005531887 nova_compute[186849]: 2025-11-22 08:04:37.147 186853 DEBUG nova.virt.libvirt.host [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:04:37 np0005531887 nova_compute[186849]: 2025-11-22 08:04:37.147 186853 DEBUG nova.virt.libvirt.host [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:04:37 np0005531887 nova_compute[186849]: 2025-11-22 08:04:37.148 186853 DEBUG nova.virt.libvirt.driver [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:04:37 np0005531887 nova_compute[186849]: 2025-11-22 08:04:37.149 186853 DEBUG nova.virt.hardware [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:04:37 np0005531887 nova_compute[186849]: 2025-11-22 08:04:37.149 186853 DEBUG nova.virt.hardware [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:04:37 np0005531887 nova_compute[186849]: 2025-11-22 08:04:37.149 186853 DEBUG nova.virt.hardware [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:04:37 np0005531887 nova_compute[186849]: 2025-11-22 08:04:37.150 186853 DEBUG nova.virt.hardware [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:04:37 np0005531887 nova_compute[186849]: 2025-11-22 08:04:37.150 186853 DEBUG nova.virt.hardware [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:04:37 np0005531887 nova_compute[186849]: 2025-11-22 08:04:37.150 186853 DEBUG nova.virt.hardware [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:04:37 np0005531887 nova_compute[186849]: 2025-11-22 08:04:37.150 186853 DEBUG nova.virt.hardware [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:04:37 np0005531887 nova_compute[186849]: 2025-11-22 08:04:37.150 186853 DEBUG nova.virt.hardware [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:04:37 np0005531887 nova_compute[186849]: 2025-11-22 08:04:37.151 186853 DEBUG nova.virt.hardware [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:04:37 np0005531887 nova_compute[186849]: 2025-11-22 08:04:37.151 186853 DEBUG nova.virt.hardware [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:04:37 np0005531887 nova_compute[186849]: 2025-11-22 08:04:37.151 186853 DEBUG nova.virt.hardware [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:04:37 np0005531887 nova_compute[186849]: 2025-11-22 08:04:37.154 186853 DEBUG nova.virt.libvirt.vif [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:04:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueTestJSONUnderV235-server-967969482',display_name='tempest-ServerRescueTestJSONUnderV235-server-967969482',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuetestjsonunderv235-server-967969482',id=104,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='95e2ccde25b541d0968f3ccee43d9e35',ramdisk_id='',reservation_id='r-c0ra48lo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueTestJSONUnderV235-938613993',owner_user_name='tempest-ServerRescueTestJSONUnderV235-938613993-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:04:22Z,user_data=None,user_id='f4c19573a27c494699060e7ea79d5515',uuid=4a06bc4f-7ec7-498b-9018-a4f2601aab63,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4fffc9c1-c4ac-4556-adf2-3c53f1c5d511", "address": "fa:16:3e:5a:11:d8", "network": {"id": "b608b756-9b87-425a-824b-5086cdee060f", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-265562927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "95e2ccde25b541d0968f3ccee43d9e35", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4fffc9c1-c4", "ovs_interfaceid": "4fffc9c1-c4ac-4556-adf2-3c53f1c5d511", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:04:37 np0005531887 nova_compute[186849]: 2025-11-22 08:04:37.155 186853 DEBUG nova.network.os_vif_util [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Converting VIF {"id": "4fffc9c1-c4ac-4556-adf2-3c53f1c5d511", "address": "fa:16:3e:5a:11:d8", "network": {"id": "b608b756-9b87-425a-824b-5086cdee060f", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-265562927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "95e2ccde25b541d0968f3ccee43d9e35", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4fffc9c1-c4", "ovs_interfaceid": "4fffc9c1-c4ac-4556-adf2-3c53f1c5d511", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:04:37 np0005531887 nova_compute[186849]: 2025-11-22 08:04:37.156 186853 DEBUG nova.network.os_vif_util [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5a:11:d8,bridge_name='br-int',has_traffic_filtering=True,id=4fffc9c1-c4ac-4556-adf2-3c53f1c5d511,network=Network(b608b756-9b87-425a-824b-5086cdee060f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4fffc9c1-c4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:04:37 np0005531887 nova_compute[186849]: 2025-11-22 08:04:37.156 186853 DEBUG nova.objects.instance [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4a06bc4f-7ec7-498b-9018-a4f2601aab63 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:04:37 np0005531887 nova_compute[186849]: 2025-11-22 08:04:37.168 186853 DEBUG nova.virt.libvirt.driver [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:04:37 np0005531887 nova_compute[186849]:  <uuid>4a06bc4f-7ec7-498b-9018-a4f2601aab63</uuid>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:  <name>instance-00000068</name>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:  <memory>131072</memory>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:  <vcpu>1</vcpu>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:04:37 np0005531887 nova_compute[186849]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:      <nova:name>tempest-ServerRescueTestJSONUnderV235-server-967969482</nova:name>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:      <nova:creationTime>2025-11-22 08:04:37</nova:creationTime>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:      <nova:flavor name="m1.nano">
Nov 22 03:04:37 np0005531887 nova_compute[186849]:        <nova:memory>128</nova:memory>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:        <nova:disk>1</nova:disk>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:        <nova:swap>0</nova:swap>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:      </nova:flavor>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:      <nova:owner>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:        <nova:user uuid="f4c19573a27c494699060e7ea79d5515">tempest-ServerRescueTestJSONUnderV235-938613993-project-member</nova:user>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:        <nova:project uuid="95e2ccde25b541d0968f3ccee43d9e35">tempest-ServerRescueTestJSONUnderV235-938613993</nova:project>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:      </nova:owner>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:      <nova:ports>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:        <nova:port uuid="4fffc9c1-c4ac-4556-adf2-3c53f1c5d511">
Nov 22 03:04:37 np0005531887 nova_compute[186849]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:        </nova:port>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:      </nova:ports>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:    </nova:instance>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:  <sysinfo type="smbios">
Nov 22 03:04:37 np0005531887 nova_compute[186849]:    <system>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:      <entry name="serial">4a06bc4f-7ec7-498b-9018-a4f2601aab63</entry>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:      <entry name="uuid">4a06bc4f-7ec7-498b-9018-a4f2601aab63</entry>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:    </system>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:  <os>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:    <boot dev="hd"/>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:    <smbios mode="sysinfo"/>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:  </os>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:  <features>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:    <vmcoreinfo/>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:  </features>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:  <clock offset="utc">
Nov 22 03:04:37 np0005531887 nova_compute[186849]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:    <timer name="hpet" present="no"/>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:  </clock>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:  <cpu mode="custom" match="exact">
Nov 22 03:04:37 np0005531887 nova_compute[186849]:    <model>Nehalem</model>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:  <devices>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:    <disk type="file" device="disk">
Nov 22 03:04:37 np0005531887 nova_compute[186849]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/4a06bc4f-7ec7-498b-9018-a4f2601aab63/disk"/>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:      <target dev="vda" bus="virtio"/>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:    <disk type="file" device="cdrom">
Nov 22 03:04:37 np0005531887 nova_compute[186849]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/4a06bc4f-7ec7-498b-9018-a4f2601aab63/disk.config"/>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:      <target dev="sda" bus="sata"/>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:    <interface type="ethernet">
Nov 22 03:04:37 np0005531887 nova_compute[186849]:      <mac address="fa:16:3e:5a:11:d8"/>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:      <mtu size="1442"/>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:      <target dev="tap4fffc9c1-c4"/>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:    </interface>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:    <serial type="pty">
Nov 22 03:04:37 np0005531887 nova_compute[186849]:      <log file="/var/lib/nova/instances/4a06bc4f-7ec7-498b-9018-a4f2601aab63/console.log" append="off"/>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:    </serial>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:    <video>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:    </video>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:    <input type="tablet" bus="usb"/>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:    <rng model="virtio">
Nov 22 03:04:37 np0005531887 nova_compute[186849]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:    </rng>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:    <controller type="usb" index="0"/>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:    <memballoon model="virtio">
Nov 22 03:04:37 np0005531887 nova_compute[186849]:      <stats period="10"/>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 03:04:37 np0005531887 nova_compute[186849]:  </devices>
Nov 22 03:04:37 np0005531887 nova_compute[186849]: </domain>
Nov 22 03:04:37 np0005531887 nova_compute[186849]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:04:37 np0005531887 nova_compute[186849]: 2025-11-22 08:04:37.169 186853 DEBUG nova.compute.manager [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Preparing to wait for external event network-vif-plugged-4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:04:37 np0005531887 nova_compute[186849]: 2025-11-22 08:04:37.170 186853 DEBUG oslo_concurrency.lockutils [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Acquiring lock "4a06bc4f-7ec7-498b-9018-a4f2601aab63-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:04:37 np0005531887 nova_compute[186849]: 2025-11-22 08:04:37.170 186853 DEBUG oslo_concurrency.lockutils [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Lock "4a06bc4f-7ec7-498b-9018-a4f2601aab63-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:04:37 np0005531887 nova_compute[186849]: 2025-11-22 08:04:37.170 186853 DEBUG oslo_concurrency.lockutils [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Lock "4a06bc4f-7ec7-498b-9018-a4f2601aab63-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:04:37 np0005531887 nova_compute[186849]: 2025-11-22 08:04:37.171 186853 DEBUG nova.virt.libvirt.vif [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:04:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueTestJSONUnderV235-server-967969482',display_name='tempest-ServerRescueTestJSONUnderV235-server-967969482',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuetestjsonunderv235-server-967969482',id=104,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='95e2ccde25b541d0968f3ccee43d9e35',ramdisk_id='',reservation_id='r-c0ra48lo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueTestJSONUnderV235-938613993',owner_user_name='tempest-ServerRescueTestJSONUnderV235-938613993-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:04:22Z,user_data=None,user_id='f4c19573a27c494699060e7ea79d5515',uuid=4a06bc4f-7ec7-498b-9018-a4f2601aab63,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4fffc9c1-c4ac-4556-adf2-3c53f1c5d511", "address": "fa:16:3e:5a:11:d8", "network": {"id": "b608b756-9b87-425a-824b-5086cdee060f", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-265562927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "95e2ccde25b541d0968f3ccee43d9e35", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4fffc9c1-c4", "ovs_interfaceid": "4fffc9c1-c4ac-4556-adf2-3c53f1c5d511", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:04:37 np0005531887 nova_compute[186849]: 2025-11-22 08:04:37.171 186853 DEBUG nova.network.os_vif_util [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Converting VIF {"id": "4fffc9c1-c4ac-4556-adf2-3c53f1c5d511", "address": "fa:16:3e:5a:11:d8", "network": {"id": "b608b756-9b87-425a-824b-5086cdee060f", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-265562927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "95e2ccde25b541d0968f3ccee43d9e35", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4fffc9c1-c4", "ovs_interfaceid": "4fffc9c1-c4ac-4556-adf2-3c53f1c5d511", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:04:37 np0005531887 nova_compute[186849]: 2025-11-22 08:04:37.172 186853 DEBUG nova.network.os_vif_util [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5a:11:d8,bridge_name='br-int',has_traffic_filtering=True,id=4fffc9c1-c4ac-4556-adf2-3c53f1c5d511,network=Network(b608b756-9b87-425a-824b-5086cdee060f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4fffc9c1-c4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:04:37 np0005531887 nova_compute[186849]: 2025-11-22 08:04:37.173 186853 DEBUG os_vif [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5a:11:d8,bridge_name='br-int',has_traffic_filtering=True,id=4fffc9c1-c4ac-4556-adf2-3c53f1c5d511,network=Network(b608b756-9b87-425a-824b-5086cdee060f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4fffc9c1-c4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:04:37 np0005531887 nova_compute[186849]: 2025-11-22 08:04:37.173 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:37 np0005531887 nova_compute[186849]: 2025-11-22 08:04:37.173 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:04:37 np0005531887 nova_compute[186849]: 2025-11-22 08:04:37.174 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:04:37 np0005531887 nova_compute[186849]: 2025-11-22 08:04:37.176 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:37 np0005531887 nova_compute[186849]: 2025-11-22 08:04:37.176 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4fffc9c1-c4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:04:37 np0005531887 nova_compute[186849]: 2025-11-22 08:04:37.177 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4fffc9c1-c4, col_values=(('external_ids', {'iface-id': '4fffc9c1-c4ac-4556-adf2-3c53f1c5d511', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5a:11:d8', 'vm-uuid': '4a06bc4f-7ec7-498b-9018-a4f2601aab63'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:04:37 np0005531887 nova_compute[186849]: 2025-11-22 08:04:37.178 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:37 np0005531887 NetworkManager[55210]: <info>  [1763798677.1797] manager: (tap4fffc9c1-c4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/150)
Nov 22 03:04:37 np0005531887 nova_compute[186849]: 2025-11-22 08:04:37.181 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:04:37 np0005531887 nova_compute[186849]: 2025-11-22 08:04:37.185 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:37 np0005531887 nova_compute[186849]: 2025-11-22 08:04:37.186 186853 INFO os_vif [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5a:11:d8,bridge_name='br-int',has_traffic_filtering=True,id=4fffc9c1-c4ac-4556-adf2-3c53f1c5d511,network=Network(b608b756-9b87-425a-824b-5086cdee060f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4fffc9c1-c4')#033[00m
Nov 22 03:04:37 np0005531887 podman[228775]: 2025-11-22 08:04:37.274557039 +0000 UTC m=+0.054802856 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:04:37 np0005531887 nova_compute[186849]: 2025-11-22 08:04:37.305 186853 DEBUG nova.virt.libvirt.driver [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:04:37 np0005531887 nova_compute[186849]: 2025-11-22 08:04:37.305 186853 DEBUG nova.virt.libvirt.driver [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:04:37 np0005531887 nova_compute[186849]: 2025-11-22 08:04:37.305 186853 DEBUG nova.virt.libvirt.driver [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] No VIF found with MAC fa:16:3e:5a:11:d8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:04:37 np0005531887 nova_compute[186849]: 2025-11-22 08:04:37.306 186853 INFO nova.virt.libvirt.driver [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Using config drive#033[00m
Nov 22 03:04:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:04:37.337 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:04:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:04:37.338 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:04:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:04:37.338 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:04:37 np0005531887 nova_compute[186849]: 2025-11-22 08:04:37.961 186853 INFO nova.virt.libvirt.driver [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Creating config drive at /var/lib/nova/instances/4a06bc4f-7ec7-498b-9018-a4f2601aab63/disk.config#033[00m
Nov 22 03:04:37 np0005531887 nova_compute[186849]: 2025-11-22 08:04:37.965 186853 DEBUG oslo_concurrency.processutils [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4a06bc4f-7ec7-498b-9018-a4f2601aab63/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdkzyt6e8 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:04:38 np0005531887 nova_compute[186849]: 2025-11-22 08:04:38.097 186853 DEBUG oslo_concurrency.processutils [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4a06bc4f-7ec7-498b-9018-a4f2601aab63/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdkzyt6e8" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:04:38 np0005531887 kernel: tap4fffc9c1-c4: entered promiscuous mode
Nov 22 03:04:38 np0005531887 NetworkManager[55210]: <info>  [1763798678.1637] manager: (tap4fffc9c1-c4): new Tun device (/org/freedesktop/NetworkManager/Devices/151)
Nov 22 03:04:38 np0005531887 nova_compute[186849]: 2025-11-22 08:04:38.162 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:38 np0005531887 ovn_controller[95130]: 2025-11-22T08:04:38Z|00318|binding|INFO|Claiming lport 4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 for this chassis.
Nov 22 03:04:38 np0005531887 ovn_controller[95130]: 2025-11-22T08:04:38Z|00319|binding|INFO|4fffc9c1-c4ac-4556-adf2-3c53f1c5d511: Claiming fa:16:3e:5a:11:d8 10.100.0.13
Nov 22 03:04:38 np0005531887 ovn_controller[95130]: 2025-11-22T08:04:38Z|00320|binding|INFO|Setting lport 4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 ovn-installed in OVS
Nov 22 03:04:38 np0005531887 nova_compute[186849]: 2025-11-22 08:04:38.176 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:38 np0005531887 nova_compute[186849]: 2025-11-22 08:04:38.177 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:38 np0005531887 nova_compute[186849]: 2025-11-22 08:04:38.182 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:38 np0005531887 ovn_controller[95130]: 2025-11-22T08:04:38Z|00321|binding|INFO|Setting lport 4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 up in Southbound
Nov 22 03:04:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:04:38.183 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5a:11:d8 10.100.0.13'], port_security=['fa:16:3e:5a:11:d8 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '4a06bc4f-7ec7-498b-9018-a4f2601aab63', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b608b756-9b87-425a-824b-5086cdee060f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '95e2ccde25b541d0968f3ccee43d9e35', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd4fcd730-8d70-42d5-b697-6271c3ce4abe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a7c3bad8-9be1-43db-b0a3-de166df88b92, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=4fffc9c1-c4ac-4556-adf2-3c53f1c5d511) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:04:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:04:38.184 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 in datapath b608b756-9b87-425a-824b-5086cdee060f bound to our chassis#033[00m
Nov 22 03:04:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:04:38.185 104084 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network b608b756-9b87-425a-824b-5086cdee060f or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Nov 22 03:04:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:04:38.186 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[8f3558bc-4bb8-4a57-a2e1-b2e247379e57]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:04:38 np0005531887 systemd-udevd[228810]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:04:38 np0005531887 NetworkManager[55210]: <info>  [1763798678.2058] device (tap4fffc9c1-c4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:04:38 np0005531887 NetworkManager[55210]: <info>  [1763798678.2066] device (tap4fffc9c1-c4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:04:38 np0005531887 systemd-machined[153180]: New machine qemu-40-instance-00000068.
Nov 22 03:04:38 np0005531887 systemd[1]: Started Virtual Machine qemu-40-instance-00000068.
Nov 22 03:04:39 np0005531887 nova_compute[186849]: 2025-11-22 08:04:39.108 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798679.1075165, 4a06bc4f-7ec7-498b-9018-a4f2601aab63 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:04:39 np0005531887 nova_compute[186849]: 2025-11-22 08:04:39.109 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] VM Started (Lifecycle Event)#033[00m
Nov 22 03:04:39 np0005531887 nova_compute[186849]: 2025-11-22 08:04:39.155 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:04:39 np0005531887 nova_compute[186849]: 2025-11-22 08:04:39.161 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798679.1089647, 4a06bc4f-7ec7-498b-9018-a4f2601aab63 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:04:39 np0005531887 nova_compute[186849]: 2025-11-22 08:04:39.161 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:04:39 np0005531887 nova_compute[186849]: 2025-11-22 08:04:39.176 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:04:39 np0005531887 nova_compute[186849]: 2025-11-22 08:04:39.180 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:04:39 np0005531887 nova_compute[186849]: 2025-11-22 08:04:39.197 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:04:39 np0005531887 nova_compute[186849]: 2025-11-22 08:04:39.261 186853 DEBUG nova.compute.manager [req-9f002c18-ae7e-480e-8411-ae2dd54d17f4 req-08602a61-ed56-4419-8d4a-838b88b2ff39 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Received event network-vif-plugged-4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:04:39 np0005531887 nova_compute[186849]: 2025-11-22 08:04:39.261 186853 DEBUG oslo_concurrency.lockutils [req-9f002c18-ae7e-480e-8411-ae2dd54d17f4 req-08602a61-ed56-4419-8d4a-838b88b2ff39 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "4a06bc4f-7ec7-498b-9018-a4f2601aab63-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:04:39 np0005531887 nova_compute[186849]: 2025-11-22 08:04:39.262 186853 DEBUG oslo_concurrency.lockutils [req-9f002c18-ae7e-480e-8411-ae2dd54d17f4 req-08602a61-ed56-4419-8d4a-838b88b2ff39 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "4a06bc4f-7ec7-498b-9018-a4f2601aab63-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:04:39 np0005531887 nova_compute[186849]: 2025-11-22 08:04:39.262 186853 DEBUG oslo_concurrency.lockutils [req-9f002c18-ae7e-480e-8411-ae2dd54d17f4 req-08602a61-ed56-4419-8d4a-838b88b2ff39 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "4a06bc4f-7ec7-498b-9018-a4f2601aab63-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:04:39 np0005531887 nova_compute[186849]: 2025-11-22 08:04:39.262 186853 DEBUG nova.compute.manager [req-9f002c18-ae7e-480e-8411-ae2dd54d17f4 req-08602a61-ed56-4419-8d4a-838b88b2ff39 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Processing event network-vif-plugged-4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:04:39 np0005531887 nova_compute[186849]: 2025-11-22 08:04:39.263 186853 DEBUG nova.compute.manager [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:04:39 np0005531887 nova_compute[186849]: 2025-11-22 08:04:39.267 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798679.267171, 4a06bc4f-7ec7-498b-9018-a4f2601aab63 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:04:39 np0005531887 nova_compute[186849]: 2025-11-22 08:04:39.267 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:04:39 np0005531887 nova_compute[186849]: 2025-11-22 08:04:39.269 186853 DEBUG nova.virt.libvirt.driver [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:04:39 np0005531887 nova_compute[186849]: 2025-11-22 08:04:39.273 186853 INFO nova.virt.libvirt.driver [-] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Instance spawned successfully.#033[00m
Nov 22 03:04:39 np0005531887 nova_compute[186849]: 2025-11-22 08:04:39.273 186853 DEBUG nova.virt.libvirt.driver [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 03:04:39 np0005531887 nova_compute[186849]: 2025-11-22 08:04:39.324 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:04:39 np0005531887 nova_compute[186849]: 2025-11-22 08:04:39.330 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:04:39 np0005531887 nova_compute[186849]: 2025-11-22 08:04:39.334 186853 DEBUG nova.virt.libvirt.driver [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:04:39 np0005531887 nova_compute[186849]: 2025-11-22 08:04:39.334 186853 DEBUG nova.virt.libvirt.driver [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:04:39 np0005531887 nova_compute[186849]: 2025-11-22 08:04:39.335 186853 DEBUG nova.virt.libvirt.driver [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:04:39 np0005531887 nova_compute[186849]: 2025-11-22 08:04:39.335 186853 DEBUG nova.virt.libvirt.driver [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:04:39 np0005531887 nova_compute[186849]: 2025-11-22 08:04:39.336 186853 DEBUG nova.virt.libvirt.driver [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:04:39 np0005531887 nova_compute[186849]: 2025-11-22 08:04:39.336 186853 DEBUG nova.virt.libvirt.driver [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:04:39 np0005531887 nova_compute[186849]: 2025-11-22 08:04:39.360 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:04:39 np0005531887 nova_compute[186849]: 2025-11-22 08:04:39.591 186853 INFO nova.compute.manager [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Took 16.94 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 03:04:39 np0005531887 nova_compute[186849]: 2025-11-22 08:04:39.591 186853 DEBUG nova.compute.manager [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:04:39 np0005531887 nova_compute[186849]: 2025-11-22 08:04:39.967 186853 INFO nova.compute.manager [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Took 17.88 seconds to build instance.#033[00m
Nov 22 03:04:40 np0005531887 nova_compute[186849]: 2025-11-22 08:04:40.077 186853 DEBUG oslo_concurrency.lockutils [None req-b5cff7ec-5f8d-4305-a8da-308ac84b4c7a f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Lock "4a06bc4f-7ec7-498b-9018-a4f2601aab63" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:04:40 np0005531887 nova_compute[186849]: 2025-11-22 08:04:40.624 186853 DEBUG nova.network.neutron [req-2e5a9ac6-b828-4853-b4cb-421a9b22b500 req-2689e570-6e5d-4c47-b422-e4c7e75215ff 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Updated VIF entry in instance network info cache for port 4fffc9c1-c4ac-4556-adf2-3c53f1c5d511. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:04:40 np0005531887 nova_compute[186849]: 2025-11-22 08:04:40.625 186853 DEBUG nova.network.neutron [req-2e5a9ac6-b828-4853-b4cb-421a9b22b500 req-2689e570-6e5d-4c47-b422-e4c7e75215ff 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Updating instance_info_cache with network_info: [{"id": "4fffc9c1-c4ac-4556-adf2-3c53f1c5d511", "address": "fa:16:3e:5a:11:d8", "network": {"id": "b608b756-9b87-425a-824b-5086cdee060f", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-265562927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "95e2ccde25b541d0968f3ccee43d9e35", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4fffc9c1-c4", "ovs_interfaceid": "4fffc9c1-c4ac-4556-adf2-3c53f1c5d511", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:04:40 np0005531887 nova_compute[186849]: 2025-11-22 08:04:40.649 186853 DEBUG oslo_concurrency.lockutils [req-2e5a9ac6-b828-4853-b4cb-421a9b22b500 req-2689e570-6e5d-4c47-b422-e4c7e75215ff 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-4a06bc4f-7ec7-498b-9018-a4f2601aab63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:04:40 np0005531887 nova_compute[186849]: 2025-11-22 08:04:40.749 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:40 np0005531887 nova_compute[186849]: 2025-11-22 08:04:40.890 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:41 np0005531887 nova_compute[186849]: 2025-11-22 08:04:41.400 186853 DEBUG nova.compute.manager [req-51049d46-52c7-4529-8b7c-4416c45aaf16 req-cfe49240-ab44-4a83-90a0-41518a93ebb8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Received event network-vif-plugged-4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:04:41 np0005531887 nova_compute[186849]: 2025-11-22 08:04:41.401 186853 DEBUG oslo_concurrency.lockutils [req-51049d46-52c7-4529-8b7c-4416c45aaf16 req-cfe49240-ab44-4a83-90a0-41518a93ebb8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "4a06bc4f-7ec7-498b-9018-a4f2601aab63-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:04:41 np0005531887 nova_compute[186849]: 2025-11-22 08:04:41.401 186853 DEBUG oslo_concurrency.lockutils [req-51049d46-52c7-4529-8b7c-4416c45aaf16 req-cfe49240-ab44-4a83-90a0-41518a93ebb8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "4a06bc4f-7ec7-498b-9018-a4f2601aab63-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:04:41 np0005531887 nova_compute[186849]: 2025-11-22 08:04:41.401 186853 DEBUG oslo_concurrency.lockutils [req-51049d46-52c7-4529-8b7c-4416c45aaf16 req-cfe49240-ab44-4a83-90a0-41518a93ebb8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "4a06bc4f-7ec7-498b-9018-a4f2601aab63-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:04:41 np0005531887 nova_compute[186849]: 2025-11-22 08:04:41.401 186853 DEBUG nova.compute.manager [req-51049d46-52c7-4529-8b7c-4416c45aaf16 req-cfe49240-ab44-4a83-90a0-41518a93ebb8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] No waiting events found dispatching network-vif-plugged-4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:04:41 np0005531887 nova_compute[186849]: 2025-11-22 08:04:41.402 186853 WARNING nova.compute.manager [req-51049d46-52c7-4529-8b7c-4416c45aaf16 req-cfe49240-ab44-4a83-90a0-41518a93ebb8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Received unexpected event network-vif-plugged-4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 for instance with vm_state active and task_state None.#033[00m
Nov 22 03:04:41 np0005531887 nova_compute[186849]: 2025-11-22 08:04:41.765 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:42 np0005531887 nova_compute[186849]: 2025-11-22 08:04:42.179 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:42 np0005531887 podman[228829]: 2025-11-22 08:04:42.834015665 +0000 UTC m=+0.051592545 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 03:04:45 np0005531887 nova_compute[186849]: 2025-11-22 08:04:45.242 186853 INFO nova.compute.manager [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Rescuing#033[00m
Nov 22 03:04:45 np0005531887 nova_compute[186849]: 2025-11-22 08:04:45.243 186853 DEBUG oslo_concurrency.lockutils [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Acquiring lock "refresh_cache-4a06bc4f-7ec7-498b-9018-a4f2601aab63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:04:45 np0005531887 nova_compute[186849]: 2025-11-22 08:04:45.243 186853 DEBUG oslo_concurrency.lockutils [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Acquired lock "refresh_cache-4a06bc4f-7ec7-498b-9018-a4f2601aab63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:04:45 np0005531887 nova_compute[186849]: 2025-11-22 08:04:45.243 186853 DEBUG nova.network.neutron [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:04:46 np0005531887 nova_compute[186849]: 2025-11-22 08:04:46.766 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:47 np0005531887 nova_compute[186849]: 2025-11-22 08:04:47.181 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:48 np0005531887 podman[228853]: 2025-11-22 08:04:48.862161451 +0000 UTC m=+0.082388419 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.openshift.tags=minimal rhel9, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, architecture=x86_64, build-date=2025-08-20T13:12:41, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, managed_by=edpm_ansible, io.openshift.expose-services=)
Nov 22 03:04:50 np0005531887 nova_compute[186849]: 2025-11-22 08:04:50.800 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:04:51 np0005531887 nova_compute[186849]: 2025-11-22 08:04:51.768 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:52 np0005531887 nova_compute[186849]: 2025-11-22 08:04:52.183 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:52 np0005531887 nova_compute[186849]: 2025-11-22 08:04:52.771 186853 DEBUG nova.network.neutron [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Updating instance_info_cache with network_info: [{"id": "4fffc9c1-c4ac-4556-adf2-3c53f1c5d511", "address": "fa:16:3e:5a:11:d8", "network": {"id": "b608b756-9b87-425a-824b-5086cdee060f", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-265562927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "95e2ccde25b541d0968f3ccee43d9e35", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4fffc9c1-c4", "ovs_interfaceid": "4fffc9c1-c4ac-4556-adf2-3c53f1c5d511", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:04:52 np0005531887 nova_compute[186849]: 2025-11-22 08:04:52.813 186853 DEBUG oslo_concurrency.lockutils [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Releasing lock "refresh_cache-4a06bc4f-7ec7-498b-9018-a4f2601aab63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:04:52 np0005531887 podman[228873]: 2025-11-22 08:04:52.861774548 +0000 UTC m=+0.071347277 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 03:04:52 np0005531887 podman[228874]: 2025-11-22 08:04:52.914354496 +0000 UTC m=+0.115349667 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 22 03:04:53 np0005531887 nova_compute[186849]: 2025-11-22 08:04:53.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:04:53 np0005531887 nova_compute[186849]: 2025-11-22 08:04:53.809 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:04:53 np0005531887 nova_compute[186849]: 2025-11-22 08:04:53.809 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:04:53 np0005531887 nova_compute[186849]: 2025-11-22 08:04:53.809 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:04:53 np0005531887 nova_compute[186849]: 2025-11-22 08:04:53.810 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:04:53 np0005531887 nova_compute[186849]: 2025-11-22 08:04:53.918 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4a06bc4f-7ec7-498b-9018-a4f2601aab63/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:04:53 np0005531887 nova_compute[186849]: 2025-11-22 08:04:53.988 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4a06bc4f-7ec7-498b-9018-a4f2601aab63/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:04:53 np0005531887 nova_compute[186849]: 2025-11-22 08:04:53.988 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4a06bc4f-7ec7-498b-9018-a4f2601aab63/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:04:54 np0005531887 nova_compute[186849]: 2025-11-22 08:04:54.026 186853 DEBUG nova.virt.libvirt.driver [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 22 03:04:54 np0005531887 nova_compute[186849]: 2025-11-22 08:04:54.057 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4a06bc4f-7ec7-498b-9018-a4f2601aab63/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:04:54 np0005531887 nova_compute[186849]: 2025-11-22 08:04:54.248 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:04:54 np0005531887 nova_compute[186849]: 2025-11-22 08:04:54.251 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5565MB free_disk=73.34492874145508GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:04:54 np0005531887 nova_compute[186849]: 2025-11-22 08:04:54.251 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:04:54 np0005531887 nova_compute[186849]: 2025-11-22 08:04:54.252 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:04:54 np0005531887 nova_compute[186849]: 2025-11-22 08:04:54.360 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Instance 4a06bc4f-7ec7-498b-9018-a4f2601aab63 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 03:04:54 np0005531887 nova_compute[186849]: 2025-11-22 08:04:54.360 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:04:54 np0005531887 nova_compute[186849]: 2025-11-22 08:04:54.361 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:04:54 np0005531887 nova_compute[186849]: 2025-11-22 08:04:54.427 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:04:54 np0005531887 nova_compute[186849]: 2025-11-22 08:04:54.443 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:04:54 np0005531887 nova_compute[186849]: 2025-11-22 08:04:54.484 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:04:54 np0005531887 nova_compute[186849]: 2025-11-22 08:04:54.485 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.233s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:04:55 np0005531887 nova_compute[186849]: 2025-11-22 08:04:55.479 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:04:55 np0005531887 nova_compute[186849]: 2025-11-22 08:04:55.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:04:55 np0005531887 nova_compute[186849]: 2025-11-22 08:04:55.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:04:56 np0005531887 nova_compute[186849]: 2025-11-22 08:04:56.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:04:56 np0005531887 nova_compute[186849]: 2025-11-22 08:04:56.770 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:56 np0005531887 nova_compute[186849]: 2025-11-22 08:04:56.772 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:04:57 np0005531887 nova_compute[186849]: 2025-11-22 08:04:57.185 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:04:57 np0005531887 podman[228941]: 2025-11-22 08:04:57.828227406 +0000 UTC m=+0.051529193 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:04:59 np0005531887 nova_compute[186849]: 2025-11-22 08:04:59.772 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:04:59 np0005531887 nova_compute[186849]: 2025-11-22 08:04:59.772 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:04:59 np0005531887 nova_compute[186849]: 2025-11-22 08:04:59.772 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:04:59 np0005531887 nova_compute[186849]: 2025-11-22 08:04:59.797 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "refresh_cache-4a06bc4f-7ec7-498b-9018-a4f2601aab63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:04:59 np0005531887 nova_compute[186849]: 2025-11-22 08:04:59.797 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquired lock "refresh_cache-4a06bc4f-7ec7-498b-9018-a4f2601aab63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:04:59 np0005531887 nova_compute[186849]: 2025-11-22 08:04:59.798 186853 DEBUG nova.network.neutron [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 03:04:59 np0005531887 nova_compute[186849]: 2025-11-22 08:04:59.798 186853 DEBUG nova.objects.instance [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4a06bc4f-7ec7-498b-9018-a4f2601aab63 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:05:01 np0005531887 nova_compute[186849]: 2025-11-22 08:05:01.772 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:02 np0005531887 nova_compute[186849]: 2025-11-22 08:05:02.068 186853 DEBUG nova.network.neutron [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Updating instance_info_cache with network_info: [{"id": "4fffc9c1-c4ac-4556-adf2-3c53f1c5d511", "address": "fa:16:3e:5a:11:d8", "network": {"id": "b608b756-9b87-425a-824b-5086cdee060f", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-265562927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "95e2ccde25b541d0968f3ccee43d9e35", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4fffc9c1-c4", "ovs_interfaceid": "4fffc9c1-c4ac-4556-adf2-3c53f1c5d511", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:05:02 np0005531887 nova_compute[186849]: 2025-11-22 08:05:02.087 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Releasing lock "refresh_cache-4a06bc4f-7ec7-498b-9018-a4f2601aab63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:05:02 np0005531887 nova_compute[186849]: 2025-11-22 08:05:02.088 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 03:05:02 np0005531887 nova_compute[186849]: 2025-11-22 08:05:02.089 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:05:02 np0005531887 nova_compute[186849]: 2025-11-22 08:05:02.089 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:05:02 np0005531887 nova_compute[186849]: 2025-11-22 08:05:02.188 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:04 np0005531887 nova_compute[186849]: 2025-11-22 08:05:04.081 186853 DEBUG nova.virt.libvirt.driver [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Nov 22 03:05:04 np0005531887 podman[228970]: 2025-11-22 08:05:04.853444356 +0000 UTC m=+0.079050633 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:05:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:05:05.344 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=30, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=29) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:05:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:05:05.345 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:05:05 np0005531887 nova_compute[186849]: 2025-11-22 08:05:05.345 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:05:05.345 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '30'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:05:06 np0005531887 kernel: tap4fffc9c1-c4 (unregistering): left promiscuous mode
Nov 22 03:05:06 np0005531887 NetworkManager[55210]: <info>  [1763798706.4185] device (tap4fffc9c1-c4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:05:06 np0005531887 nova_compute[186849]: 2025-11-22 08:05:06.427 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:06 np0005531887 ovn_controller[95130]: 2025-11-22T08:05:06Z|00322|binding|INFO|Releasing lport 4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 from this chassis (sb_readonly=0)
Nov 22 03:05:06 np0005531887 ovn_controller[95130]: 2025-11-22T08:05:06Z|00323|binding|INFO|Setting lport 4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 down in Southbound
Nov 22 03:05:06 np0005531887 ovn_controller[95130]: 2025-11-22T08:05:06Z|00324|binding|INFO|Removing iface tap4fffc9c1-c4 ovn-installed in OVS
Nov 22 03:05:06 np0005531887 nova_compute[186849]: 2025-11-22 08:05:06.429 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:06 np0005531887 nova_compute[186849]: 2025-11-22 08:05:06.439 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:06 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:05:06.454 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5a:11:d8 10.100.0.13'], port_security=['fa:16:3e:5a:11:d8 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '4a06bc4f-7ec7-498b-9018-a4f2601aab63', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b608b756-9b87-425a-824b-5086cdee060f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '95e2ccde25b541d0968f3ccee43d9e35', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd4fcd730-8d70-42d5-b697-6271c3ce4abe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a7c3bad8-9be1-43db-b0a3-de166df88b92, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=4fffc9c1-c4ac-4556-adf2-3c53f1c5d511) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:05:06 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:05:06.457 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 in datapath b608b756-9b87-425a-824b-5086cdee060f unbound from our chassis#033[00m
Nov 22 03:05:06 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:05:06.457 104084 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network b608b756-9b87-425a-824b-5086cdee060f or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Nov 22 03:05:06 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:05:06.459 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[96d2cbc7-a127-4774-8225-2e154d84c699]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:05:06 np0005531887 systemd[1]: machine-qemu\x2d40\x2dinstance\x2d00000068.scope: Deactivated successfully.
Nov 22 03:05:06 np0005531887 systemd[1]: machine-qemu\x2d40\x2dinstance\x2d00000068.scope: Consumed 18.424s CPU time.
Nov 22 03:05:06 np0005531887 systemd-machined[153180]: Machine qemu-40-instance-00000068 terminated.
Nov 22 03:05:06 np0005531887 nova_compute[186849]: 2025-11-22 08:05:06.775 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:06 np0005531887 nova_compute[186849]: 2025-11-22 08:05:06.911 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:06 np0005531887 nova_compute[186849]: 2025-11-22 08:05:06.915 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:07 np0005531887 nova_compute[186849]: 2025-11-22 08:05:07.099 186853 INFO nova.virt.libvirt.driver [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Instance shutdown successfully after 13 seconds.#033[00m
Nov 22 03:05:07 np0005531887 nova_compute[186849]: 2025-11-22 08:05:07.106 186853 INFO nova.virt.libvirt.driver [-] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Instance destroyed successfully.#033[00m
Nov 22 03:05:07 np0005531887 nova_compute[186849]: 2025-11-22 08:05:07.106 186853 DEBUG nova.objects.instance [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Lazy-loading 'numa_topology' on Instance uuid 4a06bc4f-7ec7-498b-9018-a4f2601aab63 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:05:07 np0005531887 nova_compute[186849]: 2025-11-22 08:05:07.190 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:07 np0005531887 nova_compute[186849]: 2025-11-22 08:05:07.195 186853 INFO nova.virt.libvirt.driver [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Attempting rescue#033[00m
Nov 22 03:05:07 np0005531887 nova_compute[186849]: 2025-11-22 08:05:07.196 186853 DEBUG nova.virt.libvirt.driver [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314#033[00m
Nov 22 03:05:07 np0005531887 nova_compute[186849]: 2025-11-22 08:05:07.200 186853 DEBUG nova.virt.libvirt.driver [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Nov 22 03:05:07 np0005531887 nova_compute[186849]: 2025-11-22 08:05:07.200 186853 INFO nova.virt.libvirt.driver [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Creating image(s)#033[00m
Nov 22 03:05:07 np0005531887 nova_compute[186849]: 2025-11-22 08:05:07.201 186853 DEBUG oslo_concurrency.lockutils [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Acquiring lock "/var/lib/nova/instances/4a06bc4f-7ec7-498b-9018-a4f2601aab63/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:05:07 np0005531887 nova_compute[186849]: 2025-11-22 08:05:07.201 186853 DEBUG oslo_concurrency.lockutils [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Lock "/var/lib/nova/instances/4a06bc4f-7ec7-498b-9018-a4f2601aab63/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:05:07 np0005531887 nova_compute[186849]: 2025-11-22 08:05:07.201 186853 DEBUG oslo_concurrency.lockutils [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Lock "/var/lib/nova/instances/4a06bc4f-7ec7-498b-9018-a4f2601aab63/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:05:07 np0005531887 nova_compute[186849]: 2025-11-22 08:05:07.202 186853 DEBUG nova.objects.instance [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 4a06bc4f-7ec7-498b-9018-a4f2601aab63 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:05:07 np0005531887 nova_compute[186849]: 2025-11-22 08:05:07.246 186853 DEBUG oslo_concurrency.lockutils [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:05:07 np0005531887 nova_compute[186849]: 2025-11-22 08:05:07.246 186853 DEBUG oslo_concurrency.lockutils [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:05:07 np0005531887 nova_compute[186849]: 2025-11-22 08:05:07.257 186853 DEBUG oslo_concurrency.processutils [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:05:07 np0005531887 nova_compute[186849]: 2025-11-22 08:05:07.280 186853 DEBUG nova.compute.manager [req-dd7bc2f5-43b4-4c42-a83d-d97987a28895 req-a2896b20-8504-4966-95ac-86862b3a6d0c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Received event network-vif-unplugged-4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:05:07 np0005531887 nova_compute[186849]: 2025-11-22 08:05:07.281 186853 DEBUG oslo_concurrency.lockutils [req-dd7bc2f5-43b4-4c42-a83d-d97987a28895 req-a2896b20-8504-4966-95ac-86862b3a6d0c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "4a06bc4f-7ec7-498b-9018-a4f2601aab63-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:05:07 np0005531887 nova_compute[186849]: 2025-11-22 08:05:07.284 186853 DEBUG oslo_concurrency.lockutils [req-dd7bc2f5-43b4-4c42-a83d-d97987a28895 req-a2896b20-8504-4966-95ac-86862b3a6d0c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "4a06bc4f-7ec7-498b-9018-a4f2601aab63-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:05:07 np0005531887 nova_compute[186849]: 2025-11-22 08:05:07.284 186853 DEBUG oslo_concurrency.lockutils [req-dd7bc2f5-43b4-4c42-a83d-d97987a28895 req-a2896b20-8504-4966-95ac-86862b3a6d0c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "4a06bc4f-7ec7-498b-9018-a4f2601aab63-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:05:07 np0005531887 nova_compute[186849]: 2025-11-22 08:05:07.284 186853 DEBUG nova.compute.manager [req-dd7bc2f5-43b4-4c42-a83d-d97987a28895 req-a2896b20-8504-4966-95ac-86862b3a6d0c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] No waiting events found dispatching network-vif-unplugged-4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:05:07 np0005531887 nova_compute[186849]: 2025-11-22 08:05:07.285 186853 WARNING nova.compute.manager [req-dd7bc2f5-43b4-4c42-a83d-d97987a28895 req-a2896b20-8504-4966-95ac-86862b3a6d0c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Received unexpected event network-vif-unplugged-4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 for instance with vm_state active and task_state rescuing.#033[00m
Nov 22 03:05:07 np0005531887 nova_compute[186849]: 2025-11-22 08:05:07.317 186853 DEBUG oslo_concurrency.processutils [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:05:07 np0005531887 nova_compute[186849]: 2025-11-22 08:05:07.317 186853 DEBUG oslo_concurrency.processutils [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/4a06bc4f-7ec7-498b-9018-a4f2601aab63/disk.rescue execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:05:07 np0005531887 nova_compute[186849]: 2025-11-22 08:05:07.560 186853 DEBUG oslo_concurrency.processutils [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/4a06bc4f-7ec7-498b-9018-a4f2601aab63/disk.rescue" returned: 0 in 0.242s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:05:07 np0005531887 nova_compute[186849]: 2025-11-22 08:05:07.561 186853 DEBUG oslo_concurrency.lockutils [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.315s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:05:07 np0005531887 nova_compute[186849]: 2025-11-22 08:05:07.562 186853 DEBUG nova.objects.instance [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Lazy-loading 'migration_context' on Instance uuid 4a06bc4f-7ec7-498b-9018-a4f2601aab63 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:05:07 np0005531887 nova_compute[186849]: 2025-11-22 08:05:07.574 186853 DEBUG nova.virt.libvirt.driver [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:05:07 np0005531887 nova_compute[186849]: 2025-11-22 08:05:07.575 186853 DEBUG nova.virt.libvirt.driver [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Start _get_guest_xml network_info=[{"id": "4fffc9c1-c4ac-4556-adf2-3c53f1c5d511", "address": "fa:16:3e:5a:11:d8", "network": {"id": "b608b756-9b87-425a-824b-5086cdee060f", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-265562927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSONUnderV235-265562927-network", "vif_mac": "fa:16:3e:5a:11:d8"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "95e2ccde25b541d0968f3ccee43d9e35", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4fffc9c1-c4", "ovs_interfaceid": "4fffc9c1-c4ac-4556-adf2-3c53f1c5d511", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'kernel_id': '', 'ramdisk_id': ''} block_device_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:05:07 np0005531887 nova_compute[186849]: 2025-11-22 08:05:07.576 186853 DEBUG nova.objects.instance [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Lazy-loading 'resources' on Instance uuid 4a06bc4f-7ec7-498b-9018-a4f2601aab63 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:05:07 np0005531887 nova_compute[186849]: 2025-11-22 08:05:07.596 186853 WARNING nova.virt.libvirt.driver [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:05:07 np0005531887 nova_compute[186849]: 2025-11-22 08:05:07.603 186853 DEBUG nova.virt.libvirt.host [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:05:07 np0005531887 nova_compute[186849]: 2025-11-22 08:05:07.604 186853 DEBUG nova.virt.libvirt.host [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:05:07 np0005531887 nova_compute[186849]: 2025-11-22 08:05:07.607 186853 DEBUG nova.virt.libvirt.host [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:05:07 np0005531887 nova_compute[186849]: 2025-11-22 08:05:07.608 186853 DEBUG nova.virt.libvirt.host [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:05:07 np0005531887 nova_compute[186849]: 2025-11-22 08:05:07.609 186853 DEBUG nova.virt.libvirt.driver [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:05:07 np0005531887 nova_compute[186849]: 2025-11-22 08:05:07.610 186853 DEBUG nova.virt.hardware [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:05:07 np0005531887 nova_compute[186849]: 2025-11-22 08:05:07.610 186853 DEBUG nova.virt.hardware [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:05:07 np0005531887 nova_compute[186849]: 2025-11-22 08:05:07.610 186853 DEBUG nova.virt.hardware [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:05:07 np0005531887 nova_compute[186849]: 2025-11-22 08:05:07.610 186853 DEBUG nova.virt.hardware [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:05:07 np0005531887 nova_compute[186849]: 2025-11-22 08:05:07.611 186853 DEBUG nova.virt.hardware [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:05:07 np0005531887 nova_compute[186849]: 2025-11-22 08:05:07.611 186853 DEBUG nova.virt.hardware [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:05:07 np0005531887 nova_compute[186849]: 2025-11-22 08:05:07.611 186853 DEBUG nova.virt.hardware [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:05:07 np0005531887 nova_compute[186849]: 2025-11-22 08:05:07.611 186853 DEBUG nova.virt.hardware [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:05:07 np0005531887 nova_compute[186849]: 2025-11-22 08:05:07.611 186853 DEBUG nova.virt.hardware [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:05:07 np0005531887 nova_compute[186849]: 2025-11-22 08:05:07.612 186853 DEBUG nova.virt.hardware [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:05:07 np0005531887 nova_compute[186849]: 2025-11-22 08:05:07.612 186853 DEBUG nova.virt.hardware [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:05:07 np0005531887 nova_compute[186849]: 2025-11-22 08:05:07.612 186853 DEBUG nova.objects.instance [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 4a06bc4f-7ec7-498b-9018-a4f2601aab63 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:05:07 np0005531887 nova_compute[186849]: 2025-11-22 08:05:07.640 186853 DEBUG nova.virt.libvirt.vif [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:04:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueTestJSONUnderV235-server-967969482',display_name='tempest-ServerRescueTestJSONUnderV235-server-967969482',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuetestjsonunderv235-server-967969482',id=104,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:04:39Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='95e2ccde25b541d0968f3ccee43d9e35',ramdisk_id='',reservation_id='r-c0ra48lo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueTestJSONUnderV235-938613993',owner_user_name='tempest-ServerRescueTestJSONUnderV235-938613993-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:04:39Z,user_data=None,user_id='f4c19573a27c494699060e7ea79d5515',uuid=4a06bc4f-7ec7-498b-9018-a4f2601aab63,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4fffc9c1-c4ac-4556-adf2-3c53f1c5d511", "address": "fa:16:3e:5a:11:d8", "network": {"id": "b608b756-9b87-425a-824b-5086cdee060f", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-265562927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSONUnderV235-265562927-network", "vif_mac": "fa:16:3e:5a:11:d8"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "95e2ccde25b541d0968f3ccee43d9e35", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4fffc9c1-c4", "ovs_interfaceid": "4fffc9c1-c4ac-4556-adf2-3c53f1c5d511", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:05:07 np0005531887 nova_compute[186849]: 2025-11-22 08:05:07.640 186853 DEBUG nova.network.os_vif_util [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Converting VIF {"id": "4fffc9c1-c4ac-4556-adf2-3c53f1c5d511", "address": "fa:16:3e:5a:11:d8", "network": {"id": "b608b756-9b87-425a-824b-5086cdee060f", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-265562927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSONUnderV235-265562927-network", "vif_mac": "fa:16:3e:5a:11:d8"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "95e2ccde25b541d0968f3ccee43d9e35", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4fffc9c1-c4", "ovs_interfaceid": "4fffc9c1-c4ac-4556-adf2-3c53f1c5d511", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:05:07 np0005531887 nova_compute[186849]: 2025-11-22 08:05:07.641 186853 DEBUG nova.network.os_vif_util [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5a:11:d8,bridge_name='br-int',has_traffic_filtering=True,id=4fffc9c1-c4ac-4556-adf2-3c53f1c5d511,network=Network(b608b756-9b87-425a-824b-5086cdee060f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4fffc9c1-c4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:05:07 np0005531887 nova_compute[186849]: 2025-11-22 08:05:07.642 186853 DEBUG nova.objects.instance [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4a06bc4f-7ec7-498b-9018-a4f2601aab63 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:05:07 np0005531887 nova_compute[186849]: 2025-11-22 08:05:07.659 186853 DEBUG nova.virt.libvirt.driver [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:05:07 np0005531887 nova_compute[186849]:  <uuid>4a06bc4f-7ec7-498b-9018-a4f2601aab63</uuid>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:  <name>instance-00000068</name>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:  <memory>131072</memory>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:  <vcpu>1</vcpu>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:05:07 np0005531887 nova_compute[186849]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:      <nova:name>tempest-ServerRescueTestJSONUnderV235-server-967969482</nova:name>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:      <nova:creationTime>2025-11-22 08:05:07</nova:creationTime>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:      <nova:flavor name="m1.nano">
Nov 22 03:05:07 np0005531887 nova_compute[186849]:        <nova:memory>128</nova:memory>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:        <nova:disk>1</nova:disk>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:        <nova:swap>0</nova:swap>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:      </nova:flavor>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:      <nova:owner>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:        <nova:user uuid="f4c19573a27c494699060e7ea79d5515">tempest-ServerRescueTestJSONUnderV235-938613993-project-member</nova:user>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:        <nova:project uuid="95e2ccde25b541d0968f3ccee43d9e35">tempest-ServerRescueTestJSONUnderV235-938613993</nova:project>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:      </nova:owner>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:      <nova:ports>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:        <nova:port uuid="4fffc9c1-c4ac-4556-adf2-3c53f1c5d511">
Nov 22 03:05:07 np0005531887 nova_compute[186849]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:        </nova:port>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:      </nova:ports>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:    </nova:instance>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:  <sysinfo type="smbios">
Nov 22 03:05:07 np0005531887 nova_compute[186849]:    <system>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:      <entry name="serial">4a06bc4f-7ec7-498b-9018-a4f2601aab63</entry>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:      <entry name="uuid">4a06bc4f-7ec7-498b-9018-a4f2601aab63</entry>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:    </system>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:  <os>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:    <smbios mode="sysinfo"/>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:  </os>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:  <features>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:    <vmcoreinfo/>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:  </features>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:  <clock offset="utc">
Nov 22 03:05:07 np0005531887 nova_compute[186849]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:    <timer name="hpet" present="no"/>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:  </clock>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:  <cpu mode="custom" match="exact">
Nov 22 03:05:07 np0005531887 nova_compute[186849]:    <model>Nehalem</model>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:  <devices>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:    <disk type="file" device="disk">
Nov 22 03:05:07 np0005531887 nova_compute[186849]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/4a06bc4f-7ec7-498b-9018-a4f2601aab63/disk.rescue"/>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:      <target dev="vda" bus="virtio"/>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:    <disk type="file" device="disk">
Nov 22 03:05:07 np0005531887 nova_compute[186849]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/4a06bc4f-7ec7-498b-9018-a4f2601aab63/disk"/>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:      <target dev="vdb" bus="virtio"/>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:    <disk type="file" device="cdrom">
Nov 22 03:05:07 np0005531887 nova_compute[186849]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/4a06bc4f-7ec7-498b-9018-a4f2601aab63/disk.config.rescue"/>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:      <target dev="sda" bus="sata"/>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:    <interface type="ethernet">
Nov 22 03:05:07 np0005531887 nova_compute[186849]:      <mac address="fa:16:3e:5a:11:d8"/>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:      <mtu size="1442"/>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:      <target dev="tap4fffc9c1-c4"/>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:    </interface>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:    <serial type="pty">
Nov 22 03:05:07 np0005531887 nova_compute[186849]:      <log file="/var/lib/nova/instances/4a06bc4f-7ec7-498b-9018-a4f2601aab63/console.log" append="off"/>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:    </serial>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:    <video>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:    </video>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:    <input type="tablet" bus="usb"/>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:    <rng model="virtio">
Nov 22 03:05:07 np0005531887 nova_compute[186849]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:    </rng>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:    <controller type="usb" index="0"/>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:    <memballoon model="virtio">
Nov 22 03:05:07 np0005531887 nova_compute[186849]:      <stats period="10"/>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 03:05:07 np0005531887 nova_compute[186849]:  </devices>
Nov 22 03:05:07 np0005531887 nova_compute[186849]: </domain>
Nov 22 03:05:07 np0005531887 nova_compute[186849]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:05:07 np0005531887 nova_compute[186849]: 2025-11-22 08:05:07.667 186853 INFO nova.virt.libvirt.driver [-] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Instance destroyed successfully.#033[00m
Nov 22 03:05:07 np0005531887 podman[229019]: 2025-11-22 08:05:07.772840864 +0000 UTC m=+0.068114374 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Nov 22 03:05:07 np0005531887 nova_compute[186849]: 2025-11-22 08:05:07.986 186853 DEBUG nova.virt.libvirt.driver [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:05:07 np0005531887 nova_compute[186849]: 2025-11-22 08:05:07.986 186853 DEBUG nova.virt.libvirt.driver [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:05:07 np0005531887 nova_compute[186849]: 2025-11-22 08:05:07.986 186853 DEBUG nova.virt.libvirt.driver [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:05:07 np0005531887 nova_compute[186849]: 2025-11-22 08:05:07.986 186853 DEBUG nova.virt.libvirt.driver [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] No VIF found with MAC fa:16:3e:5a:11:d8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:05:07 np0005531887 nova_compute[186849]: 2025-11-22 08:05:07.987 186853 INFO nova.virt.libvirt.driver [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Using config drive#033[00m
Nov 22 03:05:07 np0005531887 nova_compute[186849]: 2025-11-22 08:05:07.999 186853 DEBUG nova.objects.instance [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 4a06bc4f-7ec7-498b-9018-a4f2601aab63 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:05:08 np0005531887 nova_compute[186849]: 2025-11-22 08:05:08.051 186853 DEBUG nova.objects.instance [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Lazy-loading 'keypairs' on Instance uuid 4a06bc4f-7ec7-498b-9018-a4f2601aab63 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:05:09 np0005531887 nova_compute[186849]: 2025-11-22 08:05:09.250 186853 INFO nova.virt.libvirt.driver [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Creating config drive at /var/lib/nova/instances/4a06bc4f-7ec7-498b-9018-a4f2601aab63/disk.config.rescue#033[00m
Nov 22 03:05:09 np0005531887 nova_compute[186849]: 2025-11-22 08:05:09.255 186853 DEBUG oslo_concurrency.processutils [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4a06bc4f-7ec7-498b-9018-a4f2601aab63/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvmenikgj execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:05:09 np0005531887 nova_compute[186849]: 2025-11-22 08:05:09.383 186853 DEBUG oslo_concurrency.processutils [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4a06bc4f-7ec7-498b-9018-a4f2601aab63/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvmenikgj" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:05:09 np0005531887 kernel: tap4fffc9c1-c4: entered promiscuous mode
Nov 22 03:05:09 np0005531887 nova_compute[186849]: 2025-11-22 08:05:09.459 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:09 np0005531887 NetworkManager[55210]: <info>  [1763798709.4617] manager: (tap4fffc9c1-c4): new Tun device (/org/freedesktop/NetworkManager/Devices/152)
Nov 22 03:05:09 np0005531887 ovn_controller[95130]: 2025-11-22T08:05:09Z|00325|binding|INFO|Claiming lport 4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 for this chassis.
Nov 22 03:05:09 np0005531887 ovn_controller[95130]: 2025-11-22T08:05:09Z|00326|binding|INFO|4fffc9c1-c4ac-4556-adf2-3c53f1c5d511: Claiming fa:16:3e:5a:11:d8 10.100.0.13
Nov 22 03:05:09 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:05:09.469 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5a:11:d8 10.100.0.13'], port_security=['fa:16:3e:5a:11:d8 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '4a06bc4f-7ec7-498b-9018-a4f2601aab63', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b608b756-9b87-425a-824b-5086cdee060f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '95e2ccde25b541d0968f3ccee43d9e35', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'd4fcd730-8d70-42d5-b697-6271c3ce4abe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a7c3bad8-9be1-43db-b0a3-de166df88b92, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=4fffc9c1-c4ac-4556-adf2-3c53f1c5d511) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:05:09 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:05:09.470 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 in datapath b608b756-9b87-425a-824b-5086cdee060f bound to our chassis#033[00m
Nov 22 03:05:09 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:05:09.471 104084 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network b608b756-9b87-425a-824b-5086cdee060f or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Nov 22 03:05:09 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:05:09.472 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[8855527a-1048-4ca9-98bb-c776c237b429]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:05:09 np0005531887 ovn_controller[95130]: 2025-11-22T08:05:09Z|00327|binding|INFO|Setting lport 4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 up in Southbound
Nov 22 03:05:09 np0005531887 ovn_controller[95130]: 2025-11-22T08:05:09Z|00328|binding|INFO|Setting lport 4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 ovn-installed in OVS
Nov 22 03:05:09 np0005531887 nova_compute[186849]: 2025-11-22 08:05:09.478 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:09 np0005531887 nova_compute[186849]: 2025-11-22 08:05:09.481 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:09 np0005531887 systemd-udevd[229057]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:05:09 np0005531887 nova_compute[186849]: 2025-11-22 08:05:09.499 186853 DEBUG nova.compute.manager [req-dd13bfcc-1101-4dd3-9d9d-26c2c9bb7bcf req-219e8259-f716-4b1a-b26f-3810b20ac227 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Received event network-vif-plugged-4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:05:09 np0005531887 nova_compute[186849]: 2025-11-22 08:05:09.500 186853 DEBUG oslo_concurrency.lockutils [req-dd13bfcc-1101-4dd3-9d9d-26c2c9bb7bcf req-219e8259-f716-4b1a-b26f-3810b20ac227 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "4a06bc4f-7ec7-498b-9018-a4f2601aab63-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:05:09 np0005531887 nova_compute[186849]: 2025-11-22 08:05:09.500 186853 DEBUG oslo_concurrency.lockutils [req-dd13bfcc-1101-4dd3-9d9d-26c2c9bb7bcf req-219e8259-f716-4b1a-b26f-3810b20ac227 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "4a06bc4f-7ec7-498b-9018-a4f2601aab63-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:05:09 np0005531887 nova_compute[186849]: 2025-11-22 08:05:09.500 186853 DEBUG oslo_concurrency.lockutils [req-dd13bfcc-1101-4dd3-9d9d-26c2c9bb7bcf req-219e8259-f716-4b1a-b26f-3810b20ac227 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "4a06bc4f-7ec7-498b-9018-a4f2601aab63-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:05:09 np0005531887 nova_compute[186849]: 2025-11-22 08:05:09.500 186853 DEBUG nova.compute.manager [req-dd13bfcc-1101-4dd3-9d9d-26c2c9bb7bcf req-219e8259-f716-4b1a-b26f-3810b20ac227 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] No waiting events found dispatching network-vif-plugged-4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:05:09 np0005531887 nova_compute[186849]: 2025-11-22 08:05:09.501 186853 WARNING nova.compute.manager [req-dd13bfcc-1101-4dd3-9d9d-26c2c9bb7bcf req-219e8259-f716-4b1a-b26f-3810b20ac227 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Received unexpected event network-vif-plugged-4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 for instance with vm_state active and task_state rescuing.#033[00m
Nov 22 03:05:09 np0005531887 NetworkManager[55210]: <info>  [1763798709.5046] device (tap4fffc9c1-c4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:05:09 np0005531887 NetworkManager[55210]: <info>  [1763798709.5056] device (tap4fffc9c1-c4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:05:09 np0005531887 systemd-machined[153180]: New machine qemu-41-instance-00000068.
Nov 22 03:05:09 np0005531887 systemd[1]: Started Virtual Machine qemu-41-instance-00000068.
Nov 22 03:05:09 np0005531887 nova_compute[186849]: 2025-11-22 08:05:09.948 186853 DEBUG nova.virt.libvirt.host [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Removed pending event for 4a06bc4f-7ec7-498b-9018-a4f2601aab63 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Nov 22 03:05:09 np0005531887 nova_compute[186849]: 2025-11-22 08:05:09.950 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798709.947386, 4a06bc4f-7ec7-498b-9018-a4f2601aab63 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:05:09 np0005531887 nova_compute[186849]: 2025-11-22 08:05:09.950 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:05:09 np0005531887 nova_compute[186849]: 2025-11-22 08:05:09.967 186853 DEBUG nova.compute.manager [None req-17b5dd1e-865e-4199-ad2f-fdc15faab230 f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:05:09 np0005531887 nova_compute[186849]: 2025-11-22 08:05:09.994 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:05:10 np0005531887 nova_compute[186849]: 2025-11-22 08:05:10.000 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:05:10 np0005531887 nova_compute[186849]: 2025-11-22 08:05:10.018 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] During sync_power_state the instance has a pending task (rescuing). Skip.#033[00m
Nov 22 03:05:10 np0005531887 nova_compute[186849]: 2025-11-22 08:05:10.019 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798709.948316, 4a06bc4f-7ec7-498b-9018-a4f2601aab63 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:05:10 np0005531887 nova_compute[186849]: 2025-11-22 08:05:10.019 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] VM Started (Lifecycle Event)#033[00m
Nov 22 03:05:10 np0005531887 nova_compute[186849]: 2025-11-22 08:05:10.034 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:05:10 np0005531887 nova_compute[186849]: 2025-11-22 08:05:10.037 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:05:10 np0005531887 nova_compute[186849]: 2025-11-22 08:05:10.059 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] During sync_power_state the instance has a pending task (rescuing). Skip.#033[00m
Nov 22 03:05:11 np0005531887 nova_compute[186849]: 2025-11-22 08:05:11.726 186853 DEBUG nova.compute.manager [req-15a1c16b-f6b3-4749-bc8e-9cc567c26c08 req-1faef56f-b47b-4c4a-86a9-e212409150e3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Received event network-vif-plugged-4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:05:11 np0005531887 nova_compute[186849]: 2025-11-22 08:05:11.726 186853 DEBUG oslo_concurrency.lockutils [req-15a1c16b-f6b3-4749-bc8e-9cc567c26c08 req-1faef56f-b47b-4c4a-86a9-e212409150e3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "4a06bc4f-7ec7-498b-9018-a4f2601aab63-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:05:11 np0005531887 nova_compute[186849]: 2025-11-22 08:05:11.727 186853 DEBUG oslo_concurrency.lockutils [req-15a1c16b-f6b3-4749-bc8e-9cc567c26c08 req-1faef56f-b47b-4c4a-86a9-e212409150e3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "4a06bc4f-7ec7-498b-9018-a4f2601aab63-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:05:11 np0005531887 nova_compute[186849]: 2025-11-22 08:05:11.727 186853 DEBUG oslo_concurrency.lockutils [req-15a1c16b-f6b3-4749-bc8e-9cc567c26c08 req-1faef56f-b47b-4c4a-86a9-e212409150e3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "4a06bc4f-7ec7-498b-9018-a4f2601aab63-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:05:11 np0005531887 nova_compute[186849]: 2025-11-22 08:05:11.727 186853 DEBUG nova.compute.manager [req-15a1c16b-f6b3-4749-bc8e-9cc567c26c08 req-1faef56f-b47b-4c4a-86a9-e212409150e3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] No waiting events found dispatching network-vif-plugged-4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:05:11 np0005531887 nova_compute[186849]: 2025-11-22 08:05:11.728 186853 WARNING nova.compute.manager [req-15a1c16b-f6b3-4749-bc8e-9cc567c26c08 req-1faef56f-b47b-4c4a-86a9-e212409150e3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Received unexpected event network-vif-plugged-4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 for instance with vm_state rescued and task_state None.#033[00m
Nov 22 03:05:11 np0005531887 nova_compute[186849]: 2025-11-22 08:05:11.728 186853 DEBUG nova.compute.manager [req-15a1c16b-f6b3-4749-bc8e-9cc567c26c08 req-1faef56f-b47b-4c4a-86a9-e212409150e3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Received event network-vif-plugged-4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:05:11 np0005531887 nova_compute[186849]: 2025-11-22 08:05:11.728 186853 DEBUG oslo_concurrency.lockutils [req-15a1c16b-f6b3-4749-bc8e-9cc567c26c08 req-1faef56f-b47b-4c4a-86a9-e212409150e3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "4a06bc4f-7ec7-498b-9018-a4f2601aab63-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:05:11 np0005531887 nova_compute[186849]: 2025-11-22 08:05:11.728 186853 DEBUG oslo_concurrency.lockutils [req-15a1c16b-f6b3-4749-bc8e-9cc567c26c08 req-1faef56f-b47b-4c4a-86a9-e212409150e3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "4a06bc4f-7ec7-498b-9018-a4f2601aab63-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:05:11 np0005531887 nova_compute[186849]: 2025-11-22 08:05:11.729 186853 DEBUG oslo_concurrency.lockutils [req-15a1c16b-f6b3-4749-bc8e-9cc567c26c08 req-1faef56f-b47b-4c4a-86a9-e212409150e3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "4a06bc4f-7ec7-498b-9018-a4f2601aab63-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:05:11 np0005531887 nova_compute[186849]: 2025-11-22 08:05:11.729 186853 DEBUG nova.compute.manager [req-15a1c16b-f6b3-4749-bc8e-9cc567c26c08 req-1faef56f-b47b-4c4a-86a9-e212409150e3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] No waiting events found dispatching network-vif-plugged-4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:05:11 np0005531887 nova_compute[186849]: 2025-11-22 08:05:11.729 186853 WARNING nova.compute.manager [req-15a1c16b-f6b3-4749-bc8e-9cc567c26c08 req-1faef56f-b47b-4c4a-86a9-e212409150e3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Received unexpected event network-vif-plugged-4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 for instance with vm_state rescued and task_state None.#033[00m
Nov 22 03:05:11 np0005531887 nova_compute[186849]: 2025-11-22 08:05:11.776 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:12 np0005531887 nova_compute[186849]: 2025-11-22 08:05:12.191 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:13 np0005531887 podman[229076]: 2025-11-22 08:05:13.858819113 +0000 UTC m=+0.069973192 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 03:05:16 np0005531887 nova_compute[186849]: 2025-11-22 08:05:16.778 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:17 np0005531887 nova_compute[186849]: 2025-11-22 08:05:17.193 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:17 np0005531887 nova_compute[186849]: 2025-11-22 08:05:17.818 186853 DEBUG nova.compute.manager [req-2099f9ce-e467-4138-af95-d16589fc5bfd req-98f608f1-e5ab-4a4e-9ceb-2572967308ae 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Received event network-changed-4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:05:17 np0005531887 nova_compute[186849]: 2025-11-22 08:05:17.818 186853 DEBUG nova.compute.manager [req-2099f9ce-e467-4138-af95-d16589fc5bfd req-98f608f1-e5ab-4a4e-9ceb-2572967308ae 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Refreshing instance network info cache due to event network-changed-4fffc9c1-c4ac-4556-adf2-3c53f1c5d511. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:05:17 np0005531887 nova_compute[186849]: 2025-11-22 08:05:17.819 186853 DEBUG oslo_concurrency.lockutils [req-2099f9ce-e467-4138-af95-d16589fc5bfd req-98f608f1-e5ab-4a4e-9ceb-2572967308ae 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-4a06bc4f-7ec7-498b-9018-a4f2601aab63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:05:17 np0005531887 nova_compute[186849]: 2025-11-22 08:05:17.819 186853 DEBUG oslo_concurrency.lockutils [req-2099f9ce-e467-4138-af95-d16589fc5bfd req-98f608f1-e5ab-4a4e-9ceb-2572967308ae 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-4a06bc4f-7ec7-498b-9018-a4f2601aab63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:05:17 np0005531887 nova_compute[186849]: 2025-11-22 08:05:17.819 186853 DEBUG nova.network.neutron [req-2099f9ce-e467-4138-af95-d16589fc5bfd req-98f608f1-e5ab-4a4e-9ceb-2572967308ae 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Refreshing network info cache for port 4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:05:19 np0005531887 podman[229100]: 2025-11-22 08:05:19.84649949 +0000 UTC m=+0.064494374 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.buildah.version=1.33.7, managed_by=edpm_ansible, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9)
Nov 22 03:05:21 np0005531887 nova_compute[186849]: 2025-11-22 08:05:21.105 186853 DEBUG nova.compute.manager [req-3d87d72a-94f1-4889-a26c-2388c821d2b5 req-ae9afede-5993-4cea-814a-cbd2e47c9258 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Received event network-changed-4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:05:21 np0005531887 nova_compute[186849]: 2025-11-22 08:05:21.106 186853 DEBUG nova.compute.manager [req-3d87d72a-94f1-4889-a26c-2388c821d2b5 req-ae9afede-5993-4cea-814a-cbd2e47c9258 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Refreshing instance network info cache due to event network-changed-4fffc9c1-c4ac-4556-adf2-3c53f1c5d511. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:05:21 np0005531887 nova_compute[186849]: 2025-11-22 08:05:21.106 186853 DEBUG oslo_concurrency.lockutils [req-3d87d72a-94f1-4889-a26c-2388c821d2b5 req-ae9afede-5993-4cea-814a-cbd2e47c9258 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-4a06bc4f-7ec7-498b-9018-a4f2601aab63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:05:21 np0005531887 nova_compute[186849]: 2025-11-22 08:05:21.780 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:22 np0005531887 nova_compute[186849]: 2025-11-22 08:05:22.195 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:22 np0005531887 nova_compute[186849]: 2025-11-22 08:05:22.827 186853 DEBUG nova.network.neutron [req-2099f9ce-e467-4138-af95-d16589fc5bfd req-98f608f1-e5ab-4a4e-9ceb-2572967308ae 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Updated VIF entry in instance network info cache for port 4fffc9c1-c4ac-4556-adf2-3c53f1c5d511. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:05:22 np0005531887 nova_compute[186849]: 2025-11-22 08:05:22.828 186853 DEBUG nova.network.neutron [req-2099f9ce-e467-4138-af95-d16589fc5bfd req-98f608f1-e5ab-4a4e-9ceb-2572967308ae 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Updating instance_info_cache with network_info: [{"id": "4fffc9c1-c4ac-4556-adf2-3c53f1c5d511", "address": "fa:16:3e:5a:11:d8", "network": {"id": "b608b756-9b87-425a-824b-5086cdee060f", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-265562927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "95e2ccde25b541d0968f3ccee43d9e35", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4fffc9c1-c4", "ovs_interfaceid": "4fffc9c1-c4ac-4556-adf2-3c53f1c5d511", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:05:22 np0005531887 nova_compute[186849]: 2025-11-22 08:05:22.880 186853 DEBUG oslo_concurrency.lockutils [req-2099f9ce-e467-4138-af95-d16589fc5bfd req-98f608f1-e5ab-4a4e-9ceb-2572967308ae 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-4a06bc4f-7ec7-498b-9018-a4f2601aab63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:05:22 np0005531887 nova_compute[186849]: 2025-11-22 08:05:22.881 186853 DEBUG oslo_concurrency.lockutils [req-3d87d72a-94f1-4889-a26c-2388c821d2b5 req-ae9afede-5993-4cea-814a-cbd2e47c9258 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-4a06bc4f-7ec7-498b-9018-a4f2601aab63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:05:22 np0005531887 nova_compute[186849]: 2025-11-22 08:05:22.882 186853 DEBUG nova.network.neutron [req-3d87d72a-94f1-4889-a26c-2388c821d2b5 req-ae9afede-5993-4cea-814a-cbd2e47c9258 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Refreshing network info cache for port 4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:05:23 np0005531887 podman[229135]: 2025-11-22 08:05:23.868324481 +0000 UTC m=+0.074823075 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 03:05:23 np0005531887 podman[229136]: 2025-11-22 08:05:23.912454984 +0000 UTC m=+0.113053648 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Nov 22 03:05:26 np0005531887 nova_compute[186849]: 2025-11-22 08:05:26.784 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:27 np0005531887 nova_compute[186849]: 2025-11-22 08:05:27.197 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:28 np0005531887 nova_compute[186849]: 2025-11-22 08:05:28.192 186853 DEBUG nova.network.neutron [req-3d87d72a-94f1-4889-a26c-2388c821d2b5 req-ae9afede-5993-4cea-814a-cbd2e47c9258 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Updated VIF entry in instance network info cache for port 4fffc9c1-c4ac-4556-adf2-3c53f1c5d511. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:05:28 np0005531887 nova_compute[186849]: 2025-11-22 08:05:28.193 186853 DEBUG nova.network.neutron [req-3d87d72a-94f1-4889-a26c-2388c821d2b5 req-ae9afede-5993-4cea-814a-cbd2e47c9258 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Updating instance_info_cache with network_info: [{"id": "4fffc9c1-c4ac-4556-adf2-3c53f1c5d511", "address": "fa:16:3e:5a:11:d8", "network": {"id": "b608b756-9b87-425a-824b-5086cdee060f", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-265562927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "95e2ccde25b541d0968f3ccee43d9e35", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4fffc9c1-c4", "ovs_interfaceid": "4fffc9c1-c4ac-4556-adf2-3c53f1c5d511", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:05:28 np0005531887 nova_compute[186849]: 2025-11-22 08:05:28.251 186853 DEBUG oslo_concurrency.lockutils [req-3d87d72a-94f1-4889-a26c-2388c821d2b5 req-ae9afede-5993-4cea-814a-cbd2e47c9258 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-4a06bc4f-7ec7-498b-9018-a4f2601aab63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:05:28 np0005531887 podman[229180]: 2025-11-22 08:05:28.844754435 +0000 UTC m=+0.063564789 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:05:31 np0005531887 nova_compute[186849]: 2025-11-22 08:05:31.788 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:32 np0005531887 nova_compute[186849]: 2025-11-22 08:05:32.200 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:34 np0005531887 nova_compute[186849]: 2025-11-22 08:05:34.918 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:34 np0005531887 NetworkManager[55210]: <info>  [1763798734.9199] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/153)
Nov 22 03:05:34 np0005531887 NetworkManager[55210]: <info>  [1763798734.9214] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/154)
Nov 22 03:05:35 np0005531887 nova_compute[186849]: 2025-11-22 08:05:35.037 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:35 np0005531887 nova_compute[186849]: 2025-11-22 08:05:35.058 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:35 np0005531887 podman[229205]: 2025-11-22 08:05:35.846171407 +0000 UTC m=+0.067131270 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:05:36 np0005531887 nova_compute[186849]: 2025-11-22 08:05:36.790 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:36 np0005531887 nova_compute[186849]: 2025-11-22 08:05:36.809 186853 DEBUG nova.compute.manager [req-a3f9088e-e793-4130-b1cf-a6ff47644886 req-6addbf06-71a1-4380-b1e5-711c60f5cf97 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Received event network-changed-4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:05:36 np0005531887 nova_compute[186849]: 2025-11-22 08:05:36.810 186853 DEBUG nova.compute.manager [req-a3f9088e-e793-4130-b1cf-a6ff47644886 req-6addbf06-71a1-4380-b1e5-711c60f5cf97 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Refreshing instance network info cache due to event network-changed-4fffc9c1-c4ac-4556-adf2-3c53f1c5d511. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:05:36 np0005531887 nova_compute[186849]: 2025-11-22 08:05:36.810 186853 DEBUG oslo_concurrency.lockutils [req-a3f9088e-e793-4130-b1cf-a6ff47644886 req-6addbf06-71a1-4380-b1e5-711c60f5cf97 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-4a06bc4f-7ec7-498b-9018-a4f2601aab63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:05:36 np0005531887 nova_compute[186849]: 2025-11-22 08:05:36.810 186853 DEBUG oslo_concurrency.lockutils [req-a3f9088e-e793-4130-b1cf-a6ff47644886 req-6addbf06-71a1-4380-b1e5-711c60f5cf97 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-4a06bc4f-7ec7-498b-9018-a4f2601aab63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:05:36 np0005531887 nova_compute[186849]: 2025-11-22 08:05:36.811 186853 DEBUG nova.network.neutron [req-a3f9088e-e793-4130-b1cf-a6ff47644886 req-6addbf06-71a1-4380-b1e5-711c60f5cf97 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Refreshing network info cache for port 4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:05:37 np0005531887 nova_compute[186849]: 2025-11-22 08:05:37.151 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:37 np0005531887 nova_compute[186849]: 2025-11-22 08:05:37.202 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:05:37.339 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:05:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:05:37.340 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:05:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:05:37.340 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:05:38 np0005531887 podman[229224]: 2025-11-22 08:05:38.854540579 +0000 UTC m=+0.063728902 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0)
Nov 22 03:05:41 np0005531887 nova_compute[186849]: 2025-11-22 08:05:41.376 186853 DEBUG nova.network.neutron [req-a3f9088e-e793-4130-b1cf-a6ff47644886 req-6addbf06-71a1-4380-b1e5-711c60f5cf97 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Updated VIF entry in instance network info cache for port 4fffc9c1-c4ac-4556-adf2-3c53f1c5d511. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:05:41 np0005531887 nova_compute[186849]: 2025-11-22 08:05:41.377 186853 DEBUG nova.network.neutron [req-a3f9088e-e793-4130-b1cf-a6ff47644886 req-6addbf06-71a1-4380-b1e5-711c60f5cf97 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Updating instance_info_cache with network_info: [{"id": "4fffc9c1-c4ac-4556-adf2-3c53f1c5d511", "address": "fa:16:3e:5a:11:d8", "network": {"id": "b608b756-9b87-425a-824b-5086cdee060f", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-265562927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.178", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "95e2ccde25b541d0968f3ccee43d9e35", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4fffc9c1-c4", "ovs_interfaceid": "4fffc9c1-c4ac-4556-adf2-3c53f1c5d511", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:05:41 np0005531887 nova_compute[186849]: 2025-11-22 08:05:41.407 186853 DEBUG oslo_concurrency.lockutils [req-a3f9088e-e793-4130-b1cf-a6ff47644886 req-6addbf06-71a1-4380-b1e5-711c60f5cf97 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-4a06bc4f-7ec7-498b-9018-a4f2601aab63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:05:41 np0005531887 nova_compute[186849]: 2025-11-22 08:05:41.792 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:42 np0005531887 nova_compute[186849]: 2025-11-22 08:05:42.204 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:44 np0005531887 podman[229247]: 2025-11-22 08:05:44.843482779 +0000 UTC m=+0.059682911 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:05:46 np0005531887 nova_compute[186849]: 2025-11-22 08:05:46.480 186853 DEBUG nova.compute.manager [req-4b319edf-2724-4b26-8810-33f38f070ec8 req-b07141b2-3273-49ba-9a87-92a681558c2f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Received event network-changed-4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:05:46 np0005531887 nova_compute[186849]: 2025-11-22 08:05:46.481 186853 DEBUG nova.compute.manager [req-4b319edf-2724-4b26-8810-33f38f070ec8 req-b07141b2-3273-49ba-9a87-92a681558c2f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Refreshing instance network info cache due to event network-changed-4fffc9c1-c4ac-4556-adf2-3c53f1c5d511. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:05:46 np0005531887 nova_compute[186849]: 2025-11-22 08:05:46.481 186853 DEBUG oslo_concurrency.lockutils [req-4b319edf-2724-4b26-8810-33f38f070ec8 req-b07141b2-3273-49ba-9a87-92a681558c2f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-4a06bc4f-7ec7-498b-9018-a4f2601aab63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:05:46 np0005531887 nova_compute[186849]: 2025-11-22 08:05:46.481 186853 DEBUG oslo_concurrency.lockutils [req-4b319edf-2724-4b26-8810-33f38f070ec8 req-b07141b2-3273-49ba-9a87-92a681558c2f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-4a06bc4f-7ec7-498b-9018-a4f2601aab63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:05:46 np0005531887 nova_compute[186849]: 2025-11-22 08:05:46.482 186853 DEBUG nova.network.neutron [req-4b319edf-2724-4b26-8810-33f38f070ec8 req-b07141b2-3273-49ba-9a87-92a681558c2f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Refreshing network info cache for port 4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:05:46 np0005531887 nova_compute[186849]: 2025-11-22 08:05:46.796 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:47 np0005531887 nova_compute[186849]: 2025-11-22 08:05:47.207 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:48 np0005531887 nova_compute[186849]: 2025-11-22 08:05:48.678 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:49 np0005531887 nova_compute[186849]: 2025-11-22 08:05:49.326 186853 DEBUG nova.network.neutron [req-4b319edf-2724-4b26-8810-33f38f070ec8 req-b07141b2-3273-49ba-9a87-92a681558c2f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Updated VIF entry in instance network info cache for port 4fffc9c1-c4ac-4556-adf2-3c53f1c5d511. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:05:49 np0005531887 nova_compute[186849]: 2025-11-22 08:05:49.326 186853 DEBUG nova.network.neutron [req-4b319edf-2724-4b26-8810-33f38f070ec8 req-b07141b2-3273-49ba-9a87-92a681558c2f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Updating instance_info_cache with network_info: [{"id": "4fffc9c1-c4ac-4556-adf2-3c53f1c5d511", "address": "fa:16:3e:5a:11:d8", "network": {"id": "b608b756-9b87-425a-824b-5086cdee060f", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-265562927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "95e2ccde25b541d0968f3ccee43d9e35", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4fffc9c1-c4", "ovs_interfaceid": "4fffc9c1-c4ac-4556-adf2-3c53f1c5d511", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:05:49 np0005531887 nova_compute[186849]: 2025-11-22 08:05:49.346 186853 DEBUG oslo_concurrency.lockutils [req-4b319edf-2724-4b26-8810-33f38f070ec8 req-b07141b2-3273-49ba-9a87-92a681558c2f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-4a06bc4f-7ec7-498b-9018-a4f2601aab63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:05:50 np0005531887 podman[229271]: 2025-11-22 08:05:50.854802076 +0000 UTC m=+0.068418602 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, container_name=openstack_network_exporter, io.openshift.expose-services=, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41)
Nov 22 03:05:51 np0005531887 nova_compute[186849]: 2025-11-22 08:05:51.802 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:52 np0005531887 nova_compute[186849]: 2025-11-22 08:05:52.210 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:52 np0005531887 nova_compute[186849]: 2025-11-22 08:05:52.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:05:53 np0005531887 nova_compute[186849]: 2025-11-22 08:05:53.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:05:53 np0005531887 nova_compute[186849]: 2025-11-22 08:05:53.808 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:05:53 np0005531887 nova_compute[186849]: 2025-11-22 08:05:53.809 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:05:53 np0005531887 nova_compute[186849]: 2025-11-22 08:05:53.809 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:05:53 np0005531887 nova_compute[186849]: 2025-11-22 08:05:53.809 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:05:53 np0005531887 nova_compute[186849]: 2025-11-22 08:05:53.909 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4a06bc4f-7ec7-498b-9018-a4f2601aab63/disk.rescue --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:05:53 np0005531887 nova_compute[186849]: 2025-11-22 08:05:53.976 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4a06bc4f-7ec7-498b-9018-a4f2601aab63/disk.rescue --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:05:53 np0005531887 nova_compute[186849]: 2025-11-22 08:05:53.977 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4a06bc4f-7ec7-498b-9018-a4f2601aab63/disk.rescue --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:05:54 np0005531887 nova_compute[186849]: 2025-11-22 08:05:54.040 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4a06bc4f-7ec7-498b-9018-a4f2601aab63/disk.rescue --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:05:54 np0005531887 nova_compute[186849]: 2025-11-22 08:05:54.041 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4a06bc4f-7ec7-498b-9018-a4f2601aab63/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:05:54 np0005531887 nova_compute[186849]: 2025-11-22 08:05:54.102 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4a06bc4f-7ec7-498b-9018-a4f2601aab63/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:05:54 np0005531887 nova_compute[186849]: 2025-11-22 08:05:54.103 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4a06bc4f-7ec7-498b-9018-a4f2601aab63/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:05:54 np0005531887 nova_compute[186849]: 2025-11-22 08:05:54.175 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4a06bc4f-7ec7-498b-9018-a4f2601aab63/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:05:54 np0005531887 nova_compute[186849]: 2025-11-22 08:05:54.312 186853 DEBUG oslo_concurrency.lockutils [None req-c59e0e5e-6d76-429d-a310-bdb6b442521d f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Acquiring lock "4a06bc4f-7ec7-498b-9018-a4f2601aab63" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:05:54 np0005531887 nova_compute[186849]: 2025-11-22 08:05:54.313 186853 DEBUG oslo_concurrency.lockutils [None req-c59e0e5e-6d76-429d-a310-bdb6b442521d f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Lock "4a06bc4f-7ec7-498b-9018-a4f2601aab63" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:05:54 np0005531887 nova_compute[186849]: 2025-11-22 08:05:54.313 186853 DEBUG oslo_concurrency.lockutils [None req-c59e0e5e-6d76-429d-a310-bdb6b442521d f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Acquiring lock "4a06bc4f-7ec7-498b-9018-a4f2601aab63-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:05:54 np0005531887 nova_compute[186849]: 2025-11-22 08:05:54.314 186853 DEBUG oslo_concurrency.lockutils [None req-c59e0e5e-6d76-429d-a310-bdb6b442521d f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Lock "4a06bc4f-7ec7-498b-9018-a4f2601aab63-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:05:54 np0005531887 nova_compute[186849]: 2025-11-22 08:05:54.314 186853 DEBUG oslo_concurrency.lockutils [None req-c59e0e5e-6d76-429d-a310-bdb6b442521d f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Lock "4a06bc4f-7ec7-498b-9018-a4f2601aab63-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:05:54 np0005531887 nova_compute[186849]: 2025-11-22 08:05:54.325 186853 INFO nova.compute.manager [None req-c59e0e5e-6d76-429d-a310-bdb6b442521d f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Terminating instance#033[00m
Nov 22 03:05:54 np0005531887 nova_compute[186849]: 2025-11-22 08:05:54.334 186853 DEBUG nova.compute.manager [None req-c59e0e5e-6d76-429d-a310-bdb6b442521d f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 03:05:54 np0005531887 nova_compute[186849]: 2025-11-22 08:05:54.385 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:05:54 np0005531887 nova_compute[186849]: 2025-11-22 08:05:54.386 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5574MB free_disk=73.31600189208984GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:05:54 np0005531887 nova_compute[186849]: 2025-11-22 08:05:54.387 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:05:54 np0005531887 nova_compute[186849]: 2025-11-22 08:05:54.387 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:05:54 np0005531887 nova_compute[186849]: 2025-11-22 08:05:54.491 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Instance 4a06bc4f-7ec7-498b-9018-a4f2601aab63 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 03:05:54 np0005531887 nova_compute[186849]: 2025-11-22 08:05:54.492 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:05:54 np0005531887 nova_compute[186849]: 2025-11-22 08:05:54.492 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:05:54 np0005531887 nova_compute[186849]: 2025-11-22 08:05:54.519 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Refreshing inventories for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 22 03:05:54 np0005531887 kernel: tap4fffc9c1-c4 (unregistering): left promiscuous mode
Nov 22 03:05:54 np0005531887 NetworkManager[55210]: <info>  [1763798754.5660] device (tap4fffc9c1-c4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:05:54 np0005531887 ovn_controller[95130]: 2025-11-22T08:05:54Z|00329|binding|INFO|Releasing lport 4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 from this chassis (sb_readonly=0)
Nov 22 03:05:54 np0005531887 ovn_controller[95130]: 2025-11-22T08:05:54Z|00330|binding|INFO|Setting lport 4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 down in Southbound
Nov 22 03:05:54 np0005531887 ovn_controller[95130]: 2025-11-22T08:05:54Z|00331|binding|INFO|Removing iface tap4fffc9c1-c4 ovn-installed in OVS
Nov 22 03:05:54 np0005531887 nova_compute[186849]: 2025-11-22 08:05:54.582 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:54 np0005531887 nova_compute[186849]: 2025-11-22 08:05:54.585 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Updating ProviderTree inventory for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 22 03:05:54 np0005531887 nova_compute[186849]: 2025-11-22 08:05:54.586 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Updating inventory in ProviderTree for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 22 03:05:54 np0005531887 nova_compute[186849]: 2025-11-22 08:05:54.588 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:54 np0005531887 nova_compute[186849]: 2025-11-22 08:05:54.597 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:54 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:05:54.593 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5a:11:d8 10.100.0.13'], port_security=['fa:16:3e:5a:11:d8 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '4a06bc4f-7ec7-498b-9018-a4f2601aab63', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b608b756-9b87-425a-824b-5086cdee060f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '95e2ccde25b541d0968f3ccee43d9e35', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'd4fcd730-8d70-42d5-b697-6271c3ce4abe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a7c3bad8-9be1-43db-b0a3-de166df88b92, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=4fffc9c1-c4ac-4556-adf2-3c53f1c5d511) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:05:54 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:05:54.596 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 in datapath b608b756-9b87-425a-824b-5086cdee060f unbound from our chassis#033[00m
Nov 22 03:05:54 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:05:54.597 104084 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network b608b756-9b87-425a-824b-5086cdee060f or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Nov 22 03:05:54 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:05:54.600 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[5b57bf77-ac2f-49a6-b207-caa10191ac32]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:05:54 np0005531887 systemd[1]: machine-qemu\x2d41\x2dinstance\x2d00000068.scope: Deactivated successfully.
Nov 22 03:05:54 np0005531887 systemd[1]: machine-qemu\x2d41\x2dinstance\x2d00000068.scope: Consumed 16.162s CPU time.
Nov 22 03:05:54 np0005531887 systemd-machined[153180]: Machine qemu-41-instance-00000068 terminated.
Nov 22 03:05:54 np0005531887 podman[229306]: 2025-11-22 08:05:54.638443564 +0000 UTC m=+0.087511628 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:05:54 np0005531887 nova_compute[186849]: 2025-11-22 08:05:54.649 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Refreshing aggregate associations for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 22 03:05:54 np0005531887 podman[229307]: 2025-11-22 08:05:54.652711168 +0000 UTC m=+0.096957520 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 22 03:05:54 np0005531887 nova_compute[186849]: 2025-11-22 08:05:54.689 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Refreshing trait associations for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78, traits: COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_MMX,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_PCNET _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 22 03:05:54 np0005531887 nova_compute[186849]: 2025-11-22 08:05:54.738 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:05:54 np0005531887 nova_compute[186849]: 2025-11-22 08:05:54.754 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:05:54 np0005531887 kernel: tap4fffc9c1-c4: entered promiscuous mode
Nov 22 03:05:54 np0005531887 kernel: tap4fffc9c1-c4 (unregistering): left promiscuous mode
Nov 22 03:05:54 np0005531887 NetworkManager[55210]: <info>  [1763798754.7649] manager: (tap4fffc9c1-c4): new Tun device (/org/freedesktop/NetworkManager/Devices/155)
Nov 22 03:05:54 np0005531887 nova_compute[186849]: 2025-11-22 08:05:54.767 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:54 np0005531887 ovn_controller[95130]: 2025-11-22T08:05:54Z|00332|binding|INFO|Claiming lport 4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 for this chassis.
Nov 22 03:05:54 np0005531887 ovn_controller[95130]: 2025-11-22T08:05:54Z|00333|binding|INFO|4fffc9c1-c4ac-4556-adf2-3c53f1c5d511: Claiming fa:16:3e:5a:11:d8 10.100.0.13
Nov 22 03:05:54 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:05:54.775 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5a:11:d8 10.100.0.13'], port_security=['fa:16:3e:5a:11:d8 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '4a06bc4f-7ec7-498b-9018-a4f2601aab63', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b608b756-9b87-425a-824b-5086cdee060f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '95e2ccde25b541d0968f3ccee43d9e35', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'd4fcd730-8d70-42d5-b697-6271c3ce4abe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a7c3bad8-9be1-43db-b0a3-de166df88b92, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=4fffc9c1-c4ac-4556-adf2-3c53f1c5d511) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:05:54 np0005531887 nova_compute[186849]: 2025-11-22 08:05:54.776 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:05:54 np0005531887 nova_compute[186849]: 2025-11-22 08:05:54.776 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.389s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:05:54 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:05:54.777 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 in datapath b608b756-9b87-425a-824b-5086cdee060f bound to our chassis#033[00m
Nov 22 03:05:54 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:05:54.779 104084 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network b608b756-9b87-425a-824b-5086cdee060f or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Nov 22 03:05:54 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:05:54.780 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[e5255ac1-2b72-4456-a39a-35c59d76fd7a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:05:54 np0005531887 nova_compute[186849]: 2025-11-22 08:05:54.781 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:54 np0005531887 nova_compute[186849]: 2025-11-22 08:05:54.784 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:54 np0005531887 ovn_controller[95130]: 2025-11-22T08:05:54Z|00334|binding|INFO|Setting lport 4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 ovn-installed in OVS
Nov 22 03:05:54 np0005531887 ovn_controller[95130]: 2025-11-22T08:05:54Z|00335|binding|INFO|Setting lport 4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 up in Southbound
Nov 22 03:05:54 np0005531887 ovn_controller[95130]: 2025-11-22T08:05:54Z|00336|binding|INFO|Releasing lport 4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 from this chassis (sb_readonly=1)
Nov 22 03:05:54 np0005531887 ovn_controller[95130]: 2025-11-22T08:05:54Z|00337|if_status|INFO|Dropped 1 log messages in last 195 seconds (most recently, 195 seconds ago) due to excessive rate
Nov 22 03:05:54 np0005531887 ovn_controller[95130]: 2025-11-22T08:05:54Z|00338|if_status|INFO|Not setting lport 4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 down as sb is readonly
Nov 22 03:05:54 np0005531887 nova_compute[186849]: 2025-11-22 08:05:54.786 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:54 np0005531887 ovn_controller[95130]: 2025-11-22T08:05:54Z|00339|binding|INFO|Removing iface tap4fffc9c1-c4 ovn-installed in OVS
Nov 22 03:05:54 np0005531887 nova_compute[186849]: 2025-11-22 08:05:54.788 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:54 np0005531887 ovn_controller[95130]: 2025-11-22T08:05:54Z|00340|binding|INFO|Releasing lport 4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 from this chassis (sb_readonly=0)
Nov 22 03:05:54 np0005531887 ovn_controller[95130]: 2025-11-22T08:05:54Z|00341|binding|INFO|Setting lport 4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 down in Southbound
Nov 22 03:05:54 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:05:54.797 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5a:11:d8 10.100.0.13'], port_security=['fa:16:3e:5a:11:d8 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '4a06bc4f-7ec7-498b-9018-a4f2601aab63', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b608b756-9b87-425a-824b-5086cdee060f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '95e2ccde25b541d0968f3ccee43d9e35', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'd4fcd730-8d70-42d5-b697-6271c3ce4abe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a7c3bad8-9be1-43db-b0a3-de166df88b92, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=4fffc9c1-c4ac-4556-adf2-3c53f1c5d511) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:05:54 np0005531887 nova_compute[186849]: 2025-11-22 08:05:54.798 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:54 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:05:54.799 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 in datapath b608b756-9b87-425a-824b-5086cdee060f unbound from our chassis#033[00m
Nov 22 03:05:54 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:05:54.800 104084 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network b608b756-9b87-425a-824b-5086cdee060f or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Nov 22 03:05:54 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:05:54.801 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[f33bfd66-e841-4696-a1f5-343323ed9382]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:05:54 np0005531887 nova_compute[186849]: 2025-11-22 08:05:54.843 186853 INFO nova.virt.libvirt.driver [-] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Instance destroyed successfully.#033[00m
Nov 22 03:05:54 np0005531887 nova_compute[186849]: 2025-11-22 08:05:54.844 186853 DEBUG nova.objects.instance [None req-c59e0e5e-6d76-429d-a310-bdb6b442521d f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Lazy-loading 'resources' on Instance uuid 4a06bc4f-7ec7-498b-9018-a4f2601aab63 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:05:54 np0005531887 nova_compute[186849]: 2025-11-22 08:05:54.875 186853 DEBUG nova.virt.libvirt.vif [None req-c59e0e5e-6d76-429d-a310-bdb6b442521d f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:04:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueTestJSONUnderV235-server-967969482',display_name='tempest-ServerRescueTestJSONUnderV235-server-967969482',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuetestjsonunderv235-server-967969482',id=104,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:05:09Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='95e2ccde25b541d0968f3ccee43d9e35',ramdisk_id='',reservation_id='r-c0ra48lo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueTestJSONUnderV235-938613993',owner_user_name='tempest-ServerRescueTestJSONUnderV235-938613993-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:05:10Z,user_data=None,user_id='f4c19573a27c494699060e7ea79d5515',uuid=4a06bc4f-7ec7-498b-9018-a4f2601aab63,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='rescued') vif={"id": "4fffc9c1-c4ac-4556-adf2-3c53f1c5d511", "address": "fa:16:3e:5a:11:d8", "network": {"id": "b608b756-9b87-425a-824b-5086cdee060f", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-265562927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "95e2ccde25b541d0968f3ccee43d9e35", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4fffc9c1-c4", "ovs_interfaceid": "4fffc9c1-c4ac-4556-adf2-3c53f1c5d511", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:05:54 np0005531887 nova_compute[186849]: 2025-11-22 08:05:54.876 186853 DEBUG nova.network.os_vif_util [None req-c59e0e5e-6d76-429d-a310-bdb6b442521d f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Converting VIF {"id": "4fffc9c1-c4ac-4556-adf2-3c53f1c5d511", "address": "fa:16:3e:5a:11:d8", "network": {"id": "b608b756-9b87-425a-824b-5086cdee060f", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-265562927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "95e2ccde25b541d0968f3ccee43d9e35", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4fffc9c1-c4", "ovs_interfaceid": "4fffc9c1-c4ac-4556-adf2-3c53f1c5d511", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:05:54 np0005531887 nova_compute[186849]: 2025-11-22 08:05:54.876 186853 DEBUG nova.network.os_vif_util [None req-c59e0e5e-6d76-429d-a310-bdb6b442521d f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5a:11:d8,bridge_name='br-int',has_traffic_filtering=True,id=4fffc9c1-c4ac-4556-adf2-3c53f1c5d511,network=Network(b608b756-9b87-425a-824b-5086cdee060f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4fffc9c1-c4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:05:54 np0005531887 nova_compute[186849]: 2025-11-22 08:05:54.877 186853 DEBUG os_vif [None req-c59e0e5e-6d76-429d-a310-bdb6b442521d f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:5a:11:d8,bridge_name='br-int',has_traffic_filtering=True,id=4fffc9c1-c4ac-4556-adf2-3c53f1c5d511,network=Network(b608b756-9b87-425a-824b-5086cdee060f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4fffc9c1-c4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:05:54 np0005531887 nova_compute[186849]: 2025-11-22 08:05:54.878 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:54 np0005531887 nova_compute[186849]: 2025-11-22 08:05:54.879 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4fffc9c1-c4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:05:54 np0005531887 nova_compute[186849]: 2025-11-22 08:05:54.880 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:54 np0005531887 nova_compute[186849]: 2025-11-22 08:05:54.881 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:54 np0005531887 nova_compute[186849]: 2025-11-22 08:05:54.886 186853 INFO os_vif [None req-c59e0e5e-6d76-429d-a310-bdb6b442521d f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:5a:11:d8,bridge_name='br-int',has_traffic_filtering=True,id=4fffc9c1-c4ac-4556-adf2-3c53f1c5d511,network=Network(b608b756-9b87-425a-824b-5086cdee060f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4fffc9c1-c4')#033[00m
Nov 22 03:05:54 np0005531887 nova_compute[186849]: 2025-11-22 08:05:54.887 186853 INFO nova.virt.libvirt.driver [None req-c59e0e5e-6d76-429d-a310-bdb6b442521d f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Deleting instance files /var/lib/nova/instances/4a06bc4f-7ec7-498b-9018-a4f2601aab63_del#033[00m
Nov 22 03:05:54 np0005531887 nova_compute[186849]: 2025-11-22 08:05:54.887 186853 INFO nova.virt.libvirt.driver [None req-c59e0e5e-6d76-429d-a310-bdb6b442521d f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Deletion of /var/lib/nova/instances/4a06bc4f-7ec7-498b-9018-a4f2601aab63_del complete#033[00m
Nov 22 03:05:55 np0005531887 nova_compute[186849]: 2025-11-22 08:05:55.028 186853 INFO nova.compute.manager [None req-c59e0e5e-6d76-429d-a310-bdb6b442521d f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Took 0.69 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 03:05:55 np0005531887 nova_compute[186849]: 2025-11-22 08:05:55.029 186853 DEBUG oslo.service.loopingcall [None req-c59e0e5e-6d76-429d-a310-bdb6b442521d f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 03:05:55 np0005531887 nova_compute[186849]: 2025-11-22 08:05:55.029 186853 DEBUG nova.compute.manager [-] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 03:05:55 np0005531887 nova_compute[186849]: 2025-11-22 08:05:55.029 186853 DEBUG nova.network.neutron [-] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 03:05:55 np0005531887 nova_compute[186849]: 2025-11-22 08:05:55.100 186853 DEBUG nova.compute.manager [req-255ef967-4cea-4d8d-8877-dfc062c1e819 req-c25e0df9-8783-45a6-8d39-9a0b09e3e076 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Received event network-vif-unplugged-4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:05:55 np0005531887 nova_compute[186849]: 2025-11-22 08:05:55.101 186853 DEBUG oslo_concurrency.lockutils [req-255ef967-4cea-4d8d-8877-dfc062c1e819 req-c25e0df9-8783-45a6-8d39-9a0b09e3e076 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "4a06bc4f-7ec7-498b-9018-a4f2601aab63-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:05:55 np0005531887 nova_compute[186849]: 2025-11-22 08:05:55.101 186853 DEBUG oslo_concurrency.lockutils [req-255ef967-4cea-4d8d-8877-dfc062c1e819 req-c25e0df9-8783-45a6-8d39-9a0b09e3e076 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "4a06bc4f-7ec7-498b-9018-a4f2601aab63-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:05:55 np0005531887 nova_compute[186849]: 2025-11-22 08:05:55.101 186853 DEBUG oslo_concurrency.lockutils [req-255ef967-4cea-4d8d-8877-dfc062c1e819 req-c25e0df9-8783-45a6-8d39-9a0b09e3e076 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "4a06bc4f-7ec7-498b-9018-a4f2601aab63-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:05:55 np0005531887 nova_compute[186849]: 2025-11-22 08:05:55.101 186853 DEBUG nova.compute.manager [req-255ef967-4cea-4d8d-8877-dfc062c1e819 req-c25e0df9-8783-45a6-8d39-9a0b09e3e076 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] No waiting events found dispatching network-vif-unplugged-4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:05:55 np0005531887 nova_compute[186849]: 2025-11-22 08:05:55.101 186853 DEBUG nova.compute.manager [req-255ef967-4cea-4d8d-8877-dfc062c1e819 req-c25e0df9-8783-45a6-8d39-9a0b09e3e076 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Received event network-vif-unplugged-4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 03:05:55 np0005531887 nova_compute[186849]: 2025-11-22 08:05:55.620 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:55 np0005531887 nova_compute[186849]: 2025-11-22 08:05:55.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:05:56 np0005531887 nova_compute[186849]: 2025-11-22 08:05:56.405 186853 DEBUG nova.network.neutron [-] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:05:56 np0005531887 nova_compute[186849]: 2025-11-22 08:05:56.455 186853 INFO nova.compute.manager [-] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Took 1.43 seconds to deallocate network for instance.#033[00m
Nov 22 03:05:56 np0005531887 nova_compute[186849]: 2025-11-22 08:05:56.557 186853 DEBUG nova.compute.manager [req-a38121d1-6c4e-4710-a692-d4c1707fc25b req-ba3df7f2-68ce-40c5-a857-219dbdcf3248 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Received event network-vif-deleted-4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:05:56 np0005531887 nova_compute[186849]: 2025-11-22 08:05:56.572 186853 DEBUG oslo_concurrency.lockutils [None req-c59e0e5e-6d76-429d-a310-bdb6b442521d f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:05:56 np0005531887 nova_compute[186849]: 2025-11-22 08:05:56.573 186853 DEBUG oslo_concurrency.lockutils [None req-c59e0e5e-6d76-429d-a310-bdb6b442521d f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:05:56 np0005531887 nova_compute[186849]: 2025-11-22 08:05:56.640 186853 DEBUG nova.compute.provider_tree [None req-c59e0e5e-6d76-429d-a310-bdb6b442521d f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:05:56 np0005531887 nova_compute[186849]: 2025-11-22 08:05:56.657 186853 DEBUG nova.scheduler.client.report [None req-c59e0e5e-6d76-429d-a310-bdb6b442521d f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:05:56 np0005531887 nova_compute[186849]: 2025-11-22 08:05:56.694 186853 DEBUG oslo_concurrency.lockutils [None req-c59e0e5e-6d76-429d-a310-bdb6b442521d f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:05:56 np0005531887 nova_compute[186849]: 2025-11-22 08:05:56.745 186853 INFO nova.scheduler.client.report [None req-c59e0e5e-6d76-429d-a310-bdb6b442521d f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Deleted allocations for instance 4a06bc4f-7ec7-498b-9018-a4f2601aab63#033[00m
Nov 22 03:05:56 np0005531887 nova_compute[186849]: 2025-11-22 08:05:56.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:05:56 np0005531887 nova_compute[186849]: 2025-11-22 08:05:56.805 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:05:56 np0005531887 nova_compute[186849]: 2025-11-22 08:05:56.891 186853 DEBUG oslo_concurrency.lockutils [None req-c59e0e5e-6d76-429d-a310-bdb6b442521d f4c19573a27c494699060e7ea79d5515 95e2ccde25b541d0968f3ccee43d9e35 - - default default] Lock "4a06bc4f-7ec7-498b-9018-a4f2601aab63" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.578s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:05:57 np0005531887 nova_compute[186849]: 2025-11-22 08:05:57.255 186853 DEBUG nova.compute.manager [req-aa51ae64-f1aa-450a-83bf-c1c7e2eb8b56 req-78fd2c2c-e3e2-454d-a6e1-240a7e8cf7dd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Received event network-vif-plugged-4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:05:57 np0005531887 nova_compute[186849]: 2025-11-22 08:05:57.255 186853 DEBUG oslo_concurrency.lockutils [req-aa51ae64-f1aa-450a-83bf-c1c7e2eb8b56 req-78fd2c2c-e3e2-454d-a6e1-240a7e8cf7dd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "4a06bc4f-7ec7-498b-9018-a4f2601aab63-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:05:57 np0005531887 nova_compute[186849]: 2025-11-22 08:05:57.255 186853 DEBUG oslo_concurrency.lockutils [req-aa51ae64-f1aa-450a-83bf-c1c7e2eb8b56 req-78fd2c2c-e3e2-454d-a6e1-240a7e8cf7dd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "4a06bc4f-7ec7-498b-9018-a4f2601aab63-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:05:57 np0005531887 nova_compute[186849]: 2025-11-22 08:05:57.256 186853 DEBUG oslo_concurrency.lockutils [req-aa51ae64-f1aa-450a-83bf-c1c7e2eb8b56 req-78fd2c2c-e3e2-454d-a6e1-240a7e8cf7dd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "4a06bc4f-7ec7-498b-9018-a4f2601aab63-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:05:57 np0005531887 nova_compute[186849]: 2025-11-22 08:05:57.256 186853 DEBUG nova.compute.manager [req-aa51ae64-f1aa-450a-83bf-c1c7e2eb8b56 req-78fd2c2c-e3e2-454d-a6e1-240a7e8cf7dd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] No waiting events found dispatching network-vif-plugged-4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:05:57 np0005531887 nova_compute[186849]: 2025-11-22 08:05:57.256 186853 WARNING nova.compute.manager [req-aa51ae64-f1aa-450a-83bf-c1c7e2eb8b56 req-78fd2c2c-e3e2-454d-a6e1-240a7e8cf7dd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Received unexpected event network-vif-plugged-4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 for instance with vm_state deleted and task_state None.#033[00m
Nov 22 03:05:57 np0005531887 nova_compute[186849]: 2025-11-22 08:05:57.256 186853 DEBUG nova.compute.manager [req-aa51ae64-f1aa-450a-83bf-c1c7e2eb8b56 req-78fd2c2c-e3e2-454d-a6e1-240a7e8cf7dd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Received event network-vif-plugged-4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:05:57 np0005531887 nova_compute[186849]: 2025-11-22 08:05:57.257 186853 DEBUG oslo_concurrency.lockutils [req-aa51ae64-f1aa-450a-83bf-c1c7e2eb8b56 req-78fd2c2c-e3e2-454d-a6e1-240a7e8cf7dd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "4a06bc4f-7ec7-498b-9018-a4f2601aab63-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:05:57 np0005531887 nova_compute[186849]: 2025-11-22 08:05:57.257 186853 DEBUG oslo_concurrency.lockutils [req-aa51ae64-f1aa-450a-83bf-c1c7e2eb8b56 req-78fd2c2c-e3e2-454d-a6e1-240a7e8cf7dd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "4a06bc4f-7ec7-498b-9018-a4f2601aab63-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:05:57 np0005531887 nova_compute[186849]: 2025-11-22 08:05:57.257 186853 DEBUG oslo_concurrency.lockutils [req-aa51ae64-f1aa-450a-83bf-c1c7e2eb8b56 req-78fd2c2c-e3e2-454d-a6e1-240a7e8cf7dd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "4a06bc4f-7ec7-498b-9018-a4f2601aab63-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:05:57 np0005531887 nova_compute[186849]: 2025-11-22 08:05:57.257 186853 DEBUG nova.compute.manager [req-aa51ae64-f1aa-450a-83bf-c1c7e2eb8b56 req-78fd2c2c-e3e2-454d-a6e1-240a7e8cf7dd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] No waiting events found dispatching network-vif-plugged-4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:05:57 np0005531887 nova_compute[186849]: 2025-11-22 08:05:57.257 186853 WARNING nova.compute.manager [req-aa51ae64-f1aa-450a-83bf-c1c7e2eb8b56 req-78fd2c2c-e3e2-454d-a6e1-240a7e8cf7dd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Received unexpected event network-vif-plugged-4fffc9c1-c4ac-4556-adf2-3c53f1c5d511 for instance with vm_state deleted and task_state None.#033[00m
Nov 22 03:05:57 np0005531887 nova_compute[186849]: 2025-11-22 08:05:57.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:05:57 np0005531887 nova_compute[186849]: 2025-11-22 08:05:57.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:05:57 np0005531887 nova_compute[186849]: 2025-11-22 08:05:57.768 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:05:59 np0005531887 nova_compute[186849]: 2025-11-22 08:05:59.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:05:59 np0005531887 nova_compute[186849]: 2025-11-22 08:05:59.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:05:59 np0005531887 nova_compute[186849]: 2025-11-22 08:05:59.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:05:59 np0005531887 nova_compute[186849]: 2025-11-22 08:05:59.782 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:05:59 np0005531887 podman[229376]: 2025-11-22 08:05:59.847241994 +0000 UTC m=+0.065784015 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:05:59 np0005531887 nova_compute[186849]: 2025-11-22 08:05:59.880 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:01 np0005531887 nova_compute[186849]: 2025-11-22 08:06:01.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:06:01 np0005531887 nova_compute[186849]: 2025-11-22 08:06:01.807 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:02 np0005531887 nova_compute[186849]: 2025-11-22 08:06:02.117 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:02 np0005531887 nova_compute[186849]: 2025-11-22 08:06:02.355 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:03 np0005531887 nova_compute[186849]: 2025-11-22 08:06:03.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:06:04 np0005531887 nova_compute[186849]: 2025-11-22 08:06:04.881 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:06 np0005531887 nova_compute[186849]: 2025-11-22 08:06:06.763 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:06:06 np0005531887 nova_compute[186849]: 2025-11-22 08:06:06.809 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:06 np0005531887 podman[229401]: 2025-11-22 08:06:06.842616163 +0000 UTC m=+0.060969153 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 22 03:06:06 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:06:06.981 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=31, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=30) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:06:06 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:06:06.982 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:06:06 np0005531887 nova_compute[186849]: 2025-11-22 08:06:06.982 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:09 np0005531887 nova_compute[186849]: 2025-11-22 08:06:09.843 186853 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798754.8414996, 4a06bc4f-7ec7-498b-9018-a4f2601aab63 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:06:09 np0005531887 nova_compute[186849]: 2025-11-22 08:06:09.843 186853 INFO nova.compute.manager [-] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:06:09 np0005531887 podman[229419]: 2025-11-22 08:06:09.845541598 +0000 UTC m=+0.065044507 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 22 03:06:09 np0005531887 nova_compute[186849]: 2025-11-22 08:06:09.859 186853 DEBUG nova.compute.manager [None req-7bf82cfa-6316-405b-8de1-2a370387a61b - - - - - -] [instance: 4a06bc4f-7ec7-498b-9018-a4f2601aab63] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:06:09 np0005531887 nova_compute[186849]: 2025-11-22 08:06:09.883 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:11 np0005531887 nova_compute[186849]: 2025-11-22 08:06:11.811 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:11 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:06:11.983 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '31'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:06:14 np0005531887 nova_compute[186849]: 2025-11-22 08:06:14.884 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:15 np0005531887 podman[229440]: 2025-11-22 08:06:15.835436952 +0000 UTC m=+0.058373298 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 03:06:16 np0005531887 nova_compute[186849]: 2025-11-22 08:06:16.814 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:19 np0005531887 nova_compute[186849]: 2025-11-22 08:06:19.886 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:21 np0005531887 nova_compute[186849]: 2025-11-22 08:06:21.816 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:21 np0005531887 podman[229462]: 2025-11-22 08:06:21.859566977 +0000 UTC m=+0.072762504 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, container_name=openstack_network_exporter, name=ubi9-minimal, version=9.6, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, config_id=edpm, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 22 03:06:24 np0005531887 podman[229484]: 2025-11-22 08:06:24.849356367 +0000 UTC m=+0.066284728 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 22 03:06:24 np0005531887 podman[229485]: 2025-11-22 08:06:24.879347631 +0000 UTC m=+0.092763263 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:06:24 np0005531887 nova_compute[186849]: 2025-11-22 08:06:24.888 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:26 np0005531887 nova_compute[186849]: 2025-11-22 08:06:26.426 186853 DEBUG oslo_concurrency.lockutils [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Acquiring lock "7f1b779f-4565-4529-a2c8-dcb0414326f6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:06:26 np0005531887 nova_compute[186849]: 2025-11-22 08:06:26.426 186853 DEBUG oslo_concurrency.lockutils [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Lock "7f1b779f-4565-4529-a2c8-dcb0414326f6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:06:26 np0005531887 nova_compute[186849]: 2025-11-22 08:06:26.451 186853 DEBUG nova.compute.manager [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 03:06:26 np0005531887 nova_compute[186849]: 2025-11-22 08:06:26.580 186853 DEBUG oslo_concurrency.lockutils [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:06:26 np0005531887 nova_compute[186849]: 2025-11-22 08:06:26.581 186853 DEBUG oslo_concurrency.lockutils [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:06:26 np0005531887 nova_compute[186849]: 2025-11-22 08:06:26.586 186853 DEBUG nova.virt.hardware [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:06:26 np0005531887 nova_compute[186849]: 2025-11-22 08:06:26.586 186853 INFO nova.compute.claims [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 22 03:06:26 np0005531887 nova_compute[186849]: 2025-11-22 08:06:26.745 186853 DEBUG nova.compute.provider_tree [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:06:26 np0005531887 nova_compute[186849]: 2025-11-22 08:06:26.761 186853 DEBUG nova.scheduler.client.report [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:06:26 np0005531887 nova_compute[186849]: 2025-11-22 08:06:26.780 186853 DEBUG oslo_concurrency.lockutils [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.199s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:06:26 np0005531887 nova_compute[186849]: 2025-11-22 08:06:26.781 186853 DEBUG nova.compute.manager [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 03:06:26 np0005531887 nova_compute[186849]: 2025-11-22 08:06:26.821 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:26 np0005531887 nova_compute[186849]: 2025-11-22 08:06:26.869 186853 DEBUG nova.compute.manager [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 03:06:26 np0005531887 nova_compute[186849]: 2025-11-22 08:06:26.870 186853 DEBUG nova.network.neutron [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:06:26 np0005531887 nova_compute[186849]: 2025-11-22 08:06:26.895 186853 INFO nova.virt.libvirt.driver [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 03:06:26 np0005531887 nova_compute[186849]: 2025-11-22 08:06:26.922 186853 DEBUG nova.compute.manager [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 03:06:27 np0005531887 nova_compute[186849]: 2025-11-22 08:06:27.034 186853 DEBUG nova.compute.manager [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 03:06:27 np0005531887 nova_compute[186849]: 2025-11-22 08:06:27.036 186853 DEBUG nova.virt.libvirt.driver [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:06:27 np0005531887 nova_compute[186849]: 2025-11-22 08:06:27.037 186853 INFO nova.virt.libvirt.driver [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Creating image(s)#033[00m
Nov 22 03:06:27 np0005531887 nova_compute[186849]: 2025-11-22 08:06:27.037 186853 DEBUG oslo_concurrency.lockutils [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Acquiring lock "/var/lib/nova/instances/7f1b779f-4565-4529-a2c8-dcb0414326f6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:06:27 np0005531887 nova_compute[186849]: 2025-11-22 08:06:27.037 186853 DEBUG oslo_concurrency.lockutils [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Lock "/var/lib/nova/instances/7f1b779f-4565-4529-a2c8-dcb0414326f6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:06:27 np0005531887 nova_compute[186849]: 2025-11-22 08:06:27.038 186853 DEBUG oslo_concurrency.lockutils [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Lock "/var/lib/nova/instances/7f1b779f-4565-4529-a2c8-dcb0414326f6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:06:27 np0005531887 nova_compute[186849]: 2025-11-22 08:06:27.054 186853 DEBUG oslo_concurrency.processutils [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:06:27 np0005531887 nova_compute[186849]: 2025-11-22 08:06:27.113 186853 DEBUG oslo_concurrency.processutils [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:06:27 np0005531887 nova_compute[186849]: 2025-11-22 08:06:27.114 186853 DEBUG oslo_concurrency.lockutils [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:06:27 np0005531887 nova_compute[186849]: 2025-11-22 08:06:27.115 186853 DEBUG oslo_concurrency.lockutils [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:06:27 np0005531887 nova_compute[186849]: 2025-11-22 08:06:27.130 186853 DEBUG oslo_concurrency.processutils [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:06:27 np0005531887 nova_compute[186849]: 2025-11-22 08:06:27.161 186853 DEBUG nova.policy [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8f7086819eb340f28dd7087159d82fa3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:06:27 np0005531887 nova_compute[186849]: 2025-11-22 08:06:27.194 186853 DEBUG oslo_concurrency.processutils [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:06:27 np0005531887 nova_compute[186849]: 2025-11-22 08:06:27.195 186853 DEBUG oslo_concurrency.processutils [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/7f1b779f-4565-4529-a2c8-dcb0414326f6/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:06:27 np0005531887 nova_compute[186849]: 2025-11-22 08:06:27.414 186853 DEBUG oslo_concurrency.processutils [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/7f1b779f-4565-4529-a2c8-dcb0414326f6/disk 1073741824" returned: 0 in 0.219s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:06:27 np0005531887 nova_compute[186849]: 2025-11-22 08:06:27.415 186853 DEBUG oslo_concurrency.lockutils [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.301s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:06:27 np0005531887 nova_compute[186849]: 2025-11-22 08:06:27.416 186853 DEBUG oslo_concurrency.processutils [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:06:27 np0005531887 nova_compute[186849]: 2025-11-22 08:06:27.479 186853 DEBUG oslo_concurrency.processutils [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:06:27 np0005531887 nova_compute[186849]: 2025-11-22 08:06:27.480 186853 DEBUG nova.virt.disk.api [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Checking if we can resize image /var/lib/nova/instances/7f1b779f-4565-4529-a2c8-dcb0414326f6/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:06:27 np0005531887 nova_compute[186849]: 2025-11-22 08:06:27.481 186853 DEBUG oslo_concurrency.processutils [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f1b779f-4565-4529-a2c8-dcb0414326f6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:06:27 np0005531887 nova_compute[186849]: 2025-11-22 08:06:27.546 186853 DEBUG oslo_concurrency.processutils [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f1b779f-4565-4529-a2c8-dcb0414326f6/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:06:27 np0005531887 nova_compute[186849]: 2025-11-22 08:06:27.547 186853 DEBUG nova.virt.disk.api [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Cannot resize image /var/lib/nova/instances/7f1b779f-4565-4529-a2c8-dcb0414326f6/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:06:27 np0005531887 nova_compute[186849]: 2025-11-22 08:06:27.548 186853 DEBUG nova.objects.instance [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Lazy-loading 'migration_context' on Instance uuid 7f1b779f-4565-4529-a2c8-dcb0414326f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:06:27 np0005531887 nova_compute[186849]: 2025-11-22 08:06:27.569 186853 DEBUG nova.virt.libvirt.driver [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:06:27 np0005531887 nova_compute[186849]: 2025-11-22 08:06:27.570 186853 DEBUG nova.virt.libvirt.driver [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Ensure instance console log exists: /var/lib/nova/instances/7f1b779f-4565-4529-a2c8-dcb0414326f6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:06:27 np0005531887 nova_compute[186849]: 2025-11-22 08:06:27.570 186853 DEBUG oslo_concurrency.lockutils [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:06:27 np0005531887 nova_compute[186849]: 2025-11-22 08:06:27.570 186853 DEBUG oslo_concurrency.lockutils [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:06:27 np0005531887 nova_compute[186849]: 2025-11-22 08:06:27.571 186853 DEBUG oslo_concurrency.lockutils [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:06:28 np0005531887 nova_compute[186849]: 2025-11-22 08:06:28.485 186853 DEBUG nova.network.neutron [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Successfully created port: 0eb38acd-9bcc-4884-9e50-c571e7d6a405 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:06:29 np0005531887 nova_compute[186849]: 2025-11-22 08:06:29.889 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:30 np0005531887 nova_compute[186849]: 2025-11-22 08:06:30.024 186853 DEBUG nova.network.neutron [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Successfully updated port: 0eb38acd-9bcc-4884-9e50-c571e7d6a405 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:06:30 np0005531887 nova_compute[186849]: 2025-11-22 08:06:30.043 186853 DEBUG oslo_concurrency.lockutils [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Acquiring lock "refresh_cache-7f1b779f-4565-4529-a2c8-dcb0414326f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:06:30 np0005531887 nova_compute[186849]: 2025-11-22 08:06:30.043 186853 DEBUG oslo_concurrency.lockutils [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Acquired lock "refresh_cache-7f1b779f-4565-4529-a2c8-dcb0414326f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:06:30 np0005531887 nova_compute[186849]: 2025-11-22 08:06:30.043 186853 DEBUG nova.network.neutron [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:06:30 np0005531887 nova_compute[186849]: 2025-11-22 08:06:30.176 186853 DEBUG nova.compute.manager [req-c9595773-f7e3-4827-ba48-daa8dc1c1b14 req-08b3ebc1-1f67-4b0c-a6f3-641f762d4b98 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Received event network-changed-0eb38acd-9bcc-4884-9e50-c571e7d6a405 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:06:30 np0005531887 nova_compute[186849]: 2025-11-22 08:06:30.176 186853 DEBUG nova.compute.manager [req-c9595773-f7e3-4827-ba48-daa8dc1c1b14 req-08b3ebc1-1f67-4b0c-a6f3-641f762d4b98 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Refreshing instance network info cache due to event network-changed-0eb38acd-9bcc-4884-9e50-c571e7d6a405. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:06:30 np0005531887 nova_compute[186849]: 2025-11-22 08:06:30.176 186853 DEBUG oslo_concurrency.lockutils [req-c9595773-f7e3-4827-ba48-daa8dc1c1b14 req-08b3ebc1-1f67-4b0c-a6f3-641f762d4b98 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-7f1b779f-4565-4529-a2c8-dcb0414326f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:06:30 np0005531887 nova_compute[186849]: 2025-11-22 08:06:30.565 186853 DEBUG nova.network.neutron [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:06:30 np0005531887 podman[229544]: 2025-11-22 08:06:30.834422818 +0000 UTC m=+0.050653451 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 03:06:31 np0005531887 nova_compute[186849]: 2025-11-22 08:06:31.824 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:32 np0005531887 nova_compute[186849]: 2025-11-22 08:06:32.041 186853 DEBUG nova.network.neutron [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Updating instance_info_cache with network_info: [{"id": "0eb38acd-9bcc-4884-9e50-c571e7d6a405", "address": "fa:16:3e:6e:8b:e0", "network": {"id": "f9714091-78f6-46c8-b55b-4a278bd99b49", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1040984303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f7086819eb340f28dd7087159d82fa3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0eb38acd-9b", "ovs_interfaceid": "0eb38acd-9bcc-4884-9e50-c571e7d6a405", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:06:32 np0005531887 nova_compute[186849]: 2025-11-22 08:06:32.073 186853 DEBUG oslo_concurrency.lockutils [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Releasing lock "refresh_cache-7f1b779f-4565-4529-a2c8-dcb0414326f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:06:32 np0005531887 nova_compute[186849]: 2025-11-22 08:06:32.074 186853 DEBUG nova.compute.manager [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Instance network_info: |[{"id": "0eb38acd-9bcc-4884-9e50-c571e7d6a405", "address": "fa:16:3e:6e:8b:e0", "network": {"id": "f9714091-78f6-46c8-b55b-4a278bd99b49", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1040984303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f7086819eb340f28dd7087159d82fa3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0eb38acd-9b", "ovs_interfaceid": "0eb38acd-9bcc-4884-9e50-c571e7d6a405", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 03:06:32 np0005531887 nova_compute[186849]: 2025-11-22 08:06:32.074 186853 DEBUG oslo_concurrency.lockutils [req-c9595773-f7e3-4827-ba48-daa8dc1c1b14 req-08b3ebc1-1f67-4b0c-a6f3-641f762d4b98 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-7f1b779f-4565-4529-a2c8-dcb0414326f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:06:32 np0005531887 nova_compute[186849]: 2025-11-22 08:06:32.074 186853 DEBUG nova.network.neutron [req-c9595773-f7e3-4827-ba48-daa8dc1c1b14 req-08b3ebc1-1f67-4b0c-a6f3-641f762d4b98 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Refreshing network info cache for port 0eb38acd-9bcc-4884-9e50-c571e7d6a405 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:06:32 np0005531887 nova_compute[186849]: 2025-11-22 08:06:32.077 186853 DEBUG nova.virt.libvirt.driver [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Start _get_guest_xml network_info=[{"id": "0eb38acd-9bcc-4884-9e50-c571e7d6a405", "address": "fa:16:3e:6e:8b:e0", "network": {"id": "f9714091-78f6-46c8-b55b-4a278bd99b49", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1040984303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f7086819eb340f28dd7087159d82fa3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0eb38acd-9b", "ovs_interfaceid": "0eb38acd-9bcc-4884-9e50-c571e7d6a405", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'size': 0, 'boot_index': 0, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:06:32 np0005531887 nova_compute[186849]: 2025-11-22 08:06:32.083 186853 WARNING nova.virt.libvirt.driver [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:06:32 np0005531887 nova_compute[186849]: 2025-11-22 08:06:32.095 186853 DEBUG nova.virt.libvirt.host [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:06:32 np0005531887 nova_compute[186849]: 2025-11-22 08:06:32.096 186853 DEBUG nova.virt.libvirt.host [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:06:32 np0005531887 nova_compute[186849]: 2025-11-22 08:06:32.105 186853 DEBUG nova.virt.libvirt.host [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:06:32 np0005531887 nova_compute[186849]: 2025-11-22 08:06:32.107 186853 DEBUG nova.virt.libvirt.host [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:06:32 np0005531887 nova_compute[186849]: 2025-11-22 08:06:32.108 186853 DEBUG nova.virt.libvirt.driver [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:06:32 np0005531887 nova_compute[186849]: 2025-11-22 08:06:32.109 186853 DEBUG nova.virt.hardware [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:06:32 np0005531887 nova_compute[186849]: 2025-11-22 08:06:32.109 186853 DEBUG nova.virt.hardware [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:06:32 np0005531887 nova_compute[186849]: 2025-11-22 08:06:32.109 186853 DEBUG nova.virt.hardware [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:06:32 np0005531887 nova_compute[186849]: 2025-11-22 08:06:32.110 186853 DEBUG nova.virt.hardware [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:06:32 np0005531887 nova_compute[186849]: 2025-11-22 08:06:32.110 186853 DEBUG nova.virt.hardware [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:06:32 np0005531887 nova_compute[186849]: 2025-11-22 08:06:32.110 186853 DEBUG nova.virt.hardware [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:06:32 np0005531887 nova_compute[186849]: 2025-11-22 08:06:32.110 186853 DEBUG nova.virt.hardware [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:06:32 np0005531887 nova_compute[186849]: 2025-11-22 08:06:32.111 186853 DEBUG nova.virt.hardware [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:06:32 np0005531887 nova_compute[186849]: 2025-11-22 08:06:32.111 186853 DEBUG nova.virt.hardware [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:06:32 np0005531887 nova_compute[186849]: 2025-11-22 08:06:32.111 186853 DEBUG nova.virt.hardware [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:06:32 np0005531887 nova_compute[186849]: 2025-11-22 08:06:32.111 186853 DEBUG nova.virt.hardware [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:06:32 np0005531887 nova_compute[186849]: 2025-11-22 08:06:32.116 186853 DEBUG nova.virt.libvirt.vif [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:06:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1944607136',display_name='tempest-ServerRescueNegativeTestJSON-server-1944607136',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-1944607136',id=108,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8f7086819eb340f28dd7087159d82fa3',ramdisk_id='',reservation_id='r-caomrmyb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-1724156244',owner_user_name='tempest-ServerRescueNegativeTestJSON-1724156244-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:06:26Z,user_data=None,user_id='2c1b21c06c9b48d39e736b195bd12c8c',uuid=7f1b779f-4565-4529-a2c8-dcb0414326f6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0eb38acd-9bcc-4884-9e50-c571e7d6a405", "address": "fa:16:3e:6e:8b:e0", "network": {"id": "f9714091-78f6-46c8-b55b-4a278bd99b49", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1040984303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f7086819eb340f28dd7087159d82fa3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0eb38acd-9b", "ovs_interfaceid": "0eb38acd-9bcc-4884-9e50-c571e7d6a405", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:06:32 np0005531887 nova_compute[186849]: 2025-11-22 08:06:32.116 186853 DEBUG nova.network.os_vif_util [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Converting VIF {"id": "0eb38acd-9bcc-4884-9e50-c571e7d6a405", "address": "fa:16:3e:6e:8b:e0", "network": {"id": "f9714091-78f6-46c8-b55b-4a278bd99b49", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1040984303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f7086819eb340f28dd7087159d82fa3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0eb38acd-9b", "ovs_interfaceid": "0eb38acd-9bcc-4884-9e50-c571e7d6a405", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:06:32 np0005531887 nova_compute[186849]: 2025-11-22 08:06:32.117 186853 DEBUG nova.network.os_vif_util [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6e:8b:e0,bridge_name='br-int',has_traffic_filtering=True,id=0eb38acd-9bcc-4884-9e50-c571e7d6a405,network=Network(f9714091-78f6-46c8-b55b-4a278bd99b49),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0eb38acd-9b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:06:32 np0005531887 nova_compute[186849]: 2025-11-22 08:06:32.118 186853 DEBUG nova.objects.instance [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7f1b779f-4565-4529-a2c8-dcb0414326f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:06:32 np0005531887 nova_compute[186849]: 2025-11-22 08:06:32.130 186853 DEBUG nova.virt.libvirt.driver [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:06:32 np0005531887 nova_compute[186849]:  <uuid>7f1b779f-4565-4529-a2c8-dcb0414326f6</uuid>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:  <name>instance-0000006c</name>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:  <memory>131072</memory>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:  <vcpu>1</vcpu>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:06:32 np0005531887 nova_compute[186849]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:      <nova:name>tempest-ServerRescueNegativeTestJSON-server-1944607136</nova:name>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:      <nova:creationTime>2025-11-22 08:06:32</nova:creationTime>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:      <nova:flavor name="m1.nano">
Nov 22 03:06:32 np0005531887 nova_compute[186849]:        <nova:memory>128</nova:memory>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:        <nova:disk>1</nova:disk>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:        <nova:swap>0</nova:swap>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:      </nova:flavor>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:      <nova:owner>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:        <nova:user uuid="2c1b21c06c9b48d39e736b195bd12c8c">tempest-ServerRescueNegativeTestJSON-1724156244-project-member</nova:user>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:        <nova:project uuid="8f7086819eb340f28dd7087159d82fa3">tempest-ServerRescueNegativeTestJSON-1724156244</nova:project>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:      </nova:owner>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:      <nova:ports>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:        <nova:port uuid="0eb38acd-9bcc-4884-9e50-c571e7d6a405">
Nov 22 03:06:32 np0005531887 nova_compute[186849]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:        </nova:port>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:      </nova:ports>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:    </nova:instance>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:  <sysinfo type="smbios">
Nov 22 03:06:32 np0005531887 nova_compute[186849]:    <system>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:      <entry name="serial">7f1b779f-4565-4529-a2c8-dcb0414326f6</entry>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:      <entry name="uuid">7f1b779f-4565-4529-a2c8-dcb0414326f6</entry>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:    </system>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:  <os>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:    <boot dev="hd"/>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:    <smbios mode="sysinfo"/>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:  </os>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:  <features>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:    <vmcoreinfo/>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:  </features>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:  <clock offset="utc">
Nov 22 03:06:32 np0005531887 nova_compute[186849]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:    <timer name="hpet" present="no"/>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:  </clock>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:  <cpu mode="custom" match="exact">
Nov 22 03:06:32 np0005531887 nova_compute[186849]:    <model>Nehalem</model>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:  <devices>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:    <disk type="file" device="disk">
Nov 22 03:06:32 np0005531887 nova_compute[186849]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/7f1b779f-4565-4529-a2c8-dcb0414326f6/disk"/>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:      <target dev="vda" bus="virtio"/>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:    <disk type="file" device="cdrom">
Nov 22 03:06:32 np0005531887 nova_compute[186849]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/7f1b779f-4565-4529-a2c8-dcb0414326f6/disk.config"/>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:      <target dev="sda" bus="sata"/>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:    <interface type="ethernet">
Nov 22 03:06:32 np0005531887 nova_compute[186849]:      <mac address="fa:16:3e:6e:8b:e0"/>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:      <mtu size="1442"/>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:      <target dev="tap0eb38acd-9b"/>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:    </interface>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:    <serial type="pty">
Nov 22 03:06:32 np0005531887 nova_compute[186849]:      <log file="/var/lib/nova/instances/7f1b779f-4565-4529-a2c8-dcb0414326f6/console.log" append="off"/>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:    </serial>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:    <video>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:    </video>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:    <input type="tablet" bus="usb"/>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:    <rng model="virtio">
Nov 22 03:06:32 np0005531887 nova_compute[186849]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:    </rng>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:    <controller type="usb" index="0"/>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:    <memballoon model="virtio">
Nov 22 03:06:32 np0005531887 nova_compute[186849]:      <stats period="10"/>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 03:06:32 np0005531887 nova_compute[186849]:  </devices>
Nov 22 03:06:32 np0005531887 nova_compute[186849]: </domain>
Nov 22 03:06:32 np0005531887 nova_compute[186849]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:06:32 np0005531887 nova_compute[186849]: 2025-11-22 08:06:32.132 186853 DEBUG nova.compute.manager [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Preparing to wait for external event network-vif-plugged-0eb38acd-9bcc-4884-9e50-c571e7d6a405 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:06:32 np0005531887 nova_compute[186849]: 2025-11-22 08:06:32.132 186853 DEBUG oslo_concurrency.lockutils [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Acquiring lock "7f1b779f-4565-4529-a2c8-dcb0414326f6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:06:32 np0005531887 nova_compute[186849]: 2025-11-22 08:06:32.133 186853 DEBUG oslo_concurrency.lockutils [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Lock "7f1b779f-4565-4529-a2c8-dcb0414326f6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:06:32 np0005531887 nova_compute[186849]: 2025-11-22 08:06:32.133 186853 DEBUG oslo_concurrency.lockutils [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Lock "7f1b779f-4565-4529-a2c8-dcb0414326f6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:06:32 np0005531887 nova_compute[186849]: 2025-11-22 08:06:32.133 186853 DEBUG nova.virt.libvirt.vif [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:06:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1944607136',display_name='tempest-ServerRescueNegativeTestJSON-server-1944607136',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-1944607136',id=108,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8f7086819eb340f28dd7087159d82fa3',ramdisk_id='',reservation_id='r-caomrmyb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-1724156244',owner_user_name='tempest-ServerRescueNegativeTestJSON-1724156244-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:06:26Z,user_data=None,user_id='2c1b21c06c9b48d39e736b195bd12c8c',uuid=7f1b779f-4565-4529-a2c8-dcb0414326f6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0eb38acd-9bcc-4884-9e50-c571e7d6a405", "address": "fa:16:3e:6e:8b:e0", "network": {"id": "f9714091-78f6-46c8-b55b-4a278bd99b49", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1040984303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f7086819eb340f28dd7087159d82fa3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0eb38acd-9b", "ovs_interfaceid": "0eb38acd-9bcc-4884-9e50-c571e7d6a405", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:06:32 np0005531887 nova_compute[186849]: 2025-11-22 08:06:32.134 186853 DEBUG nova.network.os_vif_util [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Converting VIF {"id": "0eb38acd-9bcc-4884-9e50-c571e7d6a405", "address": "fa:16:3e:6e:8b:e0", "network": {"id": "f9714091-78f6-46c8-b55b-4a278bd99b49", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1040984303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f7086819eb340f28dd7087159d82fa3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0eb38acd-9b", "ovs_interfaceid": "0eb38acd-9bcc-4884-9e50-c571e7d6a405", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:06:32 np0005531887 nova_compute[186849]: 2025-11-22 08:06:32.135 186853 DEBUG nova.network.os_vif_util [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6e:8b:e0,bridge_name='br-int',has_traffic_filtering=True,id=0eb38acd-9bcc-4884-9e50-c571e7d6a405,network=Network(f9714091-78f6-46c8-b55b-4a278bd99b49),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0eb38acd-9b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:06:32 np0005531887 nova_compute[186849]: 2025-11-22 08:06:32.135 186853 DEBUG os_vif [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6e:8b:e0,bridge_name='br-int',has_traffic_filtering=True,id=0eb38acd-9bcc-4884-9e50-c571e7d6a405,network=Network(f9714091-78f6-46c8-b55b-4a278bd99b49),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0eb38acd-9b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:06:32 np0005531887 nova_compute[186849]: 2025-11-22 08:06:32.135 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:32 np0005531887 nova_compute[186849]: 2025-11-22 08:06:32.136 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:06:32 np0005531887 nova_compute[186849]: 2025-11-22 08:06:32.137 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:06:32 np0005531887 nova_compute[186849]: 2025-11-22 08:06:32.141 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:32 np0005531887 nova_compute[186849]: 2025-11-22 08:06:32.142 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0eb38acd-9b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:06:32 np0005531887 nova_compute[186849]: 2025-11-22 08:06:32.142 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0eb38acd-9b, col_values=(('external_ids', {'iface-id': '0eb38acd-9bcc-4884-9e50-c571e7d6a405', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6e:8b:e0', 'vm-uuid': '7f1b779f-4565-4529-a2c8-dcb0414326f6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:06:32 np0005531887 nova_compute[186849]: 2025-11-22 08:06:32.146 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:32 np0005531887 nova_compute[186849]: 2025-11-22 08:06:32.149 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:06:32 np0005531887 NetworkManager[55210]: <info>  [1763798792.1485] manager: (tap0eb38acd-9b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/156)
Nov 22 03:06:32 np0005531887 nova_compute[186849]: 2025-11-22 08:06:32.155 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:32 np0005531887 nova_compute[186849]: 2025-11-22 08:06:32.156 186853 INFO os_vif [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6e:8b:e0,bridge_name='br-int',has_traffic_filtering=True,id=0eb38acd-9bcc-4884-9e50-c571e7d6a405,network=Network(f9714091-78f6-46c8-b55b-4a278bd99b49),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0eb38acd-9b')#033[00m
Nov 22 03:06:32 np0005531887 nova_compute[186849]: 2025-11-22 08:06:32.241 186853 DEBUG nova.virt.libvirt.driver [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:06:32 np0005531887 nova_compute[186849]: 2025-11-22 08:06:32.242 186853 DEBUG nova.virt.libvirt.driver [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:06:32 np0005531887 nova_compute[186849]: 2025-11-22 08:06:32.242 186853 DEBUG nova.virt.libvirt.driver [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] No VIF found with MAC fa:16:3e:6e:8b:e0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:06:32 np0005531887 nova_compute[186849]: 2025-11-22 08:06:32.242 186853 INFO nova.virt.libvirt.driver [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Using config drive#033[00m
Nov 22 03:06:32 np0005531887 nova_compute[186849]: 2025-11-22 08:06:32.800 186853 INFO nova.virt.libvirt.driver [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Creating config drive at /var/lib/nova/instances/7f1b779f-4565-4529-a2c8-dcb0414326f6/disk.config#033[00m
Nov 22 03:06:32 np0005531887 nova_compute[186849]: 2025-11-22 08:06:32.808 186853 DEBUG oslo_concurrency.processutils [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7f1b779f-4565-4529-a2c8-dcb0414326f6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcm9espo_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:06:32 np0005531887 nova_compute[186849]: 2025-11-22 08:06:32.939 186853 DEBUG oslo_concurrency.processutils [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7f1b779f-4565-4529-a2c8-dcb0414326f6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcm9espo_" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:06:33 np0005531887 kernel: tap0eb38acd-9b: entered promiscuous mode
Nov 22 03:06:33 np0005531887 NetworkManager[55210]: <info>  [1763798793.0135] manager: (tap0eb38acd-9b): new Tun device (/org/freedesktop/NetworkManager/Devices/157)
Nov 22 03:06:33 np0005531887 ovn_controller[95130]: 2025-11-22T08:06:33Z|00342|binding|INFO|Claiming lport 0eb38acd-9bcc-4884-9e50-c571e7d6a405 for this chassis.
Nov 22 03:06:33 np0005531887 nova_compute[186849]: 2025-11-22 08:06:33.015 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:33 np0005531887 ovn_controller[95130]: 2025-11-22T08:06:33Z|00343|binding|INFO|0eb38acd-9bcc-4884-9e50-c571e7d6a405: Claiming fa:16:3e:6e:8b:e0 10.100.0.8
Nov 22 03:06:33 np0005531887 nova_compute[186849]: 2025-11-22 08:06:33.018 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:06:33.029 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6e:8b:e0 10.100.0.8'], port_security=['fa:16:3e:6e:8b:e0 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '7f1b779f-4565-4529-a2c8-dcb0414326f6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9714091-78f6-46c8-b55b-4a278bd99b49', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f7086819eb340f28dd7087159d82fa3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '65f3b143-522b-4e83-8261-f97700b0bd79', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a437e229-533d-4315-8ee6-05d493bb5ad7, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=0eb38acd-9bcc-4884-9e50-c571e7d6a405) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:06:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:06:33.031 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 0eb38acd-9bcc-4884-9e50-c571e7d6a405 in datapath f9714091-78f6-46c8-b55b-4a278bd99b49 bound to our chassis#033[00m
Nov 22 03:06:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:06:33.033 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f9714091-78f6-46c8-b55b-4a278bd99b49#033[00m
Nov 22 03:06:33 np0005531887 systemd-udevd[229588]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:06:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:06:33.046 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[5486801a-1aae-4143-bbc1-77000d38cd37]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:06:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:06:33.047 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf9714091-71 in ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:06:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:06:33.049 213790 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf9714091-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:06:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:06:33.050 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[213c8a9c-1ba3-4f07-9b8c-4189691926b0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:06:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:06:33.052 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[4b3eec42-6c30-4aea-b80e-f7f6f9a78da0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:06:33 np0005531887 NetworkManager[55210]: <info>  [1763798793.0625] device (tap0eb38acd-9b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:06:33 np0005531887 NetworkManager[55210]: <info>  [1763798793.0637] device (tap0eb38acd-9b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:06:33 np0005531887 systemd-machined[153180]: New machine qemu-42-instance-0000006c.
Nov 22 03:06:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:06:33.069 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[f0ebc774-491c-42c6-a5ec-dbb4b75eeae8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:06:33 np0005531887 nova_compute[186849]: 2025-11-22 08:06:33.074 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:33 np0005531887 ovn_controller[95130]: 2025-11-22T08:06:33Z|00344|binding|INFO|Setting lport 0eb38acd-9bcc-4884-9e50-c571e7d6a405 ovn-installed in OVS
Nov 22 03:06:33 np0005531887 ovn_controller[95130]: 2025-11-22T08:06:33Z|00345|binding|INFO|Setting lport 0eb38acd-9bcc-4884-9e50-c571e7d6a405 up in Southbound
Nov 22 03:06:33 np0005531887 nova_compute[186849]: 2025-11-22 08:06:33.081 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:33 np0005531887 systemd[1]: Started Virtual Machine qemu-42-instance-0000006c.
Nov 22 03:06:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:06:33.083 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[ccfa76b3-b0b3-4cf3-92c5-f6b6af78f29e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:06:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:06:33.114 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[d7b46e0f-9149-4b1c-bd33-d27a15bfd78c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:06:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:06:33.119 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[6d20142f-02c0-47a2-bcb3-306f977bac71]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:06:33 np0005531887 NetworkManager[55210]: <info>  [1763798793.1207] manager: (tapf9714091-70): new Veth device (/org/freedesktop/NetworkManager/Devices/158)
Nov 22 03:06:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:06:33.152 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[8365cef0-eed8-4521-8b54-7cd45f4e199a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:06:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:06:33.155 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[cbfcf120-a4b5-4681-a299-9f69d062e01b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:06:33 np0005531887 NetworkManager[55210]: <info>  [1763798793.1810] device (tapf9714091-70): carrier: link connected
Nov 22 03:06:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:06:33.186 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[6707237c-d536-4a0e-9e4c-e4bf69b99d61]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:06:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:06:33.206 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[4e3f10bb-c2bb-4d9a-80b6-436cfbe0f792]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf9714091-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:21:55:83'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 105], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 553578, 'reachable_time': 19710, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229621, 'error': None, 'target': 'ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:06:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:06:33.223 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[9606eee7-a985-4455-8719-aa18fec66ab9]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe21:5583'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 553578, 'tstamp': 553578}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229622, 'error': None, 'target': 'ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:06:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:06:33.244 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[0f20505d-5b65-46e0-98e2-4dbac6a70e00]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf9714091-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:21:55:83'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 105], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 553578, 'reachable_time': 19710, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 229623, 'error': None, 'target': 'ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:06:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:06:33.286 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[c6e7817c-77c3-4c5b-bff1-0feb7138fdb9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:06:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:06:33.357 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[e41cdec1-05fc-4cd2-b81e-93fed32b87db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:06:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:06:33.360 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9714091-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:06:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:06:33.361 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:06:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:06:33.361 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf9714091-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:06:33 np0005531887 kernel: tapf9714091-70: entered promiscuous mode
Nov 22 03:06:33 np0005531887 NetworkManager[55210]: <info>  [1763798793.3641] manager: (tapf9714091-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/159)
Nov 22 03:06:33 np0005531887 nova_compute[186849]: 2025-11-22 08:06:33.365 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:06:33.366 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf9714091-70, col_values=(('external_ids', {'iface-id': '298be65c-aa9e-4327-b67d-2a3d4f1acf68'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:06:33 np0005531887 ovn_controller[95130]: 2025-11-22T08:06:33Z|00346|binding|INFO|Releasing lport 298be65c-aa9e-4327-b67d-2a3d4f1acf68 from this chassis (sb_readonly=0)
Nov 22 03:06:33 np0005531887 nova_compute[186849]: 2025-11-22 08:06:33.382 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:06:33.384 104084 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f9714091-78f6-46c8-b55b-4a278bd99b49.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f9714091-78f6-46c8-b55b-4a278bd99b49.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:06:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:06:33.386 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[467d4798-3e42-4016-aa82-1b2d0f1c8808]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:06:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:06:33.388 104084 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:06:33 np0005531887 ovn_metadata_agent[104079]: global
Nov 22 03:06:33 np0005531887 ovn_metadata_agent[104079]:    log         /dev/log local0 debug
Nov 22 03:06:33 np0005531887 ovn_metadata_agent[104079]:    log-tag     haproxy-metadata-proxy-f9714091-78f6-46c8-b55b-4a278bd99b49
Nov 22 03:06:33 np0005531887 ovn_metadata_agent[104079]:    user        root
Nov 22 03:06:33 np0005531887 ovn_metadata_agent[104079]:    group       root
Nov 22 03:06:33 np0005531887 ovn_metadata_agent[104079]:    maxconn     1024
Nov 22 03:06:33 np0005531887 ovn_metadata_agent[104079]:    pidfile     /var/lib/neutron/external/pids/f9714091-78f6-46c8-b55b-4a278bd99b49.pid.haproxy
Nov 22 03:06:33 np0005531887 ovn_metadata_agent[104079]:    daemon
Nov 22 03:06:33 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:06:33 np0005531887 ovn_metadata_agent[104079]: defaults
Nov 22 03:06:33 np0005531887 ovn_metadata_agent[104079]:    log global
Nov 22 03:06:33 np0005531887 ovn_metadata_agent[104079]:    mode http
Nov 22 03:06:33 np0005531887 ovn_metadata_agent[104079]:    option httplog
Nov 22 03:06:33 np0005531887 ovn_metadata_agent[104079]:    option dontlognull
Nov 22 03:06:33 np0005531887 ovn_metadata_agent[104079]:    option http-server-close
Nov 22 03:06:33 np0005531887 ovn_metadata_agent[104079]:    option forwardfor
Nov 22 03:06:33 np0005531887 ovn_metadata_agent[104079]:    retries                 3
Nov 22 03:06:33 np0005531887 ovn_metadata_agent[104079]:    timeout http-request    30s
Nov 22 03:06:33 np0005531887 ovn_metadata_agent[104079]:    timeout connect         30s
Nov 22 03:06:33 np0005531887 ovn_metadata_agent[104079]:    timeout client          32s
Nov 22 03:06:33 np0005531887 ovn_metadata_agent[104079]:    timeout server          32s
Nov 22 03:06:33 np0005531887 ovn_metadata_agent[104079]:    timeout http-keep-alive 30s
Nov 22 03:06:33 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:06:33 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:06:33 np0005531887 ovn_metadata_agent[104079]: listen listener
Nov 22 03:06:33 np0005531887 ovn_metadata_agent[104079]:    bind 169.254.169.254:80
Nov 22 03:06:33 np0005531887 ovn_metadata_agent[104079]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:06:33 np0005531887 ovn_metadata_agent[104079]:    http-request add-header X-OVN-Network-ID f9714091-78f6-46c8-b55b-4a278bd99b49
Nov 22 03:06:33 np0005531887 ovn_metadata_agent[104079]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:06:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:06:33.390 104084 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49', 'env', 'PROCESS_TAG=haproxy-f9714091-78f6-46c8-b55b-4a278bd99b49', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f9714091-78f6-46c8-b55b-4a278bd99b49.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:06:33 np0005531887 podman[229655]: 2025-11-22 08:06:33.80267532 +0000 UTC m=+0.079751041 container create f90fe7d32f25b43153f0587a78fe622484c13ba687c0ea037cf00207ee1e8580 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 22 03:06:33 np0005531887 podman[229655]: 2025-11-22 08:06:33.747399343 +0000 UTC m=+0.024475084 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:06:33 np0005531887 systemd[1]: Started libpod-conmon-f90fe7d32f25b43153f0587a78fe622484c13ba687c0ea037cf00207ee1e8580.scope.
Nov 22 03:06:33 np0005531887 systemd[1]: Started libcrun container.
Nov 22 03:06:33 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a4ae92f610f64a0680a90855d762b6100c65d4a96e003c2ef984b661eb3b718c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:06:33 np0005531887 podman[229655]: 2025-11-22 08:06:33.893766009 +0000 UTC m=+0.170841750 container init f90fe7d32f25b43153f0587a78fe622484c13ba687c0ea037cf00207ee1e8580 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Nov 22 03:06:33 np0005531887 podman[229655]: 2025-11-22 08:06:33.900577742 +0000 UTC m=+0.177653463 container start f90fe7d32f25b43153f0587a78fe622484c13ba687c0ea037cf00207ee1e8580 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 22 03:06:33 np0005531887 neutron-haproxy-ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49[229671]: [NOTICE]   (229681) : New worker (229683) forked
Nov 22 03:06:33 np0005531887 neutron-haproxy-ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49[229671]: [NOTICE]   (229681) : Loading success.
Nov 22 03:06:33 np0005531887 nova_compute[186849]: 2025-11-22 08:06:33.983 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798793.9822056, 7f1b779f-4565-4529-a2c8-dcb0414326f6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:06:33 np0005531887 nova_compute[186849]: 2025-11-22 08:06:33.983 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] VM Started (Lifecycle Event)#033[00m
Nov 22 03:06:34 np0005531887 nova_compute[186849]: 2025-11-22 08:06:34.009 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:06:34 np0005531887 nova_compute[186849]: 2025-11-22 08:06:34.014 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798793.98258, 7f1b779f-4565-4529-a2c8-dcb0414326f6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:06:34 np0005531887 nova_compute[186849]: 2025-11-22 08:06:34.014 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:06:34 np0005531887 nova_compute[186849]: 2025-11-22 08:06:34.038 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:06:34 np0005531887 nova_compute[186849]: 2025-11-22 08:06:34.041 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:06:34 np0005531887 nova_compute[186849]: 2025-11-22 08:06:34.067 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:06:34 np0005531887 nova_compute[186849]: 2025-11-22 08:06:34.442 186853 DEBUG nova.network.neutron [req-c9595773-f7e3-4827-ba48-daa8dc1c1b14 req-08b3ebc1-1f67-4b0c-a6f3-641f762d4b98 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Updated VIF entry in instance network info cache for port 0eb38acd-9bcc-4884-9e50-c571e7d6a405. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:06:34 np0005531887 nova_compute[186849]: 2025-11-22 08:06:34.443 186853 DEBUG nova.network.neutron [req-c9595773-f7e3-4827-ba48-daa8dc1c1b14 req-08b3ebc1-1f67-4b0c-a6f3-641f762d4b98 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Updating instance_info_cache with network_info: [{"id": "0eb38acd-9bcc-4884-9e50-c571e7d6a405", "address": "fa:16:3e:6e:8b:e0", "network": {"id": "f9714091-78f6-46c8-b55b-4a278bd99b49", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1040984303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f7086819eb340f28dd7087159d82fa3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0eb38acd-9b", "ovs_interfaceid": "0eb38acd-9bcc-4884-9e50-c571e7d6a405", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:06:34 np0005531887 nova_compute[186849]: 2025-11-22 08:06:34.461 186853 DEBUG oslo_concurrency.lockutils [req-c9595773-f7e3-4827-ba48-daa8dc1c1b14 req-08b3ebc1-1f67-4b0c-a6f3-641f762d4b98 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-7f1b779f-4565-4529-a2c8-dcb0414326f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:06:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:06:35.142 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:28:f1 10.100.0.2 2001:db8::f816:3eff:fe1e:28f1'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe1e:28f1/64', 'neutron:device_id': 'ovnmeta-90da6fca-65d1-4012-9602-d88842a0ad0e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-90da6fca-65d1-4012-9602-d88842a0ad0e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e9c41f1e-b11e-4868-a3a0-70214f7435c4, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=0abd56a4-3e9e-4d28-8383-eadcda41744d) old=Port_Binding(mac=['fa:16:3e:1e:28:f1 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-90da6fca-65d1-4012-9602-d88842a0ad0e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-90da6fca-65d1-4012-9602-d88842a0ad0e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:06:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:06:35.145 104084 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 0abd56a4-3e9e-4d28-8383-eadcda41744d in datapath 90da6fca-65d1-4012-9602-d88842a0ad0e updated#033[00m
Nov 22 03:06:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:06:35.146 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 90da6fca-65d1-4012-9602-d88842a0ad0e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:06:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:06:35.147 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[6ac8eb53-98dd-47af-b498-90a953496b2f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.668 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '7f1b779f-4565-4529-a2c8-dcb0414326f6', 'name': 'tempest-ServerRescueNegativeTestJSON-server-1944607136', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000006c', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'paused', 'tenant_id': '8f7086819eb340f28dd7087159d82fa3', 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'hostId': '216bdafe739e0a3e3d1e5de248403d6b49b2591f524e852cc6e1e322', 'status': 'paused', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.669 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.682 12 DEBUG ceilometer.compute.pollsters [-] 7f1b779f-4565-4529-a2c8-dcb0414326f6/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.683 12 DEBUG ceilometer.compute.pollsters [-] 7f1b779f-4565-4529-a2c8-dcb0414326f6/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ebb7118d-6fa7-4b68-9b3d-454c93b68904', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'user_name': None, 'project_id': '8f7086819eb340f28dd7087159d82fa3', 'project_name': None, 'resource_id': '7f1b779f-4565-4529-a2c8-dcb0414326f6-vda', 'timestamp': '2025-11-22T08:06:36.669916', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1944607136', 'name': 'instance-0000006c', 'instance_id': '7f1b779f-4565-4529-a2c8-dcb0414326f6', 'instance_type': 'm1.nano', 'host': '216bdafe739e0a3e3d1e5de248403d6b49b2591f524e852cc6e1e322', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2ad216ae-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5539.340267332, 'message_signature': '7d208dd0b1c44fcff4b0420c241bde94e78e6cda2cef4b96c99ad148bbb75909'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'user_name': None, 'project_id': '8f7086819eb340f28dd7087159d82fa3', 'project_name': None, 'resource_id': '7f1b779f-4565-4529-a2c8-dcb0414326f6-sda', 'timestamp': '2025-11-22T08:06:36.669916', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1944607136', 'name': 'instance-0000006c', 'instance_id': '7f1b779f-4565-4529-a2c8-dcb0414326f6', 'instance_type': 'm1.nano', 'host': '216bdafe739e0a3e3d1e5de248403d6b49b2591f524e852cc6e1e322', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2ad2228e-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5539.340267332, 'message_signature': '29e6cc8d185f624d09ec0a917d6389f9a565e31aaf426f3a29cfc8648745d57a'}]}, 'timestamp': '2025-11-22 08:06:36.683850', '_unique_id': '2e195103e2f740a98ad21f958c71f15f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.685 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.685 12 DEBUG ceilometer.compute.pollsters [-] 7f1b779f-4565-4529-a2c8-dcb0414326f6/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.686 12 DEBUG ceilometer.compute.pollsters [-] 7f1b779f-4565-4529-a2c8-dcb0414326f6/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e0227ba3-7923-4892-b820-986db2027254', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'user_name': None, 'project_id': '8f7086819eb340f28dd7087159d82fa3', 'project_name': None, 'resource_id': '7f1b779f-4565-4529-a2c8-dcb0414326f6-vda', 'timestamp': '2025-11-22T08:06:36.685848', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1944607136', 'name': 'instance-0000006c', 'instance_id': '7f1b779f-4565-4529-a2c8-dcb0414326f6', 'instance_type': 'm1.nano', 'host': '216bdafe739e0a3e3d1e5de248403d6b49b2591f524e852cc6e1e322', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2ad27b26-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5539.340267332, 'message_signature': 'a6da217bc87a086ee393d8f9d158464f711684a162a755e5f3b9f4711c0d6822'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'user_name': None, 'project_id': '8f7086819eb340f28dd7087159d82fa3', 'project_name': None, 'resource_id': '7f1b779f-4565-4529-a2c8-dcb0414326f6-sda', 'timestamp': '2025-11-22T08:06:36.685848', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1944607136', 'name': 'instance-0000006c', 'instance_id': '7f1b779f-4565-4529-a2c8-dcb0414326f6', 'instance_type': 'm1.nano', 'host': '216bdafe739e0a3e3d1e5de248403d6b49b2591f524e852cc6e1e322', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2ad28742-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5539.340267332, 'message_signature': '3ee785ba0f640050f6dc111294d5f3eb11dde0d53a3835a3d558aacb81007eee'}]}, 'timestamp': '2025-11-22 08:06:36.686461', '_unique_id': '2ec2e441a6c74fae83ff605e6e2a166a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.687 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.687 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.687 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-1944607136>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-1944607136>]
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.688 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.691 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 7f1b779f-4565-4529-a2c8-dcb0414326f6 / tap0eb38acd-9b inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.691 12 DEBUG ceilometer.compute.pollsters [-] 7f1b779f-4565-4529-a2c8-dcb0414326f6/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ca56d43c-99b1-4214-a959-9b5a77da4efc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'user_name': None, 'project_id': '8f7086819eb340f28dd7087159d82fa3', 'project_name': None, 'resource_id': 'instance-0000006c-7f1b779f-4565-4529-a2c8-dcb0414326f6-tap0eb38acd-9b', 'timestamp': '2025-11-22T08:06:36.688117', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1944607136', 'name': 'tap0eb38acd-9b', 'instance_id': '7f1b779f-4565-4529-a2c8-dcb0414326f6', 'instance_type': 'm1.nano', 'host': '216bdafe739e0a3e3d1e5de248403d6b49b2591f524e852cc6e1e322', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6e:8b:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0eb38acd-9b'}, 'message_id': '2ad350dc-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5539.357920442, 'message_signature': 'fe7b75fa22b9f2f1e1941e32b05b60b892a39dcf95d949a22fa8064548c7b06a'}]}, 'timestamp': '2025-11-22 08:06:36.691653', '_unique_id': '00aa113b3cde45a89bdcbc2f426cabce'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.693 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.718 12 DEBUG ceilometer.compute.pollsters [-] 7f1b779f-4565-4529-a2c8-dcb0414326f6/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.719 12 DEBUG ceilometer.compute.pollsters [-] 7f1b779f-4565-4529-a2c8-dcb0414326f6/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.720 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6a765483-30e7-405c-bc9e-ea59abadd0c1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'user_name': None, 'project_id': '8f7086819eb340f28dd7087159d82fa3', 'project_name': None, 'resource_id': '7f1b779f-4565-4529-a2c8-dcb0414326f6-vda', 'timestamp': '2025-11-22T08:06:36.693261', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1944607136', 'name': 'instance-0000006c', 'instance_id': '7f1b779f-4565-4529-a2c8-dcb0414326f6', 'instance_type': 'm1.nano', 'host': '216bdafe739e0a3e3d1e5de248403d6b49b2591f524e852cc6e1e322', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2ad784ea-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5539.363090714, 'message_signature': '025d578a7654014986a675a0df943a6e6bc302119c64307dd08dbb8c19bd67ee'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'user_name': None, 'project_id': '8f7086819eb340f28dd7087159d82fa3', 'project_name': None, 'resource_id': '7f1b779f-4565-4529-a2c8-dcb0414326f6-sda', 'timestamp': '2025-11-22T08:06:36.693261', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1944607136', 'name': 'instance-0000006c', 'instance_id': '7f1b779f-4565-4529-a2c8-dcb0414326f6', 'instance_type': 'm1.nano', 'host': '216bdafe739e0a3e3d1e5de248403d6b49b2591f524e852cc6e1e322', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2ad791ec-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5539.363090714, 'message_signature': '1f15b13993d3d8c47bd4417c0cce700c5dbc2d647de96ffe4c6cfe402a06da04'}]}, 'timestamp': '2025-11-22 08:06:36.719487', '_unique_id': '96741e5a6bba436f89929a17681b14fc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.720 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.720 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.720 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.720 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.720 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.720 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.720 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.720 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.720 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.720 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.720 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.720 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.720 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.720 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.720 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.720 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.720 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.720 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.720 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.720 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.720 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.720 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.720 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.720 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.720 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.720 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.720 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.720 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.720 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.720 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.720 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.721 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.721 12 DEBUG ceilometer.compute.pollsters [-] 7f1b779f-4565-4529-a2c8-dcb0414326f6/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.722 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '606d8410-60e5-4de1-83c4-1919b866876f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'user_name': None, 'project_id': '8f7086819eb340f28dd7087159d82fa3', 'project_name': None, 'resource_id': 'instance-0000006c-7f1b779f-4565-4529-a2c8-dcb0414326f6-tap0eb38acd-9b', 'timestamp': '2025-11-22T08:06:36.721307', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1944607136', 'name': 'tap0eb38acd-9b', 'instance_id': '7f1b779f-4565-4529-a2c8-dcb0414326f6', 'instance_type': 'm1.nano', 'host': '216bdafe739e0a3e3d1e5de248403d6b49b2591f524e852cc6e1e322', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6e:8b:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0eb38acd-9b'}, 'message_id': '2ad7e4c6-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5539.357920442, 'message_signature': 'e802258175d3f5bca39bc5da4a70efc786186d170505bb4f1f27184c8691bf0f'}]}, 'timestamp': '2025-11-22 08:06:36.721597', '_unique_id': 'ba8b4160a4b24a72b88de51b9bddbaa4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.722 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.722 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.722 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.722 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.722 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.722 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.722 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.722 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.722 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.722 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.722 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.722 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.722 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.722 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.722 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.722 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.722 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.722 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.722 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.722 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.722 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.722 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.722 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.722 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.722 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.722 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.722 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.722 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.722 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.722 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.722 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.722 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.722 12 DEBUG ceilometer.compute.pollsters [-] 7f1b779f-4565-4529-a2c8-dcb0414326f6/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.723 12 DEBUG ceilometer.compute.pollsters [-] 7f1b779f-4565-4529-a2c8-dcb0414326f6/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.723 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f422e9df-2c8c-4056-ac8d-e5956334f5f9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'user_name': None, 'project_id': '8f7086819eb340f28dd7087159d82fa3', 'project_name': None, 'resource_id': '7f1b779f-4565-4529-a2c8-dcb0414326f6-vda', 'timestamp': '2025-11-22T08:06:36.722906', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1944607136', 'name': 'instance-0000006c', 'instance_id': '7f1b779f-4565-4529-a2c8-dcb0414326f6', 'instance_type': 'm1.nano', 'host': '216bdafe739e0a3e3d1e5de248403d6b49b2591f524e852cc6e1e322', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2ad821f2-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5539.363090714, 'message_signature': '2e5185ecf18747b879144d606fda0e7d373f585d28098ebe62f8b110b5c14682'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'user_name': None, 'project_id': '8f7086819eb340f28dd7087159d82fa3', 'project_name': None, 'resource_id': '7f1b779f-4565-4529-a2c8-dcb0414326f6-sda', 'timestamp': '2025-11-22T08:06:36.722906', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1944607136', 'name': 'instance-0000006c', 'instance_id': '7f1b779f-4565-4529-a2c8-dcb0414326f6', 'instance_type': 'm1.nano', 'host': '216bdafe739e0a3e3d1e5de248403d6b49b2591f524e852cc6e1e322', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2ad82d50-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5539.363090714, 'message_signature': 'c25ba8bd97fb96b1b2b3207bdcbaf7637f7e8581ed993c3469ed02271aad6534'}]}, 'timestamp': '2025-11-22 08:06:36.723451', '_unique_id': '1256b0c524214cc4bee03b86bea3a063'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.723 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.723 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.723 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.723 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.723 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.723 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.723 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.723 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.723 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.723 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.723 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.723 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.723 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.723 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.723 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.723 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.723 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.723 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.723 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.723 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.723 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.723 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.723 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.723 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.723 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.723 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.723 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.723 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.723 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.723 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.723 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.724 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.724 12 DEBUG ceilometer.compute.pollsters [-] 7f1b779f-4565-4529-a2c8-dcb0414326f6/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.725 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ceda72e7-663b-423e-afbf-e2323bb8da8c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'user_name': None, 'project_id': '8f7086819eb340f28dd7087159d82fa3', 'project_name': None, 'resource_id': 'instance-0000006c-7f1b779f-4565-4529-a2c8-dcb0414326f6-tap0eb38acd-9b', 'timestamp': '2025-11-22T08:06:36.724582', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1944607136', 'name': 'tap0eb38acd-9b', 'instance_id': '7f1b779f-4565-4529-a2c8-dcb0414326f6', 'instance_type': 'm1.nano', 'host': '216bdafe739e0a3e3d1e5de248403d6b49b2591f524e852cc6e1e322', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6e:8b:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0eb38acd-9b'}, 'message_id': '2ad8637e-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5539.357920442, 'message_signature': '303a78ac70e26ab53a8d958dada6aadbb9d7ca143d4ee79638887d570cfb379f'}]}, 'timestamp': '2025-11-22 08:06:36.724840', '_unique_id': 'a58dffb547084320b04cba3c8dd05a6f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.725 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.725 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.725 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.725 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.725 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.725 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.725 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.725 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.725 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.725 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.725 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.725 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.725 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.725 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.725 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.725 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.725 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.725 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.725 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.725 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.725 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.725 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.725 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.725 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.725 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.725 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.725 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.725 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.725 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.725 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.725 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.725 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.725 12 DEBUG ceilometer.compute.pollsters [-] 7f1b779f-4565-4529-a2c8-dcb0414326f6/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.726 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c6e900f6-b9af-4ebb-918f-044f561f3f24', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'user_name': None, 'project_id': '8f7086819eb340f28dd7087159d82fa3', 'project_name': None, 'resource_id': 'instance-0000006c-7f1b779f-4565-4529-a2c8-dcb0414326f6-tap0eb38acd-9b', 'timestamp': '2025-11-22T08:06:36.725936', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1944607136', 'name': 'tap0eb38acd-9b', 'instance_id': '7f1b779f-4565-4529-a2c8-dcb0414326f6', 'instance_type': 'm1.nano', 'host': '216bdafe739e0a3e3d1e5de248403d6b49b2591f524e852cc6e1e322', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6e:8b:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0eb38acd-9b'}, 'message_id': '2ad89786-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5539.357920442, 'message_signature': '8916688857721a54e62168e171ecd8cea081d2b1f3573fcdc92465c68e6952a3'}]}, 'timestamp': '2025-11-22 08:06:36.726165', '_unique_id': '85802dcfc75a49f0ac706cf8d876310f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.726 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.726 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.726 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.726 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.726 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.726 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.726 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.726 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.726 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.726 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.726 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.726 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.726 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.726 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.726 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.726 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.726 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.726 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.726 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.726 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.726 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.726 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.726 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.726 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.726 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.726 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.726 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.726 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.726 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.726 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.726 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.727 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.744 12 DEBUG ceilometer.compute.pollsters [-] 7f1b779f-4565-4529-a2c8-dcb0414326f6/cpu volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.746 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9133dec3-c517-4f13-9079-5344c6300990', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'user_name': None, 'project_id': '8f7086819eb340f28dd7087159d82fa3', 'project_name': None, 'resource_id': '7f1b779f-4565-4529-a2c8-dcb0414326f6', 'timestamp': '2025-11-22T08:06:36.727546', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1944607136', 'name': 'instance-0000006c', 'instance_id': '7f1b779f-4565-4529-a2c8-dcb0414326f6', 'instance_type': 'm1.nano', 'host': '216bdafe739e0a3e3d1e5de248403d6b49b2591f524e852cc6e1e322', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '2adb88f6-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5539.414642085, 'message_signature': '0a0c663c7348a02d3fbb6141e6c144caf0b0c7bb80ea8dbbaa537eab3523ade8'}]}, 'timestamp': '2025-11-22 08:06:36.745540', '_unique_id': '97e08256b9b341aaa7ea49be43f81c76'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.746 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.746 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.746 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.746 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.746 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.746 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.746 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.746 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.746 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.746 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.746 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.746 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.746 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.746 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.746 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.746 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.746 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.746 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.746 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.746 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.746 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.746 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.746 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.746 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.746 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.746 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.746 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.746 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.746 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.746 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.746 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.747 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.747 12 DEBUG ceilometer.compute.pollsters [-] 7f1b779f-4565-4529-a2c8-dcb0414326f6/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.748 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '527e22be-a340-4212-b9a2-b32dad1822b4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'user_name': None, 'project_id': '8f7086819eb340f28dd7087159d82fa3', 'project_name': None, 'resource_id': 'instance-0000006c-7f1b779f-4565-4529-a2c8-dcb0414326f6-tap0eb38acd-9b', 'timestamp': '2025-11-22T08:06:36.747361', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1944607136', 'name': 'tap0eb38acd-9b', 'instance_id': '7f1b779f-4565-4529-a2c8-dcb0414326f6', 'instance_type': 'm1.nano', 'host': '216bdafe739e0a3e3d1e5de248403d6b49b2591f524e852cc6e1e322', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6e:8b:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0eb38acd-9b'}, 'message_id': '2adbe080-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5539.357920442, 'message_signature': '03d0ce0460216d5e651cf07eabe4b4e918456ec198f449bfd05604cf5dc51e15'}]}, 'timestamp': '2025-11-22 08:06:36.747735', '_unique_id': 'f6c680e78911478e8c1f41738eb727ad'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.748 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.748 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.748 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.748 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.748 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.748 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.748 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.748 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.748 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.748 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.748 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.748 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.748 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.748 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.748 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.748 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.748 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.748 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.748 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.748 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.748 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.748 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.748 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.748 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.748 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.748 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.748 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.748 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.748 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.748 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.748 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.748 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.748 12 DEBUG ceilometer.compute.pollsters [-] 7f1b779f-4565-4529-a2c8-dcb0414326f6/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.749 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '55584418-d560-40f3-807e-d83603dc5e7d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'user_name': None, 'project_id': '8f7086819eb340f28dd7087159d82fa3', 'project_name': None, 'resource_id': 'instance-0000006c-7f1b779f-4565-4529-a2c8-dcb0414326f6-tap0eb38acd-9b', 'timestamp': '2025-11-22T08:06:36.748955', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1944607136', 'name': 'tap0eb38acd-9b', 'instance_id': '7f1b779f-4565-4529-a2c8-dcb0414326f6', 'instance_type': 'm1.nano', 'host': '216bdafe739e0a3e3d1e5de248403d6b49b2591f524e852cc6e1e322', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6e:8b:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0eb38acd-9b'}, 'message_id': '2adc1c26-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5539.357920442, 'message_signature': '9dd5ac8e7c358a9522468648c2891f7888d78ded39b87b077558898d2e4e5dbc'}]}, 'timestamp': '2025-11-22 08:06:36.749291', '_unique_id': '47233583687b40ab8df1f7c57e128c05'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.749 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.749 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.749 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.749 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.749 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.749 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.749 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.749 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.749 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.749 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.749 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.749 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.749 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.749 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.749 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.749 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.749 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.749 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.749 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.749 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.749 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.749 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.749 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.749 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.749 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.749 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.749 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.749 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.749 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.749 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.749 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.750 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.751 12 DEBUG ceilometer.compute.pollsters [-] 7f1b779f-4565-4529-a2c8-dcb0414326f6/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.751 12 DEBUG ceilometer.compute.pollsters [-] 7f1b779f-4565-4529-a2c8-dcb0414326f6/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.752 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '617dd1fa-7dee-4ba2-888e-d5ba3ed17d92', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'user_name': None, 'project_id': '8f7086819eb340f28dd7087159d82fa3', 'project_name': None, 'resource_id': '7f1b779f-4565-4529-a2c8-dcb0414326f6-vda', 'timestamp': '2025-11-22T08:06:36.751032', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1944607136', 'name': 'instance-0000006c', 'instance_id': '7f1b779f-4565-4529-a2c8-dcb0414326f6', 'instance_type': 'm1.nano', 'host': '216bdafe739e0a3e3d1e5de248403d6b49b2591f524e852cc6e1e322', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2adc6dd4-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5539.363090714, 'message_signature': 'b4202218643b80644cc28ef5dff0b2812ed9e5ef86d400b7cdded509393fd108'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'user_name': None, 'project_id': '8f7086819eb340f28dd7087159d82fa3', 'project_name': None, 'resource_id': '7f1b779f-4565-4529-a2c8-dcb0414326f6-sda', 'timestamp': '2025-11-22T08:06:36.751032', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1944607136', 'name': 'instance-0000006c', 'instance_id': '7f1b779f-4565-4529-a2c8-dcb0414326f6', 'instance_type': 'm1.nano', 'host': '216bdafe739e0a3e3d1e5de248403d6b49b2591f524e852cc6e1e322', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2adc7a40-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5539.363090714, 'message_signature': '906653adf2b0015bbfe99ae207fa0ddaeb74ed0c7f47cf84b19f356146c42d08'}]}, 'timestamp': '2025-11-22 08:06:36.751680', '_unique_id': '8baec947ac354d36a8c38a90a8dab4d4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.752 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.752 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.752 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.752 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.752 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.752 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.752 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.752 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.752 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.752 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.752 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.752 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.752 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.752 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.752 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.752 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.752 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.752 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.752 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.752 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.752 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.752 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.752 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.752 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.752 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.752 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.752 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.752 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.752 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.752 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.752 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.753 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.753 12 DEBUG ceilometer.compute.pollsters [-] 7f1b779f-4565-4529-a2c8-dcb0414326f6/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.754 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5956b359-1626-4071-b148-d0fac9ec2ac8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'user_name': None, 'project_id': '8f7086819eb340f28dd7087159d82fa3', 'project_name': None, 'resource_id': 'instance-0000006c-7f1b779f-4565-4529-a2c8-dcb0414326f6-tap0eb38acd-9b', 'timestamp': '2025-11-22T08:06:36.753323', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1944607136', 'name': 'tap0eb38acd-9b', 'instance_id': '7f1b779f-4565-4529-a2c8-dcb0414326f6', 'instance_type': 'm1.nano', 'host': '216bdafe739e0a3e3d1e5de248403d6b49b2591f524e852cc6e1e322', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6e:8b:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0eb38acd-9b'}, 'message_id': '2adcc7e8-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5539.357920442, 'message_signature': '6d6ca74883e004e783dc41b971fd00776ab9ae1bc36a534e17b931b11de9271e'}]}, 'timestamp': '2025-11-22 08:06:36.753667', '_unique_id': 'f53558eea7994ceda50f7915f4d77e84'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.754 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.754 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.754 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.754 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.754 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.754 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.754 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.754 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.754 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.754 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.754 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.754 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.754 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.754 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.754 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.754 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.754 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.754 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.754 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.754 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.754 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.754 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.754 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.754 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.754 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.754 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.754 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.754 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.754 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.754 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.754 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.755 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.755 12 DEBUG ceilometer.compute.pollsters [-] 7f1b779f-4565-4529-a2c8-dcb0414326f6/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.755 12 DEBUG ceilometer.compute.pollsters [-] 7f1b779f-4565-4529-a2c8-dcb0414326f6/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.756 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fc6cb042-cdc0-4498-9294-e59dde6fe88b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'user_name': None, 'project_id': '8f7086819eb340f28dd7087159d82fa3', 'project_name': None, 'resource_id': '7f1b779f-4565-4529-a2c8-dcb0414326f6-vda', 'timestamp': '2025-11-22T08:06:36.755190', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1944607136', 'name': 'instance-0000006c', 'instance_id': '7f1b779f-4565-4529-a2c8-dcb0414326f6', 'instance_type': 'm1.nano', 'host': '216bdafe739e0a3e3d1e5de248403d6b49b2591f524e852cc6e1e322', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2add10c2-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5539.363090714, 'message_signature': '14e865dd03414fdbf9eac5d21274d6447a202dcd588c11a2ded0cc4c962393f8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'user_name': None, 'project_id': '8f7086819eb340f28dd7087159d82fa3', 'project_name': None, 'resource_id': '7f1b779f-4565-4529-a2c8-dcb0414326f6-sda', 'timestamp': '2025-11-22T08:06:36.755190', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1944607136', 'name': 'instance-0000006c', 'instance_id': '7f1b779f-4565-4529-a2c8-dcb0414326f6', 'instance_type': 'm1.nano', 'host': '216bdafe739e0a3e3d1e5de248403d6b49b2591f524e852cc6e1e322', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2add19d2-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5539.363090714, 'message_signature': '2418f69a6a7d640beb675f808389d803bd5e709c49cbbb0c7aceda432205c1a0'}]}, 'timestamp': '2025-11-22 08:06:36.755767', '_unique_id': '4d3324c9b3bd4072a30f5fed0336ee38'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.756 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.756 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.756 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.756 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.756 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.756 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.756 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.756 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.756 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.756 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.756 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.756 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.756 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.756 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.756 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.756 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.756 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.756 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.756 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.756 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.756 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.756 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.756 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.756 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.756 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.756 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.756 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.756 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.756 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.756 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.756 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.757 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.757 12 DEBUG ceilometer.compute.pollsters [-] 7f1b779f-4565-4529-a2c8-dcb0414326f6/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.758 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '09a38c5a-0721-48e3-806c-96f779a3d032', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'user_name': None, 'project_id': '8f7086819eb340f28dd7087159d82fa3', 'project_name': None, 'resource_id': 'instance-0000006c-7f1b779f-4565-4529-a2c8-dcb0414326f6-tap0eb38acd-9b', 'timestamp': '2025-11-22T08:06:36.757371', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1944607136', 'name': 'tap0eb38acd-9b', 'instance_id': '7f1b779f-4565-4529-a2c8-dcb0414326f6', 'instance_type': 'm1.nano', 'host': '216bdafe739e0a3e3d1e5de248403d6b49b2591f524e852cc6e1e322', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6e:8b:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0eb38acd-9b'}, 'message_id': '2add6536-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5539.357920442, 'message_signature': '3b2580dac7ee46d7b8ec96e4498692d5770cab195f64fd9faec209b219a2f827'}]}, 'timestamp': '2025-11-22 08:06:36.757696', '_unique_id': '2e9adad59ddb4cdca97186a4a322d89c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.758 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.758 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.758 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.758 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.758 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.758 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.758 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.758 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.758 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.758 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.758 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.758 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.758 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.758 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.758 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.758 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.758 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.758 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.758 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.758 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.758 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.758 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.758 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.758 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.758 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.758 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.758 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.758 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.758 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.758 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.758 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.759 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.759 12 DEBUG ceilometer.compute.pollsters [-] 7f1b779f-4565-4529-a2c8-dcb0414326f6/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.759 12 DEBUG ceilometer.compute.pollsters [-] 7f1b779f-4565-4529-a2c8-dcb0414326f6/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.760 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '79f3fb75-ffa5-4d19-9444-136a36165f02', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'user_name': None, 'project_id': '8f7086819eb340f28dd7087159d82fa3', 'project_name': None, 'resource_id': '7f1b779f-4565-4529-a2c8-dcb0414326f6-vda', 'timestamp': '2025-11-22T08:06:36.759247', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1944607136', 'name': 'instance-0000006c', 'instance_id': '7f1b779f-4565-4529-a2c8-dcb0414326f6', 'instance_type': 'm1.nano', 'host': '216bdafe739e0a3e3d1e5de248403d6b49b2591f524e852cc6e1e322', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2addaf78-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5539.340267332, 'message_signature': '44c27f027cf5b650aca38940b36fdcc06252b17e3e98622d6a6f1922a041ec25'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'user_name': None, 'project_id': '8f7086819eb340f28dd7087159d82fa3', 'project_name': None, 'resource_id': '7f1b779f-4565-4529-a2c8-dcb0414326f6-sda', 'timestamp': '2025-11-22T08:06:36.759247', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1944607136', 'name': 'instance-0000006c', 'instance_id': '7f1b779f-4565-4529-a2c8-dcb0414326f6', 'instance_type': 'm1.nano', 'host': '216bdafe739e0a3e3d1e5de248403d6b49b2591f524e852cc6e1e322', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2addbc02-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5539.340267332, 'message_signature': '945189d5c8a2a7782b7dde697586dd7148eb9e5026529800205353d1e76c624e'}]}, 'timestamp': '2025-11-22 08:06:36.759894', '_unique_id': '117c402045ab435fb5860df847f0833d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.760 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.760 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.760 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.760 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.760 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.760 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.760 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.760 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.760 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.760 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.760 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.760 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.760 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.760 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.760 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.760 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.760 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.760 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.760 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.760 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.760 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.760 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.760 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.760 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.760 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.760 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.760 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.760 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.760 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.760 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.760 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.761 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.761 12 DEBUG ceilometer.compute.pollsters [-] 7f1b779f-4565-4529-a2c8-dcb0414326f6/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.761 12 DEBUG ceilometer.compute.pollsters [-] 7f1b779f-4565-4529-a2c8-dcb0414326f6/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.762 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '79c24d33-1f0a-4cc5-a4d4-bd3048cac0b1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'user_name': None, 'project_id': '8f7086819eb340f28dd7087159d82fa3', 'project_name': None, 'resource_id': '7f1b779f-4565-4529-a2c8-dcb0414326f6-vda', 'timestamp': '2025-11-22T08:06:36.761640', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1944607136', 'name': 'instance-0000006c', 'instance_id': '7f1b779f-4565-4529-a2c8-dcb0414326f6', 'instance_type': 'm1.nano', 'host': '216bdafe739e0a3e3d1e5de248403d6b49b2591f524e852cc6e1e322', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2ade0c48-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5539.363090714, 'message_signature': '05e15c171eeff62717c57f624aa900fa9551c30af7dfcc47b3cb118fa5d9ff45'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'user_name': None, 'project_id': '8f7086819eb340f28dd7087159d82fa3', 'project_name': None, 'resource_id': '7f1b779f-4565-4529-a2c8-dcb0414326f6-sda', 'timestamp': '2025-11-22T08:06:36.761640', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1944607136', 'name': 'instance-0000006c', 'instance_id': '7f1b779f-4565-4529-a2c8-dcb0414326f6', 'instance_type': 'm1.nano', 'host': '216bdafe739e0a3e3d1e5de248403d6b49b2591f524e852cc6e1e322', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2ade1788-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5539.363090714, 'message_signature': '206924cf54bf0a8729ed080f2b7450127eb7b4136d67ad9562cf2e51b32e3909'}]}, 'timestamp': '2025-11-22 08:06:36.762232', '_unique_id': '067048ca16c04d04a83d7ded7aea9eb0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.762 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.762 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.762 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.762 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.762 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.762 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.762 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.762 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.762 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.762 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.762 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.762 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.762 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.762 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.762 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.762 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.762 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.762 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.762 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.762 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.762 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.762 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.762 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.762 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.762 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.762 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.762 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.762 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.762 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.762 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.762 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.763 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.764 12 DEBUG ceilometer.compute.pollsters [-] 7f1b779f-4565-4529-a2c8-dcb0414326f6/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.765 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2bfb9360-49e2-40ad-81b6-b3c00abc29a9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'user_name': None, 'project_id': '8f7086819eb340f28dd7087159d82fa3', 'project_name': None, 'resource_id': 'instance-0000006c-7f1b779f-4565-4529-a2c8-dcb0414326f6-tap0eb38acd-9b', 'timestamp': '2025-11-22T08:06:36.763984', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1944607136', 'name': 'tap0eb38acd-9b', 'instance_id': '7f1b779f-4565-4529-a2c8-dcb0414326f6', 'instance_type': 'm1.nano', 'host': '216bdafe739e0a3e3d1e5de248403d6b49b2591f524e852cc6e1e322', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6e:8b:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0eb38acd-9b'}, 'message_id': '2ade67f6-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5539.357920442, 'message_signature': '05b7040c615d5cf0ffce168f11e5e30350206dfb2d3971f098b4c5cc421719de'}]}, 'timestamp': '2025-11-22 08:06:36.764341', '_unique_id': 'ed13f35504bf47458589b4ca9b196417'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.765 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.765 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.765 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.765 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.765 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.765 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.765 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.765 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.765 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.765 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.765 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.765 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.765 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.765 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.765 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.765 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.765 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.765 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.765 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.765 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.765 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.765 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.765 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.765 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.765 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.765 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.765 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.765 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.765 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.765 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.765 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.765 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.765 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.766 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-1944607136>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-1944607136>]
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.766 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.766 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.766 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-1944607136>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-1944607136>]
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.766 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.766 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.766 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-1944607136>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-1944607136>]
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.766 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.767 12 DEBUG ceilometer.compute.pollsters [-] 7f1b779f-4565-4529-a2c8-dcb0414326f6/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.767 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 7f1b779f-4565-4529-a2c8-dcb0414326f6: ceilometer.compute.pollsters.NoVolumeException
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.767 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.767 12 DEBUG ceilometer.compute.pollsters [-] 7f1b779f-4565-4529-a2c8-dcb0414326f6/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.768 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '938700e9-b182-453e-a4f8-12e5efcf62bc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'user_name': None, 'project_id': '8f7086819eb340f28dd7087159d82fa3', 'project_name': None, 'resource_id': 'instance-0000006c-7f1b779f-4565-4529-a2c8-dcb0414326f6-tap0eb38acd-9b', 'timestamp': '2025-11-22T08:06:36.767431', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1944607136', 'name': 'tap0eb38acd-9b', 'instance_id': '7f1b779f-4565-4529-a2c8-dcb0414326f6', 'instance_type': 'm1.nano', 'host': '216bdafe739e0a3e3d1e5de248403d6b49b2591f524e852cc6e1e322', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6e:8b:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0eb38acd-9b'}, 'message_id': '2adeef1e-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5539.357920442, 'message_signature': '18c5e713d5f7d386f6f0bc188e1321d3d188760181d2cab1fbdbf22cde0d78df'}]}, 'timestamp': '2025-11-22 08:06:36.767816', '_unique_id': '68dfc7c0eec44f608fa003d88795ba9b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.768 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.768 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.768 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.768 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.768 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.768 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.768 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.768 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.768 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.768 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.768 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.768 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.768 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.768 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.768 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.768 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.768 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.768 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.768 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.768 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.768 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.768 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.768 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.768 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.768 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.768 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.768 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.768 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.768 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.768 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.768 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.769 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.769 12 DEBUG ceilometer.compute.pollsters [-] 7f1b779f-4565-4529-a2c8-dcb0414326f6/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.769 12 DEBUG ceilometer.compute.pollsters [-] 7f1b779f-4565-4529-a2c8-dcb0414326f6/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.770 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0765e733-df3d-4b6c-adbb-64337052c89f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'user_name': None, 'project_id': '8f7086819eb340f28dd7087159d82fa3', 'project_name': None, 'resource_id': '7f1b779f-4565-4529-a2c8-dcb0414326f6-vda', 'timestamp': '2025-11-22T08:06:36.769240', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1944607136', 'name': 'instance-0000006c', 'instance_id': '7f1b779f-4565-4529-a2c8-dcb0414326f6', 'instance_type': 'm1.nano', 'host': '216bdafe739e0a3e3d1e5de248403d6b49b2591f524e852cc6e1e322', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2adf3438-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5539.363090714, 'message_signature': 'c499bbb653bda5a958a2b91709dbbe82b969564adbb6c65bbdc40ae89b1e0b55'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '2c1b21c06c9b48d39e736b195bd12c8c', 'user_name': None, 'project_id': '8f7086819eb340f28dd7087159d82fa3', 'project_name': None, 'resource_id': '7f1b779f-4565-4529-a2c8-dcb0414326f6-sda', 'timestamp': '2025-11-22T08:06:36.769240', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-1944607136', 'name': 'instance-0000006c', 'instance_id': '7f1b779f-4565-4529-a2c8-dcb0414326f6', 'instance_type': 'm1.nano', 'host': '216bdafe739e0a3e3d1e5de248403d6b49b2591f524e852cc6e1e322', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2adf3cf8-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5539.363090714, 'message_signature': '6436e58f0dfa445f64105bfbb4cdfc1752e70a8fb3b81490d66a6af0159b7e28'}]}, 'timestamp': '2025-11-22 08:06:36.769708', '_unique_id': 'a1919feae2a64682abeea67cb8c396b0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.770 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.770 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.770 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.770 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.770 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.770 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.770 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.770 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.770 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.770 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.770 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.770 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.770 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.770 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.770 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.770 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.770 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.770 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.770 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.770 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.770 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.770 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.770 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.770 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.770 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.770 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.770 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.770 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.770 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.770 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:06:36.770 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:06:36 np0005531887 nova_compute[186849]: 2025-11-22 08:06:36.826 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:37 np0005531887 nova_compute[186849]: 2025-11-22 08:06:37.147 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:06:37.340 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:06:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:06:37.341 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:06:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:06:37.342 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:06:37 np0005531887 nova_compute[186849]: 2025-11-22 08:06:37.544 186853 DEBUG nova.compute.manager [req-931e5392-a13f-436a-aef9-04275dc11a57 req-398767f2-8b96-4e79-9dc1-fb7635741bfb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Received event network-vif-plugged-0eb38acd-9bcc-4884-9e50-c571e7d6a405 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:06:37 np0005531887 nova_compute[186849]: 2025-11-22 08:06:37.545 186853 DEBUG oslo_concurrency.lockutils [req-931e5392-a13f-436a-aef9-04275dc11a57 req-398767f2-8b96-4e79-9dc1-fb7635741bfb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "7f1b779f-4565-4529-a2c8-dcb0414326f6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:06:37 np0005531887 nova_compute[186849]: 2025-11-22 08:06:37.545 186853 DEBUG oslo_concurrency.lockutils [req-931e5392-a13f-436a-aef9-04275dc11a57 req-398767f2-8b96-4e79-9dc1-fb7635741bfb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "7f1b779f-4565-4529-a2c8-dcb0414326f6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:06:37 np0005531887 nova_compute[186849]: 2025-11-22 08:06:37.545 186853 DEBUG oslo_concurrency.lockutils [req-931e5392-a13f-436a-aef9-04275dc11a57 req-398767f2-8b96-4e79-9dc1-fb7635741bfb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "7f1b779f-4565-4529-a2c8-dcb0414326f6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:06:37 np0005531887 nova_compute[186849]: 2025-11-22 08:06:37.545 186853 DEBUG nova.compute.manager [req-931e5392-a13f-436a-aef9-04275dc11a57 req-398767f2-8b96-4e79-9dc1-fb7635741bfb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Processing event network-vif-plugged-0eb38acd-9bcc-4884-9e50-c571e7d6a405 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:06:37 np0005531887 nova_compute[186849]: 2025-11-22 08:06:37.546 186853 DEBUG nova.compute.manager [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Instance event wait completed in 3 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:06:37 np0005531887 nova_compute[186849]: 2025-11-22 08:06:37.549 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798797.5495424, 7f1b779f-4565-4529-a2c8-dcb0414326f6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:06:37 np0005531887 nova_compute[186849]: 2025-11-22 08:06:37.550 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:06:37 np0005531887 nova_compute[186849]: 2025-11-22 08:06:37.552 186853 DEBUG nova.virt.libvirt.driver [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:06:37 np0005531887 nova_compute[186849]: 2025-11-22 08:06:37.557 186853 INFO nova.virt.libvirt.driver [-] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Instance spawned successfully.#033[00m
Nov 22 03:06:37 np0005531887 nova_compute[186849]: 2025-11-22 08:06:37.557 186853 DEBUG nova.virt.libvirt.driver [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 03:06:37 np0005531887 nova_compute[186849]: 2025-11-22 08:06:37.577 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:06:37 np0005531887 nova_compute[186849]: 2025-11-22 08:06:37.583 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:06:37 np0005531887 nova_compute[186849]: 2025-11-22 08:06:37.586 186853 DEBUG nova.virt.libvirt.driver [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:06:37 np0005531887 nova_compute[186849]: 2025-11-22 08:06:37.586 186853 DEBUG nova.virt.libvirt.driver [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:06:37 np0005531887 nova_compute[186849]: 2025-11-22 08:06:37.586 186853 DEBUG nova.virt.libvirt.driver [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:06:37 np0005531887 nova_compute[186849]: 2025-11-22 08:06:37.587 186853 DEBUG nova.virt.libvirt.driver [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:06:37 np0005531887 nova_compute[186849]: 2025-11-22 08:06:37.587 186853 DEBUG nova.virt.libvirt.driver [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:06:37 np0005531887 nova_compute[186849]: 2025-11-22 08:06:37.587 186853 DEBUG nova.virt.libvirt.driver [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:06:37 np0005531887 nova_compute[186849]: 2025-11-22 08:06:37.610 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:06:37 np0005531887 nova_compute[186849]: 2025-11-22 08:06:37.649 186853 INFO nova.compute.manager [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Took 10.61 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 03:06:37 np0005531887 nova_compute[186849]: 2025-11-22 08:06:37.649 186853 DEBUG nova.compute.manager [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:06:37 np0005531887 nova_compute[186849]: 2025-11-22 08:06:37.728 186853 INFO nova.compute.manager [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Took 11.19 seconds to build instance.#033[00m
Nov 22 03:06:37 np0005531887 nova_compute[186849]: 2025-11-22 08:06:37.749 186853 DEBUG oslo_concurrency.lockutils [None req-11f1cc81-3b1f-4933-b948-d7af76efb7a8 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Lock "7f1b779f-4565-4529-a2c8-dcb0414326f6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.322s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:06:37 np0005531887 podman[229693]: 2025-11-22 08:06:37.844150561 +0000 UTC m=+0.061237419 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent)
Nov 22 03:06:39 np0005531887 nova_compute[186849]: 2025-11-22 08:06:39.505 186853 INFO nova.compute.manager [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Rescuing#033[00m
Nov 22 03:06:39 np0005531887 nova_compute[186849]: 2025-11-22 08:06:39.506 186853 DEBUG oslo_concurrency.lockutils [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Acquiring lock "refresh_cache-7f1b779f-4565-4529-a2c8-dcb0414326f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:06:39 np0005531887 nova_compute[186849]: 2025-11-22 08:06:39.506 186853 DEBUG oslo_concurrency.lockutils [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Acquired lock "refresh_cache-7f1b779f-4565-4529-a2c8-dcb0414326f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:06:39 np0005531887 nova_compute[186849]: 2025-11-22 08:06:39.506 186853 DEBUG nova.network.neutron [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:06:39 np0005531887 nova_compute[186849]: 2025-11-22 08:06:39.702 186853 DEBUG nova.compute.manager [req-e58ef8a7-7603-4c5e-8796-ab032fd5295d req-a1e3a8f1-a1fb-405d-aec1-3695ab1ada37 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Received event network-vif-plugged-0eb38acd-9bcc-4884-9e50-c571e7d6a405 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:06:39 np0005531887 nova_compute[186849]: 2025-11-22 08:06:39.702 186853 DEBUG oslo_concurrency.lockutils [req-e58ef8a7-7603-4c5e-8796-ab032fd5295d req-a1e3a8f1-a1fb-405d-aec1-3695ab1ada37 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "7f1b779f-4565-4529-a2c8-dcb0414326f6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:06:39 np0005531887 nova_compute[186849]: 2025-11-22 08:06:39.703 186853 DEBUG oslo_concurrency.lockutils [req-e58ef8a7-7603-4c5e-8796-ab032fd5295d req-a1e3a8f1-a1fb-405d-aec1-3695ab1ada37 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "7f1b779f-4565-4529-a2c8-dcb0414326f6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:06:39 np0005531887 nova_compute[186849]: 2025-11-22 08:06:39.703 186853 DEBUG oslo_concurrency.lockutils [req-e58ef8a7-7603-4c5e-8796-ab032fd5295d req-a1e3a8f1-a1fb-405d-aec1-3695ab1ada37 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "7f1b779f-4565-4529-a2c8-dcb0414326f6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:06:39 np0005531887 nova_compute[186849]: 2025-11-22 08:06:39.704 186853 DEBUG nova.compute.manager [req-e58ef8a7-7603-4c5e-8796-ab032fd5295d req-a1e3a8f1-a1fb-405d-aec1-3695ab1ada37 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] No waiting events found dispatching network-vif-plugged-0eb38acd-9bcc-4884-9e50-c571e7d6a405 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:06:39 np0005531887 nova_compute[186849]: 2025-11-22 08:06:39.704 186853 WARNING nova.compute.manager [req-e58ef8a7-7603-4c5e-8796-ab032fd5295d req-a1e3a8f1-a1fb-405d-aec1-3695ab1ada37 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Received unexpected event network-vif-plugged-0eb38acd-9bcc-4884-9e50-c571e7d6a405 for instance with vm_state active and task_state rescuing.#033[00m
Nov 22 03:06:40 np0005531887 podman[229714]: 2025-11-22 08:06:40.83541941 +0000 UTC m=+0.052705373 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true)
Nov 22 03:06:41 np0005531887 nova_compute[186849]: 2025-11-22 08:06:41.205 186853 DEBUG nova.network.neutron [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Updating instance_info_cache with network_info: [{"id": "0eb38acd-9bcc-4884-9e50-c571e7d6a405", "address": "fa:16:3e:6e:8b:e0", "network": {"id": "f9714091-78f6-46c8-b55b-4a278bd99b49", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1040984303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f7086819eb340f28dd7087159d82fa3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0eb38acd-9b", "ovs_interfaceid": "0eb38acd-9bcc-4884-9e50-c571e7d6a405", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:06:41 np0005531887 nova_compute[186849]: 2025-11-22 08:06:41.232 186853 DEBUG oslo_concurrency.lockutils [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Releasing lock "refresh_cache-7f1b779f-4565-4529-a2c8-dcb0414326f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:06:41 np0005531887 nova_compute[186849]: 2025-11-22 08:06:41.575 186853 DEBUG nova.virt.libvirt.driver [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 22 03:06:41 np0005531887 nova_compute[186849]: 2025-11-22 08:06:41.827 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:42 np0005531887 nova_compute[186849]: 2025-11-22 08:06:42.150 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:46 np0005531887 nova_compute[186849]: 2025-11-22 08:06:46.828 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:46 np0005531887 podman[229734]: 2025-11-22 08:06:46.869443906 +0000 UTC m=+0.087343525 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 03:06:47 np0005531887 nova_compute[186849]: 2025-11-22 08:06:47.151 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:51 np0005531887 nova_compute[186849]: 2025-11-22 08:06:51.618 186853 DEBUG nova.virt.libvirt.driver [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Nov 22 03:06:51 np0005531887 nova_compute[186849]: 2025-11-22 08:06:51.830 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:52 np0005531887 nova_compute[186849]: 2025-11-22 08:06:52.153 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:52 np0005531887 ovn_controller[95130]: 2025-11-22T08:06:52Z|00045|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:6e:8b:e0 10.100.0.8
Nov 22 03:06:52 np0005531887 ovn_controller[95130]: 2025-11-22T08:06:52Z|00046|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:6e:8b:e0 10.100.0.8
Nov 22 03:06:52 np0005531887 nova_compute[186849]: 2025-11-22 08:06:52.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:06:52 np0005531887 podman[229776]: 2025-11-22 08:06:52.840010017 +0000 UTC m=+0.056662844 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, managed_by=edpm_ansible, architecture=x86_64, name=ubi9-minimal, vcs-type=git, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, release=1755695350)
Nov 22 03:06:54 np0005531887 nova_compute[186849]: 2025-11-22 08:06:54.762 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:06:54 np0005531887 nova_compute[186849]: 2025-11-22 08:06:54.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:06:54 np0005531887 nova_compute[186849]: 2025-11-22 08:06:54.792 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:06:54 np0005531887 nova_compute[186849]: 2025-11-22 08:06:54.793 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:06:54 np0005531887 nova_compute[186849]: 2025-11-22 08:06:54.793 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:06:54 np0005531887 nova_compute[186849]: 2025-11-22 08:06:54.793 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:06:54 np0005531887 nova_compute[186849]: 2025-11-22 08:06:54.862 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f1b779f-4565-4529-a2c8-dcb0414326f6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:06:54 np0005531887 nova_compute[186849]: 2025-11-22 08:06:54.922 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f1b779f-4565-4529-a2c8-dcb0414326f6/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:06:54 np0005531887 nova_compute[186849]: 2025-11-22 08:06:54.923 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f1b779f-4565-4529-a2c8-dcb0414326f6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:06:54 np0005531887 nova_compute[186849]: 2025-11-22 08:06:54.980 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f1b779f-4565-4529-a2c8-dcb0414326f6/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:06:55 np0005531887 nova_compute[186849]: 2025-11-22 08:06:55.142 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:06:55 np0005531887 nova_compute[186849]: 2025-11-22 08:06:55.144 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5549MB free_disk=73.31678009033203GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:06:55 np0005531887 nova_compute[186849]: 2025-11-22 08:06:55.144 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:06:55 np0005531887 nova_compute[186849]: 2025-11-22 08:06:55.145 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:06:55 np0005531887 nova_compute[186849]: 2025-11-22 08:06:55.212 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Instance 7f1b779f-4565-4529-a2c8-dcb0414326f6 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 03:06:55 np0005531887 nova_compute[186849]: 2025-11-22 08:06:55.213 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:06:55 np0005531887 nova_compute[186849]: 2025-11-22 08:06:55.213 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:06:55 np0005531887 nova_compute[186849]: 2025-11-22 08:06:55.258 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:06:55 np0005531887 nova_compute[186849]: 2025-11-22 08:06:55.271 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:06:55 np0005531887 nova_compute[186849]: 2025-11-22 08:06:55.298 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:06:55 np0005531887 nova_compute[186849]: 2025-11-22 08:06:55.299 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.154s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:06:55 np0005531887 podman[229804]: 2025-11-22 08:06:55.847052805 +0000 UTC m=+0.062785538 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 22 03:06:55 np0005531887 podman[229805]: 2025-11-22 08:06:55.867873045 +0000 UTC m=+0.077998206 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:06:56 np0005531887 nova_compute[186849]: 2025-11-22 08:06:56.833 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:57 np0005531887 nova_compute[186849]: 2025-11-22 08:06:57.157 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:06:57 np0005531887 nova_compute[186849]: 2025-11-22 08:06:57.299 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:06:58 np0005531887 nova_compute[186849]: 2025-11-22 08:06:58.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:06:58 np0005531887 nova_compute[186849]: 2025-11-22 08:06:58.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:06:58 np0005531887 nova_compute[186849]: 2025-11-22 08:06:58.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:07:00 np0005531887 nova_compute[186849]: 2025-11-22 08:07:00.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:07:00 np0005531887 nova_compute[186849]: 2025-11-22 08:07:00.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:07:00 np0005531887 nova_compute[186849]: 2025-11-22 08:07:00.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:07:00 np0005531887 nova_compute[186849]: 2025-11-22 08:07:00.786 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "refresh_cache-7f1b779f-4565-4529-a2c8-dcb0414326f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:07:00 np0005531887 nova_compute[186849]: 2025-11-22 08:07:00.787 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquired lock "refresh_cache-7f1b779f-4565-4529-a2c8-dcb0414326f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:07:00 np0005531887 nova_compute[186849]: 2025-11-22 08:07:00.787 186853 DEBUG nova.network.neutron [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 03:07:00 np0005531887 nova_compute[186849]: 2025-11-22 08:07:00.787 186853 DEBUG nova.objects.instance [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 7f1b779f-4565-4529-a2c8-dcb0414326f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:07:01 np0005531887 nova_compute[186849]: 2025-11-22 08:07:01.835 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:01 np0005531887 podman[229851]: 2025-11-22 08:07:01.874268418 +0000 UTC m=+0.079411322 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:07:02 np0005531887 nova_compute[186849]: 2025-11-22 08:07:02.159 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:02 np0005531887 nova_compute[186849]: 2025-11-22 08:07:02.669 186853 DEBUG nova.virt.libvirt.driver [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Instance in state 1 after 21 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Nov 22 03:07:03 np0005531887 nova_compute[186849]: 2025-11-22 08:07:03.060 186853 DEBUG nova.network.neutron [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Updating instance_info_cache with network_info: [{"id": "0eb38acd-9bcc-4884-9e50-c571e7d6a405", "address": "fa:16:3e:6e:8b:e0", "network": {"id": "f9714091-78f6-46c8-b55b-4a278bd99b49", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1040984303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f7086819eb340f28dd7087159d82fa3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0eb38acd-9b", "ovs_interfaceid": "0eb38acd-9bcc-4884-9e50-c571e7d6a405", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:07:03 np0005531887 nova_compute[186849]: 2025-11-22 08:07:03.077 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Releasing lock "refresh_cache-7f1b779f-4565-4529-a2c8-dcb0414326f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:07:03 np0005531887 nova_compute[186849]: 2025-11-22 08:07:03.077 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 03:07:03 np0005531887 nova_compute[186849]: 2025-11-22 08:07:03.078 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:07:04 np0005531887 nova_compute[186849]: 2025-11-22 08:07:04.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:07:05 np0005531887 kernel: tap0eb38acd-9b (unregistering): left promiscuous mode
Nov 22 03:07:05 np0005531887 NetworkManager[55210]: <info>  [1763798825.1909] device (tap0eb38acd-9b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:07:05 np0005531887 ovn_controller[95130]: 2025-11-22T08:07:05Z|00347|binding|INFO|Releasing lport 0eb38acd-9bcc-4884-9e50-c571e7d6a405 from this chassis (sb_readonly=0)
Nov 22 03:07:05 np0005531887 ovn_controller[95130]: 2025-11-22T08:07:05Z|00348|binding|INFO|Setting lport 0eb38acd-9bcc-4884-9e50-c571e7d6a405 down in Southbound
Nov 22 03:07:05 np0005531887 ovn_controller[95130]: 2025-11-22T08:07:05Z|00349|binding|INFO|Removing iface tap0eb38acd-9b ovn-installed in OVS
Nov 22 03:07:05 np0005531887 nova_compute[186849]: 2025-11-22 08:07:05.199 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:05.206 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6e:8b:e0 10.100.0.8'], port_security=['fa:16:3e:6e:8b:e0 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '7f1b779f-4565-4529-a2c8-dcb0414326f6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9714091-78f6-46c8-b55b-4a278bd99b49', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f7086819eb340f28dd7087159d82fa3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '65f3b143-522b-4e83-8261-f97700b0bd79', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a437e229-533d-4315-8ee6-05d493bb5ad7, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=0eb38acd-9bcc-4884-9e50-c571e7d6a405) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:07:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:05.207 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 0eb38acd-9bcc-4884-9e50-c571e7d6a405 in datapath f9714091-78f6-46c8-b55b-4a278bd99b49 unbound from our chassis#033[00m
Nov 22 03:07:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:05.208 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f9714091-78f6-46c8-b55b-4a278bd99b49, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:07:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:05.209 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[741a76d2-d3ab-4680-aaee-0c50a9b40d95]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:05.210 104084 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49 namespace which is not needed anymore#033[00m
Nov 22 03:07:05 np0005531887 nova_compute[186849]: 2025-11-22 08:07:05.217 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:05 np0005531887 systemd[1]: machine-qemu\x2d42\x2dinstance\x2d0000006c.scope: Deactivated successfully.
Nov 22 03:07:05 np0005531887 systemd[1]: machine-qemu\x2d42\x2dinstance\x2d0000006c.scope: Consumed 15.639s CPU time.
Nov 22 03:07:05 np0005531887 systemd-machined[153180]: Machine qemu-42-instance-0000006c terminated.
Nov 22 03:07:05 np0005531887 neutron-haproxy-ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49[229671]: [NOTICE]   (229681) : haproxy version is 2.8.14-c23fe91
Nov 22 03:07:05 np0005531887 neutron-haproxy-ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49[229671]: [NOTICE]   (229681) : path to executable is /usr/sbin/haproxy
Nov 22 03:07:05 np0005531887 neutron-haproxy-ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49[229671]: [WARNING]  (229681) : Exiting Master process...
Nov 22 03:07:05 np0005531887 neutron-haproxy-ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49[229671]: [ALERT]    (229681) : Current worker (229683) exited with code 143 (Terminated)
Nov 22 03:07:05 np0005531887 neutron-haproxy-ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49[229671]: [WARNING]  (229681) : All workers exited. Exiting... (0)
Nov 22 03:07:05 np0005531887 systemd[1]: libpod-f90fe7d32f25b43153f0587a78fe622484c13ba687c0ea037cf00207ee1e8580.scope: Deactivated successfully.
Nov 22 03:07:05 np0005531887 podman[229899]: 2025-11-22 08:07:05.363651038 +0000 UTC m=+0.064805001 container died f90fe7d32f25b43153f0587a78fe622484c13ba687c0ea037cf00207ee1e8580 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 03:07:05 np0005531887 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f90fe7d32f25b43153f0587a78fe622484c13ba687c0ea037cf00207ee1e8580-userdata-shm.mount: Deactivated successfully.
Nov 22 03:07:05 np0005531887 systemd[1]: var-lib-containers-storage-overlay-a4ae92f610f64a0680a90855d762b6100c65d4a96e003c2ef984b661eb3b718c-merged.mount: Deactivated successfully.
Nov 22 03:07:05 np0005531887 nova_compute[186849]: 2025-11-22 08:07:05.430 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:05 np0005531887 nova_compute[186849]: 2025-11-22 08:07:05.435 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:05 np0005531887 podman[229899]: 2025-11-22 08:07:05.443294295 +0000 UTC m=+0.144448258 container cleanup f90fe7d32f25b43153f0587a78fe622484c13ba687c0ea037cf00207ee1e8580 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118)
Nov 22 03:07:05 np0005531887 systemd[1]: libpod-conmon-f90fe7d32f25b43153f0587a78fe622484c13ba687c0ea037cf00207ee1e8580.scope: Deactivated successfully.
Nov 22 03:07:05 np0005531887 nova_compute[186849]: 2025-11-22 08:07:05.512 186853 DEBUG nova.compute.manager [req-716125ec-054e-4e26-a163-9cc964bf5261 req-2bac5894-9cdf-4979-a4e8-9e261da7bee8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Received event network-vif-unplugged-0eb38acd-9bcc-4884-9e50-c571e7d6a405 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:07:05 np0005531887 nova_compute[186849]: 2025-11-22 08:07:05.514 186853 DEBUG oslo_concurrency.lockutils [req-716125ec-054e-4e26-a163-9cc964bf5261 req-2bac5894-9cdf-4979-a4e8-9e261da7bee8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "7f1b779f-4565-4529-a2c8-dcb0414326f6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:05 np0005531887 nova_compute[186849]: 2025-11-22 08:07:05.514 186853 DEBUG oslo_concurrency.lockutils [req-716125ec-054e-4e26-a163-9cc964bf5261 req-2bac5894-9cdf-4979-a4e8-9e261da7bee8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "7f1b779f-4565-4529-a2c8-dcb0414326f6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:05 np0005531887 nova_compute[186849]: 2025-11-22 08:07:05.514 186853 DEBUG oslo_concurrency.lockutils [req-716125ec-054e-4e26-a163-9cc964bf5261 req-2bac5894-9cdf-4979-a4e8-9e261da7bee8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "7f1b779f-4565-4529-a2c8-dcb0414326f6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:05 np0005531887 nova_compute[186849]: 2025-11-22 08:07:05.514 186853 DEBUG nova.compute.manager [req-716125ec-054e-4e26-a163-9cc964bf5261 req-2bac5894-9cdf-4979-a4e8-9e261da7bee8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] No waiting events found dispatching network-vif-unplugged-0eb38acd-9bcc-4884-9e50-c571e7d6a405 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:07:05 np0005531887 nova_compute[186849]: 2025-11-22 08:07:05.515 186853 WARNING nova.compute.manager [req-716125ec-054e-4e26-a163-9cc964bf5261 req-2bac5894-9cdf-4979-a4e8-9e261da7bee8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Received unexpected event network-vif-unplugged-0eb38acd-9bcc-4884-9e50-c571e7d6a405 for instance with vm_state active and task_state rescuing.#033[00m
Nov 22 03:07:05 np0005531887 podman[229943]: 2025-11-22 08:07:05.531353427 +0000 UTC m=+0.061380723 container remove f90fe7d32f25b43153f0587a78fe622484c13ba687c0ea037cf00207ee1e8580 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 03:07:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:05.536 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[acd534a7-c404-4b73-9243-b5bdc5ba4301]: (4, ('Sat Nov 22 08:07:05 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49 (f90fe7d32f25b43153f0587a78fe622484c13ba687c0ea037cf00207ee1e8580)\nf90fe7d32f25b43153f0587a78fe622484c13ba687c0ea037cf00207ee1e8580\nSat Nov 22 08:07:05 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49 (f90fe7d32f25b43153f0587a78fe622484c13ba687c0ea037cf00207ee1e8580)\nf90fe7d32f25b43153f0587a78fe622484c13ba687c0ea037cf00207ee1e8580\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:05.538 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[3e496a90-c394-4a7d-855c-e6fd4598becf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:05.540 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9714091-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:07:05 np0005531887 nova_compute[186849]: 2025-11-22 08:07:05.542 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:05 np0005531887 kernel: tapf9714091-70: left promiscuous mode
Nov 22 03:07:05 np0005531887 nova_compute[186849]: 2025-11-22 08:07:05.559 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:05.562 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[c9aa361b-b62c-49b8-b13e-f75be0d7cf6b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:05.583 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[698119a6-84d0-4ad1-afca-9fd6d83a85ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:05.584 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[2cd0763a-52da-4fef-b940-2ac703cc8dc7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:05.598 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[83f509ee-9671-4d15-8f95-7c8c63fa3b36]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 553571, 'reachable_time': 38543, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229965, 'error': None, 'target': 'ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:05.601 104200 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:07:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:05.601 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[45fbde74-a83a-454f-b3b5-efed3f5e4469]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:05 np0005531887 systemd[1]: run-netns-ovnmeta\x2df9714091\x2d78f6\x2d46c8\x2db55b\x2d4a278bd99b49.mount: Deactivated successfully.
Nov 22 03:07:05 np0005531887 nova_compute[186849]: 2025-11-22 08:07:05.684 186853 INFO nova.virt.libvirt.driver [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Instance shutdown successfully after 24 seconds.#033[00m
Nov 22 03:07:05 np0005531887 nova_compute[186849]: 2025-11-22 08:07:05.691 186853 INFO nova.virt.libvirt.driver [-] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Instance destroyed successfully.#033[00m
Nov 22 03:07:05 np0005531887 nova_compute[186849]: 2025-11-22 08:07:05.692 186853 DEBUG nova.objects.instance [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Lazy-loading 'numa_topology' on Instance uuid 7f1b779f-4565-4529-a2c8-dcb0414326f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:07:05 np0005531887 nova_compute[186849]: 2025-11-22 08:07:05.705 186853 INFO nova.virt.libvirt.driver [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Attempting rescue#033[00m
Nov 22 03:07:05 np0005531887 nova_compute[186849]: 2025-11-22 08:07:05.710 186853 DEBUG nova.virt.libvirt.driver [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314#033[00m
Nov 22 03:07:05 np0005531887 nova_compute[186849]: 2025-11-22 08:07:05.715 186853 DEBUG nova.virt.libvirt.driver [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Nov 22 03:07:05 np0005531887 nova_compute[186849]: 2025-11-22 08:07:05.716 186853 INFO nova.virt.libvirt.driver [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Creating image(s)#033[00m
Nov 22 03:07:05 np0005531887 nova_compute[186849]: 2025-11-22 08:07:05.716 186853 DEBUG oslo_concurrency.lockutils [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Acquiring lock "/var/lib/nova/instances/7f1b779f-4565-4529-a2c8-dcb0414326f6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:05 np0005531887 nova_compute[186849]: 2025-11-22 08:07:05.717 186853 DEBUG oslo_concurrency.lockutils [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Lock "/var/lib/nova/instances/7f1b779f-4565-4529-a2c8-dcb0414326f6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:05 np0005531887 nova_compute[186849]: 2025-11-22 08:07:05.717 186853 DEBUG oslo_concurrency.lockutils [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Lock "/var/lib/nova/instances/7f1b779f-4565-4529-a2c8-dcb0414326f6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:05 np0005531887 nova_compute[186849]: 2025-11-22 08:07:05.718 186853 DEBUG nova.objects.instance [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 7f1b779f-4565-4529-a2c8-dcb0414326f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:07:05 np0005531887 nova_compute[186849]: 2025-11-22 08:07:05.738 186853 DEBUG oslo_concurrency.lockutils [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:05 np0005531887 nova_compute[186849]: 2025-11-22 08:07:05.738 186853 DEBUG oslo_concurrency.lockutils [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:05 np0005531887 nova_compute[186849]: 2025-11-22 08:07:05.753 186853 DEBUG oslo_concurrency.processutils [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:07:05 np0005531887 nova_compute[186849]: 2025-11-22 08:07:05.821 186853 DEBUG oslo_concurrency.processutils [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:07:05 np0005531887 nova_compute[186849]: 2025-11-22 08:07:05.822 186853 DEBUG oslo_concurrency.processutils [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/7f1b779f-4565-4529-a2c8-dcb0414326f6/disk.rescue execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:07:05 np0005531887 nova_compute[186849]: 2025-11-22 08:07:05.921 186853 DEBUG oslo_concurrency.processutils [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/7f1b779f-4565-4529-a2c8-dcb0414326f6/disk.rescue" returned: 0 in 0.099s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:07:05 np0005531887 nova_compute[186849]: 2025-11-22 08:07:05.922 186853 DEBUG oslo_concurrency.lockutils [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.184s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:05 np0005531887 nova_compute[186849]: 2025-11-22 08:07:05.923 186853 DEBUG nova.objects.instance [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Lazy-loading 'migration_context' on Instance uuid 7f1b779f-4565-4529-a2c8-dcb0414326f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:07:05 np0005531887 nova_compute[186849]: 2025-11-22 08:07:05.941 186853 DEBUG nova.virt.libvirt.driver [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:07:05 np0005531887 nova_compute[186849]: 2025-11-22 08:07:05.942 186853 DEBUG nova.virt.libvirt.driver [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Start _get_guest_xml network_info=[{"id": "0eb38acd-9bcc-4884-9e50-c571e7d6a405", "address": "fa:16:3e:6e:8b:e0", "network": {"id": "f9714091-78f6-46c8-b55b-4a278bd99b49", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1040984303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-1040984303-network", "vif_mac": "fa:16:3e:6e:8b:e0"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f7086819eb340f28dd7087159d82fa3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0eb38acd-9b", "ovs_interfaceid": "0eb38acd-9bcc-4884-9e50-c571e7d6a405", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'kernel_id': '', 'ramdisk_id': ''} block_device_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:07:05 np0005531887 nova_compute[186849]: 2025-11-22 08:07:05.942 186853 DEBUG nova.objects.instance [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Lazy-loading 'resources' on Instance uuid 7f1b779f-4565-4529-a2c8-dcb0414326f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:07:05 np0005531887 nova_compute[186849]: 2025-11-22 08:07:05.959 186853 WARNING nova.virt.libvirt.driver [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:07:05 np0005531887 nova_compute[186849]: 2025-11-22 08:07:05.967 186853 DEBUG nova.virt.libvirt.host [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:07:05 np0005531887 nova_compute[186849]: 2025-11-22 08:07:05.968 186853 DEBUG nova.virt.libvirt.host [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:07:05 np0005531887 nova_compute[186849]: 2025-11-22 08:07:05.972 186853 DEBUG nova.virt.libvirt.host [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:07:05 np0005531887 nova_compute[186849]: 2025-11-22 08:07:05.972 186853 DEBUG nova.virt.libvirt.host [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:07:05 np0005531887 nova_compute[186849]: 2025-11-22 08:07:05.973 186853 DEBUG nova.virt.libvirt.driver [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:07:05 np0005531887 nova_compute[186849]: 2025-11-22 08:07:05.974 186853 DEBUG nova.virt.hardware [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:07:05 np0005531887 nova_compute[186849]: 2025-11-22 08:07:05.974 186853 DEBUG nova.virt.hardware [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:07:05 np0005531887 nova_compute[186849]: 2025-11-22 08:07:05.974 186853 DEBUG nova.virt.hardware [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:07:05 np0005531887 nova_compute[186849]: 2025-11-22 08:07:05.975 186853 DEBUG nova.virt.hardware [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:07:05 np0005531887 nova_compute[186849]: 2025-11-22 08:07:05.975 186853 DEBUG nova.virt.hardware [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:07:05 np0005531887 nova_compute[186849]: 2025-11-22 08:07:05.975 186853 DEBUG nova.virt.hardware [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:07:05 np0005531887 nova_compute[186849]: 2025-11-22 08:07:05.975 186853 DEBUG nova.virt.hardware [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:07:05 np0005531887 nova_compute[186849]: 2025-11-22 08:07:05.975 186853 DEBUG nova.virt.hardware [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:07:05 np0005531887 nova_compute[186849]: 2025-11-22 08:07:05.976 186853 DEBUG nova.virt.hardware [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:07:05 np0005531887 nova_compute[186849]: 2025-11-22 08:07:05.976 186853 DEBUG nova.virt.hardware [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:07:05 np0005531887 nova_compute[186849]: 2025-11-22 08:07:05.976 186853 DEBUG nova.virt.hardware [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:07:05 np0005531887 nova_compute[186849]: 2025-11-22 08:07:05.976 186853 DEBUG nova.objects.instance [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 7f1b779f-4565-4529-a2c8-dcb0414326f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:07:05 np0005531887 nova_compute[186849]: 2025-11-22 08:07:05.992 186853 DEBUG nova.virt.libvirt.vif [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:06:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1944607136',display_name='tempest-ServerRescueNegativeTestJSON-server-1944607136',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-1944607136',id=108,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:06:37Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8f7086819eb340f28dd7087159d82fa3',ramdisk_id='',reservation_id='r-caomrmyb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueNegativeTestJSON-1724156244',owner_user_name='tempest-ServerRescueNegativeTestJSON-1724156244-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:06:37Z,user_data=None,user_id='2c1b21c06c9b48d39e736b195bd12c8c',uuid=7f1b779f-4565-4529-a2c8-dcb0414326f6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0eb38acd-9bcc-4884-9e50-c571e7d6a405", "address": "fa:16:3e:6e:8b:e0", "network": {"id": "f9714091-78f6-46c8-b55b-4a278bd99b49", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1040984303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-1040984303-network", "vif_mac": "fa:16:3e:6e:8b:e0"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f7086819eb340f28dd7087159d82fa3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0eb38acd-9b", "ovs_interfaceid": "0eb38acd-9bcc-4884-9e50-c571e7d6a405", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:07:05 np0005531887 nova_compute[186849]: 2025-11-22 08:07:05.993 186853 DEBUG nova.network.os_vif_util [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Converting VIF {"id": "0eb38acd-9bcc-4884-9e50-c571e7d6a405", "address": "fa:16:3e:6e:8b:e0", "network": {"id": "f9714091-78f6-46c8-b55b-4a278bd99b49", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1040984303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-1040984303-network", "vif_mac": "fa:16:3e:6e:8b:e0"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f7086819eb340f28dd7087159d82fa3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0eb38acd-9b", "ovs_interfaceid": "0eb38acd-9bcc-4884-9e50-c571e7d6a405", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:07:05 np0005531887 nova_compute[186849]: 2025-11-22 08:07:05.993 186853 DEBUG nova.network.os_vif_util [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:6e:8b:e0,bridge_name='br-int',has_traffic_filtering=True,id=0eb38acd-9bcc-4884-9e50-c571e7d6a405,network=Network(f9714091-78f6-46c8-b55b-4a278bd99b49),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0eb38acd-9b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:07:05 np0005531887 nova_compute[186849]: 2025-11-22 08:07:05.994 186853 DEBUG nova.objects.instance [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7f1b779f-4565-4529-a2c8-dcb0414326f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:07:06 np0005531887 nova_compute[186849]: 2025-11-22 08:07:06.025 186853 DEBUG nova.virt.libvirt.driver [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:07:06 np0005531887 nova_compute[186849]:  <uuid>7f1b779f-4565-4529-a2c8-dcb0414326f6</uuid>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:  <name>instance-0000006c</name>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:  <memory>131072</memory>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:  <vcpu>1</vcpu>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:07:06 np0005531887 nova_compute[186849]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:      <nova:name>tempest-ServerRescueNegativeTestJSON-server-1944607136</nova:name>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:      <nova:creationTime>2025-11-22 08:07:05</nova:creationTime>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:      <nova:flavor name="m1.nano">
Nov 22 03:07:06 np0005531887 nova_compute[186849]:        <nova:memory>128</nova:memory>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:        <nova:disk>1</nova:disk>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:        <nova:swap>0</nova:swap>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:      </nova:flavor>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:      <nova:owner>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:        <nova:user uuid="2c1b21c06c9b48d39e736b195bd12c8c">tempest-ServerRescueNegativeTestJSON-1724156244-project-member</nova:user>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:        <nova:project uuid="8f7086819eb340f28dd7087159d82fa3">tempest-ServerRescueNegativeTestJSON-1724156244</nova:project>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:      </nova:owner>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:      <nova:ports>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:        <nova:port uuid="0eb38acd-9bcc-4884-9e50-c571e7d6a405">
Nov 22 03:07:06 np0005531887 nova_compute[186849]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:        </nova:port>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:      </nova:ports>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:    </nova:instance>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:  <sysinfo type="smbios">
Nov 22 03:07:06 np0005531887 nova_compute[186849]:    <system>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:      <entry name="serial">7f1b779f-4565-4529-a2c8-dcb0414326f6</entry>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:      <entry name="uuid">7f1b779f-4565-4529-a2c8-dcb0414326f6</entry>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:    </system>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:  <os>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:    <smbios mode="sysinfo"/>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:  </os>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:  <features>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:    <vmcoreinfo/>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:  </features>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:  <clock offset="utc">
Nov 22 03:07:06 np0005531887 nova_compute[186849]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:    <timer name="hpet" present="no"/>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:  </clock>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:  <cpu mode="custom" match="exact">
Nov 22 03:07:06 np0005531887 nova_compute[186849]:    <model>Nehalem</model>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:  <devices>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:    <disk type="file" device="disk">
Nov 22 03:07:06 np0005531887 nova_compute[186849]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/7f1b779f-4565-4529-a2c8-dcb0414326f6/disk.rescue"/>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:      <target dev="vda" bus="virtio"/>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:    <disk type="file" device="disk">
Nov 22 03:07:06 np0005531887 nova_compute[186849]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/7f1b779f-4565-4529-a2c8-dcb0414326f6/disk"/>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:      <target dev="vdb" bus="virtio"/>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:    <disk type="file" device="cdrom">
Nov 22 03:07:06 np0005531887 nova_compute[186849]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/7f1b779f-4565-4529-a2c8-dcb0414326f6/disk.config.rescue"/>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:      <target dev="sda" bus="sata"/>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:    <interface type="ethernet">
Nov 22 03:07:06 np0005531887 nova_compute[186849]:      <mac address="fa:16:3e:6e:8b:e0"/>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:      <mtu size="1442"/>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:      <target dev="tap0eb38acd-9b"/>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:    </interface>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:    <serial type="pty">
Nov 22 03:07:06 np0005531887 nova_compute[186849]:      <log file="/var/lib/nova/instances/7f1b779f-4565-4529-a2c8-dcb0414326f6/console.log" append="off"/>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:    </serial>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:    <video>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:    </video>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:    <input type="tablet" bus="usb"/>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:    <rng model="virtio">
Nov 22 03:07:06 np0005531887 nova_compute[186849]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:    </rng>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:    <controller type="usb" index="0"/>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:    <memballoon model="virtio">
Nov 22 03:07:06 np0005531887 nova_compute[186849]:      <stats period="10"/>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 03:07:06 np0005531887 nova_compute[186849]:  </devices>
Nov 22 03:07:06 np0005531887 nova_compute[186849]: </domain>
Nov 22 03:07:06 np0005531887 nova_compute[186849]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:07:06 np0005531887 nova_compute[186849]: 2025-11-22 08:07:06.033 186853 INFO nova.virt.libvirt.driver [-] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Instance destroyed successfully.#033[00m
Nov 22 03:07:06 np0005531887 nova_compute[186849]: 2025-11-22 08:07:06.110 186853 DEBUG nova.virt.libvirt.driver [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:07:06 np0005531887 nova_compute[186849]: 2025-11-22 08:07:06.111 186853 DEBUG nova.virt.libvirt.driver [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:07:06 np0005531887 nova_compute[186849]: 2025-11-22 08:07:06.111 186853 DEBUG nova.virt.libvirt.driver [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:07:06 np0005531887 nova_compute[186849]: 2025-11-22 08:07:06.111 186853 DEBUG nova.virt.libvirt.driver [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] No VIF found with MAC fa:16:3e:6e:8b:e0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:07:06 np0005531887 nova_compute[186849]: 2025-11-22 08:07:06.112 186853 INFO nova.virt.libvirt.driver [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Using config drive#033[00m
Nov 22 03:07:06 np0005531887 nova_compute[186849]: 2025-11-22 08:07:06.129 186853 DEBUG nova.objects.instance [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 7f1b779f-4565-4529-a2c8-dcb0414326f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:07:06 np0005531887 nova_compute[186849]: 2025-11-22 08:07:06.156 186853 DEBUG nova.objects.instance [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Lazy-loading 'keypairs' on Instance uuid 7f1b779f-4565-4529-a2c8-dcb0414326f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:07:06 np0005531887 nova_compute[186849]: 2025-11-22 08:07:06.837 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:07 np0005531887 nova_compute[186849]: 2025-11-22 08:07:07.051 186853 INFO nova.virt.libvirt.driver [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Creating config drive at /var/lib/nova/instances/7f1b779f-4565-4529-a2c8-dcb0414326f6/disk.config.rescue#033[00m
Nov 22 03:07:07 np0005531887 nova_compute[186849]: 2025-11-22 08:07:07.061 186853 DEBUG oslo_concurrency.processutils [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7f1b779f-4565-4529-a2c8-dcb0414326f6/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpm2lh_c2w execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:07:07 np0005531887 nova_compute[186849]: 2025-11-22 08:07:07.161 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:07 np0005531887 nova_compute[186849]: 2025-11-22 08:07:07.192 186853 DEBUG oslo_concurrency.processutils [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7f1b779f-4565-4529-a2c8-dcb0414326f6/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpm2lh_c2w" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:07:07 np0005531887 NetworkManager[55210]: <info>  [1763798827.2696] manager: (tap0eb38acd-9b): new Tun device (/org/freedesktop/NetworkManager/Devices/160)
Nov 22 03:07:07 np0005531887 kernel: tap0eb38acd-9b: entered promiscuous mode
Nov 22 03:07:07 np0005531887 systemd-udevd[229880]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:07:07 np0005531887 ovn_controller[95130]: 2025-11-22T08:07:07Z|00350|binding|INFO|Claiming lport 0eb38acd-9bcc-4884-9e50-c571e7d6a405 for this chassis.
Nov 22 03:07:07 np0005531887 ovn_controller[95130]: 2025-11-22T08:07:07Z|00351|binding|INFO|0eb38acd-9bcc-4884-9e50-c571e7d6a405: Claiming fa:16:3e:6e:8b:e0 10.100.0.8
Nov 22 03:07:07 np0005531887 nova_compute[186849]: 2025-11-22 08:07:07.270 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:07 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:07.279 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6e:8b:e0 10.100.0.8'], port_security=['fa:16:3e:6e:8b:e0 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '7f1b779f-4565-4529-a2c8-dcb0414326f6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9714091-78f6-46c8-b55b-4a278bd99b49', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f7086819eb340f28dd7087159d82fa3', 'neutron:revision_number': '5', 'neutron:security_group_ids': '65f3b143-522b-4e83-8261-f97700b0bd79', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a437e229-533d-4315-8ee6-05d493bb5ad7, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=0eb38acd-9bcc-4884-9e50-c571e7d6a405) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:07:07 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:07.281 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 0eb38acd-9bcc-4884-9e50-c571e7d6a405 in datapath f9714091-78f6-46c8-b55b-4a278bd99b49 bound to our chassis#033[00m
Nov 22 03:07:07 np0005531887 ovn_controller[95130]: 2025-11-22T08:07:07Z|00352|binding|INFO|Setting lport 0eb38acd-9bcc-4884-9e50-c571e7d6a405 ovn-installed in OVS
Nov 22 03:07:07 np0005531887 ovn_controller[95130]: 2025-11-22T08:07:07Z|00353|binding|INFO|Setting lport 0eb38acd-9bcc-4884-9e50-c571e7d6a405 up in Southbound
Nov 22 03:07:07 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:07.282 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f9714091-78f6-46c8-b55b-4a278bd99b49#033[00m
Nov 22 03:07:07 np0005531887 nova_compute[186849]: 2025-11-22 08:07:07.284 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:07 np0005531887 NetworkManager[55210]: <info>  [1763798827.2853] device (tap0eb38acd-9b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:07:07 np0005531887 nova_compute[186849]: 2025-11-22 08:07:07.286 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:07 np0005531887 NetworkManager[55210]: <info>  [1763798827.2886] device (tap0eb38acd-9b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:07:07 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:07.295 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[9732cb65-e158-4a65-8459-afd8db690d45]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:07 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:07.296 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf9714091-71 in ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:07:07 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:07.297 213790 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf9714091-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:07:07 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:07.297 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[d9f1b901-94d4-469b-858b-78c715e249bb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:07 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:07.298 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[475fb46b-4d28-475c-9ce1-78c0eb14a099]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:07 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:07.309 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[13943302-a06a-401c-a839-e5716520b3f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:07 np0005531887 systemd-machined[153180]: New machine qemu-43-instance-0000006c.
Nov 22 03:07:07 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:07.324 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[6422be4f-b0a9-4b98-93e3-f2422bb59ee1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:07 np0005531887 systemd[1]: Started Virtual Machine qemu-43-instance-0000006c.
Nov 22 03:07:07 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:07.354 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[0a6e7231-724e-49f6-8def-5cc44bcd8e30]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:07 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:07.360 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[4854c8de-f5f1-45f1-a50a-c4a6bbaf2e58]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:07 np0005531887 NetworkManager[55210]: <info>  [1763798827.3618] manager: (tapf9714091-70): new Veth device (/org/freedesktop/NetworkManager/Devices/161)
Nov 22 03:07:07 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:07.396 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[a57c2812-bebb-46b5-9156-96c00273b717]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:07 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:07.398 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[df7bea54-d89a-4018-89ef-42045fa9e6e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:07 np0005531887 NetworkManager[55210]: <info>  [1763798827.4221] device (tapf9714091-70): carrier: link connected
Nov 22 03:07:07 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:07.427 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[5b51b936-5235-43a1-8788-4b1f859a8213]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:07 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:07.444 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[8899405d-afc4-491c-8c80-03a5ac479991]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf9714091-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:21:55:83'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 108], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 557002, 'reachable_time': 16882, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230024, 'error': None, 'target': 'ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:07 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:07.459 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[1b92e4ab-65bf-40ec-8799-fb2c460dbe7c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe21:5583'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 557002, 'tstamp': 557002}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230025, 'error': None, 'target': 'ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:07 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:07.474 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[0bfe1a70-f100-4133-9959-42d11a7b312c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf9714091-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:21:55:83'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 108], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 557002, 'reachable_time': 16882, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 230026, 'error': None, 'target': 'ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:07 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:07.501 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[4e9f73a3-10e0-46c2-af49-47152d89f8e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:07 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:07.561 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[b41877d7-300d-4ebd-89a6-07d6008c7766]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:07 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:07.582 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9714091-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:07:07 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:07.583 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:07:07 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:07.583 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf9714091-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:07:07 np0005531887 nova_compute[186849]: 2025-11-22 08:07:07.585 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:07 np0005531887 NetworkManager[55210]: <info>  [1763798827.5857] manager: (tapf9714091-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/162)
Nov 22 03:07:07 np0005531887 kernel: tapf9714091-70: entered promiscuous mode
Nov 22 03:07:07 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:07.587 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf9714091-70, col_values=(('external_ids', {'iface-id': '298be65c-aa9e-4327-b67d-2a3d4f1acf68'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:07:07 np0005531887 nova_compute[186849]: 2025-11-22 08:07:07.588 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:07 np0005531887 ovn_controller[95130]: 2025-11-22T08:07:07Z|00354|binding|INFO|Releasing lport 298be65c-aa9e-4327-b67d-2a3d4f1acf68 from this chassis (sb_readonly=0)
Nov 22 03:07:07 np0005531887 nova_compute[186849]: 2025-11-22 08:07:07.599 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:07 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:07.600 104084 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f9714091-78f6-46c8-b55b-4a278bd99b49.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f9714091-78f6-46c8-b55b-4a278bd99b49.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:07:07 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:07.600 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[d968a25d-851a-4fa7-b002-6ea40be6ddf8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:07 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:07.601 104084 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:07:07 np0005531887 ovn_metadata_agent[104079]: global
Nov 22 03:07:07 np0005531887 ovn_metadata_agent[104079]:    log         /dev/log local0 debug
Nov 22 03:07:07 np0005531887 ovn_metadata_agent[104079]:    log-tag     haproxy-metadata-proxy-f9714091-78f6-46c8-b55b-4a278bd99b49
Nov 22 03:07:07 np0005531887 ovn_metadata_agent[104079]:    user        root
Nov 22 03:07:07 np0005531887 ovn_metadata_agent[104079]:    group       root
Nov 22 03:07:07 np0005531887 ovn_metadata_agent[104079]:    maxconn     1024
Nov 22 03:07:07 np0005531887 ovn_metadata_agent[104079]:    pidfile     /var/lib/neutron/external/pids/f9714091-78f6-46c8-b55b-4a278bd99b49.pid.haproxy
Nov 22 03:07:07 np0005531887 ovn_metadata_agent[104079]:    daemon
Nov 22 03:07:07 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:07:07 np0005531887 ovn_metadata_agent[104079]: defaults
Nov 22 03:07:07 np0005531887 ovn_metadata_agent[104079]:    log global
Nov 22 03:07:07 np0005531887 ovn_metadata_agent[104079]:    mode http
Nov 22 03:07:07 np0005531887 ovn_metadata_agent[104079]:    option httplog
Nov 22 03:07:07 np0005531887 ovn_metadata_agent[104079]:    option dontlognull
Nov 22 03:07:07 np0005531887 ovn_metadata_agent[104079]:    option http-server-close
Nov 22 03:07:07 np0005531887 ovn_metadata_agent[104079]:    option forwardfor
Nov 22 03:07:07 np0005531887 ovn_metadata_agent[104079]:    retries                 3
Nov 22 03:07:07 np0005531887 ovn_metadata_agent[104079]:    timeout http-request    30s
Nov 22 03:07:07 np0005531887 ovn_metadata_agent[104079]:    timeout connect         30s
Nov 22 03:07:07 np0005531887 ovn_metadata_agent[104079]:    timeout client          32s
Nov 22 03:07:07 np0005531887 ovn_metadata_agent[104079]:    timeout server          32s
Nov 22 03:07:07 np0005531887 ovn_metadata_agent[104079]:    timeout http-keep-alive 30s
Nov 22 03:07:07 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:07:07 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:07:07 np0005531887 ovn_metadata_agent[104079]: listen listener
Nov 22 03:07:07 np0005531887 ovn_metadata_agent[104079]:    bind 169.254.169.254:80
Nov 22 03:07:07 np0005531887 ovn_metadata_agent[104079]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:07:07 np0005531887 ovn_metadata_agent[104079]:    http-request add-header X-OVN-Network-ID f9714091-78f6-46c8-b55b-4a278bd99b49
Nov 22 03:07:07 np0005531887 ovn_metadata_agent[104079]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:07:07 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:07.602 104084 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49', 'env', 'PROCESS_TAG=haproxy-f9714091-78f6-46c8-b55b-4a278bd99b49', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f9714091-78f6-46c8-b55b-4a278bd99b49.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:07:07 np0005531887 nova_compute[186849]: 2025-11-22 08:07:07.607 186853 DEBUG nova.compute.manager [req-ef16aca1-4e62-404d-9fb0-4beb43058f9e req-94a442ff-220c-48cd-bc35-1408174d8794 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Received event network-vif-plugged-0eb38acd-9bcc-4884-9e50-c571e7d6a405 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:07:07 np0005531887 nova_compute[186849]: 2025-11-22 08:07:07.608 186853 DEBUG oslo_concurrency.lockutils [req-ef16aca1-4e62-404d-9fb0-4beb43058f9e req-94a442ff-220c-48cd-bc35-1408174d8794 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "7f1b779f-4565-4529-a2c8-dcb0414326f6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:07 np0005531887 nova_compute[186849]: 2025-11-22 08:07:07.608 186853 DEBUG oslo_concurrency.lockutils [req-ef16aca1-4e62-404d-9fb0-4beb43058f9e req-94a442ff-220c-48cd-bc35-1408174d8794 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "7f1b779f-4565-4529-a2c8-dcb0414326f6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:07 np0005531887 nova_compute[186849]: 2025-11-22 08:07:07.608 186853 DEBUG oslo_concurrency.lockutils [req-ef16aca1-4e62-404d-9fb0-4beb43058f9e req-94a442ff-220c-48cd-bc35-1408174d8794 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "7f1b779f-4565-4529-a2c8-dcb0414326f6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:07 np0005531887 nova_compute[186849]: 2025-11-22 08:07:07.609 186853 DEBUG nova.compute.manager [req-ef16aca1-4e62-404d-9fb0-4beb43058f9e req-94a442ff-220c-48cd-bc35-1408174d8794 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] No waiting events found dispatching network-vif-plugged-0eb38acd-9bcc-4884-9e50-c571e7d6a405 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:07:07 np0005531887 nova_compute[186849]: 2025-11-22 08:07:07.609 186853 WARNING nova.compute.manager [req-ef16aca1-4e62-404d-9fb0-4beb43058f9e req-94a442ff-220c-48cd-bc35-1408174d8794 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Received unexpected event network-vif-plugged-0eb38acd-9bcc-4884-9e50-c571e7d6a405 for instance with vm_state active and task_state rescuing.#033[00m
Nov 22 03:07:07 np0005531887 nova_compute[186849]: 2025-11-22 08:07:07.609 186853 DEBUG nova.compute.manager [req-ef16aca1-4e62-404d-9fb0-4beb43058f9e req-94a442ff-220c-48cd-bc35-1408174d8794 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Received event network-vif-plugged-0eb38acd-9bcc-4884-9e50-c571e7d6a405 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:07:07 np0005531887 nova_compute[186849]: 2025-11-22 08:07:07.609 186853 DEBUG oslo_concurrency.lockutils [req-ef16aca1-4e62-404d-9fb0-4beb43058f9e req-94a442ff-220c-48cd-bc35-1408174d8794 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "7f1b779f-4565-4529-a2c8-dcb0414326f6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:07 np0005531887 nova_compute[186849]: 2025-11-22 08:07:07.610 186853 DEBUG oslo_concurrency.lockutils [req-ef16aca1-4e62-404d-9fb0-4beb43058f9e req-94a442ff-220c-48cd-bc35-1408174d8794 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "7f1b779f-4565-4529-a2c8-dcb0414326f6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:07 np0005531887 nova_compute[186849]: 2025-11-22 08:07:07.610 186853 DEBUG oslo_concurrency.lockutils [req-ef16aca1-4e62-404d-9fb0-4beb43058f9e req-94a442ff-220c-48cd-bc35-1408174d8794 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "7f1b779f-4565-4529-a2c8-dcb0414326f6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:07 np0005531887 nova_compute[186849]: 2025-11-22 08:07:07.610 186853 DEBUG nova.compute.manager [req-ef16aca1-4e62-404d-9fb0-4beb43058f9e req-94a442ff-220c-48cd-bc35-1408174d8794 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] No waiting events found dispatching network-vif-plugged-0eb38acd-9bcc-4884-9e50-c571e7d6a405 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:07:07 np0005531887 nova_compute[186849]: 2025-11-22 08:07:07.610 186853 WARNING nova.compute.manager [req-ef16aca1-4e62-404d-9fb0-4beb43058f9e req-94a442ff-220c-48cd-bc35-1408174d8794 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Received unexpected event network-vif-plugged-0eb38acd-9bcc-4884-9e50-c571e7d6a405 for instance with vm_state active and task_state rescuing.#033[00m
Nov 22 03:07:08 np0005531887 podman[230057]: 2025-11-22 08:07:08.027586363 +0000 UTC m=+0.098123239 container create 59a6aff2dc236fd79209a677f97ce49c25752068e0fc895c0f01794a7001614a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 03:07:08 np0005531887 podman[230057]: 2025-11-22 08:07:07.955232761 +0000 UTC m=+0.025769657 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:07:08 np0005531887 nova_compute[186849]: 2025-11-22 08:07:08.056 186853 DEBUG nova.virt.libvirt.host [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Removed pending event for 7f1b779f-4565-4529-a2c8-dcb0414326f6 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Nov 22 03:07:08 np0005531887 nova_compute[186849]: 2025-11-22 08:07:08.057 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798828.0549421, 7f1b779f-4565-4529-a2c8-dcb0414326f6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:07:08 np0005531887 nova_compute[186849]: 2025-11-22 08:07:08.057 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:07:08 np0005531887 nova_compute[186849]: 2025-11-22 08:07:08.072 186853 DEBUG nova.compute.manager [None req-e6daf378-8021-4223-a902-678a95f0814c 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:07:08 np0005531887 nova_compute[186849]: 2025-11-22 08:07:08.081 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:07:08 np0005531887 nova_compute[186849]: 2025-11-22 08:07:08.089 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:07:08 np0005531887 systemd[1]: Started libpod-conmon-59a6aff2dc236fd79209a677f97ce49c25752068e0fc895c0f01794a7001614a.scope.
Nov 22 03:07:08 np0005531887 nova_compute[186849]: 2025-11-22 08:07:08.118 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] During sync_power_state the instance has a pending task (rescuing). Skip.#033[00m
Nov 22 03:07:08 np0005531887 nova_compute[186849]: 2025-11-22 08:07:08.119 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798828.0557234, 7f1b779f-4565-4529-a2c8-dcb0414326f6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:07:08 np0005531887 nova_compute[186849]: 2025-11-22 08:07:08.119 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] VM Started (Lifecycle Event)#033[00m
Nov 22 03:07:08 np0005531887 podman[230077]: 2025-11-22 08:07:08.121061963 +0000 UTC m=+0.062252526 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 03:07:08 np0005531887 systemd[1]: Started libcrun container.
Nov 22 03:07:08 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c51804d8bab16ab4758260f701cf8074e8c8718207caa01f2b6193c402515bcd/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:07:08 np0005531887 nova_compute[186849]: 2025-11-22 08:07:08.137 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:07:08 np0005531887 nova_compute[186849]: 2025-11-22 08:07:08.142 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:07:08 np0005531887 podman[230057]: 2025-11-22 08:07:08.144114759 +0000 UTC m=+0.214651655 container init 59a6aff2dc236fd79209a677f97ce49c25752068e0fc895c0f01794a7001614a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 03:07:08 np0005531887 podman[230057]: 2025-11-22 08:07:08.149853115 +0000 UTC m=+0.220389991 container start 59a6aff2dc236fd79209a677f97ce49c25752068e0fc895c0f01794a7001614a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 03:07:08 np0005531887 neutron-haproxy-ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49[230095]: [NOTICE]   (230101) : New worker (230103) forked
Nov 22 03:07:08 np0005531887 neutron-haproxy-ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49[230095]: [NOTICE]   (230101) : Loading success.
Nov 22 03:07:09 np0005531887 nova_compute[186849]: 2025-11-22 08:07:09.705 186853 DEBUG nova.compute.manager [req-194b9211-2559-493e-9bb4-53ee6d227115 req-79e1a188-0c36-4ca6-9c6b-68d3e62b2fd5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Received event network-vif-plugged-0eb38acd-9bcc-4884-9e50-c571e7d6a405 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:07:09 np0005531887 nova_compute[186849]: 2025-11-22 08:07:09.706 186853 DEBUG oslo_concurrency.lockutils [req-194b9211-2559-493e-9bb4-53ee6d227115 req-79e1a188-0c36-4ca6-9c6b-68d3e62b2fd5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "7f1b779f-4565-4529-a2c8-dcb0414326f6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:09 np0005531887 nova_compute[186849]: 2025-11-22 08:07:09.706 186853 DEBUG oslo_concurrency.lockutils [req-194b9211-2559-493e-9bb4-53ee6d227115 req-79e1a188-0c36-4ca6-9c6b-68d3e62b2fd5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "7f1b779f-4565-4529-a2c8-dcb0414326f6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:09 np0005531887 nova_compute[186849]: 2025-11-22 08:07:09.707 186853 DEBUG oslo_concurrency.lockutils [req-194b9211-2559-493e-9bb4-53ee6d227115 req-79e1a188-0c36-4ca6-9c6b-68d3e62b2fd5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "7f1b779f-4565-4529-a2c8-dcb0414326f6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:09 np0005531887 nova_compute[186849]: 2025-11-22 08:07:09.707 186853 DEBUG nova.compute.manager [req-194b9211-2559-493e-9bb4-53ee6d227115 req-79e1a188-0c36-4ca6-9c6b-68d3e62b2fd5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] No waiting events found dispatching network-vif-plugged-0eb38acd-9bcc-4884-9e50-c571e7d6a405 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:07:09 np0005531887 nova_compute[186849]: 2025-11-22 08:07:09.707 186853 WARNING nova.compute.manager [req-194b9211-2559-493e-9bb4-53ee6d227115 req-79e1a188-0c36-4ca6-9c6b-68d3e62b2fd5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Received unexpected event network-vif-plugged-0eb38acd-9bcc-4884-9e50-c571e7d6a405 for instance with vm_state rescued and task_state None.#033[00m
Nov 22 03:07:11 np0005531887 podman[230112]: 2025-11-22 08:07:11.841181303 +0000 UTC m=+0.061220379 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2)
Nov 22 03:07:11 np0005531887 nova_compute[186849]: 2025-11-22 08:07:11.842 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:12 np0005531887 nova_compute[186849]: 2025-11-22 08:07:12.163 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:13 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:13.391 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=32, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=31) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:07:13 np0005531887 nova_compute[186849]: 2025-11-22 08:07:13.392 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:13 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:13.393 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:07:15 np0005531887 nova_compute[186849]: 2025-11-22 08:07:15.517 186853 DEBUG oslo_concurrency.lockutils [None req-8f7b17b3-069b-41cc-87a1-1e5c639e13ad 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Acquiring lock "7f1b779f-4565-4529-a2c8-dcb0414326f6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:15 np0005531887 nova_compute[186849]: 2025-11-22 08:07:15.518 186853 DEBUG oslo_concurrency.lockutils [None req-8f7b17b3-069b-41cc-87a1-1e5c639e13ad 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Lock "7f1b779f-4565-4529-a2c8-dcb0414326f6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:15 np0005531887 nova_compute[186849]: 2025-11-22 08:07:15.518 186853 DEBUG oslo_concurrency.lockutils [None req-8f7b17b3-069b-41cc-87a1-1e5c639e13ad 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Acquiring lock "7f1b779f-4565-4529-a2c8-dcb0414326f6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:15 np0005531887 nova_compute[186849]: 2025-11-22 08:07:15.518 186853 DEBUG oslo_concurrency.lockutils [None req-8f7b17b3-069b-41cc-87a1-1e5c639e13ad 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Lock "7f1b779f-4565-4529-a2c8-dcb0414326f6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:15 np0005531887 nova_compute[186849]: 2025-11-22 08:07:15.518 186853 DEBUG oslo_concurrency.lockutils [None req-8f7b17b3-069b-41cc-87a1-1e5c639e13ad 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Lock "7f1b779f-4565-4529-a2c8-dcb0414326f6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:15 np0005531887 nova_compute[186849]: 2025-11-22 08:07:15.525 186853 INFO nova.compute.manager [None req-8f7b17b3-069b-41cc-87a1-1e5c639e13ad 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Terminating instance#033[00m
Nov 22 03:07:15 np0005531887 nova_compute[186849]: 2025-11-22 08:07:15.536 186853 DEBUG nova.compute.manager [None req-8f7b17b3-069b-41cc-87a1-1e5c639e13ad 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 03:07:15 np0005531887 kernel: tap0eb38acd-9b (unregistering): left promiscuous mode
Nov 22 03:07:15 np0005531887 NetworkManager[55210]: <info>  [1763798835.5584] device (tap0eb38acd-9b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:07:15 np0005531887 ovn_controller[95130]: 2025-11-22T08:07:15Z|00355|binding|INFO|Releasing lport 0eb38acd-9bcc-4884-9e50-c571e7d6a405 from this chassis (sb_readonly=0)
Nov 22 03:07:15 np0005531887 ovn_controller[95130]: 2025-11-22T08:07:15Z|00356|binding|INFO|Setting lport 0eb38acd-9bcc-4884-9e50-c571e7d6a405 down in Southbound
Nov 22 03:07:15 np0005531887 ovn_controller[95130]: 2025-11-22T08:07:15Z|00357|binding|INFO|Removing iface tap0eb38acd-9b ovn-installed in OVS
Nov 22 03:07:15 np0005531887 nova_compute[186849]: 2025-11-22 08:07:15.566 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:15 np0005531887 nova_compute[186849]: 2025-11-22 08:07:15.569 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:15 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:15.577 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6e:8b:e0 10.100.0.8'], port_security=['fa:16:3e:6e:8b:e0 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '7f1b779f-4565-4529-a2c8-dcb0414326f6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9714091-78f6-46c8-b55b-4a278bd99b49', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f7086819eb340f28dd7087159d82fa3', 'neutron:revision_number': '6', 'neutron:security_group_ids': '65f3b143-522b-4e83-8261-f97700b0bd79', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a437e229-533d-4315-8ee6-05d493bb5ad7, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=0eb38acd-9bcc-4884-9e50-c571e7d6a405) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:07:15 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:15.578 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 0eb38acd-9bcc-4884-9e50-c571e7d6a405 in datapath f9714091-78f6-46c8-b55b-4a278bd99b49 unbound from our chassis#033[00m
Nov 22 03:07:15 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:15.580 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f9714091-78f6-46c8-b55b-4a278bd99b49, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:07:15 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:15.581 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[48b3b454-237c-4aa0-bc88-8093581c3de9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:15 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:15.582 104084 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49 namespace which is not needed anymore#033[00m
Nov 22 03:07:15 np0005531887 nova_compute[186849]: 2025-11-22 08:07:15.586 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:15 np0005531887 systemd[1]: machine-qemu\x2d43\x2dinstance\x2d0000006c.scope: Deactivated successfully.
Nov 22 03:07:15 np0005531887 systemd[1]: machine-qemu\x2d43\x2dinstance\x2d0000006c.scope: Consumed 8.353s CPU time.
Nov 22 03:07:15 np0005531887 systemd-machined[153180]: Machine qemu-43-instance-0000006c terminated.
Nov 22 03:07:15 np0005531887 neutron-haproxy-ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49[230095]: [NOTICE]   (230101) : haproxy version is 2.8.14-c23fe91
Nov 22 03:07:15 np0005531887 neutron-haproxy-ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49[230095]: [NOTICE]   (230101) : path to executable is /usr/sbin/haproxy
Nov 22 03:07:15 np0005531887 neutron-haproxy-ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49[230095]: [WARNING]  (230101) : Exiting Master process...
Nov 22 03:07:15 np0005531887 neutron-haproxy-ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49[230095]: [WARNING]  (230101) : Exiting Master process...
Nov 22 03:07:15 np0005531887 neutron-haproxy-ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49[230095]: [ALERT]    (230101) : Current worker (230103) exited with code 143 (Terminated)
Nov 22 03:07:15 np0005531887 neutron-haproxy-ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49[230095]: [WARNING]  (230101) : All workers exited. Exiting... (0)
Nov 22 03:07:15 np0005531887 systemd[1]: libpod-59a6aff2dc236fd79209a677f97ce49c25752068e0fc895c0f01794a7001614a.scope: Deactivated successfully.
Nov 22 03:07:15 np0005531887 conmon[230095]: conmon 59a6aff2dc236fd79209 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-59a6aff2dc236fd79209a677f97ce49c25752068e0fc895c0f01794a7001614a.scope/container/memory.events
Nov 22 03:07:15 np0005531887 podman[230158]: 2025-11-22 08:07:15.731366814 +0000 UTC m=+0.064725108 container died 59a6aff2dc236fd79209a677f97ce49c25752068e0fc895c0f01794a7001614a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 22 03:07:15 np0005531887 kernel: tap0eb38acd-9b: entered promiscuous mode
Nov 22 03:07:15 np0005531887 NetworkManager[55210]: <info>  [1763798835.7594] manager: (tap0eb38acd-9b): new Tun device (/org/freedesktop/NetworkManager/Devices/163)
Nov 22 03:07:15 np0005531887 kernel: tap0eb38acd-9b (unregistering): left promiscuous mode
Nov 22 03:07:15 np0005531887 nova_compute[186849]: 2025-11-22 08:07:15.766 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:15 np0005531887 nova_compute[186849]: 2025-11-22 08:07:15.809 186853 DEBUG nova.compute.manager [req-85fa5dc2-cb08-4173-950f-14800dfe7261 req-3113c592-af93-48fe-bf3f-d78589308b90 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Received event network-vif-unplugged-0eb38acd-9bcc-4884-9e50-c571e7d6a405 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:07:15 np0005531887 nova_compute[186849]: 2025-11-22 08:07:15.809 186853 DEBUG oslo_concurrency.lockutils [req-85fa5dc2-cb08-4173-950f-14800dfe7261 req-3113c592-af93-48fe-bf3f-d78589308b90 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "7f1b779f-4565-4529-a2c8-dcb0414326f6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:15 np0005531887 nova_compute[186849]: 2025-11-22 08:07:15.809 186853 DEBUG oslo_concurrency.lockutils [req-85fa5dc2-cb08-4173-950f-14800dfe7261 req-3113c592-af93-48fe-bf3f-d78589308b90 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "7f1b779f-4565-4529-a2c8-dcb0414326f6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:15 np0005531887 nova_compute[186849]: 2025-11-22 08:07:15.810 186853 DEBUG oslo_concurrency.lockutils [req-85fa5dc2-cb08-4173-950f-14800dfe7261 req-3113c592-af93-48fe-bf3f-d78589308b90 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "7f1b779f-4565-4529-a2c8-dcb0414326f6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:15 np0005531887 nova_compute[186849]: 2025-11-22 08:07:15.810 186853 DEBUG nova.compute.manager [req-85fa5dc2-cb08-4173-950f-14800dfe7261 req-3113c592-af93-48fe-bf3f-d78589308b90 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] No waiting events found dispatching network-vif-unplugged-0eb38acd-9bcc-4884-9e50-c571e7d6a405 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:07:15 np0005531887 nova_compute[186849]: 2025-11-22 08:07:15.810 186853 DEBUG nova.compute.manager [req-85fa5dc2-cb08-4173-950f-14800dfe7261 req-3113c592-af93-48fe-bf3f-d78589308b90 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Received event network-vif-unplugged-0eb38acd-9bcc-4884-9e50-c571e7d6a405 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 03:07:15 np0005531887 nova_compute[186849]: 2025-11-22 08:07:15.843 186853 INFO nova.virt.libvirt.driver [-] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Instance destroyed successfully.#033[00m
Nov 22 03:07:15 np0005531887 nova_compute[186849]: 2025-11-22 08:07:15.844 186853 DEBUG nova.objects.instance [None req-8f7b17b3-069b-41cc-87a1-1e5c639e13ad 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Lazy-loading 'resources' on Instance uuid 7f1b779f-4565-4529-a2c8-dcb0414326f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:07:15 np0005531887 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-59a6aff2dc236fd79209a677f97ce49c25752068e0fc895c0f01794a7001614a-userdata-shm.mount: Deactivated successfully.
Nov 22 03:07:15 np0005531887 systemd[1]: var-lib-containers-storage-overlay-c51804d8bab16ab4758260f701cf8074e8c8718207caa01f2b6193c402515bcd-merged.mount: Deactivated successfully.
Nov 22 03:07:15 np0005531887 nova_compute[186849]: 2025-11-22 08:07:15.878 186853 DEBUG nova.virt.libvirt.vif [None req-8f7b17b3-069b-41cc-87a1-1e5c639e13ad 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:06:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1944607136',display_name='tempest-ServerRescueNegativeTestJSON-server-1944607136',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-1944607136',id=108,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:07:08Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8f7086819eb340f28dd7087159d82fa3',ramdisk_id='',reservation_id='r-caomrmyb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueNegativeTestJSON-1724156244',owner_user_name='tempest-ServerRescueNegativeTestJSON-1724156244-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:07:08Z,user_data=None,user_id='2c1b21c06c9b48d39e736b195bd12c8c',uuid=7f1b779f-4565-4529-a2c8-dcb0414326f6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='rescued') vif={"id": "0eb38acd-9bcc-4884-9e50-c571e7d6a405", "address": "fa:16:3e:6e:8b:e0", "network": {"id": "f9714091-78f6-46c8-b55b-4a278bd99b49", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1040984303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f7086819eb340f28dd7087159d82fa3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0eb38acd-9b", "ovs_interfaceid": "0eb38acd-9bcc-4884-9e50-c571e7d6a405", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:07:15 np0005531887 nova_compute[186849]: 2025-11-22 08:07:15.879 186853 DEBUG nova.network.os_vif_util [None req-8f7b17b3-069b-41cc-87a1-1e5c639e13ad 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Converting VIF {"id": "0eb38acd-9bcc-4884-9e50-c571e7d6a405", "address": "fa:16:3e:6e:8b:e0", "network": {"id": "f9714091-78f6-46c8-b55b-4a278bd99b49", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1040984303-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f7086819eb340f28dd7087159d82fa3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0eb38acd-9b", "ovs_interfaceid": "0eb38acd-9bcc-4884-9e50-c571e7d6a405", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:07:15 np0005531887 nova_compute[186849]: 2025-11-22 08:07:15.879 186853 DEBUG nova.network.os_vif_util [None req-8f7b17b3-069b-41cc-87a1-1e5c639e13ad 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:6e:8b:e0,bridge_name='br-int',has_traffic_filtering=True,id=0eb38acd-9bcc-4884-9e50-c571e7d6a405,network=Network(f9714091-78f6-46c8-b55b-4a278bd99b49),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0eb38acd-9b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:07:15 np0005531887 nova_compute[186849]: 2025-11-22 08:07:15.880 186853 DEBUG os_vif [None req-8f7b17b3-069b-41cc-87a1-1e5c639e13ad 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:6e:8b:e0,bridge_name='br-int',has_traffic_filtering=True,id=0eb38acd-9bcc-4884-9e50-c571e7d6a405,network=Network(f9714091-78f6-46c8-b55b-4a278bd99b49),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0eb38acd-9b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:07:15 np0005531887 nova_compute[186849]: 2025-11-22 08:07:15.882 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:15 np0005531887 nova_compute[186849]: 2025-11-22 08:07:15.882 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0eb38acd-9b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:07:15 np0005531887 nova_compute[186849]: 2025-11-22 08:07:15.884 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:15 np0005531887 nova_compute[186849]: 2025-11-22 08:07:15.887 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:15 np0005531887 nova_compute[186849]: 2025-11-22 08:07:15.889 186853 INFO os_vif [None req-8f7b17b3-069b-41cc-87a1-1e5c639e13ad 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:6e:8b:e0,bridge_name='br-int',has_traffic_filtering=True,id=0eb38acd-9bcc-4884-9e50-c571e7d6a405,network=Network(f9714091-78f6-46c8-b55b-4a278bd99b49),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0eb38acd-9b')#033[00m
Nov 22 03:07:15 np0005531887 nova_compute[186849]: 2025-11-22 08:07:15.890 186853 INFO nova.virt.libvirt.driver [None req-8f7b17b3-069b-41cc-87a1-1e5c639e13ad 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Deleting instance files /var/lib/nova/instances/7f1b779f-4565-4529-a2c8-dcb0414326f6_del#033[00m
Nov 22 03:07:15 np0005531887 nova_compute[186849]: 2025-11-22 08:07:15.892 186853 INFO nova.virt.libvirt.driver [None req-8f7b17b3-069b-41cc-87a1-1e5c639e13ad 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Deletion of /var/lib/nova/instances/7f1b779f-4565-4529-a2c8-dcb0414326f6_del complete#033[00m
Nov 22 03:07:15 np0005531887 podman[230158]: 2025-11-22 08:07:15.927577689 +0000 UTC m=+0.260935983 container cleanup 59a6aff2dc236fd79209a677f97ce49c25752068e0fc895c0f01794a7001614a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 22 03:07:15 np0005531887 systemd[1]: libpod-conmon-59a6aff2dc236fd79209a677f97ce49c25752068e0fc895c0f01794a7001614a.scope: Deactivated successfully.
Nov 22 03:07:16 np0005531887 nova_compute[186849]: 2025-11-22 08:07:16.040 186853 INFO nova.compute.manager [None req-8f7b17b3-069b-41cc-87a1-1e5c639e13ad 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Took 0.50 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 03:07:16 np0005531887 nova_compute[186849]: 2025-11-22 08:07:16.041 186853 DEBUG oslo.service.loopingcall [None req-8f7b17b3-069b-41cc-87a1-1e5c639e13ad 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 03:07:16 np0005531887 nova_compute[186849]: 2025-11-22 08:07:16.041 186853 DEBUG nova.compute.manager [-] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 03:07:16 np0005531887 nova_compute[186849]: 2025-11-22 08:07:16.041 186853 DEBUG nova.network.neutron [-] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 03:07:16 np0005531887 podman[230209]: 2025-11-22 08:07:16.330391584 +0000 UTC m=+0.380172720 container remove 59a6aff2dc236fd79209a677f97ce49c25752068e0fc895c0f01794a7001614a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:07:16 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:16.336 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[27bee009-809e-44c1-a113-e760aca38465]: (4, ('Sat Nov 22 08:07:15 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49 (59a6aff2dc236fd79209a677f97ce49c25752068e0fc895c0f01794a7001614a)\n59a6aff2dc236fd79209a677f97ce49c25752068e0fc895c0f01794a7001614a\nSat Nov 22 08:07:15 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49 (59a6aff2dc236fd79209a677f97ce49c25752068e0fc895c0f01794a7001614a)\n59a6aff2dc236fd79209a677f97ce49c25752068e0fc895c0f01794a7001614a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:16 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:16.338 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[db8d422b-c49a-4121-aa87-87ebbedc18c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:16 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:16.339 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9714091-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:07:16 np0005531887 nova_compute[186849]: 2025-11-22 08:07:16.341 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:16 np0005531887 kernel: tapf9714091-70: left promiscuous mode
Nov 22 03:07:16 np0005531887 nova_compute[186849]: 2025-11-22 08:07:16.354 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:16 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:16.358 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[e7d0a5b8-c6de-454f-8974-5e2470db499d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:16 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:16.375 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[6702be73-03eb-4e3b-8e04-0f5b918afa41]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:16 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:16.377 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[93edecf5-7081-4b67-8993-62a425237fd1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:16 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:16.393 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[14729877-bd48-478c-befe-3a6a20a966d9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 556995, 'reachable_time': 27945, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230225, 'error': None, 'target': 'ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:16 np0005531887 systemd[1]: run-netns-ovnmeta\x2df9714091\x2d78f6\x2d46c8\x2db55b\x2d4a278bd99b49.mount: Deactivated successfully.
Nov 22 03:07:16 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:16.397 104200 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f9714091-78f6-46c8-b55b-4a278bd99b49 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:07:16 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:16.397 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[b10c4c51-bc0c-41cd-a2bd-ab0163a5213c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:16 np0005531887 nova_compute[186849]: 2025-11-22 08:07:16.841 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:17 np0005531887 nova_compute[186849]: 2025-11-22 08:07:17.531 186853 DEBUG nova.network.neutron [-] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:07:17 np0005531887 nova_compute[186849]: 2025-11-22 08:07:17.550 186853 INFO nova.compute.manager [-] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Took 1.51 seconds to deallocate network for instance.#033[00m
Nov 22 03:07:17 np0005531887 nova_compute[186849]: 2025-11-22 08:07:17.609 186853 DEBUG oslo_concurrency.lockutils [None req-8f7b17b3-069b-41cc-87a1-1e5c639e13ad 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:17 np0005531887 nova_compute[186849]: 2025-11-22 08:07:17.610 186853 DEBUG oslo_concurrency.lockutils [None req-8f7b17b3-069b-41cc-87a1-1e5c639e13ad 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:17 np0005531887 nova_compute[186849]: 2025-11-22 08:07:17.660 186853 DEBUG nova.compute.manager [req-4187dfb1-d948-4621-b6fc-54ca8477e391 req-49a3de22-a980-4587-aa8e-2372cc86fe24 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Received event network-vif-deleted-0eb38acd-9bcc-4884-9e50-c571e7d6a405 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:07:17 np0005531887 nova_compute[186849]: 2025-11-22 08:07:17.671 186853 DEBUG nova.compute.provider_tree [None req-8f7b17b3-069b-41cc-87a1-1e5c639e13ad 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:07:17 np0005531887 nova_compute[186849]: 2025-11-22 08:07:17.685 186853 DEBUG nova.scheduler.client.report [None req-8f7b17b3-069b-41cc-87a1-1e5c639e13ad 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:07:17 np0005531887 nova_compute[186849]: 2025-11-22 08:07:17.703 186853 DEBUG oslo_concurrency.lockutils [None req-8f7b17b3-069b-41cc-87a1-1e5c639e13ad 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.093s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:17 np0005531887 nova_compute[186849]: 2025-11-22 08:07:17.740 186853 INFO nova.scheduler.client.report [None req-8f7b17b3-069b-41cc-87a1-1e5c639e13ad 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Deleted allocations for instance 7f1b779f-4565-4529-a2c8-dcb0414326f6#033[00m
Nov 22 03:07:17 np0005531887 nova_compute[186849]: 2025-11-22 08:07:17.803 186853 DEBUG oslo_concurrency.lockutils [None req-8f7b17b3-069b-41cc-87a1-1e5c639e13ad 2c1b21c06c9b48d39e736b195bd12c8c 8f7086819eb340f28dd7087159d82fa3 - - default default] Lock "7f1b779f-4565-4529-a2c8-dcb0414326f6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.286s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:17 np0005531887 podman[230226]: 2025-11-22 08:07:17.84302897 +0000 UTC m=+0.059420923 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 03:07:17 np0005531887 nova_compute[186849]: 2025-11-22 08:07:17.973 186853 DEBUG nova.compute.manager [req-57d8b051-4e48-42cb-a88e-f7be85b34c8d req-47c7215f-0501-459b-ae1c-9847ea4cf4c0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Received event network-vif-plugged-0eb38acd-9bcc-4884-9e50-c571e7d6a405 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:07:17 np0005531887 nova_compute[186849]: 2025-11-22 08:07:17.973 186853 DEBUG oslo_concurrency.lockutils [req-57d8b051-4e48-42cb-a88e-f7be85b34c8d req-47c7215f-0501-459b-ae1c-9847ea4cf4c0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "7f1b779f-4565-4529-a2c8-dcb0414326f6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:17 np0005531887 nova_compute[186849]: 2025-11-22 08:07:17.973 186853 DEBUG oslo_concurrency.lockutils [req-57d8b051-4e48-42cb-a88e-f7be85b34c8d req-47c7215f-0501-459b-ae1c-9847ea4cf4c0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "7f1b779f-4565-4529-a2c8-dcb0414326f6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:17 np0005531887 nova_compute[186849]: 2025-11-22 08:07:17.974 186853 DEBUG oslo_concurrency.lockutils [req-57d8b051-4e48-42cb-a88e-f7be85b34c8d req-47c7215f-0501-459b-ae1c-9847ea4cf4c0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "7f1b779f-4565-4529-a2c8-dcb0414326f6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:17 np0005531887 nova_compute[186849]: 2025-11-22 08:07:17.974 186853 DEBUG nova.compute.manager [req-57d8b051-4e48-42cb-a88e-f7be85b34c8d req-47c7215f-0501-459b-ae1c-9847ea4cf4c0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] No waiting events found dispatching network-vif-plugged-0eb38acd-9bcc-4884-9e50-c571e7d6a405 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:07:17 np0005531887 nova_compute[186849]: 2025-11-22 08:07:17.974 186853 WARNING nova.compute.manager [req-57d8b051-4e48-42cb-a88e-f7be85b34c8d req-47c7215f-0501-459b-ae1c-9847ea4cf4c0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Received unexpected event network-vif-plugged-0eb38acd-9bcc-4884-9e50-c571e7d6a405 for instance with vm_state deleted and task_state None.#033[00m
Nov 22 03:07:20 np0005531887 nova_compute[186849]: 2025-11-22 08:07:20.885 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:21 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:21.395 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '32'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:07:21 np0005531887 nova_compute[186849]: 2025-11-22 08:07:21.842 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:23 np0005531887 podman[230249]: 2025-11-22 08:07:23.845124053 +0000 UTC m=+0.052538699 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, config_id=edpm, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, managed_by=edpm_ansible, io.openshift.expose-services=, name=ubi9-minimal, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, version=9.6)
Nov 22 03:07:24 np0005531887 nova_compute[186849]: 2025-11-22 08:07:24.370 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:25 np0005531887 nova_compute[186849]: 2025-11-22 08:07:25.888 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:25 np0005531887 nova_compute[186849]: 2025-11-22 08:07:25.949 186853 DEBUG oslo_concurrency.lockutils [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "834afb36-7b49-4b59-9887-fe8b10d2d934" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:25 np0005531887 nova_compute[186849]: 2025-11-22 08:07:25.950 186853 DEBUG oslo_concurrency.lockutils [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "834afb36-7b49-4b59-9887-fe8b10d2d934" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:25 np0005531887 nova_compute[186849]: 2025-11-22 08:07:25.965 186853 DEBUG nova.compute.manager [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 03:07:26 np0005531887 nova_compute[186849]: 2025-11-22 08:07:26.050 186853 DEBUG oslo_concurrency.lockutils [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:26 np0005531887 nova_compute[186849]: 2025-11-22 08:07:26.051 186853 DEBUG oslo_concurrency.lockutils [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:26 np0005531887 nova_compute[186849]: 2025-11-22 08:07:26.057 186853 DEBUG nova.virt.hardware [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:07:26 np0005531887 nova_compute[186849]: 2025-11-22 08:07:26.058 186853 INFO nova.compute.claims [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 22 03:07:26 np0005531887 nova_compute[186849]: 2025-11-22 08:07:26.195 186853 DEBUG nova.compute.provider_tree [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:07:26 np0005531887 nova_compute[186849]: 2025-11-22 08:07:26.216 186853 DEBUG nova.scheduler.client.report [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:07:26 np0005531887 nova_compute[186849]: 2025-11-22 08:07:26.254 186853 DEBUG oslo_concurrency.lockutils [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.203s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:26 np0005531887 nova_compute[186849]: 2025-11-22 08:07:26.255 186853 DEBUG nova.compute.manager [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 03:07:26 np0005531887 nova_compute[186849]: 2025-11-22 08:07:26.312 186853 DEBUG nova.compute.manager [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 03:07:26 np0005531887 nova_compute[186849]: 2025-11-22 08:07:26.312 186853 DEBUG nova.network.neutron [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:07:26 np0005531887 nova_compute[186849]: 2025-11-22 08:07:26.339 186853 INFO nova.virt.libvirt.driver [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 03:07:26 np0005531887 nova_compute[186849]: 2025-11-22 08:07:26.362 186853 DEBUG nova.compute.manager [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 03:07:26 np0005531887 nova_compute[186849]: 2025-11-22 08:07:26.490 186853 DEBUG nova.compute.manager [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 03:07:26 np0005531887 nova_compute[186849]: 2025-11-22 08:07:26.492 186853 DEBUG nova.virt.libvirt.driver [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:07:26 np0005531887 nova_compute[186849]: 2025-11-22 08:07:26.492 186853 INFO nova.virt.libvirt.driver [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Creating image(s)#033[00m
Nov 22 03:07:26 np0005531887 nova_compute[186849]: 2025-11-22 08:07:26.493 186853 DEBUG oslo_concurrency.lockutils [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "/var/lib/nova/instances/834afb36-7b49-4b59-9887-fe8b10d2d934/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:26 np0005531887 nova_compute[186849]: 2025-11-22 08:07:26.493 186853 DEBUG oslo_concurrency.lockutils [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "/var/lib/nova/instances/834afb36-7b49-4b59-9887-fe8b10d2d934/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:26 np0005531887 nova_compute[186849]: 2025-11-22 08:07:26.494 186853 DEBUG oslo_concurrency.lockutils [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "/var/lib/nova/instances/834afb36-7b49-4b59-9887-fe8b10d2d934/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:26 np0005531887 nova_compute[186849]: 2025-11-22 08:07:26.507 186853 DEBUG oslo_concurrency.processutils [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:07:26 np0005531887 nova_compute[186849]: 2025-11-22 08:07:26.572 186853 DEBUG oslo_concurrency.processutils [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:07:26 np0005531887 nova_compute[186849]: 2025-11-22 08:07:26.573 186853 DEBUG oslo_concurrency.lockutils [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:26 np0005531887 nova_compute[186849]: 2025-11-22 08:07:26.574 186853 DEBUG oslo_concurrency.lockutils [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:26 np0005531887 nova_compute[186849]: 2025-11-22 08:07:26.586 186853 DEBUG oslo_concurrency.processutils [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:07:26 np0005531887 nova_compute[186849]: 2025-11-22 08:07:26.652 186853 DEBUG oslo_concurrency.processutils [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:07:26 np0005531887 nova_compute[186849]: 2025-11-22 08:07:26.653 186853 DEBUG oslo_concurrency.processutils [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/834afb36-7b49-4b59-9887-fe8b10d2d934/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:07:26 np0005531887 nova_compute[186849]: 2025-11-22 08:07:26.710 186853 DEBUG nova.policy [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '809b865601654264af5bff7f49127cea', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:07:26 np0005531887 nova_compute[186849]: 2025-11-22 08:07:26.843 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:26 np0005531887 podman[230279]: 2025-11-22 08:07:26.875960638 +0000 UTC m=+0.091420079 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute)
Nov 22 03:07:26 np0005531887 podman[230280]: 2025-11-22 08:07:26.884796253 +0000 UTC m=+0.089294895 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 03:07:27 np0005531887 nova_compute[186849]: 2025-11-22 08:07:27.025 186853 DEBUG oslo_concurrency.processutils [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/834afb36-7b49-4b59-9887-fe8b10d2d934/disk 1073741824" returned: 0 in 0.372s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:07:27 np0005531887 nova_compute[186849]: 2025-11-22 08:07:27.026 186853 DEBUG oslo_concurrency.lockutils [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.452s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:27 np0005531887 nova_compute[186849]: 2025-11-22 08:07:27.026 186853 DEBUG oslo_concurrency.processutils [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:07:27 np0005531887 nova_compute[186849]: 2025-11-22 08:07:27.083 186853 DEBUG oslo_concurrency.processutils [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:07:27 np0005531887 nova_compute[186849]: 2025-11-22 08:07:27.084 186853 DEBUG nova.virt.disk.api [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Checking if we can resize image /var/lib/nova/instances/834afb36-7b49-4b59-9887-fe8b10d2d934/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:07:27 np0005531887 nova_compute[186849]: 2025-11-22 08:07:27.084 186853 DEBUG oslo_concurrency.processutils [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/834afb36-7b49-4b59-9887-fe8b10d2d934/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:07:27 np0005531887 nova_compute[186849]: 2025-11-22 08:07:27.140 186853 DEBUG oslo_concurrency.processutils [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/834afb36-7b49-4b59-9887-fe8b10d2d934/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:07:27 np0005531887 nova_compute[186849]: 2025-11-22 08:07:27.142 186853 DEBUG nova.virt.disk.api [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Cannot resize image /var/lib/nova/instances/834afb36-7b49-4b59-9887-fe8b10d2d934/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:07:27 np0005531887 nova_compute[186849]: 2025-11-22 08:07:27.143 186853 DEBUG nova.objects.instance [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lazy-loading 'migration_context' on Instance uuid 834afb36-7b49-4b59-9887-fe8b10d2d934 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:07:27 np0005531887 nova_compute[186849]: 2025-11-22 08:07:27.160 186853 DEBUG nova.virt.libvirt.driver [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:07:27 np0005531887 nova_compute[186849]: 2025-11-22 08:07:27.160 186853 DEBUG nova.virt.libvirt.driver [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Ensure instance console log exists: /var/lib/nova/instances/834afb36-7b49-4b59-9887-fe8b10d2d934/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:07:27 np0005531887 nova_compute[186849]: 2025-11-22 08:07:27.161 186853 DEBUG oslo_concurrency.lockutils [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:27 np0005531887 nova_compute[186849]: 2025-11-22 08:07:27.161 186853 DEBUG oslo_concurrency.lockutils [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:27 np0005531887 nova_compute[186849]: 2025-11-22 08:07:27.161 186853 DEBUG oslo_concurrency.lockutils [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:28 np0005531887 nova_compute[186849]: 2025-11-22 08:07:28.040 186853 DEBUG nova.network.neutron [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Successfully created port: 43f2bb94-fd9e-4783-b426-c1651ae59f07 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:07:29 np0005531887 nova_compute[186849]: 2025-11-22 08:07:29.129 186853 DEBUG nova.network.neutron [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Successfully updated port: 43f2bb94-fd9e-4783-b426-c1651ae59f07 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:07:29 np0005531887 nova_compute[186849]: 2025-11-22 08:07:29.209 186853 DEBUG oslo_concurrency.lockutils [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "refresh_cache-834afb36-7b49-4b59-9887-fe8b10d2d934" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:07:29 np0005531887 nova_compute[186849]: 2025-11-22 08:07:29.209 186853 DEBUG oslo_concurrency.lockutils [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquired lock "refresh_cache-834afb36-7b49-4b59-9887-fe8b10d2d934" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:07:29 np0005531887 nova_compute[186849]: 2025-11-22 08:07:29.209 186853 DEBUG nova.network.neutron [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:07:29 np0005531887 nova_compute[186849]: 2025-11-22 08:07:29.300 186853 DEBUG nova.compute.manager [req-1efef4b0-2a30-4467-8cc3-5429b38f70e5 req-febceab6-9f7b-44c7-a554-2be0fbcd1eef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Received event network-changed-43f2bb94-fd9e-4783-b426-c1651ae59f07 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:07:29 np0005531887 nova_compute[186849]: 2025-11-22 08:07:29.301 186853 DEBUG nova.compute.manager [req-1efef4b0-2a30-4467-8cc3-5429b38f70e5 req-febceab6-9f7b-44c7-a554-2be0fbcd1eef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Refreshing instance network info cache due to event network-changed-43f2bb94-fd9e-4783-b426-c1651ae59f07. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:07:29 np0005531887 nova_compute[186849]: 2025-11-22 08:07:29.301 186853 DEBUG oslo_concurrency.lockutils [req-1efef4b0-2a30-4467-8cc3-5429b38f70e5 req-febceab6-9f7b-44c7-a554-2be0fbcd1eef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-834afb36-7b49-4b59-9887-fe8b10d2d934" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:07:29 np0005531887 nova_compute[186849]: 2025-11-22 08:07:29.471 186853 DEBUG nova.network.neutron [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:07:30 np0005531887 nova_compute[186849]: 2025-11-22 08:07:30.842 186853 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798835.8407123, 7f1b779f-4565-4529-a2c8-dcb0414326f6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:07:30 np0005531887 nova_compute[186849]: 2025-11-22 08:07:30.843 186853 INFO nova.compute.manager [-] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:07:30 np0005531887 nova_compute[186849]: 2025-11-22 08:07:30.875 186853 DEBUG nova.compute.manager [None req-643dfeba-920c-4825-88e6-02b7a1b3a46d - - - - - -] [instance: 7f1b779f-4565-4529-a2c8-dcb0414326f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:07:30 np0005531887 nova_compute[186849]: 2025-11-22 08:07:30.890 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:31 np0005531887 nova_compute[186849]: 2025-11-22 08:07:31.846 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:32 np0005531887 nova_compute[186849]: 2025-11-22 08:07:32.098 186853 DEBUG nova.network.neutron [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Updating instance_info_cache with network_info: [{"id": "43f2bb94-fd9e-4783-b426-c1651ae59f07", "address": "fa:16:3e:23:99:5d", "network": {"id": "90da6fca-65d1-4012-9602-d88842a0ad0e", "bridge": "br-int", "label": "tempest-network-smoke--1184291441", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe23:995d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43f2bb94-fd", "ovs_interfaceid": "43f2bb94-fd9e-4783-b426-c1651ae59f07", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:07:32 np0005531887 nova_compute[186849]: 2025-11-22 08:07:32.123 186853 DEBUG oslo_concurrency.lockutils [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Releasing lock "refresh_cache-834afb36-7b49-4b59-9887-fe8b10d2d934" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:07:32 np0005531887 nova_compute[186849]: 2025-11-22 08:07:32.124 186853 DEBUG nova.compute.manager [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Instance network_info: |[{"id": "43f2bb94-fd9e-4783-b426-c1651ae59f07", "address": "fa:16:3e:23:99:5d", "network": {"id": "90da6fca-65d1-4012-9602-d88842a0ad0e", "bridge": "br-int", "label": "tempest-network-smoke--1184291441", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe23:995d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43f2bb94-fd", "ovs_interfaceid": "43f2bb94-fd9e-4783-b426-c1651ae59f07", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 03:07:32 np0005531887 nova_compute[186849]: 2025-11-22 08:07:32.124 186853 DEBUG oslo_concurrency.lockutils [req-1efef4b0-2a30-4467-8cc3-5429b38f70e5 req-febceab6-9f7b-44c7-a554-2be0fbcd1eef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-834afb36-7b49-4b59-9887-fe8b10d2d934" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:07:32 np0005531887 nova_compute[186849]: 2025-11-22 08:07:32.125 186853 DEBUG nova.network.neutron [req-1efef4b0-2a30-4467-8cc3-5429b38f70e5 req-febceab6-9f7b-44c7-a554-2be0fbcd1eef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Refreshing network info cache for port 43f2bb94-fd9e-4783-b426-c1651ae59f07 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:07:32 np0005531887 nova_compute[186849]: 2025-11-22 08:07:32.129 186853 DEBUG nova.virt.libvirt.driver [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Start _get_guest_xml network_info=[{"id": "43f2bb94-fd9e-4783-b426-c1651ae59f07", "address": "fa:16:3e:23:99:5d", "network": {"id": "90da6fca-65d1-4012-9602-d88842a0ad0e", "bridge": "br-int", "label": "tempest-network-smoke--1184291441", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe23:995d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43f2bb94-fd", "ovs_interfaceid": "43f2bb94-fd9e-4783-b426-c1651ae59f07", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'size': 0, 'boot_index': 0, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:07:32 np0005531887 nova_compute[186849]: 2025-11-22 08:07:32.135 186853 WARNING nova.virt.libvirt.driver [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:07:32 np0005531887 nova_compute[186849]: 2025-11-22 08:07:32.146 186853 DEBUG nova.virt.libvirt.host [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:07:32 np0005531887 nova_compute[186849]: 2025-11-22 08:07:32.147 186853 DEBUG nova.virt.libvirt.host [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:07:32 np0005531887 nova_compute[186849]: 2025-11-22 08:07:32.153 186853 DEBUG nova.virt.libvirt.host [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:07:32 np0005531887 nova_compute[186849]: 2025-11-22 08:07:32.154 186853 DEBUG nova.virt.libvirt.host [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:07:32 np0005531887 nova_compute[186849]: 2025-11-22 08:07:32.156 186853 DEBUG nova.virt.libvirt.driver [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:07:32 np0005531887 nova_compute[186849]: 2025-11-22 08:07:32.156 186853 DEBUG nova.virt.hardware [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:07:32 np0005531887 nova_compute[186849]: 2025-11-22 08:07:32.157 186853 DEBUG nova.virt.hardware [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:07:32 np0005531887 nova_compute[186849]: 2025-11-22 08:07:32.157 186853 DEBUG nova.virt.hardware [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:07:32 np0005531887 nova_compute[186849]: 2025-11-22 08:07:32.157 186853 DEBUG nova.virt.hardware [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:07:32 np0005531887 nova_compute[186849]: 2025-11-22 08:07:32.157 186853 DEBUG nova.virt.hardware [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:07:32 np0005531887 nova_compute[186849]: 2025-11-22 08:07:32.158 186853 DEBUG nova.virt.hardware [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:07:32 np0005531887 nova_compute[186849]: 2025-11-22 08:07:32.158 186853 DEBUG nova.virt.hardware [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:07:32 np0005531887 nova_compute[186849]: 2025-11-22 08:07:32.158 186853 DEBUG nova.virt.hardware [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:07:32 np0005531887 nova_compute[186849]: 2025-11-22 08:07:32.159 186853 DEBUG nova.virt.hardware [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:07:32 np0005531887 nova_compute[186849]: 2025-11-22 08:07:32.159 186853 DEBUG nova.virt.hardware [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:07:32 np0005531887 nova_compute[186849]: 2025-11-22 08:07:32.159 186853 DEBUG nova.virt.hardware [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:07:32 np0005531887 nova_compute[186849]: 2025-11-22 08:07:32.164 186853 DEBUG nova.virt.libvirt.vif [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:07:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-377491549',display_name='tempest-TestGettingAddress-server-377491549',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-377491549',id=111,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEPnaERasa+izdcdfyTuC1NZxdKIV3QYAmiEXJjMkASn0E1tv7r6vCMDrq3+5wI/5DgRhzrsGj9ouyKzyqBuAz+X8ag3n7AcCuRnJpHSdd9YGkwB1w6Z6YQ+SkW/64cPWQ==',key_name='tempest-TestGettingAddress-1450732548',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-6h3o0l87',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:07:26Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=834afb36-7b49-4b59-9887-fe8b10d2d934,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "43f2bb94-fd9e-4783-b426-c1651ae59f07", "address": "fa:16:3e:23:99:5d", "network": {"id": "90da6fca-65d1-4012-9602-d88842a0ad0e", "bridge": "br-int", "label": "tempest-network-smoke--1184291441", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe23:995d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43f2bb94-fd", "ovs_interfaceid": "43f2bb94-fd9e-4783-b426-c1651ae59f07", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:07:32 np0005531887 nova_compute[186849]: 2025-11-22 08:07:32.165 186853 DEBUG nova.network.os_vif_util [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "43f2bb94-fd9e-4783-b426-c1651ae59f07", "address": "fa:16:3e:23:99:5d", "network": {"id": "90da6fca-65d1-4012-9602-d88842a0ad0e", "bridge": "br-int", "label": "tempest-network-smoke--1184291441", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe23:995d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43f2bb94-fd", "ovs_interfaceid": "43f2bb94-fd9e-4783-b426-c1651ae59f07", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:07:32 np0005531887 nova_compute[186849]: 2025-11-22 08:07:32.166 186853 DEBUG nova.network.os_vif_util [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:23:99:5d,bridge_name='br-int',has_traffic_filtering=True,id=43f2bb94-fd9e-4783-b426-c1651ae59f07,network=Network(90da6fca-65d1-4012-9602-d88842a0ad0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43f2bb94-fd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:07:32 np0005531887 nova_compute[186849]: 2025-11-22 08:07:32.167 186853 DEBUG nova.objects.instance [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lazy-loading 'pci_devices' on Instance uuid 834afb36-7b49-4b59-9887-fe8b10d2d934 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:07:32 np0005531887 nova_compute[186849]: 2025-11-22 08:07:32.181 186853 DEBUG nova.virt.libvirt.driver [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:07:32 np0005531887 nova_compute[186849]:  <uuid>834afb36-7b49-4b59-9887-fe8b10d2d934</uuid>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:  <name>instance-0000006f</name>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:  <memory>131072</memory>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:  <vcpu>1</vcpu>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:07:32 np0005531887 nova_compute[186849]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:      <nova:name>tempest-TestGettingAddress-server-377491549</nova:name>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:      <nova:creationTime>2025-11-22 08:07:32</nova:creationTime>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:      <nova:flavor name="m1.nano">
Nov 22 03:07:32 np0005531887 nova_compute[186849]:        <nova:memory>128</nova:memory>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:        <nova:disk>1</nova:disk>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:        <nova:swap>0</nova:swap>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:      </nova:flavor>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:      <nova:owner>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:        <nova:user uuid="809b865601654264af5bff7f49127cea">tempest-TestGettingAddress-25838038-project-member</nova:user>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:        <nova:project uuid="c4200f1d1fbb44a5aaf5e3578f6354ae">tempest-TestGettingAddress-25838038</nova:project>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:      </nova:owner>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:      <nova:ports>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:        <nova:port uuid="43f2bb94-fd9e-4783-b426-c1651ae59f07">
Nov 22 03:07:32 np0005531887 nova_compute[186849]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fe23:995d" ipVersion="6"/>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:        </nova:port>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:      </nova:ports>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:    </nova:instance>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:  <sysinfo type="smbios">
Nov 22 03:07:32 np0005531887 nova_compute[186849]:    <system>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:      <entry name="serial">834afb36-7b49-4b59-9887-fe8b10d2d934</entry>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:      <entry name="uuid">834afb36-7b49-4b59-9887-fe8b10d2d934</entry>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:    </system>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:  <os>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:    <boot dev="hd"/>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:    <smbios mode="sysinfo"/>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:  </os>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:  <features>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:    <vmcoreinfo/>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:  </features>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:  <clock offset="utc">
Nov 22 03:07:32 np0005531887 nova_compute[186849]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:    <timer name="hpet" present="no"/>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:  </clock>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:  <cpu mode="custom" match="exact">
Nov 22 03:07:32 np0005531887 nova_compute[186849]:    <model>Nehalem</model>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:  <devices>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:    <disk type="file" device="disk">
Nov 22 03:07:32 np0005531887 nova_compute[186849]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/834afb36-7b49-4b59-9887-fe8b10d2d934/disk"/>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:      <target dev="vda" bus="virtio"/>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:    <disk type="file" device="cdrom">
Nov 22 03:07:32 np0005531887 nova_compute[186849]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/834afb36-7b49-4b59-9887-fe8b10d2d934/disk.config"/>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:      <target dev="sda" bus="sata"/>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:    <interface type="ethernet">
Nov 22 03:07:32 np0005531887 nova_compute[186849]:      <mac address="fa:16:3e:23:99:5d"/>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:      <mtu size="1442"/>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:      <target dev="tap43f2bb94-fd"/>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:    </interface>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:    <serial type="pty">
Nov 22 03:07:32 np0005531887 nova_compute[186849]:      <log file="/var/lib/nova/instances/834afb36-7b49-4b59-9887-fe8b10d2d934/console.log" append="off"/>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:    </serial>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:    <video>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:    </video>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:    <input type="tablet" bus="usb"/>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:    <rng model="virtio">
Nov 22 03:07:32 np0005531887 nova_compute[186849]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:    </rng>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:    <controller type="usb" index="0"/>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:    <memballoon model="virtio">
Nov 22 03:07:32 np0005531887 nova_compute[186849]:      <stats period="10"/>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 03:07:32 np0005531887 nova_compute[186849]:  </devices>
Nov 22 03:07:32 np0005531887 nova_compute[186849]: </domain>
Nov 22 03:07:32 np0005531887 nova_compute[186849]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:07:32 np0005531887 nova_compute[186849]: 2025-11-22 08:07:32.183 186853 DEBUG nova.compute.manager [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Preparing to wait for external event network-vif-plugged-43f2bb94-fd9e-4783-b426-c1651ae59f07 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:07:32 np0005531887 nova_compute[186849]: 2025-11-22 08:07:32.184 186853 DEBUG oslo_concurrency.lockutils [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "834afb36-7b49-4b59-9887-fe8b10d2d934-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:32 np0005531887 nova_compute[186849]: 2025-11-22 08:07:32.184 186853 DEBUG oslo_concurrency.lockutils [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "834afb36-7b49-4b59-9887-fe8b10d2d934-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:32 np0005531887 nova_compute[186849]: 2025-11-22 08:07:32.184 186853 DEBUG oslo_concurrency.lockutils [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "834afb36-7b49-4b59-9887-fe8b10d2d934-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:32 np0005531887 nova_compute[186849]: 2025-11-22 08:07:32.185 186853 DEBUG nova.virt.libvirt.vif [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:07:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-377491549',display_name='tempest-TestGettingAddress-server-377491549',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-377491549',id=111,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEPnaERasa+izdcdfyTuC1NZxdKIV3QYAmiEXJjMkASn0E1tv7r6vCMDrq3+5wI/5DgRhzrsGj9ouyKzyqBuAz+X8ag3n7AcCuRnJpHSdd9YGkwB1w6Z6YQ+SkW/64cPWQ==',key_name='tempest-TestGettingAddress-1450732548',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-6h3o0l87',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:07:26Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=834afb36-7b49-4b59-9887-fe8b10d2d934,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "43f2bb94-fd9e-4783-b426-c1651ae59f07", "address": "fa:16:3e:23:99:5d", "network": {"id": "90da6fca-65d1-4012-9602-d88842a0ad0e", "bridge": "br-int", "label": "tempest-network-smoke--1184291441", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe23:995d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43f2bb94-fd", "ovs_interfaceid": "43f2bb94-fd9e-4783-b426-c1651ae59f07", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:07:32 np0005531887 nova_compute[186849]: 2025-11-22 08:07:32.186 186853 DEBUG nova.network.os_vif_util [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "43f2bb94-fd9e-4783-b426-c1651ae59f07", "address": "fa:16:3e:23:99:5d", "network": {"id": "90da6fca-65d1-4012-9602-d88842a0ad0e", "bridge": "br-int", "label": "tempest-network-smoke--1184291441", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe23:995d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43f2bb94-fd", "ovs_interfaceid": "43f2bb94-fd9e-4783-b426-c1651ae59f07", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:07:32 np0005531887 nova_compute[186849]: 2025-11-22 08:07:32.187 186853 DEBUG nova.network.os_vif_util [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:23:99:5d,bridge_name='br-int',has_traffic_filtering=True,id=43f2bb94-fd9e-4783-b426-c1651ae59f07,network=Network(90da6fca-65d1-4012-9602-d88842a0ad0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43f2bb94-fd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:07:32 np0005531887 nova_compute[186849]: 2025-11-22 08:07:32.188 186853 DEBUG os_vif [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:23:99:5d,bridge_name='br-int',has_traffic_filtering=True,id=43f2bb94-fd9e-4783-b426-c1651ae59f07,network=Network(90da6fca-65d1-4012-9602-d88842a0ad0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43f2bb94-fd') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:07:32 np0005531887 nova_compute[186849]: 2025-11-22 08:07:32.188 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:32 np0005531887 nova_compute[186849]: 2025-11-22 08:07:32.189 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:07:32 np0005531887 nova_compute[186849]: 2025-11-22 08:07:32.189 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:07:32 np0005531887 nova_compute[186849]: 2025-11-22 08:07:32.194 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:32 np0005531887 nova_compute[186849]: 2025-11-22 08:07:32.194 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap43f2bb94-fd, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:07:32 np0005531887 nova_compute[186849]: 2025-11-22 08:07:32.195 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap43f2bb94-fd, col_values=(('external_ids', {'iface-id': '43f2bb94-fd9e-4783-b426-c1651ae59f07', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:23:99:5d', 'vm-uuid': '834afb36-7b49-4b59-9887-fe8b10d2d934'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:07:32 np0005531887 nova_compute[186849]: 2025-11-22 08:07:32.197 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:32 np0005531887 nova_compute[186849]: 2025-11-22 08:07:32.199 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:07:32 np0005531887 NetworkManager[55210]: <info>  [1763798852.1991] manager: (tap43f2bb94-fd): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/164)
Nov 22 03:07:32 np0005531887 nova_compute[186849]: 2025-11-22 08:07:32.204 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:32 np0005531887 nova_compute[186849]: 2025-11-22 08:07:32.204 186853 INFO os_vif [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:23:99:5d,bridge_name='br-int',has_traffic_filtering=True,id=43f2bb94-fd9e-4783-b426-c1651ae59f07,network=Network(90da6fca-65d1-4012-9602-d88842a0ad0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43f2bb94-fd')#033[00m
Nov 22 03:07:32 np0005531887 podman[230329]: 2025-11-22 08:07:32.291075248 +0000 UTC m=+0.052377534 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:07:32 np0005531887 nova_compute[186849]: 2025-11-22 08:07:32.362 186853 DEBUG nova.virt.libvirt.driver [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:07:32 np0005531887 nova_compute[186849]: 2025-11-22 08:07:32.363 186853 DEBUG nova.virt.libvirt.driver [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:07:32 np0005531887 nova_compute[186849]: 2025-11-22 08:07:32.363 186853 DEBUG nova.virt.libvirt.driver [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] No VIF found with MAC fa:16:3e:23:99:5d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:07:32 np0005531887 nova_compute[186849]: 2025-11-22 08:07:32.364 186853 INFO nova.virt.libvirt.driver [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Using config drive#033[00m
Nov 22 03:07:32 np0005531887 nova_compute[186849]: 2025-11-22 08:07:32.794 186853 INFO nova.virt.libvirt.driver [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Creating config drive at /var/lib/nova/instances/834afb36-7b49-4b59-9887-fe8b10d2d934/disk.config#033[00m
Nov 22 03:07:32 np0005531887 nova_compute[186849]: 2025-11-22 08:07:32.800 186853 DEBUG oslo_concurrency.processutils [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/834afb36-7b49-4b59-9887-fe8b10d2d934/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpf7wh14ah execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:07:32 np0005531887 nova_compute[186849]: 2025-11-22 08:07:32.930 186853 DEBUG oslo_concurrency.processutils [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/834afb36-7b49-4b59-9887-fe8b10d2d934/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpf7wh14ah" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:07:33 np0005531887 kernel: tap43f2bb94-fd: entered promiscuous mode
Nov 22 03:07:33 np0005531887 ovn_controller[95130]: 2025-11-22T08:07:33Z|00358|binding|INFO|Claiming lport 43f2bb94-fd9e-4783-b426-c1651ae59f07 for this chassis.
Nov 22 03:07:33 np0005531887 ovn_controller[95130]: 2025-11-22T08:07:33Z|00359|binding|INFO|43f2bb94-fd9e-4783-b426-c1651ae59f07: Claiming fa:16:3e:23:99:5d 10.100.0.13 2001:db8::f816:3eff:fe23:995d
Nov 22 03:07:33 np0005531887 nova_compute[186849]: 2025-11-22 08:07:33.022 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:33 np0005531887 NetworkManager[55210]: <info>  [1763798853.0244] manager: (tap43f2bb94-fd): new Tun device (/org/freedesktop/NetworkManager/Devices/165)
Nov 22 03:07:33 np0005531887 nova_compute[186849]: 2025-11-22 08:07:33.026 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:33 np0005531887 nova_compute[186849]: 2025-11-22 08:07:33.032 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:33 np0005531887 nova_compute[186849]: 2025-11-22 08:07:33.038 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:33 np0005531887 NetworkManager[55210]: <info>  [1763798853.0517] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/166)
Nov 22 03:07:33 np0005531887 NetworkManager[55210]: <info>  [1763798853.0530] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/167)
Nov 22 03:07:33 np0005531887 nova_compute[186849]: 2025-11-22 08:07:33.051 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:33 np0005531887 systemd-machined[153180]: New machine qemu-44-instance-0000006f.
Nov 22 03:07:33 np0005531887 systemd-udevd[230368]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:07:33 np0005531887 NetworkManager[55210]: <info>  [1763798853.0766] device (tap43f2bb94-fd): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:07:33 np0005531887 NetworkManager[55210]: <info>  [1763798853.0778] device (tap43f2bb94-fd): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:07:33 np0005531887 nova_compute[186849]: 2025-11-22 08:07:33.151 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:33 np0005531887 systemd[1]: Started Virtual Machine qemu-44-instance-0000006f.
Nov 22 03:07:33 np0005531887 nova_compute[186849]: 2025-11-22 08:07:33.158 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:33.166 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:23:99:5d 10.100.0.13 2001:db8::f816:3eff:fe23:995d'], port_security=['fa:16:3e:23:99:5d 10.100.0.13 2001:db8::f816:3eff:fe23:995d'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28 2001:db8::f816:3eff:fe23:995d/64', 'neutron:device_id': '834afb36-7b49-4b59-9887-fe8b10d2d934', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-90da6fca-65d1-4012-9602-d88842a0ad0e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '2', 'neutron:security_group_ids': '573b06fa-1b11-4261-bfd0-ca50fa18731b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e9c41f1e-b11e-4868-a3a0-70214f7435c4, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=43f2bb94-fd9e-4783-b426-c1651ae59f07) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:07:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:33.167 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 43f2bb94-fd9e-4783-b426-c1651ae59f07 in datapath 90da6fca-65d1-4012-9602-d88842a0ad0e bound to our chassis#033[00m
Nov 22 03:07:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:33.168 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 90da6fca-65d1-4012-9602-d88842a0ad0e#033[00m
Nov 22 03:07:33 np0005531887 ovn_controller[95130]: 2025-11-22T08:07:33Z|00360|binding|INFO|Setting lport 43f2bb94-fd9e-4783-b426-c1651ae59f07 ovn-installed in OVS
Nov 22 03:07:33 np0005531887 ovn_controller[95130]: 2025-11-22T08:07:33Z|00361|binding|INFO|Setting lport 43f2bb94-fd9e-4783-b426-c1651ae59f07 up in Southbound
Nov 22 03:07:33 np0005531887 nova_compute[186849]: 2025-11-22 08:07:33.171 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:33 np0005531887 nova_compute[186849]: 2025-11-22 08:07:33.179 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:33.180 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[a40f6b80-aed5-4cb5-9a73-690eef2a2310]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:33.181 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap90da6fca-61 in ovnmeta-90da6fca-65d1-4012-9602-d88842a0ad0e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:07:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:33.184 213790 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap90da6fca-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:07:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:33.184 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[9389919f-38fe-432c-bfce-e703e1b99a38]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:33.185 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[3b076bc7-f201-4349-aa0c-9895b5c4e230]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:33.197 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[440a8b26-9466-4969-84da-ee4c37d6b3c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:33.212 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[68cd4f53-b15a-4972-bc7b-c3a520352a03]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:33.239 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[c24f5896-4f35-49f2-8461-f3c0fe2c5bd7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:33 np0005531887 NetworkManager[55210]: <info>  [1763798853.2457] manager: (tap90da6fca-60): new Veth device (/org/freedesktop/NetworkManager/Devices/168)
Nov 22 03:07:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:33.244 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[453b0903-a81b-42bf-b382-c6d0a60d2a1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:33.277 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[7f60ae3a-07be-44fb-a837-e66b8a833324]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:33.281 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[98062e3f-c6b7-4248-8dc7-785811684889]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:33 np0005531887 NetworkManager[55210]: <info>  [1763798853.3145] device (tap90da6fca-60): carrier: link connected
Nov 22 03:07:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:33.323 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[89edb601-5fbc-4d6e-bf5e-0f6c6362f544]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:33.346 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[8c1873ca-2b61-4828-9b74-1b9f10f18112]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap90da6fca-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1e:28:f1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 111], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 559592, 'reachable_time': 27271, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230401, 'error': None, 'target': 'ovnmeta-90da6fca-65d1-4012-9602-d88842a0ad0e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:33.364 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[d988e29e-7664-4ee7-81e7-aa4ea1835161]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1e:28f1'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 559592, 'tstamp': 559592}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230402, 'error': None, 'target': 'ovnmeta-90da6fca-65d1-4012-9602-d88842a0ad0e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:33.382 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[2204e9d2-2937-4157-93e0-0d5699d0fb85]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap90da6fca-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1e:28:f1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 111], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 559592, 'reachable_time': 27271, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 230403, 'error': None, 'target': 'ovnmeta-90da6fca-65d1-4012-9602-d88842a0ad0e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:33.420 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[12eadaad-c229-4a5a-9839-a5ad07e58a36]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:33.484 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[2376ee00-ff54-4cf6-8dcb-f439109b2dfc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:33.487 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap90da6fca-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:07:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:33.487 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:07:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:33.488 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap90da6fca-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:07:33 np0005531887 NetworkManager[55210]: <info>  [1763798853.4905] manager: (tap90da6fca-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/169)
Nov 22 03:07:33 np0005531887 nova_compute[186849]: 2025-11-22 08:07:33.490 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:33 np0005531887 kernel: tap90da6fca-60: entered promiscuous mode
Nov 22 03:07:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:33.493 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap90da6fca-60, col_values=(('external_ids', {'iface-id': '0abd56a4-3e9e-4d28-8383-eadcda41744d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:07:33 np0005531887 ovn_controller[95130]: 2025-11-22T08:07:33Z|00362|binding|INFO|Releasing lport 0abd56a4-3e9e-4d28-8383-eadcda41744d from this chassis (sb_readonly=0)
Nov 22 03:07:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:33.518 104084 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/90da6fca-65d1-4012-9602-d88842a0ad0e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/90da6fca-65d1-4012-9602-d88842a0ad0e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:07:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:33.519 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[ccdee330-e454-4ad6-bbb0-257c34034024]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:07:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:33.520 104084 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:07:33 np0005531887 ovn_metadata_agent[104079]: global
Nov 22 03:07:33 np0005531887 ovn_metadata_agent[104079]:    log         /dev/log local0 debug
Nov 22 03:07:33 np0005531887 ovn_metadata_agent[104079]:    log-tag     haproxy-metadata-proxy-90da6fca-65d1-4012-9602-d88842a0ad0e
Nov 22 03:07:33 np0005531887 ovn_metadata_agent[104079]:    user        root
Nov 22 03:07:33 np0005531887 ovn_metadata_agent[104079]:    group       root
Nov 22 03:07:33 np0005531887 ovn_metadata_agent[104079]:    maxconn     1024
Nov 22 03:07:33 np0005531887 ovn_metadata_agent[104079]:    pidfile     /var/lib/neutron/external/pids/90da6fca-65d1-4012-9602-d88842a0ad0e.pid.haproxy
Nov 22 03:07:33 np0005531887 ovn_metadata_agent[104079]:    daemon
Nov 22 03:07:33 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:07:33 np0005531887 ovn_metadata_agent[104079]: defaults
Nov 22 03:07:33 np0005531887 ovn_metadata_agent[104079]:    log global
Nov 22 03:07:33 np0005531887 ovn_metadata_agent[104079]:    mode http
Nov 22 03:07:33 np0005531887 ovn_metadata_agent[104079]:    option httplog
Nov 22 03:07:33 np0005531887 ovn_metadata_agent[104079]:    option dontlognull
Nov 22 03:07:33 np0005531887 ovn_metadata_agent[104079]:    option http-server-close
Nov 22 03:07:33 np0005531887 ovn_metadata_agent[104079]:    option forwardfor
Nov 22 03:07:33 np0005531887 ovn_metadata_agent[104079]:    retries                 3
Nov 22 03:07:33 np0005531887 ovn_metadata_agent[104079]:    timeout http-request    30s
Nov 22 03:07:33 np0005531887 ovn_metadata_agent[104079]:    timeout connect         30s
Nov 22 03:07:33 np0005531887 ovn_metadata_agent[104079]:    timeout client          32s
Nov 22 03:07:33 np0005531887 ovn_metadata_agent[104079]:    timeout server          32s
Nov 22 03:07:33 np0005531887 ovn_metadata_agent[104079]:    timeout http-keep-alive 30s
Nov 22 03:07:33 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:07:33 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:07:33 np0005531887 ovn_metadata_agent[104079]: listen listener
Nov 22 03:07:33 np0005531887 ovn_metadata_agent[104079]:    bind 169.254.169.254:80
Nov 22 03:07:33 np0005531887 ovn_metadata_agent[104079]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:07:33 np0005531887 ovn_metadata_agent[104079]:    http-request add-header X-OVN-Network-ID 90da6fca-65d1-4012-9602-d88842a0ad0e
Nov 22 03:07:33 np0005531887 ovn_metadata_agent[104079]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:07:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:33.521 104084 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-90da6fca-65d1-4012-9602-d88842a0ad0e', 'env', 'PROCESS_TAG=haproxy-90da6fca-65d1-4012-9602-d88842a0ad0e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/90da6fca-65d1-4012-9602-d88842a0ad0e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:07:33 np0005531887 nova_compute[186849]: 2025-11-22 08:07:33.766 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798853.7653565, 834afb36-7b49-4b59-9887-fe8b10d2d934 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:07:33 np0005531887 nova_compute[186849]: 2025-11-22 08:07:33.766 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] VM Started (Lifecycle Event)#033[00m
Nov 22 03:07:33 np0005531887 nova_compute[186849]: 2025-11-22 08:07:33.882 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:07:33 np0005531887 nova_compute[186849]: 2025-11-22 08:07:33.888 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798853.7657607, 834afb36-7b49-4b59-9887-fe8b10d2d934 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:07:33 np0005531887 nova_compute[186849]: 2025-11-22 08:07:33.889 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:07:33 np0005531887 nova_compute[186849]: 2025-11-22 08:07:33.909 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:07:33 np0005531887 nova_compute[186849]: 2025-11-22 08:07:33.913 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:07:33 np0005531887 nova_compute[186849]: 2025-11-22 08:07:33.953 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:07:33 np0005531887 podman[230442]: 2025-11-22 08:07:33.886834041 +0000 UTC m=+0.026882035 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:07:34 np0005531887 podman[230442]: 2025-11-22 08:07:34.090949897 +0000 UTC m=+0.230997871 container create 5883fc9c27ca8f3a15ae177ad13ffb4fcf88249cf739db33304c671db465e604 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-90da6fca-65d1-4012-9602-d88842a0ad0e, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2)
Nov 22 03:07:34 np0005531887 systemd[1]: Started libpod-conmon-5883fc9c27ca8f3a15ae177ad13ffb4fcf88249cf739db33304c671db465e604.scope.
Nov 22 03:07:34 np0005531887 systemd[1]: Started libcrun container.
Nov 22 03:07:34 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f8f263bc3667a75ead0aed3f1d2f7638fd33945fd0a3222d9f408383171f72a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:07:34 np0005531887 nova_compute[186849]: 2025-11-22 08:07:34.388 186853 DEBUG nova.compute.manager [req-c35633ba-4515-4e52-81b6-61a88123cbdc req-bea7bcc9-d334-49bd-a4b6-fe3d4c827c65 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Received event network-vif-plugged-43f2bb94-fd9e-4783-b426-c1651ae59f07 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:07:34 np0005531887 nova_compute[186849]: 2025-11-22 08:07:34.389 186853 DEBUG oslo_concurrency.lockutils [req-c35633ba-4515-4e52-81b6-61a88123cbdc req-bea7bcc9-d334-49bd-a4b6-fe3d4c827c65 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "834afb36-7b49-4b59-9887-fe8b10d2d934-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:34 np0005531887 nova_compute[186849]: 2025-11-22 08:07:34.390 186853 DEBUG oslo_concurrency.lockutils [req-c35633ba-4515-4e52-81b6-61a88123cbdc req-bea7bcc9-d334-49bd-a4b6-fe3d4c827c65 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "834afb36-7b49-4b59-9887-fe8b10d2d934-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:34 np0005531887 nova_compute[186849]: 2025-11-22 08:07:34.390 186853 DEBUG oslo_concurrency.lockutils [req-c35633ba-4515-4e52-81b6-61a88123cbdc req-bea7bcc9-d334-49bd-a4b6-fe3d4c827c65 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "834afb36-7b49-4b59-9887-fe8b10d2d934-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:34 np0005531887 nova_compute[186849]: 2025-11-22 08:07:34.390 186853 DEBUG nova.compute.manager [req-c35633ba-4515-4e52-81b6-61a88123cbdc req-bea7bcc9-d334-49bd-a4b6-fe3d4c827c65 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Processing event network-vif-plugged-43f2bb94-fd9e-4783-b426-c1651ae59f07 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:07:34 np0005531887 nova_compute[186849]: 2025-11-22 08:07:34.391 186853 DEBUG nova.compute.manager [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:07:34 np0005531887 nova_compute[186849]: 2025-11-22 08:07:34.395 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798854.395134, 834afb36-7b49-4b59-9887-fe8b10d2d934 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:07:34 np0005531887 nova_compute[186849]: 2025-11-22 08:07:34.396 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:07:34 np0005531887 nova_compute[186849]: 2025-11-22 08:07:34.397 186853 DEBUG nova.virt.libvirt.driver [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:07:34 np0005531887 nova_compute[186849]: 2025-11-22 08:07:34.400 186853 INFO nova.virt.libvirt.driver [-] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Instance spawned successfully.#033[00m
Nov 22 03:07:34 np0005531887 nova_compute[186849]: 2025-11-22 08:07:34.401 186853 DEBUG nova.virt.libvirt.driver [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 03:07:34 np0005531887 nova_compute[186849]: 2025-11-22 08:07:34.421 186853 DEBUG nova.virt.libvirt.driver [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:07:34 np0005531887 nova_compute[186849]: 2025-11-22 08:07:34.422 186853 DEBUG nova.virt.libvirt.driver [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:07:34 np0005531887 nova_compute[186849]: 2025-11-22 08:07:34.422 186853 DEBUG nova.virt.libvirt.driver [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:07:34 np0005531887 nova_compute[186849]: 2025-11-22 08:07:34.423 186853 DEBUG nova.virt.libvirt.driver [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:07:34 np0005531887 nova_compute[186849]: 2025-11-22 08:07:34.423 186853 DEBUG nova.virt.libvirt.driver [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:07:34 np0005531887 nova_compute[186849]: 2025-11-22 08:07:34.424 186853 DEBUG nova.virt.libvirt.driver [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:07:34 np0005531887 nova_compute[186849]: 2025-11-22 08:07:34.429 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:07:34 np0005531887 nova_compute[186849]: 2025-11-22 08:07:34.432 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:07:34 np0005531887 podman[230442]: 2025-11-22 08:07:34.450054479 +0000 UTC m=+0.590102463 container init 5883fc9c27ca8f3a15ae177ad13ffb4fcf88249cf739db33304c671db465e604 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-90da6fca-65d1-4012-9602-d88842a0ad0e, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 22 03:07:34 np0005531887 podman[230442]: 2025-11-22 08:07:34.455092437 +0000 UTC m=+0.595140401 container start 5883fc9c27ca8f3a15ae177ad13ffb4fcf88249cf739db33304c671db465e604 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-90da6fca-65d1-4012-9602-d88842a0ad0e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 03:07:34 np0005531887 nova_compute[186849]: 2025-11-22 08:07:34.465 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:07:34 np0005531887 neutron-haproxy-ovnmeta-90da6fca-65d1-4012-9602-d88842a0ad0e[230458]: [NOTICE]   (230462) : New worker (230464) forked
Nov 22 03:07:34 np0005531887 neutron-haproxy-ovnmeta-90da6fca-65d1-4012-9602-d88842a0ad0e[230458]: [NOTICE]   (230462) : Loading success.
Nov 22 03:07:34 np0005531887 nova_compute[186849]: 2025-11-22 08:07:34.530 186853 INFO nova.compute.manager [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Took 8.04 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 03:07:34 np0005531887 nova_compute[186849]: 2025-11-22 08:07:34.531 186853 DEBUG nova.compute.manager [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:07:34 np0005531887 nova_compute[186849]: 2025-11-22 08:07:34.617 186853 INFO nova.compute.manager [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Took 8.60 seconds to build instance.#033[00m
Nov 22 03:07:34 np0005531887 nova_compute[186849]: 2025-11-22 08:07:34.638 186853 DEBUG oslo_concurrency.lockutils [None req-e8d2250d-9077-42d1-b53f-021a91016ef5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "834afb36-7b49-4b59-9887-fe8b10d2d934" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.689s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:34 np0005531887 nova_compute[186849]: 2025-11-22 08:07:34.661 186853 DEBUG nova.network.neutron [req-1efef4b0-2a30-4467-8cc3-5429b38f70e5 req-febceab6-9f7b-44c7-a554-2be0fbcd1eef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Updated VIF entry in instance network info cache for port 43f2bb94-fd9e-4783-b426-c1651ae59f07. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:07:34 np0005531887 nova_compute[186849]: 2025-11-22 08:07:34.662 186853 DEBUG nova.network.neutron [req-1efef4b0-2a30-4467-8cc3-5429b38f70e5 req-febceab6-9f7b-44c7-a554-2be0fbcd1eef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Updating instance_info_cache with network_info: [{"id": "43f2bb94-fd9e-4783-b426-c1651ae59f07", "address": "fa:16:3e:23:99:5d", "network": {"id": "90da6fca-65d1-4012-9602-d88842a0ad0e", "bridge": "br-int", "label": "tempest-network-smoke--1184291441", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe23:995d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43f2bb94-fd", "ovs_interfaceid": "43f2bb94-fd9e-4783-b426-c1651ae59f07", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:07:34 np0005531887 nova_compute[186849]: 2025-11-22 08:07:34.676 186853 DEBUG oslo_concurrency.lockutils [req-1efef4b0-2a30-4467-8cc3-5429b38f70e5 req-febceab6-9f7b-44c7-a554-2be0fbcd1eef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-834afb36-7b49-4b59-9887-fe8b10d2d934" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:07:35 np0005531887 nova_compute[186849]: 2025-11-22 08:07:35.981 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:36 np0005531887 nova_compute[186849]: 2025-11-22 08:07:36.550 186853 DEBUG nova.compute.manager [req-d7ff92ef-39fd-4db3-9ee5-eea803231ee1 req-1f47e076-92ad-42ba-8135-2f6b877a6863 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Received event network-vif-plugged-43f2bb94-fd9e-4783-b426-c1651ae59f07 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:07:36 np0005531887 nova_compute[186849]: 2025-11-22 08:07:36.551 186853 DEBUG oslo_concurrency.lockutils [req-d7ff92ef-39fd-4db3-9ee5-eea803231ee1 req-1f47e076-92ad-42ba-8135-2f6b877a6863 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "834afb36-7b49-4b59-9887-fe8b10d2d934-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:36 np0005531887 nova_compute[186849]: 2025-11-22 08:07:36.551 186853 DEBUG oslo_concurrency.lockutils [req-d7ff92ef-39fd-4db3-9ee5-eea803231ee1 req-1f47e076-92ad-42ba-8135-2f6b877a6863 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "834afb36-7b49-4b59-9887-fe8b10d2d934-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:36 np0005531887 nova_compute[186849]: 2025-11-22 08:07:36.551 186853 DEBUG oslo_concurrency.lockutils [req-d7ff92ef-39fd-4db3-9ee5-eea803231ee1 req-1f47e076-92ad-42ba-8135-2f6b877a6863 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "834afb36-7b49-4b59-9887-fe8b10d2d934-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:36 np0005531887 nova_compute[186849]: 2025-11-22 08:07:36.551 186853 DEBUG nova.compute.manager [req-d7ff92ef-39fd-4db3-9ee5-eea803231ee1 req-1f47e076-92ad-42ba-8135-2f6b877a6863 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] No waiting events found dispatching network-vif-plugged-43f2bb94-fd9e-4783-b426-c1651ae59f07 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:07:36 np0005531887 nova_compute[186849]: 2025-11-22 08:07:36.552 186853 WARNING nova.compute.manager [req-d7ff92ef-39fd-4db3-9ee5-eea803231ee1 req-1f47e076-92ad-42ba-8135-2f6b877a6863 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Received unexpected event network-vif-plugged-43f2bb94-fd9e-4783-b426-c1651ae59f07 for instance with vm_state active and task_state None.#033[00m
Nov 22 03:07:36 np0005531887 nova_compute[186849]: 2025-11-22 08:07:36.848 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:37 np0005531887 nova_compute[186849]: 2025-11-22 08:07:37.212 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:37 np0005531887 nova_compute[186849]: 2025-11-22 08:07:37.214 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:07:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:37.341 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:37.342 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:07:37.342 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:38 np0005531887 podman[230475]: 2025-11-22 08:07:38.852212373 +0000 UTC m=+0.066659517 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:07:39 np0005531887 nova_compute[186849]: 2025-11-22 08:07:39.659 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:41 np0005531887 nova_compute[186849]: 2025-11-22 08:07:41.850 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:42 np0005531887 nova_compute[186849]: 2025-11-22 08:07:42.214 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:42 np0005531887 podman[230495]: 2025-11-22 08:07:42.849677635 +0000 UTC m=+0.065308033 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 22 03:07:45 np0005531887 nova_compute[186849]: 2025-11-22 08:07:45.617 186853 DEBUG nova.compute.manager [req-7a5176d8-9726-4582-bf3b-fba9ed83de0a req-d69646af-0484-446b-aa6d-c019e1e4f569 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Received event network-changed-43f2bb94-fd9e-4783-b426-c1651ae59f07 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:07:45 np0005531887 nova_compute[186849]: 2025-11-22 08:07:45.618 186853 DEBUG nova.compute.manager [req-7a5176d8-9726-4582-bf3b-fba9ed83de0a req-d69646af-0484-446b-aa6d-c019e1e4f569 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Refreshing instance network info cache due to event network-changed-43f2bb94-fd9e-4783-b426-c1651ae59f07. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:07:45 np0005531887 nova_compute[186849]: 2025-11-22 08:07:45.618 186853 DEBUG oslo_concurrency.lockutils [req-7a5176d8-9726-4582-bf3b-fba9ed83de0a req-d69646af-0484-446b-aa6d-c019e1e4f569 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-834afb36-7b49-4b59-9887-fe8b10d2d934" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:07:45 np0005531887 nova_compute[186849]: 2025-11-22 08:07:45.618 186853 DEBUG oslo_concurrency.lockutils [req-7a5176d8-9726-4582-bf3b-fba9ed83de0a req-d69646af-0484-446b-aa6d-c019e1e4f569 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-834afb36-7b49-4b59-9887-fe8b10d2d934" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:07:45 np0005531887 nova_compute[186849]: 2025-11-22 08:07:45.618 186853 DEBUG nova.network.neutron [req-7a5176d8-9726-4582-bf3b-fba9ed83de0a req-d69646af-0484-446b-aa6d-c019e1e4f569 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Refreshing network info cache for port 43f2bb94-fd9e-4783-b426-c1651ae59f07 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:07:46 np0005531887 nova_compute[186849]: 2025-11-22 08:07:46.853 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:47 np0005531887 nova_compute[186849]: 2025-11-22 08:07:47.057 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:47 np0005531887 nova_compute[186849]: 2025-11-22 08:07:47.216 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:47 np0005531887 nova_compute[186849]: 2025-11-22 08:07:47.737 186853 DEBUG nova.network.neutron [req-7a5176d8-9726-4582-bf3b-fba9ed83de0a req-d69646af-0484-446b-aa6d-c019e1e4f569 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Updated VIF entry in instance network info cache for port 43f2bb94-fd9e-4783-b426-c1651ae59f07. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:07:47 np0005531887 nova_compute[186849]: 2025-11-22 08:07:47.737 186853 DEBUG nova.network.neutron [req-7a5176d8-9726-4582-bf3b-fba9ed83de0a req-d69646af-0484-446b-aa6d-c019e1e4f569 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Updating instance_info_cache with network_info: [{"id": "43f2bb94-fd9e-4783-b426-c1651ae59f07", "address": "fa:16:3e:23:99:5d", "network": {"id": "90da6fca-65d1-4012-9602-d88842a0ad0e", "bridge": "br-int", "label": "tempest-network-smoke--1184291441", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe23:995d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43f2bb94-fd", "ovs_interfaceid": "43f2bb94-fd9e-4783-b426-c1651ae59f07", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:07:47 np0005531887 nova_compute[186849]: 2025-11-22 08:07:47.777 186853 DEBUG oslo_concurrency.lockutils [req-7a5176d8-9726-4582-bf3b-fba9ed83de0a req-d69646af-0484-446b-aa6d-c019e1e4f569 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-834afb36-7b49-4b59-9887-fe8b10d2d934" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:07:48 np0005531887 podman[230527]: 2025-11-22 08:07:48.869615014 +0000 UTC m=+0.082003389 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:07:51 np0005531887 ovn_controller[95130]: 2025-11-22T08:07:51Z|00047|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:23:99:5d 10.100.0.13
Nov 22 03:07:51 np0005531887 ovn_controller[95130]: 2025-11-22T08:07:51Z|00048|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:23:99:5d 10.100.0.13
Nov 22 03:07:51 np0005531887 nova_compute[186849]: 2025-11-22 08:07:51.855 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:52 np0005531887 nova_compute[186849]: 2025-11-22 08:07:52.218 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:53 np0005531887 nova_compute[186849]: 2025-11-22 08:07:53.183 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:54 np0005531887 nova_compute[186849]: 2025-11-22 08:07:54.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:07:54 np0005531887 podman[230555]: 2025-11-22 08:07:54.834648203 +0000 UTC m=+0.058203562 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, architecture=x86_64, maintainer=Red Hat, Inc., managed_by=edpm_ansible, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.buildah.version=1.33.7, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., container_name=openstack_network_exporter, config_id=edpm)
Nov 22 03:07:55 np0005531887 nova_compute[186849]: 2025-11-22 08:07:55.762 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:07:55 np0005531887 nova_compute[186849]: 2025-11-22 08:07:55.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:07:55 np0005531887 nova_compute[186849]: 2025-11-22 08:07:55.787 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:55 np0005531887 nova_compute[186849]: 2025-11-22 08:07:55.788 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:55 np0005531887 nova_compute[186849]: 2025-11-22 08:07:55.788 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:55 np0005531887 nova_compute[186849]: 2025-11-22 08:07:55.788 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:07:55 np0005531887 nova_compute[186849]: 2025-11-22 08:07:55.862 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/834afb36-7b49-4b59-9887-fe8b10d2d934/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:07:55 np0005531887 nova_compute[186849]: 2025-11-22 08:07:55.925 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/834afb36-7b49-4b59-9887-fe8b10d2d934/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:07:55 np0005531887 nova_compute[186849]: 2025-11-22 08:07:55.926 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/834afb36-7b49-4b59-9887-fe8b10d2d934/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:07:55 np0005531887 nova_compute[186849]: 2025-11-22 08:07:55.982 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/834afb36-7b49-4b59-9887-fe8b10d2d934/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:07:56 np0005531887 nova_compute[186849]: 2025-11-22 08:07:56.149 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:07:56 np0005531887 nova_compute[186849]: 2025-11-22 08:07:56.151 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5566MB free_disk=73.31682205200195GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:07:56 np0005531887 nova_compute[186849]: 2025-11-22 08:07:56.151 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:07:56 np0005531887 nova_compute[186849]: 2025-11-22 08:07:56.152 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:07:56 np0005531887 nova_compute[186849]: 2025-11-22 08:07:56.232 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Instance 834afb36-7b49-4b59-9887-fe8b10d2d934 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 03:07:56 np0005531887 nova_compute[186849]: 2025-11-22 08:07:56.232 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:07:56 np0005531887 nova_compute[186849]: 2025-11-22 08:07:56.232 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:07:56 np0005531887 nova_compute[186849]: 2025-11-22 08:07:56.313 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:07:56 np0005531887 nova_compute[186849]: 2025-11-22 08:07:56.325 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:07:56 np0005531887 nova_compute[186849]: 2025-11-22 08:07:56.348 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:07:56 np0005531887 nova_compute[186849]: 2025-11-22 08:07:56.349 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.198s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:07:56 np0005531887 nova_compute[186849]: 2025-11-22 08:07:56.464 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:56 np0005531887 nova_compute[186849]: 2025-11-22 08:07:56.858 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:57 np0005531887 nova_compute[186849]: 2025-11-22 08:07:57.221 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:07:57 np0005531887 podman[230585]: 2025-11-22 08:07:57.837395223 +0000 UTC m=+0.056702595 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 03:07:57 np0005531887 podman[230586]: 2025-11-22 08:07:57.870426334 +0000 UTC m=+0.085913448 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 22 03:07:59 np0005531887 nova_compute[186849]: 2025-11-22 08:07:59.351 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:07:59 np0005531887 nova_compute[186849]: 2025-11-22 08:07:59.351 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:07:59 np0005531887 nova_compute[186849]: 2025-11-22 08:07:59.351 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:07:59 np0005531887 nova_compute[186849]: 2025-11-22 08:07:59.770 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:08:00 np0005531887 ovn_controller[95130]: 2025-11-22T08:08:00Z|00363|binding|INFO|Releasing lport 0abd56a4-3e9e-4d28-8383-eadcda41744d from this chassis (sb_readonly=0)
Nov 22 03:08:00 np0005531887 nova_compute[186849]: 2025-11-22 08:08:00.753 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:00 np0005531887 nova_compute[186849]: 2025-11-22 08:08:00.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:08:00 np0005531887 nova_compute[186849]: 2025-11-22 08:08:00.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:08:00 np0005531887 nova_compute[186849]: 2025-11-22 08:08:00.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:08:00 np0005531887 nova_compute[186849]: 2025-11-22 08:08:00.993 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "refresh_cache-834afb36-7b49-4b59-9887-fe8b10d2d934" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:08:00 np0005531887 nova_compute[186849]: 2025-11-22 08:08:00.993 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquired lock "refresh_cache-834afb36-7b49-4b59-9887-fe8b10d2d934" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:08:00 np0005531887 nova_compute[186849]: 2025-11-22 08:08:00.994 186853 DEBUG nova.network.neutron [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 03:08:00 np0005531887 nova_compute[186849]: 2025-11-22 08:08:00.994 186853 DEBUG nova.objects.instance [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 834afb36-7b49-4b59-9887-fe8b10d2d934 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:08:01 np0005531887 nova_compute[186849]: 2025-11-22 08:08:01.860 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:02 np0005531887 nova_compute[186849]: 2025-11-22 08:08:02.223 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:02 np0005531887 podman[230628]: 2025-11-22 08:08:02.875523925 +0000 UTC m=+0.084547073 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 03:08:03 np0005531887 nova_compute[186849]: 2025-11-22 08:08:03.531 186853 DEBUG nova.network.neutron [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Updating instance_info_cache with network_info: [{"id": "43f2bb94-fd9e-4783-b426-c1651ae59f07", "address": "fa:16:3e:23:99:5d", "network": {"id": "90da6fca-65d1-4012-9602-d88842a0ad0e", "bridge": "br-int", "label": "tempest-network-smoke--1184291441", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe23:995d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43f2bb94-fd", "ovs_interfaceid": "43f2bb94-fd9e-4783-b426-c1651ae59f07", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:08:03 np0005531887 nova_compute[186849]: 2025-11-22 08:08:03.556 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Releasing lock "refresh_cache-834afb36-7b49-4b59-9887-fe8b10d2d934" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:08:03 np0005531887 nova_compute[186849]: 2025-11-22 08:08:03.557 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 03:08:03 np0005531887 nova_compute[186849]: 2025-11-22 08:08:03.557 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:08:03 np0005531887 nova_compute[186849]: 2025-11-22 08:08:03.559 186853 DEBUG oslo_concurrency.lockutils [None req-a4b01242-9875-4dd6-aa8b-393f21e325da 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "834afb36-7b49-4b59-9887-fe8b10d2d934" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:08:03 np0005531887 nova_compute[186849]: 2025-11-22 08:08:03.560 186853 DEBUG oslo_concurrency.lockutils [None req-a4b01242-9875-4dd6-aa8b-393f21e325da 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "834afb36-7b49-4b59-9887-fe8b10d2d934" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:08:03 np0005531887 nova_compute[186849]: 2025-11-22 08:08:03.560 186853 DEBUG oslo_concurrency.lockutils [None req-a4b01242-9875-4dd6-aa8b-393f21e325da 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "834afb36-7b49-4b59-9887-fe8b10d2d934-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:08:03 np0005531887 nova_compute[186849]: 2025-11-22 08:08:03.560 186853 DEBUG oslo_concurrency.lockutils [None req-a4b01242-9875-4dd6-aa8b-393f21e325da 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "834afb36-7b49-4b59-9887-fe8b10d2d934-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:08:03 np0005531887 nova_compute[186849]: 2025-11-22 08:08:03.560 186853 DEBUG oslo_concurrency.lockutils [None req-a4b01242-9875-4dd6-aa8b-393f21e325da 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "834afb36-7b49-4b59-9887-fe8b10d2d934-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:08:03 np0005531887 nova_compute[186849]: 2025-11-22 08:08:03.570 186853 INFO nova.compute.manager [None req-a4b01242-9875-4dd6-aa8b-393f21e325da 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Terminating instance#033[00m
Nov 22 03:08:03 np0005531887 nova_compute[186849]: 2025-11-22 08:08:03.578 186853 DEBUG nova.compute.manager [None req-a4b01242-9875-4dd6-aa8b-393f21e325da 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 03:08:03 np0005531887 kernel: tap43f2bb94-fd (unregistering): left promiscuous mode
Nov 22 03:08:03 np0005531887 NetworkManager[55210]: <info>  [1763798883.6064] device (tap43f2bb94-fd): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:08:03 np0005531887 nova_compute[186849]: 2025-11-22 08:08:03.620 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:03 np0005531887 ovn_controller[95130]: 2025-11-22T08:08:03Z|00364|binding|INFO|Releasing lport 43f2bb94-fd9e-4783-b426-c1651ae59f07 from this chassis (sb_readonly=0)
Nov 22 03:08:03 np0005531887 ovn_controller[95130]: 2025-11-22T08:08:03Z|00365|binding|INFO|Setting lport 43f2bb94-fd9e-4783-b426-c1651ae59f07 down in Southbound
Nov 22 03:08:03 np0005531887 ovn_controller[95130]: 2025-11-22T08:08:03Z|00366|binding|INFO|Removing iface tap43f2bb94-fd ovn-installed in OVS
Nov 22 03:08:03 np0005531887 nova_compute[186849]: 2025-11-22 08:08:03.624 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:03 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:08:03.638 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:23:99:5d 10.100.0.13 2001:db8::f816:3eff:fe23:995d'], port_security=['fa:16:3e:23:99:5d 10.100.0.13 2001:db8::f816:3eff:fe23:995d'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28 2001:db8::f816:3eff:fe23:995d/64', 'neutron:device_id': '834afb36-7b49-4b59-9887-fe8b10d2d934', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-90da6fca-65d1-4012-9602-d88842a0ad0e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '4', 'neutron:security_group_ids': '573b06fa-1b11-4261-bfd0-ca50fa18731b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e9c41f1e-b11e-4868-a3a0-70214f7435c4, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=43f2bb94-fd9e-4783-b426-c1651ae59f07) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:08:03 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:08:03.640 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 43f2bb94-fd9e-4783-b426-c1651ae59f07 in datapath 90da6fca-65d1-4012-9602-d88842a0ad0e unbound from our chassis#033[00m
Nov 22 03:08:03 np0005531887 nova_compute[186849]: 2025-11-22 08:08:03.642 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:03 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:08:03.642 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 90da6fca-65d1-4012-9602-d88842a0ad0e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:08:03 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:08:03.643 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[094048e7-39b9-4a56-9ba6-ddac4b9807b5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:08:03 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:08:03.644 104084 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-90da6fca-65d1-4012-9602-d88842a0ad0e namespace which is not needed anymore#033[00m
Nov 22 03:08:03 np0005531887 systemd[1]: machine-qemu\x2d44\x2dinstance\x2d0000006f.scope: Deactivated successfully.
Nov 22 03:08:03 np0005531887 systemd[1]: machine-qemu\x2d44\x2dinstance\x2d0000006f.scope: Consumed 15.997s CPU time.
Nov 22 03:08:03 np0005531887 systemd-machined[153180]: Machine qemu-44-instance-0000006f terminated.
Nov 22 03:08:03 np0005531887 nova_compute[186849]: 2025-11-22 08:08:03.737 186853 DEBUG nova.compute.manager [req-ab262930-2df4-4f63-b8c9-78ec20812a9c req-8513a192-3096-4331-901c-eaa5e8048d5d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Received event network-changed-43f2bb94-fd9e-4783-b426-c1651ae59f07 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:08:03 np0005531887 nova_compute[186849]: 2025-11-22 08:08:03.738 186853 DEBUG nova.compute.manager [req-ab262930-2df4-4f63-b8c9-78ec20812a9c req-8513a192-3096-4331-901c-eaa5e8048d5d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Refreshing instance network info cache due to event network-changed-43f2bb94-fd9e-4783-b426-c1651ae59f07. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:08:03 np0005531887 nova_compute[186849]: 2025-11-22 08:08:03.738 186853 DEBUG oslo_concurrency.lockutils [req-ab262930-2df4-4f63-b8c9-78ec20812a9c req-8513a192-3096-4331-901c-eaa5e8048d5d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-834afb36-7b49-4b59-9887-fe8b10d2d934" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:08:03 np0005531887 nova_compute[186849]: 2025-11-22 08:08:03.738 186853 DEBUG oslo_concurrency.lockutils [req-ab262930-2df4-4f63-b8c9-78ec20812a9c req-8513a192-3096-4331-901c-eaa5e8048d5d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-834afb36-7b49-4b59-9887-fe8b10d2d934" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:08:03 np0005531887 nova_compute[186849]: 2025-11-22 08:08:03.739 186853 DEBUG nova.network.neutron [req-ab262930-2df4-4f63-b8c9-78ec20812a9c req-8513a192-3096-4331-901c-eaa5e8048d5d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Refreshing network info cache for port 43f2bb94-fd9e-4783-b426-c1651ae59f07 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:08:03 np0005531887 nova_compute[186849]: 2025-11-22 08:08:03.801 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:03 np0005531887 nova_compute[186849]: 2025-11-22 08:08:03.806 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:03 np0005531887 neutron-haproxy-ovnmeta-90da6fca-65d1-4012-9602-d88842a0ad0e[230458]: [NOTICE]   (230462) : haproxy version is 2.8.14-c23fe91
Nov 22 03:08:03 np0005531887 neutron-haproxy-ovnmeta-90da6fca-65d1-4012-9602-d88842a0ad0e[230458]: [NOTICE]   (230462) : path to executable is /usr/sbin/haproxy
Nov 22 03:08:03 np0005531887 neutron-haproxy-ovnmeta-90da6fca-65d1-4012-9602-d88842a0ad0e[230458]: [WARNING]  (230462) : Exiting Master process...
Nov 22 03:08:03 np0005531887 neutron-haproxy-ovnmeta-90da6fca-65d1-4012-9602-d88842a0ad0e[230458]: [ALERT]    (230462) : Current worker (230464) exited with code 143 (Terminated)
Nov 22 03:08:03 np0005531887 neutron-haproxy-ovnmeta-90da6fca-65d1-4012-9602-d88842a0ad0e[230458]: [WARNING]  (230462) : All workers exited. Exiting... (0)
Nov 22 03:08:03 np0005531887 systemd[1]: libpod-5883fc9c27ca8f3a15ae177ad13ffb4fcf88249cf739db33304c671db465e604.scope: Deactivated successfully.
Nov 22 03:08:03 np0005531887 podman[230676]: 2025-11-22 08:08:03.835128543 +0000 UTC m=+0.082265264 container died 5883fc9c27ca8f3a15ae177ad13ffb4fcf88249cf739db33304c671db465e604 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-90da6fca-65d1-4012-9602-d88842a0ad0e, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 03:08:03 np0005531887 nova_compute[186849]: 2025-11-22 08:08:03.852 186853 INFO nova.virt.libvirt.driver [-] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Instance destroyed successfully.#033[00m
Nov 22 03:08:03 np0005531887 nova_compute[186849]: 2025-11-22 08:08:03.852 186853 DEBUG nova.objects.instance [None req-a4b01242-9875-4dd6-aa8b-393f21e325da 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lazy-loading 'resources' on Instance uuid 834afb36-7b49-4b59-9887-fe8b10d2d934 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:08:03 np0005531887 nova_compute[186849]: 2025-11-22 08:08:03.869 186853 DEBUG nova.virt.libvirt.vif [None req-a4b01242-9875-4dd6-aa8b-393f21e325da 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:07:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-377491549',display_name='tempest-TestGettingAddress-server-377491549',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-377491549',id=111,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEPnaERasa+izdcdfyTuC1NZxdKIV3QYAmiEXJjMkASn0E1tv7r6vCMDrq3+5wI/5DgRhzrsGj9ouyKzyqBuAz+X8ag3n7AcCuRnJpHSdd9YGkwB1w6Z6YQ+SkW/64cPWQ==',key_name='tempest-TestGettingAddress-1450732548',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:07:34Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-6h3o0l87',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:07:34Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=834afb36-7b49-4b59-9887-fe8b10d2d934,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "43f2bb94-fd9e-4783-b426-c1651ae59f07", "address": "fa:16:3e:23:99:5d", "network": {"id": "90da6fca-65d1-4012-9602-d88842a0ad0e", "bridge": "br-int", "label": "tempest-network-smoke--1184291441", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe23:995d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43f2bb94-fd", "ovs_interfaceid": "43f2bb94-fd9e-4783-b426-c1651ae59f07", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:08:03 np0005531887 nova_compute[186849]: 2025-11-22 08:08:03.869 186853 DEBUG nova.network.os_vif_util [None req-a4b01242-9875-4dd6-aa8b-393f21e325da 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "43f2bb94-fd9e-4783-b426-c1651ae59f07", "address": "fa:16:3e:23:99:5d", "network": {"id": "90da6fca-65d1-4012-9602-d88842a0ad0e", "bridge": "br-int", "label": "tempest-network-smoke--1184291441", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe23:995d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43f2bb94-fd", "ovs_interfaceid": "43f2bb94-fd9e-4783-b426-c1651ae59f07", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:08:03 np0005531887 nova_compute[186849]: 2025-11-22 08:08:03.870 186853 DEBUG nova.network.os_vif_util [None req-a4b01242-9875-4dd6-aa8b-393f21e325da 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:23:99:5d,bridge_name='br-int',has_traffic_filtering=True,id=43f2bb94-fd9e-4783-b426-c1651ae59f07,network=Network(90da6fca-65d1-4012-9602-d88842a0ad0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43f2bb94-fd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:08:03 np0005531887 nova_compute[186849]: 2025-11-22 08:08:03.870 186853 DEBUG os_vif [None req-a4b01242-9875-4dd6-aa8b-393f21e325da 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:23:99:5d,bridge_name='br-int',has_traffic_filtering=True,id=43f2bb94-fd9e-4783-b426-c1651ae59f07,network=Network(90da6fca-65d1-4012-9602-d88842a0ad0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43f2bb94-fd') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:08:03 np0005531887 nova_compute[186849]: 2025-11-22 08:08:03.872 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:03 np0005531887 nova_compute[186849]: 2025-11-22 08:08:03.872 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap43f2bb94-fd, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:08:03 np0005531887 nova_compute[186849]: 2025-11-22 08:08:03.874 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:03 np0005531887 nova_compute[186849]: 2025-11-22 08:08:03.875 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:03 np0005531887 nova_compute[186849]: 2025-11-22 08:08:03.878 186853 INFO os_vif [None req-a4b01242-9875-4dd6-aa8b-393f21e325da 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:23:99:5d,bridge_name='br-int',has_traffic_filtering=True,id=43f2bb94-fd9e-4783-b426-c1651ae59f07,network=Network(90da6fca-65d1-4012-9602-d88842a0ad0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43f2bb94-fd')#033[00m
Nov 22 03:08:03 np0005531887 nova_compute[186849]: 2025-11-22 08:08:03.879 186853 INFO nova.virt.libvirt.driver [None req-a4b01242-9875-4dd6-aa8b-393f21e325da 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Deleting instance files /var/lib/nova/instances/834afb36-7b49-4b59-9887-fe8b10d2d934_del#033[00m
Nov 22 03:08:03 np0005531887 nova_compute[186849]: 2025-11-22 08:08:03.880 186853 INFO nova.virt.libvirt.driver [None req-a4b01242-9875-4dd6-aa8b-393f21e325da 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Deletion of /var/lib/nova/instances/834afb36-7b49-4b59-9887-fe8b10d2d934_del complete#033[00m
Nov 22 03:08:03 np0005531887 nova_compute[186849]: 2025-11-22 08:08:03.990 186853 INFO nova.compute.manager [None req-a4b01242-9875-4dd6-aa8b-393f21e325da 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Took 0.41 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 03:08:03 np0005531887 nova_compute[186849]: 2025-11-22 08:08:03.991 186853 DEBUG oslo.service.loopingcall [None req-a4b01242-9875-4dd6-aa8b-393f21e325da 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 03:08:03 np0005531887 nova_compute[186849]: 2025-11-22 08:08:03.991 186853 DEBUG nova.compute.manager [-] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 03:08:03 np0005531887 nova_compute[186849]: 2025-11-22 08:08:03.992 186853 DEBUG nova.network.neutron [-] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 03:08:04 np0005531887 nova_compute[186849]: 2025-11-22 08:08:04.014 186853 DEBUG nova.compute.manager [req-3e909049-e754-4fec-9b96-d5b126014d3b req-29c2d1a7-4365-4d68-b419-a158a2e57069 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Received event network-vif-unplugged-43f2bb94-fd9e-4783-b426-c1651ae59f07 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:08:04 np0005531887 nova_compute[186849]: 2025-11-22 08:08:04.014 186853 DEBUG oslo_concurrency.lockutils [req-3e909049-e754-4fec-9b96-d5b126014d3b req-29c2d1a7-4365-4d68-b419-a158a2e57069 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "834afb36-7b49-4b59-9887-fe8b10d2d934-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:08:04 np0005531887 nova_compute[186849]: 2025-11-22 08:08:04.015 186853 DEBUG oslo_concurrency.lockutils [req-3e909049-e754-4fec-9b96-d5b126014d3b req-29c2d1a7-4365-4d68-b419-a158a2e57069 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "834afb36-7b49-4b59-9887-fe8b10d2d934-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:08:04 np0005531887 nova_compute[186849]: 2025-11-22 08:08:04.015 186853 DEBUG oslo_concurrency.lockutils [req-3e909049-e754-4fec-9b96-d5b126014d3b req-29c2d1a7-4365-4d68-b419-a158a2e57069 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "834afb36-7b49-4b59-9887-fe8b10d2d934-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:08:04 np0005531887 nova_compute[186849]: 2025-11-22 08:08:04.015 186853 DEBUG nova.compute.manager [req-3e909049-e754-4fec-9b96-d5b126014d3b req-29c2d1a7-4365-4d68-b419-a158a2e57069 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] No waiting events found dispatching network-vif-unplugged-43f2bb94-fd9e-4783-b426-c1651ae59f07 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:08:04 np0005531887 nova_compute[186849]: 2025-11-22 08:08:04.015 186853 DEBUG nova.compute.manager [req-3e909049-e754-4fec-9b96-d5b126014d3b req-29c2d1a7-4365-4d68-b419-a158a2e57069 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Received event network-vif-unplugged-43f2bb94-fd9e-4783-b426-c1651ae59f07 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 03:08:04 np0005531887 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5883fc9c27ca8f3a15ae177ad13ffb4fcf88249cf739db33304c671db465e604-userdata-shm.mount: Deactivated successfully.
Nov 22 03:08:04 np0005531887 systemd[1]: var-lib-containers-storage-overlay-1f8f263bc3667a75ead0aed3f1d2f7638fd33945fd0a3222d9f408383171f72a-merged.mount: Deactivated successfully.
Nov 22 03:08:04 np0005531887 nova_compute[186849]: 2025-11-22 08:08:04.734 186853 DEBUG nova.network.neutron [-] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:08:04 np0005531887 nova_compute[186849]: 2025-11-22 08:08:04.751 186853 INFO nova.compute.manager [-] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Took 0.76 seconds to deallocate network for instance.#033[00m
Nov 22 03:08:04 np0005531887 nova_compute[186849]: 2025-11-22 08:08:04.818 186853 DEBUG oslo_concurrency.lockutils [None req-a4b01242-9875-4dd6-aa8b-393f21e325da 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:08:04 np0005531887 nova_compute[186849]: 2025-11-22 08:08:04.818 186853 DEBUG oslo_concurrency.lockutils [None req-a4b01242-9875-4dd6-aa8b-393f21e325da 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:08:04 np0005531887 nova_compute[186849]: 2025-11-22 08:08:04.893 186853 DEBUG nova.compute.provider_tree [None req-a4b01242-9875-4dd6-aa8b-393f21e325da 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:08:04 np0005531887 nova_compute[186849]: 2025-11-22 08:08:04.915 186853 DEBUG nova.scheduler.client.report [None req-a4b01242-9875-4dd6-aa8b-393f21e325da 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:08:04 np0005531887 nova_compute[186849]: 2025-11-22 08:08:04.945 186853 DEBUG oslo_concurrency.lockutils [None req-a4b01242-9875-4dd6-aa8b-393f21e325da 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.126s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:08:05 np0005531887 nova_compute[186849]: 2025-11-22 08:08:05.000 186853 INFO nova.scheduler.client.report [None req-a4b01242-9875-4dd6-aa8b-393f21e325da 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Deleted allocations for instance 834afb36-7b49-4b59-9887-fe8b10d2d934#033[00m
Nov 22 03:08:05 np0005531887 nova_compute[186849]: 2025-11-22 08:08:05.091 186853 DEBUG oslo_concurrency.lockutils [None req-a4b01242-9875-4dd6-aa8b-393f21e325da 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "834afb36-7b49-4b59-9887-fe8b10d2d934" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.531s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:08:05 np0005531887 nova_compute[186849]: 2025-11-22 08:08:05.135 186853 DEBUG nova.compute.manager [req-bce8602c-898b-4fcf-94e9-7b8c9d46fb27 req-ca88f04b-0d94-4294-aef0-da15bf68955b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Received event network-vif-deleted-43f2bb94-fd9e-4783-b426-c1651ae59f07 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:08:05 np0005531887 podman[230676]: 2025-11-22 08:08:05.176815589 +0000 UTC m=+1.423952310 container cleanup 5883fc9c27ca8f3a15ae177ad13ffb4fcf88249cf739db33304c671db465e604 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-90da6fca-65d1-4012-9602-d88842a0ad0e, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 03:08:05 np0005531887 systemd[1]: libpod-conmon-5883fc9c27ca8f3a15ae177ad13ffb4fcf88249cf739db33304c671db465e604.scope: Deactivated successfully.
Nov 22 03:08:05 np0005531887 nova_compute[186849]: 2025-11-22 08:08:05.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:08:05 np0005531887 podman[230723]: 2025-11-22 08:08:05.88278878 +0000 UTC m=+0.682649949 container remove 5883fc9c27ca8f3a15ae177ad13ffb4fcf88249cf739db33304c671db465e604 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-90da6fca-65d1-4012-9602-d88842a0ad0e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3)
Nov 22 03:08:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:08:05.890 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[b870f18a-972e-4ec8-81c8-3f3800c08f7b]: (4, ('Sat Nov 22 08:08:03 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-90da6fca-65d1-4012-9602-d88842a0ad0e (5883fc9c27ca8f3a15ae177ad13ffb4fcf88249cf739db33304c671db465e604)\n5883fc9c27ca8f3a15ae177ad13ffb4fcf88249cf739db33304c671db465e604\nSat Nov 22 08:08:05 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-90da6fca-65d1-4012-9602-d88842a0ad0e (5883fc9c27ca8f3a15ae177ad13ffb4fcf88249cf739db33304c671db465e604)\n5883fc9c27ca8f3a15ae177ad13ffb4fcf88249cf739db33304c671db465e604\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:08:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:08:05.892 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[d6c89289-4f77-44d1-836a-d12ad01548a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:08:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:08:05.894 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap90da6fca-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:08:05 np0005531887 nova_compute[186849]: 2025-11-22 08:08:05.896 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:05 np0005531887 kernel: tap90da6fca-60: left promiscuous mode
Nov 22 03:08:05 np0005531887 nova_compute[186849]: 2025-11-22 08:08:05.908 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:08:05.912 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[cdf52c7d-9b0f-4598-bec6-66f3bdf5e99f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:08:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:08:05.929 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[5e228b89-8efa-4306-a313-e1517311e32a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:08:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:08:05.931 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[1c0e7b71-f38d-4317-a765-b4e2ef315a74]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:08:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:08:05.948 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[e85076b3-866d-4d3e-b9bc-c8a05978b74f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 559584, 'reachable_time': 24659, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230739, 'error': None, 'target': 'ovnmeta-90da6fca-65d1-4012-9602-d88842a0ad0e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:08:05 np0005531887 systemd[1]: run-netns-ovnmeta\x2d90da6fca\x2d65d1\x2d4012\x2d9602\x2dd88842a0ad0e.mount: Deactivated successfully.
Nov 22 03:08:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:08:05.952 104200 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-90da6fca-65d1-4012-9602-d88842a0ad0e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:08:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:08:05.952 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[65897327-2f00-440c-a42c-439be74835ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:08:06 np0005531887 nova_compute[186849]: 2025-11-22 08:08:06.136 186853 DEBUG nova.network.neutron [req-ab262930-2df4-4f63-b8c9-78ec20812a9c req-8513a192-3096-4331-901c-eaa5e8048d5d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Updated VIF entry in instance network info cache for port 43f2bb94-fd9e-4783-b426-c1651ae59f07. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:08:06 np0005531887 nova_compute[186849]: 2025-11-22 08:08:06.137 186853 DEBUG nova.network.neutron [req-ab262930-2df4-4f63-b8c9-78ec20812a9c req-8513a192-3096-4331-901c-eaa5e8048d5d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Updating instance_info_cache with network_info: [{"id": "43f2bb94-fd9e-4783-b426-c1651ae59f07", "address": "fa:16:3e:23:99:5d", "network": {"id": "90da6fca-65d1-4012-9602-d88842a0ad0e", "bridge": "br-int", "label": "tempest-network-smoke--1184291441", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe23:995d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43f2bb94-fd", "ovs_interfaceid": "43f2bb94-fd9e-4783-b426-c1651ae59f07", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:08:06 np0005531887 nova_compute[186849]: 2025-11-22 08:08:06.154 186853 DEBUG oslo_concurrency.lockutils [req-ab262930-2df4-4f63-b8c9-78ec20812a9c req-8513a192-3096-4331-901c-eaa5e8048d5d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-834afb36-7b49-4b59-9887-fe8b10d2d934" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:08:06 np0005531887 nova_compute[186849]: 2025-11-22 08:08:06.162 186853 DEBUG nova.compute.manager [req-756923a7-d539-4f0b-9f4d-bff12a9151bf req-7e0f4880-1892-4dd8-a482-e3af43a6b189 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Received event network-vif-plugged-43f2bb94-fd9e-4783-b426-c1651ae59f07 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:08:06 np0005531887 nova_compute[186849]: 2025-11-22 08:08:06.162 186853 DEBUG oslo_concurrency.lockutils [req-756923a7-d539-4f0b-9f4d-bff12a9151bf req-7e0f4880-1892-4dd8-a482-e3af43a6b189 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "834afb36-7b49-4b59-9887-fe8b10d2d934-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:08:06 np0005531887 nova_compute[186849]: 2025-11-22 08:08:06.163 186853 DEBUG oslo_concurrency.lockutils [req-756923a7-d539-4f0b-9f4d-bff12a9151bf req-7e0f4880-1892-4dd8-a482-e3af43a6b189 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "834afb36-7b49-4b59-9887-fe8b10d2d934-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:08:06 np0005531887 nova_compute[186849]: 2025-11-22 08:08:06.163 186853 DEBUG oslo_concurrency.lockutils [req-756923a7-d539-4f0b-9f4d-bff12a9151bf req-7e0f4880-1892-4dd8-a482-e3af43a6b189 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "834afb36-7b49-4b59-9887-fe8b10d2d934-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:08:06 np0005531887 nova_compute[186849]: 2025-11-22 08:08:06.163 186853 DEBUG nova.compute.manager [req-756923a7-d539-4f0b-9f4d-bff12a9151bf req-7e0f4880-1892-4dd8-a482-e3af43a6b189 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] No waiting events found dispatching network-vif-plugged-43f2bb94-fd9e-4783-b426-c1651ae59f07 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:08:06 np0005531887 nova_compute[186849]: 2025-11-22 08:08:06.164 186853 WARNING nova.compute.manager [req-756923a7-d539-4f0b-9f4d-bff12a9151bf req-7e0f4880-1892-4dd8-a482-e3af43a6b189 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Received unexpected event network-vif-plugged-43f2bb94-fd9e-4783-b426-c1651ae59f07 for instance with vm_state deleted and task_state None.#033[00m
Nov 22 03:08:06 np0005531887 nova_compute[186849]: 2025-11-22 08:08:06.697 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:06 np0005531887 nova_compute[186849]: 2025-11-22 08:08:06.876 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:06 np0005531887 nova_compute[186849]: 2025-11-22 08:08:06.895 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:08 np0005531887 nova_compute[186849]: 2025-11-22 08:08:08.877 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:09 np0005531887 nova_compute[186849]: 2025-11-22 08:08:09.762 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:08:09 np0005531887 podman[230740]: 2025-11-22 08:08:09.847719704 +0000 UTC m=+0.059332362 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent)
Nov 22 03:08:11 np0005531887 nova_compute[186849]: 2025-11-22 08:08:11.881 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:13 np0005531887 podman[230762]: 2025-11-22 08:08:13.840226739 +0000 UTC m=+0.064430282 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd)
Nov 22 03:08:13 np0005531887 nova_compute[186849]: 2025-11-22 08:08:13.880 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:14 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:08:14.365 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=33, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=32) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:08:14 np0005531887 nova_compute[186849]: 2025-11-22 08:08:14.365 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:14 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:08:14.366 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:08:16 np0005531887 nova_compute[186849]: 2025-11-22 08:08:16.883 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:08:18.368 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '33'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:08:18 np0005531887 nova_compute[186849]: 2025-11-22 08:08:18.851 186853 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798883.8485918, 834afb36-7b49-4b59-9887-fe8b10d2d934 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:08:18 np0005531887 nova_compute[186849]: 2025-11-22 08:08:18.852 186853 INFO nova.compute.manager [-] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:08:18 np0005531887 nova_compute[186849]: 2025-11-22 08:08:18.876 186853 DEBUG nova.compute.manager [None req-6482d5e2-d60a-404f-89df-d74e7f0a89f2 - - - - - -] [instance: 834afb36-7b49-4b59-9887-fe8b10d2d934] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:08:18 np0005531887 nova_compute[186849]: 2025-11-22 08:08:18.884 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:19 np0005531887 podman[230782]: 2025-11-22 08:08:19.8406456 +0000 UTC m=+0.057316390 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:08:21 np0005531887 nova_compute[186849]: 2025-11-22 08:08:21.884 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:23 np0005531887 nova_compute[186849]: 2025-11-22 08:08:23.887 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:25 np0005531887 podman[230810]: 2025-11-22 08:08:25.853726292 +0000 UTC m=+0.066009770 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, architecture=x86_64, io.openshift.tags=minimal rhel9, release=1755695350, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., version=9.6, distribution-scope=public, vcs-type=git, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 22 03:08:26 np0005531887 nova_compute[186849]: 2025-11-22 08:08:26.885 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:28 np0005531887 podman[230831]: 2025-11-22 08:08:28.850663512 +0000 UTC m=+0.068734727 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 22 03:08:28 np0005531887 podman[230832]: 2025-11-22 08:08:28.872164353 +0000 UTC m=+0.086674390 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0)
Nov 22 03:08:28 np0005531887 nova_compute[186849]: 2025-11-22 08:08:28.888 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:31 np0005531887 nova_compute[186849]: 2025-11-22 08:08:31.887 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:33 np0005531887 podman[230873]: 2025-11-22 08:08:33.834370143 +0000 UTC m=+0.054008573 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 03:08:33 np0005531887 nova_compute[186849]: 2025-11-22 08:08:33.892 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:08:36.666 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:08:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:08:36.666 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:08:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:08:36.666 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:08:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:08:36.666 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:08:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:08:36.666 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:08:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:08:36.667 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:08:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:08:36.667 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:08:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:08:36.667 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:08:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:08:36.667 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:08:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:08:36.667 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:08:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:08:36.667 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:08:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:08:36.667 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:08:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:08:36.667 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:08:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:08:36.667 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:08:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:08:36.667 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:08:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:08:36.667 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:08:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:08:36.667 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:08:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:08:36.668 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:08:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:08:36.668 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:08:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:08:36.668 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:08:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:08:36.668 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:08:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:08:36.668 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:08:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:08:36.668 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:08:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:08:36.668 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:08:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:08:36.668 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:08:36 np0005531887 nova_compute[186849]: 2025-11-22 08:08:36.888 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:08:37.342 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:08:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:08:37.343 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:08:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:08:37.343 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:08:38 np0005531887 nova_compute[186849]: 2025-11-22 08:08:38.894 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:40 np0005531887 podman[230897]: 2025-11-22 08:08:40.841808308 +0000 UTC m=+0.058389522 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 22 03:08:41 np0005531887 nova_compute[186849]: 2025-11-22 08:08:41.889 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:43 np0005531887 nova_compute[186849]: 2025-11-22 08:08:43.897 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:44 np0005531887 podman[230916]: 2025-11-22 08:08:44.861453906 +0000 UTC m=+0.084614390 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 22 03:08:46 np0005531887 ovn_controller[95130]: 2025-11-22T08:08:46Z|00367|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Nov 22 03:08:46 np0005531887 nova_compute[186849]: 2025-11-22 08:08:46.891 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:48 np0005531887 nova_compute[186849]: 2025-11-22 08:08:48.636 186853 DEBUG oslo_concurrency.lockutils [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "b50dd877-42b1-46b2-933e-ee9a660a56c3" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:08:48 np0005531887 nova_compute[186849]: 2025-11-22 08:08:48.636 186853 DEBUG oslo_concurrency.lockutils [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "b50dd877-42b1-46b2-933e-ee9a660a56c3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:08:48 np0005531887 nova_compute[186849]: 2025-11-22 08:08:48.655 186853 DEBUG nova.compute.manager [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 03:08:48 np0005531887 nova_compute[186849]: 2025-11-22 08:08:48.770 186853 DEBUG oslo_concurrency.lockutils [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:08:48 np0005531887 nova_compute[186849]: 2025-11-22 08:08:48.771 186853 DEBUG oslo_concurrency.lockutils [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:08:48 np0005531887 nova_compute[186849]: 2025-11-22 08:08:48.786 186853 DEBUG nova.virt.hardware [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:08:48 np0005531887 nova_compute[186849]: 2025-11-22 08:08:48.787 186853 INFO nova.compute.claims [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 22 03:08:48 np0005531887 nova_compute[186849]: 2025-11-22 08:08:48.901 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:48 np0005531887 nova_compute[186849]: 2025-11-22 08:08:48.909 186853 DEBUG nova.compute.provider_tree [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:08:48 np0005531887 nova_compute[186849]: 2025-11-22 08:08:48.932 186853 DEBUG nova.scheduler.client.report [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:08:48 np0005531887 nova_compute[186849]: 2025-11-22 08:08:48.953 186853 DEBUG oslo_concurrency.lockutils [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.182s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:08:48 np0005531887 nova_compute[186849]: 2025-11-22 08:08:48.954 186853 DEBUG nova.compute.manager [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 03:08:49 np0005531887 nova_compute[186849]: 2025-11-22 08:08:49.020 186853 DEBUG nova.compute.manager [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 03:08:49 np0005531887 nova_compute[186849]: 2025-11-22 08:08:49.021 186853 DEBUG nova.network.neutron [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:08:49 np0005531887 nova_compute[186849]: 2025-11-22 08:08:49.073 186853 INFO nova.virt.libvirt.driver [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 03:08:49 np0005531887 nova_compute[186849]: 2025-11-22 08:08:49.181 186853 DEBUG nova.compute.manager [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 03:08:49 np0005531887 nova_compute[186849]: 2025-11-22 08:08:49.364 186853 DEBUG nova.policy [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '809b865601654264af5bff7f49127cea', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:08:49 np0005531887 nova_compute[186849]: 2025-11-22 08:08:49.459 186853 DEBUG nova.compute.manager [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 03:08:49 np0005531887 nova_compute[186849]: 2025-11-22 08:08:49.461 186853 DEBUG nova.virt.libvirt.driver [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:08:49 np0005531887 nova_compute[186849]: 2025-11-22 08:08:49.461 186853 INFO nova.virt.libvirt.driver [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Creating image(s)#033[00m
Nov 22 03:08:49 np0005531887 nova_compute[186849]: 2025-11-22 08:08:49.462 186853 DEBUG oslo_concurrency.lockutils [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "/var/lib/nova/instances/b50dd877-42b1-46b2-933e-ee9a660a56c3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:08:49 np0005531887 nova_compute[186849]: 2025-11-22 08:08:49.462 186853 DEBUG oslo_concurrency.lockutils [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "/var/lib/nova/instances/b50dd877-42b1-46b2-933e-ee9a660a56c3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:08:49 np0005531887 nova_compute[186849]: 2025-11-22 08:08:49.463 186853 DEBUG oslo_concurrency.lockutils [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "/var/lib/nova/instances/b50dd877-42b1-46b2-933e-ee9a660a56c3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:08:49 np0005531887 nova_compute[186849]: 2025-11-22 08:08:49.475 186853 DEBUG oslo_concurrency.processutils [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:08:49 np0005531887 nova_compute[186849]: 2025-11-22 08:08:49.538 186853 DEBUG oslo_concurrency.processutils [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:08:49 np0005531887 nova_compute[186849]: 2025-11-22 08:08:49.539 186853 DEBUG oslo_concurrency.lockutils [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:08:49 np0005531887 nova_compute[186849]: 2025-11-22 08:08:49.540 186853 DEBUG oslo_concurrency.lockutils [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:08:49 np0005531887 nova_compute[186849]: 2025-11-22 08:08:49.555 186853 DEBUG oslo_concurrency.processutils [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:08:49 np0005531887 nova_compute[186849]: 2025-11-22 08:08:49.613 186853 DEBUG oslo_concurrency.processutils [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:08:49 np0005531887 nova_compute[186849]: 2025-11-22 08:08:49.614 186853 DEBUG oslo_concurrency.processutils [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/b50dd877-42b1-46b2-933e-ee9a660a56c3/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:08:49 np0005531887 nova_compute[186849]: 2025-11-22 08:08:49.753 186853 DEBUG oslo_concurrency.processutils [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/b50dd877-42b1-46b2-933e-ee9a660a56c3/disk 1073741824" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:08:49 np0005531887 nova_compute[186849]: 2025-11-22 08:08:49.755 186853 DEBUG oslo_concurrency.lockutils [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.215s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:08:49 np0005531887 nova_compute[186849]: 2025-11-22 08:08:49.755 186853 DEBUG oslo_concurrency.processutils [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:08:49 np0005531887 nova_compute[186849]: 2025-11-22 08:08:49.827 186853 DEBUG oslo_concurrency.processutils [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:08:49 np0005531887 nova_compute[186849]: 2025-11-22 08:08:49.828 186853 DEBUG nova.virt.disk.api [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Checking if we can resize image /var/lib/nova/instances/b50dd877-42b1-46b2-933e-ee9a660a56c3/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:08:49 np0005531887 nova_compute[186849]: 2025-11-22 08:08:49.828 186853 DEBUG oslo_concurrency.processutils [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b50dd877-42b1-46b2-933e-ee9a660a56c3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:08:49 np0005531887 nova_compute[186849]: 2025-11-22 08:08:49.886 186853 DEBUG oslo_concurrency.processutils [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b50dd877-42b1-46b2-933e-ee9a660a56c3/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:08:49 np0005531887 nova_compute[186849]: 2025-11-22 08:08:49.887 186853 DEBUG nova.virt.disk.api [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Cannot resize image /var/lib/nova/instances/b50dd877-42b1-46b2-933e-ee9a660a56c3/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:08:49 np0005531887 nova_compute[186849]: 2025-11-22 08:08:49.888 186853 DEBUG nova.objects.instance [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lazy-loading 'migration_context' on Instance uuid b50dd877-42b1-46b2-933e-ee9a660a56c3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:08:49 np0005531887 nova_compute[186849]: 2025-11-22 08:08:49.904 186853 DEBUG nova.virt.libvirt.driver [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:08:49 np0005531887 nova_compute[186849]: 2025-11-22 08:08:49.905 186853 DEBUG nova.virt.libvirt.driver [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Ensure instance console log exists: /var/lib/nova/instances/b50dd877-42b1-46b2-933e-ee9a660a56c3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:08:49 np0005531887 nova_compute[186849]: 2025-11-22 08:08:49.905 186853 DEBUG oslo_concurrency.lockutils [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:08:49 np0005531887 nova_compute[186849]: 2025-11-22 08:08:49.908 186853 DEBUG oslo_concurrency.lockutils [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:08:49 np0005531887 nova_compute[186849]: 2025-11-22 08:08:49.908 186853 DEBUG oslo_concurrency.lockutils [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:08:50 np0005531887 podman[230951]: 2025-11-22 08:08:50.853011919 +0000 UTC m=+0.058590297 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:08:51 np0005531887 nova_compute[186849]: 2025-11-22 08:08:51.339 186853 DEBUG nova.network.neutron [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Successfully created port: 1b5f134e-5728-4e7f-ba86-8650cc0b721d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:08:51 np0005531887 nova_compute[186849]: 2025-11-22 08:08:51.891 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:53 np0005531887 nova_compute[186849]: 2025-11-22 08:08:53.741 186853 DEBUG nova.network.neutron [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Successfully created port: 31719a20-f6e8-45a0-9f9a-d1e76c49b1a9 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:08:53 np0005531887 nova_compute[186849]: 2025-11-22 08:08:53.904 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:55 np0005531887 nova_compute[186849]: 2025-11-22 08:08:55.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:08:56 np0005531887 nova_compute[186849]: 2025-11-22 08:08:56.334 186853 DEBUG nova.network.neutron [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Successfully updated port: 1b5f134e-5728-4e7f-ba86-8650cc0b721d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:08:56 np0005531887 nova_compute[186849]: 2025-11-22 08:08:56.551 186853 DEBUG nova.compute.manager [req-e32acc20-724c-49b2-bb72-fddf128fccbb req-f75104a3-43a3-445e-97bb-e0498e7798c7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Received event network-changed-1b5f134e-5728-4e7f-ba86-8650cc0b721d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:08:56 np0005531887 nova_compute[186849]: 2025-11-22 08:08:56.551 186853 DEBUG nova.compute.manager [req-e32acc20-724c-49b2-bb72-fddf128fccbb req-f75104a3-43a3-445e-97bb-e0498e7798c7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Refreshing instance network info cache due to event network-changed-1b5f134e-5728-4e7f-ba86-8650cc0b721d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:08:56 np0005531887 nova_compute[186849]: 2025-11-22 08:08:56.552 186853 DEBUG oslo_concurrency.lockutils [req-e32acc20-724c-49b2-bb72-fddf128fccbb req-f75104a3-43a3-445e-97bb-e0498e7798c7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-b50dd877-42b1-46b2-933e-ee9a660a56c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:08:56 np0005531887 nova_compute[186849]: 2025-11-22 08:08:56.552 186853 DEBUG oslo_concurrency.lockutils [req-e32acc20-724c-49b2-bb72-fddf128fccbb req-f75104a3-43a3-445e-97bb-e0498e7798c7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-b50dd877-42b1-46b2-933e-ee9a660a56c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:08:56 np0005531887 nova_compute[186849]: 2025-11-22 08:08:56.552 186853 DEBUG nova.network.neutron [req-e32acc20-724c-49b2-bb72-fddf128fccbb req-f75104a3-43a3-445e-97bb-e0498e7798c7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Refreshing network info cache for port 1b5f134e-5728-4e7f-ba86-8650cc0b721d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:08:56 np0005531887 nova_compute[186849]: 2025-11-22 08:08:56.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:08:56 np0005531887 nova_compute[186849]: 2025-11-22 08:08:56.794 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:08:56 np0005531887 nova_compute[186849]: 2025-11-22 08:08:56.794 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:08:56 np0005531887 nova_compute[186849]: 2025-11-22 08:08:56.795 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:08:56 np0005531887 nova_compute[186849]: 2025-11-22 08:08:56.795 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:08:56 np0005531887 podman[230975]: 2025-11-22 08:08:56.871457847 +0000 UTC m=+0.082954018 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_id=edpm, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, managed_by=edpm_ansible, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, io.buildah.version=1.33.7, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc.)
Nov 22 03:08:56 np0005531887 nova_compute[186849]: 2025-11-22 08:08:56.893 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:56 np0005531887 nova_compute[186849]: 2025-11-22 08:08:56.974 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:08:56 np0005531887 nova_compute[186849]: 2025-11-22 08:08:56.975 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5743MB free_disk=73.34526824951172GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:08:56 np0005531887 nova_compute[186849]: 2025-11-22 08:08:56.975 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:08:56 np0005531887 nova_compute[186849]: 2025-11-22 08:08:56.975 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:08:57 np0005531887 nova_compute[186849]: 2025-11-22 08:08:57.021 186853 DEBUG nova.network.neutron [req-e32acc20-724c-49b2-bb72-fddf128fccbb req-f75104a3-43a3-445e-97bb-e0498e7798c7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:08:57 np0005531887 nova_compute[186849]: 2025-11-22 08:08:57.150 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Instance b50dd877-42b1-46b2-933e-ee9a660a56c3 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 03:08:57 np0005531887 nova_compute[186849]: 2025-11-22 08:08:57.151 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:08:57 np0005531887 nova_compute[186849]: 2025-11-22 08:08:57.151 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:08:57 np0005531887 nova_compute[186849]: 2025-11-22 08:08:57.298 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:08:57 np0005531887 nova_compute[186849]: 2025-11-22 08:08:57.314 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:08:57 np0005531887 nova_compute[186849]: 2025-11-22 08:08:57.397 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:08:57 np0005531887 nova_compute[186849]: 2025-11-22 08:08:57.398 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.423s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:08:57 np0005531887 nova_compute[186849]: 2025-11-22 08:08:57.577 186853 DEBUG nova.network.neutron [req-e32acc20-724c-49b2-bb72-fddf128fccbb req-f75104a3-43a3-445e-97bb-e0498e7798c7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:08:57 np0005531887 nova_compute[186849]: 2025-11-22 08:08:57.593 186853 DEBUG oslo_concurrency.lockutils [req-e32acc20-724c-49b2-bb72-fddf128fccbb req-f75104a3-43a3-445e-97bb-e0498e7798c7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-b50dd877-42b1-46b2-933e-ee9a660a56c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:08:58 np0005531887 nova_compute[186849]: 2025-11-22 08:08:58.034 186853 DEBUG nova.network.neutron [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Successfully updated port: 31719a20-f6e8-45a0-9f9a-d1e76c49b1a9 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:08:58 np0005531887 nova_compute[186849]: 2025-11-22 08:08:58.056 186853 DEBUG oslo_concurrency.lockutils [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "refresh_cache-b50dd877-42b1-46b2-933e-ee9a660a56c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:08:58 np0005531887 nova_compute[186849]: 2025-11-22 08:08:58.056 186853 DEBUG oslo_concurrency.lockutils [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquired lock "refresh_cache-b50dd877-42b1-46b2-933e-ee9a660a56c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:08:58 np0005531887 nova_compute[186849]: 2025-11-22 08:08:58.056 186853 DEBUG nova.network.neutron [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:08:58 np0005531887 nova_compute[186849]: 2025-11-22 08:08:58.257 186853 DEBUG nova.compute.manager [req-8155ade7-f4df-4a95-8ccd-4f395f30a96d req-a740be51-6370-489c-8508-0a39ce289beb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Received event network-changed-31719a20-f6e8-45a0-9f9a-d1e76c49b1a9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:08:58 np0005531887 nova_compute[186849]: 2025-11-22 08:08:58.258 186853 DEBUG nova.compute.manager [req-8155ade7-f4df-4a95-8ccd-4f395f30a96d req-a740be51-6370-489c-8508-0a39ce289beb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Refreshing instance network info cache due to event network-changed-31719a20-f6e8-45a0-9f9a-d1e76c49b1a9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:08:58 np0005531887 nova_compute[186849]: 2025-11-22 08:08:58.258 186853 DEBUG oslo_concurrency.lockutils [req-8155ade7-f4df-4a95-8ccd-4f395f30a96d req-a740be51-6370-489c-8508-0a39ce289beb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-b50dd877-42b1-46b2-933e-ee9a660a56c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:08:58 np0005531887 nova_compute[186849]: 2025-11-22 08:08:58.305 186853 DEBUG nova.network.neutron [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:08:58 np0005531887 nova_compute[186849]: 2025-11-22 08:08:58.391 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:08:58 np0005531887 nova_compute[186849]: 2025-11-22 08:08:58.907 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:08:59 np0005531887 nova_compute[186849]: 2025-11-22 08:08:59.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:08:59 np0005531887 nova_compute[186849]: 2025-11-22 08:08:59.768 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:08:59 np0005531887 podman[230995]: 2025-11-22 08:08:59.835969137 +0000 UTC m=+0.054403454 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 03:08:59 np0005531887 podman[230996]: 2025-11-22 08:08:59.867416222 +0000 UTC m=+0.083951502 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:09:00 np0005531887 nova_compute[186849]: 2025-11-22 08:09:00.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:09:00 np0005531887 nova_compute[186849]: 2025-11-22 08:09:00.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:09:01 np0005531887 nova_compute[186849]: 2025-11-22 08:09:01.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:09:01 np0005531887 nova_compute[186849]: 2025-11-22 08:09:01.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:09:01 np0005531887 nova_compute[186849]: 2025-11-22 08:09:01.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:09:01 np0005531887 nova_compute[186849]: 2025-11-22 08:09:01.783 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Nov 22 03:09:01 np0005531887 nova_compute[186849]: 2025-11-22 08:09:01.783 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:09:01 np0005531887 nova_compute[186849]: 2025-11-22 08:09:01.895 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:02 np0005531887 nova_compute[186849]: 2025-11-22 08:09:02.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.792 186853 DEBUG nova.network.neutron [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Updating instance_info_cache with network_info: [{"id": "1b5f134e-5728-4e7f-ba86-8650cc0b721d", "address": "fa:16:3e:9f:bd:73", "network": {"id": "8591a8a4-c35f-454b-ba4c-4ec37a8765b2", "bridge": "br-int", "label": "tempest-network-smoke--372886439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b5f134e-57", "ovs_interfaceid": "1b5f134e-5728-4e7f-ba86-8650cc0b721d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "31719a20-f6e8-45a0-9f9a-d1e76c49b1a9", "address": "fa:16:3e:37:ae:80", "network": {"id": "6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad", "bridge": "br-int", "label": "tempest-network-smoke--1013716252", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe37:ae80", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31719a20-f6", "ovs_interfaceid": "31719a20-f6e8-45a0-9f9a-d1e76c49b1a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.822 186853 DEBUG oslo_concurrency.lockutils [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Releasing lock "refresh_cache-b50dd877-42b1-46b2-933e-ee9a660a56c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.822 186853 DEBUG nova.compute.manager [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Instance network_info: |[{"id": "1b5f134e-5728-4e7f-ba86-8650cc0b721d", "address": "fa:16:3e:9f:bd:73", "network": {"id": "8591a8a4-c35f-454b-ba4c-4ec37a8765b2", "bridge": "br-int", "label": "tempest-network-smoke--372886439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b5f134e-57", "ovs_interfaceid": "1b5f134e-5728-4e7f-ba86-8650cc0b721d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "31719a20-f6e8-45a0-9f9a-d1e76c49b1a9", "address": "fa:16:3e:37:ae:80", "network": {"id": "6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad", "bridge": "br-int", "label": "tempest-network-smoke--1013716252", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe37:ae80", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31719a20-f6", "ovs_interfaceid": "31719a20-f6e8-45a0-9f9a-d1e76c49b1a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.823 186853 DEBUG oslo_concurrency.lockutils [req-8155ade7-f4df-4a95-8ccd-4f395f30a96d req-a740be51-6370-489c-8508-0a39ce289beb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-b50dd877-42b1-46b2-933e-ee9a660a56c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.823 186853 DEBUG nova.network.neutron [req-8155ade7-f4df-4a95-8ccd-4f395f30a96d req-a740be51-6370-489c-8508-0a39ce289beb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Refreshing network info cache for port 31719a20-f6e8-45a0-9f9a-d1e76c49b1a9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.827 186853 DEBUG nova.virt.libvirt.driver [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Start _get_guest_xml network_info=[{"id": "1b5f134e-5728-4e7f-ba86-8650cc0b721d", "address": "fa:16:3e:9f:bd:73", "network": {"id": "8591a8a4-c35f-454b-ba4c-4ec37a8765b2", "bridge": "br-int", "label": "tempest-network-smoke--372886439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b5f134e-57", "ovs_interfaceid": "1b5f134e-5728-4e7f-ba86-8650cc0b721d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "31719a20-f6e8-45a0-9f9a-d1e76c49b1a9", "address": "fa:16:3e:37:ae:80", "network": {"id": "6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad", "bridge": "br-int", "label": "tempest-network-smoke--1013716252", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe37:ae80", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31719a20-f6", "ovs_interfaceid": "31719a20-f6e8-45a0-9f9a-d1e76c49b1a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'size': 0, 'boot_index': 0, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.832 186853 WARNING nova.virt.libvirt.driver [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.837 186853 DEBUG nova.virt.libvirt.host [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.838 186853 DEBUG nova.virt.libvirt.host [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.842 186853 DEBUG nova.virt.libvirt.host [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.842 186853 DEBUG nova.virt.libvirt.host [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.843 186853 DEBUG nova.virt.libvirt.driver [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.844 186853 DEBUG nova.virt.hardware [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.844 186853 DEBUG nova.virt.hardware [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.844 186853 DEBUG nova.virt.hardware [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.845 186853 DEBUG nova.virt.hardware [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.845 186853 DEBUG nova.virt.hardware [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.845 186853 DEBUG nova.virt.hardware [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.845 186853 DEBUG nova.virt.hardware [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.845 186853 DEBUG nova.virt.hardware [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.846 186853 DEBUG nova.virt.hardware [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.846 186853 DEBUG nova.virt.hardware [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.846 186853 DEBUG nova.virt.hardware [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.849 186853 DEBUG nova.virt.libvirt.vif [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:08:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1979784336',display_name='tempest-TestGettingAddress-server-1979784336',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1979784336',id=116,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFjUHiasph3mANdjXDIFU/4z6QnY3zqHFX60ljMxnOboMARrmtehJoNKI61Z4yVjzWcQubwJZkj5r7viLLQ3CASAyZSRfJmCkosrre9zWh2jX66uWt7aGdm69U4zKqj5nQ==',key_name='tempest-TestGettingAddress-345322674',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-w9tyxpc6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:08:49Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=b50dd877-42b1-46b2-933e-ee9a660a56c3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1b5f134e-5728-4e7f-ba86-8650cc0b721d", "address": "fa:16:3e:9f:bd:73", "network": {"id": "8591a8a4-c35f-454b-ba4c-4ec37a8765b2", "bridge": "br-int", "label": "tempest-network-smoke--372886439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b5f134e-57", "ovs_interfaceid": "1b5f134e-5728-4e7f-ba86-8650cc0b721d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.850 186853 DEBUG nova.network.os_vif_util [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "1b5f134e-5728-4e7f-ba86-8650cc0b721d", "address": "fa:16:3e:9f:bd:73", "network": {"id": "8591a8a4-c35f-454b-ba4c-4ec37a8765b2", "bridge": "br-int", "label": "tempest-network-smoke--372886439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b5f134e-57", "ovs_interfaceid": "1b5f134e-5728-4e7f-ba86-8650cc0b721d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.850 186853 DEBUG nova.network.os_vif_util [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9f:bd:73,bridge_name='br-int',has_traffic_filtering=True,id=1b5f134e-5728-4e7f-ba86-8650cc0b721d,network=Network(8591a8a4-c35f-454b-ba4c-4ec37a8765b2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b5f134e-57') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.851 186853 DEBUG nova.virt.libvirt.vif [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:08:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1979784336',display_name='tempest-TestGettingAddress-server-1979784336',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1979784336',id=116,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFjUHiasph3mANdjXDIFU/4z6QnY3zqHFX60ljMxnOboMARrmtehJoNKI61Z4yVjzWcQubwJZkj5r7viLLQ3CASAyZSRfJmCkosrre9zWh2jX66uWt7aGdm69U4zKqj5nQ==',key_name='tempest-TestGettingAddress-345322674',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-w9tyxpc6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:08:49Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=b50dd877-42b1-46b2-933e-ee9a660a56c3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "31719a20-f6e8-45a0-9f9a-d1e76c49b1a9", "address": "fa:16:3e:37:ae:80", "network": {"id": "6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad", "bridge": "br-int", "label": "tempest-network-smoke--1013716252", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe37:ae80", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31719a20-f6", "ovs_interfaceid": "31719a20-f6e8-45a0-9f9a-d1e76c49b1a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.851 186853 DEBUG nova.network.os_vif_util [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "31719a20-f6e8-45a0-9f9a-d1e76c49b1a9", "address": "fa:16:3e:37:ae:80", "network": {"id": "6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad", "bridge": "br-int", "label": "tempest-network-smoke--1013716252", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe37:ae80", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31719a20-f6", "ovs_interfaceid": "31719a20-f6e8-45a0-9f9a-d1e76c49b1a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.852 186853 DEBUG nova.network.os_vif_util [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:37:ae:80,bridge_name='br-int',has_traffic_filtering=True,id=31719a20-f6e8-45a0-9f9a-d1e76c49b1a9,network=Network(6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31719a20-f6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.852 186853 DEBUG nova.objects.instance [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lazy-loading 'pci_devices' on Instance uuid b50dd877-42b1-46b2-933e-ee9a660a56c3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.866 186853 DEBUG nova.virt.libvirt.driver [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:09:03 np0005531887 nova_compute[186849]:  <uuid>b50dd877-42b1-46b2-933e-ee9a660a56c3</uuid>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:  <name>instance-00000074</name>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:  <memory>131072</memory>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:  <vcpu>1</vcpu>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:09:03 np0005531887 nova_compute[186849]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:      <nova:name>tempest-TestGettingAddress-server-1979784336</nova:name>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:      <nova:creationTime>2025-11-22 08:09:03</nova:creationTime>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:      <nova:flavor name="m1.nano">
Nov 22 03:09:03 np0005531887 nova_compute[186849]:        <nova:memory>128</nova:memory>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:        <nova:disk>1</nova:disk>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:        <nova:swap>0</nova:swap>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:      </nova:flavor>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:      <nova:owner>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:        <nova:user uuid="809b865601654264af5bff7f49127cea">tempest-TestGettingAddress-25838038-project-member</nova:user>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:        <nova:project uuid="c4200f1d1fbb44a5aaf5e3578f6354ae">tempest-TestGettingAddress-25838038</nova:project>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:      </nova:owner>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:      <nova:ports>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:        <nova:port uuid="1b5f134e-5728-4e7f-ba86-8650cc0b721d">
Nov 22 03:09:03 np0005531887 nova_compute[186849]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:        </nova:port>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:        <nova:port uuid="31719a20-f6e8-45a0-9f9a-d1e76c49b1a9">
Nov 22 03:09:03 np0005531887 nova_compute[186849]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fe37:ae80" ipVersion="6"/>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:        </nova:port>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:      </nova:ports>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:    </nova:instance>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:  <sysinfo type="smbios">
Nov 22 03:09:03 np0005531887 nova_compute[186849]:    <system>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:      <entry name="serial">b50dd877-42b1-46b2-933e-ee9a660a56c3</entry>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:      <entry name="uuid">b50dd877-42b1-46b2-933e-ee9a660a56c3</entry>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:    </system>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:  <os>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:    <boot dev="hd"/>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:    <smbios mode="sysinfo"/>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:  </os>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:  <features>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:    <vmcoreinfo/>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:  </features>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:  <clock offset="utc">
Nov 22 03:09:03 np0005531887 nova_compute[186849]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:    <timer name="hpet" present="no"/>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:  </clock>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:  <cpu mode="custom" match="exact">
Nov 22 03:09:03 np0005531887 nova_compute[186849]:    <model>Nehalem</model>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:  <devices>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:    <disk type="file" device="disk">
Nov 22 03:09:03 np0005531887 nova_compute[186849]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/b50dd877-42b1-46b2-933e-ee9a660a56c3/disk"/>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:      <target dev="vda" bus="virtio"/>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:    <disk type="file" device="cdrom">
Nov 22 03:09:03 np0005531887 nova_compute[186849]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/b50dd877-42b1-46b2-933e-ee9a660a56c3/disk.config"/>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:      <target dev="sda" bus="sata"/>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:    <interface type="ethernet">
Nov 22 03:09:03 np0005531887 nova_compute[186849]:      <mac address="fa:16:3e:9f:bd:73"/>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:      <mtu size="1442"/>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:      <target dev="tap1b5f134e-57"/>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:    </interface>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:    <interface type="ethernet">
Nov 22 03:09:03 np0005531887 nova_compute[186849]:      <mac address="fa:16:3e:37:ae:80"/>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:      <mtu size="1442"/>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:      <target dev="tap31719a20-f6"/>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:    </interface>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:    <serial type="pty">
Nov 22 03:09:03 np0005531887 nova_compute[186849]:      <log file="/var/lib/nova/instances/b50dd877-42b1-46b2-933e-ee9a660a56c3/console.log" append="off"/>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:    </serial>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:    <video>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:    </video>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:    <input type="tablet" bus="usb"/>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:    <rng model="virtio">
Nov 22 03:09:03 np0005531887 nova_compute[186849]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:    </rng>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:    <controller type="usb" index="0"/>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:    <memballoon model="virtio">
Nov 22 03:09:03 np0005531887 nova_compute[186849]:      <stats period="10"/>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 03:09:03 np0005531887 nova_compute[186849]:  </devices>
Nov 22 03:09:03 np0005531887 nova_compute[186849]: </domain>
Nov 22 03:09:03 np0005531887 nova_compute[186849]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.867 186853 DEBUG nova.compute.manager [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Preparing to wait for external event network-vif-plugged-1b5f134e-5728-4e7f-ba86-8650cc0b721d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.868 186853 DEBUG oslo_concurrency.lockutils [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "b50dd877-42b1-46b2-933e-ee9a660a56c3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.868 186853 DEBUG oslo_concurrency.lockutils [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "b50dd877-42b1-46b2-933e-ee9a660a56c3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.868 186853 DEBUG oslo_concurrency.lockutils [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "b50dd877-42b1-46b2-933e-ee9a660a56c3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.868 186853 DEBUG nova.compute.manager [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Preparing to wait for external event network-vif-plugged-31719a20-f6e8-45a0-9f9a-d1e76c49b1a9 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.868 186853 DEBUG oslo_concurrency.lockutils [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "b50dd877-42b1-46b2-933e-ee9a660a56c3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.869 186853 DEBUG oslo_concurrency.lockutils [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "b50dd877-42b1-46b2-933e-ee9a660a56c3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.869 186853 DEBUG oslo_concurrency.lockutils [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "b50dd877-42b1-46b2-933e-ee9a660a56c3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.869 186853 DEBUG nova.virt.libvirt.vif [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:08:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1979784336',display_name='tempest-TestGettingAddress-server-1979784336',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1979784336',id=116,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFjUHiasph3mANdjXDIFU/4z6QnY3zqHFX60ljMxnOboMARrmtehJoNKI61Z4yVjzWcQubwJZkj5r7viLLQ3CASAyZSRfJmCkosrre9zWh2jX66uWt7aGdm69U4zKqj5nQ==',key_name='tempest-TestGettingAddress-345322674',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-w9tyxpc6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:08:49Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=b50dd877-42b1-46b2-933e-ee9a660a56c3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1b5f134e-5728-4e7f-ba86-8650cc0b721d", "address": "fa:16:3e:9f:bd:73", "network": {"id": "8591a8a4-c35f-454b-ba4c-4ec37a8765b2", "bridge": "br-int", "label": "tempest-network-smoke--372886439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b5f134e-57", "ovs_interfaceid": "1b5f134e-5728-4e7f-ba86-8650cc0b721d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.870 186853 DEBUG nova.network.os_vif_util [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "1b5f134e-5728-4e7f-ba86-8650cc0b721d", "address": "fa:16:3e:9f:bd:73", "network": {"id": "8591a8a4-c35f-454b-ba4c-4ec37a8765b2", "bridge": "br-int", "label": "tempest-network-smoke--372886439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b5f134e-57", "ovs_interfaceid": "1b5f134e-5728-4e7f-ba86-8650cc0b721d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.870 186853 DEBUG nova.network.os_vif_util [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9f:bd:73,bridge_name='br-int',has_traffic_filtering=True,id=1b5f134e-5728-4e7f-ba86-8650cc0b721d,network=Network(8591a8a4-c35f-454b-ba4c-4ec37a8765b2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b5f134e-57') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.871 186853 DEBUG os_vif [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9f:bd:73,bridge_name='br-int',has_traffic_filtering=True,id=1b5f134e-5728-4e7f-ba86-8650cc0b721d,network=Network(8591a8a4-c35f-454b-ba4c-4ec37a8765b2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b5f134e-57') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.871 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.872 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.872 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.875 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.876 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1b5f134e-57, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.876 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1b5f134e-57, col_values=(('external_ids', {'iface-id': '1b5f134e-5728-4e7f-ba86-8650cc0b721d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9f:bd:73', 'vm-uuid': 'b50dd877-42b1-46b2-933e-ee9a660a56c3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:09:03 np0005531887 NetworkManager[55210]: <info>  [1763798943.8786] manager: (tap1b5f134e-57): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/170)
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.880 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.886 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.888 186853 INFO os_vif [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9f:bd:73,bridge_name='br-int',has_traffic_filtering=True,id=1b5f134e-5728-4e7f-ba86-8650cc0b721d,network=Network(8591a8a4-c35f-454b-ba4c-4ec37a8765b2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b5f134e-57')#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.889 186853 DEBUG nova.virt.libvirt.vif [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:08:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1979784336',display_name='tempest-TestGettingAddress-server-1979784336',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1979784336',id=116,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFjUHiasph3mANdjXDIFU/4z6QnY3zqHFX60ljMxnOboMARrmtehJoNKI61Z4yVjzWcQubwJZkj5r7viLLQ3CASAyZSRfJmCkosrre9zWh2jX66uWt7aGdm69U4zKqj5nQ==',key_name='tempest-TestGettingAddress-345322674',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-w9tyxpc6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:08:49Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=b50dd877-42b1-46b2-933e-ee9a660a56c3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "31719a20-f6e8-45a0-9f9a-d1e76c49b1a9", "address": "fa:16:3e:37:ae:80", "network": {"id": "6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad", "bridge": "br-int", "label": "tempest-network-smoke--1013716252", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe37:ae80", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31719a20-f6", "ovs_interfaceid": "31719a20-f6e8-45a0-9f9a-d1e76c49b1a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.889 186853 DEBUG nova.network.os_vif_util [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "31719a20-f6e8-45a0-9f9a-d1e76c49b1a9", "address": "fa:16:3e:37:ae:80", "network": {"id": "6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad", "bridge": "br-int", "label": "tempest-network-smoke--1013716252", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe37:ae80", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31719a20-f6", "ovs_interfaceid": "31719a20-f6e8-45a0-9f9a-d1e76c49b1a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.890 186853 DEBUG nova.network.os_vif_util [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:37:ae:80,bridge_name='br-int',has_traffic_filtering=True,id=31719a20-f6e8-45a0-9f9a-d1e76c49b1a9,network=Network(6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31719a20-f6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.890 186853 DEBUG os_vif [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:37:ae:80,bridge_name='br-int',has_traffic_filtering=True,id=31719a20-f6e8-45a0-9f9a-d1e76c49b1a9,network=Network(6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31719a20-f6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.891 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.891 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.891 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.894 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.894 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap31719a20-f6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.895 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap31719a20-f6, col_values=(('external_ids', {'iface-id': '31719a20-f6e8-45a0-9f9a-d1e76c49b1a9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:37:ae:80', 'vm-uuid': 'b50dd877-42b1-46b2-933e-ee9a660a56c3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.896 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:03 np0005531887 NetworkManager[55210]: <info>  [1763798943.8978] manager: (tap31719a20-f6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/171)
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.899 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.904 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.905 186853 INFO os_vif [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:37:ae:80,bridge_name='br-int',has_traffic_filtering=True,id=31719a20-f6e8-45a0-9f9a-d1e76c49b1a9,network=Network(6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31719a20-f6')#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.969 186853 DEBUG nova.virt.libvirt.driver [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.970 186853 DEBUG nova.virt.libvirt.driver [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.970 186853 DEBUG nova.virt.libvirt.driver [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] No VIF found with MAC fa:16:3e:9f:bd:73, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.970 186853 DEBUG nova.virt.libvirt.driver [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] No VIF found with MAC fa:16:3e:37:ae:80, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:09:03 np0005531887 nova_compute[186849]: 2025-11-22 08:09:03.971 186853 INFO nova.virt.libvirt.driver [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Using config drive#033[00m
Nov 22 03:09:04 np0005531887 nova_compute[186849]: 2025-11-22 08:09:04.555 186853 INFO nova.virt.libvirt.driver [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Creating config drive at /var/lib/nova/instances/b50dd877-42b1-46b2-933e-ee9a660a56c3/disk.config#033[00m
Nov 22 03:09:04 np0005531887 nova_compute[186849]: 2025-11-22 08:09:04.560 186853 DEBUG oslo_concurrency.processutils [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b50dd877-42b1-46b2-933e-ee9a660a56c3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpd92b9yyg execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:09:04 np0005531887 nova_compute[186849]: 2025-11-22 08:09:04.685 186853 DEBUG oslo_concurrency.processutils [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b50dd877-42b1-46b2-933e-ee9a660a56c3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpd92b9yyg" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:09:04 np0005531887 kernel: tap1b5f134e-57: entered promiscuous mode
Nov 22 03:09:04 np0005531887 NetworkManager[55210]: <info>  [1763798944.7574] manager: (tap1b5f134e-57): new Tun device (/org/freedesktop/NetworkManager/Devices/172)
Nov 22 03:09:04 np0005531887 ovn_controller[95130]: 2025-11-22T08:09:04Z|00368|binding|INFO|Claiming lport 1b5f134e-5728-4e7f-ba86-8650cc0b721d for this chassis.
Nov 22 03:09:04 np0005531887 ovn_controller[95130]: 2025-11-22T08:09:04Z|00369|binding|INFO|1b5f134e-5728-4e7f-ba86-8650cc0b721d: Claiming fa:16:3e:9f:bd:73 10.100.0.11
Nov 22 03:09:04 np0005531887 nova_compute[186849]: 2025-11-22 08:09:04.761 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:04.780 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9f:bd:73 10.100.0.11'], port_security=['fa:16:3e:9f:bd:73 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8591a8a4-c35f-454b-ba4c-4ec37a8765b2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'fecc702f-680b-424c-83ef-3f9c6214c28e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0c8e809e-e81c-4dfc-8977-f974433d5b3a, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=1b5f134e-5728-4e7f-ba86-8650cc0b721d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:09:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:04.781 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 1b5f134e-5728-4e7f-ba86-8650cc0b721d in datapath 8591a8a4-c35f-454b-ba4c-4ec37a8765b2 bound to our chassis#033[00m
Nov 22 03:09:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:04.782 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8591a8a4-c35f-454b-ba4c-4ec37a8765b2#033[00m
Nov 22 03:09:04 np0005531887 NetworkManager[55210]: <info>  [1763798944.7895] manager: (tap31719a20-f6): new Tun device (/org/freedesktop/NetworkManager/Devices/173)
Nov 22 03:09:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:04.795 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[7465ed58-0262-4134-b81b-f5bf991eea76]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:04.796 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8591a8a4-c1 in ovnmeta-8591a8a4-c35f-454b-ba4c-4ec37a8765b2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:09:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:04.801 213790 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8591a8a4-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:09:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:04.801 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[5d7d51d3-8353-41f7-8c56-392e74eafc44]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:04.803 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[9b4373d6-03fe-448d-8297-35411f1ec1be]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:04 np0005531887 systemd-udevd[231076]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:09:04 np0005531887 systemd-udevd[231077]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:09:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:04.819 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[b93f412e-3ea0-4cb1-a6f1-1f834300eb39]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:04 np0005531887 NetworkManager[55210]: <info>  [1763798944.8310] device (tap1b5f134e-57): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:09:04 np0005531887 podman[231053]: 2025-11-22 08:09:04.831761205 +0000 UTC m=+0.081521222 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:09:04 np0005531887 NetworkManager[55210]: <info>  [1763798944.8325] device (tap1b5f134e-57): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:09:04 np0005531887 systemd-machined[153180]: New machine qemu-45-instance-00000074.
Nov 22 03:09:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:04.843 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[01ee1e9b-b33d-4748-966d-394ef532693b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:04 np0005531887 NetworkManager[55210]: <info>  [1763798944.8456] device (tap31719a20-f6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:09:04 np0005531887 kernel: tap31719a20-f6: entered promiscuous mode
Nov 22 03:09:04 np0005531887 NetworkManager[55210]: <info>  [1763798944.8471] device (tap31719a20-f6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:09:04 np0005531887 ovn_controller[95130]: 2025-11-22T08:09:04Z|00370|binding|INFO|Claiming lport 31719a20-f6e8-45a0-9f9a-d1e76c49b1a9 for this chassis.
Nov 22 03:09:04 np0005531887 ovn_controller[95130]: 2025-11-22T08:09:04Z|00371|binding|INFO|31719a20-f6e8-45a0-9f9a-d1e76c49b1a9: Claiming fa:16:3e:37:ae:80 2001:db8::f816:3eff:fe37:ae80
Nov 22 03:09:04 np0005531887 nova_compute[186849]: 2025-11-22 08:09:04.849 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:04 np0005531887 systemd[1]: Started Virtual Machine qemu-45-instance-00000074.
Nov 22 03:09:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:04.859 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:37:ae:80 2001:db8::f816:3eff:fe37:ae80'], port_security=['fa:16:3e:37:ae:80 2001:db8::f816:3eff:fe37:ae80'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe37:ae80/64', 'neutron:device_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'fecc702f-680b-424c-83ef-3f9c6214c28e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6a4afbec-9e59-4ffa-9128-10dc4f025189, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=31719a20-f6e8-45a0-9f9a-d1e76c49b1a9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:09:04 np0005531887 ovn_controller[95130]: 2025-11-22T08:09:04Z|00372|binding|INFO|Setting lport 1b5f134e-5728-4e7f-ba86-8650cc0b721d ovn-installed in OVS
Nov 22 03:09:04 np0005531887 ovn_controller[95130]: 2025-11-22T08:09:04Z|00373|binding|INFO|Setting lport 1b5f134e-5728-4e7f-ba86-8650cc0b721d up in Southbound
Nov 22 03:09:04 np0005531887 nova_compute[186849]: 2025-11-22 08:09:04.861 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:04 np0005531887 ovn_controller[95130]: 2025-11-22T08:09:04Z|00374|binding|INFO|Setting lport 31719a20-f6e8-45a0-9f9a-d1e76c49b1a9 ovn-installed in OVS
Nov 22 03:09:04 np0005531887 ovn_controller[95130]: 2025-11-22T08:09:04Z|00375|binding|INFO|Setting lport 31719a20-f6e8-45a0-9f9a-d1e76c49b1a9 up in Southbound
Nov 22 03:09:04 np0005531887 nova_compute[186849]: 2025-11-22 08:09:04.872 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:04.879 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[fa6aef74-ecf9-4ea2-801e-93dc2f60efaf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:04.885 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[4894127e-10b2-46fb-a386-9407783e1125]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:04 np0005531887 NetworkManager[55210]: <info>  [1763798944.8869] manager: (tap8591a8a4-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/174)
Nov 22 03:09:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:04.925 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[ced1f00c-7fee-4a44-8a55-d8e1f34de74e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:04.928 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[a8123d57-82c1-4049-bb3d-6d77ee2a07b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:04 np0005531887 NetworkManager[55210]: <info>  [1763798944.9583] device (tap8591a8a4-c0): carrier: link connected
Nov 22 03:09:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:04.962 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[ec62cbb9-365d-4f10-9231-aa353681e4d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:04.980 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[44b08bb3-19e0-4482-b57b-4a22c6ead6ab]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8591a8a4-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:81:5e:ce'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 115], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 568756, 'reachable_time': 19865, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231121, 'error': None, 'target': 'ovnmeta-8591a8a4-c35f-454b-ba4c-4ec37a8765b2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:04.998 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[09fd838f-d3fd-4709-8251-db64f2a067c2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe81:5ece'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 568756, 'tstamp': 568756}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231122, 'error': None, 'target': 'ovnmeta-8591a8a4-c35f-454b-ba4c-4ec37a8765b2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:05.015 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[e2fac8c5-0dd0-499f-97a3-79ca8995f847]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8591a8a4-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:81:5e:ce'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 115], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 568756, 'reachable_time': 19865, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 231123, 'error': None, 'target': 'ovnmeta-8591a8a4-c35f-454b-ba4c-4ec37a8765b2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:05.046 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[2592e03e-ed4e-441e-8ba4-08651a7c386a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:05.109 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[9d3e5947-77dd-48cd-bae7-2946ef6832f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:05.111 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8591a8a4-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:09:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:05.112 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:09:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:05.112 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8591a8a4-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:09:05 np0005531887 nova_compute[186849]: 2025-11-22 08:09:05.114 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:05 np0005531887 NetworkManager[55210]: <info>  [1763798945.1154] manager: (tap8591a8a4-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/175)
Nov 22 03:09:05 np0005531887 kernel: tap8591a8a4-c0: entered promiscuous mode
Nov 22 03:09:05 np0005531887 nova_compute[186849]: 2025-11-22 08:09:05.118 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:05.118 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8591a8a4-c0, col_values=(('external_ids', {'iface-id': 'ec231e2a-1042-4a3a-b541-060f5a121bb8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:09:05 np0005531887 nova_compute[186849]: 2025-11-22 08:09:05.119 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:05 np0005531887 ovn_controller[95130]: 2025-11-22T08:09:05Z|00376|binding|INFO|Releasing lport ec231e2a-1042-4a3a-b541-060f5a121bb8 from this chassis (sb_readonly=0)
Nov 22 03:09:05 np0005531887 nova_compute[186849]: 2025-11-22 08:09:05.132 186853 DEBUG nova.compute.manager [req-28f77aa9-572f-46b9-a024-7c2ff6904c1c req-a60cc322-418c-4b09-93bd-0622a420ded0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Received event network-vif-plugged-31719a20-f6e8-45a0-9f9a-d1e76c49b1a9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:09:05 np0005531887 nova_compute[186849]: 2025-11-22 08:09:05.133 186853 DEBUG oslo_concurrency.lockutils [req-28f77aa9-572f-46b9-a024-7c2ff6904c1c req-a60cc322-418c-4b09-93bd-0622a420ded0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "b50dd877-42b1-46b2-933e-ee9a660a56c3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:05.133 104084 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8591a8a4-c35f-454b-ba4c-4ec37a8765b2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8591a8a4-c35f-454b-ba4c-4ec37a8765b2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:09:05 np0005531887 nova_compute[186849]: 2025-11-22 08:09:05.133 186853 DEBUG oslo_concurrency.lockutils [req-28f77aa9-572f-46b9-a024-7c2ff6904c1c req-a60cc322-418c-4b09-93bd-0622a420ded0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b50dd877-42b1-46b2-933e-ee9a660a56c3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:05 np0005531887 nova_compute[186849]: 2025-11-22 08:09:05.134 186853 DEBUG oslo_concurrency.lockutils [req-28f77aa9-572f-46b9-a024-7c2ff6904c1c req-a60cc322-418c-4b09-93bd-0622a420ded0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b50dd877-42b1-46b2-933e-ee9a660a56c3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:05 np0005531887 nova_compute[186849]: 2025-11-22 08:09:05.135 186853 DEBUG nova.compute.manager [req-28f77aa9-572f-46b9-a024-7c2ff6904c1c req-a60cc322-418c-4b09-93bd-0622a420ded0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Processing event network-vif-plugged-31719a20-f6e8-45a0-9f9a-d1e76c49b1a9 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:09:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:05.135 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[22d0bb28-0260-49b9-ad9b-f7f27b2da81d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:05 np0005531887 nova_compute[186849]: 2025-11-22 08:09:05.136 186853 DEBUG nova.compute.manager [req-a48a9f7c-6aac-44fb-a4eb-31d03f6a63c1 req-9fc525c7-be09-487f-b875-d8fa7b66330b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Received event network-vif-plugged-1b5f134e-5728-4e7f-ba86-8650cc0b721d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:09:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:05.136 104084 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:09:05 np0005531887 ovn_metadata_agent[104079]: global
Nov 22 03:09:05 np0005531887 ovn_metadata_agent[104079]:    log         /dev/log local0 debug
Nov 22 03:09:05 np0005531887 ovn_metadata_agent[104079]:    log-tag     haproxy-metadata-proxy-8591a8a4-c35f-454b-ba4c-4ec37a8765b2
Nov 22 03:09:05 np0005531887 ovn_metadata_agent[104079]:    user        root
Nov 22 03:09:05 np0005531887 ovn_metadata_agent[104079]:    group       root
Nov 22 03:09:05 np0005531887 ovn_metadata_agent[104079]:    maxconn     1024
Nov 22 03:09:05 np0005531887 ovn_metadata_agent[104079]:    pidfile     /var/lib/neutron/external/pids/8591a8a4-c35f-454b-ba4c-4ec37a8765b2.pid.haproxy
Nov 22 03:09:05 np0005531887 ovn_metadata_agent[104079]:    daemon
Nov 22 03:09:05 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:09:05 np0005531887 ovn_metadata_agent[104079]: defaults
Nov 22 03:09:05 np0005531887 ovn_metadata_agent[104079]:    log global
Nov 22 03:09:05 np0005531887 ovn_metadata_agent[104079]:    mode http
Nov 22 03:09:05 np0005531887 ovn_metadata_agent[104079]:    option httplog
Nov 22 03:09:05 np0005531887 ovn_metadata_agent[104079]:    option dontlognull
Nov 22 03:09:05 np0005531887 ovn_metadata_agent[104079]:    option http-server-close
Nov 22 03:09:05 np0005531887 ovn_metadata_agent[104079]:    option forwardfor
Nov 22 03:09:05 np0005531887 ovn_metadata_agent[104079]:    retries                 3
Nov 22 03:09:05 np0005531887 ovn_metadata_agent[104079]:    timeout http-request    30s
Nov 22 03:09:05 np0005531887 ovn_metadata_agent[104079]:    timeout connect         30s
Nov 22 03:09:05 np0005531887 ovn_metadata_agent[104079]:    timeout client          32s
Nov 22 03:09:05 np0005531887 ovn_metadata_agent[104079]:    timeout server          32s
Nov 22 03:09:05 np0005531887 ovn_metadata_agent[104079]:    timeout http-keep-alive 30s
Nov 22 03:09:05 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:09:05 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:09:05 np0005531887 ovn_metadata_agent[104079]: listen listener
Nov 22 03:09:05 np0005531887 ovn_metadata_agent[104079]:    bind 169.254.169.254:80
Nov 22 03:09:05 np0005531887 ovn_metadata_agent[104079]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:09:05 np0005531887 ovn_metadata_agent[104079]:    http-request add-header X-OVN-Network-ID 8591a8a4-c35f-454b-ba4c-4ec37a8765b2
Nov 22 03:09:05 np0005531887 ovn_metadata_agent[104079]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:09:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:05.137 104084 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8591a8a4-c35f-454b-ba4c-4ec37a8765b2', 'env', 'PROCESS_TAG=haproxy-8591a8a4-c35f-454b-ba4c-4ec37a8765b2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8591a8a4-c35f-454b-ba4c-4ec37a8765b2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:09:05 np0005531887 nova_compute[186849]: 2025-11-22 08:09:05.136 186853 DEBUG oslo_concurrency.lockutils [req-a48a9f7c-6aac-44fb-a4eb-31d03f6a63c1 req-9fc525c7-be09-487f-b875-d8fa7b66330b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "b50dd877-42b1-46b2-933e-ee9a660a56c3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:05 np0005531887 nova_compute[186849]: 2025-11-22 08:09:05.138 186853 DEBUG oslo_concurrency.lockutils [req-a48a9f7c-6aac-44fb-a4eb-31d03f6a63c1 req-9fc525c7-be09-487f-b875-d8fa7b66330b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b50dd877-42b1-46b2-933e-ee9a660a56c3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:05 np0005531887 nova_compute[186849]: 2025-11-22 08:09:05.138 186853 DEBUG oslo_concurrency.lockutils [req-a48a9f7c-6aac-44fb-a4eb-31d03f6a63c1 req-9fc525c7-be09-487f-b875-d8fa7b66330b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b50dd877-42b1-46b2-933e-ee9a660a56c3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:05 np0005531887 nova_compute[186849]: 2025-11-22 08:09:05.139 186853 DEBUG nova.compute.manager [req-a48a9f7c-6aac-44fb-a4eb-31d03f6a63c1 req-9fc525c7-be09-487f-b875-d8fa7b66330b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Processing event network-vif-plugged-1b5f134e-5728-4e7f-ba86-8650cc0b721d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:09:05 np0005531887 nova_compute[186849]: 2025-11-22 08:09:05.139 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:05 np0005531887 podman[231155]: 2025-11-22 08:09:05.537859591 +0000 UTC m=+0.057523531 container create 162f5bb0dc18daf8868d5f0c0da05be0beae41e959d3cc6430ffb22a95df4263 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8591a8a4-c35f-454b-ba4c-4ec37a8765b2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:09:05 np0005531887 systemd[1]: Started libpod-conmon-162f5bb0dc18daf8868d5f0c0da05be0beae41e959d3cc6430ffb22a95df4263.scope.
Nov 22 03:09:05 np0005531887 podman[231155]: 2025-11-22 08:09:05.504010045 +0000 UTC m=+0.023674005 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:09:05 np0005531887 systemd[1]: Started libcrun container.
Nov 22 03:09:05 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7bc215a5d5fd906043a313f9814d67bddf90cd688eb4d224be8ca48a729bd9ae/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:09:05 np0005531887 podman[231155]: 2025-11-22 08:09:05.627459202 +0000 UTC m=+0.147123162 container init 162f5bb0dc18daf8868d5f0c0da05be0beae41e959d3cc6430ffb22a95df4263 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8591a8a4-c35f-454b-ba4c-4ec37a8765b2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 22 03:09:05 np0005531887 podman[231155]: 2025-11-22 08:09:05.635243204 +0000 UTC m=+0.154907144 container start 162f5bb0dc18daf8868d5f0c0da05be0beae41e959d3cc6430ffb22a95df4263 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8591a8a4-c35f-454b-ba4c-4ec37a8765b2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:09:05 np0005531887 neutron-haproxy-ovnmeta-8591a8a4-c35f-454b-ba4c-4ec37a8765b2[231170]: [NOTICE]   (231175) : New worker (231183) forked
Nov 22 03:09:05 np0005531887 neutron-haproxy-ovnmeta-8591a8a4-c35f-454b-ba4c-4ec37a8765b2[231170]: [NOTICE]   (231175) : Loading success.
Nov 22 03:09:05 np0005531887 nova_compute[186849]: 2025-11-22 08:09:05.712 186853 DEBUG nova.network.neutron [req-8155ade7-f4df-4a95-8ccd-4f395f30a96d req-a740be51-6370-489c-8508-0a39ce289beb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Updated VIF entry in instance network info cache for port 31719a20-f6e8-45a0-9f9a-d1e76c49b1a9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:09:05 np0005531887 nova_compute[186849]: 2025-11-22 08:09:05.713 186853 DEBUG nova.network.neutron [req-8155ade7-f4df-4a95-8ccd-4f395f30a96d req-a740be51-6370-489c-8508-0a39ce289beb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Updating instance_info_cache with network_info: [{"id": "1b5f134e-5728-4e7f-ba86-8650cc0b721d", "address": "fa:16:3e:9f:bd:73", "network": {"id": "8591a8a4-c35f-454b-ba4c-4ec37a8765b2", "bridge": "br-int", "label": "tempest-network-smoke--372886439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b5f134e-57", "ovs_interfaceid": "1b5f134e-5728-4e7f-ba86-8650cc0b721d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "31719a20-f6e8-45a0-9f9a-d1e76c49b1a9", "address": "fa:16:3e:37:ae:80", "network": {"id": "6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad", "bridge": "br-int", "label": "tempest-network-smoke--1013716252", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe37:ae80", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31719a20-f6", "ovs_interfaceid": "31719a20-f6e8-45a0-9f9a-d1e76c49b1a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:09:05 np0005531887 nova_compute[186849]: 2025-11-22 08:09:05.736 186853 DEBUG oslo_concurrency.lockutils [req-8155ade7-f4df-4a95-8ccd-4f395f30a96d req-a740be51-6370-489c-8508-0a39ce289beb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-b50dd877-42b1-46b2-933e-ee9a660a56c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:09:05 np0005531887 nova_compute[186849]: 2025-11-22 08:09:05.737 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798945.7354023, b50dd877-42b1-46b2-933e-ee9a660a56c3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:09:05 np0005531887 nova_compute[186849]: 2025-11-22 08:09:05.737 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] VM Started (Lifecycle Event)#033[00m
Nov 22 03:09:05 np0005531887 nova_compute[186849]: 2025-11-22 08:09:05.740 186853 DEBUG nova.compute.manager [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Instance event wait completed in 0 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:09:05 np0005531887 nova_compute[186849]: 2025-11-22 08:09:05.745 186853 DEBUG nova.virt.libvirt.driver [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:09:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:05.750 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 31719a20-f6e8-45a0-9f9a-d1e76c49b1a9 in datapath 6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad unbound from our chassis#033[00m
Nov 22 03:09:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:05.753 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad#033[00m
Nov 22 03:09:05 np0005531887 nova_compute[186849]: 2025-11-22 08:09:05.755 186853 INFO nova.virt.libvirt.driver [-] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Instance spawned successfully.#033[00m
Nov 22 03:09:05 np0005531887 nova_compute[186849]: 2025-11-22 08:09:05.756 186853 DEBUG nova.virt.libvirt.driver [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 03:09:05 np0005531887 nova_compute[186849]: 2025-11-22 08:09:05.760 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:09:05 np0005531887 nova_compute[186849]: 2025-11-22 08:09:05.764 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:09:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:05.766 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[ad22cf04-e234-4e05-aa18-18842a1a3ca2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:05.768 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6a8e7fc1-61 in ovnmeta-6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:09:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:05.770 213790 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6a8e7fc1-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:09:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:05.770 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[c680ee06-08ef-475c-a4a7-5a6937c1b2c8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:05.772 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[98a31eec-7b5b-40c7-91be-c28877f6099b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:05 np0005531887 nova_compute[186849]: 2025-11-22 08:09:05.782 186853 DEBUG nova.virt.libvirt.driver [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:09:05 np0005531887 nova_compute[186849]: 2025-11-22 08:09:05.782 186853 DEBUG nova.virt.libvirt.driver [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:09:05 np0005531887 nova_compute[186849]: 2025-11-22 08:09:05.783 186853 DEBUG nova.virt.libvirt.driver [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:09:05 np0005531887 nova_compute[186849]: 2025-11-22 08:09:05.783 186853 DEBUG nova.virt.libvirt.driver [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:09:05 np0005531887 nova_compute[186849]: 2025-11-22 08:09:05.784 186853 DEBUG nova.virt.libvirt.driver [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:09:05 np0005531887 nova_compute[186849]: 2025-11-22 08:09:05.784 186853 DEBUG nova.virt.libvirt.driver [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:09:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:05.784 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[0015215a-e7ef-43b7-982e-2372cb8c1522]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:05 np0005531887 nova_compute[186849]: 2025-11-22 08:09:05.789 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:09:05 np0005531887 nova_compute[186849]: 2025-11-22 08:09:05.790 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798945.7355955, b50dd877-42b1-46b2-933e-ee9a660a56c3 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:09:05 np0005531887 nova_compute[186849]: 2025-11-22 08:09:05.791 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:09:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:05.800 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[72fc2742-85e2-4e50-890a-07264ae2aeb1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:05 np0005531887 nova_compute[186849]: 2025-11-22 08:09:05.810 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:09:05 np0005531887 nova_compute[186849]: 2025-11-22 08:09:05.815 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798945.7431614, b50dd877-42b1-46b2-933e-ee9a660a56c3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:09:05 np0005531887 nova_compute[186849]: 2025-11-22 08:09:05.815 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:09:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:05.833 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[01493ad5-0034-4fda-be41-cf64c4870712]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:05 np0005531887 nova_compute[186849]: 2025-11-22 08:09:05.835 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:09:05 np0005531887 systemd-udevd[231107]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:09:05 np0005531887 NetworkManager[55210]: <info>  [1763798945.8411] manager: (tap6a8e7fc1-60): new Veth device (/org/freedesktop/NetworkManager/Devices/176)
Nov 22 03:09:05 np0005531887 nova_compute[186849]: 2025-11-22 08:09:05.841 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:09:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:05.842 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[ad08f57a-5776-44e7-82e8-a5c0eb915657]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:05 np0005531887 nova_compute[186849]: 2025-11-22 08:09:05.848 186853 INFO nova.compute.manager [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Took 16.39 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 03:09:05 np0005531887 nova_compute[186849]: 2025-11-22 08:09:05.849 186853 DEBUG nova.compute.manager [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:09:05 np0005531887 nova_compute[186849]: 2025-11-22 08:09:05.870 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:09:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:05.884 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[743cdb7d-9635-4096-a6d7-0be00a2ce818]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:05.891 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[4b3b3011-426e-4c34-bae9-70f4dd7a380f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:05 np0005531887 nova_compute[186849]: 2025-11-22 08:09:05.913 186853 INFO nova.compute.manager [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Took 17.21 seconds to build instance.#033[00m
Nov 22 03:09:05 np0005531887 NetworkManager[55210]: <info>  [1763798945.9278] device (tap6a8e7fc1-60): carrier: link connected
Nov 22 03:09:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:05.933 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[cc37b50b-212e-4703-8df5-ed85950af694]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:05 np0005531887 nova_compute[186849]: 2025-11-22 08:09:05.937 186853 DEBUG oslo_concurrency.lockutils [None req-833dbc5b-13b2-46c3-badd-d40f122d2006 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "b50dd877-42b1-46b2-933e-ee9a660a56c3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.300s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:05.950 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[72b63bb5-5173-4784-ad07-1618c40a0426]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6a8e7fc1-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5d:e2:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 116], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 568853, 'reachable_time': 33384, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231203, 'error': None, 'target': 'ovnmeta-6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:05.967 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[dc783673-eac3-4cfc-9bcd-fb45e84c77db]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5d:e22a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 568853, 'tstamp': 568853}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231204, 'error': None, 'target': 'ovnmeta-6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:05.985 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[ce080386-7fd7-43af-ace9-0df52f1d28a2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6a8e7fc1-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5d:e2:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 116], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 568853, 'reachable_time': 33384, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 231205, 'error': None, 'target': 'ovnmeta-6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:06 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:06.021 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[01397bd4-49de-432d-9c32-ca133e22b055]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:06 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:06.051 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[879242b7-29f3-48bc-905b-303070384f42]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:06 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:06.053 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6a8e7fc1-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:09:06 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:06.055 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:09:06 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:06.056 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6a8e7fc1-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:09:06 np0005531887 nova_compute[186849]: 2025-11-22 08:09:06.058 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:06 np0005531887 NetworkManager[55210]: <info>  [1763798946.0589] manager: (tap6a8e7fc1-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/177)
Nov 22 03:09:06 np0005531887 kernel: tap6a8e7fc1-60: entered promiscuous mode
Nov 22 03:09:06 np0005531887 nova_compute[186849]: 2025-11-22 08:09:06.061 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:06 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:06.062 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6a8e7fc1-60, col_values=(('external_ids', {'iface-id': '288f6565-c1a7-412f-8593-8864123e2215'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:09:06 np0005531887 nova_compute[186849]: 2025-11-22 08:09:06.063 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:06 np0005531887 ovn_controller[95130]: 2025-11-22T08:09:06Z|00377|binding|INFO|Releasing lport 288f6565-c1a7-412f-8593-8864123e2215 from this chassis (sb_readonly=0)
Nov 22 03:09:06 np0005531887 nova_compute[186849]: 2025-11-22 08:09:06.064 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:06 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:06.064 104084 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:09:06 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:06.065 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[8f8c4592-9f26-4ba3-a532-2019648ad2bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:06 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:06.066 104084 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:09:06 np0005531887 ovn_metadata_agent[104079]: global
Nov 22 03:09:06 np0005531887 ovn_metadata_agent[104079]:    log         /dev/log local0 debug
Nov 22 03:09:06 np0005531887 ovn_metadata_agent[104079]:    log-tag     haproxy-metadata-proxy-6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad
Nov 22 03:09:06 np0005531887 ovn_metadata_agent[104079]:    user        root
Nov 22 03:09:06 np0005531887 ovn_metadata_agent[104079]:    group       root
Nov 22 03:09:06 np0005531887 ovn_metadata_agent[104079]:    maxconn     1024
Nov 22 03:09:06 np0005531887 ovn_metadata_agent[104079]:    pidfile     /var/lib/neutron/external/pids/6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad.pid.haproxy
Nov 22 03:09:06 np0005531887 ovn_metadata_agent[104079]:    daemon
Nov 22 03:09:06 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:09:06 np0005531887 ovn_metadata_agent[104079]: defaults
Nov 22 03:09:06 np0005531887 ovn_metadata_agent[104079]:    log global
Nov 22 03:09:06 np0005531887 ovn_metadata_agent[104079]:    mode http
Nov 22 03:09:06 np0005531887 ovn_metadata_agent[104079]:    option httplog
Nov 22 03:09:06 np0005531887 ovn_metadata_agent[104079]:    option dontlognull
Nov 22 03:09:06 np0005531887 ovn_metadata_agent[104079]:    option http-server-close
Nov 22 03:09:06 np0005531887 ovn_metadata_agent[104079]:    option forwardfor
Nov 22 03:09:06 np0005531887 ovn_metadata_agent[104079]:    retries                 3
Nov 22 03:09:06 np0005531887 ovn_metadata_agent[104079]:    timeout http-request    30s
Nov 22 03:09:06 np0005531887 ovn_metadata_agent[104079]:    timeout connect         30s
Nov 22 03:09:06 np0005531887 ovn_metadata_agent[104079]:    timeout client          32s
Nov 22 03:09:06 np0005531887 ovn_metadata_agent[104079]:    timeout server          32s
Nov 22 03:09:06 np0005531887 ovn_metadata_agent[104079]:    timeout http-keep-alive 30s
Nov 22 03:09:06 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:09:06 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:09:06 np0005531887 ovn_metadata_agent[104079]: listen listener
Nov 22 03:09:06 np0005531887 ovn_metadata_agent[104079]:    bind 169.254.169.254:80
Nov 22 03:09:06 np0005531887 ovn_metadata_agent[104079]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:09:06 np0005531887 ovn_metadata_agent[104079]:    http-request add-header X-OVN-Network-ID 6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad
Nov 22 03:09:06 np0005531887 ovn_metadata_agent[104079]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:09:06 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:06.067 104084 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad', 'env', 'PROCESS_TAG=haproxy-6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:09:06 np0005531887 nova_compute[186849]: 2025-11-22 08:09:06.075 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:06 np0005531887 podman[231235]: 2025-11-22 08:09:06.447625293 +0000 UTC m=+0.057143662 container create 87f18f933149c3181baec850bf50fb4b1c04123c4ccb43ac9e736ba1fcc25167 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:09:06 np0005531887 systemd[1]: Started libpod-conmon-87f18f933149c3181baec850bf50fb4b1c04123c4ccb43ac9e736ba1fcc25167.scope.
Nov 22 03:09:06 np0005531887 podman[231235]: 2025-11-22 08:09:06.416442163 +0000 UTC m=+0.025960562 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:09:06 np0005531887 systemd[1]: Started libcrun container.
Nov 22 03:09:06 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cd11cfd91e13c5cf9abb1af6cfd96061eaa6fe133724788e30e7bd2c5bd3b98a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:09:06 np0005531887 podman[231235]: 2025-11-22 08:09:06.533854681 +0000 UTC m=+0.143373080 container init 87f18f933149c3181baec850bf50fb4b1c04123c4ccb43ac9e736ba1fcc25167 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 22 03:09:06 np0005531887 podman[231235]: 2025-11-22 08:09:06.541433478 +0000 UTC m=+0.150951847 container start 87f18f933149c3181baec850bf50fb4b1c04123c4ccb43ac9e736ba1fcc25167 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:09:06 np0005531887 neutron-haproxy-ovnmeta-6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad[231250]: [NOTICE]   (231254) : New worker (231256) forked
Nov 22 03:09:06 np0005531887 neutron-haproxy-ovnmeta-6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad[231250]: [NOTICE]   (231254) : Loading success.
Nov 22 03:09:06 np0005531887 nova_compute[186849]: 2025-11-22 08:09:06.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:09:06 np0005531887 nova_compute[186849]: 2025-11-22 08:09:06.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 22 03:09:06 np0005531887 nova_compute[186849]: 2025-11-22 08:09:06.784 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 22 03:09:06 np0005531887 nova_compute[186849]: 2025-11-22 08:09:06.898 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:07 np0005531887 nova_compute[186849]: 2025-11-22 08:09:07.224 186853 DEBUG nova.compute.manager [req-b5820b67-6543-409d-94ad-8cdc52cc73fc req-1536b187-001c-42ae-9e6a-8daba8450489 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Received event network-vif-plugged-31719a20-f6e8-45a0-9f9a-d1e76c49b1a9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:09:07 np0005531887 nova_compute[186849]: 2025-11-22 08:09:07.225 186853 DEBUG oslo_concurrency.lockutils [req-b5820b67-6543-409d-94ad-8cdc52cc73fc req-1536b187-001c-42ae-9e6a-8daba8450489 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "b50dd877-42b1-46b2-933e-ee9a660a56c3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:07 np0005531887 nova_compute[186849]: 2025-11-22 08:09:07.225 186853 DEBUG oslo_concurrency.lockutils [req-b5820b67-6543-409d-94ad-8cdc52cc73fc req-1536b187-001c-42ae-9e6a-8daba8450489 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b50dd877-42b1-46b2-933e-ee9a660a56c3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:07 np0005531887 nova_compute[186849]: 2025-11-22 08:09:07.226 186853 DEBUG oslo_concurrency.lockutils [req-b5820b67-6543-409d-94ad-8cdc52cc73fc req-1536b187-001c-42ae-9e6a-8daba8450489 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b50dd877-42b1-46b2-933e-ee9a660a56c3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:07 np0005531887 nova_compute[186849]: 2025-11-22 08:09:07.226 186853 DEBUG nova.compute.manager [req-b5820b67-6543-409d-94ad-8cdc52cc73fc req-1536b187-001c-42ae-9e6a-8daba8450489 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] No waiting events found dispatching network-vif-plugged-31719a20-f6e8-45a0-9f9a-d1e76c49b1a9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:09:07 np0005531887 nova_compute[186849]: 2025-11-22 08:09:07.226 186853 WARNING nova.compute.manager [req-b5820b67-6543-409d-94ad-8cdc52cc73fc req-1536b187-001c-42ae-9e6a-8daba8450489 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Received unexpected event network-vif-plugged-31719a20-f6e8-45a0-9f9a-d1e76c49b1a9 for instance with vm_state active and task_state None.#033[00m
Nov 22 03:09:07 np0005531887 nova_compute[186849]: 2025-11-22 08:09:07.476 186853 DEBUG nova.compute.manager [req-234a491a-a7a2-477f-bad6-b167aadbb0ee req-981cf664-3ccf-4438-ad55-e55af08d3606 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Received event network-vif-plugged-1b5f134e-5728-4e7f-ba86-8650cc0b721d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:09:07 np0005531887 nova_compute[186849]: 2025-11-22 08:09:07.476 186853 DEBUG oslo_concurrency.lockutils [req-234a491a-a7a2-477f-bad6-b167aadbb0ee req-981cf664-3ccf-4438-ad55-e55af08d3606 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "b50dd877-42b1-46b2-933e-ee9a660a56c3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:07 np0005531887 nova_compute[186849]: 2025-11-22 08:09:07.477 186853 DEBUG oslo_concurrency.lockutils [req-234a491a-a7a2-477f-bad6-b167aadbb0ee req-981cf664-3ccf-4438-ad55-e55af08d3606 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b50dd877-42b1-46b2-933e-ee9a660a56c3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:07 np0005531887 nova_compute[186849]: 2025-11-22 08:09:07.477 186853 DEBUG oslo_concurrency.lockutils [req-234a491a-a7a2-477f-bad6-b167aadbb0ee req-981cf664-3ccf-4438-ad55-e55af08d3606 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b50dd877-42b1-46b2-933e-ee9a660a56c3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:07 np0005531887 nova_compute[186849]: 2025-11-22 08:09:07.477 186853 DEBUG nova.compute.manager [req-234a491a-a7a2-477f-bad6-b167aadbb0ee req-981cf664-3ccf-4438-ad55-e55af08d3606 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] No waiting events found dispatching network-vif-plugged-1b5f134e-5728-4e7f-ba86-8650cc0b721d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:09:07 np0005531887 nova_compute[186849]: 2025-11-22 08:09:07.478 186853 WARNING nova.compute.manager [req-234a491a-a7a2-477f-bad6-b167aadbb0ee req-981cf664-3ccf-4438-ad55-e55af08d3606 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Received unexpected event network-vif-plugged-1b5f134e-5728-4e7f-ba86-8650cc0b721d for instance with vm_state active and task_state None.#033[00m
Nov 22 03:09:07 np0005531887 nova_compute[186849]: 2025-11-22 08:09:07.785 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:09:08 np0005531887 nova_compute[186849]: 2025-11-22 08:09:08.801 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:08 np0005531887 NetworkManager[55210]: <info>  [1763798948.8016] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/178)
Nov 22 03:09:08 np0005531887 NetworkManager[55210]: <info>  [1763798948.8027] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/179)
Nov 22 03:09:08 np0005531887 nova_compute[186849]: 2025-11-22 08:09:08.898 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:08 np0005531887 nova_compute[186849]: 2025-11-22 08:09:08.963 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:08 np0005531887 ovn_controller[95130]: 2025-11-22T08:09:08Z|00378|binding|INFO|Releasing lport ec231e2a-1042-4a3a-b541-060f5a121bb8 from this chassis (sb_readonly=0)
Nov 22 03:09:08 np0005531887 ovn_controller[95130]: 2025-11-22T08:09:08Z|00379|binding|INFO|Releasing lport 288f6565-c1a7-412f-8593-8864123e2215 from this chassis (sb_readonly=0)
Nov 22 03:09:08 np0005531887 nova_compute[186849]: 2025-11-22 08:09:08.987 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:09 np0005531887 nova_compute[186849]: 2025-11-22 08:09:09.578 186853 DEBUG nova.compute.manager [req-a6a53d60-7dd5-4b3b-b5bb-b50dc8f3bf2b req-baadddab-cbd2-4462-8c31-d2ed890b3f9a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Received event network-changed-1b5f134e-5728-4e7f-ba86-8650cc0b721d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:09:09 np0005531887 nova_compute[186849]: 2025-11-22 08:09:09.578 186853 DEBUG nova.compute.manager [req-a6a53d60-7dd5-4b3b-b5bb-b50dc8f3bf2b req-baadddab-cbd2-4462-8c31-d2ed890b3f9a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Refreshing instance network info cache due to event network-changed-1b5f134e-5728-4e7f-ba86-8650cc0b721d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:09:09 np0005531887 nova_compute[186849]: 2025-11-22 08:09:09.579 186853 DEBUG oslo_concurrency.lockutils [req-a6a53d60-7dd5-4b3b-b5bb-b50dc8f3bf2b req-baadddab-cbd2-4462-8c31-d2ed890b3f9a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-b50dd877-42b1-46b2-933e-ee9a660a56c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:09:09 np0005531887 nova_compute[186849]: 2025-11-22 08:09:09.579 186853 DEBUG oslo_concurrency.lockutils [req-a6a53d60-7dd5-4b3b-b5bb-b50dc8f3bf2b req-baadddab-cbd2-4462-8c31-d2ed890b3f9a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-b50dd877-42b1-46b2-933e-ee9a660a56c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:09:09 np0005531887 nova_compute[186849]: 2025-11-22 08:09:09.579 186853 DEBUG nova.network.neutron [req-a6a53d60-7dd5-4b3b-b5bb-b50dc8f3bf2b req-baadddab-cbd2-4462-8c31-d2ed890b3f9a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Refreshing network info cache for port 1b5f134e-5728-4e7f-ba86-8650cc0b721d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:09:09 np0005531887 nova_compute[186849]: 2025-11-22 08:09:09.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:09:09 np0005531887 nova_compute[186849]: 2025-11-22 08:09:09.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 22 03:09:11 np0005531887 nova_compute[186849]: 2025-11-22 08:09:11.486 186853 DEBUG nova.network.neutron [req-a6a53d60-7dd5-4b3b-b5bb-b50dc8f3bf2b req-baadddab-cbd2-4462-8c31-d2ed890b3f9a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Updated VIF entry in instance network info cache for port 1b5f134e-5728-4e7f-ba86-8650cc0b721d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:09:11 np0005531887 nova_compute[186849]: 2025-11-22 08:09:11.487 186853 DEBUG nova.network.neutron [req-a6a53d60-7dd5-4b3b-b5bb-b50dc8f3bf2b req-baadddab-cbd2-4462-8c31-d2ed890b3f9a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Updating instance_info_cache with network_info: [{"id": "1b5f134e-5728-4e7f-ba86-8650cc0b721d", "address": "fa:16:3e:9f:bd:73", "network": {"id": "8591a8a4-c35f-454b-ba4c-4ec37a8765b2", "bridge": "br-int", "label": "tempest-network-smoke--372886439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.216", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b5f134e-57", "ovs_interfaceid": "1b5f134e-5728-4e7f-ba86-8650cc0b721d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "31719a20-f6e8-45a0-9f9a-d1e76c49b1a9", "address": "fa:16:3e:37:ae:80", "network": {"id": "6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad", "bridge": "br-int", "label": "tempest-network-smoke--1013716252", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe37:ae80", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31719a20-f6", "ovs_interfaceid": "31719a20-f6e8-45a0-9f9a-d1e76c49b1a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:09:11 np0005531887 nova_compute[186849]: 2025-11-22 08:09:11.507 186853 DEBUG oslo_concurrency.lockutils [req-a6a53d60-7dd5-4b3b-b5bb-b50dc8f3bf2b req-baadddab-cbd2-4462-8c31-d2ed890b3f9a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-b50dd877-42b1-46b2-933e-ee9a660a56c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:09:11 np0005531887 podman[231266]: 2025-11-22 08:09:11.873497465 +0000 UTC m=+0.088359261 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:09:11 np0005531887 nova_compute[186849]: 2025-11-22 08:09:11.901 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:12 np0005531887 ovn_controller[95130]: 2025-11-22T08:09:12Z|00380|binding|INFO|Releasing lport ec231e2a-1042-4a3a-b541-060f5a121bb8 from this chassis (sb_readonly=0)
Nov 22 03:09:12 np0005531887 ovn_controller[95130]: 2025-11-22T08:09:12Z|00381|binding|INFO|Releasing lport 288f6565-c1a7-412f-8593-8864123e2215 from this chassis (sb_readonly=0)
Nov 22 03:09:12 np0005531887 nova_compute[186849]: 2025-11-22 08:09:12.975 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:13 np0005531887 nova_compute[186849]: 2025-11-22 08:09:13.900 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:15 np0005531887 nova_compute[186849]: 2025-11-22 08:09:15.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:09:15 np0005531887 podman[231285]: 2025-11-22 08:09:15.901149493 +0000 UTC m=+0.116102437 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 22 03:09:16 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:16.353 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=34, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=33) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:09:16 np0005531887 nova_compute[186849]: 2025-11-22 08:09:16.354 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:16 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:16.354 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:09:16 np0005531887 nova_compute[186849]: 2025-11-22 08:09:16.901 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:18 np0005531887 nova_compute[186849]: 2025-11-22 08:09:18.903 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:21 np0005531887 ovn_controller[95130]: 2025-11-22T08:09:21Z|00382|binding|INFO|Releasing lport ec231e2a-1042-4a3a-b541-060f5a121bb8 from this chassis (sb_readonly=0)
Nov 22 03:09:21 np0005531887 ovn_controller[95130]: 2025-11-22T08:09:21Z|00383|binding|INFO|Releasing lport 288f6565-c1a7-412f-8593-8864123e2215 from this chassis (sb_readonly=0)
Nov 22 03:09:21 np0005531887 nova_compute[186849]: 2025-11-22 08:09:21.215 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:21 np0005531887 ovn_controller[95130]: 2025-11-22T08:09:21Z|00049|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:9f:bd:73 10.100.0.11
Nov 22 03:09:21 np0005531887 ovn_controller[95130]: 2025-11-22T08:09:21Z|00050|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:9f:bd:73 10.100.0.11
Nov 22 03:09:21 np0005531887 podman[231320]: 2025-11-22 08:09:21.849865239 +0000 UTC m=+0.064702168 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:09:21 np0005531887 nova_compute[186849]: 2025-11-22 08:09:21.903 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:23 np0005531887 nova_compute[186849]: 2025-11-22 08:09:23.655 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:23 np0005531887 nova_compute[186849]: 2025-11-22 08:09:23.906 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:25 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:25.356 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '34'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:09:26 np0005531887 nova_compute[186849]: 2025-11-22 08:09:26.561 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:26 np0005531887 nova_compute[186849]: 2025-11-22 08:09:26.906 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:27 np0005531887 podman[231344]: 2025-11-22 08:09:27.847156213 +0000 UTC m=+0.059376407 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, vcs-type=git, version=9.6, release=1755695350, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., config_id=edpm, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 03:09:28 np0005531887 nova_compute[186849]: 2025-11-22 08:09:28.912 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:30 np0005531887 podman[231366]: 2025-11-22 08:09:30.841599942 +0000 UTC m=+0.065711863 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Nov 22 03:09:30 np0005531887 podman[231367]: 2025-11-22 08:09:30.880710696 +0000 UTC m=+0.096428350 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 03:09:31 np0005531887 nova_compute[186849]: 2025-11-22 08:09:31.222 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:31 np0005531887 nova_compute[186849]: 2025-11-22 08:09:31.909 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:33 np0005531887 nova_compute[186849]: 2025-11-22 08:09:33.915 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:34 np0005531887 nova_compute[186849]: 2025-11-22 08:09:34.824 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:35 np0005531887 podman[231409]: 2025-11-22 08:09:35.820528205 +0000 UTC m=+0.044829607 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 03:09:36 np0005531887 nova_compute[186849]: 2025-11-22 08:09:36.912 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:37.344 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:37.344 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:37.345 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:38 np0005531887 nova_compute[186849]: 2025-11-22 08:09:38.085 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:38 np0005531887 nova_compute[186849]: 2025-11-22 08:09:38.919 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:40 np0005531887 nova_compute[186849]: 2025-11-22 08:09:40.955 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:41 np0005531887 nova_compute[186849]: 2025-11-22 08:09:41.915 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:42 np0005531887 podman[231435]: 2025-11-22 08:09:42.857450516 +0000 UTC m=+0.074854848 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 03:09:43 np0005531887 nova_compute[186849]: 2025-11-22 08:09:43.922 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:46 np0005531887 ovn_controller[95130]: 2025-11-22T08:09:46Z|00384|binding|INFO|Releasing lport ec231e2a-1042-4a3a-b541-060f5a121bb8 from this chassis (sb_readonly=0)
Nov 22 03:09:46 np0005531887 ovn_controller[95130]: 2025-11-22T08:09:46Z|00385|binding|INFO|Releasing lport 288f6565-c1a7-412f-8593-8864123e2215 from this chassis (sb_readonly=0)
Nov 22 03:09:46 np0005531887 nova_compute[186849]: 2025-11-22 08:09:46.352 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:46 np0005531887 podman[231454]: 2025-11-22 08:09:46.834046214 +0000 UTC m=+0.055663465 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:09:46 np0005531887 nova_compute[186849]: 2025-11-22 08:09:46.918 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:47 np0005531887 nova_compute[186849]: 2025-11-22 08:09:47.653 186853 DEBUG oslo_concurrency.lockutils [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Acquiring lock "1b982188-a0e8-474c-a959-760a28dc3ffe" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:47 np0005531887 nova_compute[186849]: 2025-11-22 08:09:47.653 186853 DEBUG oslo_concurrency.lockutils [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lock "1b982188-a0e8-474c-a959-760a28dc3ffe" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:47 np0005531887 nova_compute[186849]: 2025-11-22 08:09:47.675 186853 DEBUG nova.compute.manager [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 03:09:47 np0005531887 nova_compute[186849]: 2025-11-22 08:09:47.770 186853 DEBUG oslo_concurrency.lockutils [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:47 np0005531887 nova_compute[186849]: 2025-11-22 08:09:47.771 186853 DEBUG oslo_concurrency.lockutils [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:47 np0005531887 nova_compute[186849]: 2025-11-22 08:09:47.779 186853 DEBUG nova.virt.hardware [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:09:47 np0005531887 nova_compute[186849]: 2025-11-22 08:09:47.780 186853 INFO nova.compute.claims [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 22 03:09:48 np0005531887 nova_compute[186849]: 2025-11-22 08:09:48.102 186853 DEBUG nova.compute.provider_tree [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:09:48 np0005531887 nova_compute[186849]: 2025-11-22 08:09:48.113 186853 DEBUG nova.scheduler.client.report [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:09:48 np0005531887 nova_compute[186849]: 2025-11-22 08:09:48.134 186853 DEBUG oslo_concurrency.lockutils [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.363s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:48 np0005531887 nova_compute[186849]: 2025-11-22 08:09:48.135 186853 DEBUG nova.compute.manager [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 03:09:48 np0005531887 nova_compute[186849]: 2025-11-22 08:09:48.179 186853 DEBUG nova.compute.manager [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 03:09:48 np0005531887 nova_compute[186849]: 2025-11-22 08:09:48.180 186853 DEBUG nova.network.neutron [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:09:48 np0005531887 nova_compute[186849]: 2025-11-22 08:09:48.198 186853 INFO nova.virt.libvirt.driver [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 03:09:48 np0005531887 nova_compute[186849]: 2025-11-22 08:09:48.214 186853 DEBUG nova.compute.manager [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 03:09:48 np0005531887 nova_compute[186849]: 2025-11-22 08:09:48.343 186853 DEBUG nova.compute.manager [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 03:09:48 np0005531887 nova_compute[186849]: 2025-11-22 08:09:48.344 186853 DEBUG nova.virt.libvirt.driver [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:09:48 np0005531887 nova_compute[186849]: 2025-11-22 08:09:48.344 186853 INFO nova.virt.libvirt.driver [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Creating image(s)#033[00m
Nov 22 03:09:48 np0005531887 nova_compute[186849]: 2025-11-22 08:09:48.345 186853 DEBUG oslo_concurrency.lockutils [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Acquiring lock "/var/lib/nova/instances/1b982188-a0e8-474c-a959-760a28dc3ffe/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:48 np0005531887 nova_compute[186849]: 2025-11-22 08:09:48.345 186853 DEBUG oslo_concurrency.lockutils [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lock "/var/lib/nova/instances/1b982188-a0e8-474c-a959-760a28dc3ffe/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:48 np0005531887 nova_compute[186849]: 2025-11-22 08:09:48.346 186853 DEBUG oslo_concurrency.lockutils [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lock "/var/lib/nova/instances/1b982188-a0e8-474c-a959-760a28dc3ffe/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:48 np0005531887 nova_compute[186849]: 2025-11-22 08:09:48.358 186853 DEBUG oslo_concurrency.processutils [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:09:48 np0005531887 nova_compute[186849]: 2025-11-22 08:09:48.420 186853 DEBUG oslo_concurrency.processutils [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:09:48 np0005531887 nova_compute[186849]: 2025-11-22 08:09:48.421 186853 DEBUG oslo_concurrency.lockutils [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:48 np0005531887 nova_compute[186849]: 2025-11-22 08:09:48.421 186853 DEBUG oslo_concurrency.lockutils [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:48 np0005531887 nova_compute[186849]: 2025-11-22 08:09:48.433 186853 DEBUG oslo_concurrency.processutils [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:09:48 np0005531887 nova_compute[186849]: 2025-11-22 08:09:48.498 186853 DEBUG oslo_concurrency.processutils [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:09:48 np0005531887 nova_compute[186849]: 2025-11-22 08:09:48.499 186853 DEBUG oslo_concurrency.processutils [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/1b982188-a0e8-474c-a959-760a28dc3ffe/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:09:48 np0005531887 nova_compute[186849]: 2025-11-22 08:09:48.621 186853 DEBUG oslo_concurrency.processutils [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/1b982188-a0e8-474c-a959-760a28dc3ffe/disk 1073741824" returned: 0 in 0.122s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:09:48 np0005531887 nova_compute[186849]: 2025-11-22 08:09:48.622 186853 DEBUG oslo_concurrency.lockutils [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.200s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:48 np0005531887 nova_compute[186849]: 2025-11-22 08:09:48.622 186853 DEBUG oslo_concurrency.processutils [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:09:48 np0005531887 nova_compute[186849]: 2025-11-22 08:09:48.694 186853 DEBUG oslo_concurrency.processutils [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:09:48 np0005531887 nova_compute[186849]: 2025-11-22 08:09:48.695 186853 DEBUG nova.virt.disk.api [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Checking if we can resize image /var/lib/nova/instances/1b982188-a0e8-474c-a959-760a28dc3ffe/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:09:48 np0005531887 nova_compute[186849]: 2025-11-22 08:09:48.696 186853 DEBUG oslo_concurrency.processutils [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1b982188-a0e8-474c-a959-760a28dc3ffe/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:09:48 np0005531887 nova_compute[186849]: 2025-11-22 08:09:48.754 186853 DEBUG oslo_concurrency.processutils [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1b982188-a0e8-474c-a959-760a28dc3ffe/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:09:48 np0005531887 nova_compute[186849]: 2025-11-22 08:09:48.756 186853 DEBUG nova.virt.disk.api [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Cannot resize image /var/lib/nova/instances/1b982188-a0e8-474c-a959-760a28dc3ffe/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:09:48 np0005531887 nova_compute[186849]: 2025-11-22 08:09:48.756 186853 DEBUG nova.objects.instance [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lazy-loading 'migration_context' on Instance uuid 1b982188-a0e8-474c-a959-760a28dc3ffe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:09:48 np0005531887 nova_compute[186849]: 2025-11-22 08:09:48.767 186853 DEBUG nova.virt.libvirt.driver [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:09:48 np0005531887 nova_compute[186849]: 2025-11-22 08:09:48.767 186853 DEBUG nova.virt.libvirt.driver [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Ensure instance console log exists: /var/lib/nova/instances/1b982188-a0e8-474c-a959-760a28dc3ffe/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:09:48 np0005531887 nova_compute[186849]: 2025-11-22 08:09:48.768 186853 DEBUG oslo_concurrency.lockutils [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:48 np0005531887 nova_compute[186849]: 2025-11-22 08:09:48.768 186853 DEBUG oslo_concurrency.lockutils [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:48 np0005531887 nova_compute[186849]: 2025-11-22 08:09:48.769 186853 DEBUG oslo_concurrency.lockutils [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:48 np0005531887 nova_compute[186849]: 2025-11-22 08:09:48.925 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:49 np0005531887 nova_compute[186849]: 2025-11-22 08:09:49.751 186853 DEBUG nova.network.neutron [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Successfully created port: 726decc7-8256-48c9-992a-051f7215b6fa _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:09:51 np0005531887 nova_compute[186849]: 2025-11-22 08:09:51.190 186853 DEBUG nova.network.neutron [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Successfully updated port: 726decc7-8256-48c9-992a-051f7215b6fa _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:09:51 np0005531887 nova_compute[186849]: 2025-11-22 08:09:51.216 186853 DEBUG oslo_concurrency.lockutils [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Acquiring lock "refresh_cache-1b982188-a0e8-474c-a959-760a28dc3ffe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:09:51 np0005531887 nova_compute[186849]: 2025-11-22 08:09:51.217 186853 DEBUG oslo_concurrency.lockutils [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Acquired lock "refresh_cache-1b982188-a0e8-474c-a959-760a28dc3ffe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:09:51 np0005531887 nova_compute[186849]: 2025-11-22 08:09:51.217 186853 DEBUG nova.network.neutron [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:09:51 np0005531887 nova_compute[186849]: 2025-11-22 08:09:51.283 186853 DEBUG nova.compute.manager [req-70ee8538-28df-4dc4-872a-df4b3daca0ce req-6f637c45-8891-46ca-b25c-71aa1430bcc6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Received event network-changed-726decc7-8256-48c9-992a-051f7215b6fa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:09:51 np0005531887 nova_compute[186849]: 2025-11-22 08:09:51.283 186853 DEBUG nova.compute.manager [req-70ee8538-28df-4dc4-872a-df4b3daca0ce req-6f637c45-8891-46ca-b25c-71aa1430bcc6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Refreshing instance network info cache due to event network-changed-726decc7-8256-48c9-992a-051f7215b6fa. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:09:51 np0005531887 nova_compute[186849]: 2025-11-22 08:09:51.283 186853 DEBUG oslo_concurrency.lockutils [req-70ee8538-28df-4dc4-872a-df4b3daca0ce req-6f637c45-8891-46ca-b25c-71aa1430bcc6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-1b982188-a0e8-474c-a959-760a28dc3ffe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:09:51 np0005531887 nova_compute[186849]: 2025-11-22 08:09:51.394 186853 DEBUG nova.network.neutron [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:09:51 np0005531887 nova_compute[186849]: 2025-11-22 08:09:51.919 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:52 np0005531887 nova_compute[186849]: 2025-11-22 08:09:52.365 186853 DEBUG nova.network.neutron [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Updating instance_info_cache with network_info: [{"id": "726decc7-8256-48c9-992a-051f7215b6fa", "address": "fa:16:3e:22:16:fc", "network": {"id": "390460fe-fb7f-40ce-abb7-9e99dea93a54", "bridge": "br-int", "label": "tempest-TestServerMultinode-1226412633-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ea60c87d8514904b3cac3301e5875af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap726decc7-82", "ovs_interfaceid": "726decc7-8256-48c9-992a-051f7215b6fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:09:52 np0005531887 nova_compute[186849]: 2025-11-22 08:09:52.386 186853 DEBUG oslo_concurrency.lockutils [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Releasing lock "refresh_cache-1b982188-a0e8-474c-a959-760a28dc3ffe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:09:52 np0005531887 nova_compute[186849]: 2025-11-22 08:09:52.386 186853 DEBUG nova.compute.manager [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Instance network_info: |[{"id": "726decc7-8256-48c9-992a-051f7215b6fa", "address": "fa:16:3e:22:16:fc", "network": {"id": "390460fe-fb7f-40ce-abb7-9e99dea93a54", "bridge": "br-int", "label": "tempest-TestServerMultinode-1226412633-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ea60c87d8514904b3cac3301e5875af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap726decc7-82", "ovs_interfaceid": "726decc7-8256-48c9-992a-051f7215b6fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 03:09:52 np0005531887 nova_compute[186849]: 2025-11-22 08:09:52.387 186853 DEBUG oslo_concurrency.lockutils [req-70ee8538-28df-4dc4-872a-df4b3daca0ce req-6f637c45-8891-46ca-b25c-71aa1430bcc6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-1b982188-a0e8-474c-a959-760a28dc3ffe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:09:52 np0005531887 nova_compute[186849]: 2025-11-22 08:09:52.387 186853 DEBUG nova.network.neutron [req-70ee8538-28df-4dc4-872a-df4b3daca0ce req-6f637c45-8891-46ca-b25c-71aa1430bcc6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Refreshing network info cache for port 726decc7-8256-48c9-992a-051f7215b6fa _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:09:52 np0005531887 nova_compute[186849]: 2025-11-22 08:09:52.390 186853 DEBUG nova.virt.libvirt.driver [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Start _get_guest_xml network_info=[{"id": "726decc7-8256-48c9-992a-051f7215b6fa", "address": "fa:16:3e:22:16:fc", "network": {"id": "390460fe-fb7f-40ce-abb7-9e99dea93a54", "bridge": "br-int", "label": "tempest-TestServerMultinode-1226412633-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ea60c87d8514904b3cac3301e5875af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap726decc7-82", "ovs_interfaceid": "726decc7-8256-48c9-992a-051f7215b6fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'size': 0, 'boot_index': 0, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:09:52 np0005531887 nova_compute[186849]: 2025-11-22 08:09:52.397 186853 WARNING nova.virt.libvirt.driver [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:09:52 np0005531887 nova_compute[186849]: 2025-11-22 08:09:52.406 186853 DEBUG nova.virt.libvirt.host [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:09:52 np0005531887 nova_compute[186849]: 2025-11-22 08:09:52.407 186853 DEBUG nova.virt.libvirt.host [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:09:52 np0005531887 nova_compute[186849]: 2025-11-22 08:09:52.412 186853 DEBUG nova.virt.libvirt.host [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:09:52 np0005531887 nova_compute[186849]: 2025-11-22 08:09:52.413 186853 DEBUG nova.virt.libvirt.host [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:09:52 np0005531887 nova_compute[186849]: 2025-11-22 08:09:52.415 186853 DEBUG nova.virt.libvirt.driver [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:09:52 np0005531887 nova_compute[186849]: 2025-11-22 08:09:52.416 186853 DEBUG nova.virt.hardware [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:09:52 np0005531887 nova_compute[186849]: 2025-11-22 08:09:52.416 186853 DEBUG nova.virt.hardware [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:09:52 np0005531887 nova_compute[186849]: 2025-11-22 08:09:52.416 186853 DEBUG nova.virt.hardware [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:09:52 np0005531887 nova_compute[186849]: 2025-11-22 08:09:52.417 186853 DEBUG nova.virt.hardware [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:09:52 np0005531887 nova_compute[186849]: 2025-11-22 08:09:52.417 186853 DEBUG nova.virt.hardware [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:09:52 np0005531887 nova_compute[186849]: 2025-11-22 08:09:52.417 186853 DEBUG nova.virt.hardware [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:09:52 np0005531887 nova_compute[186849]: 2025-11-22 08:09:52.418 186853 DEBUG nova.virt.hardware [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:09:52 np0005531887 nova_compute[186849]: 2025-11-22 08:09:52.418 186853 DEBUG nova.virt.hardware [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:09:52 np0005531887 nova_compute[186849]: 2025-11-22 08:09:52.418 186853 DEBUG nova.virt.hardware [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:09:52 np0005531887 nova_compute[186849]: 2025-11-22 08:09:52.418 186853 DEBUG nova.virt.hardware [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:09:52 np0005531887 nova_compute[186849]: 2025-11-22 08:09:52.419 186853 DEBUG nova.virt.hardware [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:09:52 np0005531887 nova_compute[186849]: 2025-11-22 08:09:52.423 186853 DEBUG nova.virt.libvirt.vif [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:09:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-1940159938',display_name='tempest-TestServerMultinode-server-1940159938',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testservermultinode-server-1940159938',id=123,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b67388009f754931a62cbdd391fb4f53',ramdisk_id='',reservation_id='r-0dc019pn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-1734646453',owner_user_name='tempest-TestServerMultinode-1734646453-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:09:48Z,user_data=None,user_id='1bc17d213e01420ebb2a0bf75f44e357',uuid=1b982188-a0e8-474c-a959-760a28dc3ffe,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "726decc7-8256-48c9-992a-051f7215b6fa", "address": "fa:16:3e:22:16:fc", "network": {"id": "390460fe-fb7f-40ce-abb7-9e99dea93a54", "bridge": "br-int", "label": "tempest-TestServerMultinode-1226412633-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ea60c87d8514904b3cac3301e5875af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap726decc7-82", "ovs_interfaceid": "726decc7-8256-48c9-992a-051f7215b6fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:09:52 np0005531887 nova_compute[186849]: 2025-11-22 08:09:52.424 186853 DEBUG nova.network.os_vif_util [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Converting VIF {"id": "726decc7-8256-48c9-992a-051f7215b6fa", "address": "fa:16:3e:22:16:fc", "network": {"id": "390460fe-fb7f-40ce-abb7-9e99dea93a54", "bridge": "br-int", "label": "tempest-TestServerMultinode-1226412633-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ea60c87d8514904b3cac3301e5875af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap726decc7-82", "ovs_interfaceid": "726decc7-8256-48c9-992a-051f7215b6fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:09:52 np0005531887 nova_compute[186849]: 2025-11-22 08:09:52.424 186853 DEBUG nova.network.os_vif_util [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:22:16:fc,bridge_name='br-int',has_traffic_filtering=True,id=726decc7-8256-48c9-992a-051f7215b6fa,network=Network(390460fe-fb7f-40ce-abb7-9e99dea93a54),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap726decc7-82') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:09:52 np0005531887 nova_compute[186849]: 2025-11-22 08:09:52.425 186853 DEBUG nova.objects.instance [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1b982188-a0e8-474c-a959-760a28dc3ffe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:09:52 np0005531887 nova_compute[186849]: 2025-11-22 08:09:52.439 186853 DEBUG nova.virt.libvirt.driver [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:09:52 np0005531887 nova_compute[186849]:  <uuid>1b982188-a0e8-474c-a959-760a28dc3ffe</uuid>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:  <name>instance-0000007b</name>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:  <memory>131072</memory>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:  <vcpu>1</vcpu>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:09:52 np0005531887 nova_compute[186849]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:      <nova:name>tempest-TestServerMultinode-server-1940159938</nova:name>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:      <nova:creationTime>2025-11-22 08:09:52</nova:creationTime>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:      <nova:flavor name="m1.nano">
Nov 22 03:09:52 np0005531887 nova_compute[186849]:        <nova:memory>128</nova:memory>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:        <nova:disk>1</nova:disk>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:        <nova:swap>0</nova:swap>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:      </nova:flavor>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:      <nova:owner>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:        <nova:user uuid="1bc17d213e01420ebb2a0bf75f44e357">tempest-TestServerMultinode-1734646453-project-admin</nova:user>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:        <nova:project uuid="b67388009f754931a62cbdd391fb4f53">tempest-TestServerMultinode-1734646453</nova:project>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:      </nova:owner>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:      <nova:ports>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:        <nova:port uuid="726decc7-8256-48c9-992a-051f7215b6fa">
Nov 22 03:09:52 np0005531887 nova_compute[186849]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:        </nova:port>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:      </nova:ports>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:    </nova:instance>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:  <sysinfo type="smbios">
Nov 22 03:09:52 np0005531887 nova_compute[186849]:    <system>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:      <entry name="serial">1b982188-a0e8-474c-a959-760a28dc3ffe</entry>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:      <entry name="uuid">1b982188-a0e8-474c-a959-760a28dc3ffe</entry>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:    </system>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:  <os>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:    <boot dev="hd"/>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:    <smbios mode="sysinfo"/>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:  </os>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:  <features>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:    <vmcoreinfo/>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:  </features>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:  <clock offset="utc">
Nov 22 03:09:52 np0005531887 nova_compute[186849]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:    <timer name="hpet" present="no"/>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:  </clock>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:  <cpu mode="custom" match="exact">
Nov 22 03:09:52 np0005531887 nova_compute[186849]:    <model>Nehalem</model>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:  <devices>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:    <disk type="file" device="disk">
Nov 22 03:09:52 np0005531887 nova_compute[186849]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/1b982188-a0e8-474c-a959-760a28dc3ffe/disk"/>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:      <target dev="vda" bus="virtio"/>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:    <disk type="file" device="cdrom">
Nov 22 03:09:52 np0005531887 nova_compute[186849]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/1b982188-a0e8-474c-a959-760a28dc3ffe/disk.config"/>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:      <target dev="sda" bus="sata"/>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:    <interface type="ethernet">
Nov 22 03:09:52 np0005531887 nova_compute[186849]:      <mac address="fa:16:3e:22:16:fc"/>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:      <mtu size="1442"/>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:      <target dev="tap726decc7-82"/>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:    </interface>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:    <serial type="pty">
Nov 22 03:09:52 np0005531887 nova_compute[186849]:      <log file="/var/lib/nova/instances/1b982188-a0e8-474c-a959-760a28dc3ffe/console.log" append="off"/>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:    </serial>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:    <video>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:    </video>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:    <input type="tablet" bus="usb"/>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:    <rng model="virtio">
Nov 22 03:09:52 np0005531887 nova_compute[186849]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:    </rng>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:    <controller type="usb" index="0"/>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:    <memballoon model="virtio">
Nov 22 03:09:52 np0005531887 nova_compute[186849]:      <stats period="10"/>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 03:09:52 np0005531887 nova_compute[186849]:  </devices>
Nov 22 03:09:52 np0005531887 nova_compute[186849]: </domain>
Nov 22 03:09:52 np0005531887 nova_compute[186849]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:09:52 np0005531887 nova_compute[186849]: 2025-11-22 08:09:52.440 186853 DEBUG nova.compute.manager [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Preparing to wait for external event network-vif-plugged-726decc7-8256-48c9-992a-051f7215b6fa prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:09:52 np0005531887 nova_compute[186849]: 2025-11-22 08:09:52.441 186853 DEBUG oslo_concurrency.lockutils [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Acquiring lock "1b982188-a0e8-474c-a959-760a28dc3ffe-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:52 np0005531887 nova_compute[186849]: 2025-11-22 08:09:52.441 186853 DEBUG oslo_concurrency.lockutils [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lock "1b982188-a0e8-474c-a959-760a28dc3ffe-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:52 np0005531887 nova_compute[186849]: 2025-11-22 08:09:52.441 186853 DEBUG oslo_concurrency.lockutils [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lock "1b982188-a0e8-474c-a959-760a28dc3ffe-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:52 np0005531887 nova_compute[186849]: 2025-11-22 08:09:52.442 186853 DEBUG nova.virt.libvirt.vif [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:09:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-1940159938',display_name='tempest-TestServerMultinode-server-1940159938',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testservermultinode-server-1940159938',id=123,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b67388009f754931a62cbdd391fb4f53',ramdisk_id='',reservation_id='r-0dc019pn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-1734646453',owner_user_name='tempest-TestServerMultinode-1734646453-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:09:48Z,user_data=None,user_id='1bc17d213e01420ebb2a0bf75f44e357',uuid=1b982188-a0e8-474c-a959-760a28dc3ffe,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "726decc7-8256-48c9-992a-051f7215b6fa", "address": "fa:16:3e:22:16:fc", "network": {"id": "390460fe-fb7f-40ce-abb7-9e99dea93a54", "bridge": "br-int", "label": "tempest-TestServerMultinode-1226412633-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ea60c87d8514904b3cac3301e5875af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap726decc7-82", "ovs_interfaceid": "726decc7-8256-48c9-992a-051f7215b6fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:09:52 np0005531887 nova_compute[186849]: 2025-11-22 08:09:52.442 186853 DEBUG nova.network.os_vif_util [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Converting VIF {"id": "726decc7-8256-48c9-992a-051f7215b6fa", "address": "fa:16:3e:22:16:fc", "network": {"id": "390460fe-fb7f-40ce-abb7-9e99dea93a54", "bridge": "br-int", "label": "tempest-TestServerMultinode-1226412633-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ea60c87d8514904b3cac3301e5875af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap726decc7-82", "ovs_interfaceid": "726decc7-8256-48c9-992a-051f7215b6fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:09:52 np0005531887 nova_compute[186849]: 2025-11-22 08:09:52.443 186853 DEBUG nova.network.os_vif_util [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:22:16:fc,bridge_name='br-int',has_traffic_filtering=True,id=726decc7-8256-48c9-992a-051f7215b6fa,network=Network(390460fe-fb7f-40ce-abb7-9e99dea93a54),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap726decc7-82') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:09:52 np0005531887 nova_compute[186849]: 2025-11-22 08:09:52.443 186853 DEBUG os_vif [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:22:16:fc,bridge_name='br-int',has_traffic_filtering=True,id=726decc7-8256-48c9-992a-051f7215b6fa,network=Network(390460fe-fb7f-40ce-abb7-9e99dea93a54),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap726decc7-82') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:09:52 np0005531887 nova_compute[186849]: 2025-11-22 08:09:52.444 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:52 np0005531887 nova_compute[186849]: 2025-11-22 08:09:52.445 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:09:52 np0005531887 nova_compute[186849]: 2025-11-22 08:09:52.445 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:09:52 np0005531887 nova_compute[186849]: 2025-11-22 08:09:52.449 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:52 np0005531887 nova_compute[186849]: 2025-11-22 08:09:52.450 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap726decc7-82, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:09:52 np0005531887 nova_compute[186849]: 2025-11-22 08:09:52.450 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap726decc7-82, col_values=(('external_ids', {'iface-id': '726decc7-8256-48c9-992a-051f7215b6fa', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:22:16:fc', 'vm-uuid': '1b982188-a0e8-474c-a959-760a28dc3ffe'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:09:52 np0005531887 nova_compute[186849]: 2025-11-22 08:09:52.452 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:52 np0005531887 NetworkManager[55210]: <info>  [1763798992.4539] manager: (tap726decc7-82): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/180)
Nov 22 03:09:52 np0005531887 nova_compute[186849]: 2025-11-22 08:09:52.454 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:09:52 np0005531887 nova_compute[186849]: 2025-11-22 08:09:52.462 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:52 np0005531887 nova_compute[186849]: 2025-11-22 08:09:52.463 186853 INFO os_vif [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:22:16:fc,bridge_name='br-int',has_traffic_filtering=True,id=726decc7-8256-48c9-992a-051f7215b6fa,network=Network(390460fe-fb7f-40ce-abb7-9e99dea93a54),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap726decc7-82')#033[00m
Nov 22 03:09:52 np0005531887 nova_compute[186849]: 2025-11-22 08:09:52.536 186853 DEBUG nova.virt.libvirt.driver [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:09:52 np0005531887 nova_compute[186849]: 2025-11-22 08:09:52.537 186853 DEBUG nova.virt.libvirt.driver [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:09:52 np0005531887 nova_compute[186849]: 2025-11-22 08:09:52.537 186853 DEBUG nova.virt.libvirt.driver [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] No VIF found with MAC fa:16:3e:22:16:fc, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:09:52 np0005531887 nova_compute[186849]: 2025-11-22 08:09:52.538 186853 INFO nova.virt.libvirt.driver [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Using config drive#033[00m
Nov 22 03:09:52 np0005531887 podman[231493]: 2025-11-22 08:09:52.574227253 +0000 UTC m=+0.065202560 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 03:09:52 np0005531887 nova_compute[186849]: 2025-11-22 08:09:52.990 186853 INFO nova.virt.libvirt.driver [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Creating config drive at /var/lib/nova/instances/1b982188-a0e8-474c-a959-760a28dc3ffe/disk.config#033[00m
Nov 22 03:09:52 np0005531887 nova_compute[186849]: 2025-11-22 08:09:52.995 186853 DEBUG oslo_concurrency.processutils [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1b982188-a0e8-474c-a959-760a28dc3ffe/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpugs_g54r execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:09:53 np0005531887 nova_compute[186849]: 2025-11-22 08:09:53.121 186853 DEBUG oslo_concurrency.processutils [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1b982188-a0e8-474c-a959-760a28dc3ffe/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpugs_g54r" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:09:53 np0005531887 kernel: tap726decc7-82: entered promiscuous mode
Nov 22 03:09:53 np0005531887 NetworkManager[55210]: <info>  [1763798993.2019] manager: (tap726decc7-82): new Tun device (/org/freedesktop/NetworkManager/Devices/181)
Nov 22 03:09:53 np0005531887 ovn_controller[95130]: 2025-11-22T08:09:53Z|00386|binding|INFO|Claiming lport 726decc7-8256-48c9-992a-051f7215b6fa for this chassis.
Nov 22 03:09:53 np0005531887 ovn_controller[95130]: 2025-11-22T08:09:53Z|00387|binding|INFO|726decc7-8256-48c9-992a-051f7215b6fa: Claiming fa:16:3e:22:16:fc 10.100.0.4
Nov 22 03:09:53 np0005531887 nova_compute[186849]: 2025-11-22 08:09:53.203 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:53 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:53.222 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:22:16:fc 10.100.0.4'], port_security=['fa:16:3e:22:16:fc 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '1b982188-a0e8-474c-a959-760a28dc3ffe', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-390460fe-fb7f-40ce-abb7-9e99dea93a54', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b67388009f754931a62cbdd391fb4f53', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e23cfd74-a57b-4610-ab28-51062b779dc9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9b005592-2b67-4b5e-87ed-f6d87ca37498, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=726decc7-8256-48c9-992a-051f7215b6fa) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:09:53 np0005531887 nova_compute[186849]: 2025-11-22 08:09:53.222 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:53 np0005531887 ovn_controller[95130]: 2025-11-22T08:09:53Z|00388|binding|INFO|Setting lport 726decc7-8256-48c9-992a-051f7215b6fa ovn-installed in OVS
Nov 22 03:09:53 np0005531887 ovn_controller[95130]: 2025-11-22T08:09:53Z|00389|binding|INFO|Setting lport 726decc7-8256-48c9-992a-051f7215b6fa up in Southbound
Nov 22 03:09:53 np0005531887 nova_compute[186849]: 2025-11-22 08:09:53.226 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:53 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:53.225 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 726decc7-8256-48c9-992a-051f7215b6fa in datapath 390460fe-fb7f-40ce-abb7-9e99dea93a54 bound to our chassis#033[00m
Nov 22 03:09:53 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:53.229 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 390460fe-fb7f-40ce-abb7-9e99dea93a54#033[00m
Nov 22 03:09:53 np0005531887 systemd-udevd[231530]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:09:53 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:53.250 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[f9a2f0fe-dba5-44d5-8b16-e6a7e1618927]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:53 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:53.252 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap390460fe-f1 in ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:09:53 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:53.254 213790 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap390460fe-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:09:53 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:53.254 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[9aa5314d-e05c-4729-ae6d-c38f193de0c9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:53 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:53.255 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[6f313e06-6841-48c5-b212-99341f0fce67]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:53 np0005531887 systemd-machined[153180]: New machine qemu-46-instance-0000007b.
Nov 22 03:09:53 np0005531887 NetworkManager[55210]: <info>  [1763798993.2640] device (tap726decc7-82): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:09:53 np0005531887 NetworkManager[55210]: <info>  [1763798993.2652] device (tap726decc7-82): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:09:53 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:53.268 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[6bbe3a21-c0fb-4921-96f5-ae8f2e64f84f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:53 np0005531887 systemd[1]: Started Virtual Machine qemu-46-instance-0000007b.
Nov 22 03:09:53 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:53.297 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[be295f20-a6fe-4231-b912-7ecd77030376]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:53 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:53.336 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[bc216b5f-983b-4acc-a808-efe03e458289]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:53 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:53.341 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[71c2df7c-4b17-4c92-b35b-e4a9a12d3ee4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:53 np0005531887 NetworkManager[55210]: <info>  [1763798993.3427] manager: (tap390460fe-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/182)
Nov 22 03:09:53 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:53.382 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[1f212e43-cf90-4922-bf2f-523428989fed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:53 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:53.385 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[6425a033-b1e1-4111-b14d-2a996fb332a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:53 np0005531887 NetworkManager[55210]: <info>  [1763798993.4095] device (tap390460fe-f0): carrier: link connected
Nov 22 03:09:53 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:53.414 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[e22dd22c-1cf3-49f2-bcf3-5ad0dc852d56]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:53 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:53.432 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[8e80cff6-d6fb-4c2d-8f54-e5f9b76c48d4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap390460fe-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:02:0a:50'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 118], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 573601, 'reachable_time': 37750, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231565, 'error': None, 'target': 'ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:53 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:53.462 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[d0a7578e-fbb5-441a-8de0-b62aa4119481]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe02:a50'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 573601, 'tstamp': 573601}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231566, 'error': None, 'target': 'ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:53 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:53.480 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[5817ab4d-bdad-4cdd-89e1-08cb0e6554fb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap390460fe-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:02:0a:50'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 118], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 573601, 'reachable_time': 37750, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 231567, 'error': None, 'target': 'ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:53 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:53.510 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[50098a78-085a-4fd1-b2d0-f214bce17f5c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:53 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:53.584 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[0bd5f730-30b2-4574-a7c1-94a9decc4691]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:53 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:53.586 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap390460fe-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:09:53 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:53.587 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:09:53 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:53.587 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap390460fe-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:09:53 np0005531887 nova_compute[186849]: 2025-11-22 08:09:53.590 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:53 np0005531887 NetworkManager[55210]: <info>  [1763798993.5915] manager: (tap390460fe-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/183)
Nov 22 03:09:53 np0005531887 kernel: tap390460fe-f0: entered promiscuous mode
Nov 22 03:09:53 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:53.594 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap390460fe-f0, col_values=(('external_ids', {'iface-id': '71a8d1b1-af34-4bcb-98ae-9fcab10d0f3b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:09:53 np0005531887 nova_compute[186849]: 2025-11-22 08:09:53.595 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:53 np0005531887 ovn_controller[95130]: 2025-11-22T08:09:53Z|00390|binding|INFO|Releasing lport 71a8d1b1-af34-4bcb-98ae-9fcab10d0f3b from this chassis (sb_readonly=0)
Nov 22 03:09:53 np0005531887 nova_compute[186849]: 2025-11-22 08:09:53.597 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:53 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:53.598 104084 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/390460fe-fb7f-40ce-abb7-9e99dea93a54.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/390460fe-fb7f-40ce-abb7-9e99dea93a54.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:09:53 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:53.599 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[ff88a9da-2c81-4c04-8c99-0d624461db74]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:09:53 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:53.600 104084 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:09:53 np0005531887 ovn_metadata_agent[104079]: global
Nov 22 03:09:53 np0005531887 ovn_metadata_agent[104079]:    log         /dev/log local0 debug
Nov 22 03:09:53 np0005531887 ovn_metadata_agent[104079]:    log-tag     haproxy-metadata-proxy-390460fe-fb7f-40ce-abb7-9e99dea93a54
Nov 22 03:09:53 np0005531887 ovn_metadata_agent[104079]:    user        root
Nov 22 03:09:53 np0005531887 ovn_metadata_agent[104079]:    group       root
Nov 22 03:09:53 np0005531887 ovn_metadata_agent[104079]:    maxconn     1024
Nov 22 03:09:53 np0005531887 ovn_metadata_agent[104079]:    pidfile     /var/lib/neutron/external/pids/390460fe-fb7f-40ce-abb7-9e99dea93a54.pid.haproxy
Nov 22 03:09:53 np0005531887 ovn_metadata_agent[104079]:    daemon
Nov 22 03:09:53 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:09:53 np0005531887 ovn_metadata_agent[104079]: defaults
Nov 22 03:09:53 np0005531887 ovn_metadata_agent[104079]:    log global
Nov 22 03:09:53 np0005531887 ovn_metadata_agent[104079]:    mode http
Nov 22 03:09:53 np0005531887 ovn_metadata_agent[104079]:    option httplog
Nov 22 03:09:53 np0005531887 ovn_metadata_agent[104079]:    option dontlognull
Nov 22 03:09:53 np0005531887 ovn_metadata_agent[104079]:    option http-server-close
Nov 22 03:09:53 np0005531887 ovn_metadata_agent[104079]:    option forwardfor
Nov 22 03:09:53 np0005531887 ovn_metadata_agent[104079]:    retries                 3
Nov 22 03:09:53 np0005531887 ovn_metadata_agent[104079]:    timeout http-request    30s
Nov 22 03:09:53 np0005531887 ovn_metadata_agent[104079]:    timeout connect         30s
Nov 22 03:09:53 np0005531887 ovn_metadata_agent[104079]:    timeout client          32s
Nov 22 03:09:53 np0005531887 ovn_metadata_agent[104079]:    timeout server          32s
Nov 22 03:09:53 np0005531887 ovn_metadata_agent[104079]:    timeout http-keep-alive 30s
Nov 22 03:09:53 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:09:53 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:09:53 np0005531887 ovn_metadata_agent[104079]: listen listener
Nov 22 03:09:53 np0005531887 ovn_metadata_agent[104079]:    bind 169.254.169.254:80
Nov 22 03:09:53 np0005531887 ovn_metadata_agent[104079]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:09:53 np0005531887 ovn_metadata_agent[104079]:    http-request add-header X-OVN-Network-ID 390460fe-fb7f-40ce-abb7-9e99dea93a54
Nov 22 03:09:53 np0005531887 ovn_metadata_agent[104079]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:09:53 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:09:53.600 104084 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54', 'env', 'PROCESS_TAG=haproxy-390460fe-fb7f-40ce-abb7-9e99dea93a54', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/390460fe-fb7f-40ce-abb7-9e99dea93a54.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:09:53 np0005531887 nova_compute[186849]: 2025-11-22 08:09:53.609 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:53 np0005531887 nova_compute[186849]: 2025-11-22 08:09:53.742 186853 DEBUG nova.compute.manager [req-ba59d509-b902-47dc-8067-d2151b38d5c4 req-7af1799b-f738-49d4-a572-944c4254f480 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Received event network-vif-plugged-726decc7-8256-48c9-992a-051f7215b6fa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:09:53 np0005531887 nova_compute[186849]: 2025-11-22 08:09:53.742 186853 DEBUG oslo_concurrency.lockutils [req-ba59d509-b902-47dc-8067-d2151b38d5c4 req-7af1799b-f738-49d4-a572-944c4254f480 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1b982188-a0e8-474c-a959-760a28dc3ffe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:53 np0005531887 nova_compute[186849]: 2025-11-22 08:09:53.743 186853 DEBUG oslo_concurrency.lockutils [req-ba59d509-b902-47dc-8067-d2151b38d5c4 req-7af1799b-f738-49d4-a572-944c4254f480 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1b982188-a0e8-474c-a959-760a28dc3ffe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:53 np0005531887 nova_compute[186849]: 2025-11-22 08:09:53.743 186853 DEBUG oslo_concurrency.lockutils [req-ba59d509-b902-47dc-8067-d2151b38d5c4 req-7af1799b-f738-49d4-a572-944c4254f480 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1b982188-a0e8-474c-a959-760a28dc3ffe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:53 np0005531887 nova_compute[186849]: 2025-11-22 08:09:53.743 186853 DEBUG nova.compute.manager [req-ba59d509-b902-47dc-8067-d2151b38d5c4 req-7af1799b-f738-49d4-a572-944c4254f480 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Processing event network-vif-plugged-726decc7-8256-48c9-992a-051f7215b6fa _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:09:53 np0005531887 nova_compute[186849]: 2025-11-22 08:09:53.836 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798993.8356216, 1b982188-a0e8-474c-a959-760a28dc3ffe => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:09:53 np0005531887 nova_compute[186849]: 2025-11-22 08:09:53.837 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] VM Started (Lifecycle Event)#033[00m
Nov 22 03:09:53 np0005531887 nova_compute[186849]: 2025-11-22 08:09:53.840 186853 DEBUG nova.compute.manager [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:09:53 np0005531887 nova_compute[186849]: 2025-11-22 08:09:53.857 186853 DEBUG nova.virt.libvirt.driver [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:09:53 np0005531887 nova_compute[186849]: 2025-11-22 08:09:53.861 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:09:53 np0005531887 nova_compute[186849]: 2025-11-22 08:09:53.866 186853 INFO nova.virt.libvirt.driver [-] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Instance spawned successfully.#033[00m
Nov 22 03:09:53 np0005531887 nova_compute[186849]: 2025-11-22 08:09:53.867 186853 DEBUG nova.virt.libvirt.driver [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 03:09:53 np0005531887 nova_compute[186849]: 2025-11-22 08:09:53.869 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:09:53 np0005531887 nova_compute[186849]: 2025-11-22 08:09:53.901 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:09:53 np0005531887 nova_compute[186849]: 2025-11-22 08:09:53.903 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798993.8373299, 1b982188-a0e8-474c-a959-760a28dc3ffe => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:09:53 np0005531887 nova_compute[186849]: 2025-11-22 08:09:53.903 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:09:53 np0005531887 nova_compute[186849]: 2025-11-22 08:09:53.914 186853 DEBUG nova.virt.libvirt.driver [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:09:53 np0005531887 nova_compute[186849]: 2025-11-22 08:09:53.915 186853 DEBUG nova.virt.libvirt.driver [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:09:53 np0005531887 nova_compute[186849]: 2025-11-22 08:09:53.916 186853 DEBUG nova.virt.libvirt.driver [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:09:53 np0005531887 nova_compute[186849]: 2025-11-22 08:09:53.916 186853 DEBUG nova.virt.libvirt.driver [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:09:53 np0005531887 nova_compute[186849]: 2025-11-22 08:09:53.917 186853 DEBUG nova.virt.libvirt.driver [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:09:53 np0005531887 nova_compute[186849]: 2025-11-22 08:09:53.917 186853 DEBUG nova.virt.libvirt.driver [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:09:53 np0005531887 nova_compute[186849]: 2025-11-22 08:09:53.922 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:09:53 np0005531887 nova_compute[186849]: 2025-11-22 08:09:53.929 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763798993.844815, 1b982188-a0e8-474c-a959-760a28dc3ffe => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:09:53 np0005531887 nova_compute[186849]: 2025-11-22 08:09:53.930 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:09:53 np0005531887 nova_compute[186849]: 2025-11-22 08:09:53.971 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:09:53 np0005531887 nova_compute[186849]: 2025-11-22 08:09:53.977 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:09:54 np0005531887 nova_compute[186849]: 2025-11-22 08:09:54.002 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:09:54 np0005531887 nova_compute[186849]: 2025-11-22 08:09:54.008 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:54 np0005531887 nova_compute[186849]: 2025-11-22 08:09:54.026 186853 INFO nova.compute.manager [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Took 5.68 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 03:09:54 np0005531887 nova_compute[186849]: 2025-11-22 08:09:54.026 186853 DEBUG nova.compute.manager [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:09:54 np0005531887 podman[231607]: 2025-11-22 08:09:54.012661852 +0000 UTC m=+0.034482503 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:09:54 np0005531887 nova_compute[186849]: 2025-11-22 08:09:54.115 186853 INFO nova.compute.manager [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Took 6.38 seconds to build instance.#033[00m
Nov 22 03:09:54 np0005531887 nova_compute[186849]: 2025-11-22 08:09:54.161 186853 DEBUG oslo_concurrency.lockutils [None req-d7ac62cc-6b56-4d96-a31f-a846a9f7408e 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lock "1b982188-a0e8-474c-a959-760a28dc3ffe" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.508s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:54 np0005531887 podman[231607]: 2025-11-22 08:09:54.3246311 +0000 UTC m=+0.346451731 container create 80cadbc8fd663a53dca669c90b95a9f4e72adec30f653ff5d808294d10f94c5b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:09:54 np0005531887 systemd[1]: Started libpod-conmon-80cadbc8fd663a53dca669c90b95a9f4e72adec30f653ff5d808294d10f94c5b.scope.
Nov 22 03:09:54 np0005531887 systemd[1]: Started libcrun container.
Nov 22 03:09:54 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb5487dd6275a854b48add95401328a5e187d979244c230d5ab82d54aedbed0f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:09:54 np0005531887 podman[231607]: 2025-11-22 08:09:54.557443756 +0000 UTC m=+0.579264387 container init 80cadbc8fd663a53dca669c90b95a9f4e72adec30f653ff5d808294d10f94c5b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:09:54 np0005531887 podman[231607]: 2025-11-22 08:09:54.564383567 +0000 UTC m=+0.586204198 container start 80cadbc8fd663a53dca669c90b95a9f4e72adec30f653ff5d808294d10f94c5b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 22 03:09:54 np0005531887 neutron-haproxy-ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54[231622]: [NOTICE]   (231626) : New worker (231628) forked
Nov 22 03:09:54 np0005531887 neutron-haproxy-ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54[231622]: [NOTICE]   (231626) : Loading success.
Nov 22 03:09:54 np0005531887 nova_compute[186849]: 2025-11-22 08:09:54.809 186853 DEBUG nova.network.neutron [req-70ee8538-28df-4dc4-872a-df4b3daca0ce req-6f637c45-8891-46ca-b25c-71aa1430bcc6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Updated VIF entry in instance network info cache for port 726decc7-8256-48c9-992a-051f7215b6fa. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:09:54 np0005531887 nova_compute[186849]: 2025-11-22 08:09:54.810 186853 DEBUG nova.network.neutron [req-70ee8538-28df-4dc4-872a-df4b3daca0ce req-6f637c45-8891-46ca-b25c-71aa1430bcc6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Updating instance_info_cache with network_info: [{"id": "726decc7-8256-48c9-992a-051f7215b6fa", "address": "fa:16:3e:22:16:fc", "network": {"id": "390460fe-fb7f-40ce-abb7-9e99dea93a54", "bridge": "br-int", "label": "tempest-TestServerMultinode-1226412633-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ea60c87d8514904b3cac3301e5875af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap726decc7-82", "ovs_interfaceid": "726decc7-8256-48c9-992a-051f7215b6fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:09:54 np0005531887 nova_compute[186849]: 2025-11-22 08:09:54.827 186853 DEBUG oslo_concurrency.lockutils [req-70ee8538-28df-4dc4-872a-df4b3daca0ce req-6f637c45-8891-46ca-b25c-71aa1430bcc6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-1b982188-a0e8-474c-a959-760a28dc3ffe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:09:55 np0005531887 nova_compute[186849]: 2025-11-22 08:09:55.781 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:09:55 np0005531887 nova_compute[186849]: 2025-11-22 08:09:55.862 186853 DEBUG nova.compute.manager [req-ef869094-827b-413f-a6aa-28f9684868f5 req-31c24832-8b14-4338-a864-433b3e2b33ad 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Received event network-vif-plugged-726decc7-8256-48c9-992a-051f7215b6fa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:09:55 np0005531887 nova_compute[186849]: 2025-11-22 08:09:55.862 186853 DEBUG oslo_concurrency.lockutils [req-ef869094-827b-413f-a6aa-28f9684868f5 req-31c24832-8b14-4338-a864-433b3e2b33ad 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1b982188-a0e8-474c-a959-760a28dc3ffe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:55 np0005531887 nova_compute[186849]: 2025-11-22 08:09:55.862 186853 DEBUG oslo_concurrency.lockutils [req-ef869094-827b-413f-a6aa-28f9684868f5 req-31c24832-8b14-4338-a864-433b3e2b33ad 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1b982188-a0e8-474c-a959-760a28dc3ffe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:55 np0005531887 nova_compute[186849]: 2025-11-22 08:09:55.863 186853 DEBUG oslo_concurrency.lockutils [req-ef869094-827b-413f-a6aa-28f9684868f5 req-31c24832-8b14-4338-a864-433b3e2b33ad 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1b982188-a0e8-474c-a959-760a28dc3ffe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:55 np0005531887 nova_compute[186849]: 2025-11-22 08:09:55.863 186853 DEBUG nova.compute.manager [req-ef869094-827b-413f-a6aa-28f9684868f5 req-31c24832-8b14-4338-a864-433b3e2b33ad 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] No waiting events found dispatching network-vif-plugged-726decc7-8256-48c9-992a-051f7215b6fa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:09:55 np0005531887 nova_compute[186849]: 2025-11-22 08:09:55.863 186853 WARNING nova.compute.manager [req-ef869094-827b-413f-a6aa-28f9684868f5 req-31c24832-8b14-4338-a864-433b3e2b33ad 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Received unexpected event network-vif-plugged-726decc7-8256-48c9-992a-051f7215b6fa for instance with vm_state active and task_state None.#033[00m
Nov 22 03:09:56 np0005531887 nova_compute[186849]: 2025-11-22 08:09:56.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:09:56 np0005531887 nova_compute[186849]: 2025-11-22 08:09:56.800 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:56 np0005531887 nova_compute[186849]: 2025-11-22 08:09:56.800 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:56 np0005531887 nova_compute[186849]: 2025-11-22 08:09:56.801 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:56 np0005531887 nova_compute[186849]: 2025-11-22 08:09:56.801 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:09:56 np0005531887 nova_compute[186849]: 2025-11-22 08:09:56.879 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1b982188-a0e8-474c-a959-760a28dc3ffe/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:09:56 np0005531887 nova_compute[186849]: 2025-11-22 08:09:56.922 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:56 np0005531887 nova_compute[186849]: 2025-11-22 08:09:56.947 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1b982188-a0e8-474c-a959-760a28dc3ffe/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:09:56 np0005531887 nova_compute[186849]: 2025-11-22 08:09:56.948 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1b982188-a0e8-474c-a959-760a28dc3ffe/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:09:57 np0005531887 nova_compute[186849]: 2025-11-22 08:09:57.009 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1b982188-a0e8-474c-a959-760a28dc3ffe/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:09:57 np0005531887 nova_compute[186849]: 2025-11-22 08:09:57.017 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b50dd877-42b1-46b2-933e-ee9a660a56c3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:09:57 np0005531887 nova_compute[186849]: 2025-11-22 08:09:57.079 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b50dd877-42b1-46b2-933e-ee9a660a56c3/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:09:57 np0005531887 nova_compute[186849]: 2025-11-22 08:09:57.080 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b50dd877-42b1-46b2-933e-ee9a660a56c3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:09:57 np0005531887 nova_compute[186849]: 2025-11-22 08:09:57.143 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b50dd877-42b1-46b2-933e-ee9a660a56c3/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:09:57 np0005531887 nova_compute[186849]: 2025-11-22 08:09:57.367 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:09:57 np0005531887 nova_compute[186849]: 2025-11-22 08:09:57.370 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5402MB free_disk=73.30164337158203GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:09:57 np0005531887 nova_compute[186849]: 2025-11-22 08:09:57.370 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:57 np0005531887 nova_compute[186849]: 2025-11-22 08:09:57.370 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:57 np0005531887 nova_compute[186849]: 2025-11-22 08:09:57.452 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:57 np0005531887 nova_compute[186849]: 2025-11-22 08:09:57.467 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Instance b50dd877-42b1-46b2-933e-ee9a660a56c3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 03:09:57 np0005531887 nova_compute[186849]: 2025-11-22 08:09:57.468 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Instance 1b982188-a0e8-474c-a959-760a28dc3ffe actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 03:09:57 np0005531887 nova_compute[186849]: 2025-11-22 08:09:57.468 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:09:57 np0005531887 nova_compute[186849]: 2025-11-22 08:09:57.468 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:09:57 np0005531887 nova_compute[186849]: 2025-11-22 08:09:57.551 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:09:57 np0005531887 nova_compute[186849]: 2025-11-22 08:09:57.570 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:09:57 np0005531887 nova_compute[186849]: 2025-11-22 08:09:57.598 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:09:57 np0005531887 nova_compute[186849]: 2025-11-22 08:09:57.598 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.228s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:57 np0005531887 nova_compute[186849]: 2025-11-22 08:09:57.943 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:09:58 np0005531887 nova_compute[186849]: 2025-11-22 08:09:58.819 186853 DEBUG oslo_concurrency.lockutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Acquiring lock "1eab226d-c316-4033-b802-511921219249" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:58 np0005531887 nova_compute[186849]: 2025-11-22 08:09:58.819 186853 DEBUG oslo_concurrency.lockutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "1eab226d-c316-4033-b802-511921219249" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:58 np0005531887 nova_compute[186849]: 2025-11-22 08:09:58.844 186853 DEBUG nova.compute.manager [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 1eab226d-c316-4033-b802-511921219249] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 03:09:58 np0005531887 podman[231650]: 2025-11-22 08:09:58.852724988 +0000 UTC m=+0.074553281 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, release=1755695350, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, version=9.6, architecture=x86_64, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 22 03:09:58 np0005531887 nova_compute[186849]: 2025-11-22 08:09:58.967 186853 DEBUG oslo_concurrency.lockutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:58 np0005531887 nova_compute[186849]: 2025-11-22 08:09:58.968 186853 DEBUG oslo_concurrency.lockutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:58 np0005531887 nova_compute[186849]: 2025-11-22 08:09:58.978 186853 DEBUG nova.virt.hardware [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:09:58 np0005531887 nova_compute[186849]: 2025-11-22 08:09:58.978 186853 INFO nova.compute.claims [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 1eab226d-c316-4033-b802-511921219249] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 22 03:09:59 np0005531887 nova_compute[186849]: 2025-11-22 08:09:59.145 186853 DEBUG nova.compute.provider_tree [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:09:59 np0005531887 nova_compute[186849]: 2025-11-22 08:09:59.161 186853 DEBUG nova.scheduler.client.report [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:09:59 np0005531887 nova_compute[186849]: 2025-11-22 08:09:59.197 186853 DEBUG oslo_concurrency.lockutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.229s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:59 np0005531887 nova_compute[186849]: 2025-11-22 08:09:59.197 186853 DEBUG nova.compute.manager [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 1eab226d-c316-4033-b802-511921219249] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 03:09:59 np0005531887 nova_compute[186849]: 2025-11-22 08:09:59.304 186853 DEBUG nova.compute.manager [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 1eab226d-c316-4033-b802-511921219249] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 03:09:59 np0005531887 nova_compute[186849]: 2025-11-22 08:09:59.305 186853 DEBUG nova.network.neutron [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 1eab226d-c316-4033-b802-511921219249] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:09:59 np0005531887 nova_compute[186849]: 2025-11-22 08:09:59.336 186853 INFO nova.virt.libvirt.driver [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 1eab226d-c316-4033-b802-511921219249] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 03:09:59 np0005531887 nova_compute[186849]: 2025-11-22 08:09:59.380 186853 DEBUG nova.compute.manager [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 1eab226d-c316-4033-b802-511921219249] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 03:09:59 np0005531887 nova_compute[186849]: 2025-11-22 08:09:59.484 186853 DEBUG nova.compute.manager [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 1eab226d-c316-4033-b802-511921219249] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 03:09:59 np0005531887 nova_compute[186849]: 2025-11-22 08:09:59.486 186853 DEBUG nova.virt.libvirt.driver [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 1eab226d-c316-4033-b802-511921219249] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:09:59 np0005531887 nova_compute[186849]: 2025-11-22 08:09:59.486 186853 INFO nova.virt.libvirt.driver [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 1eab226d-c316-4033-b802-511921219249] Creating image(s)#033[00m
Nov 22 03:09:59 np0005531887 nova_compute[186849]: 2025-11-22 08:09:59.487 186853 DEBUG oslo_concurrency.lockutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Acquiring lock "/var/lib/nova/instances/1eab226d-c316-4033-b802-511921219249/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:59 np0005531887 nova_compute[186849]: 2025-11-22 08:09:59.487 186853 DEBUG oslo_concurrency.lockutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "/var/lib/nova/instances/1eab226d-c316-4033-b802-511921219249/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:59 np0005531887 nova_compute[186849]: 2025-11-22 08:09:59.487 186853 DEBUG oslo_concurrency.lockutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "/var/lib/nova/instances/1eab226d-c316-4033-b802-511921219249/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:59 np0005531887 nova_compute[186849]: 2025-11-22 08:09:59.502 186853 DEBUG oslo_concurrency.processutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:09:59 np0005531887 nova_compute[186849]: 2025-11-22 08:09:59.524 186853 DEBUG nova.policy [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c867ad823e59410b995507d3e85b3465', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9c564dfb60114407b72d22a9c49ed513', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:09:59 np0005531887 nova_compute[186849]: 2025-11-22 08:09:59.577 186853 DEBUG oslo_concurrency.processutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:09:59 np0005531887 nova_compute[186849]: 2025-11-22 08:09:59.579 186853 DEBUG oslo_concurrency.lockutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:59 np0005531887 nova_compute[186849]: 2025-11-22 08:09:59.579 186853 DEBUG oslo_concurrency.lockutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:59 np0005531887 nova_compute[186849]: 2025-11-22 08:09:59.591 186853 DEBUG oslo_concurrency.processutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:09:59 np0005531887 nova_compute[186849]: 2025-11-22 08:09:59.616 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:09:59 np0005531887 nova_compute[186849]: 2025-11-22 08:09:59.660 186853 DEBUG oslo_concurrency.processutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:09:59 np0005531887 nova_compute[186849]: 2025-11-22 08:09:59.661 186853 DEBUG oslo_concurrency.processutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/1eab226d-c316-4033-b802-511921219249/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:09:59 np0005531887 nova_compute[186849]: 2025-11-22 08:09:59.708 186853 DEBUG oslo_concurrency.processutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/1eab226d-c316-4033-b802-511921219249/disk 1073741824" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:09:59 np0005531887 nova_compute[186849]: 2025-11-22 08:09:59.710 186853 DEBUG oslo_concurrency.lockutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:09:59 np0005531887 nova_compute[186849]: 2025-11-22 08:09:59.710 186853 DEBUG oslo_concurrency.processutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:09:59 np0005531887 nova_compute[186849]: 2025-11-22 08:09:59.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:09:59 np0005531887 nova_compute[186849]: 2025-11-22 08:09:59.768 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:09:59 np0005531887 nova_compute[186849]: 2025-11-22 08:09:59.769 186853 DEBUG oslo_concurrency.processutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:09:59 np0005531887 nova_compute[186849]: 2025-11-22 08:09:59.770 186853 DEBUG nova.virt.disk.api [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Checking if we can resize image /var/lib/nova/instances/1eab226d-c316-4033-b802-511921219249/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:09:59 np0005531887 nova_compute[186849]: 2025-11-22 08:09:59.770 186853 DEBUG oslo_concurrency.processutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1eab226d-c316-4033-b802-511921219249/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:09:59 np0005531887 nova_compute[186849]: 2025-11-22 08:09:59.831 186853 DEBUG oslo_concurrency.processutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1eab226d-c316-4033-b802-511921219249/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:09:59 np0005531887 nova_compute[186849]: 2025-11-22 08:09:59.833 186853 DEBUG nova.virt.disk.api [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Cannot resize image /var/lib/nova/instances/1eab226d-c316-4033-b802-511921219249/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:09:59 np0005531887 nova_compute[186849]: 2025-11-22 08:09:59.833 186853 DEBUG nova.objects.instance [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lazy-loading 'migration_context' on Instance uuid 1eab226d-c316-4033-b802-511921219249 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:09:59 np0005531887 nova_compute[186849]: 2025-11-22 08:09:59.851 186853 DEBUG nova.virt.libvirt.driver [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 1eab226d-c316-4033-b802-511921219249] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:09:59 np0005531887 nova_compute[186849]: 2025-11-22 08:09:59.852 186853 DEBUG nova.virt.libvirt.driver [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 1eab226d-c316-4033-b802-511921219249] Ensure instance console log exists: /var/lib/nova/instances/1eab226d-c316-4033-b802-511921219249/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:09:59 np0005531887 nova_compute[186849]: 2025-11-22 08:09:59.852 186853 DEBUG oslo_concurrency.lockutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:09:59 np0005531887 nova_compute[186849]: 2025-11-22 08:09:59.853 186853 DEBUG oslo_concurrency.lockutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:09:59 np0005531887 nova_compute[186849]: 2025-11-22 08:09:59.853 186853 DEBUG oslo_concurrency.lockutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:10:00 np0005531887 nova_compute[186849]: 2025-11-22 08:10:00.386 186853 DEBUG nova.network.neutron [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 1eab226d-c316-4033-b802-511921219249] Successfully created port: 26953fb8-b327-4ec5-9059-9bd2496c1121 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:10:01 np0005531887 nova_compute[186849]: 2025-11-22 08:10:01.374 186853 DEBUG nova.network.neutron [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 1eab226d-c316-4033-b802-511921219249] Successfully updated port: 26953fb8-b327-4ec5-9059-9bd2496c1121 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:10:01 np0005531887 nova_compute[186849]: 2025-11-22 08:10:01.412 186853 DEBUG oslo_concurrency.lockutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Acquiring lock "refresh_cache-1eab226d-c316-4033-b802-511921219249" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:10:01 np0005531887 nova_compute[186849]: 2025-11-22 08:10:01.413 186853 DEBUG oslo_concurrency.lockutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Acquired lock "refresh_cache-1eab226d-c316-4033-b802-511921219249" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:10:01 np0005531887 nova_compute[186849]: 2025-11-22 08:10:01.413 186853 DEBUG nova.network.neutron [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 1eab226d-c316-4033-b802-511921219249] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:10:01 np0005531887 nova_compute[186849]: 2025-11-22 08:10:01.507 186853 DEBUG nova.compute.manager [req-ea82a668-775c-409e-8582-09079abff2f4 req-fd7803a9-5b9e-48a2-a4cd-b2b4dcbad097 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1eab226d-c316-4033-b802-511921219249] Received event network-changed-26953fb8-b327-4ec5-9059-9bd2496c1121 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:10:01 np0005531887 nova_compute[186849]: 2025-11-22 08:10:01.508 186853 DEBUG nova.compute.manager [req-ea82a668-775c-409e-8582-09079abff2f4 req-fd7803a9-5b9e-48a2-a4cd-b2b4dcbad097 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1eab226d-c316-4033-b802-511921219249] Refreshing instance network info cache due to event network-changed-26953fb8-b327-4ec5-9059-9bd2496c1121. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:10:01 np0005531887 nova_compute[186849]: 2025-11-22 08:10:01.508 186853 DEBUG oslo_concurrency.lockutils [req-ea82a668-775c-409e-8582-09079abff2f4 req-fd7803a9-5b9e-48a2-a4cd-b2b4dcbad097 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-1eab226d-c316-4033-b802-511921219249" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:10:01 np0005531887 nova_compute[186849]: 2025-11-22 08:10:01.605 186853 DEBUG nova.network.neutron [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 1eab226d-c316-4033-b802-511921219249] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:10:01 np0005531887 nova_compute[186849]: 2025-11-22 08:10:01.770 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:10:01 np0005531887 nova_compute[186849]: 2025-11-22 08:10:01.771 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:10:01 np0005531887 nova_compute[186849]: 2025-11-22 08:10:01.771 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:10:01 np0005531887 podman[231688]: 2025-11-22 08:10:01.845464085 +0000 UTC m=+0.064663217 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 03:10:01 np0005531887 podman[231689]: 2025-11-22 08:10:01.888500037 +0000 UTC m=+0.101945417 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 22 03:10:01 np0005531887 nova_compute[186849]: 2025-11-22 08:10:01.926 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:01 np0005531887 nova_compute[186849]: 2025-11-22 08:10:01.953 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: 1eab226d-c316-4033-b802-511921219249] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Nov 22 03:10:02 np0005531887 nova_compute[186849]: 2025-11-22 08:10:02.168 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "refresh_cache-b50dd877-42b1-46b2-933e-ee9a660a56c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:10:02 np0005531887 nova_compute[186849]: 2025-11-22 08:10:02.169 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquired lock "refresh_cache-b50dd877-42b1-46b2-933e-ee9a660a56c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:10:02 np0005531887 nova_compute[186849]: 2025-11-22 08:10:02.169 186853 DEBUG nova.network.neutron [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 03:10:02 np0005531887 nova_compute[186849]: 2025-11-22 08:10:02.169 186853 DEBUG nova.objects.instance [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b50dd877-42b1-46b2-933e-ee9a660a56c3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:10:02 np0005531887 nova_compute[186849]: 2025-11-22 08:10:02.454 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:02 np0005531887 nova_compute[186849]: 2025-11-22 08:10:02.898 186853 DEBUG nova.network.neutron [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 1eab226d-c316-4033-b802-511921219249] Updating instance_info_cache with network_info: [{"id": "26953fb8-b327-4ec5-9059-9bd2496c1121", "address": "fa:16:3e:ff:70:db", "network": {"id": "c75f33da-8305-4145-97ef-eef656e4f067", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-910086432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c564dfb60114407b72d22a9c49ed513", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26953fb8-b3", "ovs_interfaceid": "26953fb8-b327-4ec5-9059-9bd2496c1121", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:10:02 np0005531887 nova_compute[186849]: 2025-11-22 08:10:02.917 186853 DEBUG oslo_concurrency.lockutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Releasing lock "refresh_cache-1eab226d-c316-4033-b802-511921219249" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:10:02 np0005531887 nova_compute[186849]: 2025-11-22 08:10:02.917 186853 DEBUG nova.compute.manager [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 1eab226d-c316-4033-b802-511921219249] Instance network_info: |[{"id": "26953fb8-b327-4ec5-9059-9bd2496c1121", "address": "fa:16:3e:ff:70:db", "network": {"id": "c75f33da-8305-4145-97ef-eef656e4f067", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-910086432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c564dfb60114407b72d22a9c49ed513", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26953fb8-b3", "ovs_interfaceid": "26953fb8-b327-4ec5-9059-9bd2496c1121", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 03:10:02 np0005531887 nova_compute[186849]: 2025-11-22 08:10:02.918 186853 DEBUG oslo_concurrency.lockutils [req-ea82a668-775c-409e-8582-09079abff2f4 req-fd7803a9-5b9e-48a2-a4cd-b2b4dcbad097 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-1eab226d-c316-4033-b802-511921219249" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:10:02 np0005531887 nova_compute[186849]: 2025-11-22 08:10:02.918 186853 DEBUG nova.network.neutron [req-ea82a668-775c-409e-8582-09079abff2f4 req-fd7803a9-5b9e-48a2-a4cd-b2b4dcbad097 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1eab226d-c316-4033-b802-511921219249] Refreshing network info cache for port 26953fb8-b327-4ec5-9059-9bd2496c1121 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:10:02 np0005531887 nova_compute[186849]: 2025-11-22 08:10:02.921 186853 DEBUG nova.virt.libvirt.driver [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 1eab226d-c316-4033-b802-511921219249] Start _get_guest_xml network_info=[{"id": "26953fb8-b327-4ec5-9059-9bd2496c1121", "address": "fa:16:3e:ff:70:db", "network": {"id": "c75f33da-8305-4145-97ef-eef656e4f067", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-910086432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c564dfb60114407b72d22a9c49ed513", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26953fb8-b3", "ovs_interfaceid": "26953fb8-b327-4ec5-9059-9bd2496c1121", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'size': 0, 'boot_index': 0, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:10:02 np0005531887 nova_compute[186849]: 2025-11-22 08:10:02.926 186853 WARNING nova.virt.libvirt.driver [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:10:02 np0005531887 nova_compute[186849]: 2025-11-22 08:10:02.930 186853 DEBUG nova.virt.libvirt.host [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:10:02 np0005531887 nova_compute[186849]: 2025-11-22 08:10:02.930 186853 DEBUG nova.virt.libvirt.host [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:10:02 np0005531887 nova_compute[186849]: 2025-11-22 08:10:02.936 186853 DEBUG nova.virt.libvirt.host [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:10:02 np0005531887 nova_compute[186849]: 2025-11-22 08:10:02.937 186853 DEBUG nova.virt.libvirt.host [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:10:02 np0005531887 nova_compute[186849]: 2025-11-22 08:10:02.938 186853 DEBUG nova.virt.libvirt.driver [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:10:02 np0005531887 nova_compute[186849]: 2025-11-22 08:10:02.938 186853 DEBUG nova.virt.hardware [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:10:02 np0005531887 nova_compute[186849]: 2025-11-22 08:10:02.938 186853 DEBUG nova.virt.hardware [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:10:02 np0005531887 nova_compute[186849]: 2025-11-22 08:10:02.939 186853 DEBUG nova.virt.hardware [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:10:02 np0005531887 nova_compute[186849]: 2025-11-22 08:10:02.939 186853 DEBUG nova.virt.hardware [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:10:02 np0005531887 nova_compute[186849]: 2025-11-22 08:10:02.939 186853 DEBUG nova.virt.hardware [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:10:02 np0005531887 nova_compute[186849]: 2025-11-22 08:10:02.939 186853 DEBUG nova.virt.hardware [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:10:02 np0005531887 nova_compute[186849]: 2025-11-22 08:10:02.939 186853 DEBUG nova.virt.hardware [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:10:02 np0005531887 nova_compute[186849]: 2025-11-22 08:10:02.940 186853 DEBUG nova.virt.hardware [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:10:02 np0005531887 nova_compute[186849]: 2025-11-22 08:10:02.940 186853 DEBUG nova.virt.hardware [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:10:02 np0005531887 nova_compute[186849]: 2025-11-22 08:10:02.940 186853 DEBUG nova.virt.hardware [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:10:02 np0005531887 nova_compute[186849]: 2025-11-22 08:10:02.940 186853 DEBUG nova.virt.hardware [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:10:02 np0005531887 nova_compute[186849]: 2025-11-22 08:10:02.943 186853 DEBUG nova.virt.libvirt.vif [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:09:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-MultipleCreateTestJSON-server-519320456',display_name='tempest-MultipleCreateTestJSON-server-519320456-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-multiplecreatetestjson-server-519320456-2',id=126,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9c564dfb60114407b72d22a9c49ed513',ramdisk_id='',reservation_id='r-s5402b4q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-1558462004',owner_user_name='tempest-MultipleCreateTestJSON-1558462004-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:09:59Z,user_data=None,user_id='c867ad823e59410b995507d3e85b3465',uuid=1eab226d-c316-4033-b802-511921219249,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "26953fb8-b327-4ec5-9059-9bd2496c1121", "address": "fa:16:3e:ff:70:db", "network": {"id": "c75f33da-8305-4145-97ef-eef656e4f067", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-910086432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c564dfb60114407b72d22a9c49ed513", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26953fb8-b3", "ovs_interfaceid": "26953fb8-b327-4ec5-9059-9bd2496c1121", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:10:02 np0005531887 nova_compute[186849]: 2025-11-22 08:10:02.944 186853 DEBUG nova.network.os_vif_util [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Converting VIF {"id": "26953fb8-b327-4ec5-9059-9bd2496c1121", "address": "fa:16:3e:ff:70:db", "network": {"id": "c75f33da-8305-4145-97ef-eef656e4f067", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-910086432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c564dfb60114407b72d22a9c49ed513", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26953fb8-b3", "ovs_interfaceid": "26953fb8-b327-4ec5-9059-9bd2496c1121", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:10:02 np0005531887 nova_compute[186849]: 2025-11-22 08:10:02.944 186853 DEBUG nova.network.os_vif_util [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ff:70:db,bridge_name='br-int',has_traffic_filtering=True,id=26953fb8-b327-4ec5-9059-9bd2496c1121,network=Network(c75f33da-8305-4145-97ef-eef656e4f067),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap26953fb8-b3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:10:02 np0005531887 nova_compute[186849]: 2025-11-22 08:10:02.945 186853 DEBUG nova.objects.instance [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1eab226d-c316-4033-b802-511921219249 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:10:02 np0005531887 nova_compute[186849]: 2025-11-22 08:10:02.956 186853 DEBUG nova.virt.libvirt.driver [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 1eab226d-c316-4033-b802-511921219249] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:10:02 np0005531887 nova_compute[186849]:  <uuid>1eab226d-c316-4033-b802-511921219249</uuid>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:  <name>instance-0000007e</name>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:  <memory>131072</memory>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:  <vcpu>1</vcpu>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:10:02 np0005531887 nova_compute[186849]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:      <nova:name>tempest-MultipleCreateTestJSON-server-519320456-2</nova:name>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:      <nova:creationTime>2025-11-22 08:10:02</nova:creationTime>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:      <nova:flavor name="m1.nano">
Nov 22 03:10:02 np0005531887 nova_compute[186849]:        <nova:memory>128</nova:memory>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:        <nova:disk>1</nova:disk>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:        <nova:swap>0</nova:swap>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:      </nova:flavor>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:      <nova:owner>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:        <nova:user uuid="c867ad823e59410b995507d3e85b3465">tempest-MultipleCreateTestJSON-1558462004-project-member</nova:user>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:        <nova:project uuid="9c564dfb60114407b72d22a9c49ed513">tempest-MultipleCreateTestJSON-1558462004</nova:project>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:      </nova:owner>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:      <nova:ports>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:        <nova:port uuid="26953fb8-b327-4ec5-9059-9bd2496c1121">
Nov 22 03:10:02 np0005531887 nova_compute[186849]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:        </nova:port>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:      </nova:ports>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:    </nova:instance>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:  <sysinfo type="smbios">
Nov 22 03:10:02 np0005531887 nova_compute[186849]:    <system>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:      <entry name="serial">1eab226d-c316-4033-b802-511921219249</entry>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:      <entry name="uuid">1eab226d-c316-4033-b802-511921219249</entry>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:    </system>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:  <os>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:    <boot dev="hd"/>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:    <smbios mode="sysinfo"/>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:  </os>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:  <features>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:    <vmcoreinfo/>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:  </features>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:  <clock offset="utc">
Nov 22 03:10:02 np0005531887 nova_compute[186849]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:    <timer name="hpet" present="no"/>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:  </clock>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:  <cpu mode="custom" match="exact">
Nov 22 03:10:02 np0005531887 nova_compute[186849]:    <model>Nehalem</model>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:  <devices>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:    <disk type="file" device="disk">
Nov 22 03:10:02 np0005531887 nova_compute[186849]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/1eab226d-c316-4033-b802-511921219249/disk"/>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:      <target dev="vda" bus="virtio"/>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:    <disk type="file" device="cdrom">
Nov 22 03:10:02 np0005531887 nova_compute[186849]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/1eab226d-c316-4033-b802-511921219249/disk.config"/>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:      <target dev="sda" bus="sata"/>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:    <interface type="ethernet">
Nov 22 03:10:02 np0005531887 nova_compute[186849]:      <mac address="fa:16:3e:ff:70:db"/>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:      <mtu size="1442"/>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:      <target dev="tap26953fb8-b3"/>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:    </interface>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:    <serial type="pty">
Nov 22 03:10:02 np0005531887 nova_compute[186849]:      <log file="/var/lib/nova/instances/1eab226d-c316-4033-b802-511921219249/console.log" append="off"/>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:    </serial>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:    <video>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:    </video>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:    <input type="tablet" bus="usb"/>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:    <rng model="virtio">
Nov 22 03:10:02 np0005531887 nova_compute[186849]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:    </rng>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:    <controller type="usb" index="0"/>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:    <memballoon model="virtio">
Nov 22 03:10:02 np0005531887 nova_compute[186849]:      <stats period="10"/>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 03:10:02 np0005531887 nova_compute[186849]:  </devices>
Nov 22 03:10:02 np0005531887 nova_compute[186849]: </domain>
Nov 22 03:10:02 np0005531887 nova_compute[186849]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:10:02 np0005531887 nova_compute[186849]: 2025-11-22 08:10:02.957 186853 DEBUG nova.compute.manager [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 1eab226d-c316-4033-b802-511921219249] Preparing to wait for external event network-vif-plugged-26953fb8-b327-4ec5-9059-9bd2496c1121 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:10:02 np0005531887 nova_compute[186849]: 2025-11-22 08:10:02.958 186853 DEBUG oslo_concurrency.lockutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Acquiring lock "1eab226d-c316-4033-b802-511921219249-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:10:02 np0005531887 nova_compute[186849]: 2025-11-22 08:10:02.958 186853 DEBUG oslo_concurrency.lockutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "1eab226d-c316-4033-b802-511921219249-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:10:02 np0005531887 nova_compute[186849]: 2025-11-22 08:10:02.958 186853 DEBUG oslo_concurrency.lockutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "1eab226d-c316-4033-b802-511921219249-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:10:02 np0005531887 nova_compute[186849]: 2025-11-22 08:10:02.959 186853 DEBUG nova.virt.libvirt.vif [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:09:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-MultipleCreateTestJSON-server-519320456',display_name='tempest-MultipleCreateTestJSON-server-519320456-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-multiplecreatetestjson-server-519320456-2',id=126,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9c564dfb60114407b72d22a9c49ed513',ramdisk_id='',reservation_id='r-s5402b4q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-1558462004',owner_user_name='tempest-MultipleCreateTestJSON-1558462004-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:09:59Z,user_data=None,user_id='c867ad823e59410b995507d3e85b3465',uuid=1eab226d-c316-4033-b802-511921219249,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "26953fb8-b327-4ec5-9059-9bd2496c1121", "address": "fa:16:3e:ff:70:db", "network": {"id": "c75f33da-8305-4145-97ef-eef656e4f067", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-910086432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c564dfb60114407b72d22a9c49ed513", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26953fb8-b3", "ovs_interfaceid": "26953fb8-b327-4ec5-9059-9bd2496c1121", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:10:02 np0005531887 nova_compute[186849]: 2025-11-22 08:10:02.959 186853 DEBUG nova.network.os_vif_util [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Converting VIF {"id": "26953fb8-b327-4ec5-9059-9bd2496c1121", "address": "fa:16:3e:ff:70:db", "network": {"id": "c75f33da-8305-4145-97ef-eef656e4f067", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-910086432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c564dfb60114407b72d22a9c49ed513", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26953fb8-b3", "ovs_interfaceid": "26953fb8-b327-4ec5-9059-9bd2496c1121", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:10:02 np0005531887 nova_compute[186849]: 2025-11-22 08:10:02.960 186853 DEBUG nova.network.os_vif_util [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ff:70:db,bridge_name='br-int',has_traffic_filtering=True,id=26953fb8-b327-4ec5-9059-9bd2496c1121,network=Network(c75f33da-8305-4145-97ef-eef656e4f067),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap26953fb8-b3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:10:02 np0005531887 nova_compute[186849]: 2025-11-22 08:10:02.960 186853 DEBUG os_vif [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:70:db,bridge_name='br-int',has_traffic_filtering=True,id=26953fb8-b327-4ec5-9059-9bd2496c1121,network=Network(c75f33da-8305-4145-97ef-eef656e4f067),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap26953fb8-b3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:10:02 np0005531887 nova_compute[186849]: 2025-11-22 08:10:02.961 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:02 np0005531887 nova_compute[186849]: 2025-11-22 08:10:02.961 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:10:02 np0005531887 nova_compute[186849]: 2025-11-22 08:10:02.961 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:10:02 np0005531887 nova_compute[186849]: 2025-11-22 08:10:02.963 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:02 np0005531887 nova_compute[186849]: 2025-11-22 08:10:02.963 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap26953fb8-b3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:10:02 np0005531887 nova_compute[186849]: 2025-11-22 08:10:02.964 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap26953fb8-b3, col_values=(('external_ids', {'iface-id': '26953fb8-b327-4ec5-9059-9bd2496c1121', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ff:70:db', 'vm-uuid': '1eab226d-c316-4033-b802-511921219249'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:10:02 np0005531887 NetworkManager[55210]: <info>  [1763799002.9667] manager: (tap26953fb8-b3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/184)
Nov 22 03:10:02 np0005531887 nova_compute[186849]: 2025-11-22 08:10:02.969 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:10:02 np0005531887 nova_compute[186849]: 2025-11-22 08:10:02.974 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:02 np0005531887 nova_compute[186849]: 2025-11-22 08:10:02.975 186853 INFO os_vif [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:70:db,bridge_name='br-int',has_traffic_filtering=True,id=26953fb8-b327-4ec5-9059-9bd2496c1121,network=Network(c75f33da-8305-4145-97ef-eef656e4f067),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap26953fb8-b3')#033[00m
Nov 22 03:10:03 np0005531887 nova_compute[186849]: 2025-11-22 08:10:03.016 186853 DEBUG nova.virt.libvirt.driver [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:10:03 np0005531887 nova_compute[186849]: 2025-11-22 08:10:03.017 186853 DEBUG nova.virt.libvirt.driver [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:10:03 np0005531887 nova_compute[186849]: 2025-11-22 08:10:03.018 186853 DEBUG nova.virt.libvirt.driver [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] No VIF found with MAC fa:16:3e:ff:70:db, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:10:03 np0005531887 nova_compute[186849]: 2025-11-22 08:10:03.018 186853 INFO nova.virt.libvirt.driver [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 1eab226d-c316-4033-b802-511921219249] Using config drive#033[00m
Nov 22 03:10:03 np0005531887 nova_compute[186849]: 2025-11-22 08:10:03.413 186853 DEBUG oslo_concurrency.lockutils [None req-4731b953-c1ec-4da3-9153-5d1a690c607c 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Acquiring lock "1b982188-a0e8-474c-a959-760a28dc3ffe" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:10:03 np0005531887 nova_compute[186849]: 2025-11-22 08:10:03.413 186853 DEBUG oslo_concurrency.lockutils [None req-4731b953-c1ec-4da3-9153-5d1a690c607c 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lock "1b982188-a0e8-474c-a959-760a28dc3ffe" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:10:03 np0005531887 nova_compute[186849]: 2025-11-22 08:10:03.414 186853 DEBUG oslo_concurrency.lockutils [None req-4731b953-c1ec-4da3-9153-5d1a690c607c 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Acquiring lock "1b982188-a0e8-474c-a959-760a28dc3ffe-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:10:03 np0005531887 nova_compute[186849]: 2025-11-22 08:10:03.414 186853 DEBUG oslo_concurrency.lockutils [None req-4731b953-c1ec-4da3-9153-5d1a690c607c 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lock "1b982188-a0e8-474c-a959-760a28dc3ffe-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:10:03 np0005531887 nova_compute[186849]: 2025-11-22 08:10:03.414 186853 DEBUG oslo_concurrency.lockutils [None req-4731b953-c1ec-4da3-9153-5d1a690c607c 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lock "1b982188-a0e8-474c-a959-760a28dc3ffe-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:10:03 np0005531887 nova_compute[186849]: 2025-11-22 08:10:03.424 186853 INFO nova.compute.manager [None req-4731b953-c1ec-4da3-9153-5d1a690c607c 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Terminating instance#033[00m
Nov 22 03:10:03 np0005531887 nova_compute[186849]: 2025-11-22 08:10:03.432 186853 DEBUG nova.compute.manager [None req-4731b953-c1ec-4da3-9153-5d1a690c607c 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 03:10:03 np0005531887 kernel: tap726decc7-82 (unregistering): left promiscuous mode
Nov 22 03:10:03 np0005531887 NetworkManager[55210]: <info>  [1763799003.4567] device (tap726decc7-82): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:10:03 np0005531887 nova_compute[186849]: 2025-11-22 08:10:03.474 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:03 np0005531887 ovn_controller[95130]: 2025-11-22T08:10:03Z|00391|binding|INFO|Releasing lport 726decc7-8256-48c9-992a-051f7215b6fa from this chassis (sb_readonly=0)
Nov 22 03:10:03 np0005531887 ovn_controller[95130]: 2025-11-22T08:10:03Z|00392|binding|INFO|Setting lport 726decc7-8256-48c9-992a-051f7215b6fa down in Southbound
Nov 22 03:10:03 np0005531887 ovn_controller[95130]: 2025-11-22T08:10:03Z|00393|binding|INFO|Removing iface tap726decc7-82 ovn-installed in OVS
Nov 22 03:10:03 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:03.492 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:22:16:fc 10.100.0.4'], port_security=['fa:16:3e:22:16:fc 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '1b982188-a0e8-474c-a959-760a28dc3ffe', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-390460fe-fb7f-40ce-abb7-9e99dea93a54', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b67388009f754931a62cbdd391fb4f53', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e23cfd74-a57b-4610-ab28-51062b779dc9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9b005592-2b67-4b5e-87ed-f6d87ca37498, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=726decc7-8256-48c9-992a-051f7215b6fa) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:10:03 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:03.494 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 726decc7-8256-48c9-992a-051f7215b6fa in datapath 390460fe-fb7f-40ce-abb7-9e99dea93a54 unbound from our chassis#033[00m
Nov 22 03:10:03 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:03.496 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 390460fe-fb7f-40ce-abb7-9e99dea93a54, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:10:03 np0005531887 nova_compute[186849]: 2025-11-22 08:10:03.496 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:03 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:03.497 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[b5f7dffd-9937-4887-9f1a-b8ed06fc739f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:03 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:03.498 104084 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54 namespace which is not needed anymore#033[00m
Nov 22 03:10:03 np0005531887 systemd[1]: machine-qemu\x2d46\x2dinstance\x2d0000007b.scope: Deactivated successfully.
Nov 22 03:10:03 np0005531887 systemd[1]: machine-qemu\x2d46\x2dinstance\x2d0000007b.scope: Consumed 10.047s CPU time.
Nov 22 03:10:03 np0005531887 systemd-machined[153180]: Machine qemu-46-instance-0000007b terminated.
Nov 22 03:10:03 np0005531887 neutron-haproxy-ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54[231622]: [NOTICE]   (231626) : haproxy version is 2.8.14-c23fe91
Nov 22 03:10:03 np0005531887 neutron-haproxy-ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54[231622]: [NOTICE]   (231626) : path to executable is /usr/sbin/haproxy
Nov 22 03:10:03 np0005531887 neutron-haproxy-ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54[231622]: [WARNING]  (231626) : Exiting Master process...
Nov 22 03:10:03 np0005531887 neutron-haproxy-ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54[231622]: [ALERT]    (231626) : Current worker (231628) exited with code 143 (Terminated)
Nov 22 03:10:03 np0005531887 neutron-haproxy-ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54[231622]: [WARNING]  (231626) : All workers exited. Exiting... (0)
Nov 22 03:10:03 np0005531887 systemd[1]: libpod-80cadbc8fd663a53dca669c90b95a9f4e72adec30f653ff5d808294d10f94c5b.scope: Deactivated successfully.
Nov 22 03:10:03 np0005531887 podman[231762]: 2025-11-22 08:10:03.636804002 +0000 UTC m=+0.048168609 container died 80cadbc8fd663a53dca669c90b95a9f4e72adec30f653ff5d808294d10f94c5b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 22 03:10:03 np0005531887 nova_compute[186849]: 2025-11-22 08:10:03.659 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:03 np0005531887 nova_compute[186849]: 2025-11-22 08:10:03.667 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:03 np0005531887 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-80cadbc8fd663a53dca669c90b95a9f4e72adec30f653ff5d808294d10f94c5b-userdata-shm.mount: Deactivated successfully.
Nov 22 03:10:03 np0005531887 systemd[1]: var-lib-containers-storage-overlay-cb5487dd6275a854b48add95401328a5e187d979244c230d5ab82d54aedbed0f-merged.mount: Deactivated successfully.
Nov 22 03:10:03 np0005531887 nova_compute[186849]: 2025-11-22 08:10:03.700 186853 INFO nova.virt.libvirt.driver [-] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Instance destroyed successfully.#033[00m
Nov 22 03:10:03 np0005531887 nova_compute[186849]: 2025-11-22 08:10:03.701 186853 DEBUG nova.objects.instance [None req-4731b953-c1ec-4da3-9153-5d1a690c607c 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lazy-loading 'resources' on Instance uuid 1b982188-a0e8-474c-a959-760a28dc3ffe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:10:03 np0005531887 nova_compute[186849]: 2025-11-22 08:10:03.713 186853 DEBUG nova.virt.libvirt.vif [None req-4731b953-c1ec-4da3-9153-5d1a690c607c 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:09:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerMultinode-server-1940159938',display_name='tempest-TestServerMultinode-server-1940159938',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testservermultinode-server-1940159938',id=123,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:09:54Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b67388009f754931a62cbdd391fb4f53',ramdisk_id='',reservation_id='r-0dc019pn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestServerMultinode-1734646453',owner_user_name='tempest-TestServerMultinode-1734646453-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:09:54Z,user_data=None,user_id='1bc17d213e01420ebb2a0bf75f44e357',uuid=1b982188-a0e8-474c-a959-760a28dc3ffe,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "726decc7-8256-48c9-992a-051f7215b6fa", "address": "fa:16:3e:22:16:fc", "network": {"id": "390460fe-fb7f-40ce-abb7-9e99dea93a54", "bridge": "br-int", "label": "tempest-TestServerMultinode-1226412633-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ea60c87d8514904b3cac3301e5875af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap726decc7-82", "ovs_interfaceid": "726decc7-8256-48c9-992a-051f7215b6fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:10:03 np0005531887 nova_compute[186849]: 2025-11-22 08:10:03.713 186853 DEBUG nova.network.os_vif_util [None req-4731b953-c1ec-4da3-9153-5d1a690c607c 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Converting VIF {"id": "726decc7-8256-48c9-992a-051f7215b6fa", "address": "fa:16:3e:22:16:fc", "network": {"id": "390460fe-fb7f-40ce-abb7-9e99dea93a54", "bridge": "br-int", "label": "tempest-TestServerMultinode-1226412633-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ea60c87d8514904b3cac3301e5875af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap726decc7-82", "ovs_interfaceid": "726decc7-8256-48c9-992a-051f7215b6fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:10:03 np0005531887 nova_compute[186849]: 2025-11-22 08:10:03.713 186853 DEBUG nova.network.os_vif_util [None req-4731b953-c1ec-4da3-9153-5d1a690c607c 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:22:16:fc,bridge_name='br-int',has_traffic_filtering=True,id=726decc7-8256-48c9-992a-051f7215b6fa,network=Network(390460fe-fb7f-40ce-abb7-9e99dea93a54),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap726decc7-82') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:10:03 np0005531887 nova_compute[186849]: 2025-11-22 08:10:03.714 186853 DEBUG os_vif [None req-4731b953-c1ec-4da3-9153-5d1a690c607c 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:22:16:fc,bridge_name='br-int',has_traffic_filtering=True,id=726decc7-8256-48c9-992a-051f7215b6fa,network=Network(390460fe-fb7f-40ce-abb7-9e99dea93a54),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap726decc7-82') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:10:03 np0005531887 nova_compute[186849]: 2025-11-22 08:10:03.715 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:03 np0005531887 nova_compute[186849]: 2025-11-22 08:10:03.715 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap726decc7-82, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:10:03 np0005531887 nova_compute[186849]: 2025-11-22 08:10:03.716 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:03 np0005531887 nova_compute[186849]: 2025-11-22 08:10:03.718 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:10:03 np0005531887 nova_compute[186849]: 2025-11-22 08:10:03.721 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:03 np0005531887 nova_compute[186849]: 2025-11-22 08:10:03.724 186853 INFO os_vif [None req-4731b953-c1ec-4da3-9153-5d1a690c607c 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:22:16:fc,bridge_name='br-int',has_traffic_filtering=True,id=726decc7-8256-48c9-992a-051f7215b6fa,network=Network(390460fe-fb7f-40ce-abb7-9e99dea93a54),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap726decc7-82')#033[00m
Nov 22 03:10:03 np0005531887 nova_compute[186849]: 2025-11-22 08:10:03.726 186853 INFO nova.virt.libvirt.driver [None req-4731b953-c1ec-4da3-9153-5d1a690c607c 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Deleting instance files /var/lib/nova/instances/1b982188-a0e8-474c-a959-760a28dc3ffe_del#033[00m
Nov 22 03:10:03 np0005531887 nova_compute[186849]: 2025-11-22 08:10:03.726 186853 INFO nova.virt.libvirt.driver [None req-4731b953-c1ec-4da3-9153-5d1a690c607c 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Deletion of /var/lib/nova/instances/1b982188-a0e8-474c-a959-760a28dc3ffe_del complete#033[00m
Nov 22 03:10:03 np0005531887 podman[231762]: 2025-11-22 08:10:03.727440459 +0000 UTC m=+0.138805076 container cleanup 80cadbc8fd663a53dca669c90b95a9f4e72adec30f653ff5d808294d10f94c5b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 22 03:10:03 np0005531887 systemd[1]: libpod-conmon-80cadbc8fd663a53dca669c90b95a9f4e72adec30f653ff5d808294d10f94c5b.scope: Deactivated successfully.
Nov 22 03:10:03 np0005531887 nova_compute[186849]: 2025-11-22 08:10:03.783 186853 INFO nova.compute.manager [None req-4731b953-c1ec-4da3-9153-5d1a690c607c 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Took 0.35 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 03:10:03 np0005531887 nova_compute[186849]: 2025-11-22 08:10:03.784 186853 DEBUG oslo.service.loopingcall [None req-4731b953-c1ec-4da3-9153-5d1a690c607c 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 03:10:03 np0005531887 nova_compute[186849]: 2025-11-22 08:10:03.784 186853 DEBUG nova.compute.manager [-] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 03:10:03 np0005531887 nova_compute[186849]: 2025-11-22 08:10:03.785 186853 DEBUG nova.network.neutron [-] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 03:10:03 np0005531887 podman[231807]: 2025-11-22 08:10:03.968704023 +0000 UTC m=+0.218561505 container remove 80cadbc8fd663a53dca669c90b95a9f4e72adec30f653ff5d808294d10f94c5b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Nov 22 03:10:03 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:03.976 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[f935b5c3-742f-41bf-afb6-0f951ad14c04]: (4, ('Sat Nov 22 08:10:03 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54 (80cadbc8fd663a53dca669c90b95a9f4e72adec30f653ff5d808294d10f94c5b)\n80cadbc8fd663a53dca669c90b95a9f4e72adec30f653ff5d808294d10f94c5b\nSat Nov 22 08:10:03 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54 (80cadbc8fd663a53dca669c90b95a9f4e72adec30f653ff5d808294d10f94c5b)\n80cadbc8fd663a53dca669c90b95a9f4e72adec30f653ff5d808294d10f94c5b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:03 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:03.978 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[9c532f61-e18e-49f9-9fe4-978090063be5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:03 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:03.979 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap390460fe-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:10:03 np0005531887 nova_compute[186849]: 2025-11-22 08:10:03.980 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:03 np0005531887 kernel: tap390460fe-f0: left promiscuous mode
Nov 22 03:10:03 np0005531887 nova_compute[186849]: 2025-11-22 08:10:03.998 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:04.002 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[35589115-0342-4dd1-aff0-7db4706d5662]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:04.019 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[0bce4c30-1cb2-4b49-9588-e977982e5be9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:04.021 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[179e3cc4-1b5e-41f9-8cf3-ffd75f64cf41]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:04.038 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[cda4d43a-d97e-4989-9c93-2ee0e49810b6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 573593, 'reachable_time': 36321, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231825, 'error': None, 'target': 'ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:04.041 104200 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:10:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:04.041 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[69b4551d-0b73-48a5-b135-c1164c1c688d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:04 np0005531887 systemd[1]: run-netns-ovnmeta\x2d390460fe\x2dfb7f\x2d40ce\x2dabb7\x2d9e99dea93a54.mount: Deactivated successfully.
Nov 22 03:10:04 np0005531887 nova_compute[186849]: 2025-11-22 08:10:04.397 186853 INFO nova.virt.libvirt.driver [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 1eab226d-c316-4033-b802-511921219249] Creating config drive at /var/lib/nova/instances/1eab226d-c316-4033-b802-511921219249/disk.config#033[00m
Nov 22 03:10:04 np0005531887 nova_compute[186849]: 2025-11-22 08:10:04.403 186853 DEBUG oslo_concurrency.processutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1eab226d-c316-4033-b802-511921219249/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqzlyl5al execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:10:04 np0005531887 nova_compute[186849]: 2025-11-22 08:10:04.532 186853 DEBUG oslo_concurrency.processutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1eab226d-c316-4033-b802-511921219249/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqzlyl5al" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:10:04 np0005531887 kernel: tap26953fb8-b3: entered promiscuous mode
Nov 22 03:10:04 np0005531887 NetworkManager[55210]: <info>  [1763799004.5983] manager: (tap26953fb8-b3): new Tun device (/org/freedesktop/NetworkManager/Devices/185)
Nov 22 03:10:04 np0005531887 systemd-udevd[231741]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:10:04 np0005531887 ovn_controller[95130]: 2025-11-22T08:10:04Z|00394|binding|INFO|Claiming lport 26953fb8-b327-4ec5-9059-9bd2496c1121 for this chassis.
Nov 22 03:10:04 np0005531887 ovn_controller[95130]: 2025-11-22T08:10:04Z|00395|binding|INFO|26953fb8-b327-4ec5-9059-9bd2496c1121: Claiming fa:16:3e:ff:70:db 10.100.0.10
Nov 22 03:10:04 np0005531887 nova_compute[186849]: 2025-11-22 08:10:04.599 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:04 np0005531887 NetworkManager[55210]: <info>  [1763799004.6115] device (tap26953fb8-b3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:10:04 np0005531887 NetworkManager[55210]: <info>  [1763799004.6131] device (tap26953fb8-b3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:10:04 np0005531887 ovn_controller[95130]: 2025-11-22T08:10:04Z|00396|binding|INFO|Setting lport 26953fb8-b327-4ec5-9059-9bd2496c1121 ovn-installed in OVS
Nov 22 03:10:04 np0005531887 nova_compute[186849]: 2025-11-22 08:10:04.615 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:04 np0005531887 nova_compute[186849]: 2025-11-22 08:10:04.621 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:04 np0005531887 systemd-machined[153180]: New machine qemu-47-instance-0000007e.
Nov 22 03:10:04 np0005531887 systemd[1]: Started Virtual Machine qemu-47-instance-0000007e.
Nov 22 03:10:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:04.733 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ff:70:db 10.100.0.10'], port_security=['fa:16:3e:ff:70:db 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '1eab226d-c316-4033-b802-511921219249', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c75f33da-8305-4145-97ef-eef656e4f067', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9c564dfb60114407b72d22a9c49ed513', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7b4508ee-1620-408d-af22-547cca254fde', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a1e17c95-3f14-4b31-90be-c563d86a1107, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=26953fb8-b327-4ec5-9059-9bd2496c1121) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:10:04 np0005531887 ovn_controller[95130]: 2025-11-22T08:10:04Z|00397|binding|INFO|Setting lport 26953fb8-b327-4ec5-9059-9bd2496c1121 up in Southbound
Nov 22 03:10:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:04.735 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 26953fb8-b327-4ec5-9059-9bd2496c1121 in datapath c75f33da-8305-4145-97ef-eef656e4f067 bound to our chassis#033[00m
Nov 22 03:10:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:04.737 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c75f33da-8305-4145-97ef-eef656e4f067#033[00m
Nov 22 03:10:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:04.747 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[ce9eaa98-4e91-449d-baca-875399f49ab4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:04.748 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc75f33da-81 in ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:10:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:04.751 213790 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc75f33da-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:10:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:04.751 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[17851bf3-931d-4707-999a-6a95b2a52c1e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:04.752 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[0590e18f-cfa7-4b58-8040-c5339207063b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:04.764 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[f2f8dfb6-7ff9-41fd-9baa-fe368069f10a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:04.786 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[0b8efc6d-f1da-40ee-b8e0-6c3256506f6e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:04.816 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[c5de429b-94aa-4012-9540-ff2e79919af3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:04.821 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[4dca0eb7-5c18-42e5-bbb5-4abce9f9026e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:04 np0005531887 NetworkManager[55210]: <info>  [1763799004.8224] manager: (tapc75f33da-80): new Veth device (/org/freedesktop/NetworkManager/Devices/186)
Nov 22 03:10:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:04.851 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[4934b8dc-6dbd-4215-a6d5-ff4eee4b6bc2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:04.854 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[6c838142-ca93-4c77-aa9c-c87eee4eacb6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:04 np0005531887 NetworkManager[55210]: <info>  [1763799004.8821] device (tapc75f33da-80): carrier: link connected
Nov 22 03:10:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:04.891 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[d63ea062-c757-40ba-88ad-3fc43629b01d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:04.911 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[0adfea7d-76a9-4017-8592-5deb915dff79]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc75f33da-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6a:c8:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 121], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 574748, 'reachable_time': 29470, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231874, 'error': None, 'target': 'ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:04.928 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[3c980d97-6cee-452c-8ba2-fc638af2cdec]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6a:c898'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 574748, 'tstamp': 574748}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231875, 'error': None, 'target': 'ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:04.947 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[443d58d9-f298-4ff0-a1d1-e33dfa9e0448]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc75f33da-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6a:c8:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 121], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 574748, 'reachable_time': 29470, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 231878, 'error': None, 'target': 'ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:04.979 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[4b473a39-ca9d-417a-b9fa-3cc4b9e780c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:05.043 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[def35771-660c-45ce-a4e3-e47d1eef5a23]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:05.046 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc75f33da-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:10:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:05.046 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:10:05 np0005531887 nova_compute[186849]: 2025-11-22 08:10:05.046 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763799005.0454628, 1eab226d-c316-4033-b802-511921219249 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:10:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:05.046 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc75f33da-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:10:05 np0005531887 nova_compute[186849]: 2025-11-22 08:10:05.046 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 1eab226d-c316-4033-b802-511921219249] VM Started (Lifecycle Event)#033[00m
Nov 22 03:10:05 np0005531887 NetworkManager[55210]: <info>  [1763799005.0486] manager: (tapc75f33da-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/187)
Nov 22 03:10:05 np0005531887 kernel: tapc75f33da-80: entered promiscuous mode
Nov 22 03:10:05 np0005531887 nova_compute[186849]: 2025-11-22 08:10:05.049 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:05 np0005531887 nova_compute[186849]: 2025-11-22 08:10:05.051 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:05.052 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc75f33da-80, col_values=(('external_ids', {'iface-id': 'd2b1e9d2-8364-40b7-8c31-edbcc237653b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:10:05 np0005531887 nova_compute[186849]: 2025-11-22 08:10:05.053 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:05 np0005531887 ovn_controller[95130]: 2025-11-22T08:10:05Z|00398|binding|INFO|Releasing lport d2b1e9d2-8364-40b7-8c31-edbcc237653b from this chassis (sb_readonly=0)
Nov 22 03:10:05 np0005531887 nova_compute[186849]: 2025-11-22 08:10:05.066 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:05.067 104084 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c75f33da-8305-4145-97ef-eef656e4f067.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c75f33da-8305-4145-97ef-eef656e4f067.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:10:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:05.068 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[505b2dd0-fbd9-4fbf-9071-5f726b3bf9a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:05.068 104084 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:10:05 np0005531887 ovn_metadata_agent[104079]: global
Nov 22 03:10:05 np0005531887 ovn_metadata_agent[104079]:    log         /dev/log local0 debug
Nov 22 03:10:05 np0005531887 ovn_metadata_agent[104079]:    log-tag     haproxy-metadata-proxy-c75f33da-8305-4145-97ef-eef656e4f067
Nov 22 03:10:05 np0005531887 ovn_metadata_agent[104079]:    user        root
Nov 22 03:10:05 np0005531887 ovn_metadata_agent[104079]:    group       root
Nov 22 03:10:05 np0005531887 ovn_metadata_agent[104079]:    maxconn     1024
Nov 22 03:10:05 np0005531887 ovn_metadata_agent[104079]:    pidfile     /var/lib/neutron/external/pids/c75f33da-8305-4145-97ef-eef656e4f067.pid.haproxy
Nov 22 03:10:05 np0005531887 ovn_metadata_agent[104079]:    daemon
Nov 22 03:10:05 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:10:05 np0005531887 ovn_metadata_agent[104079]: defaults
Nov 22 03:10:05 np0005531887 ovn_metadata_agent[104079]:    log global
Nov 22 03:10:05 np0005531887 ovn_metadata_agent[104079]:    mode http
Nov 22 03:10:05 np0005531887 ovn_metadata_agent[104079]:    option httplog
Nov 22 03:10:05 np0005531887 ovn_metadata_agent[104079]:    option dontlognull
Nov 22 03:10:05 np0005531887 ovn_metadata_agent[104079]:    option http-server-close
Nov 22 03:10:05 np0005531887 ovn_metadata_agent[104079]:    option forwardfor
Nov 22 03:10:05 np0005531887 ovn_metadata_agent[104079]:    retries                 3
Nov 22 03:10:05 np0005531887 ovn_metadata_agent[104079]:    timeout http-request    30s
Nov 22 03:10:05 np0005531887 ovn_metadata_agent[104079]:    timeout connect         30s
Nov 22 03:10:05 np0005531887 ovn_metadata_agent[104079]:    timeout client          32s
Nov 22 03:10:05 np0005531887 ovn_metadata_agent[104079]:    timeout server          32s
Nov 22 03:10:05 np0005531887 ovn_metadata_agent[104079]:    timeout http-keep-alive 30s
Nov 22 03:10:05 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:10:05 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:10:05 np0005531887 ovn_metadata_agent[104079]: listen listener
Nov 22 03:10:05 np0005531887 ovn_metadata_agent[104079]:    bind 169.254.169.254:80
Nov 22 03:10:05 np0005531887 ovn_metadata_agent[104079]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:10:05 np0005531887 ovn_metadata_agent[104079]:    http-request add-header X-OVN-Network-ID c75f33da-8305-4145-97ef-eef656e4f067
Nov 22 03:10:05 np0005531887 ovn_metadata_agent[104079]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:10:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:05.069 104084 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067', 'env', 'PROCESS_TAG=haproxy-c75f33da-8305-4145-97ef-eef656e4f067', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c75f33da-8305-4145-97ef-eef656e4f067.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:10:05 np0005531887 nova_compute[186849]: 2025-11-22 08:10:05.086 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 1eab226d-c316-4033-b802-511921219249] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:10:05 np0005531887 nova_compute[186849]: 2025-11-22 08:10:05.091 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763799005.0456638, 1eab226d-c316-4033-b802-511921219249 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:10:05 np0005531887 nova_compute[186849]: 2025-11-22 08:10:05.091 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 1eab226d-c316-4033-b802-511921219249] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:10:05 np0005531887 nova_compute[186849]: 2025-11-22 08:10:05.108 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 1eab226d-c316-4033-b802-511921219249] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:10:05 np0005531887 nova_compute[186849]: 2025-11-22 08:10:05.112 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 1eab226d-c316-4033-b802-511921219249] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:10:05 np0005531887 nova_compute[186849]: 2025-11-22 08:10:05.134 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 1eab226d-c316-4033-b802-511921219249] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:10:05 np0005531887 podman[231915]: 2025-11-22 08:10:05.423932507 +0000 UTC m=+0.048493529 container create 509ab56ef7aa9bd0f674659ea64d4f66b487ec1e42041218f1b41d831a48ec4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:10:05 np0005531887 systemd[1]: Started libpod-conmon-509ab56ef7aa9bd0f674659ea64d4f66b487ec1e42041218f1b41d831a48ec4c.scope.
Nov 22 03:10:05 np0005531887 systemd[1]: Started libcrun container.
Nov 22 03:10:05 np0005531887 podman[231915]: 2025-11-22 08:10:05.398620352 +0000 UTC m=+0.023181394 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:10:05 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/63b2c4f9f9013e671ab8a04c7f15ed3ec249cf5e54f83175f530b2a6942941ab/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:10:05 np0005531887 podman[231915]: 2025-11-22 08:10:05.508954814 +0000 UTC m=+0.133515866 container init 509ab56ef7aa9bd0f674659ea64d4f66b487ec1e42041218f1b41d831a48ec4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118)
Nov 22 03:10:05 np0005531887 nova_compute[186849]: 2025-11-22 08:10:05.508 186853 DEBUG nova.compute.manager [req-fe3fc802-944c-4ed0-8e01-08d02a434fce req-5ada4b40-49fe-4f43-98f1-7870571a3135 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Received event network-vif-unplugged-726decc7-8256-48c9-992a-051f7215b6fa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:10:05 np0005531887 nova_compute[186849]: 2025-11-22 08:10:05.509 186853 DEBUG oslo_concurrency.lockutils [req-fe3fc802-944c-4ed0-8e01-08d02a434fce req-5ada4b40-49fe-4f43-98f1-7870571a3135 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1b982188-a0e8-474c-a959-760a28dc3ffe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:10:05 np0005531887 nova_compute[186849]: 2025-11-22 08:10:05.510 186853 DEBUG oslo_concurrency.lockutils [req-fe3fc802-944c-4ed0-8e01-08d02a434fce req-5ada4b40-49fe-4f43-98f1-7870571a3135 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1b982188-a0e8-474c-a959-760a28dc3ffe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:10:05 np0005531887 nova_compute[186849]: 2025-11-22 08:10:05.510 186853 DEBUG oslo_concurrency.lockutils [req-fe3fc802-944c-4ed0-8e01-08d02a434fce req-5ada4b40-49fe-4f43-98f1-7870571a3135 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1b982188-a0e8-474c-a959-760a28dc3ffe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:10:05 np0005531887 nova_compute[186849]: 2025-11-22 08:10:05.510 186853 DEBUG nova.compute.manager [req-fe3fc802-944c-4ed0-8e01-08d02a434fce req-5ada4b40-49fe-4f43-98f1-7870571a3135 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] No waiting events found dispatching network-vif-unplugged-726decc7-8256-48c9-992a-051f7215b6fa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:10:05 np0005531887 nova_compute[186849]: 2025-11-22 08:10:05.511 186853 DEBUG nova.compute.manager [req-fe3fc802-944c-4ed0-8e01-08d02a434fce req-5ada4b40-49fe-4f43-98f1-7870571a3135 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Received event network-vif-unplugged-726decc7-8256-48c9-992a-051f7215b6fa for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 03:10:05 np0005531887 podman[231915]: 2025-11-22 08:10:05.516157863 +0000 UTC m=+0.140718885 container start 509ab56ef7aa9bd0f674659ea64d4f66b487ec1e42041218f1b41d831a48ec4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:10:05 np0005531887 neutron-haproxy-ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067[231930]: [NOTICE]   (231934) : New worker (231936) forked
Nov 22 03:10:05 np0005531887 neutron-haproxy-ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067[231930]: [NOTICE]   (231934) : Loading success.
Nov 22 03:10:06 np0005531887 podman[231945]: 2025-11-22 08:10:06.837248005 +0000 UTC m=+0.059114130 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 03:10:06 np0005531887 nova_compute[186849]: 2025-11-22 08:10:06.926 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:06 np0005531887 nova_compute[186849]: 2025-11-22 08:10:06.983 186853 DEBUG nova.network.neutron [-] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:10:06 np0005531887 nova_compute[186849]: 2025-11-22 08:10:06.993 186853 DEBUG nova.compute.manager [req-a1e2f14b-99ea-4776-bf5f-01245e0d5bea req-fcdf708e-2032-49d9-920e-422e620cf16f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Received event network-vif-deleted-726decc7-8256-48c9-992a-051f7215b6fa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:10:06 np0005531887 nova_compute[186849]: 2025-11-22 08:10:06.993 186853 INFO nova.compute.manager [req-a1e2f14b-99ea-4776-bf5f-01245e0d5bea req-fcdf708e-2032-49d9-920e-422e620cf16f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Neutron deleted interface 726decc7-8256-48c9-992a-051f7215b6fa; detaching it from the instance and deleting it from the info cache#033[00m
Nov 22 03:10:06 np0005531887 nova_compute[186849]: 2025-11-22 08:10:06.993 186853 DEBUG nova.network.neutron [req-a1e2f14b-99ea-4776-bf5f-01245e0d5bea req-fcdf708e-2032-49d9-920e-422e620cf16f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:10:07 np0005531887 nova_compute[186849]: 2025-11-22 08:10:07.087 186853 INFO nova.compute.manager [-] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Took 3.30 seconds to deallocate network for instance.#033[00m
Nov 22 03:10:07 np0005531887 nova_compute[186849]: 2025-11-22 08:10:07.099 186853 DEBUG nova.network.neutron [req-ea82a668-775c-409e-8582-09079abff2f4 req-fd7803a9-5b9e-48a2-a4cd-b2b4dcbad097 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1eab226d-c316-4033-b802-511921219249] Updated VIF entry in instance network info cache for port 26953fb8-b327-4ec5-9059-9bd2496c1121. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:10:07 np0005531887 nova_compute[186849]: 2025-11-22 08:10:07.100 186853 DEBUG nova.network.neutron [req-ea82a668-775c-409e-8582-09079abff2f4 req-fd7803a9-5b9e-48a2-a4cd-b2b4dcbad097 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1eab226d-c316-4033-b802-511921219249] Updating instance_info_cache with network_info: [{"id": "26953fb8-b327-4ec5-9059-9bd2496c1121", "address": "fa:16:3e:ff:70:db", "network": {"id": "c75f33da-8305-4145-97ef-eef656e4f067", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-910086432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c564dfb60114407b72d22a9c49ed513", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26953fb8-b3", "ovs_interfaceid": "26953fb8-b327-4ec5-9059-9bd2496c1121", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:10:07 np0005531887 nova_compute[186849]: 2025-11-22 08:10:07.124 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:07 np0005531887 nova_compute[186849]: 2025-11-22 08:10:07.130 186853 DEBUG oslo_concurrency.lockutils [req-ea82a668-775c-409e-8582-09079abff2f4 req-fd7803a9-5b9e-48a2-a4cd-b2b4dcbad097 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-1eab226d-c316-4033-b802-511921219249" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:10:07 np0005531887 nova_compute[186849]: 2025-11-22 08:10:07.150 186853 DEBUG nova.compute.manager [req-a1e2f14b-99ea-4776-bf5f-01245e0d5bea req-fcdf708e-2032-49d9-920e-422e620cf16f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Detach interface failed, port_id=726decc7-8256-48c9-992a-051f7215b6fa, reason: Instance 1b982188-a0e8-474c-a959-760a28dc3ffe could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Nov 22 03:10:07 np0005531887 nova_compute[186849]: 2025-11-22 08:10:07.246 186853 DEBUG oslo_concurrency.lockutils [None req-4731b953-c1ec-4da3-9153-5d1a690c607c 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:10:07 np0005531887 nova_compute[186849]: 2025-11-22 08:10:07.247 186853 DEBUG oslo_concurrency.lockutils [None req-4731b953-c1ec-4da3-9153-5d1a690c607c 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:10:07 np0005531887 nova_compute[186849]: 2025-11-22 08:10:07.548 186853 DEBUG nova.compute.provider_tree [None req-4731b953-c1ec-4da3-9153-5d1a690c607c 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:10:07 np0005531887 nova_compute[186849]: 2025-11-22 08:10:07.564 186853 DEBUG nova.scheduler.client.report [None req-4731b953-c1ec-4da3-9153-5d1a690c607c 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:10:07 np0005531887 nova_compute[186849]: 2025-11-22 08:10:07.593 186853 DEBUG oslo_concurrency.lockutils [None req-4731b953-c1ec-4da3-9153-5d1a690c607c 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.346s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:10:07 np0005531887 nova_compute[186849]: 2025-11-22 08:10:07.617 186853 DEBUG nova.compute.manager [req-72775921-d993-4c3b-8af5-255756b09f51 req-39a0a6fb-e136-4110-b0e5-bcecfb0474c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Received event network-vif-plugged-726decc7-8256-48c9-992a-051f7215b6fa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:10:07 np0005531887 nova_compute[186849]: 2025-11-22 08:10:07.617 186853 DEBUG oslo_concurrency.lockutils [req-72775921-d993-4c3b-8af5-255756b09f51 req-39a0a6fb-e136-4110-b0e5-bcecfb0474c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1b982188-a0e8-474c-a959-760a28dc3ffe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:10:07 np0005531887 nova_compute[186849]: 2025-11-22 08:10:07.618 186853 DEBUG oslo_concurrency.lockutils [req-72775921-d993-4c3b-8af5-255756b09f51 req-39a0a6fb-e136-4110-b0e5-bcecfb0474c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1b982188-a0e8-474c-a959-760a28dc3ffe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:10:07 np0005531887 nova_compute[186849]: 2025-11-22 08:10:07.618 186853 DEBUG oslo_concurrency.lockutils [req-72775921-d993-4c3b-8af5-255756b09f51 req-39a0a6fb-e136-4110-b0e5-bcecfb0474c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1b982188-a0e8-474c-a959-760a28dc3ffe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:10:07 np0005531887 nova_compute[186849]: 2025-11-22 08:10:07.618 186853 DEBUG nova.compute.manager [req-72775921-d993-4c3b-8af5-255756b09f51 req-39a0a6fb-e136-4110-b0e5-bcecfb0474c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] No waiting events found dispatching network-vif-plugged-726decc7-8256-48c9-992a-051f7215b6fa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:10:07 np0005531887 nova_compute[186849]: 2025-11-22 08:10:07.619 186853 WARNING nova.compute.manager [req-72775921-d993-4c3b-8af5-255756b09f51 req-39a0a6fb-e136-4110-b0e5-bcecfb0474c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Received unexpected event network-vif-plugged-726decc7-8256-48c9-992a-051f7215b6fa for instance with vm_state deleted and task_state None.#033[00m
Nov 22 03:10:07 np0005531887 nova_compute[186849]: 2025-11-22 08:10:07.619 186853 DEBUG nova.compute.manager [req-72775921-d993-4c3b-8af5-255756b09f51 req-39a0a6fb-e136-4110-b0e5-bcecfb0474c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1eab226d-c316-4033-b802-511921219249] Received event network-vif-plugged-26953fb8-b327-4ec5-9059-9bd2496c1121 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:10:07 np0005531887 nova_compute[186849]: 2025-11-22 08:10:07.619 186853 DEBUG oslo_concurrency.lockutils [req-72775921-d993-4c3b-8af5-255756b09f51 req-39a0a6fb-e136-4110-b0e5-bcecfb0474c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1eab226d-c316-4033-b802-511921219249-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:10:07 np0005531887 nova_compute[186849]: 2025-11-22 08:10:07.620 186853 DEBUG oslo_concurrency.lockutils [req-72775921-d993-4c3b-8af5-255756b09f51 req-39a0a6fb-e136-4110-b0e5-bcecfb0474c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1eab226d-c316-4033-b802-511921219249-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:10:07 np0005531887 nova_compute[186849]: 2025-11-22 08:10:07.620 186853 DEBUG oslo_concurrency.lockutils [req-72775921-d993-4c3b-8af5-255756b09f51 req-39a0a6fb-e136-4110-b0e5-bcecfb0474c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1eab226d-c316-4033-b802-511921219249-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:10:07 np0005531887 nova_compute[186849]: 2025-11-22 08:10:07.620 186853 DEBUG nova.compute.manager [req-72775921-d993-4c3b-8af5-255756b09f51 req-39a0a6fb-e136-4110-b0e5-bcecfb0474c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1eab226d-c316-4033-b802-511921219249] Processing event network-vif-plugged-26953fb8-b327-4ec5-9059-9bd2496c1121 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:10:07 np0005531887 nova_compute[186849]: 2025-11-22 08:10:07.620 186853 DEBUG nova.compute.manager [req-72775921-d993-4c3b-8af5-255756b09f51 req-39a0a6fb-e136-4110-b0e5-bcecfb0474c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1eab226d-c316-4033-b802-511921219249] Received event network-vif-plugged-26953fb8-b327-4ec5-9059-9bd2496c1121 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:10:07 np0005531887 nova_compute[186849]: 2025-11-22 08:10:07.621 186853 DEBUG oslo_concurrency.lockutils [req-72775921-d993-4c3b-8af5-255756b09f51 req-39a0a6fb-e136-4110-b0e5-bcecfb0474c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1eab226d-c316-4033-b802-511921219249-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:10:07 np0005531887 nova_compute[186849]: 2025-11-22 08:10:07.621 186853 DEBUG oslo_concurrency.lockutils [req-72775921-d993-4c3b-8af5-255756b09f51 req-39a0a6fb-e136-4110-b0e5-bcecfb0474c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1eab226d-c316-4033-b802-511921219249-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:10:07 np0005531887 nova_compute[186849]: 2025-11-22 08:10:07.621 186853 DEBUG oslo_concurrency.lockutils [req-72775921-d993-4c3b-8af5-255756b09f51 req-39a0a6fb-e136-4110-b0e5-bcecfb0474c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1eab226d-c316-4033-b802-511921219249-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:10:07 np0005531887 nova_compute[186849]: 2025-11-22 08:10:07.621 186853 DEBUG nova.compute.manager [req-72775921-d993-4c3b-8af5-255756b09f51 req-39a0a6fb-e136-4110-b0e5-bcecfb0474c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1eab226d-c316-4033-b802-511921219249] No waiting events found dispatching network-vif-plugged-26953fb8-b327-4ec5-9059-9bd2496c1121 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:10:07 np0005531887 nova_compute[186849]: 2025-11-22 08:10:07.622 186853 WARNING nova.compute.manager [req-72775921-d993-4c3b-8af5-255756b09f51 req-39a0a6fb-e136-4110-b0e5-bcecfb0474c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1eab226d-c316-4033-b802-511921219249] Received unexpected event network-vif-plugged-26953fb8-b327-4ec5-9059-9bd2496c1121 for instance with vm_state building and task_state spawning.#033[00m
Nov 22 03:10:07 np0005531887 nova_compute[186849]: 2025-11-22 08:10:07.622 186853 DEBUG nova.compute.manager [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 1eab226d-c316-4033-b802-511921219249] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:10:07 np0005531887 nova_compute[186849]: 2025-11-22 08:10:07.627 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763799007.6271398, 1eab226d-c316-4033-b802-511921219249 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:10:07 np0005531887 nova_compute[186849]: 2025-11-22 08:10:07.627 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 1eab226d-c316-4033-b802-511921219249] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:10:07 np0005531887 nova_compute[186849]: 2025-11-22 08:10:07.630 186853 DEBUG nova.virt.libvirt.driver [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 1eab226d-c316-4033-b802-511921219249] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:10:07 np0005531887 nova_compute[186849]: 2025-11-22 08:10:07.634 186853 INFO nova.virt.libvirt.driver [-] [instance: 1eab226d-c316-4033-b802-511921219249] Instance spawned successfully.#033[00m
Nov 22 03:10:07 np0005531887 nova_compute[186849]: 2025-11-22 08:10:07.635 186853 DEBUG nova.virt.libvirt.driver [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 1eab226d-c316-4033-b802-511921219249] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 03:10:07 np0005531887 nova_compute[186849]: 2025-11-22 08:10:07.650 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 1eab226d-c316-4033-b802-511921219249] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:10:07 np0005531887 nova_compute[186849]: 2025-11-22 08:10:07.659 186853 DEBUG nova.virt.libvirt.driver [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 1eab226d-c316-4033-b802-511921219249] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:10:07 np0005531887 nova_compute[186849]: 2025-11-22 08:10:07.659 186853 DEBUG nova.virt.libvirt.driver [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 1eab226d-c316-4033-b802-511921219249] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:10:07 np0005531887 nova_compute[186849]: 2025-11-22 08:10:07.660 186853 DEBUG nova.virt.libvirt.driver [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 1eab226d-c316-4033-b802-511921219249] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:10:07 np0005531887 nova_compute[186849]: 2025-11-22 08:10:07.660 186853 DEBUG nova.virt.libvirt.driver [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 1eab226d-c316-4033-b802-511921219249] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:10:07 np0005531887 nova_compute[186849]: 2025-11-22 08:10:07.660 186853 DEBUG nova.virt.libvirt.driver [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 1eab226d-c316-4033-b802-511921219249] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:10:07 np0005531887 nova_compute[186849]: 2025-11-22 08:10:07.661 186853 DEBUG nova.virt.libvirt.driver [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 1eab226d-c316-4033-b802-511921219249] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:10:07 np0005531887 nova_compute[186849]: 2025-11-22 08:10:07.663 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 1eab226d-c316-4033-b802-511921219249] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:10:07 np0005531887 nova_compute[186849]: 2025-11-22 08:10:07.687 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 1eab226d-c316-4033-b802-511921219249] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:10:07 np0005531887 nova_compute[186849]: 2025-11-22 08:10:07.694 186853 INFO nova.scheduler.client.report [None req-4731b953-c1ec-4da3-9153-5d1a690c607c 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Deleted allocations for instance 1b982188-a0e8-474c-a959-760a28dc3ffe#033[00m
Nov 22 03:10:07 np0005531887 nova_compute[186849]: 2025-11-22 08:10:07.773 186853 INFO nova.compute.manager [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 1eab226d-c316-4033-b802-511921219249] Took 8.29 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 03:10:07 np0005531887 nova_compute[186849]: 2025-11-22 08:10:07.774 186853 DEBUG nova.compute.manager [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 1eab226d-c316-4033-b802-511921219249] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:10:07 np0005531887 nova_compute[186849]: 2025-11-22 08:10:07.790 186853 DEBUG oslo_concurrency.lockutils [None req-4731b953-c1ec-4da3-9153-5d1a690c607c 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lock "1b982188-a0e8-474c-a959-760a28dc3ffe" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.376s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:10:07 np0005531887 nova_compute[186849]: 2025-11-22 08:10:07.898 186853 INFO nova.compute.manager [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 1eab226d-c316-4033-b802-511921219249] Took 8.95 seconds to build instance.#033[00m
Nov 22 03:10:07 np0005531887 nova_compute[186849]: 2025-11-22 08:10:07.956 186853 DEBUG oslo_concurrency.lockutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "1eab226d-c316-4033-b802-511921219249" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.137s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:10:08 np0005531887 nova_compute[186849]: 2025-11-22 08:10:08.184 186853 DEBUG nova.network.neutron [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Updating instance_info_cache with network_info: [{"id": "1b5f134e-5728-4e7f-ba86-8650cc0b721d", "address": "fa:16:3e:9f:bd:73", "network": {"id": "8591a8a4-c35f-454b-ba4c-4ec37a8765b2", "bridge": "br-int", "label": "tempest-network-smoke--372886439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.216", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b5f134e-57", "ovs_interfaceid": "1b5f134e-5728-4e7f-ba86-8650cc0b721d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "31719a20-f6e8-45a0-9f9a-d1e76c49b1a9", "address": "fa:16:3e:37:ae:80", "network": {"id": "6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad", "bridge": "br-int", "label": "tempest-network-smoke--1013716252", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe37:ae80", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31719a20-f6", "ovs_interfaceid": "31719a20-f6e8-45a0-9f9a-d1e76c49b1a9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:10:08 np0005531887 nova_compute[186849]: 2025-11-22 08:10:08.204 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Releasing lock "refresh_cache-b50dd877-42b1-46b2-933e-ee9a660a56c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:10:08 np0005531887 nova_compute[186849]: 2025-11-22 08:10:08.204 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 03:10:08 np0005531887 nova_compute[186849]: 2025-11-22 08:10:08.205 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:10:08 np0005531887 nova_compute[186849]: 2025-11-22 08:10:08.205 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:10:08 np0005531887 nova_compute[186849]: 2025-11-22 08:10:08.205 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:10:08 np0005531887 nova_compute[186849]: 2025-11-22 08:10:08.719 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:09 np0005531887 nova_compute[186849]: 2025-11-22 08:10:09.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:10:11 np0005531887 nova_compute[186849]: 2025-11-22 08:10:11.763 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:10:11 np0005531887 nova_compute[186849]: 2025-11-22 08:10:11.928 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:12 np0005531887 nova_compute[186849]: 2025-11-22 08:10:12.250 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:12 np0005531887 nova_compute[186849]: 2025-11-22 08:10:12.372 186853 DEBUG oslo_concurrency.lockutils [None req-ae470347-63f0-4475-9f6e-479eab1535c7 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Acquiring lock "1eab226d-c316-4033-b802-511921219249" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:10:12 np0005531887 nova_compute[186849]: 2025-11-22 08:10:12.372 186853 DEBUG oslo_concurrency.lockutils [None req-ae470347-63f0-4475-9f6e-479eab1535c7 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "1eab226d-c316-4033-b802-511921219249" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:10:12 np0005531887 nova_compute[186849]: 2025-11-22 08:10:12.373 186853 DEBUG oslo_concurrency.lockutils [None req-ae470347-63f0-4475-9f6e-479eab1535c7 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Acquiring lock "1eab226d-c316-4033-b802-511921219249-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:10:12 np0005531887 nova_compute[186849]: 2025-11-22 08:10:12.373 186853 DEBUG oslo_concurrency.lockutils [None req-ae470347-63f0-4475-9f6e-479eab1535c7 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "1eab226d-c316-4033-b802-511921219249-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:10:12 np0005531887 nova_compute[186849]: 2025-11-22 08:10:12.373 186853 DEBUG oslo_concurrency.lockutils [None req-ae470347-63f0-4475-9f6e-479eab1535c7 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "1eab226d-c316-4033-b802-511921219249-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:10:12 np0005531887 nova_compute[186849]: 2025-11-22 08:10:12.382 186853 INFO nova.compute.manager [None req-ae470347-63f0-4475-9f6e-479eab1535c7 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 1eab226d-c316-4033-b802-511921219249] Terminating instance#033[00m
Nov 22 03:10:12 np0005531887 nova_compute[186849]: 2025-11-22 08:10:12.389 186853 DEBUG nova.compute.manager [None req-ae470347-63f0-4475-9f6e-479eab1535c7 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 1eab226d-c316-4033-b802-511921219249] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 03:10:12 np0005531887 kernel: tap26953fb8-b3 (unregistering): left promiscuous mode
Nov 22 03:10:12 np0005531887 NetworkManager[55210]: <info>  [1763799012.4126] device (tap26953fb8-b3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:10:12 np0005531887 nova_compute[186849]: 2025-11-22 08:10:12.426 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:12 np0005531887 ovn_controller[95130]: 2025-11-22T08:10:12Z|00399|binding|INFO|Releasing lport 26953fb8-b327-4ec5-9059-9bd2496c1121 from this chassis (sb_readonly=0)
Nov 22 03:10:12 np0005531887 ovn_controller[95130]: 2025-11-22T08:10:12Z|00400|binding|INFO|Setting lport 26953fb8-b327-4ec5-9059-9bd2496c1121 down in Southbound
Nov 22 03:10:12 np0005531887 ovn_controller[95130]: 2025-11-22T08:10:12Z|00401|binding|INFO|Removing iface tap26953fb8-b3 ovn-installed in OVS
Nov 22 03:10:12 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:12.433 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ff:70:db 10.100.0.10'], port_security=['fa:16:3e:ff:70:db 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '1eab226d-c316-4033-b802-511921219249', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c75f33da-8305-4145-97ef-eef656e4f067', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9c564dfb60114407b72d22a9c49ed513', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7b4508ee-1620-408d-af22-547cca254fde', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a1e17c95-3f14-4b31-90be-c563d86a1107, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=26953fb8-b327-4ec5-9059-9bd2496c1121) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:10:12 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:12.435 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 26953fb8-b327-4ec5-9059-9bd2496c1121 in datapath c75f33da-8305-4145-97ef-eef656e4f067 unbound from our chassis#033[00m
Nov 22 03:10:12 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:12.437 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c75f33da-8305-4145-97ef-eef656e4f067, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:10:12 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:12.439 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[bc8c1aaa-8105-4a16-91dc-e97c30f6db4a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:12 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:12.439 104084 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067 namespace which is not needed anymore#033[00m
Nov 22 03:10:12 np0005531887 nova_compute[186849]: 2025-11-22 08:10:12.441 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:12 np0005531887 systemd[1]: machine-qemu\x2d47\x2dinstance\x2d0000007e.scope: Deactivated successfully.
Nov 22 03:10:12 np0005531887 systemd[1]: machine-qemu\x2d47\x2dinstance\x2d0000007e.scope: Consumed 5.172s CPU time.
Nov 22 03:10:12 np0005531887 systemd-machined[153180]: Machine qemu-47-instance-0000007e terminated.
Nov 22 03:10:12 np0005531887 neutron-haproxy-ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067[231930]: [NOTICE]   (231934) : haproxy version is 2.8.14-c23fe91
Nov 22 03:10:12 np0005531887 neutron-haproxy-ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067[231930]: [NOTICE]   (231934) : path to executable is /usr/sbin/haproxy
Nov 22 03:10:12 np0005531887 neutron-haproxy-ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067[231930]: [WARNING]  (231934) : Exiting Master process...
Nov 22 03:10:12 np0005531887 neutron-haproxy-ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067[231930]: [ALERT]    (231934) : Current worker (231936) exited with code 143 (Terminated)
Nov 22 03:10:12 np0005531887 neutron-haproxy-ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067[231930]: [WARNING]  (231934) : All workers exited. Exiting... (0)
Nov 22 03:10:12 np0005531887 systemd[1]: libpod-509ab56ef7aa9bd0f674659ea64d4f66b487ec1e42041218f1b41d831a48ec4c.scope: Deactivated successfully.
Nov 22 03:10:12 np0005531887 podman[232001]: 2025-11-22 08:10:12.593947652 +0000 UTC m=+0.050046076 container died 509ab56ef7aa9bd0f674659ea64d4f66b487ec1e42041218f1b41d831a48ec4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:10:12 np0005531887 nova_compute[186849]: 2025-11-22 08:10:12.617 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:12 np0005531887 nova_compute[186849]: 2025-11-22 08:10:12.622 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:12 np0005531887 systemd[1]: var-lib-containers-storage-overlay-63b2c4f9f9013e671ab8a04c7f15ed3ec249cf5e54f83175f530b2a6942941ab-merged.mount: Deactivated successfully.
Nov 22 03:10:12 np0005531887 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-509ab56ef7aa9bd0f674659ea64d4f66b487ec1e42041218f1b41d831a48ec4c-userdata-shm.mount: Deactivated successfully.
Nov 22 03:10:12 np0005531887 podman[232001]: 2025-11-22 08:10:12.644597042 +0000 UTC m=+0.100695446 container cleanup 509ab56ef7aa9bd0f674659ea64d4f66b487ec1e42041218f1b41d831a48ec4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 22 03:10:12 np0005531887 systemd[1]: libpod-conmon-509ab56ef7aa9bd0f674659ea64d4f66b487ec1e42041218f1b41d831a48ec4c.scope: Deactivated successfully.
Nov 22 03:10:12 np0005531887 nova_compute[186849]: 2025-11-22 08:10:12.672 186853 INFO nova.virt.libvirt.driver [-] [instance: 1eab226d-c316-4033-b802-511921219249] Instance destroyed successfully.#033[00m
Nov 22 03:10:12 np0005531887 nova_compute[186849]: 2025-11-22 08:10:12.672 186853 DEBUG nova.objects.instance [None req-ae470347-63f0-4475-9f6e-479eab1535c7 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lazy-loading 'resources' on Instance uuid 1eab226d-c316-4033-b802-511921219249 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:10:12 np0005531887 podman[232045]: 2025-11-22 08:10:12.731912197 +0000 UTC m=+0.056369512 container remove 509ab56ef7aa9bd0f674659ea64d4f66b487ec1e42041218f1b41d831a48ec4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 22 03:10:12 np0005531887 nova_compute[186849]: 2025-11-22 08:10:12.735 186853 DEBUG nova.virt.libvirt.vif [None req-ae470347-63f0-4475-9f6e-479eab1535c7 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:09:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-MultipleCreateTestJSON-server-519320456',display_name='tempest-MultipleCreateTestJSON-server-519320456-2',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-multiplecreatetestjson-server-519320456-2',id=126,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=1,launched_at=2025-11-22T08:10:07Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9c564dfb60114407b72d22a9c49ed513',ramdisk_id='',reservation_id='r-s5402b4q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-MultipleCreateTestJSON-1558462004',owner_user_name='tempest-MultipleCreateTestJSON-1558462004-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:10:07Z,user_data=None,user_id='c867ad823e59410b995507d3e85b3465',uuid=1eab226d-c316-4033-b802-511921219249,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "26953fb8-b327-4ec5-9059-9bd2496c1121", "address": "fa:16:3e:ff:70:db", "network": {"id": "c75f33da-8305-4145-97ef-eef656e4f067", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-910086432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c564dfb60114407b72d22a9c49ed513", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26953fb8-b3", "ovs_interfaceid": "26953fb8-b327-4ec5-9059-9bd2496c1121", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:10:12 np0005531887 nova_compute[186849]: 2025-11-22 08:10:12.737 186853 DEBUG nova.network.os_vif_util [None req-ae470347-63f0-4475-9f6e-479eab1535c7 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Converting VIF {"id": "26953fb8-b327-4ec5-9059-9bd2496c1121", "address": "fa:16:3e:ff:70:db", "network": {"id": "c75f33da-8305-4145-97ef-eef656e4f067", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-910086432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c564dfb60114407b72d22a9c49ed513", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26953fb8-b3", "ovs_interfaceid": "26953fb8-b327-4ec5-9059-9bd2496c1121", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:10:12 np0005531887 nova_compute[186849]: 2025-11-22 08:10:12.738 186853 DEBUG nova.network.os_vif_util [None req-ae470347-63f0-4475-9f6e-479eab1535c7 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ff:70:db,bridge_name='br-int',has_traffic_filtering=True,id=26953fb8-b327-4ec5-9059-9bd2496c1121,network=Network(c75f33da-8305-4145-97ef-eef656e4f067),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap26953fb8-b3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:10:12 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:12.737 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[8dfa9b64-dba9-47ce-a332-12b16ee3dd80]: (4, ('Sat Nov 22 08:10:12 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067 (509ab56ef7aa9bd0f674659ea64d4f66b487ec1e42041218f1b41d831a48ec4c)\n509ab56ef7aa9bd0f674659ea64d4f66b487ec1e42041218f1b41d831a48ec4c\nSat Nov 22 08:10:12 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067 (509ab56ef7aa9bd0f674659ea64d4f66b487ec1e42041218f1b41d831a48ec4c)\n509ab56ef7aa9bd0f674659ea64d4f66b487ec1e42041218f1b41d831a48ec4c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:12 np0005531887 nova_compute[186849]: 2025-11-22 08:10:12.738 186853 DEBUG os_vif [None req-ae470347-63f0-4475-9f6e-479eab1535c7 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:70:db,bridge_name='br-int',has_traffic_filtering=True,id=26953fb8-b327-4ec5-9059-9bd2496c1121,network=Network(c75f33da-8305-4145-97ef-eef656e4f067),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap26953fb8-b3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:10:12 np0005531887 nova_compute[186849]: 2025-11-22 08:10:12.740 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:12 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:12.740 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[0055e8c0-7d80-463d-bf3d-64e1a9124066]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:12 np0005531887 nova_compute[186849]: 2025-11-22 08:10:12.741 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap26953fb8-b3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:10:12 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:12.742 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc75f33da-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:10:12 np0005531887 nova_compute[186849]: 2025-11-22 08:10:12.743 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:12 np0005531887 kernel: tapc75f33da-80: left promiscuous mode
Nov 22 03:10:12 np0005531887 nova_compute[186849]: 2025-11-22 08:10:12.745 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:12 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:12.750 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[4c735d6e-bd19-4400-bb9d-c3b0722ead90]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:12 np0005531887 nova_compute[186849]: 2025-11-22 08:10:12.751 186853 INFO os_vif [None req-ae470347-63f0-4475-9f6e-479eab1535c7 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:70:db,bridge_name='br-int',has_traffic_filtering=True,id=26953fb8-b327-4ec5-9059-9bd2496c1121,network=Network(c75f33da-8305-4145-97ef-eef656e4f067),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap26953fb8-b3')#033[00m
Nov 22 03:10:12 np0005531887 nova_compute[186849]: 2025-11-22 08:10:12.752 186853 INFO nova.virt.libvirt.driver [None req-ae470347-63f0-4475-9f6e-479eab1535c7 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 1eab226d-c316-4033-b802-511921219249] Deleting instance files /var/lib/nova/instances/1eab226d-c316-4033-b802-511921219249_del#033[00m
Nov 22 03:10:12 np0005531887 nova_compute[186849]: 2025-11-22 08:10:12.752 186853 INFO nova.virt.libvirt.driver [None req-ae470347-63f0-4475-9f6e-479eab1535c7 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 1eab226d-c316-4033-b802-511921219249] Deletion of /var/lib/nova/instances/1eab226d-c316-4033-b802-511921219249_del complete#033[00m
Nov 22 03:10:12 np0005531887 nova_compute[186849]: 2025-11-22 08:10:12.761 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:12 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:12.771 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[dfc57a46-d223-45d9-955f-a9b1de9b6683]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:12 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:12.772 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[d17fe958-2555-407e-a8b2-91d782ffdabc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:12 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:12.792 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[e1202d65-fe11-4a3a-b44c-92ee542f0849]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 574741, 'reachable_time': 35417, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232061, 'error': None, 'target': 'ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:12 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:12.795 104200 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:10:12 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:12.795 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[fc516459-880b-46c5-8532-90346905d369]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:12 np0005531887 systemd[1]: run-netns-ovnmeta\x2dc75f33da\x2d8305\x2d4145\x2d97ef\x2deef656e4f067.mount: Deactivated successfully.
Nov 22 03:10:12 np0005531887 nova_compute[186849]: 2025-11-22 08:10:12.942 186853 INFO nova.compute.manager [None req-ae470347-63f0-4475-9f6e-479eab1535c7 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 1eab226d-c316-4033-b802-511921219249] Took 0.55 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 03:10:12 np0005531887 nova_compute[186849]: 2025-11-22 08:10:12.943 186853 DEBUG oslo.service.loopingcall [None req-ae470347-63f0-4475-9f6e-479eab1535c7 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 03:10:12 np0005531887 nova_compute[186849]: 2025-11-22 08:10:12.944 186853 DEBUG nova.compute.manager [-] [instance: 1eab226d-c316-4033-b802-511921219249] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 03:10:12 np0005531887 nova_compute[186849]: 2025-11-22 08:10:12.944 186853 DEBUG nova.network.neutron [-] [instance: 1eab226d-c316-4033-b802-511921219249] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 03:10:13 np0005531887 podman[232062]: 2025-11-22 08:10:13.834872876 +0000 UTC m=+0.053346917 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Nov 22 03:10:14 np0005531887 nova_compute[186849]: 2025-11-22 08:10:14.400 186853 DEBUG nova.network.neutron [-] [instance: 1eab226d-c316-4033-b802-511921219249] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:10:14 np0005531887 nova_compute[186849]: 2025-11-22 08:10:14.472 186853 INFO nova.compute.manager [-] [instance: 1eab226d-c316-4033-b802-511921219249] Took 1.53 seconds to deallocate network for instance.#033[00m
Nov 22 03:10:14 np0005531887 nova_compute[186849]: 2025-11-22 08:10:14.538 186853 DEBUG oslo_concurrency.lockutils [None req-ae470347-63f0-4475-9f6e-479eab1535c7 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:10:14 np0005531887 nova_compute[186849]: 2025-11-22 08:10:14.538 186853 DEBUG oslo_concurrency.lockutils [None req-ae470347-63f0-4475-9f6e-479eab1535c7 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:10:14 np0005531887 nova_compute[186849]: 2025-11-22 08:10:14.640 186853 DEBUG nova.compute.manager [req-72ed91ad-8b8f-4b6b-93ae-1f3883be6c0d req-311355ec-550d-4b0f-87be-c98c7ac974ea 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1eab226d-c316-4033-b802-511921219249] Received event network-vif-deleted-26953fb8-b327-4ec5-9059-9bd2496c1121 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:10:14 np0005531887 nova_compute[186849]: 2025-11-22 08:10:14.641 186853 DEBUG nova.compute.provider_tree [None req-ae470347-63f0-4475-9f6e-479eab1535c7 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:10:14 np0005531887 nova_compute[186849]: 2025-11-22 08:10:14.656 186853 DEBUG nova.scheduler.client.report [None req-ae470347-63f0-4475-9f6e-479eab1535c7 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:10:14 np0005531887 nova_compute[186849]: 2025-11-22 08:10:14.693 186853 DEBUG oslo_concurrency.lockutils [None req-ae470347-63f0-4475-9f6e-479eab1535c7 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.155s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:10:14 np0005531887 nova_compute[186849]: 2025-11-22 08:10:14.723 186853 INFO nova.scheduler.client.report [None req-ae470347-63f0-4475-9f6e-479eab1535c7 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Deleted allocations for instance 1eab226d-c316-4033-b802-511921219249#033[00m
Nov 22 03:10:14 np0005531887 nova_compute[186849]: 2025-11-22 08:10:14.801 186853 DEBUG oslo_concurrency.lockutils [None req-ae470347-63f0-4475-9f6e-479eab1535c7 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "1eab226d-c316-4033-b802-511921219249" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.429s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:10:14 np0005531887 ovn_controller[95130]: 2025-11-22T08:10:14Z|00402|binding|INFO|Releasing lport ec231e2a-1042-4a3a-b541-060f5a121bb8 from this chassis (sb_readonly=0)
Nov 22 03:10:14 np0005531887 ovn_controller[95130]: 2025-11-22T08:10:14Z|00403|binding|INFO|Releasing lport 288f6565-c1a7-412f-8593-8864123e2215 from this chassis (sb_readonly=0)
Nov 22 03:10:14 np0005531887 nova_compute[186849]: 2025-11-22 08:10:14.967 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:16 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:16.775 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=35, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=34) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:10:16 np0005531887 nova_compute[186849]: 2025-11-22 08:10:16.776 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:16 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:16.777 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:10:16 np0005531887 nova_compute[186849]: 2025-11-22 08:10:16.931 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:17 np0005531887 nova_compute[186849]: 2025-11-22 08:10:17.743 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:17 np0005531887 podman[232082]: 2025-11-22 08:10:17.844234881 +0000 UTC m=+0.063876747 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd)
Nov 22 03:10:18 np0005531887 nova_compute[186849]: 2025-11-22 08:10:18.700 186853 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763799003.698805, 1b982188-a0e8-474c-a959-760a28dc3ffe => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:10:18 np0005531887 nova_compute[186849]: 2025-11-22 08:10:18.701 186853 INFO nova.compute.manager [-] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:10:18 np0005531887 nova_compute[186849]: 2025-11-22 08:10:18.748 186853 DEBUG nova.compute.manager [None req-565d2ebc-3314-4b8d-b507-8fc409c8612e - - - - - -] [instance: 1b982188-a0e8-474c-a959-760a28dc3ffe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:10:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:18.779 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '35'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:10:18 np0005531887 ovn_controller[95130]: 2025-11-22T08:10:18Z|00404|binding|INFO|Releasing lport ec231e2a-1042-4a3a-b541-060f5a121bb8 from this chassis (sb_readonly=0)
Nov 22 03:10:18 np0005531887 ovn_controller[95130]: 2025-11-22T08:10:18Z|00405|binding|INFO|Releasing lport 288f6565-c1a7-412f-8593-8864123e2215 from this chassis (sb_readonly=0)
Nov 22 03:10:18 np0005531887 nova_compute[186849]: 2025-11-22 08:10:18.980 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:20 np0005531887 ovn_controller[95130]: 2025-11-22T08:10:20Z|00406|binding|INFO|Releasing lport ec231e2a-1042-4a3a-b541-060f5a121bb8 from this chassis (sb_readonly=0)
Nov 22 03:10:20 np0005531887 ovn_controller[95130]: 2025-11-22T08:10:20Z|00407|binding|INFO|Releasing lport 288f6565-c1a7-412f-8593-8864123e2215 from this chassis (sb_readonly=0)
Nov 22 03:10:20 np0005531887 nova_compute[186849]: 2025-11-22 08:10:20.057 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:21 np0005531887 nova_compute[186849]: 2025-11-22 08:10:21.934 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:22 np0005531887 nova_compute[186849]: 2025-11-22 08:10:22.745 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:22 np0005531887 podman[232103]: 2025-11-22 08:10:22.83631551 +0000 UTC m=+0.057710615 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:10:26 np0005531887 nova_compute[186849]: 2025-11-22 08:10:26.936 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:26 np0005531887 nova_compute[186849]: 2025-11-22 08:10:26.970 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:27 np0005531887 nova_compute[186849]: 2025-11-22 08:10:27.671 186853 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763799012.6698966, 1eab226d-c316-4033-b802-511921219249 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:10:27 np0005531887 nova_compute[186849]: 2025-11-22 08:10:27.672 186853 INFO nova.compute.manager [-] [instance: 1eab226d-c316-4033-b802-511921219249] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:10:27 np0005531887 nova_compute[186849]: 2025-11-22 08:10:27.708 186853 DEBUG nova.compute.manager [None req-c1a6398c-9d24-4f87-9b87-3a8dd454ba29 - - - - - -] [instance: 1eab226d-c316-4033-b802-511921219249] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:10:27 np0005531887 nova_compute[186849]: 2025-11-22 08:10:27.747 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:29 np0005531887 nova_compute[186849]: 2025-11-22 08:10:29.533 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:29 np0005531887 podman[232127]: 2025-11-22 08:10:29.850429818 +0000 UTC m=+0.062240226 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, release=1755695350, io.openshift.expose-services=, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7)
Nov 22 03:10:31 np0005531887 nova_compute[186849]: 2025-11-22 08:10:31.939 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:32 np0005531887 nova_compute[186849]: 2025-11-22 08:10:32.749 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:32 np0005531887 podman[232148]: 2025-11-22 08:10:32.831803465 +0000 UTC m=+0.055048370 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118)
Nov 22 03:10:32 np0005531887 podman[232149]: 2025-11-22 08:10:32.859367484 +0000 UTC m=+0.079855171 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 03:10:33 np0005531887 nova_compute[186849]: 2025-11-22 08:10:33.739 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.668 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3', 'name': 'tempest-TestGettingAddress-server-1979784336', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000074', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'user_id': '809b865601654264af5bff7f49127cea', 'hostId': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.668 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.687 12 DEBUG ceilometer.compute.pollsters [-] b50dd877-42b1-46b2-933e-ee9a660a56c3/cpu volume: 14320000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.688 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2febb757-8a9a-44d5-aeb4-b2fb765c2795', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 14320000000, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3', 'timestamp': '2025-11-22T08:10:36.668942', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979784336', 'name': 'instance-00000074', 'instance_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'b9dfc83c-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5779.356707421, 'message_signature': 'ad98c171b4af67d14830eb01b6eb38ae794912cc079cdf8b16427e05cb7c25c4'}]}, 'timestamp': '2025-11-22 08:10:36.687556', '_unique_id': '18eaa951682b48b987b712b467e86d4f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.688 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.688 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.688 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.688 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.688 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.688 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.688 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.688 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.688 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.688 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.688 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.688 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.688 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.688 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.688 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.688 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.688 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.688 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.688 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.688 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.688 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.688 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.688 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.688 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.688 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.688 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.688 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.688 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.688 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.688 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.688 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.689 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.700 12 DEBUG ceilometer.compute.pollsters [-] b50dd877-42b1-46b2-933e-ee9a660a56c3/disk.device.allocation volume: 30679040 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.701 12 DEBUG ceilometer.compute.pollsters [-] b50dd877-42b1-46b2-933e-ee9a660a56c3/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.702 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bbdb78f2-ed98-4e09-b558-0bb1b45243b2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30679040, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3-vda', 'timestamp': '2025-11-22T08:10:36.689344', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979784336', 'name': 'instance-00000074', 'instance_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b9e1ddb6-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5779.359151972, 'message_signature': '7109dab971b6ee138c367b769ecb1d039b78d6513aa6062bfae1e0461062d13f'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3-sda', 'timestamp': '2025-11-22T08:10:36.689344', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979784336', 'name': 'instance-00000074', 'instance_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b9e1eec8-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5779.359151972, 'message_signature': 'd18c139fc9b18dbecc3542e870a434d78a8aee9e69100f68807518f76fc5c5b0'}]}, 'timestamp': '2025-11-22 08:10:36.701640', '_unique_id': '89ac9ceb78be4b0c9087adbdb426ef4a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.702 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.702 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.702 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.702 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.702 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.702 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.702 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.702 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.702 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.702 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.702 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.702 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.702 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.702 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.702 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.702 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.702 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.702 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.702 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.702 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.702 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.702 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.702 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.702 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.702 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.702 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.702 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.702 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.702 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.702 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.702 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.703 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.704 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.704 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1979784336>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1979784336>]
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.704 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.706 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for b50dd877-42b1-46b2-933e-ee9a660a56c3 / tap1b5f134e-57 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.707 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for b50dd877-42b1-46b2-933e-ee9a660a56c3 / tap31719a20-f6 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.707 12 DEBUG ceilometer.compute.pollsters [-] b50dd877-42b1-46b2-933e-ee9a660a56c3/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.708 12 DEBUG ceilometer.compute.pollsters [-] b50dd877-42b1-46b2-933e-ee9a660a56c3/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.709 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '58c3b487-061e-4fd0-a362-ab6fba2f5241', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000074-b50dd877-42b1-46b2-933e-ee9a660a56c3-tap1b5f134e-57', 'timestamp': '2025-11-22T08:10:36.704549', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979784336', 'name': 'tap1b5f134e-57', 'instance_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9f:bd:73', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1b5f134e-57'}, 'message_id': 'b9e2e8f0-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5779.374371618, 'message_signature': '5f2576ed25f45aac0861d6da7464d95cfdff2b66eaa58c1be96885ce9f88bbf6'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000074-b50dd877-42b1-46b2-933e-ee9a660a56c3-tap31719a20-f6', 'timestamp': '2025-11-22T08:10:36.704549', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979784336', 'name': 'tap31719a20-f6', 'instance_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:37:ae:80', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap31719a20-f6'}, 'message_id': 'b9e2faa2-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5779.374371618, 'message_signature': 'c8b304cc89e7a829d5418dcadbd0c309b1f6ed6274d13de6750f8a8c23d25a20'}]}, 'timestamp': '2025-11-22 08:10:36.708491', '_unique_id': '24981fb1d55145839598e1f1f55f35a2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.709 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.709 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.709 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.709 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.709 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.709 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.709 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.709 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.709 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.709 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.709 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.709 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.709 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.709 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.709 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.709 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.709 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.709 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.709 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.709 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.709 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.709 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.709 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.709 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.709 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.709 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.709 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.709 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.709 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.709 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.709 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.710 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.737 12 DEBUG ceilometer.compute.pollsters [-] b50dd877-42b1-46b2-933e-ee9a660a56c3/disk.device.read.bytes volume: 31013376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.738 12 DEBUG ceilometer.compute.pollsters [-] b50dd877-42b1-46b2-933e-ee9a660a56c3/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.740 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a88eb282-962f-4b67-9d07-4e36dc639bf2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31013376, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3-vda', 'timestamp': '2025-11-22T08:10:36.710865', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979784336', 'name': 'instance-00000074', 'instance_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b9e78ef0-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5779.380691943, 'message_signature': '6254583eb35e61bbabfd79e7db204a94d6be95a28590ef1cd068ea6d5c771f00'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3-sda', 'timestamp': '2025-11-22T08:10:36.710865', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979784336', 'name': 'instance-00000074', 'instance_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b9e79dc8-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5779.380691943, 'message_signature': 'ccf8fcb8e03366a56a3a1b137355983892aadeda0a9cd821bcf30a61ab9687b7'}]}, 'timestamp': '2025-11-22 08:10:36.738870', '_unique_id': 'df43d4c77e324d5984d8c663ffc9cefc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.740 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.740 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.740 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.740 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.740 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.740 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.740 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.740 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.740 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.740 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.740 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.740 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.740 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.740 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.740 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.740 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.740 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.740 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.740 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.740 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.740 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.740 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.740 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.740 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.740 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.740 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.740 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.740 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.740 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.740 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.740 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.741 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.741 12 DEBUG ceilometer.compute.pollsters [-] b50dd877-42b1-46b2-933e-ee9a660a56c3/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.741 12 DEBUG ceilometer.compute.pollsters [-] b50dd877-42b1-46b2-933e-ee9a660a56c3/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.743 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'de5a8e0a-2e73-46a5-a55b-2830def93bb3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000074-b50dd877-42b1-46b2-933e-ee9a660a56c3-tap1b5f134e-57', 'timestamp': '2025-11-22T08:10:36.741249', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979784336', 'name': 'tap1b5f134e-57', 'instance_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9f:bd:73', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1b5f134e-57'}, 'message_id': 'b9e809fc-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5779.374371618, 'message_signature': '740850eb0a3c87db7ba9f36efc6af792ebcf773f0697d487cd71d6a8a6fb5beb'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000074-b50dd877-42b1-46b2-933e-ee9a660a56c3-tap31719a20-f6', 'timestamp': '2025-11-22T08:10:36.741249', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979784336', 'name': 'tap31719a20-f6', 'instance_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:37:ae:80', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap31719a20-f6'}, 'message_id': 'b9e816ae-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5779.374371618, 'message_signature': 'cdee866da334645f81d00877b5b55ce90cfb746cfcac2959e6b212d67772fc27'}]}, 'timestamp': '2025-11-22 08:10:36.742307', '_unique_id': '013d06ca864f4f199e3a54de66916980'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.743 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.743 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.743 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.743 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.743 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.743 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.743 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.743 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.743 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.743 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.743 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.743 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.743 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.743 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.743 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.743 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.743 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.743 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.743 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.743 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.743 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.743 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.743 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.743 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.743 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.743 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.743 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.743 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.743 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.743 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.743 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.743 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.744 12 DEBUG ceilometer.compute.pollsters [-] b50dd877-42b1-46b2-933e-ee9a660a56c3/network.incoming.bytes volume: 29533 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.744 12 DEBUG ceilometer.compute.pollsters [-] b50dd877-42b1-46b2-933e-ee9a660a56c3/network.incoming.bytes volume: 2236 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.745 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '53039fe0-4617-4ca3-bb86-485abc6e3475', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29533, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000074-b50dd877-42b1-46b2-933e-ee9a660a56c3-tap1b5f134e-57', 'timestamp': '2025-11-22T08:10:36.744059', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979784336', 'name': 'tap1b5f134e-57', 'instance_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9f:bd:73', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1b5f134e-57'}, 'message_id': 'b9e87568-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5779.374371618, 'message_signature': '81bc40358701c520bc3d57c296ac340f28f633052e34f696afcfef2e3be83b46'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2236, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000074-b50dd877-42b1-46b2-933e-ee9a660a56c3-tap31719a20-f6', 'timestamp': '2025-11-22T08:10:36.744059', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979784336', 'name': 'tap31719a20-f6', 'instance_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:37:ae:80', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap31719a20-f6'}, 'message_id': 'b9e882e2-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5779.374371618, 'message_signature': 'd3220fc4d1884bb886bac462599afde2285f81bfb62b0d3954b0b8b18288bd56'}]}, 'timestamp': '2025-11-22 08:10:36.744731', '_unique_id': '3c67bbbdc811493aab39688e0f5dcbce'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.745 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.745 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.745 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.745 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.745 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.745 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.745 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.745 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.745 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.745 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.745 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.745 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.745 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.745 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.745 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.745 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.745 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.745 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.745 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.745 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.745 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.745 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.745 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.745 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.745 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.745 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.745 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.745 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.745 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.745 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.745 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.746 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.746 12 DEBUG ceilometer.compute.pollsters [-] b50dd877-42b1-46b2-933e-ee9a660a56c3/network.outgoing.packets volume: 170 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.746 12 DEBUG ceilometer.compute.pollsters [-] b50dd877-42b1-46b2-933e-ee9a660a56c3/network.outgoing.packets volume: 27 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.747 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5f4caaf4-6799-4c89-9764-6cd78e040d14', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 170, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000074-b50dd877-42b1-46b2-933e-ee9a660a56c3-tap1b5f134e-57', 'timestamp': '2025-11-22T08:10:36.746439', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979784336', 'name': 'tap1b5f134e-57', 'instance_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9f:bd:73', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1b5f134e-57'}, 'message_id': 'b9e8d24c-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5779.374371618, 'message_signature': '56169addbce1bb52ef574cb92d04f88205d1b32609360115becf468c1c4d68a0'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 27, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000074-b50dd877-42b1-46b2-933e-ee9a660a56c3-tap31719a20-f6', 'timestamp': '2025-11-22T08:10:36.746439', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979784336', 'name': 'tap31719a20-f6', 'instance_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:37:ae:80', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap31719a20-f6'}, 'message_id': 'b9e8deea-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5779.374371618, 'message_signature': '86f2748b9d3d960f31c034d8c6755d08ecb60264097e3c941d1ea0d005c38375'}]}, 'timestamp': '2025-11-22 08:10:36.747089', '_unique_id': 'b118ff2c20b64f069945e98e60343f09'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.747 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.747 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.747 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.747 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.747 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.747 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.747 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.747 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.747 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.747 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.747 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.747 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.747 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.747 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.747 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.747 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.747 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.747 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.747 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.747 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.747 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.747 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.747 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.747 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.747 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.747 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.747 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.747 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.747 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.747 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.747 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.748 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.748 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.748 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1979784336>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1979784336>]
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.749 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.749 12 DEBUG ceilometer.compute.pollsters [-] b50dd877-42b1-46b2-933e-ee9a660a56c3/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.749 12 DEBUG ceilometer.compute.pollsters [-] b50dd877-42b1-46b2-933e-ee9a660a56c3/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.750 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4522c6d6-e47e-4f92-adb4-d9604e8c1869', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000074-b50dd877-42b1-46b2-933e-ee9a660a56c3-tap1b5f134e-57', 'timestamp': '2025-11-22T08:10:36.749225', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979784336', 'name': 'tap1b5f134e-57', 'instance_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9f:bd:73', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1b5f134e-57'}, 'message_id': 'b9e94056-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5779.374371618, 'message_signature': '12e3e6a6b2d04cf2a59875cee4a39bfe92fe75d6195255439a24c669b166bf23'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000074-b50dd877-42b1-46b2-933e-ee9a660a56c3-tap31719a20-f6', 'timestamp': '2025-11-22T08:10:36.749225', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979784336', 'name': 'tap31719a20-f6', 'instance_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:37:ae:80', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap31719a20-f6'}, 'message_id': 'b9e94c5e-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5779.374371618, 'message_signature': '30b8e7519b620a78bb957d1853f0e954479d1bac6bba93ce17d2232d6785739f'}]}, 'timestamp': '2025-11-22 08:10:36.749895', '_unique_id': 'fac1889b207b4f4087cac48b58fd0b3b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.750 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.750 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.750 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.750 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.750 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.750 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.750 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.750 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.750 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.750 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.750 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.750 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.750 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.750 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.750 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.750 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.750 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.750 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.750 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.750 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.750 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.750 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.750 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.750 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.750 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.750 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.750 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.750 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.750 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.750 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.750 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.751 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.751 12 DEBUG ceilometer.compute.pollsters [-] b50dd877-42b1-46b2-933e-ee9a660a56c3/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.751 12 DEBUG ceilometer.compute.pollsters [-] b50dd877-42b1-46b2-933e-ee9a660a56c3/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.752 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e99955c9-90ec-4c04-a394-5e3daa4de450', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000074-b50dd877-42b1-46b2-933e-ee9a660a56c3-tap1b5f134e-57', 'timestamp': '2025-11-22T08:10:36.751610', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979784336', 'name': 'tap1b5f134e-57', 'instance_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9f:bd:73', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1b5f134e-57'}, 'message_id': 'b9e99c7c-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5779.374371618, 'message_signature': 'fd267c6c1f3fc72abecda37c96be73f6d92794ff6a62dcbbdaba8334267b16c2'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000074-b50dd877-42b1-46b2-933e-ee9a660a56c3-tap31719a20-f6', 'timestamp': '2025-11-22T08:10:36.751610', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979784336', 'name': 'tap31719a20-f6', 'instance_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:37:ae:80', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap31719a20-f6'}, 'message_id': 'b9e9a8a2-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5779.374371618, 'message_signature': '37b126d2380d63650fa7fc9980bedc9b84768e01bfd3d9a9be2079d9b82c0e9f'}]}, 'timestamp': '2025-11-22 08:10:36.752248', '_unique_id': '7a9da6f05a694d68b6b4dc71e41a107e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.752 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.752 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.752 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.752 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.752 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.752 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.752 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.752 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.752 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.752 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.752 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.752 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.752 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.752 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.752 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.752 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.752 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.752 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.752 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.752 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.752 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.752 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.752 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.752 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.752 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.752 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.752 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.752 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.752 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.752 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.752 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.753 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.753 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.754 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1979784336>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1979784336>]
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.754 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.754 12 DEBUG ceilometer.compute.pollsters [-] b50dd877-42b1-46b2-933e-ee9a660a56c3/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.754 12 DEBUG ceilometer.compute.pollsters [-] b50dd877-42b1-46b2-933e-ee9a660a56c3/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.755 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '85214297-3611-4b9e-a014-7da9c75ba059', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3-vda', 'timestamp': '2025-11-22T08:10:36.754374', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979784336', 'name': 'instance-00000074', 'instance_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b9ea0874-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5779.359151972, 'message_signature': '255199298b766fdb4f9b5f3a01fdf984c4228bdacc20858612c21218f4b4218b'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3-sda', 'timestamp': '2025-11-22T08:10:36.754374', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979784336', 'name': 'instance-00000074', 'instance_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b9ea1468-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5779.359151972, 'message_signature': '04cbce49b651186df444893d9878cdddb96e4bfe7bb7c86739b81159d9a9a7c6'}]}, 'timestamp': '2025-11-22 08:10:36.755003', '_unique_id': '21afa5b0210d465da3c937cc14fe47c0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.755 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.755 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.755 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.755 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.755 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.755 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.755 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.755 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.755 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.755 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.755 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.755 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.755 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.755 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.755 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.755 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.755 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.755 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.755 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.755 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.755 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.755 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.755 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.755 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.755 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.755 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.755 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.755 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.755 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.755 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.755 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.756 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.756 12 DEBUG ceilometer.compute.pollsters [-] b50dd877-42b1-46b2-933e-ee9a660a56c3/disk.device.write.latency volume: 18211473098 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.757 12 DEBUG ceilometer.compute.pollsters [-] b50dd877-42b1-46b2-933e-ee9a660a56c3/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.758 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dabf0edf-3048-411c-944b-d98e4962b6eb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 18211473098, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3-vda', 'timestamp': '2025-11-22T08:10:36.756714', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979784336', 'name': 'instance-00000074', 'instance_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b9ea63fa-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5779.380691943, 'message_signature': '134e8d36e6fb6c00027541ce7cf3c021db073a019791473cde60f541cc03ded4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3-sda', 'timestamp': '2025-11-22T08:10:36.756714', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979784336', 'name': 'instance-00000074', 'instance_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b9ea744e-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5779.380691943, 'message_signature': 'd8772da656fca18456eba4c8178e9f15b5541a5afadbc144eb46909043fc315f'}]}, 'timestamp': '2025-11-22 08:10:36.757462', '_unique_id': '458bbcb59b6445d487642b3ded9c7281'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.758 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.758 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.758 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.758 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.758 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.758 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.758 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.758 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.758 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.758 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.758 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.758 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.758 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.758 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.758 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.758 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.758 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.758 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.758 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.758 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.758 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.758 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.758 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.758 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.758 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.758 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.758 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.758 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.758 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.758 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.758 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.759 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.759 12 DEBUG ceilometer.compute.pollsters [-] b50dd877-42b1-46b2-933e-ee9a660a56c3/disk.device.read.requests volume: 1116 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.760 12 DEBUG ceilometer.compute.pollsters [-] b50dd877-42b1-46b2-933e-ee9a660a56c3/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.761 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5b297e03-2481-4a28-8c0c-932d8b8597af', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1116, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3-vda', 'timestamp': '2025-11-22T08:10:36.759897', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979784336', 'name': 'instance-00000074', 'instance_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b9eae12c-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5779.380691943, 'message_signature': 'f8a530c9c70a97ef130f9c9c1fecf48152cde5511c79610c41fd7b0f1eb66e9a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3-sda', 'timestamp': '2025-11-22T08:10:36.759897', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979784336', 'name': 'instance-00000074', 'instance_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b9eaef00-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5779.380691943, 'message_signature': '8db26c160c9a2cb3fab9bcda89d4689674e15e4000051dc1c0ef89eb16f9603e'}]}, 'timestamp': '2025-11-22 08:10:36.760605', '_unique_id': '95bc0680126847d1952e791562eb099a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.761 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.761 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.761 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.761 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.761 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.761 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.761 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.761 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.761 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.761 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.761 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.761 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.761 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.761 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.761 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.761 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.761 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.761 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.761 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.761 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.761 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.761 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.761 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.761 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.761 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.761 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.761 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.761 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.761 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.761 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.761 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.762 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.762 12 DEBUG ceilometer.compute.pollsters [-] b50dd877-42b1-46b2-933e-ee9a660a56c3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.762 12 DEBUG ceilometer.compute.pollsters [-] b50dd877-42b1-46b2-933e-ee9a660a56c3/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.764 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2b9c768f-f6d3-4a86-8b6d-04bc3e433a21', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3-vda', 'timestamp': '2025-11-22T08:10:36.762543', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979784336', 'name': 'instance-00000074', 'instance_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b9eb47c0-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5779.359151972, 'message_signature': 'cf695891b35c20b089180a9ea54ab072466163e6502b74ba8a12a8e6ada2ffcd'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3-sda', 'timestamp': '2025-11-22T08:10:36.762543', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979784336', 'name': 'instance-00000074', 'instance_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b9eb5436-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5779.359151972, 'message_signature': 'cdf12f076a17022e182da0b281f1a222d4309db21b4e9b841cb754173bb79ba6'}]}, 'timestamp': '2025-11-22 08:10:36.763190', '_unique_id': '50961bf6e2474e56b534cb2b43ee94b9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.764 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.764 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.764 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.764 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.764 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.764 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.764 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.764 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.764 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.764 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.764 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.764 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.764 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.764 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.764 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.764 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.764 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.764 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.764 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.764 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.764 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.764 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.764 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.764 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.764 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.764 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.764 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.764 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.764 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.764 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.764 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.765 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.765 12 DEBUG ceilometer.compute.pollsters [-] b50dd877-42b1-46b2-933e-ee9a660a56c3/disk.device.write.requests volume: 347 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.766 12 DEBUG ceilometer.compute.pollsters [-] b50dd877-42b1-46b2-933e-ee9a660a56c3/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.767 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd066bf4a-8210-4eda-a002-e4d8908e360f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 347, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3-vda', 'timestamp': '2025-11-22T08:10:36.765698', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979784336', 'name': 'instance-00000074', 'instance_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b9ebc57e-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5779.380691943, 'message_signature': '4068ef51fff15b5687f61e92350c64d32c5e0d34d23bbc34cba4038a2f02bd51'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3-sda', 'timestamp': '2025-11-22T08:10:36.765698', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979784336', 'name': 'instance-00000074', 'instance_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b9ebd348-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5779.380691943, 'message_signature': '517daa633f66f94548301f780a0f17ad6e660f0a1ae6af23ba11416d11777dad'}]}, 'timestamp': '2025-11-22 08:10:36.766448', '_unique_id': '3ac6489d4fe54322b5483b38bc8d8001'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.767 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.767 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.767 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.767 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.767 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.767 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.767 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.767 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.767 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.767 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.767 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.767 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.767 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.767 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.767 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.767 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.767 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.767 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.767 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.767 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.767 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.767 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.767 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.767 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.767 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.767 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.767 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.767 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.767 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.767 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.767 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.768 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.768 12 DEBUG ceilometer.compute.pollsters [-] b50dd877-42b1-46b2-933e-ee9a660a56c3/network.incoming.packets volume: 163 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.769 12 DEBUG ceilometer.compute.pollsters [-] b50dd877-42b1-46b2-933e-ee9a660a56c3/network.incoming.packets volume: 24 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.770 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cca62e04-94ed-4373-a59d-a952dac5db73', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 163, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000074-b50dd877-42b1-46b2-933e-ee9a660a56c3-tap1b5f134e-57', 'timestamp': '2025-11-22T08:10:36.768662', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979784336', 'name': 'tap1b5f134e-57', 'instance_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9f:bd:73', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1b5f134e-57'}, 'message_id': 'b9ec3838-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5779.374371618, 'message_signature': 'e8e1955f9445808942563435548a269ee225a2e58977eb286d4304db2f812268'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 24, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000074-b50dd877-42b1-46b2-933e-ee9a660a56c3-tap31719a20-f6', 'timestamp': '2025-11-22T08:10:36.768662', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979784336', 'name': 'tap31719a20-f6', 'instance_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:37:ae:80', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap31719a20-f6'}, 'message_id': 'b9ec46ac-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5779.374371618, 'message_signature': 'b5f748dcee2d26b5b54e358e788b60474d0e470df11129c028944c72182a671c'}]}, 'timestamp': '2025-11-22 08:10:36.769430', '_unique_id': '9c39926e7cf845778039c22775e9ba89'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.770 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.770 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.770 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.770 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.770 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.770 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.770 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.770 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.770 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.770 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.770 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.770 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.770 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.770 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.770 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.770 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.770 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.770 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.770 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.770 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.770 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.770 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.770 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.770 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.770 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.770 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.770 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.770 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.770 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.770 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.770 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.771 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.771 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.771 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1979784336>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1979784336>]
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.771 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.771 12 DEBUG ceilometer.compute.pollsters [-] b50dd877-42b1-46b2-933e-ee9a660a56c3/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.772 12 DEBUG ceilometer.compute.pollsters [-] b50dd877-42b1-46b2-933e-ee9a660a56c3/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.773 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8651447d-97fd-490b-8583-4d36d84948e1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000074-b50dd877-42b1-46b2-933e-ee9a660a56c3-tap1b5f134e-57', 'timestamp': '2025-11-22T08:10:36.771865', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979784336', 'name': 'tap1b5f134e-57', 'instance_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9f:bd:73', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1b5f134e-57'}, 'message_id': 'b9ecb466-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5779.374371618, 'message_signature': '42d0a129ec9b5127689e39caa34ff3f2fd0a7c4ffac51d278fbc56e402004d29'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000074-b50dd877-42b1-46b2-933e-ee9a660a56c3-tap31719a20-f6', 'timestamp': '2025-11-22T08:10:36.771865', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979784336', 'name': 'tap31719a20-f6', 'instance_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:37:ae:80', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap31719a20-f6'}, 'message_id': 'b9ecc2e4-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5779.374371618, 'message_signature': 'f7c78edcd92a938b28c4cc179c4daa71795fda1e5f82ac7b00efc26143e205b7'}]}, 'timestamp': '2025-11-22 08:10:36.772592', '_unique_id': 'beacd54da66149b79a61a57f8c87f93e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.773 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.773 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.773 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.773 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.773 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.773 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.773 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.773 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.773 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.773 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.773 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.773 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.773 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.773 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.773 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.773 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.773 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.773 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.773 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.773 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.773 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.773 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.773 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.773 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.773 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.773 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.773 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.773 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.773 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.773 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.773 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.774 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.774 12 DEBUG ceilometer.compute.pollsters [-] b50dd877-42b1-46b2-933e-ee9a660a56c3/network.outgoing.bytes volume: 26978 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.774 12 DEBUG ceilometer.compute.pollsters [-] b50dd877-42b1-46b2-933e-ee9a660a56c3/network.outgoing.bytes volume: 3486 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.775 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '093f8afe-11cf-4018-a283-50e224d2bf0e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 26978, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000074-b50dd877-42b1-46b2-933e-ee9a660a56c3-tap1b5f134e-57', 'timestamp': '2025-11-22T08:10:36.774441', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979784336', 'name': 'tap1b5f134e-57', 'instance_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9f:bd:73', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1b5f134e-57'}, 'message_id': 'b9ed182a-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5779.374371618, 'message_signature': 'bf4c69ac9012aaaf6ac51ce3998e8e07e6518f0378b2cb66877f5e90c0724af0'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3486, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000074-b50dd877-42b1-46b2-933e-ee9a660a56c3-tap31719a20-f6', 'timestamp': '2025-11-22T08:10:36.774441', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979784336', 'name': 'tap31719a20-f6', 'instance_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:37:ae:80', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap31719a20-f6'}, 'message_id': 'b9ed2496-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5779.374371618, 'message_signature': '0bdb1a20af4044e9007999a5812dfa15dcc8f6773f2ac9c33d9e4119bb1ab330'}]}, 'timestamp': '2025-11-22 08:10:36.775089', '_unique_id': '60edeb7cfb9a473ea1a0ebac2ba87497'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.775 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.775 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.775 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.775 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.775 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.775 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.775 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.775 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.775 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.775 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.775 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.775 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.775 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.775 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.775 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.775 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.775 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.775 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.775 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.775 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.775 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.775 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.775 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.775 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.775 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.775 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.775 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.775 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.775 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.775 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.775 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.776 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.776 12 DEBUG ceilometer.compute.pollsters [-] b50dd877-42b1-46b2-933e-ee9a660a56c3/disk.device.read.latency volume: 1093495069 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.777 12 DEBUG ceilometer.compute.pollsters [-] b50dd877-42b1-46b2-933e-ee9a660a56c3/disk.device.read.latency volume: 223459020 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.778 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2fcd4c76-81a0-4ddf-a245-7bddda6933f0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1093495069, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3-vda', 'timestamp': '2025-11-22T08:10:36.776860', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979784336', 'name': 'instance-00000074', 'instance_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b9ed78ec-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5779.380691943, 'message_signature': 'd89dd23df18014631f9a04d76888a657b2e23498eb3658667192badf9e0d6fa2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 223459020, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3-sda', 'timestamp': '2025-11-22T08:10:36.776860', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979784336', 'name': 'instance-00000074', 'instance_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b9ed86ca-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5779.380691943, 'message_signature': 'a19feb260c9e5a5d724fd6c531b24afc12003c9990a7f260956922ecc9943975'}]}, 'timestamp': '2025-11-22 08:10:36.777600', '_unique_id': 'cdfd02f614e44cd9bbd7552cc23c791b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.778 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.778 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.778 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.778 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.778 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.778 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.778 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.778 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.778 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.778 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.778 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.778 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.778 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.778 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.778 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.778 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.778 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.778 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.778 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.778 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.778 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.778 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.778 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.778 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.778 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.778 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.778 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.778 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.778 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.778 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.778 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.779 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.779 12 DEBUG ceilometer.compute.pollsters [-] b50dd877-42b1-46b2-933e-ee9a660a56c3/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.779 12 DEBUG ceilometer.compute.pollsters [-] b50dd877-42b1-46b2-933e-ee9a660a56c3/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.781 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '896a0c1a-3ac1-4274-a98e-e079290a3a6a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000074-b50dd877-42b1-46b2-933e-ee9a660a56c3-tap1b5f134e-57', 'timestamp': '2025-11-22T08:10:36.779594', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979784336', 'name': 'tap1b5f134e-57', 'instance_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9f:bd:73', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1b5f134e-57'}, 'message_id': 'b9ede39a-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5779.374371618, 'message_signature': '051f7916c776ce4db863db8a80988233caf2ba73068cce0cde6233735d582230'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000074-b50dd877-42b1-46b2-933e-ee9a660a56c3-tap31719a20-f6', 'timestamp': '2025-11-22T08:10:36.779594', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979784336', 'name': 'tap31719a20-f6', 'instance_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:37:ae:80', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap31719a20-f6'}, 'message_id': 'b9edf04c-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5779.374371618, 'message_signature': '51fb9f72ca3835482f1df74160cef06a82289a865d42abeff2fc48fca35bddf2'}]}, 'timestamp': '2025-11-22 08:10:36.780346', '_unique_id': '775384372eab45b7a8c8b4591416daf8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.781 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.781 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.781 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.781 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.781 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.781 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.781 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.781 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.781 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.781 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.781 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.781 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.781 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.781 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.781 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.781 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.781 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.781 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.781 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.781 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.781 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.781 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.781 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.781 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.781 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.781 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.781 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.781 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.781 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.781 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.781 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.782 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.782 12 DEBUG ceilometer.compute.pollsters [-] b50dd877-42b1-46b2-933e-ee9a660a56c3/disk.device.write.bytes volume: 73134080 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.782 12 DEBUG ceilometer.compute.pollsters [-] b50dd877-42b1-46b2-933e-ee9a660a56c3/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.784 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0daacc14-be11-4aad-a9d2-771ad34c9ec3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73134080, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3-vda', 'timestamp': '2025-11-22T08:10:36.782503', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979784336', 'name': 'instance-00000074', 'instance_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b9ee5320-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5779.380691943, 'message_signature': 'de5867f59abc1d68ba6bf725f53990f2ef1ce82e657a94386402e60e1f47c24d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3-sda', 'timestamp': '2025-11-22T08:10:36.782503', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979784336', 'name': 'instance-00000074', 'instance_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b9ee5f64-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5779.380691943, 'message_signature': 'ba3277c91d85f3fdfc1db89ac56888cfb8f7391fcb2ff18736d89b927871237c'}]}, 'timestamp': '2025-11-22 08:10:36.783137', '_unique_id': 'b439bdc64c664135b635064b9eab122c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.784 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.784 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.784 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.784 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.784 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.784 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.784 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.784 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.784 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.784 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.784 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.784 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.784 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.784 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.784 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.784 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.784 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.784 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.784 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.784 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.784 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.784 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.784 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.784 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.784 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.784 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.784 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.784 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.784 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.784 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.784 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.784 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.785 12 DEBUG ceilometer.compute.pollsters [-] b50dd877-42b1-46b2-933e-ee9a660a56c3/memory.usage volume: 48.04296875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.786 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '82743f20-dd76-4085-907f-20dba2e93103', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 48.04296875, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3', 'timestamp': '2025-11-22T08:10:36.785033', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1979784336', 'name': 'instance-00000074', 'instance_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'b9eeb5d6-c77a-11f0-9b25-fa163ecc0304', 'monotonic_time': 5779.356707421, 'message_signature': 'a9e28729214cb138804a628621bdca8e712e736ca999f061b66bdc615dc5fa56'}]}, 'timestamp': '2025-11-22 08:10:36.785379', '_unique_id': 'ee45924e84f7473b827a01b92cdd021b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.786 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.786 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.786 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.786 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.786 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.786 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.786 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.786 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.786 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.786 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.786 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.786 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.786 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.786 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.786 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.786 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.786 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.786 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.786 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.786 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.786 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.786 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.786 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.786 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.786 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.786 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.786 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.786 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.786 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.786 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:10:36.786 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:10:36 np0005531887 nova_compute[186849]: 2025-11-22 08:10:36.941 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:37 np0005531887 nova_compute[186849]: 2025-11-22 08:10:37.121 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:37.344 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:10:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:37.345 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:10:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:37.346 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:10:37 np0005531887 ovn_controller[95130]: 2025-11-22T08:10:37Z|00408|binding|INFO|Releasing lport ec231e2a-1042-4a3a-b541-060f5a121bb8 from this chassis (sb_readonly=0)
Nov 22 03:10:37 np0005531887 ovn_controller[95130]: 2025-11-22T08:10:37Z|00409|binding|INFO|Releasing lport 288f6565-c1a7-412f-8593-8864123e2215 from this chassis (sb_readonly=0)
Nov 22 03:10:37 np0005531887 nova_compute[186849]: 2025-11-22 08:10:37.522 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:37 np0005531887 nova_compute[186849]: 2025-11-22 08:10:37.751 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:37 np0005531887 podman[232192]: 2025-11-22 08:10:37.853202906 +0000 UTC m=+0.070279965 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 03:10:39 np0005531887 nova_compute[186849]: 2025-11-22 08:10:39.035 186853 DEBUG nova.compute.manager [req-d588c262-0ed3-4175-a933-3c2e879f0006 req-d1d80fcc-b12b-415c-8bc7-27ad142211bc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Received event network-changed-1b5f134e-5728-4e7f-ba86-8650cc0b721d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:10:39 np0005531887 nova_compute[186849]: 2025-11-22 08:10:39.036 186853 DEBUG nova.compute.manager [req-d588c262-0ed3-4175-a933-3c2e879f0006 req-d1d80fcc-b12b-415c-8bc7-27ad142211bc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Refreshing instance network info cache due to event network-changed-1b5f134e-5728-4e7f-ba86-8650cc0b721d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:10:39 np0005531887 nova_compute[186849]: 2025-11-22 08:10:39.037 186853 DEBUG oslo_concurrency.lockutils [req-d588c262-0ed3-4175-a933-3c2e879f0006 req-d1d80fcc-b12b-415c-8bc7-27ad142211bc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-b50dd877-42b1-46b2-933e-ee9a660a56c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:10:39 np0005531887 nova_compute[186849]: 2025-11-22 08:10:39.037 186853 DEBUG oslo_concurrency.lockutils [req-d588c262-0ed3-4175-a933-3c2e879f0006 req-d1d80fcc-b12b-415c-8bc7-27ad142211bc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-b50dd877-42b1-46b2-933e-ee9a660a56c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:10:39 np0005531887 nova_compute[186849]: 2025-11-22 08:10:39.037 186853 DEBUG nova.network.neutron [req-d588c262-0ed3-4175-a933-3c2e879f0006 req-d1d80fcc-b12b-415c-8bc7-27ad142211bc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Refreshing network info cache for port 1b5f134e-5728-4e7f-ba86-8650cc0b721d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:10:39 np0005531887 nova_compute[186849]: 2025-11-22 08:10:39.112 186853 DEBUG oslo_concurrency.lockutils [None req-646d9fcf-3987-47f4-a0e0-e4b7f4ab53d5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "b50dd877-42b1-46b2-933e-ee9a660a56c3" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:10:39 np0005531887 nova_compute[186849]: 2025-11-22 08:10:39.112 186853 DEBUG oslo_concurrency.lockutils [None req-646d9fcf-3987-47f4-a0e0-e4b7f4ab53d5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "b50dd877-42b1-46b2-933e-ee9a660a56c3" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:10:39 np0005531887 nova_compute[186849]: 2025-11-22 08:10:39.113 186853 DEBUG oslo_concurrency.lockutils [None req-646d9fcf-3987-47f4-a0e0-e4b7f4ab53d5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "b50dd877-42b1-46b2-933e-ee9a660a56c3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:10:39 np0005531887 nova_compute[186849]: 2025-11-22 08:10:39.113 186853 DEBUG oslo_concurrency.lockutils [None req-646d9fcf-3987-47f4-a0e0-e4b7f4ab53d5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "b50dd877-42b1-46b2-933e-ee9a660a56c3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:10:39 np0005531887 nova_compute[186849]: 2025-11-22 08:10:39.114 186853 DEBUG oslo_concurrency.lockutils [None req-646d9fcf-3987-47f4-a0e0-e4b7f4ab53d5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "b50dd877-42b1-46b2-933e-ee9a660a56c3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:10:39 np0005531887 nova_compute[186849]: 2025-11-22 08:10:39.123 186853 INFO nova.compute.manager [None req-646d9fcf-3987-47f4-a0e0-e4b7f4ab53d5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Terminating instance#033[00m
Nov 22 03:10:39 np0005531887 nova_compute[186849]: 2025-11-22 08:10:39.131 186853 DEBUG nova.compute.manager [None req-646d9fcf-3987-47f4-a0e0-e4b7f4ab53d5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 03:10:39 np0005531887 kernel: tap1b5f134e-57 (unregistering): left promiscuous mode
Nov 22 03:10:39 np0005531887 NetworkManager[55210]: <info>  [1763799039.1532] device (tap1b5f134e-57): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:10:39 np0005531887 nova_compute[186849]: 2025-11-22 08:10:39.160 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:39 np0005531887 ovn_controller[95130]: 2025-11-22T08:10:39Z|00410|binding|INFO|Releasing lport 1b5f134e-5728-4e7f-ba86-8650cc0b721d from this chassis (sb_readonly=0)
Nov 22 03:10:39 np0005531887 ovn_controller[95130]: 2025-11-22T08:10:39Z|00411|binding|INFO|Setting lport 1b5f134e-5728-4e7f-ba86-8650cc0b721d down in Southbound
Nov 22 03:10:39 np0005531887 ovn_controller[95130]: 2025-11-22T08:10:39Z|00412|binding|INFO|Removing iface tap1b5f134e-57 ovn-installed in OVS
Nov 22 03:10:39 np0005531887 nova_compute[186849]: 2025-11-22 08:10:39.168 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:39 np0005531887 nova_compute[186849]: 2025-11-22 08:10:39.182 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:39 np0005531887 kernel: tap31719a20-f6 (unregistering): left promiscuous mode
Nov 22 03:10:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:39.183 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9f:bd:73 10.100.0.11'], port_security=['fa:16:3e:9f:bd:73 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8591a8a4-c35f-454b-ba4c-4ec37a8765b2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'fecc702f-680b-424c-83ef-3f9c6214c28e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0c8e809e-e81c-4dfc-8977-f974433d5b3a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=1b5f134e-5728-4e7f-ba86-8650cc0b721d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:10:39 np0005531887 NetworkManager[55210]: <info>  [1763799039.1879] device (tap31719a20-f6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:10:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:39.186 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 1b5f134e-5728-4e7f-ba86-8650cc0b721d in datapath 8591a8a4-c35f-454b-ba4c-4ec37a8765b2 unbound from our chassis#033[00m
Nov 22 03:10:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:39.189 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8591a8a4-c35f-454b-ba4c-4ec37a8765b2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:10:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:39.189 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[7b8aaca8-8637-450a-b15c-3b5c5e1078da]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:39.190 104084 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8591a8a4-c35f-454b-ba4c-4ec37a8765b2 namespace which is not needed anymore#033[00m
Nov 22 03:10:39 np0005531887 nova_compute[186849]: 2025-11-22 08:10:39.196 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:39 np0005531887 ovn_controller[95130]: 2025-11-22T08:10:39Z|00413|binding|INFO|Releasing lport 31719a20-f6e8-45a0-9f9a-d1e76c49b1a9 from this chassis (sb_readonly=0)
Nov 22 03:10:39 np0005531887 ovn_controller[95130]: 2025-11-22T08:10:39Z|00414|binding|INFO|Setting lport 31719a20-f6e8-45a0-9f9a-d1e76c49b1a9 down in Southbound
Nov 22 03:10:39 np0005531887 ovn_controller[95130]: 2025-11-22T08:10:39Z|00415|binding|INFO|Removing iface tap31719a20-f6 ovn-installed in OVS
Nov 22 03:10:39 np0005531887 nova_compute[186849]: 2025-11-22 08:10:39.198 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:39.206 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:37:ae:80 2001:db8::f816:3eff:fe37:ae80'], port_security=['fa:16:3e:37:ae:80 2001:db8::f816:3eff:fe37:ae80'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe37:ae80/64', 'neutron:device_id': 'b50dd877-42b1-46b2-933e-ee9a660a56c3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'fecc702f-680b-424c-83ef-3f9c6214c28e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6a4afbec-9e59-4ffa-9128-10dc4f025189, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=31719a20-f6e8-45a0-9f9a-d1e76c49b1a9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:10:39 np0005531887 nova_compute[186849]: 2025-11-22 08:10:39.214 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:39 np0005531887 systemd[1]: machine-qemu\x2d45\x2dinstance\x2d00000074.scope: Deactivated successfully.
Nov 22 03:10:39 np0005531887 systemd[1]: machine-qemu\x2d45\x2dinstance\x2d00000074.scope: Consumed 19.467s CPU time.
Nov 22 03:10:39 np0005531887 systemd-machined[153180]: Machine qemu-45-instance-00000074 terminated.
Nov 22 03:10:39 np0005531887 neutron-haproxy-ovnmeta-8591a8a4-c35f-454b-ba4c-4ec37a8765b2[231170]: [NOTICE]   (231175) : haproxy version is 2.8.14-c23fe91
Nov 22 03:10:39 np0005531887 neutron-haproxy-ovnmeta-8591a8a4-c35f-454b-ba4c-4ec37a8765b2[231170]: [NOTICE]   (231175) : path to executable is /usr/sbin/haproxy
Nov 22 03:10:39 np0005531887 neutron-haproxy-ovnmeta-8591a8a4-c35f-454b-ba4c-4ec37a8765b2[231170]: [WARNING]  (231175) : Exiting Master process...
Nov 22 03:10:39 np0005531887 neutron-haproxy-ovnmeta-8591a8a4-c35f-454b-ba4c-4ec37a8765b2[231170]: [WARNING]  (231175) : Exiting Master process...
Nov 22 03:10:39 np0005531887 neutron-haproxy-ovnmeta-8591a8a4-c35f-454b-ba4c-4ec37a8765b2[231170]: [ALERT]    (231175) : Current worker (231183) exited with code 143 (Terminated)
Nov 22 03:10:39 np0005531887 neutron-haproxy-ovnmeta-8591a8a4-c35f-454b-ba4c-4ec37a8765b2[231170]: [WARNING]  (231175) : All workers exited. Exiting... (0)
Nov 22 03:10:39 np0005531887 systemd[1]: libpod-162f5bb0dc18daf8868d5f0c0da05be0beae41e959d3cc6430ffb22a95df4263.scope: Deactivated successfully.
Nov 22 03:10:39 np0005531887 podman[232246]: 2025-11-22 08:10:39.317141455 +0000 UTC m=+0.045370782 container died 162f5bb0dc18daf8868d5f0c0da05be0beae41e959d3cc6430ffb22a95df4263 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8591a8a4-c35f-454b-ba4c-4ec37a8765b2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 22 03:10:39 np0005531887 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-162f5bb0dc18daf8868d5f0c0da05be0beae41e959d3cc6430ffb22a95df4263-userdata-shm.mount: Deactivated successfully.
Nov 22 03:10:39 np0005531887 systemd[1]: var-lib-containers-storage-overlay-7bc215a5d5fd906043a313f9814d67bddf90cd688eb4d224be8ca48a729bd9ae-merged.mount: Deactivated successfully.
Nov 22 03:10:39 np0005531887 podman[232246]: 2025-11-22 08:10:39.352919377 +0000 UTC m=+0.081148694 container cleanup 162f5bb0dc18daf8868d5f0c0da05be0beae41e959d3cc6430ffb22a95df4263 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8591a8a4-c35f-454b-ba4c-4ec37a8765b2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 22 03:10:39 np0005531887 NetworkManager[55210]: <info>  [1763799039.3660] manager: (tap31719a20-f6): new Tun device (/org/freedesktop/NetworkManager/Devices/188)
Nov 22 03:10:39 np0005531887 systemd[1]: libpod-conmon-162f5bb0dc18daf8868d5f0c0da05be0beae41e959d3cc6430ffb22a95df4263.scope: Deactivated successfully.
Nov 22 03:10:39 np0005531887 nova_compute[186849]: 2025-11-22 08:10:39.416 186853 INFO nova.virt.libvirt.driver [-] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Instance destroyed successfully.#033[00m
Nov 22 03:10:39 np0005531887 nova_compute[186849]: 2025-11-22 08:10:39.417 186853 DEBUG nova.objects.instance [None req-646d9fcf-3987-47f4-a0e0-e4b7f4ab53d5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lazy-loading 'resources' on Instance uuid b50dd877-42b1-46b2-933e-ee9a660a56c3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:10:39 np0005531887 podman[232281]: 2025-11-22 08:10:39.42153002 +0000 UTC m=+0.045645066 container remove 162f5bb0dc18daf8868d5f0c0da05be0beae41e959d3cc6430ffb22a95df4263 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8591a8a4-c35f-454b-ba4c-4ec37a8765b2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 22 03:10:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:39.428 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[ff74261f-7e8b-4a7d-b235-4546b6e37571]: (4, ('Sat Nov 22 08:10:39 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-8591a8a4-c35f-454b-ba4c-4ec37a8765b2 (162f5bb0dc18daf8868d5f0c0da05be0beae41e959d3cc6430ffb22a95df4263)\n162f5bb0dc18daf8868d5f0c0da05be0beae41e959d3cc6430ffb22a95df4263\nSat Nov 22 08:10:39 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-8591a8a4-c35f-454b-ba4c-4ec37a8765b2 (162f5bb0dc18daf8868d5f0c0da05be0beae41e959d3cc6430ffb22a95df4263)\n162f5bb0dc18daf8868d5f0c0da05be0beae41e959d3cc6430ffb22a95df4263\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:39.430 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[a1c53eb1-bb50-4f08-9a66-aad5519498f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:39.431 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8591a8a4-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:10:39 np0005531887 nova_compute[186849]: 2025-11-22 08:10:39.433 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:39 np0005531887 kernel: tap8591a8a4-c0: left promiscuous mode
Nov 22 03:10:39 np0005531887 nova_compute[186849]: 2025-11-22 08:10:39.455 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:39.458 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[3796ac93-0629-452f-be33-990273eac33b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:39.471 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[dd2bc843-6276-408a-a1d9-fafbb41518fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:39.473 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[aa544aec-3cc0-4450-b5bb-ef5bda900514]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:39 np0005531887 nova_compute[186849]: 2025-11-22 08:10:39.475 186853 DEBUG nova.virt.libvirt.vif [None req-646d9fcf-3987-47f4-a0e0-e4b7f4ab53d5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:08:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1979784336',display_name='tempest-TestGettingAddress-server-1979784336',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1979784336',id=116,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFjUHiasph3mANdjXDIFU/4z6QnY3zqHFX60ljMxnOboMARrmtehJoNKI61Z4yVjzWcQubwJZkj5r7viLLQ3CASAyZSRfJmCkosrre9zWh2jX66uWt7aGdm69U4zKqj5nQ==',key_name='tempest-TestGettingAddress-345322674',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:09:05Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-w9tyxpc6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:09:05Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=b50dd877-42b1-46b2-933e-ee9a660a56c3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1b5f134e-5728-4e7f-ba86-8650cc0b721d", "address": "fa:16:3e:9f:bd:73", "network": {"id": "8591a8a4-c35f-454b-ba4c-4ec37a8765b2", "bridge": "br-int", "label": "tempest-network-smoke--372886439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.216", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b5f134e-57", "ovs_interfaceid": "1b5f134e-5728-4e7f-ba86-8650cc0b721d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:10:39 np0005531887 nova_compute[186849]: 2025-11-22 08:10:39.475 186853 DEBUG nova.network.os_vif_util [None req-646d9fcf-3987-47f4-a0e0-e4b7f4ab53d5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "1b5f134e-5728-4e7f-ba86-8650cc0b721d", "address": "fa:16:3e:9f:bd:73", "network": {"id": "8591a8a4-c35f-454b-ba4c-4ec37a8765b2", "bridge": "br-int", "label": "tempest-network-smoke--372886439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.216", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b5f134e-57", "ovs_interfaceid": "1b5f134e-5728-4e7f-ba86-8650cc0b721d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:10:39 np0005531887 nova_compute[186849]: 2025-11-22 08:10:39.476 186853 DEBUG nova.network.os_vif_util [None req-646d9fcf-3987-47f4-a0e0-e4b7f4ab53d5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:9f:bd:73,bridge_name='br-int',has_traffic_filtering=True,id=1b5f134e-5728-4e7f-ba86-8650cc0b721d,network=Network(8591a8a4-c35f-454b-ba4c-4ec37a8765b2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b5f134e-57') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:10:39 np0005531887 nova_compute[186849]: 2025-11-22 08:10:39.476 186853 DEBUG os_vif [None req-646d9fcf-3987-47f4-a0e0-e4b7f4ab53d5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:9f:bd:73,bridge_name='br-int',has_traffic_filtering=True,id=1b5f134e-5728-4e7f-ba86-8650cc0b721d,network=Network(8591a8a4-c35f-454b-ba4c-4ec37a8765b2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b5f134e-57') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:10:39 np0005531887 nova_compute[186849]: 2025-11-22 08:10:39.478 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:39 np0005531887 nova_compute[186849]: 2025-11-22 08:10:39.478 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1b5f134e-57, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:10:39 np0005531887 nova_compute[186849]: 2025-11-22 08:10:39.479 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:39 np0005531887 nova_compute[186849]: 2025-11-22 08:10:39.481 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:10:39 np0005531887 nova_compute[186849]: 2025-11-22 08:10:39.484 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:39 np0005531887 nova_compute[186849]: 2025-11-22 08:10:39.486 186853 INFO os_vif [None req-646d9fcf-3987-47f4-a0e0-e4b7f4ab53d5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:9f:bd:73,bridge_name='br-int',has_traffic_filtering=True,id=1b5f134e-5728-4e7f-ba86-8650cc0b721d,network=Network(8591a8a4-c35f-454b-ba4c-4ec37a8765b2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b5f134e-57')#033[00m
Nov 22 03:10:39 np0005531887 nova_compute[186849]: 2025-11-22 08:10:39.487 186853 DEBUG nova.virt.libvirt.vif [None req-646d9fcf-3987-47f4-a0e0-e4b7f4ab53d5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:08:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1979784336',display_name='tempest-TestGettingAddress-server-1979784336',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1979784336',id=116,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFjUHiasph3mANdjXDIFU/4z6QnY3zqHFX60ljMxnOboMARrmtehJoNKI61Z4yVjzWcQubwJZkj5r7viLLQ3CASAyZSRfJmCkosrre9zWh2jX66uWt7aGdm69U4zKqj5nQ==',key_name='tempest-TestGettingAddress-345322674',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:09:05Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-w9tyxpc6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:09:05Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=b50dd877-42b1-46b2-933e-ee9a660a56c3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "31719a20-f6e8-45a0-9f9a-d1e76c49b1a9", "address": "fa:16:3e:37:ae:80", "network": {"id": "6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad", "bridge": "br-int", "label": "tempest-network-smoke--1013716252", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe37:ae80", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31719a20-f6", "ovs_interfaceid": "31719a20-f6e8-45a0-9f9a-d1e76c49b1a9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:10:39 np0005531887 nova_compute[186849]: 2025-11-22 08:10:39.487 186853 DEBUG nova.network.os_vif_util [None req-646d9fcf-3987-47f4-a0e0-e4b7f4ab53d5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "31719a20-f6e8-45a0-9f9a-d1e76c49b1a9", "address": "fa:16:3e:37:ae:80", "network": {"id": "6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad", "bridge": "br-int", "label": "tempest-network-smoke--1013716252", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe37:ae80", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31719a20-f6", "ovs_interfaceid": "31719a20-f6e8-45a0-9f9a-d1e76c49b1a9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:10:39 np0005531887 nova_compute[186849]: 2025-11-22 08:10:39.488 186853 DEBUG nova.network.os_vif_util [None req-646d9fcf-3987-47f4-a0e0-e4b7f4ab53d5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:37:ae:80,bridge_name='br-int',has_traffic_filtering=True,id=31719a20-f6e8-45a0-9f9a-d1e76c49b1a9,network=Network(6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31719a20-f6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:10:39 np0005531887 nova_compute[186849]: 2025-11-22 08:10:39.488 186853 DEBUG os_vif [None req-646d9fcf-3987-47f4-a0e0-e4b7f4ab53d5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:37:ae:80,bridge_name='br-int',has_traffic_filtering=True,id=31719a20-f6e8-45a0-9f9a-d1e76c49b1a9,network=Network(6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31719a20-f6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:10:39 np0005531887 nova_compute[186849]: 2025-11-22 08:10:39.490 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:39 np0005531887 nova_compute[186849]: 2025-11-22 08:10:39.490 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap31719a20-f6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:10:39 np0005531887 nova_compute[186849]: 2025-11-22 08:10:39.491 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:39 np0005531887 nova_compute[186849]: 2025-11-22 08:10:39.493 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:39.493 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[d472b77e-344c-46e5-a9ce-7265cea9e1d9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 568748, 'reachable_time': 35220, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232323, 'error': None, 'target': 'ovnmeta-8591a8a4-c35f-454b-ba4c-4ec37a8765b2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:39 np0005531887 nova_compute[186849]: 2025-11-22 08:10:39.494 186853 INFO os_vif [None req-646d9fcf-3987-47f4-a0e0-e4b7f4ab53d5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:37:ae:80,bridge_name='br-int',has_traffic_filtering=True,id=31719a20-f6e8-45a0-9f9a-d1e76c49b1a9,network=Network(6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31719a20-f6')#033[00m
Nov 22 03:10:39 np0005531887 nova_compute[186849]: 2025-11-22 08:10:39.495 186853 INFO nova.virt.libvirt.driver [None req-646d9fcf-3987-47f4-a0e0-e4b7f4ab53d5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Deleting instance files /var/lib/nova/instances/b50dd877-42b1-46b2-933e-ee9a660a56c3_del#033[00m
Nov 22 03:10:39 np0005531887 nova_compute[186849]: 2025-11-22 08:10:39.496 186853 INFO nova.virt.libvirt.driver [None req-646d9fcf-3987-47f4-a0e0-e4b7f4ab53d5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Deletion of /var/lib/nova/instances/b50dd877-42b1-46b2-933e-ee9a660a56c3_del complete#033[00m
Nov 22 03:10:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:39.496 104200 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8591a8a4-c35f-454b-ba4c-4ec37a8765b2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:10:39 np0005531887 systemd[1]: run-netns-ovnmeta\x2d8591a8a4\x2dc35f\x2d454b\x2dba4c\x2d4ec37a8765b2.mount: Deactivated successfully.
Nov 22 03:10:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:39.496 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[89436b7f-4f45-48d6-9e25-d6b7e05a4362]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:39.497 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 31719a20-f6e8-45a0-9f9a-d1e76c49b1a9 in datapath 6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad unbound from our chassis#033[00m
Nov 22 03:10:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:39.499 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:10:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:39.500 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[6228764a-ac07-4774-9bc3-7edd508b0021]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:39.500 104084 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad namespace which is not needed anymore#033[00m
Nov 22 03:10:39 np0005531887 nova_compute[186849]: 2025-11-22 08:10:39.576 186853 INFO nova.compute.manager [None req-646d9fcf-3987-47f4-a0e0-e4b7f4ab53d5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Took 0.44 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 03:10:39 np0005531887 nova_compute[186849]: 2025-11-22 08:10:39.576 186853 DEBUG oslo.service.loopingcall [None req-646d9fcf-3987-47f4-a0e0-e4b7f4ab53d5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 03:10:39 np0005531887 nova_compute[186849]: 2025-11-22 08:10:39.577 186853 DEBUG nova.compute.manager [-] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 03:10:39 np0005531887 nova_compute[186849]: 2025-11-22 08:10:39.577 186853 DEBUG nova.network.neutron [-] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 03:10:39 np0005531887 neutron-haproxy-ovnmeta-6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad[231250]: [NOTICE]   (231254) : haproxy version is 2.8.14-c23fe91
Nov 22 03:10:39 np0005531887 neutron-haproxy-ovnmeta-6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad[231250]: [NOTICE]   (231254) : path to executable is /usr/sbin/haproxy
Nov 22 03:10:39 np0005531887 neutron-haproxy-ovnmeta-6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad[231250]: [WARNING]  (231254) : Exiting Master process...
Nov 22 03:10:39 np0005531887 neutron-haproxy-ovnmeta-6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad[231250]: [WARNING]  (231254) : Exiting Master process...
Nov 22 03:10:39 np0005531887 neutron-haproxy-ovnmeta-6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad[231250]: [ALERT]    (231254) : Current worker (231256) exited with code 143 (Terminated)
Nov 22 03:10:39 np0005531887 neutron-haproxy-ovnmeta-6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad[231250]: [WARNING]  (231254) : All workers exited. Exiting... (0)
Nov 22 03:10:39 np0005531887 systemd[1]: libpod-87f18f933149c3181baec850bf50fb4b1c04123c4ccb43ac9e736ba1fcc25167.scope: Deactivated successfully.
Nov 22 03:10:39 np0005531887 podman[232342]: 2025-11-22 08:10:39.670186407 +0000 UTC m=+0.077569886 container died 87f18f933149c3181baec850bf50fb4b1c04123c4ccb43ac9e736ba1fcc25167 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Nov 22 03:10:39 np0005531887 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-87f18f933149c3181baec850bf50fb4b1c04123c4ccb43ac9e736ba1fcc25167-userdata-shm.mount: Deactivated successfully.
Nov 22 03:10:39 np0005531887 systemd[1]: var-lib-containers-storage-overlay-cd11cfd91e13c5cf9abb1af6cfd96061eaa6fe133724788e30e7bd2c5bd3b98a-merged.mount: Deactivated successfully.
Nov 22 03:10:39 np0005531887 podman[232342]: 2025-11-22 08:10:39.713156898 +0000 UTC m=+0.120540367 container cleanup 87f18f933149c3181baec850bf50fb4b1c04123c4ccb43ac9e736ba1fcc25167 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 03:10:39 np0005531887 systemd[1]: libpod-conmon-87f18f933149c3181baec850bf50fb4b1c04123c4ccb43ac9e736ba1fcc25167.scope: Deactivated successfully.
Nov 22 03:10:39 np0005531887 podman[232373]: 2025-11-22 08:10:39.817404961 +0000 UTC m=+0.080997531 container remove 87f18f933149c3181baec850bf50fb4b1c04123c4ccb43ac9e736ba1fcc25167 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 22 03:10:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:39.823 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[04202bb2-59d2-4349-b2aa-71026d49dfc0]: (4, ('Sat Nov 22 08:10:39 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad (87f18f933149c3181baec850bf50fb4b1c04123c4ccb43ac9e736ba1fcc25167)\n87f18f933149c3181baec850bf50fb4b1c04123c4ccb43ac9e736ba1fcc25167\nSat Nov 22 08:10:39 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad (87f18f933149c3181baec850bf50fb4b1c04123c4ccb43ac9e736ba1fcc25167)\n87f18f933149c3181baec850bf50fb4b1c04123c4ccb43ac9e736ba1fcc25167\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:39.825 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[2930dd08-6a07-46dc-b608-c22f04a146b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:39.826 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6a8e7fc1-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:10:39 np0005531887 nova_compute[186849]: 2025-11-22 08:10:39.827 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:39 np0005531887 kernel: tap6a8e7fc1-60: left promiscuous mode
Nov 22 03:10:39 np0005531887 nova_compute[186849]: 2025-11-22 08:10:39.844 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:39.846 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[1cc405de-76ac-47ab-b2dd-4490c677723b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:39.867 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[b938940a-bfe0-4296-8bb3-45cee8503442]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:39.869 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[017e7c4b-d4fd-4bc2-b2f5-910ded6e98c5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:39.890 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[efbea6a6-c694-4701-b561-cdd29400c5ef]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 568843, 'reachable_time': 35101, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232388, 'error': None, 'target': 'ovnmeta-6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:39.892 104200 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:10:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:10:39.892 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[00a9868a-c5b6-4b68-acc7-a45a6851d59b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:10:40 np0005531887 systemd[1]: run-netns-ovnmeta\x2d6a8e7fc1\x2d6ea3\x2d4bc9\x2d85d9\x2df62acc4ca9ad.mount: Deactivated successfully.
Nov 22 03:10:41 np0005531887 nova_compute[186849]: 2025-11-22 08:10:41.169 186853 DEBUG nova.compute.manager [req-b4594cdd-8ab8-4a61-9f7d-636f39ebfa5d req-86514022-fa23-4878-a64e-543b422279f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Received event network-vif-deleted-1b5f134e-5728-4e7f-ba86-8650cc0b721d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:10:41 np0005531887 nova_compute[186849]: 2025-11-22 08:10:41.170 186853 INFO nova.compute.manager [req-b4594cdd-8ab8-4a61-9f7d-636f39ebfa5d req-86514022-fa23-4878-a64e-543b422279f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Neutron deleted interface 1b5f134e-5728-4e7f-ba86-8650cc0b721d; detaching it from the instance and deleting it from the info cache#033[00m
Nov 22 03:10:41 np0005531887 nova_compute[186849]: 2025-11-22 08:10:41.171 186853 DEBUG nova.network.neutron [req-b4594cdd-8ab8-4a61-9f7d-636f39ebfa5d req-86514022-fa23-4878-a64e-543b422279f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Updating instance_info_cache with network_info: [{"id": "31719a20-f6e8-45a0-9f9a-d1e76c49b1a9", "address": "fa:16:3e:37:ae:80", "network": {"id": "6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad", "bridge": "br-int", "label": "tempest-network-smoke--1013716252", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe37:ae80", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31719a20-f6", "ovs_interfaceid": "31719a20-f6e8-45a0-9f9a-d1e76c49b1a9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:10:41 np0005531887 nova_compute[186849]: 2025-11-22 08:10:41.191 186853 DEBUG nova.compute.manager [req-b4594cdd-8ab8-4a61-9f7d-636f39ebfa5d req-86514022-fa23-4878-a64e-543b422279f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Detach interface failed, port_id=1b5f134e-5728-4e7f-ba86-8650cc0b721d, reason: Instance b50dd877-42b1-46b2-933e-ee9a660a56c3 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Nov 22 03:10:41 np0005531887 nova_compute[186849]: 2025-11-22 08:10:41.283 186853 DEBUG nova.network.neutron [req-d588c262-0ed3-4175-a933-3c2e879f0006 req-d1d80fcc-b12b-415c-8bc7-27ad142211bc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Updated VIF entry in instance network info cache for port 1b5f134e-5728-4e7f-ba86-8650cc0b721d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:10:41 np0005531887 nova_compute[186849]: 2025-11-22 08:10:41.284 186853 DEBUG nova.network.neutron [req-d588c262-0ed3-4175-a933-3c2e879f0006 req-d1d80fcc-b12b-415c-8bc7-27ad142211bc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Updating instance_info_cache with network_info: [{"id": "1b5f134e-5728-4e7f-ba86-8650cc0b721d", "address": "fa:16:3e:9f:bd:73", "network": {"id": "8591a8a4-c35f-454b-ba4c-4ec37a8765b2", "bridge": "br-int", "label": "tempest-network-smoke--372886439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b5f134e-57", "ovs_interfaceid": "1b5f134e-5728-4e7f-ba86-8650cc0b721d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "31719a20-f6e8-45a0-9f9a-d1e76c49b1a9", "address": "fa:16:3e:37:ae:80", "network": {"id": "6a8e7fc1-6ea3-4bc9-85d9-f62acc4ca9ad", "bridge": "br-int", "label": "tempest-network-smoke--1013716252", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe37:ae80", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31719a20-f6", "ovs_interfaceid": "31719a20-f6e8-45a0-9f9a-d1e76c49b1a9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:10:41 np0005531887 nova_compute[186849]: 2025-11-22 08:10:41.299 186853 DEBUG oslo_concurrency.lockutils [req-d588c262-0ed3-4175-a933-3c2e879f0006 req-d1d80fcc-b12b-415c-8bc7-27ad142211bc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-b50dd877-42b1-46b2-933e-ee9a660a56c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:10:41 np0005531887 nova_compute[186849]: 2025-11-22 08:10:41.356 186853 DEBUG nova.compute.manager [req-f297f156-99a4-463c-bf7a-d1092d9e2d6a req-189f7525-2b04-42e5-930e-2a140dbc9d32 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Received event network-vif-unplugged-31719a20-f6e8-45a0-9f9a-d1e76c49b1a9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:10:41 np0005531887 nova_compute[186849]: 2025-11-22 08:10:41.356 186853 DEBUG oslo_concurrency.lockutils [req-f297f156-99a4-463c-bf7a-d1092d9e2d6a req-189f7525-2b04-42e5-930e-2a140dbc9d32 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "b50dd877-42b1-46b2-933e-ee9a660a56c3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:10:41 np0005531887 nova_compute[186849]: 2025-11-22 08:10:41.357 186853 DEBUG oslo_concurrency.lockutils [req-f297f156-99a4-463c-bf7a-d1092d9e2d6a req-189f7525-2b04-42e5-930e-2a140dbc9d32 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b50dd877-42b1-46b2-933e-ee9a660a56c3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:10:41 np0005531887 nova_compute[186849]: 2025-11-22 08:10:41.357 186853 DEBUG oslo_concurrency.lockutils [req-f297f156-99a4-463c-bf7a-d1092d9e2d6a req-189f7525-2b04-42e5-930e-2a140dbc9d32 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b50dd877-42b1-46b2-933e-ee9a660a56c3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:10:41 np0005531887 nova_compute[186849]: 2025-11-22 08:10:41.357 186853 DEBUG nova.compute.manager [req-f297f156-99a4-463c-bf7a-d1092d9e2d6a req-189f7525-2b04-42e5-930e-2a140dbc9d32 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] No waiting events found dispatching network-vif-unplugged-31719a20-f6e8-45a0-9f9a-d1e76c49b1a9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:10:41 np0005531887 nova_compute[186849]: 2025-11-22 08:10:41.357 186853 DEBUG nova.compute.manager [req-f297f156-99a4-463c-bf7a-d1092d9e2d6a req-189f7525-2b04-42e5-930e-2a140dbc9d32 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Received event network-vif-unplugged-31719a20-f6e8-45a0-9f9a-d1e76c49b1a9 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 03:10:41 np0005531887 nova_compute[186849]: 2025-11-22 08:10:41.358 186853 DEBUG nova.compute.manager [req-f297f156-99a4-463c-bf7a-d1092d9e2d6a req-189f7525-2b04-42e5-930e-2a140dbc9d32 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Received event network-vif-plugged-31719a20-f6e8-45a0-9f9a-d1e76c49b1a9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:10:41 np0005531887 nova_compute[186849]: 2025-11-22 08:10:41.358 186853 DEBUG oslo_concurrency.lockutils [req-f297f156-99a4-463c-bf7a-d1092d9e2d6a req-189f7525-2b04-42e5-930e-2a140dbc9d32 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "b50dd877-42b1-46b2-933e-ee9a660a56c3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:10:41 np0005531887 nova_compute[186849]: 2025-11-22 08:10:41.358 186853 DEBUG oslo_concurrency.lockutils [req-f297f156-99a4-463c-bf7a-d1092d9e2d6a req-189f7525-2b04-42e5-930e-2a140dbc9d32 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b50dd877-42b1-46b2-933e-ee9a660a56c3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:10:41 np0005531887 nova_compute[186849]: 2025-11-22 08:10:41.359 186853 DEBUG oslo_concurrency.lockutils [req-f297f156-99a4-463c-bf7a-d1092d9e2d6a req-189f7525-2b04-42e5-930e-2a140dbc9d32 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b50dd877-42b1-46b2-933e-ee9a660a56c3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:10:41 np0005531887 nova_compute[186849]: 2025-11-22 08:10:41.359 186853 DEBUG nova.compute.manager [req-f297f156-99a4-463c-bf7a-d1092d9e2d6a req-189f7525-2b04-42e5-930e-2a140dbc9d32 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] No waiting events found dispatching network-vif-plugged-31719a20-f6e8-45a0-9f9a-d1e76c49b1a9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:10:41 np0005531887 nova_compute[186849]: 2025-11-22 08:10:41.360 186853 WARNING nova.compute.manager [req-f297f156-99a4-463c-bf7a-d1092d9e2d6a req-189f7525-2b04-42e5-930e-2a140dbc9d32 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Received unexpected event network-vif-plugged-31719a20-f6e8-45a0-9f9a-d1e76c49b1a9 for instance with vm_state active and task_state deleting.#033[00m
Nov 22 03:10:41 np0005531887 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 03:10:41 np0005531887 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 03:10:41 np0005531887 nova_compute[186849]: 2025-11-22 08:10:41.622 186853 DEBUG nova.network.neutron [-] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:10:41 np0005531887 nova_compute[186849]: 2025-11-22 08:10:41.641 186853 INFO nova.compute.manager [-] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Took 2.06 seconds to deallocate network for instance.#033[00m
Nov 22 03:10:41 np0005531887 nova_compute[186849]: 2025-11-22 08:10:41.722 186853 DEBUG oslo_concurrency.lockutils [None req-646d9fcf-3987-47f4-a0e0-e4b7f4ab53d5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:10:41 np0005531887 nova_compute[186849]: 2025-11-22 08:10:41.722 186853 DEBUG oslo_concurrency.lockutils [None req-646d9fcf-3987-47f4-a0e0-e4b7f4ab53d5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:10:41 np0005531887 nova_compute[186849]: 2025-11-22 08:10:41.779 186853 DEBUG nova.compute.provider_tree [None req-646d9fcf-3987-47f4-a0e0-e4b7f4ab53d5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:10:41 np0005531887 nova_compute[186849]: 2025-11-22 08:10:41.791 186853 DEBUG nova.scheduler.client.report [None req-646d9fcf-3987-47f4-a0e0-e4b7f4ab53d5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:10:41 np0005531887 nova_compute[186849]: 2025-11-22 08:10:41.816 186853 DEBUG oslo_concurrency.lockutils [None req-646d9fcf-3987-47f4-a0e0-e4b7f4ab53d5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.094s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:10:41 np0005531887 nova_compute[186849]: 2025-11-22 08:10:41.840 186853 INFO nova.scheduler.client.report [None req-646d9fcf-3987-47f4-a0e0-e4b7f4ab53d5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Deleted allocations for instance b50dd877-42b1-46b2-933e-ee9a660a56c3#033[00m
Nov 22 03:10:41 np0005531887 nova_compute[186849]: 2025-11-22 08:10:41.928 186853 DEBUG oslo_concurrency.lockutils [None req-646d9fcf-3987-47f4-a0e0-e4b7f4ab53d5 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "b50dd877-42b1-46b2-933e-ee9a660a56c3" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.816s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:10:41 np0005531887 nova_compute[186849]: 2025-11-22 08:10:41.942 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:42 np0005531887 nova_compute[186849]: 2025-11-22 08:10:42.973 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:43 np0005531887 nova_compute[186849]: 2025-11-22 08:10:43.184 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:43 np0005531887 nova_compute[186849]: 2025-11-22 08:10:43.337 186853 DEBUG nova.compute.manager [req-3aff8a88-d63a-4271-9371-1eca53878ded req-593e6ff5-950c-45cb-b3af-daed5d993a3b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Received event network-vif-deleted-31719a20-f6e8-45a0-9f9a-d1e76c49b1a9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:10:44 np0005531887 nova_compute[186849]: 2025-11-22 08:10:44.494 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:44 np0005531887 podman[232391]: 2025-11-22 08:10:44.833499831 +0000 UTC m=+0.056201348 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:10:46 np0005531887 nova_compute[186849]: 2025-11-22 08:10:46.945 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:48 np0005531887 podman[232410]: 2025-11-22 08:10:48.878671511 +0000 UTC m=+0.094352341 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 22 03:10:49 np0005531887 nova_compute[186849]: 2025-11-22 08:10:49.496 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:51 np0005531887 nova_compute[186849]: 2025-11-22 08:10:51.947 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:53 np0005531887 podman[232431]: 2025-11-22 08:10:53.837320574 +0000 UTC m=+0.056598947 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 03:10:54 np0005531887 nova_compute[186849]: 2025-11-22 08:10:54.414 186853 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763799039.4130485, b50dd877-42b1-46b2-933e-ee9a660a56c3 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:10:54 np0005531887 nova_compute[186849]: 2025-11-22 08:10:54.415 186853 INFO nova.compute.manager [-] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:10:54 np0005531887 nova_compute[186849]: 2025-11-22 08:10:54.445 186853 DEBUG nova.compute.manager [None req-e3d1aa45-a436-46b4-a706-5b933c4f2e80 - - - - - -] [instance: b50dd877-42b1-46b2-933e-ee9a660a56c3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:10:54 np0005531887 nova_compute[186849]: 2025-11-22 08:10:54.499 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:55 np0005531887 nova_compute[186849]: 2025-11-22 08:10:55.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:10:56 np0005531887 nova_compute[186849]: 2025-11-22 08:10:56.949 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:10:58 np0005531887 nova_compute[186849]: 2025-11-22 08:10:58.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:10:58 np0005531887 nova_compute[186849]: 2025-11-22 08:10:58.796 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:10:58 np0005531887 nova_compute[186849]: 2025-11-22 08:10:58.796 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:10:58 np0005531887 nova_compute[186849]: 2025-11-22 08:10:58.797 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:10:58 np0005531887 nova_compute[186849]: 2025-11-22 08:10:58.797 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:10:58 np0005531887 nova_compute[186849]: 2025-11-22 08:10:58.962 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:10:58 np0005531887 nova_compute[186849]: 2025-11-22 08:10:58.963 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5738MB free_disk=73.33177185058594GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:10:58 np0005531887 nova_compute[186849]: 2025-11-22 08:10:58.963 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:10:58 np0005531887 nova_compute[186849]: 2025-11-22 08:10:58.964 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:10:59 np0005531887 nova_compute[186849]: 2025-11-22 08:10:59.010 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:10:59 np0005531887 nova_compute[186849]: 2025-11-22 08:10:59.011 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:10:59 np0005531887 nova_compute[186849]: 2025-11-22 08:10:59.027 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Refreshing inventories for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 22 03:10:59 np0005531887 nova_compute[186849]: 2025-11-22 08:10:59.042 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Updating ProviderTree inventory for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 22 03:10:59 np0005531887 nova_compute[186849]: 2025-11-22 08:10:59.042 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Updating inventory in ProviderTree for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 22 03:10:59 np0005531887 nova_compute[186849]: 2025-11-22 08:10:59.060 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Refreshing aggregate associations for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 22 03:10:59 np0005531887 nova_compute[186849]: 2025-11-22 08:10:59.079 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Refreshing trait associations for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78, traits: COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_MMX,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_PCNET _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 22 03:10:59 np0005531887 nova_compute[186849]: 2025-11-22 08:10:59.106 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:10:59 np0005531887 nova_compute[186849]: 2025-11-22 08:10:59.120 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:10:59 np0005531887 nova_compute[186849]: 2025-11-22 08:10:59.141 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:10:59 np0005531887 nova_compute[186849]: 2025-11-22 08:10:59.141 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.177s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:10:59 np0005531887 nova_compute[186849]: 2025-11-22 08:10:59.502 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:00 np0005531887 nova_compute[186849]: 2025-11-22 08:11:00.135 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:11:00 np0005531887 nova_compute[186849]: 2025-11-22 08:11:00.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:11:00 np0005531887 nova_compute[186849]: 2025-11-22 08:11:00.768 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:11:00 np0005531887 podman[232457]: 2025-11-22 08:11:00.87934014 +0000 UTC m=+0.085871680 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., io.buildah.version=1.33.7, architecture=x86_64, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, name=ubi9-minimal, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.openshift.expose-services=, version=9.6, maintainer=Red Hat, Inc., vcs-type=git, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible)
Nov 22 03:11:01 np0005531887 nova_compute[186849]: 2025-11-22 08:11:01.951 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:02 np0005531887 nova_compute[186849]: 2025-11-22 08:11:02.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:11:02 np0005531887 nova_compute[186849]: 2025-11-22 08:11:02.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:11:02 np0005531887 nova_compute[186849]: 2025-11-22 08:11:02.793 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:11:02 np0005531887 nova_compute[186849]: 2025-11-22 08:11:02.794 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:11:03 np0005531887 nova_compute[186849]: 2025-11-22 08:11:03.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:11:03 np0005531887 podman[232479]: 2025-11-22 08:11:03.844443764 +0000 UTC m=+0.061877848 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_id=edpm, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 22 03:11:03 np0005531887 podman[232480]: 2025-11-22 08:11:03.87138086 +0000 UTC m=+0.087080191 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:11:04 np0005531887 nova_compute[186849]: 2025-11-22 08:11:04.505 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:06 np0005531887 nova_compute[186849]: 2025-11-22 08:11:06.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:11:06 np0005531887 nova_compute[186849]: 2025-11-22 08:11:06.953 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:08 np0005531887 podman[232524]: 2025-11-22 08:11:08.82012845 +0000 UTC m=+0.044327385 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 03:11:09 np0005531887 nova_compute[186849]: 2025-11-22 08:11:09.508 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:10 np0005531887 nova_compute[186849]: 2025-11-22 08:11:10.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:11:11 np0005531887 nova_compute[186849]: 2025-11-22 08:11:11.955 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:14 np0005531887 nova_compute[186849]: 2025-11-22 08:11:14.511 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:15 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:15.625 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:30:2c:69 2001:db8:0:1:f816:3eff:fe30:2c69 2001:db8::f816:3eff:fe30:2c69'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe30:2c69/64 2001:db8::f816:3eff:fe30:2c69/64', 'neutron:device_id': 'ovnmeta-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d81a98b9-7f60-4da8-a82f-30c94c08d498, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=f86e6fc7-3969-4922-9612-9c86d85f21ec) old=Port_Binding(mac=['fa:16:3e:30:2c:69 2001:db8::f816:3eff:fe30:2c69'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe30:2c69/64', 'neutron:device_id': 'ovnmeta-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:11:15 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:15.626 104084 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port f86e6fc7-3969-4922-9612-9c86d85f21ec in datapath 2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94 updated#033[00m
Nov 22 03:11:15 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:15.628 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:11:15 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:15.629 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[92b58225-4283-4207-a21b-ec15bdaa3c75]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:15 np0005531887 podman[232548]: 2025-11-22 08:11:15.855521932 +0000 UTC m=+0.078660212 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 03:11:16 np0005531887 nova_compute[186849]: 2025-11-22 08:11:16.958 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:17 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:17.826 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=36, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=35) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:11:17 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:17.826 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:11:17 np0005531887 nova_compute[186849]: 2025-11-22 08:11:17.828 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:19 np0005531887 nova_compute[186849]: 2025-11-22 08:11:19.514 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:19 np0005531887 podman[232568]: 2025-11-22 08:11:19.860488678 +0000 UTC m=+0.084312901 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 22 03:11:21 np0005531887 nova_compute[186849]: 2025-11-22 08:11:21.960 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:22 np0005531887 nova_compute[186849]: 2025-11-22 08:11:22.695 186853 DEBUG oslo_concurrency.lockutils [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "c48669e2-d72e-4b32-9bfa-ebda39e1376c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:11:22 np0005531887 nova_compute[186849]: 2025-11-22 08:11:22.695 186853 DEBUG oslo_concurrency.lockutils [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "c48669e2-d72e-4b32-9bfa-ebda39e1376c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:11:22 np0005531887 nova_compute[186849]: 2025-11-22 08:11:22.721 186853 DEBUG nova.compute.manager [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 03:11:22 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:22.828 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '36'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:11:23 np0005531887 nova_compute[186849]: 2025-11-22 08:11:23.007 186853 DEBUG oslo_concurrency.lockutils [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:11:23 np0005531887 nova_compute[186849]: 2025-11-22 08:11:23.007 186853 DEBUG oslo_concurrency.lockutils [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:11:23 np0005531887 nova_compute[186849]: 2025-11-22 08:11:23.014 186853 DEBUG nova.virt.hardware [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:11:23 np0005531887 nova_compute[186849]: 2025-11-22 08:11:23.014 186853 INFO nova.compute.claims [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 22 03:11:23 np0005531887 nova_compute[186849]: 2025-11-22 08:11:23.127 186853 DEBUG nova.compute.provider_tree [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:11:23 np0005531887 nova_compute[186849]: 2025-11-22 08:11:23.139 186853 DEBUG nova.scheduler.client.report [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:11:23 np0005531887 nova_compute[186849]: 2025-11-22 08:11:23.169 186853 DEBUG oslo_concurrency.lockutils [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.162s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:11:23 np0005531887 nova_compute[186849]: 2025-11-22 08:11:23.170 186853 DEBUG nova.compute.manager [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 03:11:23 np0005531887 nova_compute[186849]: 2025-11-22 08:11:23.253 186853 DEBUG nova.compute.manager [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 03:11:23 np0005531887 nova_compute[186849]: 2025-11-22 08:11:23.253 186853 DEBUG nova.network.neutron [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:11:23 np0005531887 nova_compute[186849]: 2025-11-22 08:11:23.290 186853 INFO nova.virt.libvirt.driver [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 03:11:23 np0005531887 nova_compute[186849]: 2025-11-22 08:11:23.326 186853 DEBUG nova.compute.manager [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 03:11:23 np0005531887 nova_compute[186849]: 2025-11-22 08:11:23.476 186853 DEBUG nova.compute.manager [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 03:11:23 np0005531887 nova_compute[186849]: 2025-11-22 08:11:23.478 186853 DEBUG nova.virt.libvirt.driver [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:11:23 np0005531887 nova_compute[186849]: 2025-11-22 08:11:23.479 186853 INFO nova.virt.libvirt.driver [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Creating image(s)#033[00m
Nov 22 03:11:23 np0005531887 nova_compute[186849]: 2025-11-22 08:11:23.479 186853 DEBUG oslo_concurrency.lockutils [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "/var/lib/nova/instances/c48669e2-d72e-4b32-9bfa-ebda39e1376c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:11:23 np0005531887 nova_compute[186849]: 2025-11-22 08:11:23.480 186853 DEBUG oslo_concurrency.lockutils [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "/var/lib/nova/instances/c48669e2-d72e-4b32-9bfa-ebda39e1376c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:11:23 np0005531887 nova_compute[186849]: 2025-11-22 08:11:23.480 186853 DEBUG oslo_concurrency.lockutils [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "/var/lib/nova/instances/c48669e2-d72e-4b32-9bfa-ebda39e1376c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:11:23 np0005531887 nova_compute[186849]: 2025-11-22 08:11:23.493 186853 DEBUG oslo_concurrency.processutils [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:11:23 np0005531887 nova_compute[186849]: 2025-11-22 08:11:23.544 186853 DEBUG nova.policy [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '809b865601654264af5bff7f49127cea', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:11:23 np0005531887 nova_compute[186849]: 2025-11-22 08:11:23.552 186853 DEBUG oslo_concurrency.processutils [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:11:23 np0005531887 nova_compute[186849]: 2025-11-22 08:11:23.552 186853 DEBUG oslo_concurrency.lockutils [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:11:23 np0005531887 nova_compute[186849]: 2025-11-22 08:11:23.553 186853 DEBUG oslo_concurrency.lockutils [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:11:23 np0005531887 nova_compute[186849]: 2025-11-22 08:11:23.565 186853 DEBUG oslo_concurrency.processutils [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:11:23 np0005531887 nova_compute[186849]: 2025-11-22 08:11:23.624 186853 DEBUG oslo_concurrency.processutils [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:11:23 np0005531887 nova_compute[186849]: 2025-11-22 08:11:23.626 186853 DEBUG oslo_concurrency.processutils [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/c48669e2-d72e-4b32-9bfa-ebda39e1376c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:11:23 np0005531887 nova_compute[186849]: 2025-11-22 08:11:23.662 186853 DEBUG oslo_concurrency.processutils [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/c48669e2-d72e-4b32-9bfa-ebda39e1376c/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:11:23 np0005531887 nova_compute[186849]: 2025-11-22 08:11:23.663 186853 DEBUG oslo_concurrency.lockutils [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:11:23 np0005531887 nova_compute[186849]: 2025-11-22 08:11:23.664 186853 DEBUG oslo_concurrency.processutils [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:11:23 np0005531887 nova_compute[186849]: 2025-11-22 08:11:23.727 186853 DEBUG oslo_concurrency.processutils [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:11:23 np0005531887 nova_compute[186849]: 2025-11-22 08:11:23.728 186853 DEBUG nova.virt.disk.api [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Checking if we can resize image /var/lib/nova/instances/c48669e2-d72e-4b32-9bfa-ebda39e1376c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:11:23 np0005531887 nova_compute[186849]: 2025-11-22 08:11:23.728 186853 DEBUG oslo_concurrency.processutils [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c48669e2-d72e-4b32-9bfa-ebda39e1376c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:11:23 np0005531887 nova_compute[186849]: 2025-11-22 08:11:23.785 186853 DEBUG oslo_concurrency.processutils [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c48669e2-d72e-4b32-9bfa-ebda39e1376c/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:11:23 np0005531887 nova_compute[186849]: 2025-11-22 08:11:23.786 186853 DEBUG nova.virt.disk.api [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Cannot resize image /var/lib/nova/instances/c48669e2-d72e-4b32-9bfa-ebda39e1376c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:11:23 np0005531887 nova_compute[186849]: 2025-11-22 08:11:23.786 186853 DEBUG nova.objects.instance [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lazy-loading 'migration_context' on Instance uuid c48669e2-d72e-4b32-9bfa-ebda39e1376c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:11:23 np0005531887 nova_compute[186849]: 2025-11-22 08:11:23.811 186853 DEBUG nova.virt.libvirt.driver [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:11:23 np0005531887 nova_compute[186849]: 2025-11-22 08:11:23.811 186853 DEBUG nova.virt.libvirt.driver [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Ensure instance console log exists: /var/lib/nova/instances/c48669e2-d72e-4b32-9bfa-ebda39e1376c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:11:23 np0005531887 nova_compute[186849]: 2025-11-22 08:11:23.812 186853 DEBUG oslo_concurrency.lockutils [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:11:23 np0005531887 nova_compute[186849]: 2025-11-22 08:11:23.812 186853 DEBUG oslo_concurrency.lockutils [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:11:23 np0005531887 nova_compute[186849]: 2025-11-22 08:11:23.813 186853 DEBUG oslo_concurrency.lockutils [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:11:24 np0005531887 nova_compute[186849]: 2025-11-22 08:11:24.516 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:24 np0005531887 nova_compute[186849]: 2025-11-22 08:11:24.591 186853 DEBUG nova.network.neutron [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Successfully created port: 136f6afa-dc75-4024-af1d-a03b4dae22a5 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:11:24 np0005531887 podman[232603]: 2025-11-22 08:11:24.843305687 +0000 UTC m=+0.060895324 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:11:25 np0005531887 nova_compute[186849]: 2025-11-22 08:11:25.366 186853 DEBUG nova.network.neutron [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Successfully created port: 3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:11:26 np0005531887 nova_compute[186849]: 2025-11-22 08:11:26.690 186853 DEBUG nova.network.neutron [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Successfully updated port: 136f6afa-dc75-4024-af1d-a03b4dae22a5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:11:26 np0005531887 nova_compute[186849]: 2025-11-22 08:11:26.785 186853 DEBUG nova.compute.manager [req-d78b7e84-df8e-497c-a991-6810e765e85c req-a29bace2-3e7f-4729-afc0-3ccb7f7d9b02 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Received event network-changed-136f6afa-dc75-4024-af1d-a03b4dae22a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:11:26 np0005531887 nova_compute[186849]: 2025-11-22 08:11:26.786 186853 DEBUG nova.compute.manager [req-d78b7e84-df8e-497c-a991-6810e765e85c req-a29bace2-3e7f-4729-afc0-3ccb7f7d9b02 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Refreshing instance network info cache due to event network-changed-136f6afa-dc75-4024-af1d-a03b4dae22a5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:11:26 np0005531887 nova_compute[186849]: 2025-11-22 08:11:26.786 186853 DEBUG oslo_concurrency.lockutils [req-d78b7e84-df8e-497c-a991-6810e765e85c req-a29bace2-3e7f-4729-afc0-3ccb7f7d9b02 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-c48669e2-d72e-4b32-9bfa-ebda39e1376c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:11:26 np0005531887 nova_compute[186849]: 2025-11-22 08:11:26.786 186853 DEBUG oslo_concurrency.lockutils [req-d78b7e84-df8e-497c-a991-6810e765e85c req-a29bace2-3e7f-4729-afc0-3ccb7f7d9b02 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-c48669e2-d72e-4b32-9bfa-ebda39e1376c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:11:26 np0005531887 nova_compute[186849]: 2025-11-22 08:11:26.786 186853 DEBUG nova.network.neutron [req-d78b7e84-df8e-497c-a991-6810e765e85c req-a29bace2-3e7f-4729-afc0-3ccb7f7d9b02 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Refreshing network info cache for port 136f6afa-dc75-4024-af1d-a03b4dae22a5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:11:26 np0005531887 nova_compute[186849]: 2025-11-22 08:11:26.962 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:27 np0005531887 nova_compute[186849]: 2025-11-22 08:11:27.135 186853 DEBUG nova.network.neutron [req-d78b7e84-df8e-497c-a991-6810e765e85c req-a29bace2-3e7f-4729-afc0-3ccb7f7d9b02 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:11:27 np0005531887 nova_compute[186849]: 2025-11-22 08:11:27.703 186853 DEBUG nova.network.neutron [req-d78b7e84-df8e-497c-a991-6810e765e85c req-a29bace2-3e7f-4729-afc0-3ccb7f7d9b02 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:11:27 np0005531887 nova_compute[186849]: 2025-11-22 08:11:27.720 186853 DEBUG oslo_concurrency.lockutils [req-d78b7e84-df8e-497c-a991-6810e765e85c req-a29bace2-3e7f-4729-afc0-3ccb7f7d9b02 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-c48669e2-d72e-4b32-9bfa-ebda39e1376c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:11:27 np0005531887 nova_compute[186849]: 2025-11-22 08:11:27.851 186853 DEBUG nova.network.neutron [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Successfully updated port: 3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:11:27 np0005531887 nova_compute[186849]: 2025-11-22 08:11:27.902 186853 DEBUG oslo_concurrency.lockutils [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "refresh_cache-c48669e2-d72e-4b32-9bfa-ebda39e1376c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:11:27 np0005531887 nova_compute[186849]: 2025-11-22 08:11:27.903 186853 DEBUG oslo_concurrency.lockutils [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquired lock "refresh_cache-c48669e2-d72e-4b32-9bfa-ebda39e1376c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:11:27 np0005531887 nova_compute[186849]: 2025-11-22 08:11:27.903 186853 DEBUG nova.network.neutron [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:11:28 np0005531887 nova_compute[186849]: 2025-11-22 08:11:28.141 186853 DEBUG nova.network.neutron [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:11:28 np0005531887 nova_compute[186849]: 2025-11-22 08:11:28.924 186853 DEBUG nova.compute.manager [req-7b5d4d41-d81a-431d-8059-60b76e701dac req-54ab3fb2-08a5-477a-a7a2-65c19a70a363 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Received event network-changed-3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:11:28 np0005531887 nova_compute[186849]: 2025-11-22 08:11:28.925 186853 DEBUG nova.compute.manager [req-7b5d4d41-d81a-431d-8059-60b76e701dac req-54ab3fb2-08a5-477a-a7a2-65c19a70a363 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Refreshing instance network info cache due to event network-changed-3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:11:28 np0005531887 nova_compute[186849]: 2025-11-22 08:11:28.925 186853 DEBUG oslo_concurrency.lockutils [req-7b5d4d41-d81a-431d-8059-60b76e701dac req-54ab3fb2-08a5-477a-a7a2-65c19a70a363 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-c48669e2-d72e-4b32-9bfa-ebda39e1376c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:11:29 np0005531887 nova_compute[186849]: 2025-11-22 08:11:29.520 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.603 186853 DEBUG nova.network.neutron [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Updating instance_info_cache with network_info: [{"id": "136f6afa-dc75-4024-af1d-a03b4dae22a5", "address": "fa:16:3e:ca:68:4f", "network": {"id": "04f3fbae-1178-425a-a955-30dcd392a3d3", "bridge": "br-int", "label": "tempest-network-smoke--515686970", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap136f6afa-dc", "ovs_interfaceid": "136f6afa-dc75-4024-af1d-a03b4dae22a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1", "address": "fa:16:3e:80:9c:ac", "network": {"id": "2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94", "bridge": "br-int", "label": "tempest-network-smoke--1300577145", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe80:9cac", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe80:9cac", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3869bf72-6a", "ovs_interfaceid": "3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.667 186853 DEBUG oslo_concurrency.lockutils [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Releasing lock "refresh_cache-c48669e2-d72e-4b32-9bfa-ebda39e1376c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.668 186853 DEBUG nova.compute.manager [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Instance network_info: |[{"id": "136f6afa-dc75-4024-af1d-a03b4dae22a5", "address": "fa:16:3e:ca:68:4f", "network": {"id": "04f3fbae-1178-425a-a955-30dcd392a3d3", "bridge": "br-int", "label": "tempest-network-smoke--515686970", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap136f6afa-dc", "ovs_interfaceid": "136f6afa-dc75-4024-af1d-a03b4dae22a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1", "address": "fa:16:3e:80:9c:ac", "network": {"id": "2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94", "bridge": "br-int", "label": "tempest-network-smoke--1300577145", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe80:9cac", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe80:9cac", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3869bf72-6a", "ovs_interfaceid": "3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.669 186853 DEBUG oslo_concurrency.lockutils [req-7b5d4d41-d81a-431d-8059-60b76e701dac req-54ab3fb2-08a5-477a-a7a2-65c19a70a363 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-c48669e2-d72e-4b32-9bfa-ebda39e1376c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.669 186853 DEBUG nova.network.neutron [req-7b5d4d41-d81a-431d-8059-60b76e701dac req-54ab3fb2-08a5-477a-a7a2-65c19a70a363 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Refreshing network info cache for port 3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.673 186853 DEBUG nova.virt.libvirt.driver [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Start _get_guest_xml network_info=[{"id": "136f6afa-dc75-4024-af1d-a03b4dae22a5", "address": "fa:16:3e:ca:68:4f", "network": {"id": "04f3fbae-1178-425a-a955-30dcd392a3d3", "bridge": "br-int", "label": "tempest-network-smoke--515686970", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap136f6afa-dc", "ovs_interfaceid": "136f6afa-dc75-4024-af1d-a03b4dae22a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1", "address": "fa:16:3e:80:9c:ac", "network": {"id": "2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94", "bridge": "br-int", "label": "tempest-network-smoke--1300577145", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe80:9cac", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe80:9cac", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3869bf72-6a", "ovs_interfaceid": "3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'size': 0, 'boot_index': 0, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.677 186853 WARNING nova.virt.libvirt.driver [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.684 186853 DEBUG nova.virt.libvirt.host [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.685 186853 DEBUG nova.virt.libvirt.host [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.690 186853 DEBUG nova.virt.libvirt.host [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.691 186853 DEBUG nova.virt.libvirt.host [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.692 186853 DEBUG nova.virt.libvirt.driver [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.692 186853 DEBUG nova.virt.hardware [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.693 186853 DEBUG nova.virt.hardware [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.693 186853 DEBUG nova.virt.hardware [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.694 186853 DEBUG nova.virt.hardware [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.694 186853 DEBUG nova.virt.hardware [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.694 186853 DEBUG nova.virt.hardware [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.694 186853 DEBUG nova.virt.hardware [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.695 186853 DEBUG nova.virt.hardware [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.695 186853 DEBUG nova.virt.hardware [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.695 186853 DEBUG nova.virt.hardware [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.695 186853 DEBUG nova.virt.hardware [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.700 186853 DEBUG nova.virt.libvirt.vif [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:11:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1879914593',display_name='tempest-TestGettingAddress-server-1879914593',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1879914593',id=131,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG+XE0F3HG2DEqNuf9uspjb7s2sZ2F3wHmlqMBy0O1++z8JVdcWahpbs34YYp0VwN7s8d9LGki42J5P4WPCne3zShzkztpCjZs4MsI2yFB6qUJwoGFmoflVAMWMqw1LTNA==',key_name='tempest-TestGettingAddress-1855642537',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-jeejqnuo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:11:23Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=c48669e2-d72e-4b32-9bfa-ebda39e1376c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "136f6afa-dc75-4024-af1d-a03b4dae22a5", "address": "fa:16:3e:ca:68:4f", "network": {"id": "04f3fbae-1178-425a-a955-30dcd392a3d3", "bridge": "br-int", "label": "tempest-network-smoke--515686970", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap136f6afa-dc", "ovs_interfaceid": "136f6afa-dc75-4024-af1d-a03b4dae22a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.700 186853 DEBUG nova.network.os_vif_util [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "136f6afa-dc75-4024-af1d-a03b4dae22a5", "address": "fa:16:3e:ca:68:4f", "network": {"id": "04f3fbae-1178-425a-a955-30dcd392a3d3", "bridge": "br-int", "label": "tempest-network-smoke--515686970", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap136f6afa-dc", "ovs_interfaceid": "136f6afa-dc75-4024-af1d-a03b4dae22a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.701 186853 DEBUG nova.network.os_vif_util [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ca:68:4f,bridge_name='br-int',has_traffic_filtering=True,id=136f6afa-dc75-4024-af1d-a03b4dae22a5,network=Network(04f3fbae-1178-425a-a955-30dcd392a3d3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap136f6afa-dc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.702 186853 DEBUG nova.virt.libvirt.vif [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:11:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1879914593',display_name='tempest-TestGettingAddress-server-1879914593',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1879914593',id=131,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG+XE0F3HG2DEqNuf9uspjb7s2sZ2F3wHmlqMBy0O1++z8JVdcWahpbs34YYp0VwN7s8d9LGki42J5P4WPCne3zShzkztpCjZs4MsI2yFB6qUJwoGFmoflVAMWMqw1LTNA==',key_name='tempest-TestGettingAddress-1855642537',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-jeejqnuo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:11:23Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=c48669e2-d72e-4b32-9bfa-ebda39e1376c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1", "address": "fa:16:3e:80:9c:ac", "network": {"id": "2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94", "bridge": "br-int", "label": "tempest-network-smoke--1300577145", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe80:9cac", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe80:9cac", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3869bf72-6a", "ovs_interfaceid": "3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.702 186853 DEBUG nova.network.os_vif_util [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1", "address": "fa:16:3e:80:9c:ac", "network": {"id": "2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94", "bridge": "br-int", "label": "tempest-network-smoke--1300577145", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe80:9cac", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe80:9cac", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3869bf72-6a", "ovs_interfaceid": "3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.703 186853 DEBUG nova.network.os_vif_util [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:80:9c:ac,bridge_name='br-int',has_traffic_filtering=True,id=3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1,network=Network(2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3869bf72-6a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.704 186853 DEBUG nova.objects.instance [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lazy-loading 'pci_devices' on Instance uuid c48669e2-d72e-4b32-9bfa-ebda39e1376c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.717 186853 DEBUG nova.virt.libvirt.driver [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:11:31 np0005531887 nova_compute[186849]:  <uuid>c48669e2-d72e-4b32-9bfa-ebda39e1376c</uuid>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:  <name>instance-00000083</name>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:  <memory>131072</memory>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:  <vcpu>1</vcpu>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:11:31 np0005531887 nova_compute[186849]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:      <nova:name>tempest-TestGettingAddress-server-1879914593</nova:name>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:      <nova:creationTime>2025-11-22 08:11:31</nova:creationTime>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:      <nova:flavor name="m1.nano">
Nov 22 03:11:31 np0005531887 nova_compute[186849]:        <nova:memory>128</nova:memory>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:        <nova:disk>1</nova:disk>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:        <nova:swap>0</nova:swap>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:      </nova:flavor>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:      <nova:owner>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:        <nova:user uuid="809b865601654264af5bff7f49127cea">tempest-TestGettingAddress-25838038-project-member</nova:user>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:        <nova:project uuid="c4200f1d1fbb44a5aaf5e3578f6354ae">tempest-TestGettingAddress-25838038</nova:project>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:      </nova:owner>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:      <nova:ports>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:        <nova:port uuid="136f6afa-dc75-4024-af1d-a03b4dae22a5">
Nov 22 03:11:31 np0005531887 nova_compute[186849]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:        </nova:port>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:        <nova:port uuid="3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1">
Nov 22 03:11:31 np0005531887 nova_compute[186849]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fe80:9cac" ipVersion="6"/>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:          <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fe80:9cac" ipVersion="6"/>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:        </nova:port>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:      </nova:ports>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:    </nova:instance>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:  <sysinfo type="smbios">
Nov 22 03:11:31 np0005531887 nova_compute[186849]:    <system>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:      <entry name="serial">c48669e2-d72e-4b32-9bfa-ebda39e1376c</entry>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:      <entry name="uuid">c48669e2-d72e-4b32-9bfa-ebda39e1376c</entry>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:    </system>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:  <os>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:    <boot dev="hd"/>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:    <smbios mode="sysinfo"/>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:  </os>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:  <features>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:    <vmcoreinfo/>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:  </features>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:  <clock offset="utc">
Nov 22 03:11:31 np0005531887 nova_compute[186849]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:    <timer name="hpet" present="no"/>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:  </clock>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:  <cpu mode="custom" match="exact">
Nov 22 03:11:31 np0005531887 nova_compute[186849]:    <model>Nehalem</model>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:  <devices>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:    <disk type="file" device="disk">
Nov 22 03:11:31 np0005531887 nova_compute[186849]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/c48669e2-d72e-4b32-9bfa-ebda39e1376c/disk"/>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:      <target dev="vda" bus="virtio"/>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:    <disk type="file" device="cdrom">
Nov 22 03:11:31 np0005531887 nova_compute[186849]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/c48669e2-d72e-4b32-9bfa-ebda39e1376c/disk.config"/>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:      <target dev="sda" bus="sata"/>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:    <interface type="ethernet">
Nov 22 03:11:31 np0005531887 nova_compute[186849]:      <mac address="fa:16:3e:ca:68:4f"/>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:      <mtu size="1442"/>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:      <target dev="tap136f6afa-dc"/>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:    </interface>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:    <interface type="ethernet">
Nov 22 03:11:31 np0005531887 nova_compute[186849]:      <mac address="fa:16:3e:80:9c:ac"/>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:      <mtu size="1442"/>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:      <target dev="tap3869bf72-6a"/>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:    </interface>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:    <serial type="pty">
Nov 22 03:11:31 np0005531887 nova_compute[186849]:      <log file="/var/lib/nova/instances/c48669e2-d72e-4b32-9bfa-ebda39e1376c/console.log" append="off"/>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:    </serial>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:    <video>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:    </video>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:    <input type="tablet" bus="usb"/>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:    <rng model="virtio">
Nov 22 03:11:31 np0005531887 nova_compute[186849]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:    </rng>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:    <controller type="usb" index="0"/>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:    <memballoon model="virtio">
Nov 22 03:11:31 np0005531887 nova_compute[186849]:      <stats period="10"/>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 03:11:31 np0005531887 nova_compute[186849]:  </devices>
Nov 22 03:11:31 np0005531887 nova_compute[186849]: </domain>
Nov 22 03:11:31 np0005531887 nova_compute[186849]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.719 186853 DEBUG nova.compute.manager [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Preparing to wait for external event network-vif-plugged-136f6afa-dc75-4024-af1d-a03b4dae22a5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.719 186853 DEBUG oslo_concurrency.lockutils [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "c48669e2-d72e-4b32-9bfa-ebda39e1376c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.719 186853 DEBUG oslo_concurrency.lockutils [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "c48669e2-d72e-4b32-9bfa-ebda39e1376c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.720 186853 DEBUG oslo_concurrency.lockutils [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "c48669e2-d72e-4b32-9bfa-ebda39e1376c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.720 186853 DEBUG nova.compute.manager [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Preparing to wait for external event network-vif-plugged-3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.720 186853 DEBUG oslo_concurrency.lockutils [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "c48669e2-d72e-4b32-9bfa-ebda39e1376c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.720 186853 DEBUG oslo_concurrency.lockutils [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "c48669e2-d72e-4b32-9bfa-ebda39e1376c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.721 186853 DEBUG oslo_concurrency.lockutils [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "c48669e2-d72e-4b32-9bfa-ebda39e1376c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.722 186853 DEBUG nova.virt.libvirt.vif [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:11:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1879914593',display_name='tempest-TestGettingAddress-server-1879914593',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1879914593',id=131,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG+XE0F3HG2DEqNuf9uspjb7s2sZ2F3wHmlqMBy0O1++z8JVdcWahpbs34YYp0VwN7s8d9LGki42J5P4WPCne3zShzkztpCjZs4MsI2yFB6qUJwoGFmoflVAMWMqw1LTNA==',key_name='tempest-TestGettingAddress-1855642537',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-jeejqnuo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:11:23Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=c48669e2-d72e-4b32-9bfa-ebda39e1376c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "136f6afa-dc75-4024-af1d-a03b4dae22a5", "address": "fa:16:3e:ca:68:4f", "network": {"id": "04f3fbae-1178-425a-a955-30dcd392a3d3", "bridge": "br-int", "label": "tempest-network-smoke--515686970", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap136f6afa-dc", "ovs_interfaceid": "136f6afa-dc75-4024-af1d-a03b4dae22a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.722 186853 DEBUG nova.network.os_vif_util [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "136f6afa-dc75-4024-af1d-a03b4dae22a5", "address": "fa:16:3e:ca:68:4f", "network": {"id": "04f3fbae-1178-425a-a955-30dcd392a3d3", "bridge": "br-int", "label": "tempest-network-smoke--515686970", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap136f6afa-dc", "ovs_interfaceid": "136f6afa-dc75-4024-af1d-a03b4dae22a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.723 186853 DEBUG nova.network.os_vif_util [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ca:68:4f,bridge_name='br-int',has_traffic_filtering=True,id=136f6afa-dc75-4024-af1d-a03b4dae22a5,network=Network(04f3fbae-1178-425a-a955-30dcd392a3d3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap136f6afa-dc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.723 186853 DEBUG os_vif [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ca:68:4f,bridge_name='br-int',has_traffic_filtering=True,id=136f6afa-dc75-4024-af1d-a03b4dae22a5,network=Network(04f3fbae-1178-425a-a955-30dcd392a3d3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap136f6afa-dc') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.724 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.724 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.725 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.727 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.728 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap136f6afa-dc, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.728 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap136f6afa-dc, col_values=(('external_ids', {'iface-id': '136f6afa-dc75-4024-af1d-a03b4dae22a5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ca:68:4f', 'vm-uuid': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.730 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:31 np0005531887 NetworkManager[55210]: <info>  [1763799091.7310] manager: (tap136f6afa-dc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/189)
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.734 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.736 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.737 186853 INFO os_vif [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ca:68:4f,bridge_name='br-int',has_traffic_filtering=True,id=136f6afa-dc75-4024-af1d-a03b4dae22a5,network=Network(04f3fbae-1178-425a-a955-30dcd392a3d3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap136f6afa-dc')#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.738 186853 DEBUG nova.virt.libvirt.vif [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:11:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1879914593',display_name='tempest-TestGettingAddress-server-1879914593',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1879914593',id=131,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG+XE0F3HG2DEqNuf9uspjb7s2sZ2F3wHmlqMBy0O1++z8JVdcWahpbs34YYp0VwN7s8d9LGki42J5P4WPCne3zShzkztpCjZs4MsI2yFB6qUJwoGFmoflVAMWMqw1LTNA==',key_name='tempest-TestGettingAddress-1855642537',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-jeejqnuo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:11:23Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=c48669e2-d72e-4b32-9bfa-ebda39e1376c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1", "address": "fa:16:3e:80:9c:ac", "network": {"id": "2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94", "bridge": "br-int", "label": "tempest-network-smoke--1300577145", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe80:9cac", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe80:9cac", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3869bf72-6a", "ovs_interfaceid": "3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.738 186853 DEBUG nova.network.os_vif_util [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1", "address": "fa:16:3e:80:9c:ac", "network": {"id": "2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94", "bridge": "br-int", "label": "tempest-network-smoke--1300577145", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe80:9cac", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe80:9cac", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3869bf72-6a", "ovs_interfaceid": "3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.739 186853 DEBUG nova.network.os_vif_util [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:80:9c:ac,bridge_name='br-int',has_traffic_filtering=True,id=3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1,network=Network(2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3869bf72-6a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.739 186853 DEBUG os_vif [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:80:9c:ac,bridge_name='br-int',has_traffic_filtering=True,id=3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1,network=Network(2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3869bf72-6a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.740 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.740 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.740 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.742 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.742 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3869bf72-6a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.742 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3869bf72-6a, col_values=(('external_ids', {'iface-id': '3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:80:9c:ac', 'vm-uuid': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:11:31 np0005531887 NetworkManager[55210]: <info>  [1763799091.7446] manager: (tap3869bf72-6a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/190)
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.743 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.746 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.751 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.752 186853 INFO os_vif [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:80:9c:ac,bridge_name='br-int',has_traffic_filtering=True,id=3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1,network=Network(2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3869bf72-6a')#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.811 186853 DEBUG nova.virt.libvirt.driver [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.812 186853 DEBUG nova.virt.libvirt.driver [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.812 186853 DEBUG nova.virt.libvirt.driver [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] No VIF found with MAC fa:16:3e:ca:68:4f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.812 186853 DEBUG nova.virt.libvirt.driver [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] No VIF found with MAC fa:16:3e:80:9c:ac, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.813 186853 INFO nova.virt.libvirt.driver [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Using config drive#033[00m
Nov 22 03:11:31 np0005531887 podman[232631]: 2025-11-22 08:11:31.848551316 +0000 UTC m=+0.062386670 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, distribution-scope=public, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, name=ubi9-minimal, build-date=2025-08-20T13:12:41, config_id=edpm)
Nov 22 03:11:31 np0005531887 nova_compute[186849]: 2025-11-22 08:11:31.964 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:33 np0005531887 nova_compute[186849]: 2025-11-22 08:11:33.882 186853 INFO nova.virt.libvirt.driver [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Creating config drive at /var/lib/nova/instances/c48669e2-d72e-4b32-9bfa-ebda39e1376c/disk.config#033[00m
Nov 22 03:11:33 np0005531887 nova_compute[186849]: 2025-11-22 08:11:33.887 186853 DEBUG oslo_concurrency.processutils [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c48669e2-d72e-4b32-9bfa-ebda39e1376c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgt777h5f execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:11:34 np0005531887 nova_compute[186849]: 2025-11-22 08:11:34.013 186853 DEBUG oslo_concurrency.processutils [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c48669e2-d72e-4b32-9bfa-ebda39e1376c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgt777h5f" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:11:34 np0005531887 NetworkManager[55210]: <info>  [1763799094.1005] manager: (tap136f6afa-dc): new Tun device (/org/freedesktop/NetworkManager/Devices/191)
Nov 22 03:11:34 np0005531887 kernel: tap136f6afa-dc: entered promiscuous mode
Nov 22 03:11:34 np0005531887 ovn_controller[95130]: 2025-11-22T08:11:34Z|00416|binding|INFO|Claiming lport 136f6afa-dc75-4024-af1d-a03b4dae22a5 for this chassis.
Nov 22 03:11:34 np0005531887 ovn_controller[95130]: 2025-11-22T08:11:34Z|00417|binding|INFO|136f6afa-dc75-4024-af1d-a03b4dae22a5: Claiming fa:16:3e:ca:68:4f 10.100.0.14
Nov 22 03:11:34 np0005531887 nova_compute[186849]: 2025-11-22 08:11:34.117 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:34 np0005531887 kernel: tap3869bf72-6a: entered promiscuous mode
Nov 22 03:11:34 np0005531887 NetworkManager[55210]: <info>  [1763799094.1277] manager: (tap3869bf72-6a): new Tun device (/org/freedesktop/NetworkManager/Devices/192)
Nov 22 03:11:34 np0005531887 systemd-udevd[232700]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:11:34 np0005531887 systemd-udevd[232698]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:11:34 np0005531887 NetworkManager[55210]: <info>  [1763799094.1644] device (tap3869bf72-6a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:11:34 np0005531887 NetworkManager[55210]: <info>  [1763799094.1656] device (tap136f6afa-dc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:11:34 np0005531887 NetworkManager[55210]: <info>  [1763799094.1665] device (tap3869bf72-6a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:11:34 np0005531887 NetworkManager[55210]: <info>  [1763799094.1670] device (tap136f6afa-dc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:11:34 np0005531887 systemd-machined[153180]: New machine qemu-48-instance-00000083.
Nov 22 03:11:34 np0005531887 ovn_controller[95130]: 2025-11-22T08:11:34Z|00418|if_status|INFO|Not updating pb chassis for 3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1 now as sb is readonly
Nov 22 03:11:34 np0005531887 nova_compute[186849]: 2025-11-22 08:11:34.198 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:34 np0005531887 nova_compute[186849]: 2025-11-22 08:11:34.201 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:34 np0005531887 podman[232662]: 2025-11-22 08:11:34.203993126 +0000 UTC m=+0.110839877 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 22 03:11:34 np0005531887 systemd[1]: Started Virtual Machine qemu-48-instance-00000083.
Nov 22 03:11:34 np0005531887 nova_compute[186849]: 2025-11-22 08:11:34.209 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:34 np0005531887 ovn_controller[95130]: 2025-11-22T08:11:34Z|00419|binding|INFO|Setting lport 136f6afa-dc75-4024-af1d-a03b4dae22a5 ovn-installed in OVS
Nov 22 03:11:34 np0005531887 nova_compute[186849]: 2025-11-22 08:11:34.220 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:34 np0005531887 podman[232663]: 2025-11-22 08:11:34.226292146 +0000 UTC m=+0.131779983 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 03:11:34 np0005531887 ovn_controller[95130]: 2025-11-22T08:11:34Z|00420|binding|INFO|Claiming lport 3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1 for this chassis.
Nov 22 03:11:34 np0005531887 ovn_controller[95130]: 2025-11-22T08:11:34Z|00421|binding|INFO|3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1: Claiming fa:16:3e:80:9c:ac 2001:db8:0:1:f816:3eff:fe80:9cac 2001:db8::f816:3eff:fe80:9cac
Nov 22 03:11:34 np0005531887 ovn_controller[95130]: 2025-11-22T08:11:34Z|00422|binding|INFO|Setting lport 136f6afa-dc75-4024-af1d-a03b4dae22a5 up in Southbound
Nov 22 03:11:34 np0005531887 ovn_controller[95130]: 2025-11-22T08:11:34Z|00423|binding|INFO|Setting lport 3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1 ovn-installed in OVS
Nov 22 03:11:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:34.236 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ca:68:4f 10.100.0.14'], port_security=['fa:16:3e:ca:68:4f 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-04f3fbae-1178-425a-a955-30dcd392a3d3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd57662f9-c343-413b-940d-39a2648160cf', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=79e2fe83-1ab0-49c1-acb4-3bc86f0137dc, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=136f6afa-dc75-4024-af1d-a03b4dae22a5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:11:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:34.237 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 136f6afa-dc75-4024-af1d-a03b4dae22a5 in datapath 04f3fbae-1178-425a-a955-30dcd392a3d3 bound to our chassis#033[00m
Nov 22 03:11:34 np0005531887 nova_compute[186849]: 2025-11-22 08:11:34.237 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:34.238 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 04f3fbae-1178-425a-a955-30dcd392a3d3#033[00m
Nov 22 03:11:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:34.252 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[52adb188-1a5d-4689-93ee-dec3d073aac6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:34.253 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap04f3fbae-11 in ovnmeta-04f3fbae-1178-425a-a955-30dcd392a3d3 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:11:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:34.255 213790 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap04f3fbae-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:11:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:34.256 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[e7409c1f-2787-4dac-9c1a-21f2fc083489]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:34.256 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[daa80ad5-715a-4d38-9ea9-70ebe5889720]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:34.269 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[4fc40e5f-e814-4cc7-9adc-c7c05438ca04]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:34.294 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[0d7d84d2-03e0-484e-a6b3-e4d3a7a95542]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:34 np0005531887 ovn_controller[95130]: 2025-11-22T08:11:34Z|00424|binding|INFO|Setting lport 3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1 up in Southbound
Nov 22 03:11:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:34.320 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:80:9c:ac 2001:db8:0:1:f816:3eff:fe80:9cac 2001:db8::f816:3eff:fe80:9cac'], port_security=['fa:16:3e:80:9c:ac 2001:db8:0:1:f816:3eff:fe80:9cac 2001:db8::f816:3eff:fe80:9cac'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe80:9cac/64 2001:db8::f816:3eff:fe80:9cac/64', 'neutron:device_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd57662f9-c343-413b-940d-39a2648160cf', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d81a98b9-7f60-4da8-a82f-30c94c08d498, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:11:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:34.326 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[8998db91-2e49-4617-93d0-0bd4e84c1d4a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:34 np0005531887 NetworkManager[55210]: <info>  [1763799094.3329] manager: (tap04f3fbae-10): new Veth device (/org/freedesktop/NetworkManager/Devices/193)
Nov 22 03:11:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:34.337 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[682ce9c0-9591-4518-8891-7896b1bdc7e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:34.371 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[58a2cbf3-0187-461b-949c-7ccc017ae83f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:34.376 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[1dff2ea0-c7e0-42b6-a468-91d5836b836c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:34 np0005531887 NetworkManager[55210]: <info>  [1763799094.4024] device (tap04f3fbae-10): carrier: link connected
Nov 22 03:11:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:34.407 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[e22341b8-5de0-4635-a32e-c93cfb173cb5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:34.425 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[a2dc4111-76f5-4262-8ede-27dd824dab34]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap04f3fbae-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2e:20:89'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 127], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 583700, 'reachable_time': 17436, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232754, 'error': None, 'target': 'ovnmeta-04f3fbae-1178-425a-a955-30dcd392a3d3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:34.445 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[c22ab498-ee1a-46a2-9dc3-8cc917a5897c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2e:2089'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 583700, 'tstamp': 583700}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232755, 'error': None, 'target': 'ovnmeta-04f3fbae-1178-425a-a955-30dcd392a3d3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:34.465 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[55d730d8-002b-4746-b5cc-4009182d9b18]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap04f3fbae-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2e:20:89'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 127], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 583700, 'reachable_time': 17436, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 232756, 'error': None, 'target': 'ovnmeta-04f3fbae-1178-425a-a955-30dcd392a3d3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:34.498 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[63a725ee-0417-429d-842e-9365b4a00c84]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:34.558 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[258e1757-8acb-4368-aeed-45de8e0ad872]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:34.559 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap04f3fbae-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:11:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:34.560 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:11:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:34.560 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap04f3fbae-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:11:34 np0005531887 NetworkManager[55210]: <info>  [1763799094.5625] manager: (tap04f3fbae-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/194)
Nov 22 03:11:34 np0005531887 nova_compute[186849]: 2025-11-22 08:11:34.562 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:34 np0005531887 kernel: tap04f3fbae-10: entered promiscuous mode
Nov 22 03:11:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:34.567 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap04f3fbae-10, col_values=(('external_ids', {'iface-id': '725c746c-ac46-482e-8d13-14e88613ed55'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:11:34 np0005531887 ovn_controller[95130]: 2025-11-22T08:11:34Z|00425|binding|INFO|Releasing lport 725c746c-ac46-482e-8d13-14e88613ed55 from this chassis (sb_readonly=0)
Nov 22 03:11:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:34.570 104084 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/04f3fbae-1178-425a-a955-30dcd392a3d3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/04f3fbae-1178-425a-a955-30dcd392a3d3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:11:34 np0005531887 nova_compute[186849]: 2025-11-22 08:11:34.569 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:34 np0005531887 nova_compute[186849]: 2025-11-22 08:11:34.583 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:34.581 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[55922c84-297b-4cd6-9006-12ec1f183839]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:34.584 104084 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:11:34 np0005531887 ovn_metadata_agent[104079]: global
Nov 22 03:11:34 np0005531887 ovn_metadata_agent[104079]:    log         /dev/log local0 debug
Nov 22 03:11:34 np0005531887 ovn_metadata_agent[104079]:    log-tag     haproxy-metadata-proxy-04f3fbae-1178-425a-a955-30dcd392a3d3
Nov 22 03:11:34 np0005531887 ovn_metadata_agent[104079]:    user        root
Nov 22 03:11:34 np0005531887 ovn_metadata_agent[104079]:    group       root
Nov 22 03:11:34 np0005531887 ovn_metadata_agent[104079]:    maxconn     1024
Nov 22 03:11:34 np0005531887 ovn_metadata_agent[104079]:    pidfile     /var/lib/neutron/external/pids/04f3fbae-1178-425a-a955-30dcd392a3d3.pid.haproxy
Nov 22 03:11:34 np0005531887 ovn_metadata_agent[104079]:    daemon
Nov 22 03:11:34 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:11:34 np0005531887 ovn_metadata_agent[104079]: defaults
Nov 22 03:11:34 np0005531887 ovn_metadata_agent[104079]:    log global
Nov 22 03:11:34 np0005531887 ovn_metadata_agent[104079]:    mode http
Nov 22 03:11:34 np0005531887 ovn_metadata_agent[104079]:    option httplog
Nov 22 03:11:34 np0005531887 ovn_metadata_agent[104079]:    option dontlognull
Nov 22 03:11:34 np0005531887 ovn_metadata_agent[104079]:    option http-server-close
Nov 22 03:11:34 np0005531887 ovn_metadata_agent[104079]:    option forwardfor
Nov 22 03:11:34 np0005531887 ovn_metadata_agent[104079]:    retries                 3
Nov 22 03:11:34 np0005531887 ovn_metadata_agent[104079]:    timeout http-request    30s
Nov 22 03:11:34 np0005531887 ovn_metadata_agent[104079]:    timeout connect         30s
Nov 22 03:11:34 np0005531887 ovn_metadata_agent[104079]:    timeout client          32s
Nov 22 03:11:34 np0005531887 ovn_metadata_agent[104079]:    timeout server          32s
Nov 22 03:11:34 np0005531887 ovn_metadata_agent[104079]:    timeout http-keep-alive 30s
Nov 22 03:11:34 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:11:34 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:11:34 np0005531887 ovn_metadata_agent[104079]: listen listener
Nov 22 03:11:34 np0005531887 ovn_metadata_agent[104079]:    bind 169.254.169.254:80
Nov 22 03:11:34 np0005531887 ovn_metadata_agent[104079]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:11:34 np0005531887 ovn_metadata_agent[104079]:    http-request add-header X-OVN-Network-ID 04f3fbae-1178-425a-a955-30dcd392a3d3
Nov 22 03:11:34 np0005531887 ovn_metadata_agent[104079]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:11:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:34.585 104084 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-04f3fbae-1178-425a-a955-30dcd392a3d3', 'env', 'PROCESS_TAG=haproxy-04f3fbae-1178-425a-a955-30dcd392a3d3', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/04f3fbae-1178-425a-a955-30dcd392a3d3.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:11:34 np0005531887 nova_compute[186849]: 2025-11-22 08:11:34.915 186853 DEBUG nova.compute.manager [req-a8ecd5b5-3094-4d39-88d9-ec9e73ddb8b0 req-e9baa3db-a74b-4668-8574-e1f0f8e111b7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Received event network-vif-plugged-136f6afa-dc75-4024-af1d-a03b4dae22a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:11:34 np0005531887 nova_compute[186849]: 2025-11-22 08:11:34.916 186853 DEBUG oslo_concurrency.lockutils [req-a8ecd5b5-3094-4d39-88d9-ec9e73ddb8b0 req-e9baa3db-a74b-4668-8574-e1f0f8e111b7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c48669e2-d72e-4b32-9bfa-ebda39e1376c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:11:34 np0005531887 nova_compute[186849]: 2025-11-22 08:11:34.916 186853 DEBUG oslo_concurrency.lockutils [req-a8ecd5b5-3094-4d39-88d9-ec9e73ddb8b0 req-e9baa3db-a74b-4668-8574-e1f0f8e111b7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c48669e2-d72e-4b32-9bfa-ebda39e1376c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:11:34 np0005531887 nova_compute[186849]: 2025-11-22 08:11:34.916 186853 DEBUG oslo_concurrency.lockutils [req-a8ecd5b5-3094-4d39-88d9-ec9e73ddb8b0 req-e9baa3db-a74b-4668-8574-e1f0f8e111b7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c48669e2-d72e-4b32-9bfa-ebda39e1376c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:11:34 np0005531887 nova_compute[186849]: 2025-11-22 08:11:34.917 186853 DEBUG nova.compute.manager [req-a8ecd5b5-3094-4d39-88d9-ec9e73ddb8b0 req-e9baa3db-a74b-4668-8574-e1f0f8e111b7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Processing event network-vif-plugged-136f6afa-dc75-4024-af1d-a03b4dae22a5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:11:34 np0005531887 podman[232788]: 2025-11-22 08:11:34.970705357 +0000 UTC m=+0.059133110 container create 1b7d278f496d0a8e7c2c95c60d5a9671b86815b5edcadb8fdc978f06991c8e4b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-04f3fbae-1178-425a-a955-30dcd392a3d3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS)
Nov 22 03:11:35 np0005531887 systemd[1]: Started libpod-conmon-1b7d278f496d0a8e7c2c95c60d5a9671b86815b5edcadb8fdc978f06991c8e4b.scope.
Nov 22 03:11:35 np0005531887 podman[232788]: 2025-11-22 08:11:34.934228077 +0000 UTC m=+0.022655860 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:11:35 np0005531887 systemd[1]: Started libcrun container.
Nov 22 03:11:35 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/145a19438fc4c00ac4ac3524022b7a99f4c1f89656fef9512c781abbbd6a9704/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:11:35 np0005531887 podman[232788]: 2025-11-22 08:11:35.077456621 +0000 UTC m=+0.165884394 container init 1b7d278f496d0a8e7c2c95c60d5a9671b86815b5edcadb8fdc978f06991c8e4b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-04f3fbae-1178-425a-a955-30dcd392a3d3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 03:11:35 np0005531887 podman[232788]: 2025-11-22 08:11:35.085117601 +0000 UTC m=+0.173545354 container start 1b7d278f496d0a8e7c2c95c60d5a9671b86815b5edcadb8fdc978f06991c8e4b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-04f3fbae-1178-425a-a955-30dcd392a3d3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:11:35 np0005531887 neutron-haproxy-ovnmeta-04f3fbae-1178-425a-a955-30dcd392a3d3[232804]: [NOTICE]   (232808) : New worker (232810) forked
Nov 22 03:11:35 np0005531887 neutron-haproxy-ovnmeta-04f3fbae-1178-425a-a955-30dcd392a3d3[232804]: [NOTICE]   (232808) : Loading success.
Nov 22 03:11:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:35.158 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1 in datapath 2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94 unbound from our chassis#033[00m
Nov 22 03:11:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:35.162 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94#033[00m
Nov 22 03:11:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:35.175 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[abbcf167-973f-43e0-ba6a-3512bfc04dd5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:35.176 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap2b7e9f2d-21 in ovnmeta-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:11:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:35.178 213790 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap2b7e9f2d-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:11:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:35.178 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[413b8844-ba24-4725-8737-b3650fd1471b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:35.179 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[ec816523-1920-4a28-83c0-1b3f737f01c6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:35.192 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[d8e3c276-11cb-4bf9-8ae5-deae53bc04bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:35.216 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[d332d7ef-3495-49cf-a2ca-b7189e55a59d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:35.246 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[6d656ae7-ef95-49bc-b7dd-5bd837949d62]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:35 np0005531887 systemd-udevd[232746]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:11:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:35.254 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[d73507f8-e45f-49b0-b706-fabf8a2cbf35]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:35 np0005531887 NetworkManager[55210]: <info>  [1763799095.2559] manager: (tap2b7e9f2d-20): new Veth device (/org/freedesktop/NetworkManager/Devices/195)
Nov 22 03:11:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:35.292 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[0bee696b-2d7f-4931-98b3-6491c4054e70]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:35.296 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[b43b85d4-b056-47e0-944e-ad2cd27879cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:35 np0005531887 nova_compute[186849]: 2025-11-22 08:11:35.299 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763799095.2992048, c48669e2-d72e-4b32-9bfa-ebda39e1376c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:11:35 np0005531887 nova_compute[186849]: 2025-11-22 08:11:35.300 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] VM Started (Lifecycle Event)#033[00m
Nov 22 03:11:35 np0005531887 nova_compute[186849]: 2025-11-22 08:11:35.319 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:11:35 np0005531887 nova_compute[186849]: 2025-11-22 08:11:35.323 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763799095.3000925, c48669e2-d72e-4b32-9bfa-ebda39e1376c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:11:35 np0005531887 nova_compute[186849]: 2025-11-22 08:11:35.323 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:11:35 np0005531887 NetworkManager[55210]: <info>  [1763799095.3249] device (tap2b7e9f2d-20): carrier: link connected
Nov 22 03:11:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:35.331 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[da2405de-538e-4bf0-bf88-bc22b6a8f3d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:35 np0005531887 nova_compute[186849]: 2025-11-22 08:11:35.345 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:11:35 np0005531887 nova_compute[186849]: 2025-11-22 08:11:35.348 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:11:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:35.350 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[725c5a5d-5568-4038-9366-96b1d4fd0cd6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2b7e9f2d-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:30:2c:69'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 128], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 583793, 'reachable_time': 43430, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232837, 'error': None, 'target': 'ovnmeta-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:35.368 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[fc165ca3-6297-4ca1-91f6-d958b9c11b5e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe30:2c69'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 583793, 'tstamp': 583793}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232838, 'error': None, 'target': 'ovnmeta-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:35 np0005531887 nova_compute[186849]: 2025-11-22 08:11:35.370 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:11:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:35.387 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[1eafcaf1-7b33-49a6-9f08-1dd3ffc630ed]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2b7e9f2d-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:30:2c:69'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 128], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 583793, 'reachable_time': 43430, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 232839, 'error': None, 'target': 'ovnmeta-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:35.424 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[8ff8ce90-d942-451f-9de4-8b28a83dc066]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:35.457 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[1fe0f9e5-8a0e-4b92-9399-aa53c0817072]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:35.459 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2b7e9f2d-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:11:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:35.459 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:11:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:35.460 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2b7e9f2d-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:11:35 np0005531887 kernel: tap2b7e9f2d-20: entered promiscuous mode
Nov 22 03:11:35 np0005531887 nova_compute[186849]: 2025-11-22 08:11:35.462 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:35 np0005531887 NetworkManager[55210]: <info>  [1763799095.4635] manager: (tap2b7e9f2d-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/196)
Nov 22 03:11:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:35.465 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2b7e9f2d-20, col_values=(('external_ids', {'iface-id': 'f86e6fc7-3969-4922-9612-9c86d85f21ec'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:11:35 np0005531887 nova_compute[186849]: 2025-11-22 08:11:35.465 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:35 np0005531887 ovn_controller[95130]: 2025-11-22T08:11:35Z|00426|binding|INFO|Releasing lport f86e6fc7-3969-4922-9612-9c86d85f21ec from this chassis (sb_readonly=0)
Nov 22 03:11:35 np0005531887 nova_compute[186849]: 2025-11-22 08:11:35.478 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:35.479 104084 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:11:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:35.480 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[575fa6f0-1e1c-4032-adff-2a477cdd67f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:11:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:35.481 104084 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:11:35 np0005531887 ovn_metadata_agent[104079]: global
Nov 22 03:11:35 np0005531887 ovn_metadata_agent[104079]:    log         /dev/log local0 debug
Nov 22 03:11:35 np0005531887 ovn_metadata_agent[104079]:    log-tag     haproxy-metadata-proxy-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94
Nov 22 03:11:35 np0005531887 ovn_metadata_agent[104079]:    user        root
Nov 22 03:11:35 np0005531887 ovn_metadata_agent[104079]:    group       root
Nov 22 03:11:35 np0005531887 ovn_metadata_agent[104079]:    maxconn     1024
Nov 22 03:11:35 np0005531887 ovn_metadata_agent[104079]:    pidfile     /var/lib/neutron/external/pids/2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94.pid.haproxy
Nov 22 03:11:35 np0005531887 ovn_metadata_agent[104079]:    daemon
Nov 22 03:11:35 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:11:35 np0005531887 ovn_metadata_agent[104079]: defaults
Nov 22 03:11:35 np0005531887 ovn_metadata_agent[104079]:    log global
Nov 22 03:11:35 np0005531887 ovn_metadata_agent[104079]:    mode http
Nov 22 03:11:35 np0005531887 ovn_metadata_agent[104079]:    option httplog
Nov 22 03:11:35 np0005531887 ovn_metadata_agent[104079]:    option dontlognull
Nov 22 03:11:35 np0005531887 ovn_metadata_agent[104079]:    option http-server-close
Nov 22 03:11:35 np0005531887 ovn_metadata_agent[104079]:    option forwardfor
Nov 22 03:11:35 np0005531887 ovn_metadata_agent[104079]:    retries                 3
Nov 22 03:11:35 np0005531887 ovn_metadata_agent[104079]:    timeout http-request    30s
Nov 22 03:11:35 np0005531887 ovn_metadata_agent[104079]:    timeout connect         30s
Nov 22 03:11:35 np0005531887 ovn_metadata_agent[104079]:    timeout client          32s
Nov 22 03:11:35 np0005531887 ovn_metadata_agent[104079]:    timeout server          32s
Nov 22 03:11:35 np0005531887 ovn_metadata_agent[104079]:    timeout http-keep-alive 30s
Nov 22 03:11:35 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:11:35 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:11:35 np0005531887 ovn_metadata_agent[104079]: listen listener
Nov 22 03:11:35 np0005531887 ovn_metadata_agent[104079]:    bind 169.254.169.254:80
Nov 22 03:11:35 np0005531887 ovn_metadata_agent[104079]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:11:35 np0005531887 ovn_metadata_agent[104079]:    http-request add-header X-OVN-Network-ID 2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94
Nov 22 03:11:35 np0005531887 ovn_metadata_agent[104079]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:11:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:35.482 104084 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94', 'env', 'PROCESS_TAG=haproxy-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:11:35 np0005531887 podman[232868]: 2025-11-22 08:11:35.852494498 +0000 UTC m=+0.056677009 container create d4a4ccb1dfd5222d0aac7cb80698359b95f86a6c4fc33bfd749d09bf76eb2cc5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:11:35 np0005531887 systemd[1]: Started libpod-conmon-d4a4ccb1dfd5222d0aac7cb80698359b95f86a6c4fc33bfd749d09bf76eb2cc5.scope.
Nov 22 03:11:35 np0005531887 podman[232868]: 2025-11-22 08:11:35.82177336 +0000 UTC m=+0.025955891 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:11:35 np0005531887 systemd[1]: Started libcrun container.
Nov 22 03:11:35 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a27d9b528248bd6566a9c65b35d4d9c8384942bbdaf0aae60b44541565327c00/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:11:35 np0005531887 podman[232868]: 2025-11-22 08:11:35.936286616 +0000 UTC m=+0.140469157 container init d4a4ccb1dfd5222d0aac7cb80698359b95f86a6c4fc33bfd749d09bf76eb2cc5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3)
Nov 22 03:11:35 np0005531887 podman[232868]: 2025-11-22 08:11:35.941874434 +0000 UTC m=+0.146056945 container start d4a4ccb1dfd5222d0aac7cb80698359b95f86a6c4fc33bfd749d09bf76eb2cc5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:11:35 np0005531887 neutron-haproxy-ovnmeta-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94[232885]: [NOTICE]   (232889) : New worker (232891) forked
Nov 22 03:11:35 np0005531887 neutron-haproxy-ovnmeta-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94[232885]: [NOTICE]   (232889) : Loading success.
Nov 22 03:11:36 np0005531887 nova_compute[186849]: 2025-11-22 08:11:36.626 186853 DEBUG nova.network.neutron [req-7b5d4d41-d81a-431d-8059-60b76e701dac req-54ab3fb2-08a5-477a-a7a2-65c19a70a363 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Updated VIF entry in instance network info cache for port 3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:11:36 np0005531887 nova_compute[186849]: 2025-11-22 08:11:36.627 186853 DEBUG nova.network.neutron [req-7b5d4d41-d81a-431d-8059-60b76e701dac req-54ab3fb2-08a5-477a-a7a2-65c19a70a363 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Updating instance_info_cache with network_info: [{"id": "136f6afa-dc75-4024-af1d-a03b4dae22a5", "address": "fa:16:3e:ca:68:4f", "network": {"id": "04f3fbae-1178-425a-a955-30dcd392a3d3", "bridge": "br-int", "label": "tempest-network-smoke--515686970", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap136f6afa-dc", "ovs_interfaceid": "136f6afa-dc75-4024-af1d-a03b4dae22a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1", "address": "fa:16:3e:80:9c:ac", "network": {"id": "2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94", "bridge": "br-int", "label": "tempest-network-smoke--1300577145", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe80:9cac", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe80:9cac", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3869bf72-6a", "ovs_interfaceid": "3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:11:36 np0005531887 nova_compute[186849]: 2025-11-22 08:11:36.644 186853 DEBUG oslo_concurrency.lockutils [req-7b5d4d41-d81a-431d-8059-60b76e701dac req-54ab3fb2-08a5-477a-a7a2-65c19a70a363 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-c48669e2-d72e-4b32-9bfa-ebda39e1376c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:11:36 np0005531887 nova_compute[186849]: 2025-11-22 08:11:36.745 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:36 np0005531887 nova_compute[186849]: 2025-11-22 08:11:36.966 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:37 np0005531887 nova_compute[186849]: 2025-11-22 08:11:37.094 186853 DEBUG nova.compute.manager [req-16b999c7-c6ef-45e9-bf1d-89213a4a26a5 req-5cf35794-e3cb-4ff5-9ef3-3ed38002befe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Received event network-vif-plugged-136f6afa-dc75-4024-af1d-a03b4dae22a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:11:37 np0005531887 nova_compute[186849]: 2025-11-22 08:11:37.095 186853 DEBUG oslo_concurrency.lockutils [req-16b999c7-c6ef-45e9-bf1d-89213a4a26a5 req-5cf35794-e3cb-4ff5-9ef3-3ed38002befe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c48669e2-d72e-4b32-9bfa-ebda39e1376c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:11:37 np0005531887 nova_compute[186849]: 2025-11-22 08:11:37.095 186853 DEBUG oslo_concurrency.lockutils [req-16b999c7-c6ef-45e9-bf1d-89213a4a26a5 req-5cf35794-e3cb-4ff5-9ef3-3ed38002befe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c48669e2-d72e-4b32-9bfa-ebda39e1376c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:11:37 np0005531887 nova_compute[186849]: 2025-11-22 08:11:37.095 186853 DEBUG oslo_concurrency.lockutils [req-16b999c7-c6ef-45e9-bf1d-89213a4a26a5 req-5cf35794-e3cb-4ff5-9ef3-3ed38002befe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c48669e2-d72e-4b32-9bfa-ebda39e1376c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:11:37 np0005531887 nova_compute[186849]: 2025-11-22 08:11:37.095 186853 DEBUG nova.compute.manager [req-16b999c7-c6ef-45e9-bf1d-89213a4a26a5 req-5cf35794-e3cb-4ff5-9ef3-3ed38002befe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] No event matching network-vif-plugged-136f6afa-dc75-4024-af1d-a03b4dae22a5 in dict_keys([('network-vif-plugged', '3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Nov 22 03:11:37 np0005531887 nova_compute[186849]: 2025-11-22 08:11:37.096 186853 WARNING nova.compute.manager [req-16b999c7-c6ef-45e9-bf1d-89213a4a26a5 req-5cf35794-e3cb-4ff5-9ef3-3ed38002befe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Received unexpected event network-vif-plugged-136f6afa-dc75-4024-af1d-a03b4dae22a5 for instance with vm_state building and task_state spawning.#033[00m
Nov 22 03:11:37 np0005531887 nova_compute[186849]: 2025-11-22 08:11:37.096 186853 DEBUG nova.compute.manager [req-16b999c7-c6ef-45e9-bf1d-89213a4a26a5 req-5cf35794-e3cb-4ff5-9ef3-3ed38002befe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Received event network-vif-plugged-3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:11:37 np0005531887 nova_compute[186849]: 2025-11-22 08:11:37.096 186853 DEBUG oslo_concurrency.lockutils [req-16b999c7-c6ef-45e9-bf1d-89213a4a26a5 req-5cf35794-e3cb-4ff5-9ef3-3ed38002befe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c48669e2-d72e-4b32-9bfa-ebda39e1376c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:11:37 np0005531887 nova_compute[186849]: 2025-11-22 08:11:37.097 186853 DEBUG oslo_concurrency.lockutils [req-16b999c7-c6ef-45e9-bf1d-89213a4a26a5 req-5cf35794-e3cb-4ff5-9ef3-3ed38002befe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c48669e2-d72e-4b32-9bfa-ebda39e1376c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:11:37 np0005531887 nova_compute[186849]: 2025-11-22 08:11:37.097 186853 DEBUG oslo_concurrency.lockutils [req-16b999c7-c6ef-45e9-bf1d-89213a4a26a5 req-5cf35794-e3cb-4ff5-9ef3-3ed38002befe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c48669e2-d72e-4b32-9bfa-ebda39e1376c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:11:37 np0005531887 nova_compute[186849]: 2025-11-22 08:11:37.097 186853 DEBUG nova.compute.manager [req-16b999c7-c6ef-45e9-bf1d-89213a4a26a5 req-5cf35794-e3cb-4ff5-9ef3-3ed38002befe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Processing event network-vif-plugged-3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:11:37 np0005531887 nova_compute[186849]: 2025-11-22 08:11:37.097 186853 DEBUG nova.compute.manager [req-16b999c7-c6ef-45e9-bf1d-89213a4a26a5 req-5cf35794-e3cb-4ff5-9ef3-3ed38002befe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Received event network-vif-plugged-3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:11:37 np0005531887 nova_compute[186849]: 2025-11-22 08:11:37.098 186853 DEBUG oslo_concurrency.lockutils [req-16b999c7-c6ef-45e9-bf1d-89213a4a26a5 req-5cf35794-e3cb-4ff5-9ef3-3ed38002befe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c48669e2-d72e-4b32-9bfa-ebda39e1376c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:11:37 np0005531887 nova_compute[186849]: 2025-11-22 08:11:37.098 186853 DEBUG oslo_concurrency.lockutils [req-16b999c7-c6ef-45e9-bf1d-89213a4a26a5 req-5cf35794-e3cb-4ff5-9ef3-3ed38002befe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c48669e2-d72e-4b32-9bfa-ebda39e1376c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:11:37 np0005531887 nova_compute[186849]: 2025-11-22 08:11:37.098 186853 DEBUG oslo_concurrency.lockutils [req-16b999c7-c6ef-45e9-bf1d-89213a4a26a5 req-5cf35794-e3cb-4ff5-9ef3-3ed38002befe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c48669e2-d72e-4b32-9bfa-ebda39e1376c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:11:37 np0005531887 nova_compute[186849]: 2025-11-22 08:11:37.098 186853 DEBUG nova.compute.manager [req-16b999c7-c6ef-45e9-bf1d-89213a4a26a5 req-5cf35794-e3cb-4ff5-9ef3-3ed38002befe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] No waiting events found dispatching network-vif-plugged-3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:11:37 np0005531887 nova_compute[186849]: 2025-11-22 08:11:37.098 186853 WARNING nova.compute.manager [req-16b999c7-c6ef-45e9-bf1d-89213a4a26a5 req-5cf35794-e3cb-4ff5-9ef3-3ed38002befe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Received unexpected event network-vif-plugged-3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1 for instance with vm_state building and task_state spawning.#033[00m
Nov 22 03:11:37 np0005531887 nova_compute[186849]: 2025-11-22 08:11:37.099 186853 DEBUG nova.compute.manager [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Instance event wait completed in 1 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:11:37 np0005531887 nova_compute[186849]: 2025-11-22 08:11:37.103 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763799097.1036847, c48669e2-d72e-4b32-9bfa-ebda39e1376c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:11:37 np0005531887 nova_compute[186849]: 2025-11-22 08:11:37.104 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:11:37 np0005531887 nova_compute[186849]: 2025-11-22 08:11:37.106 186853 DEBUG nova.virt.libvirt.driver [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:11:37 np0005531887 nova_compute[186849]: 2025-11-22 08:11:37.110 186853 INFO nova.virt.libvirt.driver [-] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Instance spawned successfully.#033[00m
Nov 22 03:11:37 np0005531887 nova_compute[186849]: 2025-11-22 08:11:37.110 186853 DEBUG nova.virt.libvirt.driver [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 03:11:37 np0005531887 nova_compute[186849]: 2025-11-22 08:11:37.124 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:11:37 np0005531887 nova_compute[186849]: 2025-11-22 08:11:37.128 186853 DEBUG nova.virt.libvirt.driver [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:11:37 np0005531887 nova_compute[186849]: 2025-11-22 08:11:37.129 186853 DEBUG nova.virt.libvirt.driver [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:11:37 np0005531887 nova_compute[186849]: 2025-11-22 08:11:37.129 186853 DEBUG nova.virt.libvirt.driver [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:11:37 np0005531887 nova_compute[186849]: 2025-11-22 08:11:37.130 186853 DEBUG nova.virt.libvirt.driver [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:11:37 np0005531887 nova_compute[186849]: 2025-11-22 08:11:37.130 186853 DEBUG nova.virt.libvirt.driver [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:11:37 np0005531887 nova_compute[186849]: 2025-11-22 08:11:37.130 186853 DEBUG nova.virt.libvirt.driver [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:11:37 np0005531887 nova_compute[186849]: 2025-11-22 08:11:37.134 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:11:37 np0005531887 nova_compute[186849]: 2025-11-22 08:11:37.158 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:11:37 np0005531887 nova_compute[186849]: 2025-11-22 08:11:37.201 186853 INFO nova.compute.manager [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Took 13.72 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 03:11:37 np0005531887 nova_compute[186849]: 2025-11-22 08:11:37.202 186853 DEBUG nova.compute.manager [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:11:37 np0005531887 nova_compute[186849]: 2025-11-22 08:11:37.273 186853 INFO nova.compute.manager [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Took 14.31 seconds to build instance.#033[00m
Nov 22 03:11:37 np0005531887 nova_compute[186849]: 2025-11-22 08:11:37.297 186853 DEBUG oslo_concurrency.lockutils [None req-44d31ddd-e614-4c84-823d-1475b8405f4a 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "c48669e2-d72e-4b32-9bfa-ebda39e1376c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.602s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:11:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:37.346 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:11:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:37.347 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:11:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:11:37.348 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:11:39 np0005531887 podman[232900]: 2025-11-22 08:11:39.845510701 +0000 UTC m=+0.060768832 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:11:40 np0005531887 nova_compute[186849]: 2025-11-22 08:11:40.998 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:41 np0005531887 NetworkManager[55210]: <info>  [1763799100.9993] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/197)
Nov 22 03:11:41 np0005531887 NetworkManager[55210]: <info>  [1763799101.0008] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/198)
Nov 22 03:11:41 np0005531887 nova_compute[186849]: 2025-11-22 08:11:41.106 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:41 np0005531887 ovn_controller[95130]: 2025-11-22T08:11:41Z|00427|binding|INFO|Releasing lport 725c746c-ac46-482e-8d13-14e88613ed55 from this chassis (sb_readonly=0)
Nov 22 03:11:41 np0005531887 ovn_controller[95130]: 2025-11-22T08:11:41Z|00428|binding|INFO|Releasing lport f86e6fc7-3969-4922-9612-9c86d85f21ec from this chassis (sb_readonly=0)
Nov 22 03:11:41 np0005531887 nova_compute[186849]: 2025-11-22 08:11:41.125 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:41 np0005531887 nova_compute[186849]: 2025-11-22 08:11:41.748 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:41 np0005531887 nova_compute[186849]: 2025-11-22 08:11:41.951 186853 DEBUG nova.compute.manager [req-1ae6c680-ec32-45bb-ad86-6248dfcc05d4 req-7d676e33-b76a-455a-8131-daf12b3ea6a4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Received event network-changed-136f6afa-dc75-4024-af1d-a03b4dae22a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:11:41 np0005531887 nova_compute[186849]: 2025-11-22 08:11:41.951 186853 DEBUG nova.compute.manager [req-1ae6c680-ec32-45bb-ad86-6248dfcc05d4 req-7d676e33-b76a-455a-8131-daf12b3ea6a4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Refreshing instance network info cache due to event network-changed-136f6afa-dc75-4024-af1d-a03b4dae22a5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:11:41 np0005531887 nova_compute[186849]: 2025-11-22 08:11:41.952 186853 DEBUG oslo_concurrency.lockutils [req-1ae6c680-ec32-45bb-ad86-6248dfcc05d4 req-7d676e33-b76a-455a-8131-daf12b3ea6a4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-c48669e2-d72e-4b32-9bfa-ebda39e1376c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:11:41 np0005531887 nova_compute[186849]: 2025-11-22 08:11:41.952 186853 DEBUG oslo_concurrency.lockutils [req-1ae6c680-ec32-45bb-ad86-6248dfcc05d4 req-7d676e33-b76a-455a-8131-daf12b3ea6a4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-c48669e2-d72e-4b32-9bfa-ebda39e1376c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:11:41 np0005531887 nova_compute[186849]: 2025-11-22 08:11:41.952 186853 DEBUG nova.network.neutron [req-1ae6c680-ec32-45bb-ad86-6248dfcc05d4 req-7d676e33-b76a-455a-8131-daf12b3ea6a4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Refreshing network info cache for port 136f6afa-dc75-4024-af1d-a03b4dae22a5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:11:41 np0005531887 nova_compute[186849]: 2025-11-22 08:11:41.973 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:42 np0005531887 nova_compute[186849]: 2025-11-22 08:11:42.938 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:43 np0005531887 nova_compute[186849]: 2025-11-22 08:11:43.590 186853 DEBUG nova.network.neutron [req-1ae6c680-ec32-45bb-ad86-6248dfcc05d4 req-7d676e33-b76a-455a-8131-daf12b3ea6a4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Updated VIF entry in instance network info cache for port 136f6afa-dc75-4024-af1d-a03b4dae22a5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:11:43 np0005531887 nova_compute[186849]: 2025-11-22 08:11:43.591 186853 DEBUG nova.network.neutron [req-1ae6c680-ec32-45bb-ad86-6248dfcc05d4 req-7d676e33-b76a-455a-8131-daf12b3ea6a4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Updating instance_info_cache with network_info: [{"id": "136f6afa-dc75-4024-af1d-a03b4dae22a5", "address": "fa:16:3e:ca:68:4f", "network": {"id": "04f3fbae-1178-425a-a955-30dcd392a3d3", "bridge": "br-int", "label": "tempest-network-smoke--515686970", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap136f6afa-dc", "ovs_interfaceid": "136f6afa-dc75-4024-af1d-a03b4dae22a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1", "address": "fa:16:3e:80:9c:ac", "network": {"id": "2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94", "bridge": "br-int", "label": "tempest-network-smoke--1300577145", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe80:9cac", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe80:9cac", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3869bf72-6a", "ovs_interfaceid": "3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:11:43 np0005531887 nova_compute[186849]: 2025-11-22 08:11:43.667 186853 DEBUG oslo_concurrency.lockutils [req-1ae6c680-ec32-45bb-ad86-6248dfcc05d4 req-7d676e33-b76a-455a-8131-daf12b3ea6a4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-c48669e2-d72e-4b32-9bfa-ebda39e1376c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:11:46 np0005531887 nova_compute[186849]: 2025-11-22 08:11:46.751 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:46 np0005531887 podman[232923]: 2025-11-22 08:11:46.845001328 +0000 UTC m=+0.065075126 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 22 03:11:46 np0005531887 nova_compute[186849]: 2025-11-22 08:11:46.875 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:46 np0005531887 nova_compute[186849]: 2025-11-22 08:11:46.973 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:50 np0005531887 podman[232963]: 2025-11-22 08:11:50.854493077 +0000 UTC m=+0.067376243 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.build-date=20251118)
Nov 22 03:11:51 np0005531887 ovn_controller[95130]: 2025-11-22T08:11:51Z|00051|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ca:68:4f 10.100.0.14
Nov 22 03:11:51 np0005531887 ovn_controller[95130]: 2025-11-22T08:11:51Z|00052|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ca:68:4f 10.100.0.14
Nov 22 03:11:51 np0005531887 nova_compute[186849]: 2025-11-22 08:11:51.754 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:51 np0005531887 nova_compute[186849]: 2025-11-22 08:11:51.976 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:55 np0005531887 nova_compute[186849]: 2025-11-22 08:11:55.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:11:55 np0005531887 podman[232986]: 2025-11-22 08:11:55.846238476 +0000 UTC m=+0.053601083 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:11:56 np0005531887 nova_compute[186849]: 2025-11-22 08:11:56.758 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:11:56 np0005531887 nova_compute[186849]: 2025-11-22 08:11:56.977 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:00 np0005531887 nova_compute[186849]: 2025-11-22 08:12:00.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:12:00 np0005531887 nova_compute[186849]: 2025-11-22 08:12:00.812 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:12:00 np0005531887 nova_compute[186849]: 2025-11-22 08:12:00.813 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:12:00 np0005531887 nova_compute[186849]: 2025-11-22 08:12:00.813 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:12:00 np0005531887 nova_compute[186849]: 2025-11-22 08:12:00.813 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:12:00 np0005531887 nova_compute[186849]: 2025-11-22 08:12:00.898 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c48669e2-d72e-4b32-9bfa-ebda39e1376c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:12:00 np0005531887 nova_compute[186849]: 2025-11-22 08:12:00.963 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c48669e2-d72e-4b32-9bfa-ebda39e1376c/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:12:00 np0005531887 nova_compute[186849]: 2025-11-22 08:12:00.964 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c48669e2-d72e-4b32-9bfa-ebda39e1376c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:12:01 np0005531887 nova_compute[186849]: 2025-11-22 08:12:01.029 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c48669e2-d72e-4b32-9bfa-ebda39e1376c/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:12:01 np0005531887 nova_compute[186849]: 2025-11-22 08:12:01.226 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:12:01 np0005531887 nova_compute[186849]: 2025-11-22 08:12:01.228 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5556MB free_disk=73.30210494995117GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:12:01 np0005531887 nova_compute[186849]: 2025-11-22 08:12:01.228 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:12:01 np0005531887 nova_compute[186849]: 2025-11-22 08:12:01.229 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:12:01 np0005531887 nova_compute[186849]: 2025-11-22 08:12:01.296 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Instance c48669e2-d72e-4b32-9bfa-ebda39e1376c actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 03:12:01 np0005531887 nova_compute[186849]: 2025-11-22 08:12:01.297 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:12:01 np0005531887 nova_compute[186849]: 2025-11-22 08:12:01.297 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:12:01 np0005531887 nova_compute[186849]: 2025-11-22 08:12:01.354 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:12:01 np0005531887 nova_compute[186849]: 2025-11-22 08:12:01.379 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:12:01 np0005531887 nova_compute[186849]: 2025-11-22 08:12:01.396 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:12:01 np0005531887 nova_compute[186849]: 2025-11-22 08:12:01.396 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.168s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:12:01 np0005531887 nova_compute[186849]: 2025-11-22 08:12:01.761 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:01 np0005531887 nova_compute[186849]: 2025-11-22 08:12:01.978 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:02 np0005531887 nova_compute[186849]: 2025-11-22 08:12:02.389 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:12:02 np0005531887 nova_compute[186849]: 2025-11-22 08:12:02.390 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:12:02 np0005531887 nova_compute[186849]: 2025-11-22 08:12:02.390 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:12:02 np0005531887 nova_compute[186849]: 2025-11-22 08:12:02.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:12:02 np0005531887 nova_compute[186849]: 2025-11-22 08:12:02.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:12:02 np0005531887 nova_compute[186849]: 2025-11-22 08:12:02.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:12:02 np0005531887 podman[233019]: 2025-11-22 08:12:02.841173741 +0000 UTC m=+0.060253438 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=ubi9-minimal, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.buildah.version=1.33.7, config_id=edpm, container_name=openstack_network_exporter, io.openshift.expose-services=, architecture=x86_64, vendor=Red Hat, Inc., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 22 03:12:03 np0005531887 nova_compute[186849]: 2025-11-22 08:12:03.001 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "refresh_cache-c48669e2-d72e-4b32-9bfa-ebda39e1376c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:12:03 np0005531887 nova_compute[186849]: 2025-11-22 08:12:03.001 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquired lock "refresh_cache-c48669e2-d72e-4b32-9bfa-ebda39e1376c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:12:03 np0005531887 nova_compute[186849]: 2025-11-22 08:12:03.001 186853 DEBUG nova.network.neutron [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 03:12:03 np0005531887 nova_compute[186849]: 2025-11-22 08:12:03.002 186853 DEBUG nova.objects.instance [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c48669e2-d72e-4b32-9bfa-ebda39e1376c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:12:04 np0005531887 podman[233039]: 2025-11-22 08:12:04.849823112 +0000 UTC m=+0.061282403 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute)
Nov 22 03:12:04 np0005531887 podman[233040]: 2025-11-22 08:12:04.880577181 +0000 UTC m=+0.086948347 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:12:05 np0005531887 nova_compute[186849]: 2025-11-22 08:12:05.572 186853 DEBUG nova.network.neutron [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Updating instance_info_cache with network_info: [{"id": "136f6afa-dc75-4024-af1d-a03b4dae22a5", "address": "fa:16:3e:ca:68:4f", "network": {"id": "04f3fbae-1178-425a-a955-30dcd392a3d3", "bridge": "br-int", "label": "tempest-network-smoke--515686970", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap136f6afa-dc", "ovs_interfaceid": "136f6afa-dc75-4024-af1d-a03b4dae22a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1", "address": "fa:16:3e:80:9c:ac", "network": {"id": "2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94", "bridge": "br-int", "label": "tempest-network-smoke--1300577145", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe80:9cac", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe80:9cac", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3869bf72-6a", "ovs_interfaceid": "3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:12:05 np0005531887 nova_compute[186849]: 2025-11-22 08:12:05.871 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Releasing lock "refresh_cache-c48669e2-d72e-4b32-9bfa-ebda39e1376c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:12:05 np0005531887 nova_compute[186849]: 2025-11-22 08:12:05.872 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 03:12:05 np0005531887 nova_compute[186849]: 2025-11-22 08:12:05.872 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:12:05 np0005531887 nova_compute[186849]: 2025-11-22 08:12:05.873 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:12:06 np0005531887 nova_compute[186849]: 2025-11-22 08:12:06.764 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:06 np0005531887 nova_compute[186849]: 2025-11-22 08:12:06.981 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:07 np0005531887 ovn_controller[95130]: 2025-11-22T08:12:07Z|00429|binding|INFO|Releasing lport 725c746c-ac46-482e-8d13-14e88613ed55 from this chassis (sb_readonly=0)
Nov 22 03:12:07 np0005531887 ovn_controller[95130]: 2025-11-22T08:12:07Z|00430|binding|INFO|Releasing lport f86e6fc7-3969-4922-9612-9c86d85f21ec from this chassis (sb_readonly=0)
Nov 22 03:12:07 np0005531887 nova_compute[186849]: 2025-11-22 08:12:07.528 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:07 np0005531887 nova_compute[186849]: 2025-11-22 08:12:07.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:12:10 np0005531887 podman[233085]: 2025-11-22 08:12:10.861484142 +0000 UTC m=+0.083640416 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 03:12:11 np0005531887 nova_compute[186849]: 2025-11-22 08:12:11.766 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:11 np0005531887 nova_compute[186849]: 2025-11-22 08:12:11.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:12:11 np0005531887 nova_compute[186849]: 2025-11-22 08:12:11.985 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:15 np0005531887 nova_compute[186849]: 2025-11-22 08:12:15.762 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:12:16 np0005531887 nova_compute[186849]: 2025-11-22 08:12:16.771 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:16 np0005531887 nova_compute[186849]: 2025-11-22 08:12:16.985 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:17 np0005531887 podman[233109]: 2025-11-22 08:12:17.828378125 +0000 UTC m=+0.052360163 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 03:12:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:12:18.798 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=37, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=36) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:12:18 np0005531887 nova_compute[186849]: 2025-11-22 08:12:18.799 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:12:18.800 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:12:19 np0005531887 nova_compute[186849]: 2025-11-22 08:12:19.516 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:21 np0005531887 nova_compute[186849]: 2025-11-22 08:12:21.773 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:21 np0005531887 podman[233129]: 2025-11-22 08:12:21.837083025 +0000 UTC m=+0.060690449 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible)
Nov 22 03:12:21 np0005531887 nova_compute[186849]: 2025-11-22 08:12:21.987 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:23 np0005531887 nova_compute[186849]: 2025-11-22 08:12:23.733 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:26 np0005531887 nova_compute[186849]: 2025-11-22 08:12:26.776 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:26 np0005531887 podman[233149]: 2025-11-22 08:12:26.82418537 +0000 UTC m=+0.048383436 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:12:26 np0005531887 nova_compute[186849]: 2025-11-22 08:12:26.990 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:27 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:12:27.802 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '37'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:12:31 np0005531887 nova_compute[186849]: 2025-11-22 08:12:31.778 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:31 np0005531887 nova_compute[186849]: 2025-11-22 08:12:31.991 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:33 np0005531887 podman[233173]: 2025-11-22 08:12:33.838304808 +0000 UTC m=+0.057139711 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, name=ubi9-minimal, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, version=9.6, build-date=2025-08-20T13:12:41, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, distribution-scope=public, io.openshift.expose-services=)
Nov 22 03:12:35 np0005531887 podman[233194]: 2025-11-22 08:12:35.843125594 +0000 UTC m=+0.060440622 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:12:35 np0005531887 podman[233195]: 2025-11-22 08:12:35.879444351 +0000 UTC m=+0.090847373 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.669 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c', 'name': 'tempest-TestGettingAddress-server-1879914593', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000083', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'user_id': '809b865601654264af5bff7f49127cea', 'hostId': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.671 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.698 12 DEBUG ceilometer.compute.pollsters [-] c48669e2-d72e-4b32-9bfa-ebda39e1376c/disk.device.write.requests volume: 312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.699 12 DEBUG ceilometer.compute.pollsters [-] c48669e2-d72e-4b32-9bfa-ebda39e1376c/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.700 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '651cd255-5121-440d-a984-ec26378d4aac', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 312, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c-vda', 'timestamp': '2025-11-22T08:12:36.672287', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1879914593', 'name': 'instance-00000083', 'instance_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '01681498-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 5899.342098705, 'message_signature': '3b14c24eb9da40593b15b1689aa4dc76029d0e3101266d4f318a8eb4ad14396c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c-sda', 'timestamp': '2025-11-22T08:12:36.672287', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1879914593', 'name': 'instance-00000083', 'instance_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0168223a-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 5899.342098705, 'message_signature': 'c185bed38ca8b78725d264c8b4cb61b55a9fc739094aab41c9a830f948addfba'}]}, 'timestamp': '2025-11-22 08:12:36.699329', '_unique_id': '327cb01f4f524f47980a53049d278f82'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.700 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.700 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.700 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.700 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.700 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.700 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.700 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.700 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.700 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.700 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.700 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.700 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.700 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.700 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.700 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.700 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.700 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.700 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.700 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.700 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.700 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.700 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.700 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.700 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.700 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.700 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.700 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.700 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.700 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.700 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.700 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.701 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.704 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for c48669e2-d72e-4b32-9bfa-ebda39e1376c / tap136f6afa-dc inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.704 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for c48669e2-d72e-4b32-9bfa-ebda39e1376c / tap3869bf72-6a inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.704 12 DEBUG ceilometer.compute.pollsters [-] c48669e2-d72e-4b32-9bfa-ebda39e1376c/network.outgoing.bytes volume: 3390 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.705 12 DEBUG ceilometer.compute.pollsters [-] c48669e2-d72e-4b32-9bfa-ebda39e1376c/network.outgoing.bytes volume: 2962 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.706 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '05f7225d-311b-466a-9bc0-87af4a4c3d5b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3390, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000083-c48669e2-d72e-4b32-9bfa-ebda39e1376c-tap136f6afa-dc', 'timestamp': '2025-11-22T08:12:36.701516', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1879914593', 'name': 'tap136f6afa-dc', 'instance_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ca:68:4f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap136f6afa-dc'}, 'message_id': '0169047a-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 5899.371327867, 'message_signature': '4b3a371edda6a384d785ea3ee1b73eb75ced2bca4a8c0537c0298c4f1ee66816'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2962, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000083-c48669e2-d72e-4b32-9bfa-ebda39e1376c-tap3869bf72-6a', 'timestamp': '2025-11-22T08:12:36.701516', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1879914593', 'name': 'tap3869bf72-6a', 'instance_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:80:9c:ac', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3869bf72-6a'}, 'message_id': '01691140-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 5899.371327867, 'message_signature': 'b1aea10213129a054805bc7c1e7a34a3718be9de94047c3676186401137bc962'}]}, 'timestamp': '2025-11-22 08:12:36.705428', '_unique_id': 'df9d4065048848e5ad3440fdcd973838'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.706 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.706 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.706 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.706 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.706 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.706 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.706 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.706 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.706 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.706 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.706 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.706 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.706 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.706 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.706 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.706 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.706 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.706 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.706 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.706 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.706 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.706 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.706 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.706 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.706 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.706 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.706 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.706 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.706 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.706 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.706 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.707 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.717 12 DEBUG ceilometer.compute.pollsters [-] c48669e2-d72e-4b32-9bfa-ebda39e1376c/disk.device.allocation volume: 30679040 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.718 12 DEBUG ceilometer.compute.pollsters [-] c48669e2-d72e-4b32-9bfa-ebda39e1376c/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.719 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6880fd2e-f72f-4bb2-b6bc-8e378d5b3f98', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30679040, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c-vda', 'timestamp': '2025-11-22T08:12:36.707397', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1879914593', 'name': 'instance-00000083', 'instance_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '016b0270-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 5899.377197372, 'message_signature': 'd657df61ee5c08cf97399f01bea18c52341dfc9807dfa7d30612808774c1e510'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c-sda', 'timestamp': '2025-11-22T08:12:36.707397', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1879914593', 'name': 'instance-00000083', 'instance_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '016b18b4-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 5899.377197372, 'message_signature': '20d5ffebb5178ee6bf4ee4536ad398173b21649d7287353c8df27177f8ea31cb'}]}, 'timestamp': '2025-11-22 08:12:36.718761', '_unique_id': '9f6b6699e120407daac88929cc6fbfd5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.719 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.719 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.719 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.719 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.719 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.719 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.719 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.719 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.719 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.719 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.719 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.719 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.719 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.719 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.719 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.719 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.719 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.719 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.719 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.719 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.719 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.719 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.719 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.719 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.719 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.719 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.719 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.719 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.719 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.719 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.719 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.721 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.721 12 DEBUG ceilometer.compute.pollsters [-] c48669e2-d72e-4b32-9bfa-ebda39e1376c/disk.device.write.latency volume: 4982068282 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.721 12 DEBUG ceilometer.compute.pollsters [-] c48669e2-d72e-4b32-9bfa-ebda39e1376c/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.722 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1e0792a0-77c5-43fd-91bb-e47f92f6da71', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4982068282, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c-vda', 'timestamp': '2025-11-22T08:12:36.721328', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1879914593', 'name': 'instance-00000083', 'instance_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '016b89b6-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 5899.342098705, 'message_signature': '9a6fd0f142085a8079c3c37eaf8c09c2f93c0188b2ddf18dc08ddcb6774046f1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c-sda', 'timestamp': '2025-11-22T08:12:36.721328', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1879914593', 'name': 'instance-00000083', 'instance_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '016b9366-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 5899.342098705, 'message_signature': '6429ed8ea617c8846c711f65de8b7cb54a5d43cf78067210fea9e798c288f123'}]}, 'timestamp': '2025-11-22 08:12:36.721852', '_unique_id': 'b1ce5ecf36834a5da573e3c9d6847887'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.722 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.722 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.722 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.722 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.722 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.722 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.722 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.722 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.722 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.722 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.722 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.722 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.722 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.722 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.722 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.722 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.722 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.722 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.722 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.722 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.722 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.722 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.722 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.722 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.722 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.722 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.722 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.722 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.722 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.722 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.722 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.723 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.723 12 DEBUG ceilometer.compute.pollsters [-] c48669e2-d72e-4b32-9bfa-ebda39e1376c/disk.device.read.latency volume: 791660244 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.723 12 DEBUG ceilometer.compute.pollsters [-] c48669e2-d72e-4b32-9bfa-ebda39e1376c/disk.device.read.latency volume: 66934709 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.724 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c9215b10-f959-4e8f-ba8a-3d892f1fface', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 791660244, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c-vda', 'timestamp': '2025-11-22T08:12:36.723524', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1879914593', 'name': 'instance-00000083', 'instance_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '016bdef2-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 5899.342098705, 'message_signature': 'cac1226cbf3125b8a57edc1d99f7ffbdfc6dd017cb05c59ab3db7ce53fbe4b59'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 66934709, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c-sda', 'timestamp': '2025-11-22T08:12:36.723524', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1879914593', 'name': 'instance-00000083', 'instance_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '016be8f2-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 5899.342098705, 'message_signature': 'd1a5ec7a2b01d69377b0946baf40e70d2c4ad49c22e3441a4cdc6a1d93dac2f6'}]}, 'timestamp': '2025-11-22 08:12:36.724051', '_unique_id': 'd3c0778e4c73480eb1774122decf44ac'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.724 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.724 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.724 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.724 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.724 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.724 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.724 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.724 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.724 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.724 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.724 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.724 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.724 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.724 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.724 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.724 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.724 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.724 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.724 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.724 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.724 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.724 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.724 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.724 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.724 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.724 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.724 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.724 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.724 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.724 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.724 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.725 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.725 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.725 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1879914593>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1879914593>]
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.725 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.742 12 DEBUG ceilometer.compute.pollsters [-] c48669e2-d72e-4b32-9bfa-ebda39e1376c/memory.usage volume: 43.984375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.743 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c4bb283a-a5f6-4f9e-abc4-c1746ad01d74', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 43.984375, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c', 'timestamp': '2025-11-22T08:12:36.725953', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1879914593', 'name': 'instance-00000083', 'instance_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '016ecdb0-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 5899.412152674, 'message_signature': '112c615c17fbca6f55ac1cb0bfd45f794227190a9a5d53086d7e1066b7f3e8db'}]}, 'timestamp': '2025-11-22 08:12:36.743079', '_unique_id': 'fb7f33bf40c64cdd8f694928a9dd1863'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.743 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.743 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.743 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.743 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.743 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.743 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.743 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.743 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.743 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.743 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.743 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.743 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.743 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.743 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.743 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.743 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.743 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.743 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.743 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.743 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.743 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.743 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.743 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.743 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.743 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.743 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.743 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.743 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.743 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.743 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.743 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.744 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.745 12 DEBUG ceilometer.compute.pollsters [-] c48669e2-d72e-4b32-9bfa-ebda39e1376c/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.745 12 DEBUG ceilometer.compute.pollsters [-] c48669e2-d72e-4b32-9bfa-ebda39e1376c/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.746 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ac0ab0cb-d3c0-44e4-9ff3-de8582e58f70', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000083-c48669e2-d72e-4b32-9bfa-ebda39e1376c-tap136f6afa-dc', 'timestamp': '2025-11-22T08:12:36.745109', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1879914593', 'name': 'tap136f6afa-dc', 'instance_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ca:68:4f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap136f6afa-dc'}, 'message_id': '016f2d46-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 5899.371327867, 'message_signature': '5ccc8ad0c432a1e5a10531d481046145650578d7fa328c7961c9089e2d92cccd'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000083-c48669e2-d72e-4b32-9bfa-ebda39e1376c-tap3869bf72-6a', 'timestamp': '2025-11-22T08:12:36.745109', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1879914593', 'name': 'tap3869bf72-6a', 'instance_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:80:9c:ac', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3869bf72-6a'}, 'message_id': '016f396c-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 5899.371327867, 'message_signature': 'ef4a4138b6249c89838d06c2d5916c87ac0248f19e44091620244f017e6a1bac'}]}, 'timestamp': '2025-11-22 08:12:36.745788', '_unique_id': 'eb90bd693db64e2aa82dfa6836eef9d1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.746 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.746 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.746 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.746 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.746 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.746 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.746 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.746 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.746 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.746 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.746 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.746 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.746 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.746 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.746 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.746 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.746 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.746 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.746 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.746 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.746 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.746 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.746 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.746 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.746 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.746 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.746 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.746 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.746 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.746 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.746 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.747 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.747 12 DEBUG ceilometer.compute.pollsters [-] c48669e2-d72e-4b32-9bfa-ebda39e1376c/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.747 12 DEBUG ceilometer.compute.pollsters [-] c48669e2-d72e-4b32-9bfa-ebda39e1376c/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.748 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e1fa2828-4872-449d-a272-8ce96039943e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000083-c48669e2-d72e-4b32-9bfa-ebda39e1376c-tap136f6afa-dc', 'timestamp': '2025-11-22T08:12:36.747564', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1879914593', 'name': 'tap136f6afa-dc', 'instance_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ca:68:4f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap136f6afa-dc'}, 'message_id': '016f8a20-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 5899.371327867, 'message_signature': 'd4388f7153056c4db539045ce2849e6c6c3c03c42c295ce0c7952df7ea581f83'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000083-c48669e2-d72e-4b32-9bfa-ebda39e1376c-tap3869bf72-6a', 'timestamp': '2025-11-22T08:12:36.747564', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1879914593', 'name': 'tap3869bf72-6a', 'instance_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:80:9c:ac', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3869bf72-6a'}, 'message_id': '016f9402-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 5899.371327867, 'message_signature': '1da498237aae8cf30db977455d739c5a47560293c77af14ce9319cd2c8ba976b'}]}, 'timestamp': '2025-11-22 08:12:36.748098', '_unique_id': '1dfd4ba79e6f4011a92277540fe78de2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.748 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.748 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.748 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.748 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.748 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.748 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.748 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.748 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.748 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.748 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.748 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.748 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.748 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.748 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.748 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.748 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.748 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.748 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.748 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.748 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.748 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.748 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.748 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.748 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.748 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.748 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.748 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.748 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.748 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.748 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.748 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.749 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.749 12 DEBUG ceilometer.compute.pollsters [-] c48669e2-d72e-4b32-9bfa-ebda39e1376c/disk.device.read.bytes volume: 31087104 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.749 12 DEBUG ceilometer.compute.pollsters [-] c48669e2-d72e-4b32-9bfa-ebda39e1376c/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.750 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cceaa535-bbeb-4131-ba80-f37e2f790793', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31087104, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c-vda', 'timestamp': '2025-11-22T08:12:36.749608', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1879914593', 'name': 'instance-00000083', 'instance_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '016fd9bc-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 5899.342098705, 'message_signature': '08c4c9e083682865ccfae3d41cdab6ca753b2c6accd4c3d69e8d2bd44e68d24f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c-sda', 'timestamp': '2025-11-22T08:12:36.749608', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1879914593', 'name': 'instance-00000083', 'instance_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '016fe326-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 5899.342098705, 'message_signature': '984b92a8050b8b9de29ea372d34af249cb6bc5edf530fa8309b1a29b956c0c58'}]}, 'timestamp': '2025-11-22 08:12:36.750104', '_unique_id': 'e29b7ac911c34d969ea2cd36cdf56ad0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.750 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.750 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.750 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.750 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.750 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.750 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.750 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.750 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.750 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.750 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.750 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.750 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.750 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.750 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.750 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.750 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.750 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.750 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.750 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.750 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.750 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.750 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.750 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.750 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.750 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.750 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.750 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.750 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.750 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.750 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.750 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.751 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.751 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.752 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1879914593>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1879914593>]
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.752 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.752 12 DEBUG ceilometer.compute.pollsters [-] c48669e2-d72e-4b32-9bfa-ebda39e1376c/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.752 12 DEBUG ceilometer.compute.pollsters [-] c48669e2-d72e-4b32-9bfa-ebda39e1376c/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.753 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aa2e7455-ea16-4d37-a3cd-b59fd27bb5c5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000083-c48669e2-d72e-4b32-9bfa-ebda39e1376c-tap136f6afa-dc', 'timestamp': '2025-11-22T08:12:36.752388', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1879914593', 'name': 'tap136f6afa-dc', 'instance_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ca:68:4f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap136f6afa-dc'}, 'message_id': '01704820-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 5899.371327867, 'message_signature': '1283cdd92353937635cc654ad872f66bca62ef3fe3a8f0dc4acb6d8c2aaa158f'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000083-c48669e2-d72e-4b32-9bfa-ebda39e1376c-tap3869bf72-6a', 'timestamp': '2025-11-22T08:12:36.752388', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1879914593', 'name': 'tap3869bf72-6a', 'instance_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:80:9c:ac', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3869bf72-6a'}, 'message_id': '01705450-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 5899.371327867, 'message_signature': 'fca9e70e047b5e2f39e8b837465622e5e90ae17118d37a74bdef3d421711c4e6'}]}, 'timestamp': '2025-11-22 08:12:36.753045', '_unique_id': 'b09cd81c65a1456bb542c32b4ab508d3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.753 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.753 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.753 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.753 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.753 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.753 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.753 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.753 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.753 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.753 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.753 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.753 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.753 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.753 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.753 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.753 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.753 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.753 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.753 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.753 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.753 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.753 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.753 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.753 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.753 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.753 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.753 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.753 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.753 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.753 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.753 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.754 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.755 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.755 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1879914593>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1879914593>]
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.755 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.755 12 DEBUG ceilometer.compute.pollsters [-] c48669e2-d72e-4b32-9bfa-ebda39e1376c/network.incoming.packets volume: 25 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.755 12 DEBUG ceilometer.compute.pollsters [-] c48669e2-d72e-4b32-9bfa-ebda39e1376c/network.incoming.packets volume: 8 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.756 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a7eecd52-906b-49c8-9201-06e98d7c0af9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 25, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000083-c48669e2-d72e-4b32-9bfa-ebda39e1376c-tap136f6afa-dc', 'timestamp': '2025-11-22T08:12:36.755501', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1879914593', 'name': 'tap136f6afa-dc', 'instance_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ca:68:4f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap136f6afa-dc'}, 'message_id': '0170c03e-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 5899.371327867, 'message_signature': '67ee4d0cd8eb45a5c85f82af8562e5bffec27e0d50f7171ab49023a262d4965f'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 8, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000083-c48669e2-d72e-4b32-9bfa-ebda39e1376c-tap3869bf72-6a', 'timestamp': '2025-11-22T08:12:36.755501', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1879914593', 'name': 'tap3869bf72-6a', 'instance_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:80:9c:ac', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3869bf72-6a'}, 'message_id': '0170ca5c-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 5899.371327867, 'message_signature': 'd6b4fcca9bb325568dd871f1b9e4abc08548e2b25f4d0aece4cb938e0d380265'}]}, 'timestamp': '2025-11-22 08:12:36.756034', '_unique_id': 'e49d6bf8809d47b9b1ca47659b886d1e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.756 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.756 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.756 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.756 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.756 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.756 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.756 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.756 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.756 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.756 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.756 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.756 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.756 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.756 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.756 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.756 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.756 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.756 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.756 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.756 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.756 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.756 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.756 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.756 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.756 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.756 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.756 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.756 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.756 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.756 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.756 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.757 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.758 12 DEBUG ceilometer.compute.pollsters [-] c48669e2-d72e-4b32-9bfa-ebda39e1376c/disk.device.read.requests volume: 1123 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.758 12 DEBUG ceilometer.compute.pollsters [-] c48669e2-d72e-4b32-9bfa-ebda39e1376c/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.759 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a7e59636-aa38-4db8-87f5-922de21a89ab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1123, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c-vda', 'timestamp': '2025-11-22T08:12:36.757994', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1879914593', 'name': 'instance-00000083', 'instance_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '017121fa-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 5899.342098705, 'message_signature': '1b2fd69d21d13f6175a479015d0e06848c0f7fc3623f4c9d41f7889f111cb37b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c-sda', 'timestamp': '2025-11-22T08:12:36.757994', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1879914593', 'name': 'instance-00000083', 'instance_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '01712de4-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 5899.342098705, 'message_signature': '6fda65ad59962499d5784d3596d516e157e22e1318ba72472e7b4fce37fb0df3'}]}, 'timestamp': '2025-11-22 08:12:36.758574', '_unique_id': '851b7692bd0549bb96a7858ae0180ef4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.759 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.759 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.759 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.759 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.759 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.759 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.759 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.759 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.759 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.759 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.759 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.759 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.759 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.759 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.759 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.759 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.759 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.759 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.759 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.759 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.759 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.759 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.759 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.759 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.759 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.759 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.759 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.759 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.759 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.759 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.759 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.759 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.760 12 DEBUG ceilometer.compute.pollsters [-] c48669e2-d72e-4b32-9bfa-ebda39e1376c/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.760 12 DEBUG ceilometer.compute.pollsters [-] c48669e2-d72e-4b32-9bfa-ebda39e1376c/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.761 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '76a0886e-152a-481b-8a1f-aa15359eb688', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000083-c48669e2-d72e-4b32-9bfa-ebda39e1376c-tap136f6afa-dc', 'timestamp': '2025-11-22T08:12:36.760085', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1879914593', 'name': 'tap136f6afa-dc', 'instance_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ca:68:4f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap136f6afa-dc'}, 'message_id': '01717b6e-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 5899.371327867, 'message_signature': 'eeba0445f0a8f30a1e48748a3eff682c6a32b9ec3a9d9253f8b426f346e85f5d'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000083-c48669e2-d72e-4b32-9bfa-ebda39e1376c-tap3869bf72-6a', 'timestamp': '2025-11-22T08:12:36.760085', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1879914593', 'name': 'tap3869bf72-6a', 'instance_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:80:9c:ac', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3869bf72-6a'}, 'message_id': '01718546-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 5899.371327867, 'message_signature': '2d13ad64b3d76d42ee74e6cef4410925cc88bdfe659b8ba4741c719f21d6f2a8'}]}, 'timestamp': '2025-11-22 08:12:36.760816', '_unique_id': 'ceafbb5ba74343458d41f22c628aaf13'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.761 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.761 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.761 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.761 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.761 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.761 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.761 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.761 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.761 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.761 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.761 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.761 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.761 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.761 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.761 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.761 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.761 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.761 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.761 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.761 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.761 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.761 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.761 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.761 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.761 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.761 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.761 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.761 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.761 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.761 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.761 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.762 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.762 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.762 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1879914593>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1879914593>]
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.762 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.763 12 DEBUG ceilometer.compute.pollsters [-] c48669e2-d72e-4b32-9bfa-ebda39e1376c/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.763 12 DEBUG ceilometer.compute.pollsters [-] c48669e2-d72e-4b32-9bfa-ebda39e1376c/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.764 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c9323755-cb7e-41d2-8360-52e1f36db46a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c-vda', 'timestamp': '2025-11-22T08:12:36.763029', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1879914593', 'name': 'instance-00000083', 'instance_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0171e694-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 5899.377197372, 'message_signature': '4559cb76c6727ac3a1cc3b0a46f08799d1732a50714aea0d5c20a636e1309dfa'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c-sda', 'timestamp': '2025-11-22T08:12:36.763029', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1879914593', 'name': 'instance-00000083', 'instance_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0171f242-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 5899.377197372, 'message_signature': 'c640d03b67d4b249825e41cd4cd1c0e9bd1c8f5a98c48f3845571392ec8cb112'}]}, 'timestamp': '2025-11-22 08:12:36.763603', '_unique_id': 'f4c1f3d8100c4b6bb62a108b6672f2e4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.764 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.764 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.764 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.764 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.764 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.764 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.764 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.764 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.764 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.764 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.764 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.764 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.764 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.764 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.764 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.764 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.764 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.764 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.764 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.764 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.764 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.764 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.764 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.764 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.764 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.764 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.764 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.764 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.764 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.764 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.764 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.765 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.765 12 DEBUG ceilometer.compute.pollsters [-] c48669e2-d72e-4b32-9bfa-ebda39e1376c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.765 12 DEBUG ceilometer.compute.pollsters [-] c48669e2-d72e-4b32-9bfa-ebda39e1376c/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.766 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'be2eb972-ab1f-4537-8619-777e1d0bbbba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c-vda', 'timestamp': '2025-11-22T08:12:36.765383', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1879914593', 'name': 'instance-00000083', 'instance_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '017242c4-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 5899.377197372, 'message_signature': 'be468ba15f465f332d5f313cb4bc6e25868d5df2a44b147b383d312b6797f5b9'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c-sda', 'timestamp': '2025-11-22T08:12:36.765383', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1879914593', 'name': 'instance-00000083', 'instance_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '01724c9c-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 5899.377197372, 'message_signature': '01e7142e4a40b0e01c87288424a29712f57919235c2d5be0756755bc81b02749'}]}, 'timestamp': '2025-11-22 08:12:36.765942', '_unique_id': 'f49c5b20bec7425fa620a2f32e50d118'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.766 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.766 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.766 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.766 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.766 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.766 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.766 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.766 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.766 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.766 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.766 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.766 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.766 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.766 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.766 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.766 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.766 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.766 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.766 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.766 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.766 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.766 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.766 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.766 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.766 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.766 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.766 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.766 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.766 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.766 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.766 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.767 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.767 12 DEBUG ceilometer.compute.pollsters [-] c48669e2-d72e-4b32-9bfa-ebda39e1376c/network.incoming.bytes volume: 4105 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.767 12 DEBUG ceilometer.compute.pollsters [-] c48669e2-d72e-4b32-9bfa-ebda39e1376c/network.incoming.bytes volume: 772 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.768 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '939240ed-984d-43c0-8208-1ad2428a7f10', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4105, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000083-c48669e2-d72e-4b32-9bfa-ebda39e1376c-tap136f6afa-dc', 'timestamp': '2025-11-22T08:12:36.767657', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1879914593', 'name': 'tap136f6afa-dc', 'instance_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ca:68:4f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap136f6afa-dc'}, 'message_id': '01729b20-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 5899.371327867, 'message_signature': 'd309b41bc5649987c05a3a22551b5f9d7d6bc2ea51727f110abc154bf420def7'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 772, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000083-c48669e2-d72e-4b32-9bfa-ebda39e1376c-tap3869bf72-6a', 'timestamp': '2025-11-22T08:12:36.767657', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1879914593', 'name': 'tap3869bf72-6a', 'instance_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:80:9c:ac', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3869bf72-6a'}, 'message_id': '0172a4f8-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 5899.371327867, 'message_signature': '80d3c2ab330f5f67dd94391073bdbe608297e637762953b31d33aee21cf66aaf'}]}, 'timestamp': '2025-11-22 08:12:36.768214', '_unique_id': 'c7fd3a77ab4648eda79759a96b03cda6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.768 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.768 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.768 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.768 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.768 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.768 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.768 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.768 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.768 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.768 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.768 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.768 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.768 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.768 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.768 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.768 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.768 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.768 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.768 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.768 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.768 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.768 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.768 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.768 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.768 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.768 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.768 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.768 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.768 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.768 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.768 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.769 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.769 12 DEBUG ceilometer.compute.pollsters [-] c48669e2-d72e-4b32-9bfa-ebda39e1376c/cpu volume: 14130000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.770 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '020db4a0-00a5-4774-9433-558cc54fa616', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 14130000000, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c', 'timestamp': '2025-11-22T08:12:36.769796', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1879914593', 'name': 'instance-00000083', 'instance_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '0172ee86-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 5899.412152674, 'message_signature': 'e0a60f0ab8b882d35bcb1d8315cd0ad29edc547fc8b7442140b8b33617ab35e7'}]}, 'timestamp': '2025-11-22 08:12:36.770076', '_unique_id': '06e101b6a0e340cdac58ef5a54acb069'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.770 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.770 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.770 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.770 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.770 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.770 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.770 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.770 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.770 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.770 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.770 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.770 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.770 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.770 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.770 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.770 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.770 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.770 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.770 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.770 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.770 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.770 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.770 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.770 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.770 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.770 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.770 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.770 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.770 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.770 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.770 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.771 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.771 12 DEBUG ceilometer.compute.pollsters [-] c48669e2-d72e-4b32-9bfa-ebda39e1376c/disk.device.write.bytes volume: 72970240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.771 12 DEBUG ceilometer.compute.pollsters [-] c48669e2-d72e-4b32-9bfa-ebda39e1376c/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.772 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f20e3f24-12b4-41be-9636-bc5e9fc189db', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72970240, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c-vda', 'timestamp': '2025-11-22T08:12:36.771704', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1879914593', 'name': 'instance-00000083', 'instance_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '01733936-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 5899.342098705, 'message_signature': '6313564c19d559edc9e029bbccb24eea0d5c3cf2e6f577987af52e285f36931d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c-sda', 'timestamp': '2025-11-22T08:12:36.771704', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1879914593', 'name': 'instance-00000083', 'instance_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '01734354-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 5899.342098705, 'message_signature': 'ebceed72ab4720ab9dabc8263cd4893465e75fa1ac9b0c27d7a3f3460641f8c3'}]}, 'timestamp': '2025-11-22 08:12:36.772274', '_unique_id': '7201728724d04676ae6e316b72d81052'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.772 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.772 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.772 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.772 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.772 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.772 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.772 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.772 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.772 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.772 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.772 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.772 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.772 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.772 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.772 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.772 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.772 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.772 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.772 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.772 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.772 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.772 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.772 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.772 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.772 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.772 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.772 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.772 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.772 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.772 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.772 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.773 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.774 12 DEBUG ceilometer.compute.pollsters [-] c48669e2-d72e-4b32-9bfa-ebda39e1376c/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.774 12 DEBUG ceilometer.compute.pollsters [-] c48669e2-d72e-4b32-9bfa-ebda39e1376c/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.775 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '350cbba1-8513-4e9b-85e9-8ef46f8dd6a3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000083-c48669e2-d72e-4b32-9bfa-ebda39e1376c-tap136f6afa-dc', 'timestamp': '2025-11-22T08:12:36.774003', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1879914593', 'name': 'tap136f6afa-dc', 'instance_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ca:68:4f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap136f6afa-dc'}, 'message_id': '017395a2-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 5899.371327867, 'message_signature': 'c869dbd9052527efe415eb9e258f7cd696be98de0c28ed030c27dcc8cfc40b64'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000083-c48669e2-d72e-4b32-9bfa-ebda39e1376c-tap3869bf72-6a', 'timestamp': '2025-11-22T08:12:36.774003', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1879914593', 'name': 'tap3869bf72-6a', 'instance_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:80:9c:ac', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3869bf72-6a'}, 'message_id': '0173a2c2-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 5899.371327867, 'message_signature': '93a52b747af7bdae0a704f65317d29bcbc69088008a639887cae27e090b038df'}]}, 'timestamp': '2025-11-22 08:12:36.774681', '_unique_id': 'ed98ad5dd2124d44b60b47690cb320c0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.775 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.775 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.775 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.775 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.775 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.775 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.775 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.775 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.775 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.775 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.775 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.775 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.775 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.775 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.775 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.775 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.775 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.775 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.775 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.775 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.775 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.775 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.775 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.775 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.775 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.775 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.775 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.775 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.775 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.775 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.775 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.776 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.776 12 DEBUG ceilometer.compute.pollsters [-] c48669e2-d72e-4b32-9bfa-ebda39e1376c/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.776 12 DEBUG ceilometer.compute.pollsters [-] c48669e2-d72e-4b32-9bfa-ebda39e1376c/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.777 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4b1c9f46-fd45-43ad-91f1-b46b0ce5be81', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000083-c48669e2-d72e-4b32-9bfa-ebda39e1376c-tap136f6afa-dc', 'timestamp': '2025-11-22T08:12:36.776366', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1879914593', 'name': 'tap136f6afa-dc', 'instance_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ca:68:4f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap136f6afa-dc'}, 'message_id': '0173ef5c-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 5899.371327867, 'message_signature': '35979ed2ebb2d327bb4d9f500273f4f7eb2c8c910ec4bd7c0ab4e1216b0c2fed'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000083-c48669e2-d72e-4b32-9bfa-ebda39e1376c-tap3869bf72-6a', 'timestamp': '2025-11-22T08:12:36.776366', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1879914593', 'name': 'tap3869bf72-6a', 'instance_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:80:9c:ac', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3869bf72-6a'}, 'message_id': '0173f92a-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 5899.371327867, 'message_signature': '080aa76d07c833c094ef59a0695d354e6acc50b8c5b9043a53317b4cd5048a16'}]}, 'timestamp': '2025-11-22 08:12:36.776891', '_unique_id': '880e32f1ecf0437bb15766b291d189fb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.777 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.777 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.777 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.777 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.777 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.777 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.777 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.777 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.777 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.777 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.777 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.777 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.777 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.777 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.777 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.777 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.777 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.777 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.777 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.777 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.777 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.777 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.777 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.777 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.777 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.777 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.777 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.777 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.777 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.777 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.777 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.778 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.778 12 DEBUG ceilometer.compute.pollsters [-] c48669e2-d72e-4b32-9bfa-ebda39e1376c/network.outgoing.packets volume: 28 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.779 12 DEBUG ceilometer.compute.pollsters [-] c48669e2-d72e-4b32-9bfa-ebda39e1376c/network.outgoing.packets volume: 25 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.780 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c64cfbef-29dd-40fc-8a8c-b264d7bb681c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 28, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000083-c48669e2-d72e-4b32-9bfa-ebda39e1376c-tap136f6afa-dc', 'timestamp': '2025-11-22T08:12:36.778702', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1879914593', 'name': 'tap136f6afa-dc', 'instance_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ca:68:4f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap136f6afa-dc'}, 'message_id': '01744c40-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 5899.371327867, 'message_signature': '57df7b21c61e87051e40dd8a965bb683cd914a9b4e07a7dd8945f93ed71fcd86'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 25, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000083-c48669e2-d72e-4b32-9bfa-ebda39e1376c-tap3869bf72-6a', 'timestamp': '2025-11-22T08:12:36.778702', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1879914593', 'name': 'tap3869bf72-6a', 'instance_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:80:9c:ac', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3869bf72-6a'}, 'message_id': '017457e4-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 5899.371327867, 'message_signature': 'db902fb05bcd698b0555ed35d9c649e71093ab4bff1aeb62e20c98735c9ce57e'}]}, 'timestamp': '2025-11-22 08:12:36.779381', '_unique_id': '2e691928a8484872a8767d9ee8683800'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.780 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.780 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.780 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.780 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.780 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.780 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.780 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.780 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.780 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.780 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.780 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.780 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.780 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.780 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.780 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.780 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.780 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.780 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.780 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.780 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.780 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.780 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.780 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.780 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.780 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.780 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.780 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.780 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.780 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.780 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:12:36.780 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:12:36 np0005531887 nova_compute[186849]: 2025-11-22 08:12:36.781 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:36 np0005531887 nova_compute[186849]: 2025-11-22 08:12:36.993 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:12:37.346 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:12:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:12:37.347 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:12:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:12:37.347 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:12:40 np0005531887 ovn_controller[95130]: 2025-11-22T08:12:40Z|00431|binding|INFO|Releasing lport 725c746c-ac46-482e-8d13-14e88613ed55 from this chassis (sb_readonly=0)
Nov 22 03:12:40 np0005531887 ovn_controller[95130]: 2025-11-22T08:12:40Z|00432|binding|INFO|Releasing lport f86e6fc7-3969-4922-9612-9c86d85f21ec from this chassis (sb_readonly=0)
Nov 22 03:12:40 np0005531887 nova_compute[186849]: 2025-11-22 08:12:40.088 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:41 np0005531887 nova_compute[186849]: 2025-11-22 08:12:41.784 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:41 np0005531887 podman[233239]: 2025-11-22 08:12:41.832437233 +0000 UTC m=+0.048137990 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:12:41 np0005531887 nova_compute[186849]: 2025-11-22 08:12:41.995 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:42 np0005531887 ovn_controller[95130]: 2025-11-22T08:12:42Z|00433|binding|INFO|Releasing lport 725c746c-ac46-482e-8d13-14e88613ed55 from this chassis (sb_readonly=0)
Nov 22 03:12:42 np0005531887 ovn_controller[95130]: 2025-11-22T08:12:42Z|00434|binding|INFO|Releasing lport f86e6fc7-3969-4922-9612-9c86d85f21ec from this chassis (sb_readonly=0)
Nov 22 03:12:42 np0005531887 nova_compute[186849]: 2025-11-22 08:12:42.460 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:43 np0005531887 ovn_controller[95130]: 2025-11-22T08:12:43Z|00435|binding|INFO|Releasing lport 725c746c-ac46-482e-8d13-14e88613ed55 from this chassis (sb_readonly=0)
Nov 22 03:12:43 np0005531887 ovn_controller[95130]: 2025-11-22T08:12:43Z|00436|binding|INFO|Releasing lport f86e6fc7-3969-4922-9612-9c86d85f21ec from this chassis (sb_readonly=0)
Nov 22 03:12:43 np0005531887 nova_compute[186849]: 2025-11-22 08:12:43.471 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:46 np0005531887 nova_compute[186849]: 2025-11-22 08:12:46.787 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:46 np0005531887 nova_compute[186849]: 2025-11-22 08:12:46.997 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:48 np0005531887 podman[233266]: 2025-11-22 08:12:48.838967154 +0000 UTC m=+0.059845418 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:12:50 np0005531887 nova_compute[186849]: 2025-11-22 08:12:50.468 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:51 np0005531887 nova_compute[186849]: 2025-11-22 08:12:51.790 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:52 np0005531887 nova_compute[186849]: 2025-11-22 08:12:51.999 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:52 np0005531887 nova_compute[186849]: 2025-11-22 08:12:52.269 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:52 np0005531887 podman[233285]: 2025-11-22 08:12:52.859817403 +0000 UTC m=+0.079726178 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:12:52 np0005531887 nova_compute[186849]: 2025-11-22 08:12:52.862 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:55 np0005531887 nova_compute[186849]: 2025-11-22 08:12:55.656 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:56 np0005531887 nova_compute[186849]: 2025-11-22 08:12:56.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:12:56 np0005531887 nova_compute[186849]: 2025-11-22 08:12:56.792 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:57 np0005531887 nova_compute[186849]: 2025-11-22 08:12:57.002 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:12:57 np0005531887 podman[233319]: 2025-11-22 08:12:57.846093098 +0000 UTC m=+0.055087390 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:13:01 np0005531887 nova_compute[186849]: 2025-11-22 08:13:01.269 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:01 np0005531887 nova_compute[186849]: 2025-11-22 08:13:01.765 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:13:01 np0005531887 nova_compute[186849]: 2025-11-22 08:13:01.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:13:01 np0005531887 nova_compute[186849]: 2025-11-22 08:13:01.789 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:13:01 np0005531887 nova_compute[186849]: 2025-11-22 08:13:01.790 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:13:01 np0005531887 nova_compute[186849]: 2025-11-22 08:13:01.790 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:13:01 np0005531887 nova_compute[186849]: 2025-11-22 08:13:01.791 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:13:01 np0005531887 nova_compute[186849]: 2025-11-22 08:13:01.794 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:01 np0005531887 nova_compute[186849]: 2025-11-22 08:13:01.848 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c48669e2-d72e-4b32-9bfa-ebda39e1376c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:13:01 np0005531887 nova_compute[186849]: 2025-11-22 08:13:01.905 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c48669e2-d72e-4b32-9bfa-ebda39e1376c/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:13:01 np0005531887 nova_compute[186849]: 2025-11-22 08:13:01.906 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c48669e2-d72e-4b32-9bfa-ebda39e1376c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:13:01 np0005531887 nova_compute[186849]: 2025-11-22 08:13:01.960 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c48669e2-d72e-4b32-9bfa-ebda39e1376c/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:13:02 np0005531887 nova_compute[186849]: 2025-11-22 08:13:02.004 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:02 np0005531887 nova_compute[186849]: 2025-11-22 08:13:02.141 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:13:02 np0005531887 nova_compute[186849]: 2025-11-22 08:13:02.143 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5577MB free_disk=73.30214309692383GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:13:02 np0005531887 nova_compute[186849]: 2025-11-22 08:13:02.143 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:13:02 np0005531887 nova_compute[186849]: 2025-11-22 08:13:02.143 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:13:02 np0005531887 nova_compute[186849]: 2025-11-22 08:13:02.226 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Instance c48669e2-d72e-4b32-9bfa-ebda39e1376c actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 03:13:02 np0005531887 nova_compute[186849]: 2025-11-22 08:13:02.227 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:13:02 np0005531887 nova_compute[186849]: 2025-11-22 08:13:02.227 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:13:02 np0005531887 nova_compute[186849]: 2025-11-22 08:13:02.289 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:13:02 np0005531887 nova_compute[186849]: 2025-11-22 08:13:02.300 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:13:02 np0005531887 nova_compute[186849]: 2025-11-22 08:13:02.302 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:13:02 np0005531887 nova_compute[186849]: 2025-11-22 08:13:02.302 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.158s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:13:03 np0005531887 nova_compute[186849]: 2025-11-22 08:13:03.303 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:13:03 np0005531887 nova_compute[186849]: 2025-11-22 08:13:03.304 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:13:03 np0005531887 nova_compute[186849]: 2025-11-22 08:13:03.771 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:13:03 np0005531887 nova_compute[186849]: 2025-11-22 08:13:03.771 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:13:04 np0005531887 nova_compute[186849]: 2025-11-22 08:13:04.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:13:04 np0005531887 nova_compute[186849]: 2025-11-22 08:13:04.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:13:04 np0005531887 nova_compute[186849]: 2025-11-22 08:13:04.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:13:04 np0005531887 podman[233350]: 2025-11-22 08:13:04.836573454 +0000 UTC m=+0.058854883 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, architecture=x86_64, config_id=edpm, managed_by=edpm_ansible, distribution-scope=public, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 22 03:13:04 np0005531887 nova_compute[186849]: 2025-11-22 08:13:04.992 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "refresh_cache-c48669e2-d72e-4b32-9bfa-ebda39e1376c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:13:04 np0005531887 nova_compute[186849]: 2025-11-22 08:13:04.993 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquired lock "refresh_cache-c48669e2-d72e-4b32-9bfa-ebda39e1376c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:13:04 np0005531887 nova_compute[186849]: 2025-11-22 08:13:04.993 186853 DEBUG nova.network.neutron [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 03:13:04 np0005531887 nova_compute[186849]: 2025-11-22 08:13:04.993 186853 DEBUG nova.objects.instance [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c48669e2-d72e-4b32-9bfa-ebda39e1376c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:13:05 np0005531887 nova_compute[186849]: 2025-11-22 08:13:05.731 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:06 np0005531887 nova_compute[186849]: 2025-11-22 08:13:06.795 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:06 np0005531887 podman[233372]: 2025-11-22 08:13:06.831359412 +0000 UTC m=+0.054829134 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ceilometer_agent_compute)
Nov 22 03:13:06 np0005531887 podman[233373]: 2025-11-22 08:13:06.859089247 +0000 UTC m=+0.078398336 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 03:13:07 np0005531887 nova_compute[186849]: 2025-11-22 08:13:07.006 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:07 np0005531887 nova_compute[186849]: 2025-11-22 08:13:07.462 186853 DEBUG nova.network.neutron [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Updating instance_info_cache with network_info: [{"id": "136f6afa-dc75-4024-af1d-a03b4dae22a5", "address": "fa:16:3e:ca:68:4f", "network": {"id": "04f3fbae-1178-425a-a955-30dcd392a3d3", "bridge": "br-int", "label": "tempest-network-smoke--515686970", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap136f6afa-dc", "ovs_interfaceid": "136f6afa-dc75-4024-af1d-a03b4dae22a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1", "address": "fa:16:3e:80:9c:ac", "network": {"id": "2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94", "bridge": "br-int", "label": "tempest-network-smoke--1300577145", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe80:9cac", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe80:9cac", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3869bf72-6a", "ovs_interfaceid": "3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:13:07 np0005531887 nova_compute[186849]: 2025-11-22 08:13:07.475 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Releasing lock "refresh_cache-c48669e2-d72e-4b32-9bfa-ebda39e1376c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:13:07 np0005531887 nova_compute[186849]: 2025-11-22 08:13:07.475 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 03:13:08 np0005531887 nova_compute[186849]: 2025-11-22 08:13:08.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:13:11 np0005531887 nova_compute[186849]: 2025-11-22 08:13:11.798 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:12 np0005531887 nova_compute[186849]: 2025-11-22 08:13:12.008 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:12 np0005531887 nova_compute[186849]: 2025-11-22 08:13:12.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:13:12 np0005531887 podman[233415]: 2025-11-22 08:13:12.826354731 +0000 UTC m=+0.050537598 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 03:13:13 np0005531887 ovn_controller[95130]: 2025-11-22T08:13:13Z|00437|binding|INFO|Releasing lport 725c746c-ac46-482e-8d13-14e88613ed55 from this chassis (sb_readonly=0)
Nov 22 03:13:13 np0005531887 ovn_controller[95130]: 2025-11-22T08:13:13Z|00438|binding|INFO|Releasing lport f86e6fc7-3969-4922-9612-9c86d85f21ec from this chassis (sb_readonly=0)
Nov 22 03:13:13 np0005531887 nova_compute[186849]: 2025-11-22 08:13:13.085 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:16 np0005531887 nova_compute[186849]: 2025-11-22 08:13:16.802 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:16 np0005531887 nova_compute[186849]: 2025-11-22 08:13:16.867 186853 DEBUG oslo_concurrency.lockutils [None req-f4fffd6b-58d8-424b-b976-ba59673e3328 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "c48669e2-d72e-4b32-9bfa-ebda39e1376c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:13:16 np0005531887 nova_compute[186849]: 2025-11-22 08:13:16.868 186853 DEBUG oslo_concurrency.lockutils [None req-f4fffd6b-58d8-424b-b976-ba59673e3328 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "c48669e2-d72e-4b32-9bfa-ebda39e1376c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:13:16 np0005531887 nova_compute[186849]: 2025-11-22 08:13:16.868 186853 DEBUG oslo_concurrency.lockutils [None req-f4fffd6b-58d8-424b-b976-ba59673e3328 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "c48669e2-d72e-4b32-9bfa-ebda39e1376c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:13:16 np0005531887 nova_compute[186849]: 2025-11-22 08:13:16.868 186853 DEBUG oslo_concurrency.lockutils [None req-f4fffd6b-58d8-424b-b976-ba59673e3328 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "c48669e2-d72e-4b32-9bfa-ebda39e1376c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:13:16 np0005531887 nova_compute[186849]: 2025-11-22 08:13:16.869 186853 DEBUG oslo_concurrency.lockutils [None req-f4fffd6b-58d8-424b-b976-ba59673e3328 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "c48669e2-d72e-4b32-9bfa-ebda39e1376c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:13:16 np0005531887 nova_compute[186849]: 2025-11-22 08:13:16.876 186853 INFO nova.compute.manager [None req-f4fffd6b-58d8-424b-b976-ba59673e3328 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Terminating instance#033[00m
Nov 22 03:13:16 np0005531887 nova_compute[186849]: 2025-11-22 08:13:16.886 186853 DEBUG nova.compute.manager [None req-f4fffd6b-58d8-424b-b976-ba59673e3328 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 03:13:16 np0005531887 kernel: tap136f6afa-dc (unregistering): left promiscuous mode
Nov 22 03:13:16 np0005531887 NetworkManager[55210]: <info>  [1763799196.9189] device (tap136f6afa-dc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:13:16 np0005531887 ovn_controller[95130]: 2025-11-22T08:13:16Z|00439|binding|INFO|Releasing lport 136f6afa-dc75-4024-af1d-a03b4dae22a5 from this chassis (sb_readonly=0)
Nov 22 03:13:16 np0005531887 ovn_controller[95130]: 2025-11-22T08:13:16Z|00440|binding|INFO|Setting lport 136f6afa-dc75-4024-af1d-a03b4dae22a5 down in Southbound
Nov 22 03:13:16 np0005531887 nova_compute[186849]: 2025-11-22 08:13:16.926 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:16 np0005531887 ovn_controller[95130]: 2025-11-22T08:13:16Z|00441|binding|INFO|Removing iface tap136f6afa-dc ovn-installed in OVS
Nov 22 03:13:16 np0005531887 nova_compute[186849]: 2025-11-22 08:13:16.928 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:16 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:16.937 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ca:68:4f 10.100.0.14'], port_security=['fa:16:3e:ca:68:4f 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-04f3fbae-1178-425a-a955-30dcd392a3d3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd57662f9-c343-413b-940d-39a2648160cf', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=79e2fe83-1ab0-49c1-acb4-3bc86f0137dc, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=136f6afa-dc75-4024-af1d-a03b4dae22a5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:13:16 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:16.938 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 136f6afa-dc75-4024-af1d-a03b4dae22a5 in datapath 04f3fbae-1178-425a-a955-30dcd392a3d3 unbound from our chassis#033[00m
Nov 22 03:13:16 np0005531887 nova_compute[186849]: 2025-11-22 08:13:16.939 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:16 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:16.939 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 04f3fbae-1178-425a-a955-30dcd392a3d3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:13:16 np0005531887 kernel: tap3869bf72-6a (unregistering): left promiscuous mode
Nov 22 03:13:16 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:16.941 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[a1f52be7-2d22-4397-a0bd-ba892966fb62]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:16 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:16.942 104084 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-04f3fbae-1178-425a-a955-30dcd392a3d3 namespace which is not needed anymore#033[00m
Nov 22 03:13:16 np0005531887 NetworkManager[55210]: <info>  [1763799196.9436] device (tap3869bf72-6a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:13:16 np0005531887 nova_compute[186849]: 2025-11-22 08:13:16.946 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:16 np0005531887 nova_compute[186849]: 2025-11-22 08:13:16.951 186853 DEBUG nova.compute.manager [req-88ea6ae2-e85a-46fc-836e-860bd00afff8 req-ea951944-c800-42ea-959d-c2060c6e68a3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Received event network-changed-136f6afa-dc75-4024-af1d-a03b4dae22a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:13:16 np0005531887 nova_compute[186849]: 2025-11-22 08:13:16.951 186853 DEBUG nova.compute.manager [req-88ea6ae2-e85a-46fc-836e-860bd00afff8 req-ea951944-c800-42ea-959d-c2060c6e68a3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Refreshing instance network info cache due to event network-changed-136f6afa-dc75-4024-af1d-a03b4dae22a5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:13:16 np0005531887 nova_compute[186849]: 2025-11-22 08:13:16.952 186853 DEBUG oslo_concurrency.lockutils [req-88ea6ae2-e85a-46fc-836e-860bd00afff8 req-ea951944-c800-42ea-959d-c2060c6e68a3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-c48669e2-d72e-4b32-9bfa-ebda39e1376c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:13:16 np0005531887 nova_compute[186849]: 2025-11-22 08:13:16.952 186853 DEBUG oslo_concurrency.lockutils [req-88ea6ae2-e85a-46fc-836e-860bd00afff8 req-ea951944-c800-42ea-959d-c2060c6e68a3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-c48669e2-d72e-4b32-9bfa-ebda39e1376c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:13:16 np0005531887 nova_compute[186849]: 2025-11-22 08:13:16.952 186853 DEBUG nova.network.neutron [req-88ea6ae2-e85a-46fc-836e-860bd00afff8 req-ea951944-c800-42ea-959d-c2060c6e68a3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Refreshing network info cache for port 136f6afa-dc75-4024-af1d-a03b4dae22a5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:13:16 np0005531887 nova_compute[186849]: 2025-11-22 08:13:16.955 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:16 np0005531887 ovn_controller[95130]: 2025-11-22T08:13:16Z|00442|binding|INFO|Releasing lport 3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1 from this chassis (sb_readonly=0)
Nov 22 03:13:16 np0005531887 ovn_controller[95130]: 2025-11-22T08:13:16Z|00443|binding|INFO|Setting lport 3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1 down in Southbound
Nov 22 03:13:16 np0005531887 ovn_controller[95130]: 2025-11-22T08:13:16Z|00444|binding|INFO|Removing iface tap3869bf72-6a ovn-installed in OVS
Nov 22 03:13:16 np0005531887 nova_compute[186849]: 2025-11-22 08:13:16.958 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:16 np0005531887 nova_compute[186849]: 2025-11-22 08:13:16.970 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:16 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:16.975 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:80:9c:ac 2001:db8:0:1:f816:3eff:fe80:9cac 2001:db8::f816:3eff:fe80:9cac'], port_security=['fa:16:3e:80:9c:ac 2001:db8:0:1:f816:3eff:fe80:9cac 2001:db8::f816:3eff:fe80:9cac'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe80:9cac/64 2001:db8::f816:3eff:fe80:9cac/64', 'neutron:device_id': 'c48669e2-d72e-4b32-9bfa-ebda39e1376c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd57662f9-c343-413b-940d-39a2648160cf', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d81a98b9-7f60-4da8-a82f-30c94c08d498, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:13:17 np0005531887 systemd[1]: machine-qemu\x2d48\x2dinstance\x2d00000083.scope: Deactivated successfully.
Nov 22 03:13:17 np0005531887 systemd[1]: machine-qemu\x2d48\x2dinstance\x2d00000083.scope: Consumed 20.025s CPU time.
Nov 22 03:13:17 np0005531887 systemd-machined[153180]: Machine qemu-48-instance-00000083 terminated.
Nov 22 03:13:17 np0005531887 nova_compute[186849]: 2025-11-22 08:13:17.009 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:17 np0005531887 NetworkManager[55210]: <info>  [1763799197.1211] manager: (tap3869bf72-6a): new Tun device (/org/freedesktop/NetworkManager/Devices/199)
Nov 22 03:13:17 np0005531887 nova_compute[186849]: 2025-11-22 08:13:17.160 186853 INFO nova.virt.libvirt.driver [-] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Instance destroyed successfully.#033[00m
Nov 22 03:13:17 np0005531887 nova_compute[186849]: 2025-11-22 08:13:17.160 186853 DEBUG nova.objects.instance [None req-f4fffd6b-58d8-424b-b976-ba59673e3328 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lazy-loading 'resources' on Instance uuid c48669e2-d72e-4b32-9bfa-ebda39e1376c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:13:17 np0005531887 nova_compute[186849]: 2025-11-22 08:13:17.176 186853 DEBUG nova.virt.libvirt.vif [None req-f4fffd6b-58d8-424b-b976-ba59673e3328 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:11:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1879914593',display_name='tempest-TestGettingAddress-server-1879914593',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1879914593',id=131,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG+XE0F3HG2DEqNuf9uspjb7s2sZ2F3wHmlqMBy0O1++z8JVdcWahpbs34YYp0VwN7s8d9LGki42J5P4WPCne3zShzkztpCjZs4MsI2yFB6qUJwoGFmoflVAMWMqw1LTNA==',key_name='tempest-TestGettingAddress-1855642537',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:11:37Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-jeejqnuo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:11:37Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=c48669e2-d72e-4b32-9bfa-ebda39e1376c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "136f6afa-dc75-4024-af1d-a03b4dae22a5", "address": "fa:16:3e:ca:68:4f", "network": {"id": "04f3fbae-1178-425a-a955-30dcd392a3d3", "bridge": "br-int", "label": "tempest-network-smoke--515686970", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap136f6afa-dc", "ovs_interfaceid": "136f6afa-dc75-4024-af1d-a03b4dae22a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:13:17 np0005531887 nova_compute[186849]: 2025-11-22 08:13:17.177 186853 DEBUG nova.network.os_vif_util [None req-f4fffd6b-58d8-424b-b976-ba59673e3328 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "136f6afa-dc75-4024-af1d-a03b4dae22a5", "address": "fa:16:3e:ca:68:4f", "network": {"id": "04f3fbae-1178-425a-a955-30dcd392a3d3", "bridge": "br-int", "label": "tempest-network-smoke--515686970", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap136f6afa-dc", "ovs_interfaceid": "136f6afa-dc75-4024-af1d-a03b4dae22a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:13:17 np0005531887 nova_compute[186849]: 2025-11-22 08:13:17.177 186853 DEBUG nova.network.os_vif_util [None req-f4fffd6b-58d8-424b-b976-ba59673e3328 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ca:68:4f,bridge_name='br-int',has_traffic_filtering=True,id=136f6afa-dc75-4024-af1d-a03b4dae22a5,network=Network(04f3fbae-1178-425a-a955-30dcd392a3d3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap136f6afa-dc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:13:17 np0005531887 nova_compute[186849]: 2025-11-22 08:13:17.178 186853 DEBUG os_vif [None req-f4fffd6b-58d8-424b-b976-ba59673e3328 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ca:68:4f,bridge_name='br-int',has_traffic_filtering=True,id=136f6afa-dc75-4024-af1d-a03b4dae22a5,network=Network(04f3fbae-1178-425a-a955-30dcd392a3d3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap136f6afa-dc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:13:17 np0005531887 nova_compute[186849]: 2025-11-22 08:13:17.179 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:17 np0005531887 nova_compute[186849]: 2025-11-22 08:13:17.180 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap136f6afa-dc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:13:17 np0005531887 nova_compute[186849]: 2025-11-22 08:13:17.181 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:17 np0005531887 nova_compute[186849]: 2025-11-22 08:13:17.183 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:13:17 np0005531887 nova_compute[186849]: 2025-11-22 08:13:17.186 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:17 np0005531887 nova_compute[186849]: 2025-11-22 08:13:17.189 186853 INFO os_vif [None req-f4fffd6b-58d8-424b-b976-ba59673e3328 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ca:68:4f,bridge_name='br-int',has_traffic_filtering=True,id=136f6afa-dc75-4024-af1d-a03b4dae22a5,network=Network(04f3fbae-1178-425a-a955-30dcd392a3d3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap136f6afa-dc')#033[00m
Nov 22 03:13:17 np0005531887 nova_compute[186849]: 2025-11-22 08:13:17.190 186853 DEBUG nova.virt.libvirt.vif [None req-f4fffd6b-58d8-424b-b976-ba59673e3328 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:11:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1879914593',display_name='tempest-TestGettingAddress-server-1879914593',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1879914593',id=131,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG+XE0F3HG2DEqNuf9uspjb7s2sZ2F3wHmlqMBy0O1++z8JVdcWahpbs34YYp0VwN7s8d9LGki42J5P4WPCne3zShzkztpCjZs4MsI2yFB6qUJwoGFmoflVAMWMqw1LTNA==',key_name='tempest-TestGettingAddress-1855642537',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:11:37Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-jeejqnuo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:11:37Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=c48669e2-d72e-4b32-9bfa-ebda39e1376c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1", "address": "fa:16:3e:80:9c:ac", "network": {"id": "2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94", "bridge": "br-int", "label": "tempest-network-smoke--1300577145", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe80:9cac", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe80:9cac", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3869bf72-6a", "ovs_interfaceid": "3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:13:17 np0005531887 nova_compute[186849]: 2025-11-22 08:13:17.190 186853 DEBUG nova.network.os_vif_util [None req-f4fffd6b-58d8-424b-b976-ba59673e3328 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1", "address": "fa:16:3e:80:9c:ac", "network": {"id": "2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94", "bridge": "br-int", "label": "tempest-network-smoke--1300577145", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe80:9cac", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe80:9cac", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3869bf72-6a", "ovs_interfaceid": "3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:13:17 np0005531887 nova_compute[186849]: 2025-11-22 08:13:17.190 186853 DEBUG nova.network.os_vif_util [None req-f4fffd6b-58d8-424b-b976-ba59673e3328 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:80:9c:ac,bridge_name='br-int',has_traffic_filtering=True,id=3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1,network=Network(2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3869bf72-6a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:13:17 np0005531887 nova_compute[186849]: 2025-11-22 08:13:17.191 186853 DEBUG os_vif [None req-f4fffd6b-58d8-424b-b976-ba59673e3328 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:80:9c:ac,bridge_name='br-int',has_traffic_filtering=True,id=3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1,network=Network(2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3869bf72-6a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:13:17 np0005531887 nova_compute[186849]: 2025-11-22 08:13:17.192 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:17 np0005531887 nova_compute[186849]: 2025-11-22 08:13:17.192 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3869bf72-6a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:13:17 np0005531887 nova_compute[186849]: 2025-11-22 08:13:17.193 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:17 np0005531887 nova_compute[186849]: 2025-11-22 08:13:17.195 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:17 np0005531887 nova_compute[186849]: 2025-11-22 08:13:17.196 186853 INFO os_vif [None req-f4fffd6b-58d8-424b-b976-ba59673e3328 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:80:9c:ac,bridge_name='br-int',has_traffic_filtering=True,id=3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1,network=Network(2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3869bf72-6a')#033[00m
Nov 22 03:13:17 np0005531887 nova_compute[186849]: 2025-11-22 08:13:17.197 186853 INFO nova.virt.libvirt.driver [None req-f4fffd6b-58d8-424b-b976-ba59673e3328 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Deleting instance files /var/lib/nova/instances/c48669e2-d72e-4b32-9bfa-ebda39e1376c_del#033[00m
Nov 22 03:13:17 np0005531887 nova_compute[186849]: 2025-11-22 08:13:17.197 186853 INFO nova.virt.libvirt.driver [None req-f4fffd6b-58d8-424b-b976-ba59673e3328 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Deletion of /var/lib/nova/instances/c48669e2-d72e-4b32-9bfa-ebda39e1376c_del complete#033[00m
Nov 22 03:13:17 np0005531887 nova_compute[186849]: 2025-11-22 08:13:17.277 186853 INFO nova.compute.manager [None req-f4fffd6b-58d8-424b-b976-ba59673e3328 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 03:13:17 np0005531887 nova_compute[186849]: 2025-11-22 08:13:17.277 186853 DEBUG oslo.service.loopingcall [None req-f4fffd6b-58d8-424b-b976-ba59673e3328 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 03:13:17 np0005531887 nova_compute[186849]: 2025-11-22 08:13:17.277 186853 DEBUG nova.compute.manager [-] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 03:13:17 np0005531887 nova_compute[186849]: 2025-11-22 08:13:17.277 186853 DEBUG nova.network.neutron [-] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 03:13:17 np0005531887 neutron-haproxy-ovnmeta-04f3fbae-1178-425a-a955-30dcd392a3d3[232804]: [NOTICE]   (232808) : haproxy version is 2.8.14-c23fe91
Nov 22 03:13:17 np0005531887 neutron-haproxy-ovnmeta-04f3fbae-1178-425a-a955-30dcd392a3d3[232804]: [NOTICE]   (232808) : path to executable is /usr/sbin/haproxy
Nov 22 03:13:17 np0005531887 neutron-haproxy-ovnmeta-04f3fbae-1178-425a-a955-30dcd392a3d3[232804]: [WARNING]  (232808) : Exiting Master process...
Nov 22 03:13:17 np0005531887 neutron-haproxy-ovnmeta-04f3fbae-1178-425a-a955-30dcd392a3d3[232804]: [ALERT]    (232808) : Current worker (232810) exited with code 143 (Terminated)
Nov 22 03:13:17 np0005531887 neutron-haproxy-ovnmeta-04f3fbae-1178-425a-a955-30dcd392a3d3[232804]: [WARNING]  (232808) : All workers exited. Exiting... (0)
Nov 22 03:13:17 np0005531887 systemd[1]: libpod-1b7d278f496d0a8e7c2c95c60d5a9671b86815b5edcadb8fdc978f06991c8e4b.scope: Deactivated successfully.
Nov 22 03:13:17 np0005531887 conmon[232804]: conmon 1b7d278f496d0a8e7c2c <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-1b7d278f496d0a8e7c2c95c60d5a9671b86815b5edcadb8fdc978f06991c8e4b.scope/container/memory.events
Nov 22 03:13:17 np0005531887 podman[233465]: 2025-11-22 08:13:17.300832627 +0000 UTC m=+0.259853375 container died 1b7d278f496d0a8e7c2c95c60d5a9671b86815b5edcadb8fdc978f06991c8e4b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-04f3fbae-1178-425a-a955-30dcd392a3d3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Nov 22 03:13:17 np0005531887 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1b7d278f496d0a8e7c2c95c60d5a9671b86815b5edcadb8fdc978f06991c8e4b-userdata-shm.mount: Deactivated successfully.
Nov 22 03:13:17 np0005531887 systemd[1]: var-lib-containers-storage-overlay-145a19438fc4c00ac4ac3524022b7a99f4c1f89656fef9512c781abbbd6a9704-merged.mount: Deactivated successfully.
Nov 22 03:13:17 np0005531887 podman[233465]: 2025-11-22 08:13:17.918125761 +0000 UTC m=+0.877146489 container cleanup 1b7d278f496d0a8e7c2c95c60d5a9671b86815b5edcadb8fdc978f06991c8e4b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-04f3fbae-1178-425a-a955-30dcd392a3d3, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0)
Nov 22 03:13:17 np0005531887 systemd[1]: libpod-conmon-1b7d278f496d0a8e7c2c95c60d5a9671b86815b5edcadb8fdc978f06991c8e4b.scope: Deactivated successfully.
Nov 22 03:13:18 np0005531887 nova_compute[186849]: 2025-11-22 08:13:18.271 186853 DEBUG nova.compute.manager [req-6133b2e4-b86d-4e62-8403-d12b18aa7e59 req-1d28a8cf-d496-436c-92bd-89c966d47606 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Received event network-vif-deleted-136f6afa-dc75-4024-af1d-a03b4dae22a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:13:18 np0005531887 nova_compute[186849]: 2025-11-22 08:13:18.271 186853 INFO nova.compute.manager [req-6133b2e4-b86d-4e62-8403-d12b18aa7e59 req-1d28a8cf-d496-436c-92bd-89c966d47606 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Neutron deleted interface 136f6afa-dc75-4024-af1d-a03b4dae22a5; detaching it from the instance and deleting it from the info cache#033[00m
Nov 22 03:13:18 np0005531887 nova_compute[186849]: 2025-11-22 08:13:18.271 186853 DEBUG nova.network.neutron [req-6133b2e4-b86d-4e62-8403-d12b18aa7e59 req-1d28a8cf-d496-436c-92bd-89c966d47606 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Updating instance_info_cache with network_info: [{"id": "3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1", "address": "fa:16:3e:80:9c:ac", "network": {"id": "2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94", "bridge": "br-int", "label": "tempest-network-smoke--1300577145", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe80:9cac", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe80:9cac", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3869bf72-6a", "ovs_interfaceid": "3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:13:18 np0005531887 nova_compute[186849]: 2025-11-22 08:13:18.295 186853 DEBUG nova.compute.manager [req-6133b2e4-b86d-4e62-8403-d12b18aa7e59 req-1d28a8cf-d496-436c-92bd-89c966d47606 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Detach interface failed, port_id=136f6afa-dc75-4024-af1d-a03b4dae22a5, reason: Instance c48669e2-d72e-4b32-9bfa-ebda39e1376c could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Nov 22 03:13:18 np0005531887 podman[233523]: 2025-11-22 08:13:18.456149528 +0000 UTC m=+0.512465278 container remove 1b7d278f496d0a8e7c2c95c60d5a9671b86815b5edcadb8fdc978f06991c8e4b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-04f3fbae-1178-425a-a955-30dcd392a3d3, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:13:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:18.461 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[01841f80-0af5-47e5-ad46-944e34449ccc]: (4, ('Sat Nov 22 08:13:17 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-04f3fbae-1178-425a-a955-30dcd392a3d3 (1b7d278f496d0a8e7c2c95c60d5a9671b86815b5edcadb8fdc978f06991c8e4b)\n1b7d278f496d0a8e7c2c95c60d5a9671b86815b5edcadb8fdc978f06991c8e4b\nSat Nov 22 08:13:17 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-04f3fbae-1178-425a-a955-30dcd392a3d3 (1b7d278f496d0a8e7c2c95c60d5a9671b86815b5edcadb8fdc978f06991c8e4b)\n1b7d278f496d0a8e7c2c95c60d5a9671b86815b5edcadb8fdc978f06991c8e4b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:18.463 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[573a9f87-faf0-4452-b1ef-867997a72fa5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:18.464 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap04f3fbae-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:13:18 np0005531887 nova_compute[186849]: 2025-11-22 08:13:18.465 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:18 np0005531887 kernel: tap04f3fbae-10: left promiscuous mode
Nov 22 03:13:18 np0005531887 nova_compute[186849]: 2025-11-22 08:13:18.478 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:18.481 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[7afec27d-ec11-46dc-8cfb-911161ca4920]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:18.498 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[4dfdd90c-3566-4fd4-97bd-3ccf5ee46ddc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:18.501 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[7476a2fc-3af5-4f97-95ec-b3955512ab2e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:18.522 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[09d895ff-e9bb-4f60-96d3-2eb1bb4701c9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 583693, 'reachable_time': 35925, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233539, 'error': None, 'target': 'ovnmeta-04f3fbae-1178-425a-a955-30dcd392a3d3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:18 np0005531887 systemd[1]: run-netns-ovnmeta\x2d04f3fbae\x2d1178\x2d425a\x2da955\x2d30dcd392a3d3.mount: Deactivated successfully.
Nov 22 03:13:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:18.527 104200 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-04f3fbae-1178-425a-a955-30dcd392a3d3 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:13:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:18.527 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[0e49e3a9-68c7-48d0-b605-514ecdc08c8f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:18.529 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1 in datapath 2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94 unbound from our chassis#033[00m
Nov 22 03:13:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:18.530 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:13:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:18.530 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[3bbe64ea-496d-4ad7-8293-eed2adb8bc6a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:18.531 104084 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94 namespace which is not needed anymore#033[00m
Nov 22 03:13:18 np0005531887 nova_compute[186849]: 2025-11-22 08:13:18.656 186853 DEBUG nova.network.neutron [-] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:13:18 np0005531887 nova_compute[186849]: 2025-11-22 08:13:18.667 186853 INFO nova.compute.manager [-] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Took 1.39 seconds to deallocate network for instance.#033[00m
Nov 22 03:13:18 np0005531887 nova_compute[186849]: 2025-11-22 08:13:18.726 186853 DEBUG oslo_concurrency.lockutils [None req-f4fffd6b-58d8-424b-b976-ba59673e3328 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:13:18 np0005531887 nova_compute[186849]: 2025-11-22 08:13:18.727 186853 DEBUG oslo_concurrency.lockutils [None req-f4fffd6b-58d8-424b-b976-ba59673e3328 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:13:18 np0005531887 nova_compute[186849]: 2025-11-22 08:13:18.793 186853 DEBUG nova.compute.provider_tree [None req-f4fffd6b-58d8-424b-b976-ba59673e3328 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:13:18 np0005531887 nova_compute[186849]: 2025-11-22 08:13:18.820 186853 DEBUG nova.scheduler.client.report [None req-f4fffd6b-58d8-424b-b976-ba59673e3328 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:13:18 np0005531887 nova_compute[186849]: 2025-11-22 08:13:18.841 186853 DEBUG oslo_concurrency.lockutils [None req-f4fffd6b-58d8-424b-b976-ba59673e3328 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:13:18 np0005531887 nova_compute[186849]: 2025-11-22 08:13:18.867 186853 INFO nova.scheduler.client.report [None req-f4fffd6b-58d8-424b-b976-ba59673e3328 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Deleted allocations for instance c48669e2-d72e-4b32-9bfa-ebda39e1376c#033[00m
Nov 22 03:13:18 np0005531887 neutron-haproxy-ovnmeta-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94[232885]: [NOTICE]   (232889) : haproxy version is 2.8.14-c23fe91
Nov 22 03:13:18 np0005531887 neutron-haproxy-ovnmeta-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94[232885]: [NOTICE]   (232889) : path to executable is /usr/sbin/haproxy
Nov 22 03:13:18 np0005531887 neutron-haproxy-ovnmeta-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94[232885]: [WARNING]  (232889) : Exiting Master process...
Nov 22 03:13:18 np0005531887 neutron-haproxy-ovnmeta-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94[232885]: [ALERT]    (232889) : Current worker (232891) exited with code 143 (Terminated)
Nov 22 03:13:18 np0005531887 neutron-haproxy-ovnmeta-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94[232885]: [WARNING]  (232889) : All workers exited. Exiting... (0)
Nov 22 03:13:18 np0005531887 systemd[1]: libpod-d4a4ccb1dfd5222d0aac7cb80698359b95f86a6c4fc33bfd749d09bf76eb2cc5.scope: Deactivated successfully.
Nov 22 03:13:18 np0005531887 podman[233557]: 2025-11-22 08:13:18.89750068 +0000 UTC m=+0.272526676 container died d4a4ccb1dfd5222d0aac7cb80698359b95f86a6c4fc33bfd749d09bf76eb2cc5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3)
Nov 22 03:13:18 np0005531887 nova_compute[186849]: 2025-11-22 08:13:18.938 186853 DEBUG oslo_concurrency.lockutils [None req-f4fffd6b-58d8-424b-b976-ba59673e3328 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "c48669e2-d72e-4b32-9bfa-ebda39e1376c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.070s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:13:19 np0005531887 nova_compute[186849]: 2025-11-22 08:13:19.111 186853 DEBUG nova.compute.manager [req-b1c9fba8-1133-4472-a665-a8c3d790eca8 req-d293be4b-0530-43c1-94f0-60db1d617fdd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Received event network-vif-unplugged-136f6afa-dc75-4024-af1d-a03b4dae22a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:13:19 np0005531887 nova_compute[186849]: 2025-11-22 08:13:19.112 186853 DEBUG oslo_concurrency.lockutils [req-b1c9fba8-1133-4472-a665-a8c3d790eca8 req-d293be4b-0530-43c1-94f0-60db1d617fdd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c48669e2-d72e-4b32-9bfa-ebda39e1376c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:13:19 np0005531887 nova_compute[186849]: 2025-11-22 08:13:19.112 186853 DEBUG oslo_concurrency.lockutils [req-b1c9fba8-1133-4472-a665-a8c3d790eca8 req-d293be4b-0530-43c1-94f0-60db1d617fdd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c48669e2-d72e-4b32-9bfa-ebda39e1376c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:13:19 np0005531887 nova_compute[186849]: 2025-11-22 08:13:19.112 186853 DEBUG oslo_concurrency.lockutils [req-b1c9fba8-1133-4472-a665-a8c3d790eca8 req-d293be4b-0530-43c1-94f0-60db1d617fdd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c48669e2-d72e-4b32-9bfa-ebda39e1376c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:13:19 np0005531887 nova_compute[186849]: 2025-11-22 08:13:19.113 186853 DEBUG nova.compute.manager [req-b1c9fba8-1133-4472-a665-a8c3d790eca8 req-d293be4b-0530-43c1-94f0-60db1d617fdd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] No waiting events found dispatching network-vif-unplugged-136f6afa-dc75-4024-af1d-a03b4dae22a5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:13:19 np0005531887 nova_compute[186849]: 2025-11-22 08:13:19.113 186853 WARNING nova.compute.manager [req-b1c9fba8-1133-4472-a665-a8c3d790eca8 req-d293be4b-0530-43c1-94f0-60db1d617fdd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Received unexpected event network-vif-unplugged-136f6afa-dc75-4024-af1d-a03b4dae22a5 for instance with vm_state deleted and task_state None.#033[00m
Nov 22 03:13:19 np0005531887 nova_compute[186849]: 2025-11-22 08:13:19.113 186853 DEBUG nova.compute.manager [req-b1c9fba8-1133-4472-a665-a8c3d790eca8 req-d293be4b-0530-43c1-94f0-60db1d617fdd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Received event network-vif-plugged-136f6afa-dc75-4024-af1d-a03b4dae22a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:13:19 np0005531887 nova_compute[186849]: 2025-11-22 08:13:19.113 186853 DEBUG oslo_concurrency.lockutils [req-b1c9fba8-1133-4472-a665-a8c3d790eca8 req-d293be4b-0530-43c1-94f0-60db1d617fdd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c48669e2-d72e-4b32-9bfa-ebda39e1376c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:13:19 np0005531887 nova_compute[186849]: 2025-11-22 08:13:19.114 186853 DEBUG oslo_concurrency.lockutils [req-b1c9fba8-1133-4472-a665-a8c3d790eca8 req-d293be4b-0530-43c1-94f0-60db1d617fdd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c48669e2-d72e-4b32-9bfa-ebda39e1376c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:13:19 np0005531887 nova_compute[186849]: 2025-11-22 08:13:19.114 186853 DEBUG oslo_concurrency.lockutils [req-b1c9fba8-1133-4472-a665-a8c3d790eca8 req-d293be4b-0530-43c1-94f0-60db1d617fdd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c48669e2-d72e-4b32-9bfa-ebda39e1376c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:13:19 np0005531887 nova_compute[186849]: 2025-11-22 08:13:19.114 186853 DEBUG nova.compute.manager [req-b1c9fba8-1133-4472-a665-a8c3d790eca8 req-d293be4b-0530-43c1-94f0-60db1d617fdd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] No waiting events found dispatching network-vif-plugged-136f6afa-dc75-4024-af1d-a03b4dae22a5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:13:19 np0005531887 nova_compute[186849]: 2025-11-22 08:13:19.114 186853 WARNING nova.compute.manager [req-b1c9fba8-1133-4472-a665-a8c3d790eca8 req-d293be4b-0530-43c1-94f0-60db1d617fdd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Received unexpected event network-vif-plugged-136f6afa-dc75-4024-af1d-a03b4dae22a5 for instance with vm_state deleted and task_state None.#033[00m
Nov 22 03:13:19 np0005531887 nova_compute[186849]: 2025-11-22 08:13:19.114 186853 DEBUG nova.compute.manager [req-b1c9fba8-1133-4472-a665-a8c3d790eca8 req-d293be4b-0530-43c1-94f0-60db1d617fdd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Received event network-vif-unplugged-3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:13:19 np0005531887 nova_compute[186849]: 2025-11-22 08:13:19.114 186853 DEBUG oslo_concurrency.lockutils [req-b1c9fba8-1133-4472-a665-a8c3d790eca8 req-d293be4b-0530-43c1-94f0-60db1d617fdd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c48669e2-d72e-4b32-9bfa-ebda39e1376c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:13:19 np0005531887 nova_compute[186849]: 2025-11-22 08:13:19.115 186853 DEBUG oslo_concurrency.lockutils [req-b1c9fba8-1133-4472-a665-a8c3d790eca8 req-d293be4b-0530-43c1-94f0-60db1d617fdd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c48669e2-d72e-4b32-9bfa-ebda39e1376c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:13:19 np0005531887 nova_compute[186849]: 2025-11-22 08:13:19.116 186853 DEBUG oslo_concurrency.lockutils [req-b1c9fba8-1133-4472-a665-a8c3d790eca8 req-d293be4b-0530-43c1-94f0-60db1d617fdd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c48669e2-d72e-4b32-9bfa-ebda39e1376c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:13:19 np0005531887 nova_compute[186849]: 2025-11-22 08:13:19.116 186853 DEBUG nova.compute.manager [req-b1c9fba8-1133-4472-a665-a8c3d790eca8 req-d293be4b-0530-43c1-94f0-60db1d617fdd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] No waiting events found dispatching network-vif-unplugged-3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:13:19 np0005531887 nova_compute[186849]: 2025-11-22 08:13:19.116 186853 WARNING nova.compute.manager [req-b1c9fba8-1133-4472-a665-a8c3d790eca8 req-d293be4b-0530-43c1-94f0-60db1d617fdd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Received unexpected event network-vif-unplugged-3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1 for instance with vm_state deleted and task_state None.#033[00m
Nov 22 03:13:19 np0005531887 nova_compute[186849]: 2025-11-22 08:13:19.116 186853 DEBUG nova.compute.manager [req-b1c9fba8-1133-4472-a665-a8c3d790eca8 req-d293be4b-0530-43c1-94f0-60db1d617fdd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Received event network-vif-plugged-3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:13:19 np0005531887 nova_compute[186849]: 2025-11-22 08:13:19.117 186853 DEBUG oslo_concurrency.lockutils [req-b1c9fba8-1133-4472-a665-a8c3d790eca8 req-d293be4b-0530-43c1-94f0-60db1d617fdd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c48669e2-d72e-4b32-9bfa-ebda39e1376c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:13:19 np0005531887 nova_compute[186849]: 2025-11-22 08:13:19.117 186853 DEBUG oslo_concurrency.lockutils [req-b1c9fba8-1133-4472-a665-a8c3d790eca8 req-d293be4b-0530-43c1-94f0-60db1d617fdd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c48669e2-d72e-4b32-9bfa-ebda39e1376c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:13:19 np0005531887 nova_compute[186849]: 2025-11-22 08:13:19.117 186853 DEBUG oslo_concurrency.lockutils [req-b1c9fba8-1133-4472-a665-a8c3d790eca8 req-d293be4b-0530-43c1-94f0-60db1d617fdd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c48669e2-d72e-4b32-9bfa-ebda39e1376c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:13:19 np0005531887 nova_compute[186849]: 2025-11-22 08:13:19.117 186853 DEBUG nova.compute.manager [req-b1c9fba8-1133-4472-a665-a8c3d790eca8 req-d293be4b-0530-43c1-94f0-60db1d617fdd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] No waiting events found dispatching network-vif-plugged-3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:13:19 np0005531887 nova_compute[186849]: 2025-11-22 08:13:19.117 186853 WARNING nova.compute.manager [req-b1c9fba8-1133-4472-a665-a8c3d790eca8 req-d293be4b-0530-43c1-94f0-60db1d617fdd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Received unexpected event network-vif-plugged-3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1 for instance with vm_state deleted and task_state None.#033[00m
Nov 22 03:13:19 np0005531887 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d4a4ccb1dfd5222d0aac7cb80698359b95f86a6c4fc33bfd749d09bf76eb2cc5-userdata-shm.mount: Deactivated successfully.
Nov 22 03:13:19 np0005531887 systemd[1]: var-lib-containers-storage-overlay-a27d9b528248bd6566a9c65b35d4d9c8384942bbdaf0aae60b44541565327c00-merged.mount: Deactivated successfully.
Nov 22 03:13:19 np0005531887 podman[233569]: 2025-11-22 08:13:19.30757401 +0000 UTC m=+0.388828767 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 22 03:13:19 np0005531887 podman[233557]: 2025-11-22 08:13:19.59367306 +0000 UTC m=+0.968699026 container cleanup d4a4ccb1dfd5222d0aac7cb80698359b95f86a6c4fc33bfd749d09bf76eb2cc5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 03:13:19 np0005531887 systemd[1]: libpod-conmon-d4a4ccb1dfd5222d0aac7cb80698359b95f86a6c4fc33bfd749d09bf76eb2cc5.scope: Deactivated successfully.
Nov 22 03:13:19 np0005531887 nova_compute[186849]: 2025-11-22 08:13:19.614 186853 DEBUG nova.network.neutron [req-88ea6ae2-e85a-46fc-836e-860bd00afff8 req-ea951944-c800-42ea-959d-c2060c6e68a3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Updated VIF entry in instance network info cache for port 136f6afa-dc75-4024-af1d-a03b4dae22a5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:13:19 np0005531887 nova_compute[186849]: 2025-11-22 08:13:19.614 186853 DEBUG nova.network.neutron [req-88ea6ae2-e85a-46fc-836e-860bd00afff8 req-ea951944-c800-42ea-959d-c2060c6e68a3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Updating instance_info_cache with network_info: [{"id": "136f6afa-dc75-4024-af1d-a03b4dae22a5", "address": "fa:16:3e:ca:68:4f", "network": {"id": "04f3fbae-1178-425a-a955-30dcd392a3d3", "bridge": "br-int", "label": "tempest-network-smoke--515686970", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap136f6afa-dc", "ovs_interfaceid": "136f6afa-dc75-4024-af1d-a03b4dae22a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1", "address": "fa:16:3e:80:9c:ac", "network": {"id": "2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94", "bridge": "br-int", "label": "tempest-network-smoke--1300577145", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe80:9cac", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe80:9cac", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3869bf72-6a", "ovs_interfaceid": "3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:13:19 np0005531887 nova_compute[186849]: 2025-11-22 08:13:19.645 186853 DEBUG oslo_concurrency.lockutils [req-88ea6ae2-e85a-46fc-836e-860bd00afff8 req-ea951944-c800-42ea-959d-c2060c6e68a3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-c48669e2-d72e-4b32-9bfa-ebda39e1376c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:13:20 np0005531887 podman[233611]: 2025-11-22 08:13:20.239596111 +0000 UTC m=+0.610686701 container remove d4a4ccb1dfd5222d0aac7cb80698359b95f86a6c4fc33bfd749d09bf76eb2cc5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 22 03:13:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:20.247 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[f2189795-72e0-44ca-99c8-ba6ecb24f2a3]: (4, ('Sat Nov 22 08:13:18 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94 (d4a4ccb1dfd5222d0aac7cb80698359b95f86a6c4fc33bfd749d09bf76eb2cc5)\nd4a4ccb1dfd5222d0aac7cb80698359b95f86a6c4fc33bfd749d09bf76eb2cc5\nSat Nov 22 08:13:19 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94 (d4a4ccb1dfd5222d0aac7cb80698359b95f86a6c4fc33bfd749d09bf76eb2cc5)\nd4a4ccb1dfd5222d0aac7cb80698359b95f86a6c4fc33bfd749d09bf76eb2cc5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:20.250 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[3d4810d4-d3d0-4d4a-99bd-9730d844407d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:20.251 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2b7e9f2d-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:13:20 np0005531887 nova_compute[186849]: 2025-11-22 08:13:20.254 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:20 np0005531887 kernel: tap2b7e9f2d-20: left promiscuous mode
Nov 22 03:13:20 np0005531887 nova_compute[186849]: 2025-11-22 08:13:20.266 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:20.271 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[42a86b1b-bbda-4170-a37b-c16febadc846]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:20.291 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[8203337c-1094-4bc9-8c8c-619a6a783793]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:20.294 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[5691dad3-e46f-4ac1-8424-07b99748277c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:20.314 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[a4b69242-08c1-49a7-995c-88e801dfcb1b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 583784, 'reachable_time': 31875, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233625, 'error': None, 'target': 'ovnmeta-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:20.317 104200 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:13:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:20.317 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[6e7af4f9-c8d4-408f-8ce1-392e0bef8e87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:20 np0005531887 systemd[1]: run-netns-ovnmeta\x2d2b7e9f2d\x2d2098\x2d4c6b\x2d8c54\x2da96b7bb53c94.mount: Deactivated successfully.
Nov 22 03:13:20 np0005531887 nova_compute[186849]: 2025-11-22 08:13:20.363 186853 DEBUG nova.compute.manager [req-f1d535a4-4441-42e5-b030-a49a955003b2 req-490f5656-c9d8-4ed5-9d8f-324a0a7b5871 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Received event network-vif-deleted-3869bf72-6a4e-49fa-8cb3-d024d3f8b1a1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:13:21 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:21.572 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=38, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=37) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:13:21 np0005531887 nova_compute[186849]: 2025-11-22 08:13:21.572 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:21 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:21.573 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:13:22 np0005531887 nova_compute[186849]: 2025-11-22 08:13:22.010 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:22 np0005531887 nova_compute[186849]: 2025-11-22 08:13:22.194 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:23 np0005531887 podman[233630]: 2025-11-22 08:13:23.836112277 +0000 UTC m=+0.057679814 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:13:26 np0005531887 nova_compute[186849]: 2025-11-22 08:13:26.002 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:26 np0005531887 nova_compute[186849]: 2025-11-22 08:13:26.406 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:26 np0005531887 nova_compute[186849]: 2025-11-22 08:13:26.606 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:26 np0005531887 nova_compute[186849]: 2025-11-22 08:13:26.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:13:26 np0005531887 nova_compute[186849]: 2025-11-22 08:13:26.770 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:13:26 np0005531887 nova_compute[186849]: 2025-11-22 08:13:26.771 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:13:26 np0005531887 nova_compute[186849]: 2025-11-22 08:13:26.772 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:13:26 np0005531887 nova_compute[186849]: 2025-11-22 08:13:26.772 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:13:26 np0005531887 nova_compute[186849]: 2025-11-22 08:13:26.772 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:13:26 np0005531887 nova_compute[186849]: 2025-11-22 08:13:26.773 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:13:26 np0005531887 nova_compute[186849]: 2025-11-22 08:13:26.795 186853 DEBUG nova.virt.libvirt.imagecache [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314#033[00m
Nov 22 03:13:26 np0005531887 nova_compute[186849]: 2025-11-22 08:13:26.796 186853 WARNING nova.virt.libvirt.imagecache [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726#033[00m
Nov 22 03:13:26 np0005531887 nova_compute[186849]: 2025-11-22 08:13:26.796 186853 WARNING nova.virt.libvirt.imagecache [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42#033[00m
Nov 22 03:13:26 np0005531887 nova_compute[186849]: 2025-11-22 08:13:26.796 186853 WARNING nova.virt.libvirt.imagecache [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/5e1b055cd2dda7073fea6bdd458a9a8fcf51be29#033[00m
Nov 22 03:13:26 np0005531887 nova_compute[186849]: 2025-11-22 08:13:26.797 186853 WARNING nova.virt.libvirt.imagecache [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/253915408629bc954cd98505927b671e71b5d9d2#033[00m
Nov 22 03:13:26 np0005531887 nova_compute[186849]: 2025-11-22 08:13:26.797 186853 INFO nova.virt.libvirt.imagecache [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Removable base files: /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 /var/lib/nova/instances/_base/5e1b055cd2dda7073fea6bdd458a9a8fcf51be29 /var/lib/nova/instances/_base/253915408629bc954cd98505927b671e71b5d9d2#033[00m
Nov 22 03:13:26 np0005531887 nova_compute[186849]: 2025-11-22 08:13:26.798 186853 INFO nova.virt.libvirt.imagecache [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726#033[00m
Nov 22 03:13:26 np0005531887 nova_compute[186849]: 2025-11-22 08:13:26.799 186853 INFO nova.virt.libvirt.imagecache [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42#033[00m
Nov 22 03:13:26 np0005531887 nova_compute[186849]: 2025-11-22 08:13:26.799 186853 INFO nova.virt.libvirt.imagecache [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/5e1b055cd2dda7073fea6bdd458a9a8fcf51be29#033[00m
Nov 22 03:13:26 np0005531887 nova_compute[186849]: 2025-11-22 08:13:26.800 186853 INFO nova.virt.libvirt.imagecache [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/253915408629bc954cd98505927b671e71b5d9d2#033[00m
Nov 22 03:13:26 np0005531887 nova_compute[186849]: 2025-11-22 08:13:26.800 186853 DEBUG nova.virt.libvirt.imagecache [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350#033[00m
Nov 22 03:13:26 np0005531887 nova_compute[186849]: 2025-11-22 08:13:26.800 186853 DEBUG nova.virt.libvirt.imagecache [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299#033[00m
Nov 22 03:13:26 np0005531887 nova_compute[186849]: 2025-11-22 08:13:26.800 186853 DEBUG nova.virt.libvirt.imagecache [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284#033[00m
Nov 22 03:13:27 np0005531887 nova_compute[186849]: 2025-11-22 08:13:27.012 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:27 np0005531887 nova_compute[186849]: 2025-11-22 08:13:27.196 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:28 np0005531887 podman[233651]: 2025-11-22 08:13:28.8442054 +0000 UTC m=+0.061808077 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 03:13:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:30.574 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '38'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:13:32 np0005531887 nova_compute[186849]: 2025-11-22 08:13:32.014 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:32 np0005531887 nova_compute[186849]: 2025-11-22 08:13:32.159 186853 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763799197.1579795, c48669e2-d72e-4b32-9bfa-ebda39e1376c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:13:32 np0005531887 nova_compute[186849]: 2025-11-22 08:13:32.160 186853 INFO nova.compute.manager [-] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:13:32 np0005531887 nova_compute[186849]: 2025-11-22 08:13:32.178 186853 DEBUG nova.compute.manager [None req-9f4cf47c-d722-4f04-bd4a-610f090c0995 - - - - - -] [instance: c48669e2-d72e-4b32-9bfa-ebda39e1376c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:13:32 np0005531887 nova_compute[186849]: 2025-11-22 08:13:32.198 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:35 np0005531887 podman[233678]: 2025-11-22 08:13:35.836070259 +0000 UTC m=+0.055875010 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, release=1755695350)
Nov 22 03:13:37 np0005531887 nova_compute[186849]: 2025-11-22 08:13:37.017 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:37 np0005531887 nova_compute[186849]: 2025-11-22 08:13:37.201 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:37.347 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:13:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:37.347 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:13:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:37.348 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:13:37 np0005531887 podman[233699]: 2025-11-22 08:13:37.862802365 +0000 UTC m=+0.074104878 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 03:13:37 np0005531887 podman[233700]: 2025-11-22 08:13:37.892222942 +0000 UTC m=+0.103143766 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:13:42 np0005531887 nova_compute[186849]: 2025-11-22 08:13:42.019 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:42 np0005531887 nova_compute[186849]: 2025-11-22 08:13:42.203 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:43 np0005531887 podman[233741]: 2025-11-22 08:13:43.832367817 +0000 UTC m=+0.055671275 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:13:47 np0005531887 nova_compute[186849]: 2025-11-22 08:13:47.021 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:47 np0005531887 nova_compute[186849]: 2025-11-22 08:13:47.204 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:49 np0005531887 podman[233767]: 2025-11-22 08:13:49.840242903 +0000 UTC m=+0.051408969 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118)
Nov 22 03:13:52 np0005531887 nova_compute[186849]: 2025-11-22 08:13:52.023 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:52 np0005531887 nova_compute[186849]: 2025-11-22 08:13:52.206 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:53 np0005531887 nova_compute[186849]: 2025-11-22 08:13:53.034 186853 DEBUG oslo_concurrency.lockutils [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "cd188127-6d18-4ee7-a764-30f3612157e8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:13:53 np0005531887 nova_compute[186849]: 2025-11-22 08:13:53.035 186853 DEBUG oslo_concurrency.lockutils [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "cd188127-6d18-4ee7-a764-30f3612157e8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:13:53 np0005531887 nova_compute[186849]: 2025-11-22 08:13:53.051 186853 DEBUG nova.compute.manager [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 03:13:53 np0005531887 nova_compute[186849]: 2025-11-22 08:13:53.154 186853 DEBUG oslo_concurrency.lockutils [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:13:53 np0005531887 nova_compute[186849]: 2025-11-22 08:13:53.154 186853 DEBUG oslo_concurrency.lockutils [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:13:53 np0005531887 nova_compute[186849]: 2025-11-22 08:13:53.162 186853 DEBUG nova.virt.hardware [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:13:53 np0005531887 nova_compute[186849]: 2025-11-22 08:13:53.163 186853 INFO nova.compute.claims [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 22 03:13:53 np0005531887 nova_compute[186849]: 2025-11-22 08:13:53.429 186853 DEBUG nova.compute.provider_tree [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:13:53 np0005531887 nova_compute[186849]: 2025-11-22 08:13:53.442 186853 DEBUG nova.scheduler.client.report [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:13:53 np0005531887 nova_compute[186849]: 2025-11-22 08:13:53.462 186853 DEBUG oslo_concurrency.lockutils [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.308s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:13:53 np0005531887 nova_compute[186849]: 2025-11-22 08:13:53.463 186853 DEBUG nova.compute.manager [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 03:13:53 np0005531887 nova_compute[186849]: 2025-11-22 08:13:53.509 186853 DEBUG nova.compute.manager [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 03:13:53 np0005531887 nova_compute[186849]: 2025-11-22 08:13:53.510 186853 DEBUG nova.network.neutron [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:13:53 np0005531887 nova_compute[186849]: 2025-11-22 08:13:53.530 186853 INFO nova.virt.libvirt.driver [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 03:13:53 np0005531887 nova_compute[186849]: 2025-11-22 08:13:53.554 186853 DEBUG nova.compute.manager [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 03:13:53 np0005531887 nova_compute[186849]: 2025-11-22 08:13:53.642 186853 DEBUG nova.compute.manager [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 03:13:53 np0005531887 nova_compute[186849]: 2025-11-22 08:13:53.643 186853 DEBUG nova.virt.libvirt.driver [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:13:53 np0005531887 nova_compute[186849]: 2025-11-22 08:13:53.643 186853 INFO nova.virt.libvirt.driver [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Creating image(s)#033[00m
Nov 22 03:13:53 np0005531887 nova_compute[186849]: 2025-11-22 08:13:53.644 186853 DEBUG oslo_concurrency.lockutils [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "/var/lib/nova/instances/cd188127-6d18-4ee7-a764-30f3612157e8/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:13:53 np0005531887 nova_compute[186849]: 2025-11-22 08:13:53.644 186853 DEBUG oslo_concurrency.lockutils [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "/var/lib/nova/instances/cd188127-6d18-4ee7-a764-30f3612157e8/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:13:53 np0005531887 nova_compute[186849]: 2025-11-22 08:13:53.645 186853 DEBUG oslo_concurrency.lockutils [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "/var/lib/nova/instances/cd188127-6d18-4ee7-a764-30f3612157e8/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:13:53 np0005531887 nova_compute[186849]: 2025-11-22 08:13:53.657 186853 DEBUG oslo_concurrency.processutils [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:13:53 np0005531887 nova_compute[186849]: 2025-11-22 08:13:53.694 186853 DEBUG nova.policy [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:13:53 np0005531887 nova_compute[186849]: 2025-11-22 08:13:53.727 186853 DEBUG oslo_concurrency.processutils [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:13:53 np0005531887 nova_compute[186849]: 2025-11-22 08:13:53.728 186853 DEBUG oslo_concurrency.lockutils [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:13:53 np0005531887 nova_compute[186849]: 2025-11-22 08:13:53.728 186853 DEBUG oslo_concurrency.lockutils [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:13:53 np0005531887 nova_compute[186849]: 2025-11-22 08:13:53.741 186853 DEBUG oslo_concurrency.processutils [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:13:53 np0005531887 nova_compute[186849]: 2025-11-22 08:13:53.804 186853 DEBUG oslo_concurrency.processutils [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:13:53 np0005531887 nova_compute[186849]: 2025-11-22 08:13:53.805 186853 DEBUG oslo_concurrency.processutils [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/cd188127-6d18-4ee7-a764-30f3612157e8/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:13:53 np0005531887 nova_compute[186849]: 2025-11-22 08:13:53.918 186853 DEBUG oslo_concurrency.processutils [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/cd188127-6d18-4ee7-a764-30f3612157e8/disk 1073741824" returned: 0 in 0.113s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:13:53 np0005531887 nova_compute[186849]: 2025-11-22 08:13:53.920 186853 DEBUG oslo_concurrency.lockutils [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.192s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:13:53 np0005531887 nova_compute[186849]: 2025-11-22 08:13:53.921 186853 DEBUG oslo_concurrency.processutils [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:13:54 np0005531887 nova_compute[186849]: 2025-11-22 08:13:53.999 186853 DEBUG oslo_concurrency.processutils [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:13:54 np0005531887 nova_compute[186849]: 2025-11-22 08:13:54.001 186853 DEBUG nova.virt.disk.api [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Checking if we can resize image /var/lib/nova/instances/cd188127-6d18-4ee7-a764-30f3612157e8/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:13:54 np0005531887 nova_compute[186849]: 2025-11-22 08:13:54.002 186853 DEBUG oslo_concurrency.processutils [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cd188127-6d18-4ee7-a764-30f3612157e8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:13:54 np0005531887 nova_compute[186849]: 2025-11-22 08:13:54.069 186853 DEBUG oslo_concurrency.processutils [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cd188127-6d18-4ee7-a764-30f3612157e8/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:13:54 np0005531887 nova_compute[186849]: 2025-11-22 08:13:54.072 186853 DEBUG nova.virt.disk.api [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Cannot resize image /var/lib/nova/instances/cd188127-6d18-4ee7-a764-30f3612157e8/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:13:54 np0005531887 nova_compute[186849]: 2025-11-22 08:13:54.073 186853 DEBUG nova.objects.instance [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lazy-loading 'migration_context' on Instance uuid cd188127-6d18-4ee7-a764-30f3612157e8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:13:54 np0005531887 nova_compute[186849]: 2025-11-22 08:13:54.097 186853 DEBUG nova.virt.libvirt.driver [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:13:54 np0005531887 nova_compute[186849]: 2025-11-22 08:13:54.098 186853 DEBUG nova.virt.libvirt.driver [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Ensure instance console log exists: /var/lib/nova/instances/cd188127-6d18-4ee7-a764-30f3612157e8/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:13:54 np0005531887 nova_compute[186849]: 2025-11-22 08:13:54.098 186853 DEBUG oslo_concurrency.lockutils [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:13:54 np0005531887 nova_compute[186849]: 2025-11-22 08:13:54.099 186853 DEBUG oslo_concurrency.lockutils [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:13:54 np0005531887 nova_compute[186849]: 2025-11-22 08:13:54.099 186853 DEBUG oslo_concurrency.lockutils [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:13:54 np0005531887 nova_compute[186849]: 2025-11-22 08:13:54.246 186853 DEBUG nova.network.neutron [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Successfully created port: 4b028913-9948-488e-a6e0-30df6e049217 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:13:54 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:54.298 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7e:a2:f4 2001:db8:0:1:f816:3eff:fe7e:a2f4 2001:db8::f816:3eff:fe7e:a2f4'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe7e:a2f4/64 2001:db8::f816:3eff:fe7e:a2f4/64', 'neutron:device_id': 'ovnmeta-6c0a2255-6426-43c4-abc3-5c1857ba0a79', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6c0a2255-6426-43c4-abc3-5c1857ba0a79', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1f6eb0a2-d476-48e9-8756-79e6bbc84c15, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=433cf940-3b59-425c-aeb8-689a57de46c2) old=Port_Binding(mac=['fa:16:3e:7e:a2:f4 2001:db8::f816:3eff:fe7e:a2f4'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe7e:a2f4/64', 'neutron:device_id': 'ovnmeta-6c0a2255-6426-43c4-abc3-5c1857ba0a79', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6c0a2255-6426-43c4-abc3-5c1857ba0a79', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:13:54 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:54.299 104084 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 433cf940-3b59-425c-aeb8-689a57de46c2 in datapath 6c0a2255-6426-43c4-abc3-5c1857ba0a79 updated#033[00m
Nov 22 03:13:54 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:54.301 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6c0a2255-6426-43c4-abc3-5c1857ba0a79, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:13:54 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:54.302 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[7b7a9b44-00c8-4d03-a750-4eba25bb6d25]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:54 np0005531887 podman[233801]: 2025-11-22 08:13:54.840194995 +0000 UTC m=+0.058198087 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118)
Nov 22 03:13:55 np0005531887 nova_compute[186849]: 2025-11-22 08:13:55.269 186853 DEBUG nova.network.neutron [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Successfully updated port: 4b028913-9948-488e-a6e0-30df6e049217 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:13:55 np0005531887 nova_compute[186849]: 2025-11-22 08:13:55.290 186853 DEBUG oslo_concurrency.lockutils [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "refresh_cache-cd188127-6d18-4ee7-a764-30f3612157e8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:13:55 np0005531887 nova_compute[186849]: 2025-11-22 08:13:55.290 186853 DEBUG oslo_concurrency.lockutils [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquired lock "refresh_cache-cd188127-6d18-4ee7-a764-30f3612157e8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:13:55 np0005531887 nova_compute[186849]: 2025-11-22 08:13:55.290 186853 DEBUG nova.network.neutron [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:13:55 np0005531887 nova_compute[186849]: 2025-11-22 08:13:55.382 186853 DEBUG nova.compute.manager [req-6ae0fc15-b94c-4eff-b846-b0ee3398edb8 req-ced66dc1-1068-4209-bd4f-ae34cd01974c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Received event network-changed-4b028913-9948-488e-a6e0-30df6e049217 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:13:55 np0005531887 nova_compute[186849]: 2025-11-22 08:13:55.383 186853 DEBUG nova.compute.manager [req-6ae0fc15-b94c-4eff-b846-b0ee3398edb8 req-ced66dc1-1068-4209-bd4f-ae34cd01974c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Refreshing instance network info cache due to event network-changed-4b028913-9948-488e-a6e0-30df6e049217. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:13:55 np0005531887 nova_compute[186849]: 2025-11-22 08:13:55.383 186853 DEBUG oslo_concurrency.lockutils [req-6ae0fc15-b94c-4eff-b846-b0ee3398edb8 req-ced66dc1-1068-4209-bd4f-ae34cd01974c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-cd188127-6d18-4ee7-a764-30f3612157e8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:13:55 np0005531887 nova_compute[186849]: 2025-11-22 08:13:55.438 186853 DEBUG nova.network.neutron [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:13:56 np0005531887 nova_compute[186849]: 2025-11-22 08:13:56.186 186853 DEBUG nova.network.neutron [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Updating instance_info_cache with network_info: [{"id": "4b028913-9948-488e-a6e0-30df6e049217", "address": "fa:16:3e:13:92:17", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b028913-99", "ovs_interfaceid": "4b028913-9948-488e-a6e0-30df6e049217", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:13:56 np0005531887 nova_compute[186849]: 2025-11-22 08:13:56.208 186853 DEBUG oslo_concurrency.lockutils [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Releasing lock "refresh_cache-cd188127-6d18-4ee7-a764-30f3612157e8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:13:56 np0005531887 nova_compute[186849]: 2025-11-22 08:13:56.209 186853 DEBUG nova.compute.manager [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Instance network_info: |[{"id": "4b028913-9948-488e-a6e0-30df6e049217", "address": "fa:16:3e:13:92:17", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b028913-99", "ovs_interfaceid": "4b028913-9948-488e-a6e0-30df6e049217", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 03:13:56 np0005531887 nova_compute[186849]: 2025-11-22 08:13:56.209 186853 DEBUG oslo_concurrency.lockutils [req-6ae0fc15-b94c-4eff-b846-b0ee3398edb8 req-ced66dc1-1068-4209-bd4f-ae34cd01974c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-cd188127-6d18-4ee7-a764-30f3612157e8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:13:56 np0005531887 nova_compute[186849]: 2025-11-22 08:13:56.210 186853 DEBUG nova.network.neutron [req-6ae0fc15-b94c-4eff-b846-b0ee3398edb8 req-ced66dc1-1068-4209-bd4f-ae34cd01974c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Refreshing network info cache for port 4b028913-9948-488e-a6e0-30df6e049217 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:13:56 np0005531887 nova_compute[186849]: 2025-11-22 08:13:56.213 186853 DEBUG nova.virt.libvirt.driver [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Start _get_guest_xml network_info=[{"id": "4b028913-9948-488e-a6e0-30df6e049217", "address": "fa:16:3e:13:92:17", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b028913-99", "ovs_interfaceid": "4b028913-9948-488e-a6e0-30df6e049217", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'size': 0, 'boot_index': 0, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:13:56 np0005531887 nova_compute[186849]: 2025-11-22 08:13:56.218 186853 WARNING nova.virt.libvirt.driver [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:13:56 np0005531887 nova_compute[186849]: 2025-11-22 08:13:56.229 186853 DEBUG nova.virt.libvirt.host [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:13:56 np0005531887 nova_compute[186849]: 2025-11-22 08:13:56.231 186853 DEBUG nova.virt.libvirt.host [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:13:56 np0005531887 nova_compute[186849]: 2025-11-22 08:13:56.240 186853 DEBUG nova.virt.libvirt.host [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:13:56 np0005531887 nova_compute[186849]: 2025-11-22 08:13:56.241 186853 DEBUG nova.virt.libvirt.host [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:13:56 np0005531887 nova_compute[186849]: 2025-11-22 08:13:56.243 186853 DEBUG nova.virt.libvirt.driver [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:13:56 np0005531887 nova_compute[186849]: 2025-11-22 08:13:56.243 186853 DEBUG nova.virt.hardware [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:13:56 np0005531887 nova_compute[186849]: 2025-11-22 08:13:56.244 186853 DEBUG nova.virt.hardware [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:13:56 np0005531887 nova_compute[186849]: 2025-11-22 08:13:56.244 186853 DEBUG nova.virt.hardware [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:13:56 np0005531887 nova_compute[186849]: 2025-11-22 08:13:56.244 186853 DEBUG nova.virt.hardware [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:13:56 np0005531887 nova_compute[186849]: 2025-11-22 08:13:56.245 186853 DEBUG nova.virt.hardware [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:13:56 np0005531887 nova_compute[186849]: 2025-11-22 08:13:56.245 186853 DEBUG nova.virt.hardware [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:13:56 np0005531887 nova_compute[186849]: 2025-11-22 08:13:56.245 186853 DEBUG nova.virt.hardware [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:13:56 np0005531887 nova_compute[186849]: 2025-11-22 08:13:56.246 186853 DEBUG nova.virt.hardware [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:13:56 np0005531887 nova_compute[186849]: 2025-11-22 08:13:56.246 186853 DEBUG nova.virt.hardware [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:13:56 np0005531887 nova_compute[186849]: 2025-11-22 08:13:56.246 186853 DEBUG nova.virt.hardware [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:13:56 np0005531887 nova_compute[186849]: 2025-11-22 08:13:56.246 186853 DEBUG nova.virt.hardware [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:13:56 np0005531887 nova_compute[186849]: 2025-11-22 08:13:56.252 186853 DEBUG nova.virt.libvirt.vif [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=2001:2001::3,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:13:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1493876225',display_name='tempest-ServersTestJSON-server-1493876225',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestjson-server-1493876225',id=138,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='70cb231da30d4002a985cf18a579cd6a',ramdisk_id='',reservation_id='r-1q6n9tjy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1620770071',owner_user_name='tempest-ServersTestJSON-1620770071-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:13:53Z,user_data=None,user_id='11d95211a44e4da9a04eb309ec3ab024',uuid=cd188127-6d18-4ee7-a764-30f3612157e8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4b028913-9948-488e-a6e0-30df6e049217", "address": "fa:16:3e:13:92:17", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b028913-99", "ovs_interfaceid": "4b028913-9948-488e-a6e0-30df6e049217", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:13:56 np0005531887 nova_compute[186849]: 2025-11-22 08:13:56.253 186853 DEBUG nova.network.os_vif_util [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Converting VIF {"id": "4b028913-9948-488e-a6e0-30df6e049217", "address": "fa:16:3e:13:92:17", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b028913-99", "ovs_interfaceid": "4b028913-9948-488e-a6e0-30df6e049217", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:13:56 np0005531887 nova_compute[186849]: 2025-11-22 08:13:56.254 186853 DEBUG nova.network.os_vif_util [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:13:92:17,bridge_name='br-int',has_traffic_filtering=True,id=4b028913-9948-488e-a6e0-30df6e049217,network=Network(66c945b4-7237-4e85-b411-0c51b31ea31a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b028913-99') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:13:56 np0005531887 nova_compute[186849]: 2025-11-22 08:13:56.255 186853 DEBUG nova.objects.instance [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lazy-loading 'pci_devices' on Instance uuid cd188127-6d18-4ee7-a764-30f3612157e8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:13:56 np0005531887 nova_compute[186849]: 2025-11-22 08:13:56.285 186853 DEBUG nova.virt.libvirt.driver [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:13:56 np0005531887 nova_compute[186849]:  <uuid>cd188127-6d18-4ee7-a764-30f3612157e8</uuid>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:  <name>instance-0000008a</name>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:  <memory>131072</memory>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:  <vcpu>1</vcpu>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:13:56 np0005531887 nova_compute[186849]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:      <nova:name>tempest-ServersTestJSON-server-1493876225</nova:name>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:      <nova:creationTime>2025-11-22 08:13:56</nova:creationTime>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:      <nova:flavor name="m1.nano">
Nov 22 03:13:56 np0005531887 nova_compute[186849]:        <nova:memory>128</nova:memory>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:        <nova:disk>1</nova:disk>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:        <nova:swap>0</nova:swap>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:      </nova:flavor>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:      <nova:owner>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:        <nova:user uuid="11d95211a44e4da9a04eb309ec3ab024">tempest-ServersTestJSON-1620770071-project-member</nova:user>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:        <nova:project uuid="70cb231da30d4002a985cf18a579cd6a">tempest-ServersTestJSON-1620770071</nova:project>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:      </nova:owner>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:      <nova:ports>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:        <nova:port uuid="4b028913-9948-488e-a6e0-30df6e049217">
Nov 22 03:13:56 np0005531887 nova_compute[186849]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:        </nova:port>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:      </nova:ports>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:    </nova:instance>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:  <sysinfo type="smbios">
Nov 22 03:13:56 np0005531887 nova_compute[186849]:    <system>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:      <entry name="serial">cd188127-6d18-4ee7-a764-30f3612157e8</entry>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:      <entry name="uuid">cd188127-6d18-4ee7-a764-30f3612157e8</entry>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:    </system>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:  <os>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:    <boot dev="hd"/>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:    <smbios mode="sysinfo"/>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:  </os>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:  <features>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:    <vmcoreinfo/>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:  </features>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:  <clock offset="utc">
Nov 22 03:13:56 np0005531887 nova_compute[186849]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:    <timer name="hpet" present="no"/>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:  </clock>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:  <cpu mode="custom" match="exact">
Nov 22 03:13:56 np0005531887 nova_compute[186849]:    <model>Nehalem</model>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:  <devices>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:    <disk type="file" device="disk">
Nov 22 03:13:56 np0005531887 nova_compute[186849]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/cd188127-6d18-4ee7-a764-30f3612157e8/disk"/>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:      <target dev="vda" bus="virtio"/>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:    <disk type="file" device="cdrom">
Nov 22 03:13:56 np0005531887 nova_compute[186849]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/cd188127-6d18-4ee7-a764-30f3612157e8/disk.config"/>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:      <target dev="sda" bus="sata"/>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:    <interface type="ethernet">
Nov 22 03:13:56 np0005531887 nova_compute[186849]:      <mac address="fa:16:3e:13:92:17"/>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:      <mtu size="1442"/>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:      <target dev="tap4b028913-99"/>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:    </interface>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:    <serial type="pty">
Nov 22 03:13:56 np0005531887 nova_compute[186849]:      <log file="/var/lib/nova/instances/cd188127-6d18-4ee7-a764-30f3612157e8/console.log" append="off"/>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:    </serial>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:    <video>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:    </video>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:    <input type="tablet" bus="usb"/>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:    <rng model="virtio">
Nov 22 03:13:56 np0005531887 nova_compute[186849]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:    </rng>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:    <controller type="usb" index="0"/>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:    <memballoon model="virtio">
Nov 22 03:13:56 np0005531887 nova_compute[186849]:      <stats period="10"/>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 03:13:56 np0005531887 nova_compute[186849]:  </devices>
Nov 22 03:13:56 np0005531887 nova_compute[186849]: </domain>
Nov 22 03:13:56 np0005531887 nova_compute[186849]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:13:56 np0005531887 nova_compute[186849]: 2025-11-22 08:13:56.287 186853 DEBUG nova.compute.manager [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Preparing to wait for external event network-vif-plugged-4b028913-9948-488e-a6e0-30df6e049217 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:13:56 np0005531887 nova_compute[186849]: 2025-11-22 08:13:56.287 186853 DEBUG oslo_concurrency.lockutils [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "cd188127-6d18-4ee7-a764-30f3612157e8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:13:56 np0005531887 nova_compute[186849]: 2025-11-22 08:13:56.287 186853 DEBUG oslo_concurrency.lockutils [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "cd188127-6d18-4ee7-a764-30f3612157e8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:13:56 np0005531887 nova_compute[186849]: 2025-11-22 08:13:56.288 186853 DEBUG oslo_concurrency.lockutils [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "cd188127-6d18-4ee7-a764-30f3612157e8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:13:56 np0005531887 nova_compute[186849]: 2025-11-22 08:13:56.288 186853 DEBUG nova.virt.libvirt.vif [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=2001:2001::3,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:13:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1493876225',display_name='tempest-ServersTestJSON-server-1493876225',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestjson-server-1493876225',id=138,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='70cb231da30d4002a985cf18a579cd6a',ramdisk_id='',reservation_id='r-1q6n9tjy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1620770071',owner_user_name='tempest-ServersTestJSON-1620770071-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:13:53Z,user_data=None,user_id='11d95211a44e4da9a04eb309ec3ab024',uuid=cd188127-6d18-4ee7-a764-30f3612157e8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4b028913-9948-488e-a6e0-30df6e049217", "address": "fa:16:3e:13:92:17", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b028913-99", "ovs_interfaceid": "4b028913-9948-488e-a6e0-30df6e049217", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:13:56 np0005531887 nova_compute[186849]: 2025-11-22 08:13:56.289 186853 DEBUG nova.network.os_vif_util [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Converting VIF {"id": "4b028913-9948-488e-a6e0-30df6e049217", "address": "fa:16:3e:13:92:17", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b028913-99", "ovs_interfaceid": "4b028913-9948-488e-a6e0-30df6e049217", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:13:56 np0005531887 nova_compute[186849]: 2025-11-22 08:13:56.289 186853 DEBUG nova.network.os_vif_util [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:13:92:17,bridge_name='br-int',has_traffic_filtering=True,id=4b028913-9948-488e-a6e0-30df6e049217,network=Network(66c945b4-7237-4e85-b411-0c51b31ea31a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b028913-99') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:13:56 np0005531887 nova_compute[186849]: 2025-11-22 08:13:56.290 186853 DEBUG os_vif [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:13:92:17,bridge_name='br-int',has_traffic_filtering=True,id=4b028913-9948-488e-a6e0-30df6e049217,network=Network(66c945b4-7237-4e85-b411-0c51b31ea31a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b028913-99') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:13:56 np0005531887 nova_compute[186849]: 2025-11-22 08:13:56.290 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:56 np0005531887 nova_compute[186849]: 2025-11-22 08:13:56.291 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:13:56 np0005531887 nova_compute[186849]: 2025-11-22 08:13:56.291 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:13:56 np0005531887 nova_compute[186849]: 2025-11-22 08:13:56.296 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:56 np0005531887 nova_compute[186849]: 2025-11-22 08:13:56.296 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4b028913-99, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:13:56 np0005531887 nova_compute[186849]: 2025-11-22 08:13:56.297 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4b028913-99, col_values=(('external_ids', {'iface-id': '4b028913-9948-488e-a6e0-30df6e049217', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:13:92:17', 'vm-uuid': 'cd188127-6d18-4ee7-a764-30f3612157e8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:13:56 np0005531887 nova_compute[186849]: 2025-11-22 08:13:56.299 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:56 np0005531887 NetworkManager[55210]: <info>  [1763799236.3012] manager: (tap4b028913-99): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/200)
Nov 22 03:13:56 np0005531887 nova_compute[186849]: 2025-11-22 08:13:56.302 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:13:56 np0005531887 nova_compute[186849]: 2025-11-22 08:13:56.306 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:56 np0005531887 nova_compute[186849]: 2025-11-22 08:13:56.309 186853 INFO os_vif [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:13:92:17,bridge_name='br-int',has_traffic_filtering=True,id=4b028913-9948-488e-a6e0-30df6e049217,network=Network(66c945b4-7237-4e85-b411-0c51b31ea31a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b028913-99')#033[00m
Nov 22 03:13:56 np0005531887 nova_compute[186849]: 2025-11-22 08:13:56.516 186853 DEBUG nova.virt.libvirt.driver [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:13:56 np0005531887 nova_compute[186849]: 2025-11-22 08:13:56.516 186853 DEBUG nova.virt.libvirt.driver [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:13:56 np0005531887 nova_compute[186849]: 2025-11-22 08:13:56.517 186853 DEBUG nova.virt.libvirt.driver [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] No VIF found with MAC fa:16:3e:13:92:17, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:13:56 np0005531887 nova_compute[186849]: 2025-11-22 08:13:56.517 186853 INFO nova.virt.libvirt.driver [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Using config drive#033[00m
Nov 22 03:13:56 np0005531887 nova_compute[186849]: 2025-11-22 08:13:56.800 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:13:57 np0005531887 nova_compute[186849]: 2025-11-22 08:13:57.025 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:57 np0005531887 nova_compute[186849]: 2025-11-22 08:13:57.088 186853 INFO nova.virt.libvirt.driver [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Creating config drive at /var/lib/nova/instances/cd188127-6d18-4ee7-a764-30f3612157e8/disk.config#033[00m
Nov 22 03:13:57 np0005531887 nova_compute[186849]: 2025-11-22 08:13:57.093 186853 DEBUG oslo_concurrency.processutils [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/cd188127-6d18-4ee7-a764-30f3612157e8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkjo9yc_k execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:13:57 np0005531887 nova_compute[186849]: 2025-11-22 08:13:57.220 186853 DEBUG oslo_concurrency.processutils [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/cd188127-6d18-4ee7-a764-30f3612157e8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkjo9yc_k" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:13:57 np0005531887 kernel: tap4b028913-99: entered promiscuous mode
Nov 22 03:13:57 np0005531887 NetworkManager[55210]: <info>  [1763799237.3037] manager: (tap4b028913-99): new Tun device (/org/freedesktop/NetworkManager/Devices/201)
Nov 22 03:13:57 np0005531887 ovn_controller[95130]: 2025-11-22T08:13:57Z|00445|binding|INFO|Claiming lport 4b028913-9948-488e-a6e0-30df6e049217 for this chassis.
Nov 22 03:13:57 np0005531887 ovn_controller[95130]: 2025-11-22T08:13:57Z|00446|binding|INFO|4b028913-9948-488e-a6e0-30df6e049217: Claiming fa:16:3e:13:92:17 10.100.0.6
Nov 22 03:13:57 np0005531887 nova_compute[186849]: 2025-11-22 08:13:57.306 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:57 np0005531887 nova_compute[186849]: 2025-11-22 08:13:57.309 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:57 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:57.322 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:13:92:17 10.100.0.6'], port_security=['fa:16:3e:13:92:17 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'cd188127-6d18-4ee7-a764-30f3612157e8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-66c945b4-7237-4e85-b411-0c51b31ea31a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '70cb231da30d4002a985cf18a579cd6a', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'cdac32cd-3018-48f9-b8b4-269b2f46b94b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b63d9e41-5235-4b2c-88f9-85531fc2355b, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=4b028913-9948-488e-a6e0-30df6e049217) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:13:57 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:57.324 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 4b028913-9948-488e-a6e0-30df6e049217 in datapath 66c945b4-7237-4e85-b411-0c51b31ea31a bound to our chassis#033[00m
Nov 22 03:13:57 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:57.325 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 66c945b4-7237-4e85-b411-0c51b31ea31a#033[00m
Nov 22 03:13:57 np0005531887 systemd-udevd[233838]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:13:57 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:57.342 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[83a1d3dd-ad88-40d1-bec3-8d28cb6dfd14]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:57 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:57.345 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap66c945b4-71 in ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:13:57 np0005531887 NetworkManager[55210]: <info>  [1763799237.3492] device (tap4b028913-99): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:13:57 np0005531887 NetworkManager[55210]: <info>  [1763799237.3500] device (tap4b028913-99): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:13:57 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:57.348 213790 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap66c945b4-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:13:57 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:57.348 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[c249464d-f509-42f0-8b18-13e6d49ec00c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:57 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:57.353 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[7fd38a48-e0c4-4160-8bf5-3c9bf2a2d9ab]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:57 np0005531887 nova_compute[186849]: 2025-11-22 08:13:57.365 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:57 np0005531887 systemd-machined[153180]: New machine qemu-49-instance-0000008a.
Nov 22 03:13:57 np0005531887 ovn_controller[95130]: 2025-11-22T08:13:57Z|00447|binding|INFO|Setting lport 4b028913-9948-488e-a6e0-30df6e049217 ovn-installed in OVS
Nov 22 03:13:57 np0005531887 ovn_controller[95130]: 2025-11-22T08:13:57Z|00448|binding|INFO|Setting lport 4b028913-9948-488e-a6e0-30df6e049217 up in Southbound
Nov 22 03:13:57 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:57.370 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[a31d09f1-31ec-4828-9ac5-f3c4f0080e44]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:57 np0005531887 nova_compute[186849]: 2025-11-22 08:13:57.373 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:57 np0005531887 systemd[1]: Started Virtual Machine qemu-49-instance-0000008a.
Nov 22 03:13:57 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:57.392 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[5c763622-93eb-4b2b-a2da-ee80939f8a94]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:57 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:57.422 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[e5d63e80-4a12-442d-9b44-dd9fbd509187]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:57 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:57.429 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[62010baa-bb14-45f3-8789-cffd36469254]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:57 np0005531887 NetworkManager[55210]: <info>  [1763799237.4299] manager: (tap66c945b4-70): new Veth device (/org/freedesktop/NetworkManager/Devices/202)
Nov 22 03:13:57 np0005531887 systemd-udevd[233844]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:13:57 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:57.474 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[6d9a601b-7e99-41e9-98ec-8b83b43aeb57]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:57 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:57.477 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[ba18d4bc-5e4d-4938-b727-8597b9ee9df6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:57 np0005531887 NetworkManager[55210]: <info>  [1763799237.5068] device (tap66c945b4-70): carrier: link connected
Nov 22 03:13:57 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:57.514 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[e51b22e1-ebd4-4d02-8615-a35c62334bbe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:57 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:57.534 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[1cc56937-d30b-45ef-b401-8287970e65a8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap66c945b4-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7c:5a:27'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 132], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 598011, 'reachable_time': 42314, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233874, 'error': None, 'target': 'ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:57 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:57.556 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[c1c52d60-01d3-4e35-a9dc-94db50d06391]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe7c:5a27'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 598011, 'tstamp': 598011}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233875, 'error': None, 'target': 'ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:57 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:57.578 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[f12bd1fc-b701-4833-a35e-3ae189828c6e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap66c945b4-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7c:5a:27'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 132], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 598011, 'reachable_time': 42314, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 233876, 'error': None, 'target': 'ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:57 np0005531887 nova_compute[186849]: 2025-11-22 08:13:57.605 186853 DEBUG nova.compute.manager [req-6a48c867-4c75-41db-ba1f-0add86a78c52 req-14e7399c-546a-4ae8-b850-1d9ba1b781cd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Received event network-vif-plugged-4b028913-9948-488e-a6e0-30df6e049217 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:13:57 np0005531887 nova_compute[186849]: 2025-11-22 08:13:57.606 186853 DEBUG oslo_concurrency.lockutils [req-6a48c867-4c75-41db-ba1f-0add86a78c52 req-14e7399c-546a-4ae8-b850-1d9ba1b781cd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "cd188127-6d18-4ee7-a764-30f3612157e8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:13:57 np0005531887 nova_compute[186849]: 2025-11-22 08:13:57.606 186853 DEBUG oslo_concurrency.lockutils [req-6a48c867-4c75-41db-ba1f-0add86a78c52 req-14e7399c-546a-4ae8-b850-1d9ba1b781cd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "cd188127-6d18-4ee7-a764-30f3612157e8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:13:57 np0005531887 nova_compute[186849]: 2025-11-22 08:13:57.607 186853 DEBUG oslo_concurrency.lockutils [req-6a48c867-4c75-41db-ba1f-0add86a78c52 req-14e7399c-546a-4ae8-b850-1d9ba1b781cd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "cd188127-6d18-4ee7-a764-30f3612157e8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:13:57 np0005531887 nova_compute[186849]: 2025-11-22 08:13:57.607 186853 DEBUG nova.compute.manager [req-6a48c867-4c75-41db-ba1f-0add86a78c52 req-14e7399c-546a-4ae8-b850-1d9ba1b781cd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Processing event network-vif-plugged-4b028913-9948-488e-a6e0-30df6e049217 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:13:57 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:57.615 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[90e7fafe-6bbf-4839-8301-e5cc35fac868]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:57 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:57.693 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[078a4947-3878-4326-a671-7f2cdd218d57]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:57 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:57.695 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap66c945b4-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:13:57 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:57.695 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:13:57 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:57.696 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap66c945b4-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:13:57 np0005531887 kernel: tap66c945b4-70: entered promiscuous mode
Nov 22 03:13:57 np0005531887 NetworkManager[55210]: <info>  [1763799237.6997] manager: (tap66c945b4-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/203)
Nov 22 03:13:57 np0005531887 nova_compute[186849]: 2025-11-22 08:13:57.698 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:57 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:57.702 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap66c945b4-70, col_values=(('external_ids', {'iface-id': 'd6ef1392-aa2a-4e3e-91ba-ec0ce61e416a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:13:57 np0005531887 ovn_controller[95130]: 2025-11-22T08:13:57Z|00449|binding|INFO|Releasing lport d6ef1392-aa2a-4e3e-91ba-ec0ce61e416a from this chassis (sb_readonly=0)
Nov 22 03:13:57 np0005531887 nova_compute[186849]: 2025-11-22 08:13:57.703 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:57 np0005531887 nova_compute[186849]: 2025-11-22 08:13:57.716 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:57 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:57.717 104084 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/66c945b4-7237-4e85-b411-0c51b31ea31a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/66c945b4-7237-4e85-b411-0c51b31ea31a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:13:57 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:57.719 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[53266b7f-0e31-4373-aecb-666b4a81dcb0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:13:57 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:57.720 104084 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:13:57 np0005531887 ovn_metadata_agent[104079]: global
Nov 22 03:13:57 np0005531887 ovn_metadata_agent[104079]:    log         /dev/log local0 debug
Nov 22 03:13:57 np0005531887 ovn_metadata_agent[104079]:    log-tag     haproxy-metadata-proxy-66c945b4-7237-4e85-b411-0c51b31ea31a
Nov 22 03:13:57 np0005531887 ovn_metadata_agent[104079]:    user        root
Nov 22 03:13:57 np0005531887 ovn_metadata_agent[104079]:    group       root
Nov 22 03:13:57 np0005531887 ovn_metadata_agent[104079]:    maxconn     1024
Nov 22 03:13:57 np0005531887 ovn_metadata_agent[104079]:    pidfile     /var/lib/neutron/external/pids/66c945b4-7237-4e85-b411-0c51b31ea31a.pid.haproxy
Nov 22 03:13:57 np0005531887 ovn_metadata_agent[104079]:    daemon
Nov 22 03:13:57 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:13:57 np0005531887 ovn_metadata_agent[104079]: defaults
Nov 22 03:13:57 np0005531887 ovn_metadata_agent[104079]:    log global
Nov 22 03:13:57 np0005531887 ovn_metadata_agent[104079]:    mode http
Nov 22 03:13:57 np0005531887 ovn_metadata_agent[104079]:    option httplog
Nov 22 03:13:57 np0005531887 ovn_metadata_agent[104079]:    option dontlognull
Nov 22 03:13:57 np0005531887 ovn_metadata_agent[104079]:    option http-server-close
Nov 22 03:13:57 np0005531887 ovn_metadata_agent[104079]:    option forwardfor
Nov 22 03:13:57 np0005531887 ovn_metadata_agent[104079]:    retries                 3
Nov 22 03:13:57 np0005531887 ovn_metadata_agent[104079]:    timeout http-request    30s
Nov 22 03:13:57 np0005531887 ovn_metadata_agent[104079]:    timeout connect         30s
Nov 22 03:13:57 np0005531887 ovn_metadata_agent[104079]:    timeout client          32s
Nov 22 03:13:57 np0005531887 ovn_metadata_agent[104079]:    timeout server          32s
Nov 22 03:13:57 np0005531887 ovn_metadata_agent[104079]:    timeout http-keep-alive 30s
Nov 22 03:13:57 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:13:57 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:13:57 np0005531887 ovn_metadata_agent[104079]: listen listener
Nov 22 03:13:57 np0005531887 ovn_metadata_agent[104079]:    bind 169.254.169.254:80
Nov 22 03:13:57 np0005531887 ovn_metadata_agent[104079]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:13:57 np0005531887 ovn_metadata_agent[104079]:    http-request add-header X-OVN-Network-ID 66c945b4-7237-4e85-b411-0c51b31ea31a
Nov 22 03:13:57 np0005531887 ovn_metadata_agent[104079]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:13:57 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:57.721 104084 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a', 'env', 'PROCESS_TAG=haproxy-66c945b4-7237-4e85-b411-0c51b31ea31a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/66c945b4-7237-4e85-b411-0c51b31ea31a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:13:58 np0005531887 podman[233906]: 2025-11-22 08:13:58.13963614 +0000 UTC m=+0.038141292 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:13:58 np0005531887 podman[233906]: 2025-11-22 08:13:58.381733225 +0000 UTC m=+0.280238357 container create 0779f0e4dfe7670ea8962ea8464619d958f0e68e23802b2f260d0afd3e9be077 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Nov 22 03:13:58 np0005531887 systemd[1]: Started libpod-conmon-0779f0e4dfe7670ea8962ea8464619d958f0e68e23802b2f260d0afd3e9be077.scope.
Nov 22 03:13:58 np0005531887 systemd[1]: Started libcrun container.
Nov 22 03:13:58 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd9c02880acd9aa46147a09cab71d9bdc9d1dc95a59320f134c36316ba441192/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:13:58 np0005531887 podman[233906]: 2025-11-22 08:13:58.519817193 +0000 UTC m=+0.418322365 container init 0779f0e4dfe7670ea8962ea8464619d958f0e68e23802b2f260d0afd3e9be077 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 22 03:13:58 np0005531887 podman[233906]: 2025-11-22 08:13:58.529058921 +0000 UTC m=+0.427564043 container start 0779f0e4dfe7670ea8962ea8464619d958f0e68e23802b2f260d0afd3e9be077 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Nov 22 03:13:58 np0005531887 neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a[233921]: [NOTICE]   (233931) : New worker (233933) forked
Nov 22 03:13:58 np0005531887 neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a[233921]: [NOTICE]   (233931) : Loading success.
Nov 22 03:13:58 np0005531887 nova_compute[186849]: 2025-11-22 08:13:58.555 186853 DEBUG nova.compute.manager [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:13:58 np0005531887 nova_compute[186849]: 2025-11-22 08:13:58.556 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763799238.5551383, cd188127-6d18-4ee7-a764-30f3612157e8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:13:58 np0005531887 nova_compute[186849]: 2025-11-22 08:13:58.557 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] VM Started (Lifecycle Event)#033[00m
Nov 22 03:13:58 np0005531887 nova_compute[186849]: 2025-11-22 08:13:58.559 186853 DEBUG nova.virt.libvirt.driver [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:13:58 np0005531887 nova_compute[186849]: 2025-11-22 08:13:58.563 186853 INFO nova.virt.libvirt.driver [-] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Instance spawned successfully.#033[00m
Nov 22 03:13:58 np0005531887 nova_compute[186849]: 2025-11-22 08:13:58.563 186853 DEBUG nova.virt.libvirt.driver [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 03:13:58 np0005531887 nova_compute[186849]: 2025-11-22 08:13:58.599 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:13:58 np0005531887 nova_compute[186849]: 2025-11-22 08:13:58.608 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:13:58 np0005531887 nova_compute[186849]: 2025-11-22 08:13:58.618 186853 DEBUG nova.virt.libvirt.driver [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:13:58 np0005531887 nova_compute[186849]: 2025-11-22 08:13:58.619 186853 DEBUG nova.virt.libvirt.driver [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:13:58 np0005531887 nova_compute[186849]: 2025-11-22 08:13:58.620 186853 DEBUG nova.virt.libvirt.driver [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:13:58 np0005531887 nova_compute[186849]: 2025-11-22 08:13:58.620 186853 DEBUG nova.virt.libvirt.driver [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:13:58 np0005531887 nova_compute[186849]: 2025-11-22 08:13:58.621 186853 DEBUG nova.virt.libvirt.driver [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:13:58 np0005531887 nova_compute[186849]: 2025-11-22 08:13:58.621 186853 DEBUG nova.virt.libvirt.driver [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:13:58 np0005531887 nova_compute[186849]: 2025-11-22 08:13:58.649 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:13:58 np0005531887 nova_compute[186849]: 2025-11-22 08:13:58.650 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763799238.5553622, cd188127-6d18-4ee7-a764-30f3612157e8 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:13:58 np0005531887 nova_compute[186849]: 2025-11-22 08:13:58.650 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:13:58 np0005531887 nova_compute[186849]: 2025-11-22 08:13:58.692 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:13:58 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:58.693 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=39, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=38) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:13:58 np0005531887 nova_compute[186849]: 2025-11-22 08:13:58.695 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:13:58 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:13:58.695 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:13:58 np0005531887 nova_compute[186849]: 2025-11-22 08:13:58.699 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763799238.5589678, cd188127-6d18-4ee7-a764-30f3612157e8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:13:58 np0005531887 nova_compute[186849]: 2025-11-22 08:13:58.699 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:13:58 np0005531887 nova_compute[186849]: 2025-11-22 08:13:58.717 186853 DEBUG nova.network.neutron [req-6ae0fc15-b94c-4eff-b846-b0ee3398edb8 req-ced66dc1-1068-4209-bd4f-ae34cd01974c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Updated VIF entry in instance network info cache for port 4b028913-9948-488e-a6e0-30df6e049217. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:13:58 np0005531887 nova_compute[186849]: 2025-11-22 08:13:58.718 186853 DEBUG nova.network.neutron [req-6ae0fc15-b94c-4eff-b846-b0ee3398edb8 req-ced66dc1-1068-4209-bd4f-ae34cd01974c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Updating instance_info_cache with network_info: [{"id": "4b028913-9948-488e-a6e0-30df6e049217", "address": "fa:16:3e:13:92:17", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b028913-99", "ovs_interfaceid": "4b028913-9948-488e-a6e0-30df6e049217", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:13:58 np0005531887 nova_compute[186849]: 2025-11-22 08:13:58.721 186853 INFO nova.compute.manager [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Took 5.08 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 03:13:58 np0005531887 nova_compute[186849]: 2025-11-22 08:13:58.721 186853 DEBUG nova.compute.manager [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:13:58 np0005531887 nova_compute[186849]: 2025-11-22 08:13:58.729 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:13:58 np0005531887 nova_compute[186849]: 2025-11-22 08:13:58.732 186853 DEBUG oslo_concurrency.lockutils [req-6ae0fc15-b94c-4eff-b846-b0ee3398edb8 req-ced66dc1-1068-4209-bd4f-ae34cd01974c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-cd188127-6d18-4ee7-a764-30f3612157e8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:13:58 np0005531887 nova_compute[186849]: 2025-11-22 08:13:58.735 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:13:58 np0005531887 nova_compute[186849]: 2025-11-22 08:13:58.752 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:13:58 np0005531887 nova_compute[186849]: 2025-11-22 08:13:58.800 186853 INFO nova.compute.manager [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Took 5.69 seconds to build instance.#033[00m
Nov 22 03:13:58 np0005531887 nova_compute[186849]: 2025-11-22 08:13:58.971 186853 DEBUG oslo_concurrency.lockutils [None req-14f76452-f668-4c5c-b335-31535d7069ff 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "cd188127-6d18-4ee7-a764-30f3612157e8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.936s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:13:59 np0005531887 nova_compute[186849]: 2025-11-22 08:13:59.790 186853 DEBUG nova.compute.manager [req-a8e992da-9e40-4545-b4f2-2c3d79490efa req-55990912-b827-4898-8b7f-463966fe613a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Received event network-vif-plugged-4b028913-9948-488e-a6e0-30df6e049217 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:13:59 np0005531887 nova_compute[186849]: 2025-11-22 08:13:59.791 186853 DEBUG oslo_concurrency.lockutils [req-a8e992da-9e40-4545-b4f2-2c3d79490efa req-55990912-b827-4898-8b7f-463966fe613a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "cd188127-6d18-4ee7-a764-30f3612157e8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:13:59 np0005531887 nova_compute[186849]: 2025-11-22 08:13:59.791 186853 DEBUG oslo_concurrency.lockutils [req-a8e992da-9e40-4545-b4f2-2c3d79490efa req-55990912-b827-4898-8b7f-463966fe613a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "cd188127-6d18-4ee7-a764-30f3612157e8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:13:59 np0005531887 nova_compute[186849]: 2025-11-22 08:13:59.791 186853 DEBUG oslo_concurrency.lockutils [req-a8e992da-9e40-4545-b4f2-2c3d79490efa req-55990912-b827-4898-8b7f-463966fe613a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "cd188127-6d18-4ee7-a764-30f3612157e8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:13:59 np0005531887 nova_compute[186849]: 2025-11-22 08:13:59.791 186853 DEBUG nova.compute.manager [req-a8e992da-9e40-4545-b4f2-2c3d79490efa req-55990912-b827-4898-8b7f-463966fe613a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] No waiting events found dispatching network-vif-plugged-4b028913-9948-488e-a6e0-30df6e049217 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:13:59 np0005531887 nova_compute[186849]: 2025-11-22 08:13:59.791 186853 WARNING nova.compute.manager [req-a8e992da-9e40-4545-b4f2-2c3d79490efa req-55990912-b827-4898-8b7f-463966fe613a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Received unexpected event network-vif-plugged-4b028913-9948-488e-a6e0-30df6e049217 for instance with vm_state active and task_state None.#033[00m
Nov 22 03:13:59 np0005531887 podman[233942]: 2025-11-22 08:13:59.879316983 +0000 UTC m=+0.088936136 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 03:14:01 np0005531887 nova_compute[186849]: 2025-11-22 08:14:01.300 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:01 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:01.698 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '39'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:14:02 np0005531887 nova_compute[186849]: 2025-11-22 08:14:02.027 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:02 np0005531887 nova_compute[186849]: 2025-11-22 08:14:02.605 186853 DEBUG oslo_concurrency.lockutils [None req-63d1fca3-2af5-461e-a204-ad140df0a846 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "cd188127-6d18-4ee7-a764-30f3612157e8" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:14:02 np0005531887 nova_compute[186849]: 2025-11-22 08:14:02.605 186853 DEBUG oslo_concurrency.lockutils [None req-63d1fca3-2af5-461e-a204-ad140df0a846 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "cd188127-6d18-4ee7-a764-30f3612157e8" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:14:02 np0005531887 nova_compute[186849]: 2025-11-22 08:14:02.605 186853 DEBUG oslo_concurrency.lockutils [None req-63d1fca3-2af5-461e-a204-ad140df0a846 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "cd188127-6d18-4ee7-a764-30f3612157e8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:14:02 np0005531887 nova_compute[186849]: 2025-11-22 08:14:02.605 186853 DEBUG oslo_concurrency.lockutils [None req-63d1fca3-2af5-461e-a204-ad140df0a846 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "cd188127-6d18-4ee7-a764-30f3612157e8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:14:02 np0005531887 nova_compute[186849]: 2025-11-22 08:14:02.605 186853 DEBUG oslo_concurrency.lockutils [None req-63d1fca3-2af5-461e-a204-ad140df0a846 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "cd188127-6d18-4ee7-a764-30f3612157e8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:14:02 np0005531887 nova_compute[186849]: 2025-11-22 08:14:02.613 186853 INFO nova.compute.manager [None req-63d1fca3-2af5-461e-a204-ad140df0a846 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Terminating instance#033[00m
Nov 22 03:14:02 np0005531887 nova_compute[186849]: 2025-11-22 08:14:02.624 186853 DEBUG nova.compute.manager [None req-63d1fca3-2af5-461e-a204-ad140df0a846 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 03:14:02 np0005531887 kernel: tap4b028913-99 (unregistering): left promiscuous mode
Nov 22 03:14:02 np0005531887 NetworkManager[55210]: <info>  [1763799242.6485] device (tap4b028913-99): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:14:02 np0005531887 nova_compute[186849]: 2025-11-22 08:14:02.667 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:02 np0005531887 ovn_controller[95130]: 2025-11-22T08:14:02Z|00450|binding|INFO|Releasing lport 4b028913-9948-488e-a6e0-30df6e049217 from this chassis (sb_readonly=0)
Nov 22 03:14:02 np0005531887 ovn_controller[95130]: 2025-11-22T08:14:02Z|00451|binding|INFO|Setting lport 4b028913-9948-488e-a6e0-30df6e049217 down in Southbound
Nov 22 03:14:02 np0005531887 ovn_controller[95130]: 2025-11-22T08:14:02Z|00452|binding|INFO|Removing iface tap4b028913-99 ovn-installed in OVS
Nov 22 03:14:02 np0005531887 nova_compute[186849]: 2025-11-22 08:14:02.669 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:02 np0005531887 nova_compute[186849]: 2025-11-22 08:14:02.671 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:02 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:02.676 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:13:92:17 10.100.0.6'], port_security=['fa:16:3e:13:92:17 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'cd188127-6d18-4ee7-a764-30f3612157e8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-66c945b4-7237-4e85-b411-0c51b31ea31a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '70cb231da30d4002a985cf18a579cd6a', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'cdac32cd-3018-48f9-b8b4-269b2f46b94b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b63d9e41-5235-4b2c-88f9-85531fc2355b, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=4b028913-9948-488e-a6e0-30df6e049217) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:14:02 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:02.677 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 4b028913-9948-488e-a6e0-30df6e049217 in datapath 66c945b4-7237-4e85-b411-0c51b31ea31a unbound from our chassis#033[00m
Nov 22 03:14:02 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:02.679 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 66c945b4-7237-4e85-b411-0c51b31ea31a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:14:02 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:02.680 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[bb6d8956-bd5b-487e-84d9-6ebfa20c0e5e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:14:02 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:02.680 104084 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a namespace which is not needed anymore#033[00m
Nov 22 03:14:02 np0005531887 nova_compute[186849]: 2025-11-22 08:14:02.689 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:02 np0005531887 systemd[1]: machine-qemu\x2d49\x2dinstance\x2d0000008a.scope: Deactivated successfully.
Nov 22 03:14:02 np0005531887 systemd[1]: machine-qemu\x2d49\x2dinstance\x2d0000008a.scope: Consumed 5.293s CPU time.
Nov 22 03:14:02 np0005531887 systemd-machined[153180]: Machine qemu-49-instance-0000008a terminated.
Nov 22 03:14:02 np0005531887 neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a[233921]: [NOTICE]   (233931) : haproxy version is 2.8.14-c23fe91
Nov 22 03:14:02 np0005531887 neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a[233921]: [NOTICE]   (233931) : path to executable is /usr/sbin/haproxy
Nov 22 03:14:02 np0005531887 neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a[233921]: [WARNING]  (233931) : Exiting Master process...
Nov 22 03:14:02 np0005531887 neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a[233921]: [WARNING]  (233931) : Exiting Master process...
Nov 22 03:14:02 np0005531887 neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a[233921]: [ALERT]    (233931) : Current worker (233933) exited with code 143 (Terminated)
Nov 22 03:14:02 np0005531887 neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a[233921]: [WARNING]  (233931) : All workers exited. Exiting... (0)
Nov 22 03:14:02 np0005531887 systemd[1]: libpod-0779f0e4dfe7670ea8962ea8464619d958f0e68e23802b2f260d0afd3e9be077.scope: Deactivated successfully.
Nov 22 03:14:02 np0005531887 podman[233989]: 2025-11-22 08:14:02.83760924 +0000 UTC m=+0.060859814 container died 0779f0e4dfe7670ea8962ea8464619d958f0e68e23802b2f260d0afd3e9be077 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 03:14:02 np0005531887 nova_compute[186849]: 2025-11-22 08:14:02.851 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:02 np0005531887 nova_compute[186849]: 2025-11-22 08:14:02.857 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:02 np0005531887 nova_compute[186849]: 2025-11-22 08:14:02.884 186853 DEBUG nova.compute.manager [req-87dfe640-1cb9-4f9f-aa5f-14699251edfe req-1ae12865-7323-4f7a-ab72-f69cf68340a8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Received event network-vif-unplugged-4b028913-9948-488e-a6e0-30df6e049217 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:14:02 np0005531887 nova_compute[186849]: 2025-11-22 08:14:02.884 186853 DEBUG oslo_concurrency.lockutils [req-87dfe640-1cb9-4f9f-aa5f-14699251edfe req-1ae12865-7323-4f7a-ab72-f69cf68340a8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "cd188127-6d18-4ee7-a764-30f3612157e8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:14:02 np0005531887 nova_compute[186849]: 2025-11-22 08:14:02.885 186853 DEBUG oslo_concurrency.lockutils [req-87dfe640-1cb9-4f9f-aa5f-14699251edfe req-1ae12865-7323-4f7a-ab72-f69cf68340a8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "cd188127-6d18-4ee7-a764-30f3612157e8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:14:02 np0005531887 nova_compute[186849]: 2025-11-22 08:14:02.885 186853 DEBUG oslo_concurrency.lockutils [req-87dfe640-1cb9-4f9f-aa5f-14699251edfe req-1ae12865-7323-4f7a-ab72-f69cf68340a8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "cd188127-6d18-4ee7-a764-30f3612157e8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:14:02 np0005531887 nova_compute[186849]: 2025-11-22 08:14:02.885 186853 DEBUG nova.compute.manager [req-87dfe640-1cb9-4f9f-aa5f-14699251edfe req-1ae12865-7323-4f7a-ab72-f69cf68340a8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] No waiting events found dispatching network-vif-unplugged-4b028913-9948-488e-a6e0-30df6e049217 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:14:02 np0005531887 nova_compute[186849]: 2025-11-22 08:14:02.885 186853 DEBUG nova.compute.manager [req-87dfe640-1cb9-4f9f-aa5f-14699251edfe req-1ae12865-7323-4f7a-ab72-f69cf68340a8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Received event network-vif-unplugged-4b028913-9948-488e-a6e0-30df6e049217 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 03:14:02 np0005531887 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0779f0e4dfe7670ea8962ea8464619d958f0e68e23802b2f260d0afd3e9be077-userdata-shm.mount: Deactivated successfully.
Nov 22 03:14:02 np0005531887 systemd[1]: var-lib-containers-storage-overlay-fd9c02880acd9aa46147a09cab71d9bdc9d1dc95a59320f134c36316ba441192-merged.mount: Deactivated successfully.
Nov 22 03:14:02 np0005531887 podman[233989]: 2025-11-22 08:14:02.905448994 +0000 UTC m=+0.128699568 container cleanup 0779f0e4dfe7670ea8962ea8464619d958f0e68e23802b2f260d0afd3e9be077 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251118, io.buildah.version=1.41.3)
Nov 22 03:14:02 np0005531887 nova_compute[186849]: 2025-11-22 08:14:02.905 186853 INFO nova.virt.libvirt.driver [-] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Instance destroyed successfully.#033[00m
Nov 22 03:14:02 np0005531887 nova_compute[186849]: 2025-11-22 08:14:02.907 186853 DEBUG nova.objects.instance [None req-63d1fca3-2af5-461e-a204-ad140df0a846 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lazy-loading 'resources' on Instance uuid cd188127-6d18-4ee7-a764-30f3612157e8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:14:02 np0005531887 systemd[1]: libpod-conmon-0779f0e4dfe7670ea8962ea8464619d958f0e68e23802b2f260d0afd3e9be077.scope: Deactivated successfully.
Nov 22 03:14:02 np0005531887 nova_compute[186849]: 2025-11-22 08:14:02.923 186853 DEBUG nova.virt.libvirt.vif [None req-63d1fca3-2af5-461e-a204-ad140df0a846 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=2001:2001::3,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:13:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1493876225',display_name='tempest-ServersTestJSON-server-1493876225',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestjson-server-1493876225',id=138,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:13:58Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='70cb231da30d4002a985cf18a579cd6a',ramdisk_id='',reservation_id='r-1q6n9tjy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-1620770071',owner_user_name='tempest-ServersTestJSON-1620770071-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:13:58Z,user_data=None,user_id='11d95211a44e4da9a04eb309ec3ab024',uuid=cd188127-6d18-4ee7-a764-30f3612157e8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4b028913-9948-488e-a6e0-30df6e049217", "address": "fa:16:3e:13:92:17", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b028913-99", "ovs_interfaceid": "4b028913-9948-488e-a6e0-30df6e049217", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:14:02 np0005531887 nova_compute[186849]: 2025-11-22 08:14:02.923 186853 DEBUG nova.network.os_vif_util [None req-63d1fca3-2af5-461e-a204-ad140df0a846 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Converting VIF {"id": "4b028913-9948-488e-a6e0-30df6e049217", "address": "fa:16:3e:13:92:17", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b028913-99", "ovs_interfaceid": "4b028913-9948-488e-a6e0-30df6e049217", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:14:02 np0005531887 nova_compute[186849]: 2025-11-22 08:14:02.924 186853 DEBUG nova.network.os_vif_util [None req-63d1fca3-2af5-461e-a204-ad140df0a846 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:13:92:17,bridge_name='br-int',has_traffic_filtering=True,id=4b028913-9948-488e-a6e0-30df6e049217,network=Network(66c945b4-7237-4e85-b411-0c51b31ea31a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b028913-99') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:14:02 np0005531887 nova_compute[186849]: 2025-11-22 08:14:02.924 186853 DEBUG os_vif [None req-63d1fca3-2af5-461e-a204-ad140df0a846 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:13:92:17,bridge_name='br-int',has_traffic_filtering=True,id=4b028913-9948-488e-a6e0-30df6e049217,network=Network(66c945b4-7237-4e85-b411-0c51b31ea31a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b028913-99') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:14:02 np0005531887 nova_compute[186849]: 2025-11-22 08:14:02.925 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:02 np0005531887 nova_compute[186849]: 2025-11-22 08:14:02.926 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4b028913-99, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:14:02 np0005531887 nova_compute[186849]: 2025-11-22 08:14:02.927 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:02 np0005531887 nova_compute[186849]: 2025-11-22 08:14:02.929 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:02 np0005531887 nova_compute[186849]: 2025-11-22 08:14:02.931 186853 INFO os_vif [None req-63d1fca3-2af5-461e-a204-ad140df0a846 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:13:92:17,bridge_name='br-int',has_traffic_filtering=True,id=4b028913-9948-488e-a6e0-30df6e049217,network=Network(66c945b4-7237-4e85-b411-0c51b31ea31a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b028913-99')#033[00m
Nov 22 03:14:02 np0005531887 nova_compute[186849]: 2025-11-22 08:14:02.932 186853 INFO nova.virt.libvirt.driver [None req-63d1fca3-2af5-461e-a204-ad140df0a846 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Deleting instance files /var/lib/nova/instances/cd188127-6d18-4ee7-a764-30f3612157e8_del#033[00m
Nov 22 03:14:02 np0005531887 nova_compute[186849]: 2025-11-22 08:14:02.932 186853 INFO nova.virt.libvirt.driver [None req-63d1fca3-2af5-461e-a204-ad140df0a846 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Deletion of /var/lib/nova/instances/cd188127-6d18-4ee7-a764-30f3612157e8_del complete#033[00m
Nov 22 03:14:02 np0005531887 podman[234034]: 2025-11-22 08:14:02.996519911 +0000 UTC m=+0.063820275 container remove 0779f0e4dfe7670ea8962ea8464619d958f0e68e23802b2f260d0afd3e9be077 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 03:14:03 np0005531887 nova_compute[186849]: 2025-11-22 08:14:03.002 186853 INFO nova.compute.manager [None req-63d1fca3-2af5-461e-a204-ad140df0a846 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 03:14:03 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:03.003 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[2fe1fe8d-961b-4c55-9b3f-a64eba01d62e]: (4, ('Sat Nov 22 08:14:02 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a (0779f0e4dfe7670ea8962ea8464619d958f0e68e23802b2f260d0afd3e9be077)\n0779f0e4dfe7670ea8962ea8464619d958f0e68e23802b2f260d0afd3e9be077\nSat Nov 22 08:14:02 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a (0779f0e4dfe7670ea8962ea8464619d958f0e68e23802b2f260d0afd3e9be077)\n0779f0e4dfe7670ea8962ea8464619d958f0e68e23802b2f260d0afd3e9be077\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:14:03 np0005531887 nova_compute[186849]: 2025-11-22 08:14:03.003 186853 DEBUG oslo.service.loopingcall [None req-63d1fca3-2af5-461e-a204-ad140df0a846 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 03:14:03 np0005531887 nova_compute[186849]: 2025-11-22 08:14:03.004 186853 DEBUG nova.compute.manager [-] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 03:14:03 np0005531887 nova_compute[186849]: 2025-11-22 08:14:03.004 186853 DEBUG nova.network.neutron [-] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 03:14:03 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:03.004 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[6c7fa81f-8387-47ed-ae17-6b5466c6ebed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:14:03 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:03.007 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap66c945b4-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:14:03 np0005531887 nova_compute[186849]: 2025-11-22 08:14:03.009 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:03 np0005531887 kernel: tap66c945b4-70: left promiscuous mode
Nov 22 03:14:03 np0005531887 nova_compute[186849]: 2025-11-22 08:14:03.023 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:03 np0005531887 nova_compute[186849]: 2025-11-22 08:14:03.024 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:03 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:03.026 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[61f67ea6-de44-4a0c-855d-89596cd57cc4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:14:03 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:03.048 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[fbc5c21d-3086-4fb0-96bc-756eb7495b53]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:14:03 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:03.050 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[3bc0663d-d4a5-40bd-a59d-59160049cc78]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:14:03 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:03.070 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[b74620a9-6fd1-43cf-8a0f-cc738e993bfc]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 598002, 'reachable_time': 39912, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234049, 'error': None, 'target': 'ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:14:03 np0005531887 systemd[1]: run-netns-ovnmeta\x2d66c945b4\x2d7237\x2d4e85\x2db411\x2d0c51b31ea31a.mount: Deactivated successfully.
Nov 22 03:14:03 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:03.075 104200 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:14:03 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:03.076 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[a921f7b0-b46a-4fd0-9f86-7a89c28f244d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:14:03 np0005531887 nova_compute[186849]: 2025-11-22 08:14:03.762 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:14:03 np0005531887 nova_compute[186849]: 2025-11-22 08:14:03.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:14:03 np0005531887 nova_compute[186849]: 2025-11-22 08:14:03.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:14:03 np0005531887 nova_compute[186849]: 2025-11-22 08:14:03.790 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:14:03 np0005531887 nova_compute[186849]: 2025-11-22 08:14:03.790 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:14:03 np0005531887 nova_compute[186849]: 2025-11-22 08:14:03.790 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:14:03 np0005531887 nova_compute[186849]: 2025-11-22 08:14:03.791 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:14:04 np0005531887 nova_compute[186849]: 2025-11-22 08:14:04.003 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:14:04 np0005531887 nova_compute[186849]: 2025-11-22 08:14:04.005 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5700MB free_disk=73.33135223388672GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:14:04 np0005531887 nova_compute[186849]: 2025-11-22 08:14:04.005 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:14:04 np0005531887 nova_compute[186849]: 2025-11-22 08:14:04.005 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:14:04 np0005531887 nova_compute[186849]: 2025-11-22 08:14:04.099 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Instance cd188127-6d18-4ee7-a764-30f3612157e8 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 03:14:04 np0005531887 nova_compute[186849]: 2025-11-22 08:14:04.100 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:14:04 np0005531887 nova_compute[186849]: 2025-11-22 08:14:04.100 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:14:04 np0005531887 nova_compute[186849]: 2025-11-22 08:14:04.260 186853 DEBUG nova.network.neutron [-] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:14:04 np0005531887 nova_compute[186849]: 2025-11-22 08:14:04.281 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:14:04 np0005531887 nova_compute[186849]: 2025-11-22 08:14:04.292 186853 INFO nova.compute.manager [-] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Took 1.29 seconds to deallocate network for instance.#033[00m
Nov 22 03:14:04 np0005531887 nova_compute[186849]: 2025-11-22 08:14:04.306 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:14:04 np0005531887 nova_compute[186849]: 2025-11-22 08:14:04.336 186853 DEBUG nova.compute.manager [req-0b4d7007-8159-4a3f-ae53-253143d12a95 req-c69c9934-a370-4d8e-b24b-6dc5d96841bc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Received event network-vif-deleted-4b028913-9948-488e-a6e0-30df6e049217 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:14:04 np0005531887 nova_compute[186849]: 2025-11-22 08:14:04.361 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:14:04 np0005531887 nova_compute[186849]: 2025-11-22 08:14:04.362 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.356s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:14:04 np0005531887 nova_compute[186849]: 2025-11-22 08:14:04.379 186853 DEBUG oslo_concurrency.lockutils [None req-63d1fca3-2af5-461e-a204-ad140df0a846 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:14:04 np0005531887 nova_compute[186849]: 2025-11-22 08:14:04.380 186853 DEBUG oslo_concurrency.lockutils [None req-63d1fca3-2af5-461e-a204-ad140df0a846 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:14:04 np0005531887 nova_compute[186849]: 2025-11-22 08:14:04.442 186853 DEBUG nova.compute.provider_tree [None req-63d1fca3-2af5-461e-a204-ad140df0a846 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:14:04 np0005531887 nova_compute[186849]: 2025-11-22 08:14:04.456 186853 DEBUG nova.scheduler.client.report [None req-63d1fca3-2af5-461e-a204-ad140df0a846 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:14:04 np0005531887 nova_compute[186849]: 2025-11-22 08:14:04.480 186853 DEBUG oslo_concurrency.lockutils [None req-63d1fca3-2af5-461e-a204-ad140df0a846 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:14:04 np0005531887 nova_compute[186849]: 2025-11-22 08:14:04.506 186853 INFO nova.scheduler.client.report [None req-63d1fca3-2af5-461e-a204-ad140df0a846 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Deleted allocations for instance cd188127-6d18-4ee7-a764-30f3612157e8#033[00m
Nov 22 03:14:04 np0005531887 nova_compute[186849]: 2025-11-22 08:14:04.574 186853 DEBUG oslo_concurrency.lockutils [None req-63d1fca3-2af5-461e-a204-ad140df0a846 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "cd188127-6d18-4ee7-a764-30f3612157e8" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.969s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:14:05 np0005531887 nova_compute[186849]: 2025-11-22 08:14:05.000 186853 DEBUG nova.compute.manager [req-8c765d08-a54d-48d2-9481-5adebdfa52bb req-c3cfbcd4-5188-441b-a15b-05f50a7692c2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Received event network-vif-plugged-4b028913-9948-488e-a6e0-30df6e049217 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:14:05 np0005531887 nova_compute[186849]: 2025-11-22 08:14:05.000 186853 DEBUG oslo_concurrency.lockutils [req-8c765d08-a54d-48d2-9481-5adebdfa52bb req-c3cfbcd4-5188-441b-a15b-05f50a7692c2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "cd188127-6d18-4ee7-a764-30f3612157e8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:14:05 np0005531887 nova_compute[186849]: 2025-11-22 08:14:05.001 186853 DEBUG oslo_concurrency.lockutils [req-8c765d08-a54d-48d2-9481-5adebdfa52bb req-c3cfbcd4-5188-441b-a15b-05f50a7692c2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "cd188127-6d18-4ee7-a764-30f3612157e8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:14:05 np0005531887 nova_compute[186849]: 2025-11-22 08:14:05.001 186853 DEBUG oslo_concurrency.lockutils [req-8c765d08-a54d-48d2-9481-5adebdfa52bb req-c3cfbcd4-5188-441b-a15b-05f50a7692c2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "cd188127-6d18-4ee7-a764-30f3612157e8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:14:05 np0005531887 nova_compute[186849]: 2025-11-22 08:14:05.001 186853 DEBUG nova.compute.manager [req-8c765d08-a54d-48d2-9481-5adebdfa52bb req-c3cfbcd4-5188-441b-a15b-05f50a7692c2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] No waiting events found dispatching network-vif-plugged-4b028913-9948-488e-a6e0-30df6e049217 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:14:05 np0005531887 nova_compute[186849]: 2025-11-22 08:14:05.001 186853 WARNING nova.compute.manager [req-8c765d08-a54d-48d2-9481-5adebdfa52bb req-c3cfbcd4-5188-441b-a15b-05f50a7692c2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Received unexpected event network-vif-plugged-4b028913-9948-488e-a6e0-30df6e049217 for instance with vm_state deleted and task_state None.#033[00m
Nov 22 03:14:05 np0005531887 nova_compute[186849]: 2025-11-22 08:14:05.364 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:14:05 np0005531887 nova_compute[186849]: 2025-11-22 08:14:05.365 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:14:05 np0005531887 nova_compute[186849]: 2025-11-22 08:14:05.366 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:14:05 np0005531887 nova_compute[186849]: 2025-11-22 08:14:05.771 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:14:05 np0005531887 nova_compute[186849]: 2025-11-22 08:14:05.772 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:14:05 np0005531887 nova_compute[186849]: 2025-11-22 08:14:05.772 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:14:05 np0005531887 nova_compute[186849]: 2025-11-22 08:14:05.791 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:14:06 np0005531887 podman[234051]: 2025-11-22 08:14:06.840100165 +0000 UTC m=+0.054290590 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., release=1755695350, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 03:14:07 np0005531887 nova_compute[186849]: 2025-11-22 08:14:07.030 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:07 np0005531887 nova_compute[186849]: 2025-11-22 08:14:07.930 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:08 np0005531887 nova_compute[186849]: 2025-11-22 08:14:08.303 186853 DEBUG oslo_concurrency.lockutils [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:14:08 np0005531887 nova_compute[186849]: 2025-11-22 08:14:08.304 186853 DEBUG oslo_concurrency.lockutils [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:14:08 np0005531887 nova_compute[186849]: 2025-11-22 08:14:08.331 186853 DEBUG nova.compute.manager [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 03:14:08 np0005531887 nova_compute[186849]: 2025-11-22 08:14:08.446 186853 DEBUG oslo_concurrency.lockutils [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:14:08 np0005531887 nova_compute[186849]: 2025-11-22 08:14:08.447 186853 DEBUG oslo_concurrency.lockutils [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:14:08 np0005531887 nova_compute[186849]: 2025-11-22 08:14:08.455 186853 DEBUG nova.virt.hardware [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:14:08 np0005531887 nova_compute[186849]: 2025-11-22 08:14:08.456 186853 INFO nova.compute.claims [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 22 03:14:08 np0005531887 nova_compute[186849]: 2025-11-22 08:14:08.563 186853 DEBUG nova.compute.provider_tree [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:14:08 np0005531887 nova_compute[186849]: 2025-11-22 08:14:08.575 186853 DEBUG nova.scheduler.client.report [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:14:08 np0005531887 nova_compute[186849]: 2025-11-22 08:14:08.600 186853 DEBUG oslo_concurrency.lockutils [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.153s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:14:08 np0005531887 nova_compute[186849]: 2025-11-22 08:14:08.602 186853 DEBUG nova.compute.manager [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 03:14:08 np0005531887 nova_compute[186849]: 2025-11-22 08:14:08.837 186853 DEBUG nova.compute.manager [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 03:14:08 np0005531887 nova_compute[186849]: 2025-11-22 08:14:08.838 186853 DEBUG nova.network.neutron [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:14:08 np0005531887 podman[234073]: 2025-11-22 08:14:08.838596046 +0000 UTC m=+0.059120680 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:14:08 np0005531887 nova_compute[186849]: 2025-11-22 08:14:08.864 186853 INFO nova.virt.libvirt.driver [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 03:14:08 np0005531887 podman[234074]: 2025-11-22 08:14:08.874559602 +0000 UTC m=+0.090387821 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 22 03:14:08 np0005531887 nova_compute[186849]: 2025-11-22 08:14:08.881 186853 DEBUG nova.compute.manager [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 03:14:08 np0005531887 nova_compute[186849]: 2025-11-22 08:14:08.971 186853 DEBUG nova.compute.manager [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 03:14:08 np0005531887 nova_compute[186849]: 2025-11-22 08:14:08.972 186853 DEBUG nova.virt.libvirt.driver [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:14:08 np0005531887 nova_compute[186849]: 2025-11-22 08:14:08.972 186853 INFO nova.virt.libvirt.driver [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Creating image(s)#033[00m
Nov 22 03:14:08 np0005531887 nova_compute[186849]: 2025-11-22 08:14:08.973 186853 DEBUG oslo_concurrency.lockutils [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "/var/lib/nova/instances/9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:14:08 np0005531887 nova_compute[186849]: 2025-11-22 08:14:08.973 186853 DEBUG oslo_concurrency.lockutils [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "/var/lib/nova/instances/9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:14:08 np0005531887 nova_compute[186849]: 2025-11-22 08:14:08.973 186853 DEBUG oslo_concurrency.lockutils [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "/var/lib/nova/instances/9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:14:08 np0005531887 nova_compute[186849]: 2025-11-22 08:14:08.985 186853 DEBUG oslo_concurrency.processutils [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:14:09 np0005531887 nova_compute[186849]: 2025-11-22 08:14:09.040 186853 DEBUG nova.policy [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:14:09 np0005531887 nova_compute[186849]: 2025-11-22 08:14:09.043 186853 DEBUG oslo_concurrency.processutils [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:14:09 np0005531887 nova_compute[186849]: 2025-11-22 08:14:09.044 186853 DEBUG oslo_concurrency.lockutils [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:14:09 np0005531887 nova_compute[186849]: 2025-11-22 08:14:09.044 186853 DEBUG oslo_concurrency.lockutils [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:14:09 np0005531887 nova_compute[186849]: 2025-11-22 08:14:09.056 186853 DEBUG oslo_concurrency.processutils [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:14:09 np0005531887 nova_compute[186849]: 2025-11-22 08:14:09.117 186853 DEBUG oslo_concurrency.processutils [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:14:09 np0005531887 nova_compute[186849]: 2025-11-22 08:14:09.119 186853 DEBUG oslo_concurrency.processutils [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:14:09 np0005531887 nova_compute[186849]: 2025-11-22 08:14:09.359 186853 DEBUG oslo_concurrency.processutils [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e/disk 1073741824" returned: 0 in 0.240s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:14:09 np0005531887 nova_compute[186849]: 2025-11-22 08:14:09.360 186853 DEBUG oslo_concurrency.lockutils [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.316s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:14:09 np0005531887 nova_compute[186849]: 2025-11-22 08:14:09.361 186853 DEBUG oslo_concurrency.processutils [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:14:09 np0005531887 nova_compute[186849]: 2025-11-22 08:14:09.417 186853 DEBUG oslo_concurrency.processutils [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:14:09 np0005531887 nova_compute[186849]: 2025-11-22 08:14:09.419 186853 DEBUG nova.virt.disk.api [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Checking if we can resize image /var/lib/nova/instances/9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:14:09 np0005531887 nova_compute[186849]: 2025-11-22 08:14:09.419 186853 DEBUG oslo_concurrency.processutils [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:14:09 np0005531887 nova_compute[186849]: 2025-11-22 08:14:09.493 186853 DEBUG oslo_concurrency.processutils [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:14:09 np0005531887 nova_compute[186849]: 2025-11-22 08:14:09.494 186853 DEBUG nova.virt.disk.api [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Cannot resize image /var/lib/nova/instances/9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:14:09 np0005531887 nova_compute[186849]: 2025-11-22 08:14:09.495 186853 DEBUG nova.objects.instance [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lazy-loading 'migration_context' on Instance uuid 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:14:09 np0005531887 nova_compute[186849]: 2025-11-22 08:14:09.533 186853 DEBUG nova.virt.libvirt.driver [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:14:09 np0005531887 nova_compute[186849]: 2025-11-22 08:14:09.533 186853 DEBUG nova.virt.libvirt.driver [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Ensure instance console log exists: /var/lib/nova/instances/9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:14:09 np0005531887 nova_compute[186849]: 2025-11-22 08:14:09.534 186853 DEBUG oslo_concurrency.lockutils [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:14:09 np0005531887 nova_compute[186849]: 2025-11-22 08:14:09.534 186853 DEBUG oslo_concurrency.lockutils [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:14:09 np0005531887 nova_compute[186849]: 2025-11-22 08:14:09.534 186853 DEBUG oslo_concurrency.lockutils [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:14:09 np0005531887 nova_compute[186849]: 2025-11-22 08:14:09.876 186853 DEBUG nova.network.neutron [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Successfully created port: 0867f0fe-7fa4-488f-8c3f-451ee6c50a10 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:14:10 np0005531887 nova_compute[186849]: 2025-11-22 08:14:10.337 186853 DEBUG oslo_concurrency.lockutils [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Acquiring lock "9d3b7a77-8b28-4774-9eeb-65b858c3820b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:14:10 np0005531887 nova_compute[186849]: 2025-11-22 08:14:10.338 186853 DEBUG oslo_concurrency.lockutils [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Lock "9d3b7a77-8b28-4774-9eeb-65b858c3820b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:14:10 np0005531887 nova_compute[186849]: 2025-11-22 08:14:10.356 186853 DEBUG nova.compute.manager [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 03:14:10 np0005531887 nova_compute[186849]: 2025-11-22 08:14:10.552 186853 DEBUG oslo_concurrency.lockutils [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:14:10 np0005531887 nova_compute[186849]: 2025-11-22 08:14:10.553 186853 DEBUG oslo_concurrency.lockutils [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:14:10 np0005531887 nova_compute[186849]: 2025-11-22 08:14:10.560 186853 DEBUG nova.virt.hardware [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:14:10 np0005531887 nova_compute[186849]: 2025-11-22 08:14:10.560 186853 INFO nova.compute.claims [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 22 03:14:10 np0005531887 nova_compute[186849]: 2025-11-22 08:14:10.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:14:10 np0005531887 nova_compute[186849]: 2025-11-22 08:14:10.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:14:10 np0005531887 nova_compute[186849]: 2025-11-22 08:14:10.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 22 03:14:10 np0005531887 nova_compute[186849]: 2025-11-22 08:14:10.807 186853 DEBUG nova.compute.provider_tree [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:14:10 np0005531887 nova_compute[186849]: 2025-11-22 08:14:10.829 186853 DEBUG nova.scheduler.client.report [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:14:10 np0005531887 nova_compute[186849]: 2025-11-22 08:14:10.853 186853 DEBUG oslo_concurrency.lockutils [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.300s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:14:10 np0005531887 nova_compute[186849]: 2025-11-22 08:14:10.854 186853 DEBUG nova.compute.manager [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 03:14:10 np0005531887 nova_compute[186849]: 2025-11-22 08:14:10.928 186853 DEBUG nova.compute.manager [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 03:14:10 np0005531887 nova_compute[186849]: 2025-11-22 08:14:10.928 186853 DEBUG nova.network.neutron [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:14:10 np0005531887 nova_compute[186849]: 2025-11-22 08:14:10.954 186853 INFO nova.virt.libvirt.driver [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 03:14:10 np0005531887 nova_compute[186849]: 2025-11-22 08:14:10.979 186853 DEBUG nova.compute.manager [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 03:14:11 np0005531887 nova_compute[186849]: 2025-11-22 08:14:11.094 186853 DEBUG nova.compute.manager [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 03:14:11 np0005531887 nova_compute[186849]: 2025-11-22 08:14:11.096 186853 DEBUG nova.virt.libvirt.driver [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:14:11 np0005531887 nova_compute[186849]: 2025-11-22 08:14:11.096 186853 INFO nova.virt.libvirt.driver [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Creating image(s)#033[00m
Nov 22 03:14:11 np0005531887 nova_compute[186849]: 2025-11-22 08:14:11.097 186853 DEBUG oslo_concurrency.lockutils [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Acquiring lock "/var/lib/nova/instances/9d3b7a77-8b28-4774-9eeb-65b858c3820b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:14:11 np0005531887 nova_compute[186849]: 2025-11-22 08:14:11.098 186853 DEBUG oslo_concurrency.lockutils [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Lock "/var/lib/nova/instances/9d3b7a77-8b28-4774-9eeb-65b858c3820b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:14:11 np0005531887 nova_compute[186849]: 2025-11-22 08:14:11.099 186853 DEBUG oslo_concurrency.lockutils [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Lock "/var/lib/nova/instances/9d3b7a77-8b28-4774-9eeb-65b858c3820b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:14:11 np0005531887 nova_compute[186849]: 2025-11-22 08:14:11.099 186853 DEBUG oslo_concurrency.lockutils [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Acquiring lock "dd5495c5d01c62e3f2430ded9b741807f5260e73" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:14:11 np0005531887 nova_compute[186849]: 2025-11-22 08:14:11.100 186853 DEBUG oslo_concurrency.lockutils [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Lock "dd5495c5d01c62e3f2430ded9b741807f5260e73" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:14:11 np0005531887 nova_compute[186849]: 2025-11-22 08:14:11.104 186853 DEBUG nova.network.neutron [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Successfully updated port: 0867f0fe-7fa4-488f-8c3f-451ee6c50a10 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:14:11 np0005531887 nova_compute[186849]: 2025-11-22 08:14:11.120 186853 DEBUG oslo_concurrency.lockutils [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "refresh_cache-9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:14:11 np0005531887 nova_compute[186849]: 2025-11-22 08:14:11.120 186853 DEBUG oslo_concurrency.lockutils [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquired lock "refresh_cache-9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:14:11 np0005531887 nova_compute[186849]: 2025-11-22 08:14:11.121 186853 DEBUG nova.network.neutron [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:14:11 np0005531887 nova_compute[186849]: 2025-11-22 08:14:11.257 186853 DEBUG nova.compute.manager [req-4890d1f9-f516-436c-a16b-0205d1575bd4 req-8bbf6b2d-e8ff-45a8-842a-5f7edde01539 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Received event network-changed-0867f0fe-7fa4-488f-8c3f-451ee6c50a10 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:14:11 np0005531887 nova_compute[186849]: 2025-11-22 08:14:11.258 186853 DEBUG nova.compute.manager [req-4890d1f9-f516-436c-a16b-0205d1575bd4 req-8bbf6b2d-e8ff-45a8-842a-5f7edde01539 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Refreshing instance network info cache due to event network-changed-0867f0fe-7fa4-488f-8c3f-451ee6c50a10. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:14:11 np0005531887 nova_compute[186849]: 2025-11-22 08:14:11.258 186853 DEBUG oslo_concurrency.lockutils [req-4890d1f9-f516-436c-a16b-0205d1575bd4 req-8bbf6b2d-e8ff-45a8-842a-5f7edde01539 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:14:11 np0005531887 nova_compute[186849]: 2025-11-22 08:14:11.322 186853 DEBUG nova.policy [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '72df4512d7f245118018df81223ce5ff', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5c9016c6b616412fa2db0983e23a8150', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:14:11 np0005531887 nova_compute[186849]: 2025-11-22 08:14:11.427 186853 DEBUG nova.network.neutron [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:14:12 np0005531887 nova_compute[186849]: 2025-11-22 08:14:12.031 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:12 np0005531887 nova_compute[186849]: 2025-11-22 08:14:12.831 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:14:12 np0005531887 nova_compute[186849]: 2025-11-22 08:14:12.935 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:13 np0005531887 nova_compute[186849]: 2025-11-22 08:14:13.967 186853 DEBUG oslo_concurrency.processutils [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd5495c5d01c62e3f2430ded9b741807f5260e73.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:14:14 np0005531887 nova_compute[186849]: 2025-11-22 08:14:14.040 186853 DEBUG oslo_concurrency.processutils [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd5495c5d01c62e3f2430ded9b741807f5260e73.part --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:14:14 np0005531887 nova_compute[186849]: 2025-11-22 08:14:14.041 186853 DEBUG nova.virt.images [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] 2b9b9d31-f80f-437c-8142-755f74bb78ae was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Nov 22 03:14:14 np0005531887 nova_compute[186849]: 2025-11-22 08:14:14.044 186853 DEBUG nova.privsep.utils [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 22 03:14:14 np0005531887 nova_compute[186849]: 2025-11-22 08:14:14.045 186853 DEBUG oslo_concurrency.processutils [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/dd5495c5d01c62e3f2430ded9b741807f5260e73.part /var/lib/nova/instances/_base/dd5495c5d01c62e3f2430ded9b741807f5260e73.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:14:14 np0005531887 podman[234141]: 2025-11-22 08:14:14.833361558 +0000 UTC m=+0.051967204 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 03:14:14 np0005531887 nova_compute[186849]: 2025-11-22 08:14:14.994 186853 DEBUG oslo_concurrency.processutils [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/dd5495c5d01c62e3f2430ded9b741807f5260e73.part /var/lib/nova/instances/_base/dd5495c5d01c62e3f2430ded9b741807f5260e73.converted" returned: 0 in 0.949s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.006 186853 DEBUG oslo_concurrency.processutils [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd5495c5d01c62e3f2430ded9b741807f5260e73.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.064 186853 DEBUG oslo_concurrency.processutils [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd5495c5d01c62e3f2430ded9b741807f5260e73.converted --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.065 186853 DEBUG oslo_concurrency.lockutils [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Lock "dd5495c5d01c62e3f2430ded9b741807f5260e73" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 3.965s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.078 186853 DEBUG nova.network.neutron [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Updating instance_info_cache with network_info: [{"id": "0867f0fe-7fa4-488f-8c3f-451ee6c50a10", "address": "fa:16:3e:92:7c:a7", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0867f0fe-7f", "ovs_interfaceid": "0867f0fe-7fa4-488f-8c3f-451ee6c50a10", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.081 186853 DEBUG oslo_concurrency.processutils [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd5495c5d01c62e3f2430ded9b741807f5260e73 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.142 186853 DEBUG oslo_concurrency.processutils [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd5495c5d01c62e3f2430ded9b741807f5260e73 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.144 186853 DEBUG oslo_concurrency.lockutils [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Acquiring lock "dd5495c5d01c62e3f2430ded9b741807f5260e73" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.144 186853 DEBUG oslo_concurrency.lockutils [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Lock "dd5495c5d01c62e3f2430ded9b741807f5260e73" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.156 186853 DEBUG oslo_concurrency.processutils [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd5495c5d01c62e3f2430ded9b741807f5260e73 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.217 186853 DEBUG oslo_concurrency.processutils [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd5495c5d01c62e3f2430ded9b741807f5260e73 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.218 186853 DEBUG oslo_concurrency.processutils [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dd5495c5d01c62e3f2430ded9b741807f5260e73,backing_fmt=raw /var/lib/nova/instances/9d3b7a77-8b28-4774-9eeb-65b858c3820b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.265 186853 DEBUG oslo_concurrency.lockutils [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Releasing lock "refresh_cache-9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.266 186853 DEBUG nova.compute.manager [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Instance network_info: |[{"id": "0867f0fe-7fa4-488f-8c3f-451ee6c50a10", "address": "fa:16:3e:92:7c:a7", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0867f0fe-7f", "ovs_interfaceid": "0867f0fe-7fa4-488f-8c3f-451ee6c50a10", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.268 186853 DEBUG oslo_concurrency.lockutils [req-4890d1f9-f516-436c-a16b-0205d1575bd4 req-8bbf6b2d-e8ff-45a8-842a-5f7edde01539 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.268 186853 DEBUG nova.network.neutron [req-4890d1f9-f516-436c-a16b-0205d1575bd4 req-8bbf6b2d-e8ff-45a8-842a-5f7edde01539 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Refreshing network info cache for port 0867f0fe-7fa4-488f-8c3f-451ee6c50a10 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.271 186853 DEBUG nova.virt.libvirt.driver [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Start _get_guest_xml network_info=[{"id": "0867f0fe-7fa4-488f-8c3f-451ee6c50a10", "address": "fa:16:3e:92:7c:a7", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0867f0fe-7f", "ovs_interfaceid": "0867f0fe-7fa4-488f-8c3f-451ee6c50a10", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'size': 0, 'boot_index': 0, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.276 186853 DEBUG oslo_concurrency.processutils [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dd5495c5d01c62e3f2430ded9b741807f5260e73,backing_fmt=raw /var/lib/nova/instances/9d3b7a77-8b28-4774-9eeb-65b858c3820b/disk 1073741824" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.277 186853 DEBUG oslo_concurrency.lockutils [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Lock "dd5495c5d01c62e3f2430ded9b741807f5260e73" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.133s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.277 186853 DEBUG oslo_concurrency.processutils [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd5495c5d01c62e3f2430ded9b741807f5260e73 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.305 186853 WARNING nova.virt.libvirt.driver [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.308 186853 DEBUG nova.network.neutron [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Successfully created port: 6fca0d10-ec3d-4b7b-844b-458a39db0a47 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.315 186853 DEBUG nova.virt.libvirt.host [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.316 186853 DEBUG nova.virt.libvirt.host [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.320 186853 DEBUG nova.virt.libvirt.host [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.321 186853 DEBUG nova.virt.libvirt.host [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.322 186853 DEBUG nova.virt.libvirt.driver [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.323 186853 DEBUG nova.virt.hardware [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.323 186853 DEBUG nova.virt.hardware [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.324 186853 DEBUG nova.virt.hardware [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.324 186853 DEBUG nova.virt.hardware [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.324 186853 DEBUG nova.virt.hardware [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.324 186853 DEBUG nova.virt.hardware [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.325 186853 DEBUG nova.virt.hardware [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.325 186853 DEBUG nova.virt.hardware [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.325 186853 DEBUG nova.virt.hardware [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.326 186853 DEBUG nova.virt.hardware [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.326 186853 DEBUG nova.virt.hardware [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.330 186853 DEBUG nova.virt.libvirt.vif [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:14:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-849339840',display_name='tempest-ServersTestJSON-server-849339840',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestjson-server-849339840',id=140,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNeTAz8G+WtQEnDVQCIWuuae3miBxjOPUnKRHOt/l8W+q+TALWf4U9Y5FJxO6CFUtZ8WRe2/7bHoG1UTH6RY8u91pqpnnVgoXf5m2eSRE5boS4R0NeXH+VFSVo4SCT2B7w==',key_name='tempest-key-1426395444',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='70cb231da30d4002a985cf18a579cd6a',ramdisk_id='',reservation_id='r-q0lbn4yi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1620770071',owner_user_name='tempest-ServersTestJSON-1620770071-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:14:08Z,user_data=None,user_id='11d95211a44e4da9a04eb309ec3ab024',uuid=9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0867f0fe-7fa4-488f-8c3f-451ee6c50a10", "address": "fa:16:3e:92:7c:a7", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0867f0fe-7f", "ovs_interfaceid": "0867f0fe-7fa4-488f-8c3f-451ee6c50a10", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.331 186853 DEBUG nova.network.os_vif_util [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Converting VIF {"id": "0867f0fe-7fa4-488f-8c3f-451ee6c50a10", "address": "fa:16:3e:92:7c:a7", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0867f0fe-7f", "ovs_interfaceid": "0867f0fe-7fa4-488f-8c3f-451ee6c50a10", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.332 186853 DEBUG nova.network.os_vif_util [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:92:7c:a7,bridge_name='br-int',has_traffic_filtering=True,id=0867f0fe-7fa4-488f-8c3f-451ee6c50a10,network=Network(66c945b4-7237-4e85-b411-0c51b31ea31a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0867f0fe-7f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.333 186853 DEBUG nova.objects.instance [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lazy-loading 'pci_devices' on Instance uuid 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.343 186853 DEBUG oslo_concurrency.processutils [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd5495c5d01c62e3f2430ded9b741807f5260e73 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.344 186853 DEBUG nova.objects.instance [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Lazy-loading 'migration_context' on Instance uuid 9d3b7a77-8b28-4774-9eeb-65b858c3820b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.348 186853 DEBUG nova.virt.libvirt.driver [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:14:15 np0005531887 nova_compute[186849]:  <uuid>9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e</uuid>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:  <name>instance-0000008c</name>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:  <memory>131072</memory>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:  <vcpu>1</vcpu>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:14:15 np0005531887 nova_compute[186849]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:      <nova:name>tempest-ServersTestJSON-server-849339840</nova:name>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:      <nova:creationTime>2025-11-22 08:14:15</nova:creationTime>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:      <nova:flavor name="m1.nano">
Nov 22 03:14:15 np0005531887 nova_compute[186849]:        <nova:memory>128</nova:memory>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:        <nova:disk>1</nova:disk>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:        <nova:swap>0</nova:swap>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:      </nova:flavor>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:      <nova:owner>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:        <nova:user uuid="11d95211a44e4da9a04eb309ec3ab024">tempest-ServersTestJSON-1620770071-project-member</nova:user>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:        <nova:project uuid="70cb231da30d4002a985cf18a579cd6a">tempest-ServersTestJSON-1620770071</nova:project>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:      </nova:owner>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:      <nova:ports>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:        <nova:port uuid="0867f0fe-7fa4-488f-8c3f-451ee6c50a10">
Nov 22 03:14:15 np0005531887 nova_compute[186849]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:        </nova:port>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:      </nova:ports>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:    </nova:instance>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:  <sysinfo type="smbios">
Nov 22 03:14:15 np0005531887 nova_compute[186849]:    <system>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:      <entry name="serial">9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e</entry>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:      <entry name="uuid">9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e</entry>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:    </system>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:  <os>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:    <boot dev="hd"/>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:    <smbios mode="sysinfo"/>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:  </os>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:  <features>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:    <vmcoreinfo/>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:  </features>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:  <clock offset="utc">
Nov 22 03:14:15 np0005531887 nova_compute[186849]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:    <timer name="hpet" present="no"/>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:  </clock>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:  <cpu mode="custom" match="exact">
Nov 22 03:14:15 np0005531887 nova_compute[186849]:    <model>Nehalem</model>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:  <devices>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:    <disk type="file" device="disk">
Nov 22 03:14:15 np0005531887 nova_compute[186849]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e/disk"/>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:      <target dev="vda" bus="virtio"/>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:    <disk type="file" device="cdrom">
Nov 22 03:14:15 np0005531887 nova_compute[186849]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e/disk.config"/>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:      <target dev="sda" bus="sata"/>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:    <interface type="ethernet">
Nov 22 03:14:15 np0005531887 nova_compute[186849]:      <mac address="fa:16:3e:92:7c:a7"/>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:      <mtu size="1442"/>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:      <target dev="tap0867f0fe-7f"/>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:    </interface>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:    <serial type="pty">
Nov 22 03:14:15 np0005531887 nova_compute[186849]:      <log file="/var/lib/nova/instances/9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e/console.log" append="off"/>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:    </serial>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:    <video>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:    </video>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:    <input type="tablet" bus="usb"/>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:    <rng model="virtio">
Nov 22 03:14:15 np0005531887 nova_compute[186849]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:    </rng>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:    <controller type="usb" index="0"/>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:    <memballoon model="virtio">
Nov 22 03:14:15 np0005531887 nova_compute[186849]:      <stats period="10"/>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 03:14:15 np0005531887 nova_compute[186849]:  </devices>
Nov 22 03:14:15 np0005531887 nova_compute[186849]: </domain>
Nov 22 03:14:15 np0005531887 nova_compute[186849]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.349 186853 DEBUG nova.compute.manager [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Preparing to wait for external event network-vif-plugged-0867f0fe-7fa4-488f-8c3f-451ee6c50a10 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.350 186853 DEBUG oslo_concurrency.lockutils [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.350 186853 DEBUG oslo_concurrency.lockutils [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.351 186853 DEBUG oslo_concurrency.lockutils [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.351 186853 DEBUG nova.virt.libvirt.vif [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:14:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-849339840',display_name='tempest-ServersTestJSON-server-849339840',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestjson-server-849339840',id=140,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNeTAz8G+WtQEnDVQCIWuuae3miBxjOPUnKRHOt/l8W+q+TALWf4U9Y5FJxO6CFUtZ8WRe2/7bHoG1UTH6RY8u91pqpnnVgoXf5m2eSRE5boS4R0NeXH+VFSVo4SCT2B7w==',key_name='tempest-key-1426395444',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='70cb231da30d4002a985cf18a579cd6a',ramdisk_id='',reservation_id='r-q0lbn4yi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1620770071',owner_user_name='tempest-ServersTestJSON-1620770071-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:14:08Z,user_data=None,user_id='11d95211a44e4da9a04eb309ec3ab024',uuid=9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0867f0fe-7fa4-488f-8c3f-451ee6c50a10", "address": "fa:16:3e:92:7c:a7", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0867f0fe-7f", "ovs_interfaceid": "0867f0fe-7fa4-488f-8c3f-451ee6c50a10", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.352 186853 DEBUG nova.network.os_vif_util [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Converting VIF {"id": "0867f0fe-7fa4-488f-8c3f-451ee6c50a10", "address": "fa:16:3e:92:7c:a7", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0867f0fe-7f", "ovs_interfaceid": "0867f0fe-7fa4-488f-8c3f-451ee6c50a10", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.353 186853 DEBUG nova.network.os_vif_util [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:92:7c:a7,bridge_name='br-int',has_traffic_filtering=True,id=0867f0fe-7fa4-488f-8c3f-451ee6c50a10,network=Network(66c945b4-7237-4e85-b411-0c51b31ea31a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0867f0fe-7f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.353 186853 DEBUG os_vif [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:92:7c:a7,bridge_name='br-int',has_traffic_filtering=True,id=0867f0fe-7fa4-488f-8c3f-451ee6c50a10,network=Network(66c945b4-7237-4e85-b411-0c51b31ea31a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0867f0fe-7f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.354 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.354 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.355 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.358 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.359 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0867f0fe-7f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.359 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0867f0fe-7f, col_values=(('external_ids', {'iface-id': '0867f0fe-7fa4-488f-8c3f-451ee6c50a10', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:92:7c:a7', 'vm-uuid': '9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.361 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.363 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:14:15 np0005531887 NetworkManager[55210]: <info>  [1763799255.3634] manager: (tap0867f0fe-7f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/204)
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.370 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.372 186853 INFO os_vif [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:92:7c:a7,bridge_name='br-int',has_traffic_filtering=True,id=0867f0fe-7fa4-488f-8c3f-451ee6c50a10,network=Network(66c945b4-7237-4e85-b411-0c51b31ea31a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0867f0fe-7f')#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.390 186853 DEBUG nova.virt.libvirt.driver [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.391 186853 DEBUG nova.virt.libvirt.driver [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Ensure instance console log exists: /var/lib/nova/instances/9d3b7a77-8b28-4774-9eeb-65b858c3820b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.391 186853 DEBUG oslo_concurrency.lockutils [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.392 186853 DEBUG oslo_concurrency.lockutils [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.392 186853 DEBUG oslo_concurrency.lockutils [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.462 186853 DEBUG nova.virt.libvirt.driver [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.462 186853 DEBUG nova.virt.libvirt.driver [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.463 186853 DEBUG nova.virt.libvirt.driver [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] No VIF found with MAC fa:16:3e:92:7c:a7, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:14:15 np0005531887 nova_compute[186849]: 2025-11-22 08:14:15.463 186853 INFO nova.virt.libvirt.driver [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Using config drive#033[00m
Nov 22 03:14:17 np0005531887 nova_compute[186849]: 2025-11-22 08:14:17.034 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:17 np0005531887 nova_compute[186849]: 2025-11-22 08:14:17.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:14:17 np0005531887 nova_compute[186849]: 2025-11-22 08:14:17.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 22 03:14:17 np0005531887 nova_compute[186849]: 2025-11-22 08:14:17.786 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 22 03:14:17 np0005531887 nova_compute[186849]: 2025-11-22 08:14:17.902 186853 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763799242.9010608, cd188127-6d18-4ee7-a764-30f3612157e8 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:14:17 np0005531887 nova_compute[186849]: 2025-11-22 08:14:17.902 186853 INFO nova.compute.manager [-] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:14:17 np0005531887 nova_compute[186849]: 2025-11-22 08:14:17.921 186853 DEBUG nova.compute.manager [None req-2909e5aa-49cd-4a72-bbc5-996fa7c8e28d - - - - - -] [instance: cd188127-6d18-4ee7-a764-30f3612157e8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:14:18 np0005531887 nova_compute[186849]: 2025-11-22 08:14:18.779 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:14:18 np0005531887 nova_compute[186849]: 2025-11-22 08:14:18.795 186853 INFO nova.virt.libvirt.driver [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Creating config drive at /var/lib/nova/instances/9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e/disk.config#033[00m
Nov 22 03:14:18 np0005531887 nova_compute[186849]: 2025-11-22 08:14:18.800 186853 DEBUG oslo_concurrency.processutils [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqo_6kk_h execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:14:18 np0005531887 nova_compute[186849]: 2025-11-22 08:14:18.933 186853 DEBUG oslo_concurrency.processutils [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqo_6kk_h" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:14:19 np0005531887 kernel: tap0867f0fe-7f: entered promiscuous mode
Nov 22 03:14:19 np0005531887 NetworkManager[55210]: <info>  [1763799259.0142] manager: (tap0867f0fe-7f): new Tun device (/org/freedesktop/NetworkManager/Devices/205)
Nov 22 03:14:19 np0005531887 nova_compute[186849]: 2025-11-22 08:14:19.013 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:19 np0005531887 ovn_controller[95130]: 2025-11-22T08:14:19Z|00453|binding|INFO|Claiming lport 0867f0fe-7fa4-488f-8c3f-451ee6c50a10 for this chassis.
Nov 22 03:14:19 np0005531887 ovn_controller[95130]: 2025-11-22T08:14:19Z|00454|binding|INFO|0867f0fe-7fa4-488f-8c3f-451ee6c50a10: Claiming fa:16:3e:92:7c:a7 10.100.0.14
Nov 22 03:14:19 np0005531887 ovn_controller[95130]: 2025-11-22T08:14:19Z|00455|binding|INFO|Setting lport 0867f0fe-7fa4-488f-8c3f-451ee6c50a10 ovn-installed in OVS
Nov 22 03:14:19 np0005531887 nova_compute[186849]: 2025-11-22 08:14:19.034 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:19 np0005531887 nova_compute[186849]: 2025-11-22 08:14:19.037 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:19 np0005531887 systemd-udevd[234200]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:14:19 np0005531887 systemd-machined[153180]: New machine qemu-50-instance-0000008c.
Nov 22 03:14:19 np0005531887 NetworkManager[55210]: <info>  [1763799259.0661] device (tap0867f0fe-7f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:14:19 np0005531887 NetworkManager[55210]: <info>  [1763799259.0673] device (tap0867f0fe-7f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:14:19 np0005531887 systemd[1]: Started Virtual Machine qemu-50-instance-0000008c.
Nov 22 03:14:19 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:19.480 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:92:7c:a7 10.100.0.14'], port_security=['fa:16:3e:92:7c:a7 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-66c945b4-7237-4e85-b411-0c51b31ea31a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '70cb231da30d4002a985cf18a579cd6a', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'cdac32cd-3018-48f9-b8b4-269b2f46b94b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b63d9e41-5235-4b2c-88f9-85531fc2355b, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=0867f0fe-7fa4-488f-8c3f-451ee6c50a10) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:14:19 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:19.482 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 0867f0fe-7fa4-488f-8c3f-451ee6c50a10 in datapath 66c945b4-7237-4e85-b411-0c51b31ea31a bound to our chassis#033[00m
Nov 22 03:14:19 np0005531887 ovn_controller[95130]: 2025-11-22T08:14:19Z|00456|binding|INFO|Setting lport 0867f0fe-7fa4-488f-8c3f-451ee6c50a10 up in Southbound
Nov 22 03:14:19 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:19.484 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 66c945b4-7237-4e85-b411-0c51b31ea31a#033[00m
Nov 22 03:14:19 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:19.500 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[d2af9a35-dd07-4002-9868-25200590d7c1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:14:19 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:19.501 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap66c945b4-71 in ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:14:19 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:19.503 213790 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap66c945b4-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:14:19 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:19.504 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[b6f1b6fc-c56d-44fe-b411-78f6f989b29c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:14:19 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:19.505 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[a9499ac9-2337-45cf-ac12-1e7cc4ced80d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:14:19 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:19.521 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[ec91a191-05bf-4057-b0a6-f563356f9cda]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:14:19 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:19.538 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[d1854faf-89b1-4de0-b2f2-2ac68d123321]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:14:19 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:19.577 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[314b4464-c122-46d2-a9b1-5e1bec5628e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:14:19 np0005531887 systemd-udevd[234203]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:14:19 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:19.585 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[ee806c52-20a3-4b0a-b1b8-6ec554eac24b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:14:19 np0005531887 NetworkManager[55210]: <info>  [1763799259.5874] manager: (tap66c945b4-70): new Veth device (/org/freedesktop/NetworkManager/Devices/206)
Nov 22 03:14:19 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:19.631 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[a19fe1ee-5dc4-4756-838d-95161abeb599]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:14:19 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:19.636 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[1a6ce040-18b2-45e5-9aa6-5562781a928c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:14:19 np0005531887 NetworkManager[55210]: <info>  [1763799259.6637] device (tap66c945b4-70): carrier: link connected
Nov 22 03:14:19 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:19.670 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[01a0b735-b152-4d88-bbc7-a9d6eeb8e102]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:14:19 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:19.691 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[44c87397-4a3b-4f8d-847f-2618c0f38a50]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap66c945b4-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7c:5a:27'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 135], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 600227, 'reachable_time': 39595, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234240, 'error': None, 'target': 'ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:14:19 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:19.709 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[10cd73df-dd7b-400d-a5a1-1d94c02dc3ae]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe7c:5a27'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 600227, 'tstamp': 600227}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234242, 'error': None, 'target': 'ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:14:19 np0005531887 nova_compute[186849]: 2025-11-22 08:14:19.731 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763799259.7305205, 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:14:19 np0005531887 nova_compute[186849]: 2025-11-22 08:14:19.731 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] VM Started (Lifecycle Event)#033[00m
Nov 22 03:14:19 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:19.732 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[7e448347-fcdd-41cb-af43-4ba66145b32c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap66c945b4-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7c:5a:27'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 135], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 600227, 'reachable_time': 39595, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 234243, 'error': None, 'target': 'ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:14:19 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:19.772 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[dc0adfd8-ad23-4039-a0ec-8550f1d69588]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:14:19 np0005531887 nova_compute[186849]: 2025-11-22 08:14:19.794 186853 DEBUG nova.network.neutron [req-4890d1f9-f516-436c-a16b-0205d1575bd4 req-8bbf6b2d-e8ff-45a8-842a-5f7edde01539 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Updated VIF entry in instance network info cache for port 0867f0fe-7fa4-488f-8c3f-451ee6c50a10. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:14:19 np0005531887 nova_compute[186849]: 2025-11-22 08:14:19.795 186853 DEBUG nova.network.neutron [req-4890d1f9-f516-436c-a16b-0205d1575bd4 req-8bbf6b2d-e8ff-45a8-842a-5f7edde01539 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Updating instance_info_cache with network_info: [{"id": "0867f0fe-7fa4-488f-8c3f-451ee6c50a10", "address": "fa:16:3e:92:7c:a7", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0867f0fe-7f", "ovs_interfaceid": "0867f0fe-7fa4-488f-8c3f-451ee6c50a10", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:14:19 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:19.845 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[0fa03da2-512d-4b3a-81cd-93b6642e2226]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:14:19 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:19.847 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap66c945b4-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:14:19 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:19.848 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:14:19 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:19.848 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap66c945b4-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:14:19 np0005531887 nova_compute[186849]: 2025-11-22 08:14:19.848 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:14:19 np0005531887 NetworkManager[55210]: <info>  [1763799259.8510] manager: (tap66c945b4-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/207)
Nov 22 03:14:19 np0005531887 kernel: tap66c945b4-70: entered promiscuous mode
Nov 22 03:14:19 np0005531887 nova_compute[186849]: 2025-11-22 08:14:19.852 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:19 np0005531887 nova_compute[186849]: 2025-11-22 08:14:19.855 186853 DEBUG oslo_concurrency.lockutils [req-4890d1f9-f516-436c-a16b-0205d1575bd4 req-8bbf6b2d-e8ff-45a8-842a-5f7edde01539 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:14:19 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:19.856 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap66c945b4-70, col_values=(('external_ids', {'iface-id': 'd6ef1392-aa2a-4e3e-91ba-ec0ce61e416a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:14:19 np0005531887 ovn_controller[95130]: 2025-11-22T08:14:19Z|00457|binding|INFO|Releasing lport d6ef1392-aa2a-4e3e-91ba-ec0ce61e416a from this chassis (sb_readonly=0)
Nov 22 03:14:19 np0005531887 nova_compute[186849]: 2025-11-22 08:14:19.859 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:19 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:19.861 104084 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/66c945b4-7237-4e85-b411-0c51b31ea31a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/66c945b4-7237-4e85-b411-0c51b31ea31a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:14:19 np0005531887 nova_compute[186849]: 2025-11-22 08:14:19.861 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763799259.7319808, 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:14:19 np0005531887 nova_compute[186849]: 2025-11-22 08:14:19.862 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:14:19 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:19.862 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[7c96cf28-2557-436f-852e-765c78c3877f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:14:19 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:19.864 104084 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:14:19 np0005531887 ovn_metadata_agent[104079]: global
Nov 22 03:14:19 np0005531887 ovn_metadata_agent[104079]:    log         /dev/log local0 debug
Nov 22 03:14:19 np0005531887 ovn_metadata_agent[104079]:    log-tag     haproxy-metadata-proxy-66c945b4-7237-4e85-b411-0c51b31ea31a
Nov 22 03:14:19 np0005531887 ovn_metadata_agent[104079]:    user        root
Nov 22 03:14:19 np0005531887 ovn_metadata_agent[104079]:    group       root
Nov 22 03:14:19 np0005531887 ovn_metadata_agent[104079]:    maxconn     1024
Nov 22 03:14:19 np0005531887 ovn_metadata_agent[104079]:    pidfile     /var/lib/neutron/external/pids/66c945b4-7237-4e85-b411-0c51b31ea31a.pid.haproxy
Nov 22 03:14:19 np0005531887 ovn_metadata_agent[104079]:    daemon
Nov 22 03:14:19 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:14:19 np0005531887 ovn_metadata_agent[104079]: defaults
Nov 22 03:14:19 np0005531887 ovn_metadata_agent[104079]:    log global
Nov 22 03:14:19 np0005531887 ovn_metadata_agent[104079]:    mode http
Nov 22 03:14:19 np0005531887 ovn_metadata_agent[104079]:    option httplog
Nov 22 03:14:19 np0005531887 ovn_metadata_agent[104079]:    option dontlognull
Nov 22 03:14:19 np0005531887 ovn_metadata_agent[104079]:    option http-server-close
Nov 22 03:14:19 np0005531887 ovn_metadata_agent[104079]:    option forwardfor
Nov 22 03:14:19 np0005531887 ovn_metadata_agent[104079]:    retries                 3
Nov 22 03:14:19 np0005531887 ovn_metadata_agent[104079]:    timeout http-request    30s
Nov 22 03:14:19 np0005531887 ovn_metadata_agent[104079]:    timeout connect         30s
Nov 22 03:14:19 np0005531887 ovn_metadata_agent[104079]:    timeout client          32s
Nov 22 03:14:19 np0005531887 ovn_metadata_agent[104079]:    timeout server          32s
Nov 22 03:14:19 np0005531887 ovn_metadata_agent[104079]:    timeout http-keep-alive 30s
Nov 22 03:14:19 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:14:19 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:14:19 np0005531887 ovn_metadata_agent[104079]: listen listener
Nov 22 03:14:19 np0005531887 ovn_metadata_agent[104079]:    bind 169.254.169.254:80
Nov 22 03:14:19 np0005531887 ovn_metadata_agent[104079]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:14:19 np0005531887 ovn_metadata_agent[104079]:    http-request add-header X-OVN-Network-ID 66c945b4-7237-4e85-b411-0c51b31ea31a
Nov 22 03:14:19 np0005531887 ovn_metadata_agent[104079]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:14:19 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:19.865 104084 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a', 'env', 'PROCESS_TAG=haproxy-66c945b4-7237-4e85-b411-0c51b31ea31a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/66c945b4-7237-4e85-b411-0c51b31ea31a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:14:19 np0005531887 nova_compute[186849]: 2025-11-22 08:14:19.870 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:19 np0005531887 nova_compute[186849]: 2025-11-22 08:14:19.878 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:14:19 np0005531887 nova_compute[186849]: 2025-11-22 08:14:19.882 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:14:19 np0005531887 nova_compute[186849]: 2025-11-22 08:14:19.904 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:14:20 np0005531887 podman[234274]: 2025-11-22 08:14:20.283296885 +0000 UTC m=+0.086366223 container create 40d9328b60e4b899762c69b9ded38e8c5b8196a95f45d5baca3ec402c5ac80d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:14:20 np0005531887 podman[234274]: 2025-11-22 08:14:20.220779152 +0000 UTC m=+0.023848520 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:14:20 np0005531887 systemd[1]: Started libpod-conmon-40d9328b60e4b899762c69b9ded38e8c5b8196a95f45d5baca3ec402c5ac80d5.scope.
Nov 22 03:14:20 np0005531887 systemd[1]: Started libcrun container.
Nov 22 03:14:20 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c6b02c7a7267a0ef93bc748b34985473749ee0ecfe288363b4dd2f863bde53a6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:14:20 np0005531887 nova_compute[186849]: 2025-11-22 08:14:20.362 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:20 np0005531887 podman[234274]: 2025-11-22 08:14:20.374750532 +0000 UTC m=+0.177819880 container init 40d9328b60e4b899762c69b9ded38e8c5b8196a95f45d5baca3ec402c5ac80d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 03:14:20 np0005531887 podman[234274]: 2025-11-22 08:14:20.38155176 +0000 UTC m=+0.184621088 container start 40d9328b60e4b899762c69b9ded38e8c5b8196a95f45d5baca3ec402c5ac80d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 03:14:20 np0005531887 podman[234287]: 2025-11-22 08:14:20.382687248 +0000 UTC m=+0.061952251 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:14:20 np0005531887 neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a[234295]: [NOTICE]   (234311) : New worker (234313) forked
Nov 22 03:14:20 np0005531887 neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a[234295]: [NOTICE]   (234311) : Loading success.
Nov 22 03:14:20 np0005531887 nova_compute[186849]: 2025-11-22 08:14:20.903 186853 DEBUG nova.network.neutron [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Successfully updated port: 6fca0d10-ec3d-4b7b-844b-458a39db0a47 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:14:20 np0005531887 nova_compute[186849]: 2025-11-22 08:14:20.946 186853 DEBUG oslo_concurrency.lockutils [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Acquiring lock "refresh_cache-9d3b7a77-8b28-4774-9eeb-65b858c3820b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:14:20 np0005531887 nova_compute[186849]: 2025-11-22 08:14:20.947 186853 DEBUG oslo_concurrency.lockutils [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Acquired lock "refresh_cache-9d3b7a77-8b28-4774-9eeb-65b858c3820b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:14:20 np0005531887 nova_compute[186849]: 2025-11-22 08:14:20.947 186853 DEBUG nova.network.neutron [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:14:21 np0005531887 nova_compute[186849]: 2025-11-22 08:14:21.084 186853 DEBUG nova.compute.manager [req-08af2d56-ad3d-461d-8572-8bb607117206 req-8dc2ef5e-9b46-4997-a91e-cf39442e985a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Received event network-changed-6fca0d10-ec3d-4b7b-844b-458a39db0a47 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:14:21 np0005531887 nova_compute[186849]: 2025-11-22 08:14:21.084 186853 DEBUG nova.compute.manager [req-08af2d56-ad3d-461d-8572-8bb607117206 req-8dc2ef5e-9b46-4997-a91e-cf39442e985a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Refreshing instance network info cache due to event network-changed-6fca0d10-ec3d-4b7b-844b-458a39db0a47. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:14:21 np0005531887 nova_compute[186849]: 2025-11-22 08:14:21.085 186853 DEBUG oslo_concurrency.lockutils [req-08af2d56-ad3d-461d-8572-8bb607117206 req-8dc2ef5e-9b46-4997-a91e-cf39442e985a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-9d3b7a77-8b28-4774-9eeb-65b858c3820b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:14:21 np0005531887 nova_compute[186849]: 2025-11-22 08:14:21.717 186853 DEBUG nova.network.neutron [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:14:21 np0005531887 nova_compute[186849]: 2025-11-22 08:14:21.795 186853 DEBUG nova.compute.manager [req-4dfba895-2611-4714-8167-ff45342c99c6 req-202b5bcb-2d29-4f8d-8095-8d714ee26717 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Received event network-vif-plugged-0867f0fe-7fa4-488f-8c3f-451ee6c50a10 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:14:21 np0005531887 nova_compute[186849]: 2025-11-22 08:14:21.795 186853 DEBUG oslo_concurrency.lockutils [req-4dfba895-2611-4714-8167-ff45342c99c6 req-202b5bcb-2d29-4f8d-8095-8d714ee26717 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:14:21 np0005531887 nova_compute[186849]: 2025-11-22 08:14:21.795 186853 DEBUG oslo_concurrency.lockutils [req-4dfba895-2611-4714-8167-ff45342c99c6 req-202b5bcb-2d29-4f8d-8095-8d714ee26717 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:14:21 np0005531887 nova_compute[186849]: 2025-11-22 08:14:21.796 186853 DEBUG oslo_concurrency.lockutils [req-4dfba895-2611-4714-8167-ff45342c99c6 req-202b5bcb-2d29-4f8d-8095-8d714ee26717 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:14:21 np0005531887 nova_compute[186849]: 2025-11-22 08:14:21.796 186853 DEBUG nova.compute.manager [req-4dfba895-2611-4714-8167-ff45342c99c6 req-202b5bcb-2d29-4f8d-8095-8d714ee26717 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Processing event network-vif-plugged-0867f0fe-7fa4-488f-8c3f-451ee6c50a10 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:14:21 np0005531887 nova_compute[186849]: 2025-11-22 08:14:21.796 186853 DEBUG nova.compute.manager [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:14:21 np0005531887 nova_compute[186849]: 2025-11-22 08:14:21.801 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763799261.8009512, 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:14:21 np0005531887 nova_compute[186849]: 2025-11-22 08:14:21.802 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:14:21 np0005531887 nova_compute[186849]: 2025-11-22 08:14:21.804 186853 DEBUG nova.virt.libvirt.driver [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:14:21 np0005531887 nova_compute[186849]: 2025-11-22 08:14:21.809 186853 INFO nova.virt.libvirt.driver [-] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Instance spawned successfully.#033[00m
Nov 22 03:14:21 np0005531887 nova_compute[186849]: 2025-11-22 08:14:21.810 186853 DEBUG nova.virt.libvirt.driver [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 03:14:21 np0005531887 nova_compute[186849]: 2025-11-22 08:14:21.835 186853 DEBUG nova.virt.libvirt.driver [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:14:21 np0005531887 nova_compute[186849]: 2025-11-22 08:14:21.836 186853 DEBUG nova.virt.libvirt.driver [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:14:21 np0005531887 nova_compute[186849]: 2025-11-22 08:14:21.837 186853 DEBUG nova.virt.libvirt.driver [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:14:21 np0005531887 nova_compute[186849]: 2025-11-22 08:14:21.837 186853 DEBUG nova.virt.libvirt.driver [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:14:21 np0005531887 nova_compute[186849]: 2025-11-22 08:14:21.837 186853 DEBUG nova.virt.libvirt.driver [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:14:21 np0005531887 nova_compute[186849]: 2025-11-22 08:14:21.838 186853 DEBUG nova.virt.libvirt.driver [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:14:21 np0005531887 nova_compute[186849]: 2025-11-22 08:14:21.843 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:14:21 np0005531887 nova_compute[186849]: 2025-11-22 08:14:21.846 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:14:21 np0005531887 nova_compute[186849]: 2025-11-22 08:14:21.867 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:14:22 np0005531887 nova_compute[186849]: 2025-11-22 08:14:22.035 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:23 np0005531887 nova_compute[186849]: 2025-11-22 08:14:23.029 186853 INFO nova.compute.manager [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Took 14.06 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 03:14:23 np0005531887 nova_compute[186849]: 2025-11-22 08:14:23.030 186853 DEBUG nova.compute.manager [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:14:23 np0005531887 nova_compute[186849]: 2025-11-22 08:14:23.207 186853 INFO nova.compute.manager [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Took 14.79 seconds to build instance.#033[00m
Nov 22 03:14:23 np0005531887 nova_compute[186849]: 2025-11-22 08:14:23.282 186853 DEBUG oslo_concurrency.lockutils [None req-d0b81d79-5fe4-4ad1-839b-b19b4e0269c2 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.978s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:14:23 np0005531887 nova_compute[186849]: 2025-11-22 08:14:23.927 186853 DEBUG nova.compute.manager [req-2163f7d1-9a27-4c17-8d8f-e67ac42ce685 req-a82142a2-2f88-4893-8aea-809261290776 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Received event network-vif-plugged-0867f0fe-7fa4-488f-8c3f-451ee6c50a10 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:14:23 np0005531887 nova_compute[186849]: 2025-11-22 08:14:23.927 186853 DEBUG oslo_concurrency.lockutils [req-2163f7d1-9a27-4c17-8d8f-e67ac42ce685 req-a82142a2-2f88-4893-8aea-809261290776 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:14:23 np0005531887 nova_compute[186849]: 2025-11-22 08:14:23.928 186853 DEBUG oslo_concurrency.lockutils [req-2163f7d1-9a27-4c17-8d8f-e67ac42ce685 req-a82142a2-2f88-4893-8aea-809261290776 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:14:23 np0005531887 nova_compute[186849]: 2025-11-22 08:14:23.929 186853 DEBUG oslo_concurrency.lockutils [req-2163f7d1-9a27-4c17-8d8f-e67ac42ce685 req-a82142a2-2f88-4893-8aea-809261290776 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:14:23 np0005531887 nova_compute[186849]: 2025-11-22 08:14:23.929 186853 DEBUG nova.compute.manager [req-2163f7d1-9a27-4c17-8d8f-e67ac42ce685 req-a82142a2-2f88-4893-8aea-809261290776 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] No waiting events found dispatching network-vif-plugged-0867f0fe-7fa4-488f-8c3f-451ee6c50a10 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:14:23 np0005531887 nova_compute[186849]: 2025-11-22 08:14:23.930 186853 WARNING nova.compute.manager [req-2163f7d1-9a27-4c17-8d8f-e67ac42ce685 req-a82142a2-2f88-4893-8aea-809261290776 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Received unexpected event network-vif-plugged-0867f0fe-7fa4-488f-8c3f-451ee6c50a10 for instance with vm_state active and task_state None.#033[00m
Nov 22 03:14:24 np0005531887 nova_compute[186849]: 2025-11-22 08:14:24.427 186853 DEBUG nova.network.neutron [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Updating instance_info_cache with network_info: [{"id": "6fca0d10-ec3d-4b7b-844b-458a39db0a47", "address": "fa:16:3e:90:00:cb", "network": {"id": "5cbf5083-8d50-44bd-b6ba-93e507a8654e", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-722250133-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9016c6b616412fa2db0983e23a8150", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6fca0d10-ec", "ovs_interfaceid": "6fca0d10-ec3d-4b7b-844b-458a39db0a47", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:14:24 np0005531887 nova_compute[186849]: 2025-11-22 08:14:24.452 186853 DEBUG oslo_concurrency.lockutils [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Releasing lock "refresh_cache-9d3b7a77-8b28-4774-9eeb-65b858c3820b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:14:24 np0005531887 nova_compute[186849]: 2025-11-22 08:14:24.452 186853 DEBUG nova.compute.manager [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Instance network_info: |[{"id": "6fca0d10-ec3d-4b7b-844b-458a39db0a47", "address": "fa:16:3e:90:00:cb", "network": {"id": "5cbf5083-8d50-44bd-b6ba-93e507a8654e", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-722250133-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9016c6b616412fa2db0983e23a8150", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6fca0d10-ec", "ovs_interfaceid": "6fca0d10-ec3d-4b7b-844b-458a39db0a47", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 03:14:24 np0005531887 nova_compute[186849]: 2025-11-22 08:14:24.453 186853 DEBUG oslo_concurrency.lockutils [req-08af2d56-ad3d-461d-8572-8bb607117206 req-8dc2ef5e-9b46-4997-a91e-cf39442e985a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-9d3b7a77-8b28-4774-9eeb-65b858c3820b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:14:24 np0005531887 nova_compute[186849]: 2025-11-22 08:14:24.453 186853 DEBUG nova.network.neutron [req-08af2d56-ad3d-461d-8572-8bb607117206 req-8dc2ef5e-9b46-4997-a91e-cf39442e985a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Refreshing network info cache for port 6fca0d10-ec3d-4b7b-844b-458a39db0a47 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:14:24 np0005531887 nova_compute[186849]: 2025-11-22 08:14:24.456 186853 DEBUG nova.virt.libvirt.driver [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Start _get_guest_xml network_info=[{"id": "6fca0d10-ec3d-4b7b-844b-458a39db0a47", "address": "fa:16:3e:90:00:cb", "network": {"id": "5cbf5083-8d50-44bd-b6ba-93e507a8654e", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-722250133-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9016c6b616412fa2db0983e23a8150", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6fca0d10-ec", "ovs_interfaceid": "6fca0d10-ec3d-4b7b-844b-458a39db0a47", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='57bb7699361678fb09bc540599f03160',container_format='bare',created_at=2025-11-22T08:13:59Z,direct_url=<?>,disk_format='qcow2',id=2b9b9d31-f80f-437c-8142-755f74bb78ae,min_disk=1,min_ram=0,name='tempest-TestSnapshotPatternsnapshot-31724466',owner='5c9016c6b616412fa2db0983e23a8150',properties=ImageMetaProps,protected=<?>,size=52297728,status='active',tags=<?>,updated_at=2025-11-22T08:14:06Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'size': 0, 'boot_index': 0, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'image_id': '2b9b9d31-f80f-437c-8142-755f74bb78ae'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:14:24 np0005531887 nova_compute[186849]: 2025-11-22 08:14:24.461 186853 WARNING nova.virt.libvirt.driver [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:14:24 np0005531887 nova_compute[186849]: 2025-11-22 08:14:24.471 186853 DEBUG nova.virt.libvirt.host [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:14:24 np0005531887 nova_compute[186849]: 2025-11-22 08:14:24.472 186853 DEBUG nova.virt.libvirt.host [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:14:24 np0005531887 nova_compute[186849]: 2025-11-22 08:14:24.475 186853 DEBUG nova.virt.libvirt.host [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:14:24 np0005531887 nova_compute[186849]: 2025-11-22 08:14:24.476 186853 DEBUG nova.virt.libvirt.host [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:14:24 np0005531887 nova_compute[186849]: 2025-11-22 08:14:24.477 186853 DEBUG nova.virt.libvirt.driver [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:14:24 np0005531887 nova_compute[186849]: 2025-11-22 08:14:24.477 186853 DEBUG nova.virt.hardware [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='57bb7699361678fb09bc540599f03160',container_format='bare',created_at=2025-11-22T08:13:59Z,direct_url=<?>,disk_format='qcow2',id=2b9b9d31-f80f-437c-8142-755f74bb78ae,min_disk=1,min_ram=0,name='tempest-TestSnapshotPatternsnapshot-31724466',owner='5c9016c6b616412fa2db0983e23a8150',properties=ImageMetaProps,protected=<?>,size=52297728,status='active',tags=<?>,updated_at=2025-11-22T08:14:06Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:14:24 np0005531887 nova_compute[186849]: 2025-11-22 08:14:24.478 186853 DEBUG nova.virt.hardware [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:14:24 np0005531887 nova_compute[186849]: 2025-11-22 08:14:24.478 186853 DEBUG nova.virt.hardware [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:14:24 np0005531887 nova_compute[186849]: 2025-11-22 08:14:24.478 186853 DEBUG nova.virt.hardware [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:14:24 np0005531887 nova_compute[186849]: 2025-11-22 08:14:24.478 186853 DEBUG nova.virt.hardware [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:14:24 np0005531887 nova_compute[186849]: 2025-11-22 08:14:24.479 186853 DEBUG nova.virt.hardware [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:14:24 np0005531887 nova_compute[186849]: 2025-11-22 08:14:24.479 186853 DEBUG nova.virt.hardware [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:14:24 np0005531887 nova_compute[186849]: 2025-11-22 08:14:24.479 186853 DEBUG nova.virt.hardware [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:14:24 np0005531887 nova_compute[186849]: 2025-11-22 08:14:24.480 186853 DEBUG nova.virt.hardware [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:14:24 np0005531887 nova_compute[186849]: 2025-11-22 08:14:24.480 186853 DEBUG nova.virt.hardware [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:14:24 np0005531887 nova_compute[186849]: 2025-11-22 08:14:24.480 186853 DEBUG nova.virt.hardware [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:14:24 np0005531887 nova_compute[186849]: 2025-11-22 08:14:24.483 186853 DEBUG nova.virt.libvirt.vif [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:14:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-368734434',display_name='tempest-TestSnapshotPattern-server-368734434',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-368734434',id=141,image_ref='2b9b9d31-f80f-437c-8142-755f74bb78ae',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO3zqwL5oCAVcYUK4UfRxRlwiLCpXhyrVibiQXfDMPSmEzdCg2weZeJjjoUlK1vs2o/ZsP7kK+r7TBW2xEMw9M43RfSbbpgfpmDe3/3E/PZ1RgVY0zy+sKDgo7g8yf0esA==',key_name='tempest-TestSnapshotPattern-653067273',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5c9016c6b616412fa2db0983e23a8150',ramdisk_id='',reservation_id='r-9jtkwk6t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_boot_roles='member,reader',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='64848f5c-64c9-41ed-9c0d-c2ef3839d5de',image_min_disk='1',image_min_ram='0',image_owner_id='5c9016c6b616412fa2db0983e23a8150',image_owner_project_name='tempest-TestSnapshotPattern-1254822391',image_owner_user_name='tempest-TestSnapshotPattern-1254822391-project-member',image_user_id='72df4512d7f245118018df81223ce5ff',image_version='8.0',network_allocated='True',owner_project_name='tempest-TestSnapshotPattern-1254822391',owner_user_name='tempest-TestSnapshotPattern-1254822391-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:14:11Z,user_data=None,user_id='72df4512d7f245118018df81223ce5ff',uuid=9d3b7a77-8b28-4774-9eeb-65b858c3820b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6fca0d10-ec3d-4b7b-844b-458a39db0a47", "address": "fa:16:3e:90:00:cb", "network": {"id": "5cbf5083-8d50-44bd-b6ba-93e507a8654e", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-722250133-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9016c6b616412fa2db0983e23a8150", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6fca0d10-ec", "ovs_interfaceid": "6fca0d10-ec3d-4b7b-844b-458a39db0a47", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:14:24 np0005531887 nova_compute[186849]: 2025-11-22 08:14:24.484 186853 DEBUG nova.network.os_vif_util [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Converting VIF {"id": "6fca0d10-ec3d-4b7b-844b-458a39db0a47", "address": "fa:16:3e:90:00:cb", "network": {"id": "5cbf5083-8d50-44bd-b6ba-93e507a8654e", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-722250133-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9016c6b616412fa2db0983e23a8150", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6fca0d10-ec", "ovs_interfaceid": "6fca0d10-ec3d-4b7b-844b-458a39db0a47", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:14:24 np0005531887 nova_compute[186849]: 2025-11-22 08:14:24.484 186853 DEBUG nova.network.os_vif_util [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:90:00:cb,bridge_name='br-int',has_traffic_filtering=True,id=6fca0d10-ec3d-4b7b-844b-458a39db0a47,network=Network(5cbf5083-8d50-44bd-b6ba-93e507a8654e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6fca0d10-ec') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:14:24 np0005531887 nova_compute[186849]: 2025-11-22 08:14:24.485 186853 DEBUG nova.objects.instance [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9d3b7a77-8b28-4774-9eeb-65b858c3820b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:14:24 np0005531887 nova_compute[186849]: 2025-11-22 08:14:24.506 186853 DEBUG nova.virt.libvirt.driver [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:14:24 np0005531887 nova_compute[186849]:  <uuid>9d3b7a77-8b28-4774-9eeb-65b858c3820b</uuid>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:  <name>instance-0000008d</name>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:  <memory>131072</memory>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:  <vcpu>1</vcpu>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:14:24 np0005531887 nova_compute[186849]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:      <nova:name>tempest-TestSnapshotPattern-server-368734434</nova:name>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:      <nova:creationTime>2025-11-22 08:14:24</nova:creationTime>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:      <nova:flavor name="m1.nano">
Nov 22 03:14:24 np0005531887 nova_compute[186849]:        <nova:memory>128</nova:memory>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:        <nova:disk>1</nova:disk>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:        <nova:swap>0</nova:swap>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:      </nova:flavor>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:      <nova:owner>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:        <nova:user uuid="72df4512d7f245118018df81223ce5ff">tempest-TestSnapshotPattern-1254822391-project-member</nova:user>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:        <nova:project uuid="5c9016c6b616412fa2db0983e23a8150">tempest-TestSnapshotPattern-1254822391</nova:project>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:      </nova:owner>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:      <nova:root type="image" uuid="2b9b9d31-f80f-437c-8142-755f74bb78ae"/>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:      <nova:ports>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:        <nova:port uuid="6fca0d10-ec3d-4b7b-844b-458a39db0a47">
Nov 22 03:14:24 np0005531887 nova_compute[186849]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:        </nova:port>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:      </nova:ports>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:    </nova:instance>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:  <sysinfo type="smbios">
Nov 22 03:14:24 np0005531887 nova_compute[186849]:    <system>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:      <entry name="serial">9d3b7a77-8b28-4774-9eeb-65b858c3820b</entry>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:      <entry name="uuid">9d3b7a77-8b28-4774-9eeb-65b858c3820b</entry>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:    </system>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:  <os>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:    <boot dev="hd"/>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:    <smbios mode="sysinfo"/>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:  </os>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:  <features>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:    <vmcoreinfo/>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:  </features>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:  <clock offset="utc">
Nov 22 03:14:24 np0005531887 nova_compute[186849]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:    <timer name="hpet" present="no"/>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:  </clock>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:  <cpu mode="custom" match="exact">
Nov 22 03:14:24 np0005531887 nova_compute[186849]:    <model>Nehalem</model>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:  <devices>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:    <disk type="file" device="disk">
Nov 22 03:14:24 np0005531887 nova_compute[186849]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/9d3b7a77-8b28-4774-9eeb-65b858c3820b/disk"/>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:      <target dev="vda" bus="virtio"/>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:    <disk type="file" device="cdrom">
Nov 22 03:14:24 np0005531887 nova_compute[186849]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/9d3b7a77-8b28-4774-9eeb-65b858c3820b/disk.config"/>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:      <target dev="sda" bus="sata"/>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:    <interface type="ethernet">
Nov 22 03:14:24 np0005531887 nova_compute[186849]:      <mac address="fa:16:3e:90:00:cb"/>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:      <mtu size="1442"/>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:      <target dev="tap6fca0d10-ec"/>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:    </interface>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:    <serial type="pty">
Nov 22 03:14:24 np0005531887 nova_compute[186849]:      <log file="/var/lib/nova/instances/9d3b7a77-8b28-4774-9eeb-65b858c3820b/console.log" append="off"/>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:    </serial>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:    <video>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:    </video>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:    <input type="tablet" bus="usb"/>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:    <input type="keyboard" bus="usb"/>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:    <rng model="virtio">
Nov 22 03:14:24 np0005531887 nova_compute[186849]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:    </rng>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:    <controller type="usb" index="0"/>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:    <memballoon model="virtio">
Nov 22 03:14:24 np0005531887 nova_compute[186849]:      <stats period="10"/>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 03:14:24 np0005531887 nova_compute[186849]:  </devices>
Nov 22 03:14:24 np0005531887 nova_compute[186849]: </domain>
Nov 22 03:14:24 np0005531887 nova_compute[186849]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:14:24 np0005531887 nova_compute[186849]: 2025-11-22 08:14:24.511 186853 DEBUG nova.compute.manager [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Preparing to wait for external event network-vif-plugged-6fca0d10-ec3d-4b7b-844b-458a39db0a47 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:14:24 np0005531887 nova_compute[186849]: 2025-11-22 08:14:24.511 186853 DEBUG oslo_concurrency.lockutils [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Acquiring lock "9d3b7a77-8b28-4774-9eeb-65b858c3820b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:14:24 np0005531887 nova_compute[186849]: 2025-11-22 08:14:24.512 186853 DEBUG oslo_concurrency.lockutils [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Lock "9d3b7a77-8b28-4774-9eeb-65b858c3820b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:14:24 np0005531887 nova_compute[186849]: 2025-11-22 08:14:24.512 186853 DEBUG oslo_concurrency.lockutils [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Lock "9d3b7a77-8b28-4774-9eeb-65b858c3820b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:14:24 np0005531887 nova_compute[186849]: 2025-11-22 08:14:24.513 186853 DEBUG nova.virt.libvirt.vif [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:14:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-368734434',display_name='tempest-TestSnapshotPattern-server-368734434',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-368734434',id=141,image_ref='2b9b9d31-f80f-437c-8142-755f74bb78ae',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO3zqwL5oCAVcYUK4UfRxRlwiLCpXhyrVibiQXfDMPSmEzdCg2weZeJjjoUlK1vs2o/ZsP7kK+r7TBW2xEMw9M43RfSbbpgfpmDe3/3E/PZ1RgVY0zy+sKDgo7g8yf0esA==',key_name='tempest-TestSnapshotPattern-653067273',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5c9016c6b616412fa2db0983e23a8150',ramdisk_id='',reservation_id='r-9jtkwk6t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_boot_roles='member,reader',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='64848f5c-64c9-41ed-9c0d-c2ef3839d5de',image_min_disk='1',image_min_ram='0',image_owner_id='5c9016c6b616412fa2db0983e23a8150',image_owner_project_name='tempest-TestSnapshotPattern-1254822391',image_owner_user_name='tempest-TestSnapshotPattern-1254822391-project-member',image_user_id='72df4512d7f245118018df81223ce5ff',image_version='8.0',network_allocated='True',owner_project_name='tempest-TestSnapshotPattern-1254822391',owner_user_name='tempest-TestSnapshotPattern-1254822391-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:14:11Z,user_data=None,user_id='72df4512d7f245118018df81223ce5ff',uuid=9d3b7a77-8b28-4774-9eeb-65b858c3820b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6fca0d10-ec3d-4b7b-844b-458a39db0a47", "address": "fa:16:3e:90:00:cb", "network": {"id": "5cbf5083-8d50-44bd-b6ba-93e507a8654e", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-722250133-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9016c6b616412fa2db0983e23a8150", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6fca0d10-ec", "ovs_interfaceid": "6fca0d10-ec3d-4b7b-844b-458a39db0a47", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:14:24 np0005531887 nova_compute[186849]: 2025-11-22 08:14:24.513 186853 DEBUG nova.network.os_vif_util [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Converting VIF {"id": "6fca0d10-ec3d-4b7b-844b-458a39db0a47", "address": "fa:16:3e:90:00:cb", "network": {"id": "5cbf5083-8d50-44bd-b6ba-93e507a8654e", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-722250133-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9016c6b616412fa2db0983e23a8150", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6fca0d10-ec", "ovs_interfaceid": "6fca0d10-ec3d-4b7b-844b-458a39db0a47", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:14:24 np0005531887 nova_compute[186849]: 2025-11-22 08:14:24.514 186853 DEBUG nova.network.os_vif_util [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:90:00:cb,bridge_name='br-int',has_traffic_filtering=True,id=6fca0d10-ec3d-4b7b-844b-458a39db0a47,network=Network(5cbf5083-8d50-44bd-b6ba-93e507a8654e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6fca0d10-ec') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:14:24 np0005531887 nova_compute[186849]: 2025-11-22 08:14:24.514 186853 DEBUG os_vif [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:90:00:cb,bridge_name='br-int',has_traffic_filtering=True,id=6fca0d10-ec3d-4b7b-844b-458a39db0a47,network=Network(5cbf5083-8d50-44bd-b6ba-93e507a8654e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6fca0d10-ec') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:14:24 np0005531887 nova_compute[186849]: 2025-11-22 08:14:24.515 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:24 np0005531887 nova_compute[186849]: 2025-11-22 08:14:24.515 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:14:24 np0005531887 nova_compute[186849]: 2025-11-22 08:14:24.515 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:14:24 np0005531887 nova_compute[186849]: 2025-11-22 08:14:24.518 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:24 np0005531887 nova_compute[186849]: 2025-11-22 08:14:24.519 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6fca0d10-ec, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:14:24 np0005531887 nova_compute[186849]: 2025-11-22 08:14:24.519 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6fca0d10-ec, col_values=(('external_ids', {'iface-id': '6fca0d10-ec3d-4b7b-844b-458a39db0a47', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:90:00:cb', 'vm-uuid': '9d3b7a77-8b28-4774-9eeb-65b858c3820b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:14:24 np0005531887 nova_compute[186849]: 2025-11-22 08:14:24.520 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:24 np0005531887 NetworkManager[55210]: <info>  [1763799264.5216] manager: (tap6fca0d10-ec): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/208)
Nov 22 03:14:24 np0005531887 nova_compute[186849]: 2025-11-22 08:14:24.524 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:14:24 np0005531887 nova_compute[186849]: 2025-11-22 08:14:24.531 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:24 np0005531887 nova_compute[186849]: 2025-11-22 08:14:24.534 186853 INFO os_vif [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:90:00:cb,bridge_name='br-int',has_traffic_filtering=True,id=6fca0d10-ec3d-4b7b-844b-458a39db0a47,network=Network(5cbf5083-8d50-44bd-b6ba-93e507a8654e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6fca0d10-ec')#033[00m
Nov 22 03:14:24 np0005531887 nova_compute[186849]: 2025-11-22 08:14:24.626 186853 DEBUG nova.virt.libvirt.driver [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:14:24 np0005531887 nova_compute[186849]: 2025-11-22 08:14:24.627 186853 DEBUG nova.virt.libvirt.driver [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:14:24 np0005531887 nova_compute[186849]: 2025-11-22 08:14:24.628 186853 DEBUG nova.virt.libvirt.driver [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] No VIF found with MAC fa:16:3e:90:00:cb, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:14:24 np0005531887 nova_compute[186849]: 2025-11-22 08:14:24.628 186853 INFO nova.virt.libvirt.driver [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Using config drive#033[00m
Nov 22 03:14:25 np0005531887 nova_compute[186849]: 2025-11-22 08:14:25.774 186853 INFO nova.virt.libvirt.driver [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Creating config drive at /var/lib/nova/instances/9d3b7a77-8b28-4774-9eeb-65b858c3820b/disk.config#033[00m
Nov 22 03:14:25 np0005531887 nova_compute[186849]: 2025-11-22 08:14:25.780 186853 DEBUG oslo_concurrency.processutils [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9d3b7a77-8b28-4774-9eeb-65b858c3820b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpk2nwl13c execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:14:25 np0005531887 podman[234325]: 2025-11-22 08:14:25.840275834 +0000 UTC m=+0.058293860 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Nov 22 03:14:25 np0005531887 nova_compute[186849]: 2025-11-22 08:14:25.920 186853 DEBUG oslo_concurrency.processutils [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9d3b7a77-8b28-4774-9eeb-65b858c3820b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpk2nwl13c" returned: 0 in 0.140s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:14:25 np0005531887 kernel: tap6fca0d10-ec: entered promiscuous mode
Nov 22 03:14:25 np0005531887 NetworkManager[55210]: <info>  [1763799265.9855] manager: (tap6fca0d10-ec): new Tun device (/org/freedesktop/NetworkManager/Devices/209)
Nov 22 03:14:25 np0005531887 ovn_controller[95130]: 2025-11-22T08:14:25Z|00458|binding|INFO|Claiming lport 6fca0d10-ec3d-4b7b-844b-458a39db0a47 for this chassis.
Nov 22 03:14:25 np0005531887 ovn_controller[95130]: 2025-11-22T08:14:25Z|00459|binding|INFO|6fca0d10-ec3d-4b7b-844b-458a39db0a47: Claiming fa:16:3e:90:00:cb 10.100.0.5
Nov 22 03:14:25 np0005531887 nova_compute[186849]: 2025-11-22 08:14:25.993 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:26 np0005531887 nova_compute[186849]: 2025-11-22 08:14:26.009 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:26 np0005531887 NetworkManager[55210]: <info>  [1763799266.0111] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/210)
Nov 22 03:14:26 np0005531887 NetworkManager[55210]: <info>  [1763799266.0118] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/211)
Nov 22 03:14:26 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:26.021 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:90:00:cb 10.100.0.5'], port_security=['fa:16:3e:90:00:cb 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '9d3b7a77-8b28-4774-9eeb-65b858c3820b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5cbf5083-8d50-44bd-b6ba-93e507a8654e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5c9016c6b616412fa2db0983e23a8150', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7420c781-e9c7-4653-97a5-92e76e44aa71', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7e7964e9-a04c-4b66-8053-f482dcbb2cee, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=6fca0d10-ec3d-4b7b-844b-458a39db0a47) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:14:26 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:26.024 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 6fca0d10-ec3d-4b7b-844b-458a39db0a47 in datapath 5cbf5083-8d50-44bd-b6ba-93e507a8654e bound to our chassis#033[00m
Nov 22 03:14:26 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:26.027 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5cbf5083-8d50-44bd-b6ba-93e507a8654e#033[00m
Nov 22 03:14:26 np0005531887 systemd-machined[153180]: New machine qemu-51-instance-0000008d.
Nov 22 03:14:26 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:26.040 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[e1b66768-8bda-4987-a8bb-f3d12a979c6f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:14:26 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:26.042 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap5cbf5083-81 in ovnmeta-5cbf5083-8d50-44bd-b6ba-93e507a8654e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:14:26 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:26.046 213790 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap5cbf5083-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:14:26 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:26.046 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[d924efdf-256d-445e-b2ec-5f04fbaea7b1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:14:26 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:26.047 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[95036119-1ab3-4abf-91f2-377c2b8c120d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:14:26 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:26.058 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[611a6b9c-2f1a-48b5-9676-bae4ab7a7140]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:14:26 np0005531887 systemd[1]: Started Virtual Machine qemu-51-instance-0000008d.
Nov 22 03:14:26 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:26.084 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[431eeec1-ce57-47f2-b7d5-e88b3bb3d383]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:14:26 np0005531887 systemd-udevd[234364]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:14:26 np0005531887 NetworkManager[55210]: <info>  [1763799266.1074] device (tap6fca0d10-ec): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:14:26 np0005531887 NetworkManager[55210]: <info>  [1763799266.1087] device (tap6fca0d10-ec): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:14:26 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:26.115 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[3093cbd5-aad4-4e2a-bf8e-ec6aee4d8284]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:14:26 np0005531887 NetworkManager[55210]: <info>  [1763799266.1376] manager: (tap5cbf5083-80): new Veth device (/org/freedesktop/NetworkManager/Devices/212)
Nov 22 03:14:26 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:26.133 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[f79df2b7-7e62-498b-a61f-aa3dece574bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:14:26 np0005531887 systemd-udevd[234368]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:14:26 np0005531887 nova_compute[186849]: 2025-11-22 08:14:26.163 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:26 np0005531887 ovn_controller[95130]: 2025-11-22T08:14:26Z|00460|binding|INFO|Releasing lport d6ef1392-aa2a-4e3e-91ba-ec0ce61e416a from this chassis (sb_readonly=0)
Nov 22 03:14:26 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:26.173 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[4c961ed9-17f6-499a-88cf-7a50a2d7cee0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:14:26 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:26.177 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[065df596-be95-445c-8954-2782cf62eb50]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:14:26 np0005531887 nova_compute[186849]: 2025-11-22 08:14:26.185 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:26 np0005531887 ovn_controller[95130]: 2025-11-22T08:14:26Z|00461|binding|INFO|Setting lport 6fca0d10-ec3d-4b7b-844b-458a39db0a47 ovn-installed in OVS
Nov 22 03:14:26 np0005531887 ovn_controller[95130]: 2025-11-22T08:14:26Z|00462|binding|INFO|Setting lport 6fca0d10-ec3d-4b7b-844b-458a39db0a47 up in Southbound
Nov 22 03:14:26 np0005531887 nova_compute[186849]: 2025-11-22 08:14:26.194 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:26 np0005531887 NetworkManager[55210]: <info>  [1763799266.2011] device (tap5cbf5083-80): carrier: link connected
Nov 22 03:14:26 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:26.206 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[bde6b6e7-cb79-4037-9a8b-96051b029e14]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:14:26 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:26.225 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[73a1d1f8-0cb0-41c2-9a4a-78fce8a592a5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5cbf5083-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b0:31:34'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 137], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 600880, 'reachable_time': 44677, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234394, 'error': None, 'target': 'ovnmeta-5cbf5083-8d50-44bd-b6ba-93e507a8654e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:14:26 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:26.241 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[90a8b14b-0429-4649-8e4c-b73bcfc12cbe]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb0:3134'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 600880, 'tstamp': 600880}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234395, 'error': None, 'target': 'ovnmeta-5cbf5083-8d50-44bd-b6ba-93e507a8654e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:14:26 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:26.259 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[3531da4a-f64a-4725-988e-64b45d0d59a7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5cbf5083-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b0:31:34'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 137], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 600880, 'reachable_time': 44677, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 234396, 'error': None, 'target': 'ovnmeta-5cbf5083-8d50-44bd-b6ba-93e507a8654e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:14:26 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:26.294 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[6017b8d2-2d1d-4fd3-bad4-10430f47ddcb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:14:26 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:26.361 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[e2569fe3-3d30-4c41-8170-3e5018ae56f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:14:26 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:26.363 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5cbf5083-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:14:26 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:26.363 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:14:26 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:26.364 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5cbf5083-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:14:26 np0005531887 nova_compute[186849]: 2025-11-22 08:14:26.366 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:26 np0005531887 kernel: tap5cbf5083-80: entered promiscuous mode
Nov 22 03:14:26 np0005531887 NetworkManager[55210]: <info>  [1763799266.3671] manager: (tap5cbf5083-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/213)
Nov 22 03:14:26 np0005531887 nova_compute[186849]: 2025-11-22 08:14:26.369 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:26 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:26.374 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5cbf5083-80, col_values=(('external_ids', {'iface-id': 'c7997624-ca02-4f3d-814b-acdd3ec0189c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:14:26 np0005531887 nova_compute[186849]: 2025-11-22 08:14:26.375 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:26 np0005531887 ovn_controller[95130]: 2025-11-22T08:14:26Z|00463|binding|INFO|Releasing lport c7997624-ca02-4f3d-814b-acdd3ec0189c from this chassis (sb_readonly=0)
Nov 22 03:14:26 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:26.379 104084 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5cbf5083-8d50-44bd-b6ba-93e507a8654e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5cbf5083-8d50-44bd-b6ba-93e507a8654e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:14:26 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:26.380 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[9b869dcf-07bd-4b06-88bf-e5dce38a9d51]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:14:26 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:26.381 104084 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:14:26 np0005531887 ovn_metadata_agent[104079]: global
Nov 22 03:14:26 np0005531887 ovn_metadata_agent[104079]:    log         /dev/log local0 debug
Nov 22 03:14:26 np0005531887 ovn_metadata_agent[104079]:    log-tag     haproxy-metadata-proxy-5cbf5083-8d50-44bd-b6ba-93e507a8654e
Nov 22 03:14:26 np0005531887 ovn_metadata_agent[104079]:    user        root
Nov 22 03:14:26 np0005531887 ovn_metadata_agent[104079]:    group       root
Nov 22 03:14:26 np0005531887 ovn_metadata_agent[104079]:    maxconn     1024
Nov 22 03:14:26 np0005531887 ovn_metadata_agent[104079]:    pidfile     /var/lib/neutron/external/pids/5cbf5083-8d50-44bd-b6ba-93e507a8654e.pid.haproxy
Nov 22 03:14:26 np0005531887 ovn_metadata_agent[104079]:    daemon
Nov 22 03:14:26 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:14:26 np0005531887 ovn_metadata_agent[104079]: defaults
Nov 22 03:14:26 np0005531887 ovn_metadata_agent[104079]:    log global
Nov 22 03:14:26 np0005531887 ovn_metadata_agent[104079]:    mode http
Nov 22 03:14:26 np0005531887 ovn_metadata_agent[104079]:    option httplog
Nov 22 03:14:26 np0005531887 ovn_metadata_agent[104079]:    option dontlognull
Nov 22 03:14:26 np0005531887 ovn_metadata_agent[104079]:    option http-server-close
Nov 22 03:14:26 np0005531887 ovn_metadata_agent[104079]:    option forwardfor
Nov 22 03:14:26 np0005531887 ovn_metadata_agent[104079]:    retries                 3
Nov 22 03:14:26 np0005531887 ovn_metadata_agent[104079]:    timeout http-request    30s
Nov 22 03:14:26 np0005531887 ovn_metadata_agent[104079]:    timeout connect         30s
Nov 22 03:14:26 np0005531887 ovn_metadata_agent[104079]:    timeout client          32s
Nov 22 03:14:26 np0005531887 ovn_metadata_agent[104079]:    timeout server          32s
Nov 22 03:14:26 np0005531887 ovn_metadata_agent[104079]:    timeout http-keep-alive 30s
Nov 22 03:14:26 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:14:26 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:14:26 np0005531887 ovn_metadata_agent[104079]: listen listener
Nov 22 03:14:26 np0005531887 ovn_metadata_agent[104079]:    bind 169.254.169.254:80
Nov 22 03:14:26 np0005531887 ovn_metadata_agent[104079]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:14:26 np0005531887 ovn_metadata_agent[104079]:    http-request add-header X-OVN-Network-ID 5cbf5083-8d50-44bd-b6ba-93e507a8654e
Nov 22 03:14:26 np0005531887 ovn_metadata_agent[104079]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:14:26 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:26.382 104084 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-5cbf5083-8d50-44bd-b6ba-93e507a8654e', 'env', 'PROCESS_TAG=haproxy-5cbf5083-8d50-44bd-b6ba-93e507a8654e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/5cbf5083-8d50-44bd-b6ba-93e507a8654e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:14:26 np0005531887 nova_compute[186849]: 2025-11-22 08:14:26.389 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:26 np0005531887 nova_compute[186849]: 2025-11-22 08:14:26.548 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763799266.5470574, 9d3b7a77-8b28-4774-9eeb-65b858c3820b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:14:26 np0005531887 nova_compute[186849]: 2025-11-22 08:14:26.549 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] VM Started (Lifecycle Event)#033[00m
Nov 22 03:14:26 np0005531887 nova_compute[186849]: 2025-11-22 08:14:26.577 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:14:26 np0005531887 nova_compute[186849]: 2025-11-22 08:14:26.582 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763799266.5487587, 9d3b7a77-8b28-4774-9eeb-65b858c3820b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:14:26 np0005531887 nova_compute[186849]: 2025-11-22 08:14:26.582 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:14:26 np0005531887 nova_compute[186849]: 2025-11-22 08:14:26.602 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:14:26 np0005531887 nova_compute[186849]: 2025-11-22 08:14:26.606 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:14:26 np0005531887 nova_compute[186849]: 2025-11-22 08:14:26.627 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:14:26 np0005531887 nova_compute[186849]: 2025-11-22 08:14:26.676 186853 DEBUG nova.compute.manager [req-4cb7e62c-e1f8-4a58-9f98-a6f2d7453319 req-4db4dd08-ec9e-46f4-b309-33f3b415ddf0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Received event network-vif-plugged-6fca0d10-ec3d-4b7b-844b-458a39db0a47 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:14:26 np0005531887 nova_compute[186849]: 2025-11-22 08:14:26.677 186853 DEBUG oslo_concurrency.lockutils [req-4cb7e62c-e1f8-4a58-9f98-a6f2d7453319 req-4db4dd08-ec9e-46f4-b309-33f3b415ddf0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "9d3b7a77-8b28-4774-9eeb-65b858c3820b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:14:26 np0005531887 nova_compute[186849]: 2025-11-22 08:14:26.678 186853 DEBUG oslo_concurrency.lockutils [req-4cb7e62c-e1f8-4a58-9f98-a6f2d7453319 req-4db4dd08-ec9e-46f4-b309-33f3b415ddf0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "9d3b7a77-8b28-4774-9eeb-65b858c3820b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:14:26 np0005531887 nova_compute[186849]: 2025-11-22 08:14:26.678 186853 DEBUG oslo_concurrency.lockutils [req-4cb7e62c-e1f8-4a58-9f98-a6f2d7453319 req-4db4dd08-ec9e-46f4-b309-33f3b415ddf0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "9d3b7a77-8b28-4774-9eeb-65b858c3820b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:14:26 np0005531887 nova_compute[186849]: 2025-11-22 08:14:26.678 186853 DEBUG nova.compute.manager [req-4cb7e62c-e1f8-4a58-9f98-a6f2d7453319 req-4db4dd08-ec9e-46f4-b309-33f3b415ddf0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Processing event network-vif-plugged-6fca0d10-ec3d-4b7b-844b-458a39db0a47 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:14:26 np0005531887 nova_compute[186849]: 2025-11-22 08:14:26.679 186853 DEBUG nova.compute.manager [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:14:26 np0005531887 nova_compute[186849]: 2025-11-22 08:14:26.683 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763799266.682872, 9d3b7a77-8b28-4774-9eeb-65b858c3820b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:14:26 np0005531887 nova_compute[186849]: 2025-11-22 08:14:26.683 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:14:26 np0005531887 nova_compute[186849]: 2025-11-22 08:14:26.689 186853 DEBUG nova.virt.libvirt.driver [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:14:26 np0005531887 nova_compute[186849]: 2025-11-22 08:14:26.693 186853 INFO nova.virt.libvirt.driver [-] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Instance spawned successfully.#033[00m
Nov 22 03:14:26 np0005531887 nova_compute[186849]: 2025-11-22 08:14:26.694 186853 INFO nova.compute.manager [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Took 15.60 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 03:14:26 np0005531887 nova_compute[186849]: 2025-11-22 08:14:26.694 186853 DEBUG nova.compute.manager [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:14:26 np0005531887 nova_compute[186849]: 2025-11-22 08:14:26.703 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:14:26 np0005531887 nova_compute[186849]: 2025-11-22 08:14:26.707 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:14:26 np0005531887 nova_compute[186849]: 2025-11-22 08:14:26.729 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:14:26 np0005531887 nova_compute[186849]: 2025-11-22 08:14:26.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:14:26 np0005531887 podman[234436]: 2025-11-22 08:14:26.763956368 +0000 UTC m=+0.024055844 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:14:26 np0005531887 podman[234436]: 2025-11-22 08:14:26.984008269 +0000 UTC m=+0.244107715 container create 09b990fd2c0f52de4d7a8c32cc4b3937fbc49dddac25fc692fcc373f51595444 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5cbf5083-8d50-44bd-b6ba-93e507a8654e, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 22 03:14:27 np0005531887 systemd[1]: Started libpod-conmon-09b990fd2c0f52de4d7a8c32cc4b3937fbc49dddac25fc692fcc373f51595444.scope.
Nov 22 03:14:27 np0005531887 nova_compute[186849]: 2025-11-22 08:14:27.039 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:27 np0005531887 systemd[1]: Started libcrun container.
Nov 22 03:14:27 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4ce1f7cf373bd09991ed80acf4e7908bce79a67ef32b6719ad78666bf614964e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:14:27 np0005531887 podman[234436]: 2025-11-22 08:14:27.118753875 +0000 UTC m=+0.378853341 container init 09b990fd2c0f52de4d7a8c32cc4b3937fbc49dddac25fc692fcc373f51595444 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5cbf5083-8d50-44bd-b6ba-93e507a8654e, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 03:14:27 np0005531887 podman[234436]: 2025-11-22 08:14:27.125215524 +0000 UTC m=+0.385314970 container start 09b990fd2c0f52de4d7a8c32cc4b3937fbc49dddac25fc692fcc373f51595444 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5cbf5083-8d50-44bd-b6ba-93e507a8654e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 22 03:14:27 np0005531887 neutron-haproxy-ovnmeta-5cbf5083-8d50-44bd-b6ba-93e507a8654e[234451]: [NOTICE]   (234455) : New worker (234457) forked
Nov 22 03:14:27 np0005531887 neutron-haproxy-ovnmeta-5cbf5083-8d50-44bd-b6ba-93e507a8654e[234451]: [NOTICE]   (234455) : Loading success.
Nov 22 03:14:27 np0005531887 nova_compute[186849]: 2025-11-22 08:14:27.208 186853 INFO nova.compute.manager [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Took 16.73 seconds to build instance.#033[00m
Nov 22 03:14:27 np0005531887 nova_compute[186849]: 2025-11-22 08:14:27.257 186853 DEBUG oslo_concurrency.lockutils [None req-e897a614-fba5-432a-b6bb-00a7f1b2ba40 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Lock "9d3b7a77-8b28-4774-9eeb-65b858c3820b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.919s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:14:27 np0005531887 nova_compute[186849]: 2025-11-22 08:14:27.402 186853 DEBUG nova.network.neutron [req-08af2d56-ad3d-461d-8572-8bb607117206 req-8dc2ef5e-9b46-4997-a91e-cf39442e985a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Updated VIF entry in instance network info cache for port 6fca0d10-ec3d-4b7b-844b-458a39db0a47. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:14:27 np0005531887 nova_compute[186849]: 2025-11-22 08:14:27.404 186853 DEBUG nova.network.neutron [req-08af2d56-ad3d-461d-8572-8bb607117206 req-8dc2ef5e-9b46-4997-a91e-cf39442e985a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Updating instance_info_cache with network_info: [{"id": "6fca0d10-ec3d-4b7b-844b-458a39db0a47", "address": "fa:16:3e:90:00:cb", "network": {"id": "5cbf5083-8d50-44bd-b6ba-93e507a8654e", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-722250133-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9016c6b616412fa2db0983e23a8150", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6fca0d10-ec", "ovs_interfaceid": "6fca0d10-ec3d-4b7b-844b-458a39db0a47", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:14:27 np0005531887 nova_compute[186849]: 2025-11-22 08:14:27.436 186853 DEBUG oslo_concurrency.lockutils [req-08af2d56-ad3d-461d-8572-8bb607117206 req-8dc2ef5e-9b46-4997-a91e-cf39442e985a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-9d3b7a77-8b28-4774-9eeb-65b858c3820b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:14:27 np0005531887 nova_compute[186849]: 2025-11-22 08:14:27.601 186853 DEBUG oslo_concurrency.lockutils [None req-5c7e33b6-08e0-438d-b6e7-e46734edbafb 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:14:27 np0005531887 nova_compute[186849]: 2025-11-22 08:14:27.602 186853 DEBUG oslo_concurrency.lockutils [None req-5c7e33b6-08e0-438d-b6e7-e46734edbafb 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:14:27 np0005531887 nova_compute[186849]: 2025-11-22 08:14:27.602 186853 DEBUG oslo_concurrency.lockutils [None req-5c7e33b6-08e0-438d-b6e7-e46734edbafb 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:14:27 np0005531887 nova_compute[186849]: 2025-11-22 08:14:27.603 186853 DEBUG oslo_concurrency.lockutils [None req-5c7e33b6-08e0-438d-b6e7-e46734edbafb 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:14:27 np0005531887 nova_compute[186849]: 2025-11-22 08:14:27.603 186853 DEBUG oslo_concurrency.lockutils [None req-5c7e33b6-08e0-438d-b6e7-e46734edbafb 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:14:27 np0005531887 nova_compute[186849]: 2025-11-22 08:14:27.611 186853 INFO nova.compute.manager [None req-5c7e33b6-08e0-438d-b6e7-e46734edbafb 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Terminating instance#033[00m
Nov 22 03:14:27 np0005531887 nova_compute[186849]: 2025-11-22 08:14:27.618 186853 DEBUG nova.compute.manager [None req-5c7e33b6-08e0-438d-b6e7-e46734edbafb 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 03:14:27 np0005531887 kernel: tap0867f0fe-7f (unregistering): left promiscuous mode
Nov 22 03:14:27 np0005531887 NetworkManager[55210]: <info>  [1763799267.6401] device (tap0867f0fe-7f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:14:27 np0005531887 ovn_controller[95130]: 2025-11-22T08:14:27Z|00464|binding|INFO|Releasing lport 0867f0fe-7fa4-488f-8c3f-451ee6c50a10 from this chassis (sb_readonly=0)
Nov 22 03:14:27 np0005531887 ovn_controller[95130]: 2025-11-22T08:14:27Z|00465|binding|INFO|Setting lport 0867f0fe-7fa4-488f-8c3f-451ee6c50a10 down in Southbound
Nov 22 03:14:27 np0005531887 ovn_controller[95130]: 2025-11-22T08:14:27Z|00466|binding|INFO|Removing iface tap0867f0fe-7f ovn-installed in OVS
Nov 22 03:14:27 np0005531887 nova_compute[186849]: 2025-11-22 08:14:27.649 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:27 np0005531887 nova_compute[186849]: 2025-11-22 08:14:27.651 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:27 np0005531887 nova_compute[186849]: 2025-11-22 08:14:27.666 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:27 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:27.673 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:92:7c:a7 10.100.0.14'], port_security=['fa:16:3e:92:7c:a7 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-66c945b4-7237-4e85-b411-0c51b31ea31a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '70cb231da30d4002a985cf18a579cd6a', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'cdac32cd-3018-48f9-b8b4-269b2f46b94b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b63d9e41-5235-4b2c-88f9-85531fc2355b, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=0867f0fe-7fa4-488f-8c3f-451ee6c50a10) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:14:27 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:27.675 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 0867f0fe-7fa4-488f-8c3f-451ee6c50a10 in datapath 66c945b4-7237-4e85-b411-0c51b31ea31a unbound from our chassis#033[00m
Nov 22 03:14:27 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:27.677 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 66c945b4-7237-4e85-b411-0c51b31ea31a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:14:27 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:27.679 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[b9cfb7ff-71f8-426c-b8dd-7d842d9effbc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:14:27 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:27.679 104084 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a namespace which is not needed anymore#033[00m
Nov 22 03:14:27 np0005531887 systemd[1]: machine-qemu\x2d50\x2dinstance\x2d0000008c.scope: Deactivated successfully.
Nov 22 03:14:27 np0005531887 systemd[1]: machine-qemu\x2d50\x2dinstance\x2d0000008c.scope: Consumed 6.488s CPU time.
Nov 22 03:14:27 np0005531887 systemd-machined[153180]: Machine qemu-50-instance-0000008c terminated.
Nov 22 03:14:27 np0005531887 neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a[234295]: [NOTICE]   (234311) : haproxy version is 2.8.14-c23fe91
Nov 22 03:14:27 np0005531887 neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a[234295]: [NOTICE]   (234311) : path to executable is /usr/sbin/haproxy
Nov 22 03:14:27 np0005531887 neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a[234295]: [WARNING]  (234311) : Exiting Master process...
Nov 22 03:14:27 np0005531887 neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a[234295]: [WARNING]  (234311) : Exiting Master process...
Nov 22 03:14:27 np0005531887 neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a[234295]: [ALERT]    (234311) : Current worker (234313) exited with code 143 (Terminated)
Nov 22 03:14:27 np0005531887 neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a[234295]: [WARNING]  (234311) : All workers exited. Exiting... (0)
Nov 22 03:14:27 np0005531887 systemd[1]: libpod-40d9328b60e4b899762c69b9ded38e8c5b8196a95f45d5baca3ec402c5ac80d5.scope: Deactivated successfully.
Nov 22 03:14:27 np0005531887 podman[234484]: 2025-11-22 08:14:27.818927064 +0000 UTC m=+0.049472663 container died 40d9328b60e4b899762c69b9ded38e8c5b8196a95f45d5baca3ec402c5ac80d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 03:14:27 np0005531887 nova_compute[186849]: 2025-11-22 08:14:27.889 186853 INFO nova.virt.libvirt.driver [-] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Instance destroyed successfully.#033[00m
Nov 22 03:14:27 np0005531887 nova_compute[186849]: 2025-11-22 08:14:27.889 186853 DEBUG nova.objects.instance [None req-5c7e33b6-08e0-438d-b6e7-e46734edbafb 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lazy-loading 'resources' on Instance uuid 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:14:27 np0005531887 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-40d9328b60e4b899762c69b9ded38e8c5b8196a95f45d5baca3ec402c5ac80d5-userdata-shm.mount: Deactivated successfully.
Nov 22 03:14:27 np0005531887 systemd[1]: var-lib-containers-storage-overlay-c6b02c7a7267a0ef93bc748b34985473749ee0ecfe288363b4dd2f863bde53a6-merged.mount: Deactivated successfully.
Nov 22 03:14:27 np0005531887 nova_compute[186849]: 2025-11-22 08:14:27.904 186853 DEBUG nova.virt.libvirt.vif [None req-5c7e33b6-08e0-438d-b6e7-e46734edbafb 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:14:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-849339840',display_name='tempest-ServersTestJSON-server-849339840',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestjson-server-849339840',id=140,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNeTAz8G+WtQEnDVQCIWuuae3miBxjOPUnKRHOt/l8W+q+TALWf4U9Y5FJxO6CFUtZ8WRe2/7bHoG1UTH6RY8u91pqpnnVgoXf5m2eSRE5boS4R0NeXH+VFSVo4SCT2B7w==',key_name='tempest-key-1426395444',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:14:23Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='70cb231da30d4002a985cf18a579cd6a',ramdisk_id='',reservation_id='r-q0lbn4yi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-1620770071',owner_user_name='tempest-ServersTestJSON-1620770071-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:14:23Z,user_data=None,user_id='11d95211a44e4da9a04eb309ec3ab024',uuid=9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0867f0fe-7fa4-488f-8c3f-451ee6c50a10", "address": "fa:16:3e:92:7c:a7", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0867f0fe-7f", "ovs_interfaceid": "0867f0fe-7fa4-488f-8c3f-451ee6c50a10", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:14:27 np0005531887 nova_compute[186849]: 2025-11-22 08:14:27.904 186853 DEBUG nova.network.os_vif_util [None req-5c7e33b6-08e0-438d-b6e7-e46734edbafb 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Converting VIF {"id": "0867f0fe-7fa4-488f-8c3f-451ee6c50a10", "address": "fa:16:3e:92:7c:a7", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0867f0fe-7f", "ovs_interfaceid": "0867f0fe-7fa4-488f-8c3f-451ee6c50a10", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:14:27 np0005531887 nova_compute[186849]: 2025-11-22 08:14:27.905 186853 DEBUG nova.network.os_vif_util [None req-5c7e33b6-08e0-438d-b6e7-e46734edbafb 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:92:7c:a7,bridge_name='br-int',has_traffic_filtering=True,id=0867f0fe-7fa4-488f-8c3f-451ee6c50a10,network=Network(66c945b4-7237-4e85-b411-0c51b31ea31a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0867f0fe-7f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:14:27 np0005531887 nova_compute[186849]: 2025-11-22 08:14:27.905 186853 DEBUG os_vif [None req-5c7e33b6-08e0-438d-b6e7-e46734edbafb 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:92:7c:a7,bridge_name='br-int',has_traffic_filtering=True,id=0867f0fe-7fa4-488f-8c3f-451ee6c50a10,network=Network(66c945b4-7237-4e85-b411-0c51b31ea31a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0867f0fe-7f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:14:27 np0005531887 nova_compute[186849]: 2025-11-22 08:14:27.908 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:27 np0005531887 nova_compute[186849]: 2025-11-22 08:14:27.908 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0867f0fe-7f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:14:27 np0005531887 nova_compute[186849]: 2025-11-22 08:14:27.910 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:27 np0005531887 nova_compute[186849]: 2025-11-22 08:14:27.913 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:14:27 np0005531887 nova_compute[186849]: 2025-11-22 08:14:27.915 186853 INFO os_vif [None req-5c7e33b6-08e0-438d-b6e7-e46734edbafb 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:92:7c:a7,bridge_name='br-int',has_traffic_filtering=True,id=0867f0fe-7fa4-488f-8c3f-451ee6c50a10,network=Network(66c945b4-7237-4e85-b411-0c51b31ea31a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0867f0fe-7f')#033[00m
Nov 22 03:14:27 np0005531887 nova_compute[186849]: 2025-11-22 08:14:27.915 186853 INFO nova.virt.libvirt.driver [None req-5c7e33b6-08e0-438d-b6e7-e46734edbafb 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Deleting instance files /var/lib/nova/instances/9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e_del#033[00m
Nov 22 03:14:27 np0005531887 nova_compute[186849]: 2025-11-22 08:14:27.916 186853 INFO nova.virt.libvirt.driver [None req-5c7e33b6-08e0-438d-b6e7-e46734edbafb 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Deletion of /var/lib/nova/instances/9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e_del complete#033[00m
Nov 22 03:14:27 np0005531887 podman[234484]: 2025-11-22 08:14:27.931632475 +0000 UTC m=+0.162178024 container cleanup 40d9328b60e4b899762c69b9ded38e8c5b8196a95f45d5baca3ec402c5ac80d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 03:14:27 np0005531887 systemd[1]: libpod-conmon-40d9328b60e4b899762c69b9ded38e8c5b8196a95f45d5baca3ec402c5ac80d5.scope: Deactivated successfully.
Nov 22 03:14:27 np0005531887 nova_compute[186849]: 2025-11-22 08:14:27.995 186853 INFO nova.compute.manager [None req-5c7e33b6-08e0-438d-b6e7-e46734edbafb 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 03:14:27 np0005531887 nova_compute[186849]: 2025-11-22 08:14:27.995 186853 DEBUG oslo.service.loopingcall [None req-5c7e33b6-08e0-438d-b6e7-e46734edbafb 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 03:14:27 np0005531887 nova_compute[186849]: 2025-11-22 08:14:27.996 186853 DEBUG nova.compute.manager [-] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 03:14:27 np0005531887 nova_compute[186849]: 2025-11-22 08:14:27.996 186853 DEBUG nova.network.neutron [-] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 03:14:28 np0005531887 podman[234529]: 2025-11-22 08:14:28.017498945 +0000 UTC m=+0.061797477 container remove 40d9328b60e4b899762c69b9ded38e8c5b8196a95f45d5baca3ec402c5ac80d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 22 03:14:28 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:28.023 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[10abde9a-be26-43d7-968a-5555af3aec42]: (4, ('Sat Nov 22 08:14:27 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a (40d9328b60e4b899762c69b9ded38e8c5b8196a95f45d5baca3ec402c5ac80d5)\n40d9328b60e4b899762c69b9ded38e8c5b8196a95f45d5baca3ec402c5ac80d5\nSat Nov 22 08:14:27 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a (40d9328b60e4b899762c69b9ded38e8c5b8196a95f45d5baca3ec402c5ac80d5)\n40d9328b60e4b899762c69b9ded38e8c5b8196a95f45d5baca3ec402c5ac80d5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:14:28 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:28.024 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[42022627-92c4-42cf-8998-52ddb9427a38]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:14:28 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:28.025 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap66c945b4-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:14:28 np0005531887 kernel: tap66c945b4-70: left promiscuous mode
Nov 22 03:14:28 np0005531887 nova_compute[186849]: 2025-11-22 08:14:28.027 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:28 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:28.034 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[91f48180-d92d-44cb-85ca-bf09fdbee35f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:14:28 np0005531887 nova_compute[186849]: 2025-11-22 08:14:28.040 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:28 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:28.049 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[b39a0e07-7260-4c19-9913-1eddb7351499]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:14:28 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:28.051 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[59adf031-3482-4288-b4a3-42603b290867]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:14:28 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:28.067 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[e114b328-8ea0-48f7-a9e4-e982129d296f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 600217, 'reachable_time': 43920, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234544, 'error': None, 'target': 'ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:14:28 np0005531887 systemd[1]: run-netns-ovnmeta\x2d66c945b4\x2d7237\x2d4e85\x2db411\x2d0c51b31ea31a.mount: Deactivated successfully.
Nov 22 03:14:28 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:28.072 104200 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:14:28 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:28.073 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[6d83b943-09eb-4605-94f6-8798626faffd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:14:28 np0005531887 nova_compute[186849]: 2025-11-22 08:14:28.882 186853 DEBUG nova.compute.manager [req-d91db04a-773e-4fd8-a418-46c2046fc6e9 req-e4ec367e-50a1-49c8-8597-effd1b2ecdfb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Received event network-vif-plugged-6fca0d10-ec3d-4b7b-844b-458a39db0a47 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:14:28 np0005531887 nova_compute[186849]: 2025-11-22 08:14:28.883 186853 DEBUG oslo_concurrency.lockutils [req-d91db04a-773e-4fd8-a418-46c2046fc6e9 req-e4ec367e-50a1-49c8-8597-effd1b2ecdfb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "9d3b7a77-8b28-4774-9eeb-65b858c3820b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:14:28 np0005531887 nova_compute[186849]: 2025-11-22 08:14:28.883 186853 DEBUG oslo_concurrency.lockutils [req-d91db04a-773e-4fd8-a418-46c2046fc6e9 req-e4ec367e-50a1-49c8-8597-effd1b2ecdfb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "9d3b7a77-8b28-4774-9eeb-65b858c3820b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:14:28 np0005531887 nova_compute[186849]: 2025-11-22 08:14:28.883 186853 DEBUG oslo_concurrency.lockutils [req-d91db04a-773e-4fd8-a418-46c2046fc6e9 req-e4ec367e-50a1-49c8-8597-effd1b2ecdfb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "9d3b7a77-8b28-4774-9eeb-65b858c3820b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:14:28 np0005531887 nova_compute[186849]: 2025-11-22 08:14:28.883 186853 DEBUG nova.compute.manager [req-d91db04a-773e-4fd8-a418-46c2046fc6e9 req-e4ec367e-50a1-49c8-8597-effd1b2ecdfb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] No waiting events found dispatching network-vif-plugged-6fca0d10-ec3d-4b7b-844b-458a39db0a47 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:14:28 np0005531887 nova_compute[186849]: 2025-11-22 08:14:28.883 186853 WARNING nova.compute.manager [req-d91db04a-773e-4fd8-a418-46c2046fc6e9 req-e4ec367e-50a1-49c8-8597-effd1b2ecdfb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Received unexpected event network-vif-plugged-6fca0d10-ec3d-4b7b-844b-458a39db0a47 for instance with vm_state active and task_state None.#033[00m
Nov 22 03:14:29 np0005531887 nova_compute[186849]: 2025-11-22 08:14:29.944 186853 DEBUG nova.compute.manager [req-132ef5e6-c639-4d7f-80eb-f7524d238eeb req-5ab42d02-2725-4247-8712-59b09bc1688d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Received event network-vif-unplugged-0867f0fe-7fa4-488f-8c3f-451ee6c50a10 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:14:29 np0005531887 nova_compute[186849]: 2025-11-22 08:14:29.945 186853 DEBUG oslo_concurrency.lockutils [req-132ef5e6-c639-4d7f-80eb-f7524d238eeb req-5ab42d02-2725-4247-8712-59b09bc1688d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:14:29 np0005531887 nova_compute[186849]: 2025-11-22 08:14:29.945 186853 DEBUG oslo_concurrency.lockutils [req-132ef5e6-c639-4d7f-80eb-f7524d238eeb req-5ab42d02-2725-4247-8712-59b09bc1688d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:14:29 np0005531887 nova_compute[186849]: 2025-11-22 08:14:29.945 186853 DEBUG oslo_concurrency.lockutils [req-132ef5e6-c639-4d7f-80eb-f7524d238eeb req-5ab42d02-2725-4247-8712-59b09bc1688d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:14:29 np0005531887 nova_compute[186849]: 2025-11-22 08:14:29.946 186853 DEBUG nova.compute.manager [req-132ef5e6-c639-4d7f-80eb-f7524d238eeb req-5ab42d02-2725-4247-8712-59b09bc1688d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] No waiting events found dispatching network-vif-unplugged-0867f0fe-7fa4-488f-8c3f-451ee6c50a10 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:14:29 np0005531887 nova_compute[186849]: 2025-11-22 08:14:29.946 186853 DEBUG nova.compute.manager [req-132ef5e6-c639-4d7f-80eb-f7524d238eeb req-5ab42d02-2725-4247-8712-59b09bc1688d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Received event network-vif-unplugged-0867f0fe-7fa4-488f-8c3f-451ee6c50a10 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 03:14:30 np0005531887 nova_compute[186849]: 2025-11-22 08:14:30.691 186853 DEBUG nova.network.neutron [-] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:14:30 np0005531887 nova_compute[186849]: 2025-11-22 08:14:30.721 186853 INFO nova.compute.manager [-] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Took 2.73 seconds to deallocate network for instance.#033[00m
Nov 22 03:14:30 np0005531887 nova_compute[186849]: 2025-11-22 08:14:30.793 186853 DEBUG oslo_concurrency.lockutils [None req-5c7e33b6-08e0-438d-b6e7-e46734edbafb 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:14:30 np0005531887 nova_compute[186849]: 2025-11-22 08:14:30.794 186853 DEBUG oslo_concurrency.lockutils [None req-5c7e33b6-08e0-438d-b6e7-e46734edbafb 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:14:30 np0005531887 podman[234545]: 2025-11-22 08:14:30.844566723 +0000 UTC m=+0.062868773 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:14:30 np0005531887 nova_compute[186849]: 2025-11-22 08:14:30.919 186853 DEBUG nova.compute.provider_tree [None req-5c7e33b6-08e0-438d-b6e7-e46734edbafb 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:14:30 np0005531887 nova_compute[186849]: 2025-11-22 08:14:30.957 186853 DEBUG nova.scheduler.client.report [None req-5c7e33b6-08e0-438d-b6e7-e46734edbafb 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:14:30 np0005531887 nova_compute[186849]: 2025-11-22 08:14:30.981 186853 DEBUG oslo_concurrency.lockutils [None req-5c7e33b6-08e0-438d-b6e7-e46734edbafb 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.187s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:14:31 np0005531887 nova_compute[186849]: 2025-11-22 08:14:31.018 186853 INFO nova.scheduler.client.report [None req-5c7e33b6-08e0-438d-b6e7-e46734edbafb 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Deleted allocations for instance 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e#033[00m
Nov 22 03:14:31 np0005531887 nova_compute[186849]: 2025-11-22 08:14:31.160 186853 DEBUG oslo_concurrency.lockutils [None req-5c7e33b6-08e0-438d-b6e7-e46734edbafb 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.558s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:14:31 np0005531887 nova_compute[186849]: 2025-11-22 08:14:31.304 186853 DEBUG nova.compute.manager [req-c6d397b4-0b0c-4f8b-939e-a8f084177109 req-46c28244-7430-4f72-bd4f-207c0ff2be51 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Received event network-vif-deleted-0867f0fe-7fa4-488f-8c3f-451ee6c50a10 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:14:32 np0005531887 nova_compute[186849]: 2025-11-22 08:14:32.041 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:32 np0005531887 nova_compute[186849]: 2025-11-22 08:14:32.081 186853 DEBUG nova.compute.manager [req-ccb968ac-89f8-475c-83da-9cb59118a82c req-6325967a-7b99-4cde-834e-de9f5e9460f4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Received event network-vif-plugged-0867f0fe-7fa4-488f-8c3f-451ee6c50a10 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:14:32 np0005531887 nova_compute[186849]: 2025-11-22 08:14:32.081 186853 DEBUG oslo_concurrency.lockutils [req-ccb968ac-89f8-475c-83da-9cb59118a82c req-6325967a-7b99-4cde-834e-de9f5e9460f4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:14:32 np0005531887 nova_compute[186849]: 2025-11-22 08:14:32.082 186853 DEBUG oslo_concurrency.lockutils [req-ccb968ac-89f8-475c-83da-9cb59118a82c req-6325967a-7b99-4cde-834e-de9f5e9460f4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:14:32 np0005531887 nova_compute[186849]: 2025-11-22 08:14:32.083 186853 DEBUG oslo_concurrency.lockutils [req-ccb968ac-89f8-475c-83da-9cb59118a82c req-6325967a-7b99-4cde-834e-de9f5e9460f4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:14:32 np0005531887 nova_compute[186849]: 2025-11-22 08:14:32.083 186853 DEBUG nova.compute.manager [req-ccb968ac-89f8-475c-83da-9cb59118a82c req-6325967a-7b99-4cde-834e-de9f5e9460f4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] No waiting events found dispatching network-vif-plugged-0867f0fe-7fa4-488f-8c3f-451ee6c50a10 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:14:32 np0005531887 nova_compute[186849]: 2025-11-22 08:14:32.084 186853 WARNING nova.compute.manager [req-ccb968ac-89f8-475c-83da-9cb59118a82c req-6325967a-7b99-4cde-834e-de9f5e9460f4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Received unexpected event network-vif-plugged-0867f0fe-7fa4-488f-8c3f-451ee6c50a10 for instance with vm_state deleted and task_state None.#033[00m
Nov 22 03:14:32 np0005531887 nova_compute[186849]: 2025-11-22 08:14:32.910 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:33 np0005531887 nova_compute[186849]: 2025-11-22 08:14:33.434 186853 DEBUG nova.compute.manager [req-299ad899-0d61-45a5-b14d-79d0573fd9cd req-c1c4b3af-8c45-4601-a666-f841349cfcd2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Received event network-changed-6fca0d10-ec3d-4b7b-844b-458a39db0a47 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:14:33 np0005531887 nova_compute[186849]: 2025-11-22 08:14:33.434 186853 DEBUG nova.compute.manager [req-299ad899-0d61-45a5-b14d-79d0573fd9cd req-c1c4b3af-8c45-4601-a666-f841349cfcd2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Refreshing instance network info cache due to event network-changed-6fca0d10-ec3d-4b7b-844b-458a39db0a47. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:14:33 np0005531887 nova_compute[186849]: 2025-11-22 08:14:33.434 186853 DEBUG oslo_concurrency.lockutils [req-299ad899-0d61-45a5-b14d-79d0573fd9cd req-c1c4b3af-8c45-4601-a666-f841349cfcd2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-9d3b7a77-8b28-4774-9eeb-65b858c3820b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:14:33 np0005531887 nova_compute[186849]: 2025-11-22 08:14:33.434 186853 DEBUG oslo_concurrency.lockutils [req-299ad899-0d61-45a5-b14d-79d0573fd9cd req-c1c4b3af-8c45-4601-a666-f841349cfcd2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-9d3b7a77-8b28-4774-9eeb-65b858c3820b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:14:33 np0005531887 nova_compute[186849]: 2025-11-22 08:14:33.435 186853 DEBUG nova.network.neutron [req-299ad899-0d61-45a5-b14d-79d0573fd9cd req-c1c4b3af-8c45-4601-a666-f841349cfcd2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Refreshing network info cache for port 6fca0d10-ec3d-4b7b-844b-458a39db0a47 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:14:35 np0005531887 nova_compute[186849]: 2025-11-22 08:14:35.666 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:14:35 np0005531887 nova_compute[186849]: 2025-11-22 08:14:35.722 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Triggering sync for uuid 9d3b7a77-8b28-4774-9eeb-65b858c3820b _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Nov 22 03:14:35 np0005531887 nova_compute[186849]: 2025-11-22 08:14:35.723 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "9d3b7a77-8b28-4774-9eeb-65b858c3820b" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:14:35 np0005531887 nova_compute[186849]: 2025-11-22 08:14:35.723 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "9d3b7a77-8b28-4774-9eeb-65b858c3820b" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:14:35 np0005531887 nova_compute[186849]: 2025-11-22 08:14:35.890 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "9d3b7a77-8b28-4774-9eeb-65b858c3820b" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.167s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:14:36 np0005531887 nova_compute[186849]: 2025-11-22 08:14:36.025 186853 DEBUG nova.network.neutron [req-299ad899-0d61-45a5-b14d-79d0573fd9cd req-c1c4b3af-8c45-4601-a666-f841349cfcd2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Updated VIF entry in instance network info cache for port 6fca0d10-ec3d-4b7b-844b-458a39db0a47. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:14:36 np0005531887 nova_compute[186849]: 2025-11-22 08:14:36.026 186853 DEBUG nova.network.neutron [req-299ad899-0d61-45a5-b14d-79d0573fd9cd req-c1c4b3af-8c45-4601-a666-f841349cfcd2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Updating instance_info_cache with network_info: [{"id": "6fca0d10-ec3d-4b7b-844b-458a39db0a47", "address": "fa:16:3e:90:00:cb", "network": {"id": "5cbf5083-8d50-44bd-b6ba-93e507a8654e", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-722250133-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9016c6b616412fa2db0983e23a8150", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6fca0d10-ec", "ovs_interfaceid": "6fca0d10-ec3d-4b7b-844b-458a39db0a47", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:14:36 np0005531887 nova_compute[186849]: 2025-11-22 08:14:36.072 186853 DEBUG oslo_concurrency.lockutils [req-299ad899-0d61-45a5-b14d-79d0573fd9cd req-c1c4b3af-8c45-4601-a666-f841349cfcd2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-9d3b7a77-8b28-4774-9eeb-65b858c3820b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.669 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '9d3b7a77-8b28-4774-9eeb-65b858c3820b', 'name': 'tempest-TestSnapshotPattern-server-368734434', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '2b9b9d31-f80f-437c-8142-755f74bb78ae'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000008d', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '5c9016c6b616412fa2db0983e23a8150', 'user_id': '72df4512d7f245118018df81223ce5ff', 'hostId': '2f2fa492e6e3c7e1a189afc24ada02ef76d028b72e142de1557a5fbb', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.670 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.681 12 DEBUG ceilometer.compute.pollsters [-] 9d3b7a77-8b28-4774-9eeb-65b858c3820b/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.682 12 DEBUG ceilometer.compute.pollsters [-] 9d3b7a77-8b28-4774-9eeb-65b858c3820b/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.683 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '506cf9e1-52f8-4903-9093-d8c80f27e083', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '72df4512d7f245118018df81223ce5ff', 'user_name': None, 'project_id': '5c9016c6b616412fa2db0983e23a8150', 'project_name': None, 'resource_id': '9d3b7a77-8b28-4774-9eeb-65b858c3820b-vda', 'timestamp': '2025-11-22T08:14:36.670394', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-368734434', 'name': 'instance-0000008d', 'instance_id': '9d3b7a77-8b28-4774-9eeb-65b858c3820b', 'instance_type': 'm1.nano', 'host': '2f2fa492e6e3c7e1a189afc24ada02ef76d028b72e142de1557a5fbb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b9b9d31-f80f-437c-8142-755f74bb78ae'}, 'image_ref': '2b9b9d31-f80f-437c-8142-755f74bb78ae', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '48ec1742-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6019.340191034, 'message_signature': '1a235bccb722540d1269f01ec6b991f99100a59effa799be9aedec75a8024e91'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '72df4512d7f245118018df81223ce5ff', 'user_name': None, 'project_id': '5c9016c6b616412fa2db0983e23a8150', 'project_name': None, 'resource_id': '9d3b7a77-8b28-4774-9eeb-65b858c3820b-sda', 'timestamp': '2025-11-22T08:14:36.670394', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-368734434', 'name': 'instance-0000008d', 'instance_id': '9d3b7a77-8b28-4774-9eeb-65b858c3820b', 'instance_type': 'm1.nano', 'host': '2f2fa492e6e3c7e1a189afc24ada02ef76d028b72e142de1557a5fbb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b9b9d31-f80f-437c-8142-755f74bb78ae'}, 'image_ref': '2b9b9d31-f80f-437c-8142-755f74bb78ae', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '48ec2674-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6019.340191034, 'message_signature': 'b0bfc1314bbeba67cce5a9b093a2a27cdfc43c170398601a03f1da512f221276'}]}, 'timestamp': '2025-11-22 08:14:36.682763', '_unique_id': 'b195bac7577740fd9954637470d74a9d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.683 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.683 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.683 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.683 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.683 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.683 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.683 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.683 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.683 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.683 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.683 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.683 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.683 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.683 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.683 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.683 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.683 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.683 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.683 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.683 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.683 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.683 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.683 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.683 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.683 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.683 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.683 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.683 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.683 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.683 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.683 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.685 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.706 12 DEBUG ceilometer.compute.pollsters [-] 9d3b7a77-8b28-4774-9eeb-65b858c3820b/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.706 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 9d3b7a77-8b28-4774-9eeb-65b858c3820b: ceilometer.compute.pollsters.NoVolumeException
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.706 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.707 12 DEBUG ceilometer.compute.pollsters [-] 9d3b7a77-8b28-4774-9eeb-65b858c3820b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.707 12 DEBUG ceilometer.compute.pollsters [-] 9d3b7a77-8b28-4774-9eeb-65b858c3820b/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.708 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd8fcaa37-10b8-4226-b1e7-1c89d7be8a62', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '72df4512d7f245118018df81223ce5ff', 'user_name': None, 'project_id': '5c9016c6b616412fa2db0983e23a8150', 'project_name': None, 'resource_id': '9d3b7a77-8b28-4774-9eeb-65b858c3820b-vda', 'timestamp': '2025-11-22T08:14:36.707010', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-368734434', 'name': 'instance-0000008d', 'instance_id': '9d3b7a77-8b28-4774-9eeb-65b858c3820b', 'instance_type': 'm1.nano', 'host': '2f2fa492e6e3c7e1a189afc24ada02ef76d028b72e142de1557a5fbb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b9b9d31-f80f-437c-8142-755f74bb78ae'}, 'image_ref': '2b9b9d31-f80f-437c-8142-755f74bb78ae', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '48efe69c-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6019.340191034, 'message_signature': 'c7222789975e88f28f58b0677be4e40c8fa270a6f321fb50d38a9e83d5c1d9ad'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '72df4512d7f245118018df81223ce5ff', 'user_name': None, 'project_id': '5c9016c6b616412fa2db0983e23a8150', 'project_name': None, 'resource_id': '9d3b7a77-8b28-4774-9eeb-65b858c3820b-sda', 'timestamp': '2025-11-22T08:14:36.707010', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-368734434', 'name': 'instance-0000008d', 'instance_id': '9d3b7a77-8b28-4774-9eeb-65b858c3820b', 'instance_type': 'm1.nano', 'host': '2f2fa492e6e3c7e1a189afc24ada02ef76d028b72e142de1557a5fbb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b9b9d31-f80f-437c-8142-755f74bb78ae'}, 'image_ref': '2b9b9d31-f80f-437c-8142-755f74bb78ae', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '48eff16e-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6019.340191034, 'message_signature': '6a6008161666fb79f06b18c1b8b0828dcd32a762010bce28360ae464f4578f28'}]}, 'timestamp': '2025-11-22 08:14:36.707574', '_unique_id': '0a3b03ffffce4e4db88f114581e1ab41'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.708 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.708 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.708 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.708 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.708 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.708 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.708 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.708 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.708 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.708 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.708 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.708 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.708 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.708 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.708 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.708 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.708 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.708 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.708 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.708 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.708 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.708 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.708 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.708 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.708 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.708 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.708 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.708 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.708 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.708 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.708 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.709 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.735 12 DEBUG ceilometer.compute.pollsters [-] 9d3b7a77-8b28-4774-9eeb-65b858c3820b/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.736 12 DEBUG ceilometer.compute.pollsters [-] 9d3b7a77-8b28-4774-9eeb-65b858c3820b/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '566ba115-644e-4c9d-88f4-1335f6039a6d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '72df4512d7f245118018df81223ce5ff', 'user_name': None, 'project_id': '5c9016c6b616412fa2db0983e23a8150', 'project_name': None, 'resource_id': '9d3b7a77-8b28-4774-9eeb-65b858c3820b-vda', 'timestamp': '2025-11-22T08:14:36.709158', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-368734434', 'name': 'instance-0000008d', 'instance_id': '9d3b7a77-8b28-4774-9eeb-65b858c3820b', 'instance_type': 'm1.nano', 'host': '2f2fa492e6e3c7e1a189afc24ada02ef76d028b72e142de1557a5fbb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b9b9d31-f80f-437c-8142-755f74bb78ae'}, 'image_ref': '2b9b9d31-f80f-437c-8142-755f74bb78ae', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '48f45b0a-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6019.3789491, 'message_signature': 'f666ac56a75f79ef3ac993cf2235f15f67069e299d566afe0364268b116ad92d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '72df4512d7f245118018df81223ce5ff', 'user_name': None, 'project_id': '5c9016c6b616412fa2db0983e23a8150', 'project_name': None, 'resource_id': '9d3b7a77-8b28-4774-9eeb-65b858c3820b-sda', 'timestamp': '2025-11-22T08:14:36.709158', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-368734434', 'name': 'instance-0000008d', 'instance_id': '9d3b7a77-8b28-4774-9eeb-65b858c3820b', 'instance_type': 'm1.nano', 'host': '2f2fa492e6e3c7e1a189afc24ada02ef76d028b72e142de1557a5fbb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b9b9d31-f80f-437c-8142-755f74bb78ae'}, 'image_ref': '2b9b9d31-f80f-437c-8142-755f74bb78ae', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '48f4696a-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6019.3789491, 'message_signature': '55e136e5ee135d15608e275b8e651ab5b981f128d59460b5303561fb337c7806'}]}, 'timestamp': '2025-11-22 08:14:36.736909', '_unique_id': 'b0bc963fd2834cff8ec99305bab11fdf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.738 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.739 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.739 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestSnapshotPattern-server-368734434>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestSnapshotPattern-server-368734434>]
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.739 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.739 12 DEBUG ceilometer.compute.pollsters [-] 9d3b7a77-8b28-4774-9eeb-65b858c3820b/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.739 12 DEBUG ceilometer.compute.pollsters [-] 9d3b7a77-8b28-4774-9eeb-65b858c3820b/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9b6d64a9-67d0-4117-aa61-92c855665556', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '72df4512d7f245118018df81223ce5ff', 'user_name': None, 'project_id': '5c9016c6b616412fa2db0983e23a8150', 'project_name': None, 'resource_id': '9d3b7a77-8b28-4774-9eeb-65b858c3820b-vda', 'timestamp': '2025-11-22T08:14:36.739496', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-368734434', 'name': 'instance-0000008d', 'instance_id': '9d3b7a77-8b28-4774-9eeb-65b858c3820b', 'instance_type': 'm1.nano', 'host': '2f2fa492e6e3c7e1a189afc24ada02ef76d028b72e142de1557a5fbb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b9b9d31-f80f-437c-8142-755f74bb78ae'}, 'image_ref': '2b9b9d31-f80f-437c-8142-755f74bb78ae', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '48f4daee-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6019.3789491, 'message_signature': '0180201ca5fc50dac26e1d563fef0da806e1c02aa56f6782763b742eac25dafd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '72df4512d7f245118018df81223ce5ff', 'user_name': None, 'project_id': '5c9016c6b616412fa2db0983e23a8150', 'project_name': None, 'resource_id': '9d3b7a77-8b28-4774-9eeb-65b858c3820b-sda', 'timestamp': '2025-11-22T08:14:36.739496', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-368734434', 'name': 'instance-0000008d', 'instance_id': '9d3b7a77-8b28-4774-9eeb-65b858c3820b', 'instance_type': 'm1.nano', 'host': '2f2fa492e6e3c7e1a189afc24ada02ef76d028b72e142de1557a5fbb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b9b9d31-f80f-437c-8142-755f74bb78ae'}, 'image_ref': '2b9b9d31-f80f-437c-8142-755f74bb78ae', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '48f4e71e-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6019.3789491, 'message_signature': 'c7dd51e281e00a35aaac726c3281fdca6291d4ac1b7853635c6c53b7e8780ea8'}]}, 'timestamp': '2025-11-22 08:14:36.740109', '_unique_id': 'aa1b0bfaee3142df993e4695b5fe2b9a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.741 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.741 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.741 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestSnapshotPattern-server-368734434>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestSnapshotPattern-server-368734434>]
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.741 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.745 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 9d3b7a77-8b28-4774-9eeb-65b858c3820b / tap6fca0d10-ec inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.745 12 DEBUG ceilometer.compute.pollsters [-] 9d3b7a77-8b28-4774-9eeb-65b858c3820b/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.746 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '18d44782-b22b-4baa-bc80-6d4740587fd4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '72df4512d7f245118018df81223ce5ff', 'user_name': None, 'project_id': '5c9016c6b616412fa2db0983e23a8150', 'project_name': None, 'resource_id': 'instance-0000008d-9d3b7a77-8b28-4774-9eeb-65b858c3820b-tap6fca0d10-ec', 'timestamp': '2025-11-22T08:14:36.742006', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-368734434', 'name': 'tap6fca0d10-ec', 'instance_id': '9d3b7a77-8b28-4774-9eeb-65b858c3820b', 'instance_type': 'm1.nano', 'host': '2f2fa492e6e3c7e1a189afc24ada02ef76d028b72e142de1557a5fbb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b9b9d31-f80f-437c-8142-755f74bb78ae'}, 'image_ref': '2b9b9d31-f80f-437c-8142-755f74bb78ae', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:90:00:cb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6fca0d10-ec'}, 'message_id': '48f5bd88-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6019.411833932, 'message_signature': 'd8058ec3ba09d253b9776021920a5e67a94ced3457001409fd487673306438f6'}]}, 'timestamp': '2025-11-22 08:14:36.745617', '_unique_id': '0ad60b21e2ba46d8a909807b0d62dac1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.746 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.746 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.746 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.746 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.746 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.746 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.746 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.746 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.746 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.746 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.746 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.746 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.746 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.746 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.746 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.746 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.746 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.746 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.746 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.746 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.746 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.746 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.746 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.746 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.746 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.746 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.746 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.746 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.746 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.746 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.746 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.747 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.747 12 DEBUG ceilometer.compute.pollsters [-] 9d3b7a77-8b28-4774-9eeb-65b858c3820b/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.748 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ca0fc54c-1318-440e-8402-10d34248980d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '72df4512d7f245118018df81223ce5ff', 'user_name': None, 'project_id': '5c9016c6b616412fa2db0983e23a8150', 'project_name': None, 'resource_id': 'instance-0000008d-9d3b7a77-8b28-4774-9eeb-65b858c3820b-tap6fca0d10-ec', 'timestamp': '2025-11-22T08:14:36.747112', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-368734434', 'name': 'tap6fca0d10-ec', 'instance_id': '9d3b7a77-8b28-4774-9eeb-65b858c3820b', 'instance_type': 'm1.nano', 'host': '2f2fa492e6e3c7e1a189afc24ada02ef76d028b72e142de1557a5fbb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b9b9d31-f80f-437c-8142-755f74bb78ae'}, 'image_ref': '2b9b9d31-f80f-437c-8142-755f74bb78ae', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:90:00:cb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6fca0d10-ec'}, 'message_id': '48f604dc-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6019.411833932, 'message_signature': '431051ea7a21534b0c88ba1b19948850d1a73819891b1168d285395a913b09cc'}]}, 'timestamp': '2025-11-22 08:14:36.747433', '_unique_id': '19f93eada96e415e9c08ba8de656b22c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.748 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.748 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.748 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.748 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.748 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.748 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.748 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.748 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.748 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.748 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.748 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.748 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.748 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.748 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.748 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.748 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.748 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.748 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.748 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.748 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.748 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.748 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.748 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.748 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.748 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.748 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.748 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.748 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.748 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.748 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.748 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.748 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.748 12 DEBUG ceilometer.compute.pollsters [-] 9d3b7a77-8b28-4774-9eeb-65b858c3820b/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.749 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '262b22c7-00ef-4540-9b70-8bab13907ee5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '72df4512d7f245118018df81223ce5ff', 'user_name': None, 'project_id': '5c9016c6b616412fa2db0983e23a8150', 'project_name': None, 'resource_id': 'instance-0000008d-9d3b7a77-8b28-4774-9eeb-65b858c3820b-tap6fca0d10-ec', 'timestamp': '2025-11-22T08:14:36.748861', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-368734434', 'name': 'tap6fca0d10-ec', 'instance_id': '9d3b7a77-8b28-4774-9eeb-65b858c3820b', 'instance_type': 'm1.nano', 'host': '2f2fa492e6e3c7e1a189afc24ada02ef76d028b72e142de1557a5fbb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b9b9d31-f80f-437c-8142-755f74bb78ae'}, 'image_ref': '2b9b9d31-f80f-437c-8142-755f74bb78ae', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:90:00:cb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6fca0d10-ec'}, 'message_id': '48f64870-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6019.411833932, 'message_signature': 'f652dd56f3c9a3da288eb68561657e25baf5b90debc1e91c28b5382848589d36'}]}, 'timestamp': '2025-11-22 08:14:36.749164', '_unique_id': 'd0bcf5dd43e84fb597b020a2f76544df'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.749 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.749 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.749 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.749 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.749 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.749 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.749 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.749 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.749 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.749 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.749 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.749 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.749 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.749 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.749 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.749 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.749 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.749 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.749 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.749 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.749 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.749 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.749 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.749 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.749 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.749 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.749 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.749 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.749 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.749 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.749 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.750 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.750 12 DEBUG ceilometer.compute.pollsters [-] 9d3b7a77-8b28-4774-9eeb-65b858c3820b/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.751 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '80201fb5-b478-4830-bf47-4a8ea0247bd7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '72df4512d7f245118018df81223ce5ff', 'user_name': None, 'project_id': '5c9016c6b616412fa2db0983e23a8150', 'project_name': None, 'resource_id': 'instance-0000008d-9d3b7a77-8b28-4774-9eeb-65b858c3820b-tap6fca0d10-ec', 'timestamp': '2025-11-22T08:14:36.750582', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-368734434', 'name': 'tap6fca0d10-ec', 'instance_id': '9d3b7a77-8b28-4774-9eeb-65b858c3820b', 'instance_type': 'm1.nano', 'host': '2f2fa492e6e3c7e1a189afc24ada02ef76d028b72e142de1557a5fbb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b9b9d31-f80f-437c-8142-755f74bb78ae'}, 'image_ref': '2b9b9d31-f80f-437c-8142-755f74bb78ae', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:90:00:cb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6fca0d10-ec'}, 'message_id': '48f68baa-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6019.411833932, 'message_signature': '8e0da7bb6f4f50b2473b075b1789ead53e029861044fb7419f97fa68b38b4200'}]}, 'timestamp': '2025-11-22 08:14:36.750885', '_unique_id': '26734d01098e48759deceb68bc9ddb03'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.751 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.751 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.751 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.751 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.751 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.751 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.751 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.751 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.751 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.751 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.751 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.751 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.751 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.751 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.751 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.751 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.751 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.751 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.751 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.751 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.751 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.751 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.751 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.751 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.751 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.751 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.751 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.751 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.751 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.751 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.751 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.752 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.752 12 DEBUG ceilometer.compute.pollsters [-] 9d3b7a77-8b28-4774-9eeb-65b858c3820b/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.753 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '94f449a6-7db5-46ba-a6d8-5497f2c98939', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '72df4512d7f245118018df81223ce5ff', 'user_name': None, 'project_id': '5c9016c6b616412fa2db0983e23a8150', 'project_name': None, 'resource_id': 'instance-0000008d-9d3b7a77-8b28-4774-9eeb-65b858c3820b-tap6fca0d10-ec', 'timestamp': '2025-11-22T08:14:36.752312', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-368734434', 'name': 'tap6fca0d10-ec', 'instance_id': '9d3b7a77-8b28-4774-9eeb-65b858c3820b', 'instance_type': 'm1.nano', 'host': '2f2fa492e6e3c7e1a189afc24ada02ef76d028b72e142de1557a5fbb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b9b9d31-f80f-437c-8142-755f74bb78ae'}, 'image_ref': '2b9b9d31-f80f-437c-8142-755f74bb78ae', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:90:00:cb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6fca0d10-ec'}, 'message_id': '48f6cf66-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6019.411833932, 'message_signature': 'b2bcc848e88cfc25d7125095851e3df518ab8e8ed4a3f8f7f464d352159c6577'}]}, 'timestamp': '2025-11-22 08:14:36.752621', '_unique_id': 'ba6c9f0daa844ca09ae830cccde66523'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.753 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.753 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.753 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.753 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.753 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.753 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.753 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.753 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.753 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.753 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.753 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.753 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.753 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.753 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.753 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.753 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.753 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.753 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.753 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.753 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.753 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.753 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.753 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.753 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.753 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.753 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.753 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.753 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.753 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.753 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.753 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.753 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.753 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.754 12 DEBUG ceilometer.compute.pollsters [-] 9d3b7a77-8b28-4774-9eeb-65b858c3820b/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c4fcc093-bdcf-418b-b5b3-e3cba3154619', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '72df4512d7f245118018df81223ce5ff', 'user_name': None, 'project_id': '5c9016c6b616412fa2db0983e23a8150', 'project_name': None, 'resource_id': 'instance-0000008d-9d3b7a77-8b28-4774-9eeb-65b858c3820b-tap6fca0d10-ec', 'timestamp': '2025-11-22T08:14:36.754015', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-368734434', 'name': 'tap6fca0d10-ec', 'instance_id': '9d3b7a77-8b28-4774-9eeb-65b858c3820b', 'instance_type': 'm1.nano', 'host': '2f2fa492e6e3c7e1a189afc24ada02ef76d028b72e142de1557a5fbb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b9b9d31-f80f-437c-8142-755f74bb78ae'}, 'image_ref': '2b9b9d31-f80f-437c-8142-755f74bb78ae', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:90:00:cb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6fca0d10-ec'}, 'message_id': '48f711f6-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6019.411833932, 'message_signature': 'a554efef9fd31f9b5d5397e4ece54a7ffddf95fa6cce7e5aa53357d9596fc5e7'}]}, 'timestamp': '2025-11-22 08:14:36.754351', '_unique_id': '06c43420793a46c3a47ec03aa85939d1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.755 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.755 12 DEBUG ceilometer.compute.pollsters [-] 9d3b7a77-8b28-4774-9eeb-65b858c3820b/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.756 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fbb08f29-d6a4-4430-a7af-6e75cb7f6a46', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '72df4512d7f245118018df81223ce5ff', 'user_name': None, 'project_id': '5c9016c6b616412fa2db0983e23a8150', 'project_name': None, 'resource_id': 'instance-0000008d-9d3b7a77-8b28-4774-9eeb-65b858c3820b-tap6fca0d10-ec', 'timestamp': '2025-11-22T08:14:36.755737', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-368734434', 'name': 'tap6fca0d10-ec', 'instance_id': '9d3b7a77-8b28-4774-9eeb-65b858c3820b', 'instance_type': 'm1.nano', 'host': '2f2fa492e6e3c7e1a189afc24ada02ef76d028b72e142de1557a5fbb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b9b9d31-f80f-437c-8142-755f74bb78ae'}, 'image_ref': '2b9b9d31-f80f-437c-8142-755f74bb78ae', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:90:00:cb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6fca0d10-ec'}, 'message_id': '48f755ee-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6019.411833932, 'message_signature': '16ffc6d465925e2ec1c890b67cf7d7b7dba60b85e6b2734406514bcf7fe83357'}]}, 'timestamp': '2025-11-22 08:14:36.756057', '_unique_id': '723fd9e957764404901a216b2d45b149'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.756 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.756 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.756 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.756 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.756 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.756 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.756 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.756 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.756 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.756 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.756 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.756 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.756 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.756 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.756 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.756 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.756 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.756 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.756 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.756 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.756 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.756 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.756 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.756 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.756 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.756 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.756 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.756 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.756 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.756 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.756 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.757 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.757 12 DEBUG ceilometer.compute.pollsters [-] 9d3b7a77-8b28-4774-9eeb-65b858c3820b/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.757 12 DEBUG ceilometer.compute.pollsters [-] 9d3b7a77-8b28-4774-9eeb-65b858c3820b/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.758 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '24c47ee5-880a-410f-a280-39bdec327e24', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '72df4512d7f245118018df81223ce5ff', 'user_name': None, 'project_id': '5c9016c6b616412fa2db0983e23a8150', 'project_name': None, 'resource_id': '9d3b7a77-8b28-4774-9eeb-65b858c3820b-vda', 'timestamp': '2025-11-22T08:14:36.757609', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-368734434', 'name': 'instance-0000008d', 'instance_id': '9d3b7a77-8b28-4774-9eeb-65b858c3820b', 'instance_type': 'm1.nano', 'host': '2f2fa492e6e3c7e1a189afc24ada02ef76d028b72e142de1557a5fbb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b9b9d31-f80f-437c-8142-755f74bb78ae'}, 'image_ref': '2b9b9d31-f80f-437c-8142-755f74bb78ae', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '48f79df6-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6019.3789491, 'message_signature': '46b6fd34fe8ca5ac0044d658c042d93b7be6b74677a86c1126e834e002b3f91d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '72df4512d7f245118018df81223ce5ff', 'user_name': None, 'project_id': '5c9016c6b616412fa2db0983e23a8150', 'project_name': None, 'resource_id': '9d3b7a77-8b28-4774-9eeb-65b858c3820b-sda', 'timestamp': '2025-11-22T08:14:36.757609', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-368734434', 'name': 'instance-0000008d', 'instance_id': '9d3b7a77-8b28-4774-9eeb-65b858c3820b', 'instance_type': 'm1.nano', 'host': '2f2fa492e6e3c7e1a189afc24ada02ef76d028b72e142de1557a5fbb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b9b9d31-f80f-437c-8142-755f74bb78ae'}, 'image_ref': '2b9b9d31-f80f-437c-8142-755f74bb78ae', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '48f7a882-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6019.3789491, 'message_signature': 'a2bfef5740bf5078fb6bc3eb8aa4524900b920fc35c0396265d4d0c30c78430f'}]}, 'timestamp': '2025-11-22 08:14:36.758160', '_unique_id': '9edf83b5871b42fc863dd812b9a769a8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.758 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.758 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.758 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.758 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.758 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.758 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.758 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.758 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.758 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.758 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.758 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.758 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.758 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.758 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.758 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.758 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.758 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.758 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.758 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.758 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.758 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.758 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.758 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.758 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.758 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.758 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.758 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.758 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.758 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.758 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.758 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.759 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.759 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.759 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestSnapshotPattern-server-368734434>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestSnapshotPattern-server-368734434>]
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.759 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.759 12 DEBUG ceilometer.compute.pollsters [-] 9d3b7a77-8b28-4774-9eeb-65b858c3820b/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.760 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2f418a9e-ef0f-4794-8a50-3cdf7d6d0725', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '72df4512d7f245118018df81223ce5ff', 'user_name': None, 'project_id': '5c9016c6b616412fa2db0983e23a8150', 'project_name': None, 'resource_id': 'instance-0000008d-9d3b7a77-8b28-4774-9eeb-65b858c3820b-tap6fca0d10-ec', 'timestamp': '2025-11-22T08:14:36.759874', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-368734434', 'name': 'tap6fca0d10-ec', 'instance_id': '9d3b7a77-8b28-4774-9eeb-65b858c3820b', 'instance_type': 'm1.nano', 'host': '2f2fa492e6e3c7e1a189afc24ada02ef76d028b72e142de1557a5fbb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b9b9d31-f80f-437c-8142-755f74bb78ae'}, 'image_ref': '2b9b9d31-f80f-437c-8142-755f74bb78ae', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:90:00:cb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6fca0d10-ec'}, 'message_id': '48f7f7ce-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6019.411833932, 'message_signature': '6d5565b9c716a62c127f405c823e80500d7518a7875d110028fb482477dd5654'}]}, 'timestamp': '2025-11-22 08:14:36.760197', '_unique_id': 'f2905d0003e14424a1647e93eb68714b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.760 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.760 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.760 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.760 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.760 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.760 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.760 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.760 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.760 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.760 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.760 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.760 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.760 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.760 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.760 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.760 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.760 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.760 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.760 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.760 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.760 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.760 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.760 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.760 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.760 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.760 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.760 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.760 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.760 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.760 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.760 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.761 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.761 12 DEBUG ceilometer.compute.pollsters [-] 9d3b7a77-8b28-4774-9eeb-65b858c3820b/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.761 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '161fcd1e-cf06-4df4-b640-b1f107fa78a0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '72df4512d7f245118018df81223ce5ff', 'user_name': None, 'project_id': '5c9016c6b616412fa2db0983e23a8150', 'project_name': None, 'resource_id': 'instance-0000008d-9d3b7a77-8b28-4774-9eeb-65b858c3820b-tap6fca0d10-ec', 'timestamp': '2025-11-22T08:14:36.761326', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-368734434', 'name': 'tap6fca0d10-ec', 'instance_id': '9d3b7a77-8b28-4774-9eeb-65b858c3820b', 'instance_type': 'm1.nano', 'host': '2f2fa492e6e3c7e1a189afc24ada02ef76d028b72e142de1557a5fbb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b9b9d31-f80f-437c-8142-755f74bb78ae'}, 'image_ref': '2b9b9d31-f80f-437c-8142-755f74bb78ae', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:90:00:cb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6fca0d10-ec'}, 'message_id': '48f82e38-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6019.411833932, 'message_signature': '4cc71bae66c2085994f034831d2fe4f73f3ceb5b16ef996f321e6683bfd0451c'}]}, 'timestamp': '2025-11-22 08:14:36.761562', '_unique_id': 'bbbd3cdbb1bb4180a5d8658895536ce7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.761 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.761 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.761 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.761 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.761 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.761 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.761 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.761 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.761 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.761 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.761 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.761 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.761 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.761 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.761 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.761 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.761 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.761 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.761 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.761 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.761 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.761 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.761 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.761 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.761 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.761 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.761 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.761 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.761 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.761 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.761 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.762 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.762 12 DEBUG ceilometer.compute.pollsters [-] 9d3b7a77-8b28-4774-9eeb-65b858c3820b/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.763 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd8ef2ba1-8896-42ce-a3d3-96e841a8fa6e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '72df4512d7f245118018df81223ce5ff', 'user_name': None, 'project_id': '5c9016c6b616412fa2db0983e23a8150', 'project_name': None, 'resource_id': 'instance-0000008d-9d3b7a77-8b28-4774-9eeb-65b858c3820b-tap6fca0d10-ec', 'timestamp': '2025-11-22T08:14:36.762604', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-368734434', 'name': 'tap6fca0d10-ec', 'instance_id': '9d3b7a77-8b28-4774-9eeb-65b858c3820b', 'instance_type': 'm1.nano', 'host': '2f2fa492e6e3c7e1a189afc24ada02ef76d028b72e142de1557a5fbb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b9b9d31-f80f-437c-8142-755f74bb78ae'}, 'image_ref': '2b9b9d31-f80f-437c-8142-755f74bb78ae', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:90:00:cb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6fca0d10-ec'}, 'message_id': '48f85f98-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6019.411833932, 'message_signature': '94ca13355c17a094d69a1d8ff319fbcdaecf755d18f1c2f4ac58ad81181228f1'}]}, 'timestamp': '2025-11-22 08:14:36.762823', '_unique_id': '9733a966cac744e89414e3c64993b3d4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.763 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.763 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.763 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.763 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.763 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.763 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.763 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.763 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.763 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.763 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.763 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.763 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.763 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.763 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.763 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.763 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.763 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.763 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.763 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.763 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.763 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.763 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.763 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.763 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.763 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.763 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.763 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.763 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.763 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.763 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.763 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.763 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.763 12 DEBUG ceilometer.compute.pollsters [-] 9d3b7a77-8b28-4774-9eeb-65b858c3820b/cpu volume: 9690000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.764 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f002f7bb-7474-42e2-a35a-2088b40e0a67', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9690000000, 'user_id': '72df4512d7f245118018df81223ce5ff', 'user_name': None, 'project_id': '5c9016c6b616412fa2db0983e23a8150', 'project_name': None, 'resource_id': '9d3b7a77-8b28-4774-9eeb-65b858c3820b', 'timestamp': '2025-11-22T08:14:36.763853', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-368734434', 'name': 'instance-0000008d', 'instance_id': '9d3b7a77-8b28-4774-9eeb-65b858c3820b', 'instance_type': 'm1.nano', 'host': '2f2fa492e6e3c7e1a189afc24ada02ef76d028b72e142de1557a5fbb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b9b9d31-f80f-437c-8142-755f74bb78ae'}, 'image_ref': '2b9b9d31-f80f-437c-8142-755f74bb78ae', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '48f8917a-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6019.376335635, 'message_signature': '027506897191a8214399bb52720680663554ee961abd9aad4331257c5f053089'}]}, 'timestamp': '2025-11-22 08:14:36.764108', '_unique_id': 'a7d9c265ba494b459916fb355e2a9e33'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.764 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.764 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.764 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.764 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.764 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.764 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.764 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.764 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.764 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.764 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.764 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.764 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.764 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.764 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.764 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.764 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.764 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.764 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.764 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.764 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.764 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.764 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.764 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.764 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.764 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.764 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.764 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.764 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.764 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.764 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.764 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.765 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.765 12 DEBUG ceilometer.compute.pollsters [-] 9d3b7a77-8b28-4774-9eeb-65b858c3820b/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.765 12 DEBUG ceilometer.compute.pollsters [-] 9d3b7a77-8b28-4774-9eeb-65b858c3820b/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.766 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cf053e00-1b48-4726-b23a-1fb2597f468e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '72df4512d7f245118018df81223ce5ff', 'user_name': None, 'project_id': '5c9016c6b616412fa2db0983e23a8150', 'project_name': None, 'resource_id': '9d3b7a77-8b28-4774-9eeb-65b858c3820b-vda', 'timestamp': '2025-11-22T08:14:36.765165', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-368734434', 'name': 'instance-0000008d', 'instance_id': '9d3b7a77-8b28-4774-9eeb-65b858c3820b', 'instance_type': 'm1.nano', 'host': '2f2fa492e6e3c7e1a189afc24ada02ef76d028b72e142de1557a5fbb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b9b9d31-f80f-437c-8142-755f74bb78ae'}, 'image_ref': '2b9b9d31-f80f-437c-8142-755f74bb78ae', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '48f8c5a0-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6019.3789491, 'message_signature': '5e9982860cf999c0e96f7792cb68b460c8f3125c3cbf816d9ebce3e81ef2ffe7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '72df4512d7f245118018df81223ce5ff', 'user_name': None, 'project_id': '5c9016c6b616412fa2db0983e23a8150', 'project_name': None, 'resource_id': '9d3b7a77-8b28-4774-9eeb-65b858c3820b-sda', 'timestamp': '2025-11-22T08:14:36.765165', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-368734434', 'name': 'instance-0000008d', 'instance_id': '9d3b7a77-8b28-4774-9eeb-65b858c3820b', 'instance_type': 'm1.nano', 'host': '2f2fa492e6e3c7e1a189afc24ada02ef76d028b72e142de1557a5fbb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b9b9d31-f80f-437c-8142-755f74bb78ae'}, 'image_ref': '2b9b9d31-f80f-437c-8142-755f74bb78ae', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '48f8cdf2-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6019.3789491, 'message_signature': '222da6f79dcb05fe0c7eeea751be1747cf17b54c0a6aca38a9baf9d9fc983a8b'}]}, 'timestamp': '2025-11-22 08:14:36.765635', '_unique_id': 'fff1ec61c2764bf6a26a609c5b102781'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.766 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.766 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.766 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.766 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.766 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.766 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.766 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.766 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.766 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.766 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.766 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.766 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.766 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.766 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.766 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.766 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.766 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.766 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.766 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.766 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.766 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.766 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.766 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.766 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.766 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.766 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.766 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.766 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.766 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.766 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.766 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.766 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.766 12 DEBUG ceilometer.compute.pollsters [-] 9d3b7a77-8b28-4774-9eeb-65b858c3820b/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.766 12 DEBUG ceilometer.compute.pollsters [-] 9d3b7a77-8b28-4774-9eeb-65b858c3820b/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.767 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '436353d9-b622-4687-85e5-78d10db4aaf5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '72df4512d7f245118018df81223ce5ff', 'user_name': None, 'project_id': '5c9016c6b616412fa2db0983e23a8150', 'project_name': None, 'resource_id': '9d3b7a77-8b28-4774-9eeb-65b858c3820b-vda', 'timestamp': '2025-11-22T08:14:36.766702', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-368734434', 'name': 'instance-0000008d', 'instance_id': '9d3b7a77-8b28-4774-9eeb-65b858c3820b', 'instance_type': 'm1.nano', 'host': '2f2fa492e6e3c7e1a189afc24ada02ef76d028b72e142de1557a5fbb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b9b9d31-f80f-437c-8142-755f74bb78ae'}, 'image_ref': '2b9b9d31-f80f-437c-8142-755f74bb78ae', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '48f8ffb6-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6019.340191034, 'message_signature': 'db3006c885fe0b7b4733b8d10fdca58ba6a6eb470228a5f54cf743a372c49a86'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '72df4512d7f245118018df81223ce5ff', 'user_name': None, 'project_id': '5c9016c6b616412fa2db0983e23a8150', 'project_name': None, 'resource_id': '9d3b7a77-8b28-4774-9eeb-65b858c3820b-sda', 'timestamp': '2025-11-22T08:14:36.766702', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-368734434', 'name': 'instance-0000008d', 'instance_id': '9d3b7a77-8b28-4774-9eeb-65b858c3820b', 'instance_type': 'm1.nano', 'host': '2f2fa492e6e3c7e1a189afc24ada02ef76d028b72e142de1557a5fbb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b9b9d31-f80f-437c-8142-755f74bb78ae'}, 'image_ref': '2b9b9d31-f80f-437c-8142-755f74bb78ae', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '48f9072c-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6019.340191034, 'message_signature': '3685686b9791de9f24dc320fd8ad04834e22e371ecf2dcd4b77165980e165c69'}]}, 'timestamp': '2025-11-22 08:14:36.767097', '_unique_id': 'df2329d1adbe4e6592b370907836a503'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.767 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.767 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.767 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.767 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.767 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.767 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.767 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.767 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.767 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.767 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.767 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.767 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.767 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.767 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.767 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.767 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.767 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.767 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.767 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.767 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.767 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.767 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.767 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.767 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.767 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.767 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.767 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.767 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.767 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.767 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.767 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.768 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.768 12 DEBUG ceilometer.compute.pollsters [-] 9d3b7a77-8b28-4774-9eeb-65b858c3820b/disk.device.read.latency volume: 986621467 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.768 12 DEBUG ceilometer.compute.pollsters [-] 9d3b7a77-8b28-4774-9eeb-65b858c3820b/disk.device.read.latency volume: 9447894 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.769 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd1c7f96f-fd1e-414e-8a09-1c5545221bf6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 986621467, 'user_id': '72df4512d7f245118018df81223ce5ff', 'user_name': None, 'project_id': '5c9016c6b616412fa2db0983e23a8150', 'project_name': None, 'resource_id': '9d3b7a77-8b28-4774-9eeb-65b858c3820b-vda', 'timestamp': '2025-11-22T08:14:36.768168', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-368734434', 'name': 'instance-0000008d', 'instance_id': '9d3b7a77-8b28-4774-9eeb-65b858c3820b', 'instance_type': 'm1.nano', 'host': '2f2fa492e6e3c7e1a189afc24ada02ef76d028b72e142de1557a5fbb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b9b9d31-f80f-437c-8142-755f74bb78ae'}, 'image_ref': '2b9b9d31-f80f-437c-8142-755f74bb78ae', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '48f9399a-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6019.3789491, 'message_signature': 'c7b6c2027303db940c81dad4d9cc8595a4be99d54b5627b872522ef10966fdff'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9447894, 'user_id': '72df4512d7f245118018df81223ce5ff', 'user_name': None, 'project_id': '5c9016c6b616412fa2db0983e23a8150', 'project_name': None, 'resource_id': '9d3b7a77-8b28-4774-9eeb-65b858c3820b-sda', 'timestamp': '2025-11-22T08:14:36.768168', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-368734434', 'name': 'instance-0000008d', 'instance_id': '9d3b7a77-8b28-4774-9eeb-65b858c3820b', 'instance_type': 'm1.nano', 'host': '2f2fa492e6e3c7e1a189afc24ada02ef76d028b72e142de1557a5fbb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b9b9d31-f80f-437c-8142-755f74bb78ae'}, 'image_ref': '2b9b9d31-f80f-437c-8142-755f74bb78ae', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '48f9426e-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6019.3789491, 'message_signature': '719ac1eaddad6ec30195a5cc1be4888ae9446a20ad3902acee777a4ef9f6d14e'}]}, 'timestamp': '2025-11-22 08:14:36.768617', '_unique_id': 'a18e24fdba6f4b9fb96a9f51a99bd709'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.769 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.769 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.769 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.769 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.769 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.769 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.769 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.769 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.769 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.769 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.769 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.769 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.769 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.769 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.769 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.769 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.769 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.769 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.769 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.769 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.769 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.769 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.769 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.769 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.769 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.769 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.769 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.769 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.769 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.769 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.769 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.769 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.769 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.769 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestSnapshotPattern-server-368734434>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestSnapshotPattern-server-368734434>]
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.769 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.769 12 DEBUG ceilometer.compute.pollsters [-] 9d3b7a77-8b28-4774-9eeb-65b858c3820b/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.770 12 DEBUG ceilometer.compute.pollsters [-] 9d3b7a77-8b28-4774-9eeb-65b858c3820b/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.770 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5ce48020-40c2-4620-85df-6d709258c017', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '72df4512d7f245118018df81223ce5ff', 'user_name': None, 'project_id': '5c9016c6b616412fa2db0983e23a8150', 'project_name': None, 'resource_id': '9d3b7a77-8b28-4774-9eeb-65b858c3820b-vda', 'timestamp': '2025-11-22T08:14:36.769966', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-368734434', 'name': 'instance-0000008d', 'instance_id': '9d3b7a77-8b28-4774-9eeb-65b858c3820b', 'instance_type': 'm1.nano', 'host': '2f2fa492e6e3c7e1a189afc24ada02ef76d028b72e142de1557a5fbb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b9b9d31-f80f-437c-8142-755f74bb78ae'}, 'image_ref': '2b9b9d31-f80f-437c-8142-755f74bb78ae', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '48f97f0e-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6019.3789491, 'message_signature': '1236cfd1734539ce65b6709906c763c6d4020cb92b72d9a66368670c202200a3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '72df4512d7f245118018df81223ce5ff', 'user_name': None, 'project_id': '5c9016c6b616412fa2db0983e23a8150', 'project_name': None, 'resource_id': '9d3b7a77-8b28-4774-9eeb-65b858c3820b-sda', 'timestamp': '2025-11-22T08:14:36.769966', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-368734434', 'name': 'instance-0000008d', 'instance_id': '9d3b7a77-8b28-4774-9eeb-65b858c3820b', 'instance_type': 'm1.nano', 'host': '2f2fa492e6e3c7e1a189afc24ada02ef76d028b72e142de1557a5fbb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b9b9d31-f80f-437c-8142-755f74bb78ae'}, 'image_ref': '2b9b9d31-f80f-437c-8142-755f74bb78ae', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '48f9880a-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6019.3789491, 'message_signature': '529fc008822f64952cb0d686b23138571ffafd4c523aaca3739c3c1b84bed64f'}]}, 'timestamp': '2025-11-22 08:14:36.770410', '_unique_id': '7b6b180a7ee64d698dd72b5c2f389675'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.770 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.770 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.770 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.770 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.770 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.770 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.770 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.770 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.770 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.770 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.770 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.770 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.770 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.770 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.770 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.770 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.770 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.770 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.770 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.770 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.770 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.770 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.770 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.770 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.770 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.770 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.770 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.770 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.770 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.770 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:14:36.770 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:14:37 np0005531887 nova_compute[186849]: 2025-11-22 08:14:37.043 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:37.349 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:14:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:37.349 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:14:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:37.350 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:14:37 np0005531887 podman[234567]: 2025-11-22 08:14:37.839234181 +0000 UTC m=+0.061619812 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., container_name=openstack_network_exporter, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 03:14:37 np0005531887 nova_compute[186849]: 2025-11-22 08:14:37.913 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:39 np0005531887 podman[234597]: 2025-11-22 08:14:39.83077918 +0000 UTC m=+0.054195449 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 22 03:14:39 np0005531887 podman[234598]: 2025-11-22 08:14:39.85510119 +0000 UTC m=+0.076714645 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:14:41 np0005531887 ovn_controller[95130]: 2025-11-22T08:14:41Z|00053|pinctrl(ovn_pinctrl0)|WARN|DHCPREQUEST requested IP 10.100.0.4 does not match offer 10.100.0.5
Nov 22 03:14:41 np0005531887 ovn_controller[95130]: 2025-11-22T08:14:41Z|00054|pinctrl(ovn_pinctrl0)|INFO|DHCPNAK fa:16:3e:90:00:cb 10.100.0.5
Nov 22 03:14:42 np0005531887 nova_compute[186849]: 2025-11-22 08:14:42.046 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:42 np0005531887 nova_compute[186849]: 2025-11-22 08:14:42.888 186853 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763799267.887049, 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:14:42 np0005531887 nova_compute[186849]: 2025-11-22 08:14:42.888 186853 INFO nova.compute.manager [-] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:14:42 np0005531887 nova_compute[186849]: 2025-11-22 08:14:42.914 186853 DEBUG nova.compute.manager [None req-22525ec5-48aa-4cfd-b362-2ece196dd24e - - - - - -] [instance: 9b56fea8-e3a5-4d2b-8dc2-fe0934916f2e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:14:42 np0005531887 nova_compute[186849]: 2025-11-22 08:14:42.917 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:44 np0005531887 ovn_controller[95130]: 2025-11-22T08:14:44Z|00055|pinctrl(ovn_pinctrl0)|WARN|DHCPREQUEST requested IP 10.100.0.4 does not match offer 10.100.0.5
Nov 22 03:14:44 np0005531887 ovn_controller[95130]: 2025-11-22T08:14:44Z|00056|pinctrl(ovn_pinctrl0)|INFO|DHCPNAK fa:16:3e:90:00:cb 10.100.0.5
Nov 22 03:14:45 np0005531887 podman[234644]: 2025-11-22 08:14:45.845783612 +0000 UTC m=+0.055095321 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:14:46 np0005531887 ovn_controller[95130]: 2025-11-22T08:14:46Z|00057|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:90:00:cb 10.100.0.5
Nov 22 03:14:46 np0005531887 ovn_controller[95130]: 2025-11-22T08:14:46Z|00058|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:90:00:cb 10.100.0.5
Nov 22 03:14:47 np0005531887 nova_compute[186849]: 2025-11-22 08:14:47.046 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:47 np0005531887 nova_compute[186849]: 2025-11-22 08:14:47.699 186853 DEBUG oslo_concurrency.lockutils [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "11b63192-42c9-4462-80c0-d66b0f6fcd47" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:14:47 np0005531887 nova_compute[186849]: 2025-11-22 08:14:47.699 186853 DEBUG oslo_concurrency.lockutils [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "11b63192-42c9-4462-80c0-d66b0f6fcd47" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:14:47 np0005531887 nova_compute[186849]: 2025-11-22 08:14:47.746 186853 DEBUG nova.compute.manager [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 03:14:47 np0005531887 nova_compute[186849]: 2025-11-22 08:14:47.892 186853 DEBUG oslo_concurrency.lockutils [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:14:47 np0005531887 nova_compute[186849]: 2025-11-22 08:14:47.893 186853 DEBUG oslo_concurrency.lockutils [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:14:47 np0005531887 nova_compute[186849]: 2025-11-22 08:14:47.911 186853 DEBUG nova.virt.hardware [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:14:47 np0005531887 nova_compute[186849]: 2025-11-22 08:14:47.912 186853 INFO nova.compute.claims [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 22 03:14:47 np0005531887 nova_compute[186849]: 2025-11-22 08:14:47.919 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:48 np0005531887 nova_compute[186849]: 2025-11-22 08:14:48.305 186853 DEBUG nova.compute.provider_tree [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:14:48 np0005531887 nova_compute[186849]: 2025-11-22 08:14:48.330 186853 DEBUG nova.scheduler.client.report [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:14:48 np0005531887 nova_compute[186849]: 2025-11-22 08:14:48.416 186853 DEBUG oslo_concurrency.lockutils [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.523s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:14:48 np0005531887 nova_compute[186849]: 2025-11-22 08:14:48.418 186853 DEBUG nova.compute.manager [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 03:14:48 np0005531887 nova_compute[186849]: 2025-11-22 08:14:48.507 186853 DEBUG nova.compute.manager [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 03:14:48 np0005531887 nova_compute[186849]: 2025-11-22 08:14:48.508 186853 DEBUG nova.network.neutron [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:14:48 np0005531887 nova_compute[186849]: 2025-11-22 08:14:48.933 186853 INFO nova.virt.libvirt.driver [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 03:14:49 np0005531887 nova_compute[186849]: 2025-11-22 08:14:49.305 186853 DEBUG nova.policy [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '809b865601654264af5bff7f49127cea', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:14:49 np0005531887 nova_compute[186849]: 2025-11-22 08:14:49.489 186853 DEBUG nova.compute.manager [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 03:14:49 np0005531887 nova_compute[186849]: 2025-11-22 08:14:49.900 186853 DEBUG nova.compute.manager [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 03:14:49 np0005531887 nova_compute[186849]: 2025-11-22 08:14:49.902 186853 DEBUG nova.virt.libvirt.driver [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:14:49 np0005531887 nova_compute[186849]: 2025-11-22 08:14:49.902 186853 INFO nova.virt.libvirt.driver [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Creating image(s)#033[00m
Nov 22 03:14:49 np0005531887 nova_compute[186849]: 2025-11-22 08:14:49.903 186853 DEBUG oslo_concurrency.lockutils [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "/var/lib/nova/instances/11b63192-42c9-4462-80c0-d66b0f6fcd47/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:14:49 np0005531887 nova_compute[186849]: 2025-11-22 08:14:49.904 186853 DEBUG oslo_concurrency.lockutils [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "/var/lib/nova/instances/11b63192-42c9-4462-80c0-d66b0f6fcd47/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:14:49 np0005531887 nova_compute[186849]: 2025-11-22 08:14:49.904 186853 DEBUG oslo_concurrency.lockutils [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "/var/lib/nova/instances/11b63192-42c9-4462-80c0-d66b0f6fcd47/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:14:49 np0005531887 nova_compute[186849]: 2025-11-22 08:14:49.926 186853 DEBUG oslo_concurrency.processutils [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:14:49 np0005531887 nova_compute[186849]: 2025-11-22 08:14:49.990 186853 DEBUG oslo_concurrency.processutils [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:14:49 np0005531887 nova_compute[186849]: 2025-11-22 08:14:49.991 186853 DEBUG oslo_concurrency.lockutils [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:14:49 np0005531887 nova_compute[186849]: 2025-11-22 08:14:49.992 186853 DEBUG oslo_concurrency.lockutils [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:14:50 np0005531887 nova_compute[186849]: 2025-11-22 08:14:50.004 186853 DEBUG oslo_concurrency.processutils [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:14:50 np0005531887 nova_compute[186849]: 2025-11-22 08:14:50.085 186853 DEBUG oslo_concurrency.processutils [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:14:50 np0005531887 nova_compute[186849]: 2025-11-22 08:14:50.086 186853 DEBUG oslo_concurrency.processutils [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/11b63192-42c9-4462-80c0-d66b0f6fcd47/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:14:50 np0005531887 nova_compute[186849]: 2025-11-22 08:14:50.226 186853 DEBUG oslo_concurrency.processutils [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/11b63192-42c9-4462-80c0-d66b0f6fcd47/disk 1073741824" returned: 0 in 0.140s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:14:50 np0005531887 nova_compute[186849]: 2025-11-22 08:14:50.227 186853 DEBUG oslo_concurrency.lockutils [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.235s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:14:50 np0005531887 nova_compute[186849]: 2025-11-22 08:14:50.228 186853 DEBUG oslo_concurrency.processutils [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:14:50 np0005531887 nova_compute[186849]: 2025-11-22 08:14:50.303 186853 DEBUG oslo_concurrency.processutils [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:14:50 np0005531887 nova_compute[186849]: 2025-11-22 08:14:50.304 186853 DEBUG nova.virt.disk.api [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Checking if we can resize image /var/lib/nova/instances/11b63192-42c9-4462-80c0-d66b0f6fcd47/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:14:50 np0005531887 nova_compute[186849]: 2025-11-22 08:14:50.305 186853 DEBUG oslo_concurrency.processutils [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/11b63192-42c9-4462-80c0-d66b0f6fcd47/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:14:50 np0005531887 nova_compute[186849]: 2025-11-22 08:14:50.364 186853 DEBUG oslo_concurrency.processutils [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/11b63192-42c9-4462-80c0-d66b0f6fcd47/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:14:50 np0005531887 nova_compute[186849]: 2025-11-22 08:14:50.366 186853 DEBUG nova.virt.disk.api [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Cannot resize image /var/lib/nova/instances/11b63192-42c9-4462-80c0-d66b0f6fcd47/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:14:50 np0005531887 nova_compute[186849]: 2025-11-22 08:14:50.366 186853 DEBUG nova.objects.instance [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lazy-loading 'migration_context' on Instance uuid 11b63192-42c9-4462-80c0-d66b0f6fcd47 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:14:50 np0005531887 nova_compute[186849]: 2025-11-22 08:14:50.384 186853 DEBUG nova.virt.libvirt.driver [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:14:50 np0005531887 nova_compute[186849]: 2025-11-22 08:14:50.385 186853 DEBUG nova.virt.libvirt.driver [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Ensure instance console log exists: /var/lib/nova/instances/11b63192-42c9-4462-80c0-d66b0f6fcd47/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:14:50 np0005531887 nova_compute[186849]: 2025-11-22 08:14:50.386 186853 DEBUG oslo_concurrency.lockutils [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:14:50 np0005531887 nova_compute[186849]: 2025-11-22 08:14:50.386 186853 DEBUG oslo_concurrency.lockutils [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:14:50 np0005531887 nova_compute[186849]: 2025-11-22 08:14:50.386 186853 DEBUG oslo_concurrency.lockutils [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:14:50 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:50.802 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=40, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=39) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:14:50 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:50.803 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:14:50 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:14:50.803 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '40'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:14:50 np0005531887 nova_compute[186849]: 2025-11-22 08:14:50.804 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:50 np0005531887 podman[234685]: 2025-11-22 08:14:50.87316375 +0000 UTC m=+0.076430196 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 22 03:14:50 np0005531887 nova_compute[186849]: 2025-11-22 08:14:50.979 186853 DEBUG nova.network.neutron [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Successfully created port: d9732384-f751-4c79-b2d8-54d8d0a67924 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:14:52 np0005531887 nova_compute[186849]: 2025-11-22 08:14:52.048 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:52 np0005531887 nova_compute[186849]: 2025-11-22 08:14:52.710 186853 DEBUG nova.network.neutron [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Successfully created port: 77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:14:52 np0005531887 nova_compute[186849]: 2025-11-22 08:14:52.922 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:54 np0005531887 nova_compute[186849]: 2025-11-22 08:14:54.968 186853 DEBUG nova.network.neutron [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Successfully updated port: d9732384-f751-4c79-b2d8-54d8d0a67924 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:14:55 np0005531887 nova_compute[186849]: 2025-11-22 08:14:55.093 186853 DEBUG nova.compute.manager [req-72dbb9e9-e909-4a6d-99e6-64b4ecc682c6 req-8de92916-9b14-4d2b-8188-1a8d6ddfb88a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Received event network-changed-d9732384-f751-4c79-b2d8-54d8d0a67924 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:14:55 np0005531887 nova_compute[186849]: 2025-11-22 08:14:55.094 186853 DEBUG nova.compute.manager [req-72dbb9e9-e909-4a6d-99e6-64b4ecc682c6 req-8de92916-9b14-4d2b-8188-1a8d6ddfb88a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Refreshing instance network info cache due to event network-changed-d9732384-f751-4c79-b2d8-54d8d0a67924. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:14:55 np0005531887 nova_compute[186849]: 2025-11-22 08:14:55.094 186853 DEBUG oslo_concurrency.lockutils [req-72dbb9e9-e909-4a6d-99e6-64b4ecc682c6 req-8de92916-9b14-4d2b-8188-1a8d6ddfb88a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-11b63192-42c9-4462-80c0-d66b0f6fcd47" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:14:55 np0005531887 nova_compute[186849]: 2025-11-22 08:14:55.094 186853 DEBUG oslo_concurrency.lockutils [req-72dbb9e9-e909-4a6d-99e6-64b4ecc682c6 req-8de92916-9b14-4d2b-8188-1a8d6ddfb88a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-11b63192-42c9-4462-80c0-d66b0f6fcd47" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:14:55 np0005531887 nova_compute[186849]: 2025-11-22 08:14:55.094 186853 DEBUG nova.network.neutron [req-72dbb9e9-e909-4a6d-99e6-64b4ecc682c6 req-8de92916-9b14-4d2b-8188-1a8d6ddfb88a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Refreshing network info cache for port d9732384-f751-4c79-b2d8-54d8d0a67924 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:14:55 np0005531887 nova_compute[186849]: 2025-11-22 08:14:55.497 186853 DEBUG nova.network.neutron [req-72dbb9e9-e909-4a6d-99e6-64b4ecc682c6 req-8de92916-9b14-4d2b-8188-1a8d6ddfb88a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:14:56 np0005531887 nova_compute[186849]: 2025-11-22 08:14:56.077 186853 DEBUG nova.network.neutron [req-72dbb9e9-e909-4a6d-99e6-64b4ecc682c6 req-8de92916-9b14-4d2b-8188-1a8d6ddfb88a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:14:56 np0005531887 nova_compute[186849]: 2025-11-22 08:14:56.107 186853 DEBUG oslo_concurrency.lockutils [req-72dbb9e9-e909-4a6d-99e6-64b4ecc682c6 req-8de92916-9b14-4d2b-8188-1a8d6ddfb88a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-11b63192-42c9-4462-80c0-d66b0f6fcd47" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:14:56 np0005531887 podman[234702]: 2025-11-22 08:14:56.834979729 +0000 UTC m=+0.050825045 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 22 03:14:57 np0005531887 nova_compute[186849]: 2025-11-22 08:14:57.044 186853 DEBUG nova.network.neutron [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Successfully updated port: 77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:14:57 np0005531887 nova_compute[186849]: 2025-11-22 08:14:57.050 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:57 np0005531887 nova_compute[186849]: 2025-11-22 08:14:57.059 186853 DEBUG oslo_concurrency.lockutils [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "refresh_cache-11b63192-42c9-4462-80c0-d66b0f6fcd47" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:14:57 np0005531887 nova_compute[186849]: 2025-11-22 08:14:57.059 186853 DEBUG oslo_concurrency.lockutils [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquired lock "refresh_cache-11b63192-42c9-4462-80c0-d66b0f6fcd47" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:14:57 np0005531887 nova_compute[186849]: 2025-11-22 08:14:57.059 186853 DEBUG nova.network.neutron [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:14:57 np0005531887 nova_compute[186849]: 2025-11-22 08:14:57.205 186853 DEBUG nova.compute.manager [req-4f301cf4-078d-4c3d-80e2-1a46f14d9422 req-4d4a6f0b-59ca-488a-8e86-ccb69d94fc21 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Received event network-changed-77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:14:57 np0005531887 nova_compute[186849]: 2025-11-22 08:14:57.205 186853 DEBUG nova.compute.manager [req-4f301cf4-078d-4c3d-80e2-1a46f14d9422 req-4d4a6f0b-59ca-488a-8e86-ccb69d94fc21 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Refreshing instance network info cache due to event network-changed-77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:14:57 np0005531887 nova_compute[186849]: 2025-11-22 08:14:57.205 186853 DEBUG oslo_concurrency.lockutils [req-4f301cf4-078d-4c3d-80e2-1a46f14d9422 req-4d4a6f0b-59ca-488a-8e86-ccb69d94fc21 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-11b63192-42c9-4462-80c0-d66b0f6fcd47" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:14:57 np0005531887 nova_compute[186849]: 2025-11-22 08:14:57.280 186853 DEBUG nova.network.neutron [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:14:57 np0005531887 nova_compute[186849]: 2025-11-22 08:14:57.924 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:14:58 np0005531887 nova_compute[186849]: 2025-11-22 08:14:58.827 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.419 186853 DEBUG nova.network.neutron [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Updating instance_info_cache with network_info: [{"id": "d9732384-f751-4c79-b2d8-54d8d0a67924", "address": "fa:16:3e:ba:aa:4b", "network": {"id": "7e573664-04ba-4ce5-994a-9fb9483a2400", "bridge": "br-int", "label": "tempest-network-smoke--447291836", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd9732384-f7", "ovs_interfaceid": "d9732384-f751-4c79-b2d8-54d8d0a67924", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9", "address": "fa:16:3e:14:9a:3c", "network": {"id": "6c0a2255-6426-43c4-abc3-5c1857ba0a79", "bridge": "br-int", "label": "tempest-network-smoke--277865928", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe14:9a3c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe14:9a3c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77caeab6-ac", "ovs_interfaceid": "77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.454 186853 DEBUG oslo_concurrency.lockutils [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Releasing lock "refresh_cache-11b63192-42c9-4462-80c0-d66b0f6fcd47" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.455 186853 DEBUG nova.compute.manager [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Instance network_info: |[{"id": "d9732384-f751-4c79-b2d8-54d8d0a67924", "address": "fa:16:3e:ba:aa:4b", "network": {"id": "7e573664-04ba-4ce5-994a-9fb9483a2400", "bridge": "br-int", "label": "tempest-network-smoke--447291836", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd9732384-f7", "ovs_interfaceid": "d9732384-f751-4c79-b2d8-54d8d0a67924", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9", "address": "fa:16:3e:14:9a:3c", "network": {"id": "6c0a2255-6426-43c4-abc3-5c1857ba0a79", "bridge": "br-int", "label": "tempest-network-smoke--277865928", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe14:9a3c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe14:9a3c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77caeab6-ac", "ovs_interfaceid": "77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.455 186853 DEBUG oslo_concurrency.lockutils [req-4f301cf4-078d-4c3d-80e2-1a46f14d9422 req-4d4a6f0b-59ca-488a-8e86-ccb69d94fc21 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-11b63192-42c9-4462-80c0-d66b0f6fcd47" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.456 186853 DEBUG nova.network.neutron [req-4f301cf4-078d-4c3d-80e2-1a46f14d9422 req-4d4a6f0b-59ca-488a-8e86-ccb69d94fc21 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Refreshing network info cache for port 77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.460 186853 DEBUG nova.virt.libvirt.driver [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Start _get_guest_xml network_info=[{"id": "d9732384-f751-4c79-b2d8-54d8d0a67924", "address": "fa:16:3e:ba:aa:4b", "network": {"id": "7e573664-04ba-4ce5-994a-9fb9483a2400", "bridge": "br-int", "label": "tempest-network-smoke--447291836", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd9732384-f7", "ovs_interfaceid": "d9732384-f751-4c79-b2d8-54d8d0a67924", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9", "address": "fa:16:3e:14:9a:3c", "network": {"id": "6c0a2255-6426-43c4-abc3-5c1857ba0a79", "bridge": "br-int", "label": "tempest-network-smoke--277865928", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe14:9a3c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe14:9a3c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77caeab6-ac", "ovs_interfaceid": "77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'size': 0, 'boot_index': 0, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.467 186853 WARNING nova.virt.libvirt.driver [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.476 186853 DEBUG nova.virt.libvirt.host [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.477 186853 DEBUG nova.virt.libvirt.host [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.482 186853 DEBUG nova.virt.libvirt.host [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.483 186853 DEBUG nova.virt.libvirt.host [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.484 186853 DEBUG nova.virt.libvirt.driver [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.484 186853 DEBUG nova.virt.hardware [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.485 186853 DEBUG nova.virt.hardware [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.485 186853 DEBUG nova.virt.hardware [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.485 186853 DEBUG nova.virt.hardware [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.486 186853 DEBUG nova.virt.hardware [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.486 186853 DEBUG nova.virt.hardware [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.486 186853 DEBUG nova.virt.hardware [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.487 186853 DEBUG nova.virt.hardware [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.487 186853 DEBUG nova.virt.hardware [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.487 186853 DEBUG nova.virt.hardware [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.488 186853 DEBUG nova.virt.hardware [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.491 186853 DEBUG nova.virt.libvirt.vif [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:14:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-540966288',display_name='tempest-TestGettingAddress-server-540966288',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-540966288',id=143,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBE3h1SGyV3gSWS5sQ4nylu2AAOZq0lzcQgQV1fi/afTfgHNAebinpbyavWAHUC3BTYwehM8YAaeM76WaxgrKeLAhyjYG3qrO7DzWZz90S7erIXCzT/UdxFEeIFnV62ADrw==',key_name='tempest-TestGettingAddress-2002370226',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-yuux33uq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:14:49Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=11b63192-42c9-4462-80c0-d66b0f6fcd47,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d9732384-f751-4c79-b2d8-54d8d0a67924", "address": "fa:16:3e:ba:aa:4b", "network": {"id": "7e573664-04ba-4ce5-994a-9fb9483a2400", "bridge": "br-int", "label": "tempest-network-smoke--447291836", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd9732384-f7", "ovs_interfaceid": "d9732384-f751-4c79-b2d8-54d8d0a67924", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.492 186853 DEBUG nova.network.os_vif_util [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "d9732384-f751-4c79-b2d8-54d8d0a67924", "address": "fa:16:3e:ba:aa:4b", "network": {"id": "7e573664-04ba-4ce5-994a-9fb9483a2400", "bridge": "br-int", "label": "tempest-network-smoke--447291836", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd9732384-f7", "ovs_interfaceid": "d9732384-f751-4c79-b2d8-54d8d0a67924", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.493 186853 DEBUG nova.network.os_vif_util [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ba:aa:4b,bridge_name='br-int',has_traffic_filtering=True,id=d9732384-f751-4c79-b2d8-54d8d0a67924,network=Network(7e573664-04ba-4ce5-994a-9fb9483a2400),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd9732384-f7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.494 186853 DEBUG nova.virt.libvirt.vif [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:14:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-540966288',display_name='tempest-TestGettingAddress-server-540966288',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-540966288',id=143,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBE3h1SGyV3gSWS5sQ4nylu2AAOZq0lzcQgQV1fi/afTfgHNAebinpbyavWAHUC3BTYwehM8YAaeM76WaxgrKeLAhyjYG3qrO7DzWZz90S7erIXCzT/UdxFEeIFnV62ADrw==',key_name='tempest-TestGettingAddress-2002370226',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-yuux33uq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:14:49Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=11b63192-42c9-4462-80c0-d66b0f6fcd47,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9", "address": "fa:16:3e:14:9a:3c", "network": {"id": "6c0a2255-6426-43c4-abc3-5c1857ba0a79", "bridge": "br-int", "label": "tempest-network-smoke--277865928", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe14:9a3c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe14:9a3c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77caeab6-ac", "ovs_interfaceid": "77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.494 186853 DEBUG nova.network.os_vif_util [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9", "address": "fa:16:3e:14:9a:3c", "network": {"id": "6c0a2255-6426-43c4-abc3-5c1857ba0a79", "bridge": "br-int", "label": "tempest-network-smoke--277865928", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe14:9a3c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe14:9a3c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77caeab6-ac", "ovs_interfaceid": "77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.495 186853 DEBUG nova.network.os_vif_util [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:14:9a:3c,bridge_name='br-int',has_traffic_filtering=True,id=77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9,network=Network(6c0a2255-6426-43c4-abc3-5c1857ba0a79),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77caeab6-ac') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.496 186853 DEBUG nova.objects.instance [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lazy-loading 'pci_devices' on Instance uuid 11b63192-42c9-4462-80c0-d66b0f6fcd47 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.516 186853 DEBUG nova.virt.libvirt.driver [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:15:00 np0005531887 nova_compute[186849]:  <uuid>11b63192-42c9-4462-80c0-d66b0f6fcd47</uuid>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:  <name>instance-0000008f</name>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:  <memory>131072</memory>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:  <vcpu>1</vcpu>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:15:00 np0005531887 nova_compute[186849]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:      <nova:name>tempest-TestGettingAddress-server-540966288</nova:name>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:      <nova:creationTime>2025-11-22 08:15:00</nova:creationTime>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:      <nova:flavor name="m1.nano">
Nov 22 03:15:00 np0005531887 nova_compute[186849]:        <nova:memory>128</nova:memory>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:        <nova:disk>1</nova:disk>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:        <nova:swap>0</nova:swap>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:      </nova:flavor>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:      <nova:owner>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:        <nova:user uuid="809b865601654264af5bff7f49127cea">tempest-TestGettingAddress-25838038-project-member</nova:user>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:        <nova:project uuid="c4200f1d1fbb44a5aaf5e3578f6354ae">tempest-TestGettingAddress-25838038</nova:project>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:      </nova:owner>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:      <nova:ports>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:        <nova:port uuid="d9732384-f751-4c79-b2d8-54d8d0a67924">
Nov 22 03:15:00 np0005531887 nova_compute[186849]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:        </nova:port>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:        <nova:port uuid="77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9">
Nov 22 03:15:00 np0005531887 nova_compute[186849]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fe14:9a3c" ipVersion="6"/>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:          <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fe14:9a3c" ipVersion="6"/>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:        </nova:port>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:      </nova:ports>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:    </nova:instance>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:  <sysinfo type="smbios">
Nov 22 03:15:00 np0005531887 nova_compute[186849]:    <system>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:      <entry name="serial">11b63192-42c9-4462-80c0-d66b0f6fcd47</entry>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:      <entry name="uuid">11b63192-42c9-4462-80c0-d66b0f6fcd47</entry>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:    </system>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:  <os>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:    <boot dev="hd"/>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:    <smbios mode="sysinfo"/>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:  </os>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:  <features>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:    <vmcoreinfo/>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:  </features>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:  <clock offset="utc">
Nov 22 03:15:00 np0005531887 nova_compute[186849]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:    <timer name="hpet" present="no"/>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:  </clock>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:  <cpu mode="custom" match="exact">
Nov 22 03:15:00 np0005531887 nova_compute[186849]:    <model>Nehalem</model>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:  <devices>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:    <disk type="file" device="disk">
Nov 22 03:15:00 np0005531887 nova_compute[186849]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/11b63192-42c9-4462-80c0-d66b0f6fcd47/disk"/>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:      <target dev="vda" bus="virtio"/>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:    <disk type="file" device="cdrom">
Nov 22 03:15:00 np0005531887 nova_compute[186849]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/11b63192-42c9-4462-80c0-d66b0f6fcd47/disk.config"/>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:      <target dev="sda" bus="sata"/>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:    <interface type="ethernet">
Nov 22 03:15:00 np0005531887 nova_compute[186849]:      <mac address="fa:16:3e:ba:aa:4b"/>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:      <mtu size="1442"/>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:      <target dev="tapd9732384-f7"/>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:    </interface>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:    <interface type="ethernet">
Nov 22 03:15:00 np0005531887 nova_compute[186849]:      <mac address="fa:16:3e:14:9a:3c"/>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:      <mtu size="1442"/>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:      <target dev="tap77caeab6-ac"/>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:    </interface>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:    <serial type="pty">
Nov 22 03:15:00 np0005531887 nova_compute[186849]:      <log file="/var/lib/nova/instances/11b63192-42c9-4462-80c0-d66b0f6fcd47/console.log" append="off"/>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:    </serial>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:    <video>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:    </video>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:    <input type="tablet" bus="usb"/>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:    <rng model="virtio">
Nov 22 03:15:00 np0005531887 nova_compute[186849]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:    </rng>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:    <controller type="usb" index="0"/>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:    <memballoon model="virtio">
Nov 22 03:15:00 np0005531887 nova_compute[186849]:      <stats period="10"/>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 03:15:00 np0005531887 nova_compute[186849]:  </devices>
Nov 22 03:15:00 np0005531887 nova_compute[186849]: </domain>
Nov 22 03:15:00 np0005531887 nova_compute[186849]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.518 186853 DEBUG nova.compute.manager [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Preparing to wait for external event network-vif-plugged-d9732384-f751-4c79-b2d8-54d8d0a67924 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.518 186853 DEBUG oslo_concurrency.lockutils [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "11b63192-42c9-4462-80c0-d66b0f6fcd47-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.518 186853 DEBUG oslo_concurrency.lockutils [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "11b63192-42c9-4462-80c0-d66b0f6fcd47-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.519 186853 DEBUG oslo_concurrency.lockutils [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "11b63192-42c9-4462-80c0-d66b0f6fcd47-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.519 186853 DEBUG nova.compute.manager [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Preparing to wait for external event network-vif-plugged-77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.519 186853 DEBUG oslo_concurrency.lockutils [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "11b63192-42c9-4462-80c0-d66b0f6fcd47-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.520 186853 DEBUG oslo_concurrency.lockutils [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "11b63192-42c9-4462-80c0-d66b0f6fcd47-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.520 186853 DEBUG oslo_concurrency.lockutils [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "11b63192-42c9-4462-80c0-d66b0f6fcd47-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.520 186853 DEBUG nova.virt.libvirt.vif [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:14:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-540966288',display_name='tempest-TestGettingAddress-server-540966288',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-540966288',id=143,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBE3h1SGyV3gSWS5sQ4nylu2AAOZq0lzcQgQV1fi/afTfgHNAebinpbyavWAHUC3BTYwehM8YAaeM76WaxgrKeLAhyjYG3qrO7DzWZz90S7erIXCzT/UdxFEeIFnV62ADrw==',key_name='tempest-TestGettingAddress-2002370226',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-yuux33uq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:14:49Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=11b63192-42c9-4462-80c0-d66b0f6fcd47,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d9732384-f751-4c79-b2d8-54d8d0a67924", "address": "fa:16:3e:ba:aa:4b", "network": {"id": "7e573664-04ba-4ce5-994a-9fb9483a2400", "bridge": "br-int", "label": "tempest-network-smoke--447291836", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd9732384-f7", "ovs_interfaceid": "d9732384-f751-4c79-b2d8-54d8d0a67924", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.521 186853 DEBUG nova.network.os_vif_util [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "d9732384-f751-4c79-b2d8-54d8d0a67924", "address": "fa:16:3e:ba:aa:4b", "network": {"id": "7e573664-04ba-4ce5-994a-9fb9483a2400", "bridge": "br-int", "label": "tempest-network-smoke--447291836", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd9732384-f7", "ovs_interfaceid": "d9732384-f751-4c79-b2d8-54d8d0a67924", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.521 186853 DEBUG nova.network.os_vif_util [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ba:aa:4b,bridge_name='br-int',has_traffic_filtering=True,id=d9732384-f751-4c79-b2d8-54d8d0a67924,network=Network(7e573664-04ba-4ce5-994a-9fb9483a2400),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd9732384-f7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.522 186853 DEBUG os_vif [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ba:aa:4b,bridge_name='br-int',has_traffic_filtering=True,id=d9732384-f751-4c79-b2d8-54d8d0a67924,network=Network(7e573664-04ba-4ce5-994a-9fb9483a2400),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd9732384-f7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.522 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.523 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.523 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.526 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.526 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd9732384-f7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.527 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd9732384-f7, col_values=(('external_ids', {'iface-id': 'd9732384-f751-4c79-b2d8-54d8d0a67924', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ba:aa:4b', 'vm-uuid': '11b63192-42c9-4462-80c0-d66b0f6fcd47'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.528 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:00 np0005531887 NetworkManager[55210]: <info>  [1763799300.5301] manager: (tapd9732384-f7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/214)
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.531 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.541 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.542 186853 INFO os_vif [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ba:aa:4b,bridge_name='br-int',has_traffic_filtering=True,id=d9732384-f751-4c79-b2d8-54d8d0a67924,network=Network(7e573664-04ba-4ce5-994a-9fb9483a2400),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd9732384-f7')#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.544 186853 DEBUG nova.virt.libvirt.vif [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:14:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-540966288',display_name='tempest-TestGettingAddress-server-540966288',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-540966288',id=143,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBE3h1SGyV3gSWS5sQ4nylu2AAOZq0lzcQgQV1fi/afTfgHNAebinpbyavWAHUC3BTYwehM8YAaeM76WaxgrKeLAhyjYG3qrO7DzWZz90S7erIXCzT/UdxFEeIFnV62ADrw==',key_name='tempest-TestGettingAddress-2002370226',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-yuux33uq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:14:49Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=11b63192-42c9-4462-80c0-d66b0f6fcd47,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9", "address": "fa:16:3e:14:9a:3c", "network": {"id": "6c0a2255-6426-43c4-abc3-5c1857ba0a79", "bridge": "br-int", "label": "tempest-network-smoke--277865928", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe14:9a3c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe14:9a3c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77caeab6-ac", "ovs_interfaceid": "77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.544 186853 DEBUG nova.network.os_vif_util [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9", "address": "fa:16:3e:14:9a:3c", "network": {"id": "6c0a2255-6426-43c4-abc3-5c1857ba0a79", "bridge": "br-int", "label": "tempest-network-smoke--277865928", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe14:9a3c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe14:9a3c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77caeab6-ac", "ovs_interfaceid": "77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.545 186853 DEBUG nova.network.os_vif_util [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:14:9a:3c,bridge_name='br-int',has_traffic_filtering=True,id=77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9,network=Network(6c0a2255-6426-43c4-abc3-5c1857ba0a79),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77caeab6-ac') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.546 186853 DEBUG os_vif [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:14:9a:3c,bridge_name='br-int',has_traffic_filtering=True,id=77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9,network=Network(6c0a2255-6426-43c4-abc3-5c1857ba0a79),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77caeab6-ac') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.546 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.546 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.547 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.549 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.550 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap77caeab6-ac, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.550 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap77caeab6-ac, col_values=(('external_ids', {'iface-id': '77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:14:9a:3c', 'vm-uuid': '11b63192-42c9-4462-80c0-d66b0f6fcd47'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.552 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:00 np0005531887 NetworkManager[55210]: <info>  [1763799300.5535] manager: (tap77caeab6-ac): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/215)
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.555 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.561 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.562 186853 INFO os_vif [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:14:9a:3c,bridge_name='br-int',has_traffic_filtering=True,id=77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9,network=Network(6c0a2255-6426-43c4-abc3-5c1857ba0a79),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77caeab6-ac')#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.776 186853 DEBUG nova.virt.libvirt.driver [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.776 186853 DEBUG nova.virt.libvirt.driver [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.776 186853 DEBUG nova.virt.libvirt.driver [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] No VIF found with MAC fa:16:3e:ba:aa:4b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.777 186853 DEBUG nova.virt.libvirt.driver [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] No VIF found with MAC fa:16:3e:14:9a:3c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:15:00 np0005531887 nova_compute[186849]: 2025-11-22 08:15:00.777 186853 INFO nova.virt.libvirt.driver [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Using config drive#033[00m
Nov 22 03:15:01 np0005531887 podman[234726]: 2025-11-22 08:15:01.849607194 +0000 UTC m=+0.064720228 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 03:15:02 np0005531887 nova_compute[186849]: 2025-11-22 08:15:02.052 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:04 np0005531887 nova_compute[186849]: 2025-11-22 08:15:04.777 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:15:04 np0005531887 nova_compute[186849]: 2025-11-22 08:15:04.778 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:15:04 np0005531887 nova_compute[186849]: 2025-11-22 08:15:04.778 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:15:04 np0005531887 nova_compute[186849]: 2025-11-22 08:15:04.815 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:15:04 np0005531887 nova_compute[186849]: 2025-11-22 08:15:04.816 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:15:04 np0005531887 nova_compute[186849]: 2025-11-22 08:15:04.817 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:15:04 np0005531887 nova_compute[186849]: 2025-11-22 08:15:04.817 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:15:04 np0005531887 nova_compute[186849]: 2025-11-22 08:15:04.914 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9d3b7a77-8b28-4774-9eeb-65b858c3820b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:15:04 np0005531887 nova_compute[186849]: 2025-11-22 08:15:04.987 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9d3b7a77-8b28-4774-9eeb-65b858c3820b/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:15:04 np0005531887 nova_compute[186849]: 2025-11-22 08:15:04.989 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9d3b7a77-8b28-4774-9eeb-65b858c3820b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:15:05 np0005531887 nova_compute[186849]: 2025-11-22 08:15:05.056 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9d3b7a77-8b28-4774-9eeb-65b858c3820b/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:15:05 np0005531887 nova_compute[186849]: 2025-11-22 08:15:05.063 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/11b63192-42c9-4462-80c0-d66b0f6fcd47/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:15:05 np0005531887 nova_compute[186849]: 2025-11-22 08:15:05.146 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/11b63192-42c9-4462-80c0-d66b0f6fcd47/disk --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:15:05 np0005531887 nova_compute[186849]: 2025-11-22 08:15:05.148 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/11b63192-42c9-4462-80c0-d66b0f6fcd47/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:15:05 np0005531887 nova_compute[186849]: 2025-11-22 08:15:05.219 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/11b63192-42c9-4462-80c0-d66b0f6fcd47/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:15:05 np0005531887 nova_compute[186849]: 2025-11-22 08:15:05.220 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Periodic task is updating the host stat, it is trying to get disk instance-0000008f, but disk file was removed by concurrent operations such as resize.: FileNotFoundError: [Errno 2] No such file or directory: '/var/lib/nova/instances/11b63192-42c9-4462-80c0-d66b0f6fcd47/disk.config'#033[00m
Nov 22 03:15:05 np0005531887 nova_compute[186849]: 2025-11-22 08:15:05.378 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:15:05 np0005531887 nova_compute[186849]: 2025-11-22 08:15:05.380 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5557MB free_disk=73.27180480957031GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:15:05 np0005531887 nova_compute[186849]: 2025-11-22 08:15:05.380 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:15:05 np0005531887 nova_compute[186849]: 2025-11-22 08:15:05.380 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:15:05 np0005531887 nova_compute[186849]: 2025-11-22 08:15:05.391 186853 INFO nova.virt.libvirt.driver [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Creating config drive at /var/lib/nova/instances/11b63192-42c9-4462-80c0-d66b0f6fcd47/disk.config#033[00m
Nov 22 03:15:05 np0005531887 nova_compute[186849]: 2025-11-22 08:15:05.395 186853 DEBUG oslo_concurrency.processutils [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/11b63192-42c9-4462-80c0-d66b0f6fcd47/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7wnuc5l8 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:15:05 np0005531887 nova_compute[186849]: 2025-11-22 08:15:05.517 186853 DEBUG oslo_concurrency.processutils [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/11b63192-42c9-4462-80c0-d66b0f6fcd47/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7wnuc5l8" returned: 0 in 0.122s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:15:05 np0005531887 nova_compute[186849]: 2025-11-22 08:15:05.523 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Instance 9d3b7a77-8b28-4774-9eeb-65b858c3820b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 03:15:05 np0005531887 nova_compute[186849]: 2025-11-22 08:15:05.524 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Instance 11b63192-42c9-4462-80c0-d66b0f6fcd47 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 03:15:05 np0005531887 nova_compute[186849]: 2025-11-22 08:15:05.524 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:15:05 np0005531887 nova_compute[186849]: 2025-11-22 08:15:05.524 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:15:05 np0005531887 nova_compute[186849]: 2025-11-22 08:15:05.552 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:05 np0005531887 kernel: tapd9732384-f7: entered promiscuous mode
Nov 22 03:15:05 np0005531887 NetworkManager[55210]: <info>  [1763799305.5800] manager: (tapd9732384-f7): new Tun device (/org/freedesktop/NetworkManager/Devices/216)
Nov 22 03:15:05 np0005531887 ovn_controller[95130]: 2025-11-22T08:15:05Z|00467|binding|INFO|Claiming lport d9732384-f751-4c79-b2d8-54d8d0a67924 for this chassis.
Nov 22 03:15:05 np0005531887 ovn_controller[95130]: 2025-11-22T08:15:05Z|00468|binding|INFO|d9732384-f751-4c79-b2d8-54d8d0a67924: Claiming fa:16:3e:ba:aa:4b 10.100.0.9
Nov 22 03:15:05 np0005531887 nova_compute[186849]: 2025-11-22 08:15:05.585 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:05 np0005531887 NetworkManager[55210]: <info>  [1763799305.6013] manager: (tap77caeab6-ac): new Tun device (/org/freedesktop/NetworkManager/Devices/217)
Nov 22 03:15:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:05.599 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ba:aa:4b 10.100.0.9'], port_security=['fa:16:3e:ba:aa:4b 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '11b63192-42c9-4462-80c0-d66b0f6fcd47', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7e573664-04ba-4ce5-994a-9fb9483a2400', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'bc613044-b796-41b5-a7b0-c508f998d641', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=da796c20-96a3-420c-a9ae-3320426db7c7, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=d9732384-f751-4c79-b2d8-54d8d0a67924) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:15:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:05.601 104084 INFO neutron.agent.ovn.metadata.agent [-] Port d9732384-f751-4c79-b2d8-54d8d0a67924 in datapath 7e573664-04ba-4ce5-994a-9fb9483a2400 bound to our chassis#033[00m
Nov 22 03:15:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:05.602 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7e573664-04ba-4ce5-994a-9fb9483a2400#033[00m
Nov 22 03:15:05 np0005531887 kernel: tap77caeab6-ac: entered promiscuous mode
Nov 22 03:15:05 np0005531887 ovn_controller[95130]: 2025-11-22T08:15:05Z|00469|binding|INFO|Setting lport d9732384-f751-4c79-b2d8-54d8d0a67924 ovn-installed in OVS
Nov 22 03:15:05 np0005531887 ovn_controller[95130]: 2025-11-22T08:15:05Z|00470|binding|INFO|Setting lport d9732384-f751-4c79-b2d8-54d8d0a67924 up in Southbound
Nov 22 03:15:05 np0005531887 ovn_controller[95130]: 2025-11-22T08:15:05Z|00471|if_status|INFO|Dropped 8 log messages in last 212 seconds (most recently, 212 seconds ago) due to excessive rate
Nov 22 03:15:05 np0005531887 ovn_controller[95130]: 2025-11-22T08:15:05Z|00472|if_status|INFO|Not updating pb chassis for 77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9 now as sb is readonly
Nov 22 03:15:05 np0005531887 nova_compute[186849]: 2025-11-22 08:15:05.611 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:05 np0005531887 systemd-udevd[234783]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:15:05 np0005531887 systemd-udevd[234782]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:15:05 np0005531887 ovn_controller[95130]: 2025-11-22T08:15:05Z|00473|binding|INFO|Claiming lport 77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9 for this chassis.
Nov 22 03:15:05 np0005531887 ovn_controller[95130]: 2025-11-22T08:15:05Z|00474|binding|INFO|77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9: Claiming fa:16:3e:14:9a:3c 2001:db8:0:1:f816:3eff:fe14:9a3c 2001:db8::f816:3eff:fe14:9a3c
Nov 22 03:15:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:05.621 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[e033806e-693a-46a2-995b-2e6eff0132c2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:05.623 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7e573664-01 in ovnmeta-7e573664-04ba-4ce5-994a-9fb9483a2400 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:15:05 np0005531887 nova_compute[186849]: 2025-11-22 08:15:05.623 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:05 np0005531887 ovn_controller[95130]: 2025-11-22T08:15:05Z|00475|binding|INFO|Setting lport 77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9 ovn-installed in OVS
Nov 22 03:15:05 np0005531887 nova_compute[186849]: 2025-11-22 08:15:05.627 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:05 np0005531887 NetworkManager[55210]: <info>  [1763799305.6289] device (tapd9732384-f7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:15:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:05.626 213790 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7e573664-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:15:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:05.626 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[f71fa60b-b69f-417b-91c4-df1fa440dbd1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:05 np0005531887 NetworkManager[55210]: <info>  [1763799305.6297] device (tap77caeab6-ac): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:15:05 np0005531887 NetworkManager[55210]: <info>  [1763799305.6304] device (tapd9732384-f7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:15:05 np0005531887 NetworkManager[55210]: <info>  [1763799305.6309] device (tap77caeab6-ac): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:15:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:05.631 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:14:9a:3c 2001:db8:0:1:f816:3eff:fe14:9a3c 2001:db8::f816:3eff:fe14:9a3c'], port_security=['fa:16:3e:14:9a:3c 2001:db8:0:1:f816:3eff:fe14:9a3c 2001:db8::f816:3eff:fe14:9a3c'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe14:9a3c/64 2001:db8::f816:3eff:fe14:9a3c/64', 'neutron:device_id': '11b63192-42c9-4462-80c0-d66b0f6fcd47', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6c0a2255-6426-43c4-abc3-5c1857ba0a79', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'bc613044-b796-41b5-a7b0-c508f998d641', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1f6eb0a2-d476-48e9-8756-79e6bbc84c15, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:15:05 np0005531887 ovn_controller[95130]: 2025-11-22T08:15:05Z|00476|binding|INFO|Setting lport 77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9 up in Southbound
Nov 22 03:15:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:05.631 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[9a8c4412-1c45-4f3d-806c-81f7bf7c60f4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:05.648 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[023750bd-c6cc-442c-9611-25fb291377ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:05 np0005531887 systemd-machined[153180]: New machine qemu-52-instance-0000008f.
Nov 22 03:15:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:05.663 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[cdb49da0-a0f3-40da-8caf-4a7d43eac56d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:05 np0005531887 systemd[1]: Started Virtual Machine qemu-52-instance-0000008f.
Nov 22 03:15:05 np0005531887 nova_compute[186849]: 2025-11-22 08:15:05.687 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:15:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:05.694 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[8e47558b-577d-4d90-96c9-85629430fa5d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:05.699 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[fff665aa-14d8-4dfc-9b3d-f8f519dc8270]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:05 np0005531887 NetworkManager[55210]: <info>  [1763799305.7006] manager: (tap7e573664-00): new Veth device (/org/freedesktop/NetworkManager/Devices/218)
Nov 22 03:15:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:05.733 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[4c513185-2b0d-4cec-9cfc-263fa7d13f61]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:05 np0005531887 nova_compute[186849]: 2025-11-22 08:15:05.732 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:15:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:05.738 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[5fd03305-6581-4256-a733-d76b6fe24355]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:05 np0005531887 nova_compute[186849]: 2025-11-22 08:15:05.758 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:15:05 np0005531887 nova_compute[186849]: 2025-11-22 08:15:05.759 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.379s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:15:05 np0005531887 NetworkManager[55210]: <info>  [1763799305.7686] device (tap7e573664-00): carrier: link connected
Nov 22 03:15:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:05.774 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[811cb3a0-1351-4e3d-81e2-4c88b7b7d0bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:05.794 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[e4d60939-4002-46c5-abc9-9a6b682765bc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7e573664-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fe:03:12'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 141], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 604837, 'reachable_time': 21318, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234819, 'error': None, 'target': 'ovnmeta-7e573664-04ba-4ce5-994a-9fb9483a2400', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:05.813 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[c40a3cc9-b3d2-47c3-9dca-3f14c70c0877]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fefe:312'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 604837, 'tstamp': 604837}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234820, 'error': None, 'target': 'ovnmeta-7e573664-04ba-4ce5-994a-9fb9483a2400', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:05.841 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[37591897-a4e6-4ff7-a73e-2ffeb07fb481]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7e573664-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fe:03:12'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 141], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 604837, 'reachable_time': 21318, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 234821, 'error': None, 'target': 'ovnmeta-7e573664-04ba-4ce5-994a-9fb9483a2400', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:05.878 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[a10cc006-1828-4e52-95d7-7068ed03c36c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:05.956 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[386d0d23-8575-4c56-8e46-9533bd2b8cbb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:05.958 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7e573664-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:15:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:05.958 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:15:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:05.958 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7e573664-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:15:05 np0005531887 nova_compute[186849]: 2025-11-22 08:15:05.960 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:05 np0005531887 kernel: tap7e573664-00: entered promiscuous mode
Nov 22 03:15:05 np0005531887 NetworkManager[55210]: <info>  [1763799305.9611] manager: (tap7e573664-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/219)
Nov 22 03:15:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:05.965 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7e573664-00, col_values=(('external_ids', {'iface-id': '32debc95-1c60-4d8b-9d74-79ae74c8f38f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:15:05 np0005531887 ovn_controller[95130]: 2025-11-22T08:15:05Z|00477|binding|INFO|Releasing lport 32debc95-1c60-4d8b-9d74-79ae74c8f38f from this chassis (sb_readonly=0)
Nov 22 03:15:05 np0005531887 nova_compute[186849]: 2025-11-22 08:15:05.966 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:05 np0005531887 nova_compute[186849]: 2025-11-22 08:15:05.994 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:05.996 104084 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7e573664-04ba-4ce5-994a-9fb9483a2400.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7e573664-04ba-4ce5-994a-9fb9483a2400.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:15:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:05.997 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[2e53d01e-a08e-43cf-810c-7b3043221ed0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:05.998 104084 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:15:05 np0005531887 ovn_metadata_agent[104079]: global
Nov 22 03:15:05 np0005531887 ovn_metadata_agent[104079]:    log         /dev/log local0 debug
Nov 22 03:15:05 np0005531887 ovn_metadata_agent[104079]:    log-tag     haproxy-metadata-proxy-7e573664-04ba-4ce5-994a-9fb9483a2400
Nov 22 03:15:05 np0005531887 ovn_metadata_agent[104079]:    user        root
Nov 22 03:15:05 np0005531887 ovn_metadata_agent[104079]:    group       root
Nov 22 03:15:05 np0005531887 ovn_metadata_agent[104079]:    maxconn     1024
Nov 22 03:15:05 np0005531887 ovn_metadata_agent[104079]:    pidfile     /var/lib/neutron/external/pids/7e573664-04ba-4ce5-994a-9fb9483a2400.pid.haproxy
Nov 22 03:15:05 np0005531887 ovn_metadata_agent[104079]:    daemon
Nov 22 03:15:05 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:15:05 np0005531887 ovn_metadata_agent[104079]: defaults
Nov 22 03:15:05 np0005531887 ovn_metadata_agent[104079]:    log global
Nov 22 03:15:05 np0005531887 ovn_metadata_agent[104079]:    mode http
Nov 22 03:15:05 np0005531887 ovn_metadata_agent[104079]:    option httplog
Nov 22 03:15:05 np0005531887 ovn_metadata_agent[104079]:    option dontlognull
Nov 22 03:15:05 np0005531887 ovn_metadata_agent[104079]:    option http-server-close
Nov 22 03:15:05 np0005531887 ovn_metadata_agent[104079]:    option forwardfor
Nov 22 03:15:05 np0005531887 ovn_metadata_agent[104079]:    retries                 3
Nov 22 03:15:05 np0005531887 ovn_metadata_agent[104079]:    timeout http-request    30s
Nov 22 03:15:05 np0005531887 ovn_metadata_agent[104079]:    timeout connect         30s
Nov 22 03:15:05 np0005531887 ovn_metadata_agent[104079]:    timeout client          32s
Nov 22 03:15:05 np0005531887 ovn_metadata_agent[104079]:    timeout server          32s
Nov 22 03:15:05 np0005531887 ovn_metadata_agent[104079]:    timeout http-keep-alive 30s
Nov 22 03:15:05 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:15:06 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:15:06 np0005531887 ovn_metadata_agent[104079]: listen listener
Nov 22 03:15:06 np0005531887 ovn_metadata_agent[104079]:    bind 169.254.169.254:80
Nov 22 03:15:06 np0005531887 ovn_metadata_agent[104079]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:15:06 np0005531887 ovn_metadata_agent[104079]:    http-request add-header X-OVN-Network-ID 7e573664-04ba-4ce5-994a-9fb9483a2400
Nov 22 03:15:06 np0005531887 ovn_metadata_agent[104079]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:15:06 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:05.999 104084 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7e573664-04ba-4ce5-994a-9fb9483a2400', 'env', 'PROCESS_TAG=haproxy-7e573664-04ba-4ce5-994a-9fb9483a2400', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7e573664-04ba-4ce5-994a-9fb9483a2400.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:15:06 np0005531887 nova_compute[186849]: 2025-11-22 08:15:06.158 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763799306.1582682, 11b63192-42c9-4462-80c0-d66b0f6fcd47 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:15:06 np0005531887 nova_compute[186849]: 2025-11-22 08:15:06.159 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] VM Started (Lifecycle Event)#033[00m
Nov 22 03:15:06 np0005531887 nova_compute[186849]: 2025-11-22 08:15:06.215 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:15:06 np0005531887 nova_compute[186849]: 2025-11-22 08:15:06.228 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763799306.1592104, 11b63192-42c9-4462-80c0-d66b0f6fcd47 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:15:06 np0005531887 nova_compute[186849]: 2025-11-22 08:15:06.228 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:15:06 np0005531887 nova_compute[186849]: 2025-11-22 08:15:06.266 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:15:06 np0005531887 nova_compute[186849]: 2025-11-22 08:15:06.271 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:15:06 np0005531887 nova_compute[186849]: 2025-11-22 08:15:06.293 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:15:06 np0005531887 podman[234861]: 2025-11-22 08:15:06.438964653 +0000 UTC m=+0.036168754 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:15:06 np0005531887 nova_compute[186849]: 2025-11-22 08:15:06.646 186853 DEBUG nova.compute.manager [None req-4de49a16-e83c-4b65-98e8-e82d6d2844b9 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:15:06 np0005531887 podman[234861]: 2025-11-22 08:15:06.710400932 +0000 UTC m=+0.307604953 container create a97a28606d197da05da3aabdd453428e9790d76caedf8354ddc360222672ba83 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7e573664-04ba-4ce5-994a-9fb9483a2400, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118)
Nov 22 03:15:06 np0005531887 nova_compute[186849]: 2025-11-22 08:15:06.736 186853 DEBUG nova.compute.manager [req-d0b2ea1f-ac30-4fe9-b4f8-d1ac5fb1ecf8 req-a5f288db-19b0-487f-9af0-ca395ebc5bbc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Received event network-vif-plugged-77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:15:06 np0005531887 nova_compute[186849]: 2025-11-22 08:15:06.736 186853 DEBUG oslo_concurrency.lockutils [req-d0b2ea1f-ac30-4fe9-b4f8-d1ac5fb1ecf8 req-a5f288db-19b0-487f-9af0-ca395ebc5bbc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "11b63192-42c9-4462-80c0-d66b0f6fcd47-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:15:06 np0005531887 nova_compute[186849]: 2025-11-22 08:15:06.737 186853 DEBUG oslo_concurrency.lockutils [req-d0b2ea1f-ac30-4fe9-b4f8-d1ac5fb1ecf8 req-a5f288db-19b0-487f-9af0-ca395ebc5bbc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "11b63192-42c9-4462-80c0-d66b0f6fcd47-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:15:06 np0005531887 nova_compute[186849]: 2025-11-22 08:15:06.737 186853 DEBUG oslo_concurrency.lockutils [req-d0b2ea1f-ac30-4fe9-b4f8-d1ac5fb1ecf8 req-a5f288db-19b0-487f-9af0-ca395ebc5bbc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "11b63192-42c9-4462-80c0-d66b0f6fcd47-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:15:06 np0005531887 nova_compute[186849]: 2025-11-22 08:15:06.737 186853 DEBUG nova.compute.manager [req-d0b2ea1f-ac30-4fe9-b4f8-d1ac5fb1ecf8 req-a5f288db-19b0-487f-9af0-ca395ebc5bbc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Processing event network-vif-plugged-77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:15:06 np0005531887 nova_compute[186849]: 2025-11-22 08:15:06.739 186853 INFO nova.compute.manager [None req-4de49a16-e83c-4b65-98e8-e82d6d2844b9 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] instance snapshotting#033[00m
Nov 22 03:15:06 np0005531887 nova_compute[186849]: 2025-11-22 08:15:06.746 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:15:06 np0005531887 nova_compute[186849]: 2025-11-22 08:15:06.747 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:15:06 np0005531887 nova_compute[186849]: 2025-11-22 08:15:06.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:15:06 np0005531887 systemd[1]: Started libpod-conmon-a97a28606d197da05da3aabdd453428e9790d76caedf8354ddc360222672ba83.scope.
Nov 22 03:15:06 np0005531887 systemd[1]: Started libcrun container.
Nov 22 03:15:06 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3dc039eb6a6e58fa720c6065638e598ce9270cfd7881cb655688e3e58fda1373/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:15:06 np0005531887 podman[234861]: 2025-11-22 08:15:06.848020998 +0000 UTC m=+0.445225119 container init a97a28606d197da05da3aabdd453428e9790d76caedf8354ddc360222672ba83 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7e573664-04ba-4ce5-994a-9fb9483a2400, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 22 03:15:06 np0005531887 podman[234861]: 2025-11-22 08:15:06.854178349 +0000 UTC m=+0.451382410 container start a97a28606d197da05da3aabdd453428e9790d76caedf8354ddc360222672ba83 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7e573664-04ba-4ce5-994a-9fb9483a2400, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 22 03:15:06 np0005531887 neutron-haproxy-ovnmeta-7e573664-04ba-4ce5-994a-9fb9483a2400[234877]: [NOTICE]   (234881) : New worker (234883) forked
Nov 22 03:15:06 np0005531887 neutron-haproxy-ovnmeta-7e573664-04ba-4ce5-994a-9fb9483a2400[234877]: [NOTICE]   (234881) : Loading success.
Nov 22 03:15:06 np0005531887 nova_compute[186849]: 2025-11-22 08:15:06.878 186853 DEBUG nova.compute.manager [req-b5436ae8-7ade-41f5-b56f-e78c8a30c351 req-42711822-83d1-4d4e-861a-62c96c5a2e85 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Received event network-vif-plugged-d9732384-f751-4c79-b2d8-54d8d0a67924 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:15:06 np0005531887 nova_compute[186849]: 2025-11-22 08:15:06.879 186853 DEBUG oslo_concurrency.lockutils [req-b5436ae8-7ade-41f5-b56f-e78c8a30c351 req-42711822-83d1-4d4e-861a-62c96c5a2e85 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "11b63192-42c9-4462-80c0-d66b0f6fcd47-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:15:06 np0005531887 nova_compute[186849]: 2025-11-22 08:15:06.879 186853 DEBUG oslo_concurrency.lockutils [req-b5436ae8-7ade-41f5-b56f-e78c8a30c351 req-42711822-83d1-4d4e-861a-62c96c5a2e85 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "11b63192-42c9-4462-80c0-d66b0f6fcd47-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:15:06 np0005531887 nova_compute[186849]: 2025-11-22 08:15:06.880 186853 DEBUG oslo_concurrency.lockutils [req-b5436ae8-7ade-41f5-b56f-e78c8a30c351 req-42711822-83d1-4d4e-861a-62c96c5a2e85 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "11b63192-42c9-4462-80c0-d66b0f6fcd47-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:15:06 np0005531887 nova_compute[186849]: 2025-11-22 08:15:06.880 186853 DEBUG nova.compute.manager [req-b5436ae8-7ade-41f5-b56f-e78c8a30c351 req-42711822-83d1-4d4e-861a-62c96c5a2e85 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Processing event network-vif-plugged-d9732384-f751-4c79-b2d8-54d8d0a67924 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:15:06 np0005531887 nova_compute[186849]: 2025-11-22 08:15:06.882 186853 DEBUG nova.compute.manager [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Instance event wait completed in 0 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:15:06 np0005531887 nova_compute[186849]: 2025-11-22 08:15:06.886 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763799306.8858535, 11b63192-42c9-4462-80c0-d66b0f6fcd47 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:15:06 np0005531887 nova_compute[186849]: 2025-11-22 08:15:06.886 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:15:06 np0005531887 nova_compute[186849]: 2025-11-22 08:15:06.890 186853 DEBUG nova.virt.libvirt.driver [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:15:06 np0005531887 nova_compute[186849]: 2025-11-22 08:15:06.895 186853 INFO nova.virt.libvirt.driver [-] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Instance spawned successfully.#033[00m
Nov 22 03:15:06 np0005531887 nova_compute[186849]: 2025-11-22 08:15:06.895 186853 DEBUG nova.virt.libvirt.driver [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 03:15:06 np0005531887 nova_compute[186849]: 2025-11-22 08:15:06.919 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:15:06 np0005531887 nova_compute[186849]: 2025-11-22 08:15:06.928 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:15:06 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:06.941 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9 in datapath 6c0a2255-6426-43c4-abc3-5c1857ba0a79 unbound from our chassis#033[00m
Nov 22 03:15:06 np0005531887 nova_compute[186849]: 2025-11-22 08:15:06.942 186853 DEBUG nova.virt.libvirt.driver [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:15:06 np0005531887 nova_compute[186849]: 2025-11-22 08:15:06.943 186853 DEBUG nova.virt.libvirt.driver [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:15:06 np0005531887 nova_compute[186849]: 2025-11-22 08:15:06.944 186853 DEBUG nova.virt.libvirt.driver [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:15:06 np0005531887 nova_compute[186849]: 2025-11-22 08:15:06.944 186853 DEBUG nova.virt.libvirt.driver [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:15:06 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:06.945 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6c0a2255-6426-43c4-abc3-5c1857ba0a79#033[00m
Nov 22 03:15:06 np0005531887 nova_compute[186849]: 2025-11-22 08:15:06.945 186853 DEBUG nova.virt.libvirt.driver [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:15:06 np0005531887 nova_compute[186849]: 2025-11-22 08:15:06.946 186853 DEBUG nova.virt.libvirt.driver [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:15:06 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:06.956 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[822e0969-5c3a-467f-8895-c27a457fed67]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:06 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:06.957 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6c0a2255-61 in ovnmeta-6c0a2255-6426-43c4-abc3-5c1857ba0a79 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:15:06 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:06.961 213790 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6c0a2255-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:15:06 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:06.961 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[af139fa0-c412-406e-8640-637434e8c881]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:06 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:06.962 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[73c6c7c1-4da2-4abb-80f0-314326876309]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:06 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:06.975 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[8564ecab-26d2-46c9-bb6f-ab80ec7e5446]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:06 np0005531887 nova_compute[186849]: 2025-11-22 08:15:06.978 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:15:06 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:06.992 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[5f01d3dc-3f25-43dc-85df-9da76629fe3c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:07 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:07.018 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[18e0a310-4811-4e41-86ba-9250cbdc362b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:07 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:07.023 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[c53a419c-751f-48dd-b4cd-7bf685d8c691]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:07 np0005531887 NetworkManager[55210]: <info>  [1763799307.0254] manager: (tap6c0a2255-60): new Veth device (/org/freedesktop/NetworkManager/Devices/220)
Nov 22 03:15:07 np0005531887 systemd-udevd[234803]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:15:07 np0005531887 nova_compute[186849]: 2025-11-22 08:15:07.055 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:07 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:07.071 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[2c8187e4-770b-4789-a8bb-98a2428087bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:07 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:07.074 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[9c479ed4-375d-4c53-b418-86293340df74]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:07 np0005531887 nova_compute[186849]: 2025-11-22 08:15:07.094 186853 INFO nova.compute.manager [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Took 17.19 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 03:15:07 np0005531887 nova_compute[186849]: 2025-11-22 08:15:07.095 186853 DEBUG nova.compute.manager [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:15:07 np0005531887 NetworkManager[55210]: <info>  [1763799307.0991] device (tap6c0a2255-60): carrier: link connected
Nov 22 03:15:07 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:07.105 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[0afc64fa-1fe8-450e-826c-5d0dd3bbd669]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:07 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:07.123 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[9595f7ee-0521-4a6b-9ebc-dd1456abab5f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6c0a2255-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7e:a2:f4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 142], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 604970, 'reachable_time': 43447, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234902, 'error': None, 'target': 'ovnmeta-6c0a2255-6426-43c4-abc3-5c1857ba0a79', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:07 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:07.138 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[9ac1a5c6-9310-4d3a-b242-c0122244d2a8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe7e:a2f4'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 604970, 'tstamp': 604970}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234903, 'error': None, 'target': 'ovnmeta-6c0a2255-6426-43c4-abc3-5c1857ba0a79', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:07 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:07.157 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[11403c97-eacc-4090-b247-304982931e5a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6c0a2255-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7e:a2:f4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 142], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 604970, 'reachable_time': 43447, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 234904, 'error': None, 'target': 'ovnmeta-6c0a2255-6426-43c4-abc3-5c1857ba0a79', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:07 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:07.188 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[b37cc855-7830-47a2-b4f7-abb8fd4677c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:07 np0005531887 nova_compute[186849]: 2025-11-22 08:15:07.212 186853 INFO nova.virt.libvirt.driver [None req-4de49a16-e83c-4b65-98e8-e82d6d2844b9 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Beginning live snapshot process#033[00m
Nov 22 03:15:07 np0005531887 nova_compute[186849]: 2025-11-22 08:15:07.220 186853 INFO nova.compute.manager [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Took 19.40 seconds to build instance.#033[00m
Nov 22 03:15:07 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:07.222 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[2de1ecfd-880e-46f5-af7d-527c93a739a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:07 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:07.224 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6c0a2255-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:15:07 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:07.224 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:15:07 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:07.225 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6c0a2255-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:15:07 np0005531887 nova_compute[186849]: 2025-11-22 08:15:07.227 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:07 np0005531887 NetworkManager[55210]: <info>  [1763799307.2288] manager: (tap6c0a2255-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/221)
Nov 22 03:15:07 np0005531887 kernel: tap6c0a2255-60: entered promiscuous mode
Nov 22 03:15:07 np0005531887 nova_compute[186849]: 2025-11-22 08:15:07.230 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:07 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:07.233 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6c0a2255-60, col_values=(('external_ids', {'iface-id': '433cf940-3b59-425c-aeb8-689a57de46c2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:15:07 np0005531887 nova_compute[186849]: 2025-11-22 08:15:07.235 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:07 np0005531887 ovn_controller[95130]: 2025-11-22T08:15:07Z|00478|binding|INFO|Releasing lport 433cf940-3b59-425c-aeb8-689a57de46c2 from this chassis (sb_readonly=0)
Nov 22 03:15:07 np0005531887 nova_compute[186849]: 2025-11-22 08:15:07.235 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:07 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:07.238 104084 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6c0a2255-6426-43c4-abc3-5c1857ba0a79.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6c0a2255-6426-43c4-abc3-5c1857ba0a79.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:15:07 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:07.240 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[dea2d92d-8b52-41c4-94f1-9ea61ef78762]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:07 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:07.241 104084 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:15:07 np0005531887 ovn_metadata_agent[104079]: global
Nov 22 03:15:07 np0005531887 ovn_metadata_agent[104079]:    log         /dev/log local0 debug
Nov 22 03:15:07 np0005531887 ovn_metadata_agent[104079]:    log-tag     haproxy-metadata-proxy-6c0a2255-6426-43c4-abc3-5c1857ba0a79
Nov 22 03:15:07 np0005531887 ovn_metadata_agent[104079]:    user        root
Nov 22 03:15:07 np0005531887 ovn_metadata_agent[104079]:    group       root
Nov 22 03:15:07 np0005531887 ovn_metadata_agent[104079]:    maxconn     1024
Nov 22 03:15:07 np0005531887 ovn_metadata_agent[104079]:    pidfile     /var/lib/neutron/external/pids/6c0a2255-6426-43c4-abc3-5c1857ba0a79.pid.haproxy
Nov 22 03:15:07 np0005531887 ovn_metadata_agent[104079]:    daemon
Nov 22 03:15:07 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:15:07 np0005531887 ovn_metadata_agent[104079]: defaults
Nov 22 03:15:07 np0005531887 ovn_metadata_agent[104079]:    log global
Nov 22 03:15:07 np0005531887 ovn_metadata_agent[104079]:    mode http
Nov 22 03:15:07 np0005531887 ovn_metadata_agent[104079]:    option httplog
Nov 22 03:15:07 np0005531887 ovn_metadata_agent[104079]:    option dontlognull
Nov 22 03:15:07 np0005531887 ovn_metadata_agent[104079]:    option http-server-close
Nov 22 03:15:07 np0005531887 ovn_metadata_agent[104079]:    option forwardfor
Nov 22 03:15:07 np0005531887 ovn_metadata_agent[104079]:    retries                 3
Nov 22 03:15:07 np0005531887 ovn_metadata_agent[104079]:    timeout http-request    30s
Nov 22 03:15:07 np0005531887 ovn_metadata_agent[104079]:    timeout connect         30s
Nov 22 03:15:07 np0005531887 ovn_metadata_agent[104079]:    timeout client          32s
Nov 22 03:15:07 np0005531887 ovn_metadata_agent[104079]:    timeout server          32s
Nov 22 03:15:07 np0005531887 ovn_metadata_agent[104079]:    timeout http-keep-alive 30s
Nov 22 03:15:07 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:15:07 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:15:07 np0005531887 ovn_metadata_agent[104079]: listen listener
Nov 22 03:15:07 np0005531887 ovn_metadata_agent[104079]:    bind 169.254.169.254:80
Nov 22 03:15:07 np0005531887 ovn_metadata_agent[104079]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:15:07 np0005531887 ovn_metadata_agent[104079]:    http-request add-header X-OVN-Network-ID 6c0a2255-6426-43c4-abc3-5c1857ba0a79
Nov 22 03:15:07 np0005531887 ovn_metadata_agent[104079]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:15:07 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:07.243 104084 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6c0a2255-6426-43c4-abc3-5c1857ba0a79', 'env', 'PROCESS_TAG=haproxy-6c0a2255-6426-43c4-abc3-5c1857ba0a79', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6c0a2255-6426-43c4-abc3-5c1857ba0a79.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:15:07 np0005531887 nova_compute[186849]: 2025-11-22 08:15:07.246 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:07 np0005531887 nova_compute[186849]: 2025-11-22 08:15:07.248 186853 DEBUG oslo_concurrency.lockutils [None req-6ca5164d-ab4a-4adb-b7c3-b69d44435bde 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "11b63192-42c9-4462-80c0-d66b0f6fcd47" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.549s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:15:07 np0005531887 virtqemud[186424]: invalid argument: disk vda does not have an active block job
Nov 22 03:15:07 np0005531887 nova_compute[186849]: 2025-11-22 08:15:07.463 186853 DEBUG oslo_concurrency.processutils [None req-4de49a16-e83c-4b65-98e8-e82d6d2844b9 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9d3b7a77-8b28-4774-9eeb-65b858c3820b/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:15:07 np0005531887 nova_compute[186849]: 2025-11-22 08:15:07.533 186853 DEBUG oslo_concurrency.processutils [None req-4de49a16-e83c-4b65-98e8-e82d6d2844b9 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9d3b7a77-8b28-4774-9eeb-65b858c3820b/disk --force-share --output=json -f qcow2" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:15:07 np0005531887 nova_compute[186849]: 2025-11-22 08:15:07.534 186853 DEBUG oslo_concurrency.processutils [None req-4de49a16-e83c-4b65-98e8-e82d6d2844b9 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9d3b7a77-8b28-4774-9eeb-65b858c3820b/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:15:07 np0005531887 nova_compute[186849]: 2025-11-22 08:15:07.618 186853 DEBUG oslo_concurrency.processutils [None req-4de49a16-e83c-4b65-98e8-e82d6d2844b9 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9d3b7a77-8b28-4774-9eeb-65b858c3820b/disk --force-share --output=json -f qcow2" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:15:07 np0005531887 nova_compute[186849]: 2025-11-22 08:15:07.637 186853 DEBUG oslo_concurrency.processutils [None req-4de49a16-e83c-4b65-98e8-e82d6d2844b9 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd5495c5d01c62e3f2430ded9b741807f5260e73 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:15:07 np0005531887 nova_compute[186849]: 2025-11-22 08:15:07.695 186853 DEBUG oslo_concurrency.processutils [None req-4de49a16-e83c-4b65-98e8-e82d6d2844b9 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd5495c5d01c62e3f2430ded9b741807f5260e73 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:15:07 np0005531887 nova_compute[186849]: 2025-11-22 08:15:07.697 186853 DEBUG oslo_concurrency.processutils [None req-4de49a16-e83c-4b65-98e8-e82d6d2844b9 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dd5495c5d01c62e3f2430ded9b741807f5260e73,backing_fmt=raw /var/lib/nova/instances/snapshots/tmptyjgmp9c/be01f554f88348d09a37dcd3098d3929.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:15:07 np0005531887 podman[234939]: 2025-11-22 08:15:07.637216104 +0000 UTC m=+0.033503448 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:15:07 np0005531887 nova_compute[186849]: 2025-11-22 08:15:07.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:15:07 np0005531887 nova_compute[186849]: 2025-11-22 08:15:07.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:15:07 np0005531887 nova_compute[186849]: 2025-11-22 08:15:07.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:15:07 np0005531887 nova_compute[186849]: 2025-11-22 08:15:07.792 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "refresh_cache-9d3b7a77-8b28-4774-9eeb-65b858c3820b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:15:07 np0005531887 nova_compute[186849]: 2025-11-22 08:15:07.792 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquired lock "refresh_cache-9d3b7a77-8b28-4774-9eeb-65b858c3820b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:15:07 np0005531887 nova_compute[186849]: 2025-11-22 08:15:07.792 186853 DEBUG nova.network.neutron [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 03:15:07 np0005531887 nova_compute[186849]: 2025-11-22 08:15:07.793 186853 DEBUG nova.objects.instance [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9d3b7a77-8b28-4774-9eeb-65b858c3820b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:15:08 np0005531887 podman[234939]: 2025-11-22 08:15:08.156478258 +0000 UTC m=+0.552765572 container create 424a58f5fad36fe36f58ef8ac275cf4cf6524c3cc31fe07db1ab1cb882b07a89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6c0a2255-6426-43c4-abc3-5c1857ba0a79, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118)
Nov 22 03:15:08 np0005531887 nova_compute[186849]: 2025-11-22 08:15:08.264 186853 DEBUG oslo_concurrency.processutils [None req-4de49a16-e83c-4b65-98e8-e82d6d2844b9 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dd5495c5d01c62e3f2430ded9b741807f5260e73,backing_fmt=raw /var/lib/nova/instances/snapshots/tmptyjgmp9c/be01f554f88348d09a37dcd3098d3929.delta 1073741824" returned: 0 in 0.567s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:15:08 np0005531887 nova_compute[186849]: 2025-11-22 08:15:08.265 186853 INFO nova.virt.libvirt.driver [None req-4de49a16-e83c-4b65-98e8-e82d6d2844b9 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Nov 22 03:15:08 np0005531887 nova_compute[186849]: 2025-11-22 08:15:08.327 186853 DEBUG nova.virt.libvirt.guest [None req-4de49a16-e83c-4b65-98e8-e82d6d2844b9 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] COPY block job progress, current cursor: 0 final cursor: 1048576 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Nov 22 03:15:08 np0005531887 systemd[1]: Started libpod-conmon-424a58f5fad36fe36f58ef8ac275cf4cf6524c3cc31fe07db1ab1cb882b07a89.scope.
Nov 22 03:15:08 np0005531887 podman[234960]: 2025-11-22 08:15:08.368295566 +0000 UTC m=+0.177847880 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, io.openshift.tags=minimal rhel9, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, container_name=openstack_network_exporter, release=1755695350, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, distribution-scope=public, version=9.6, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 03:15:08 np0005531887 systemd[1]: Started libcrun container.
Nov 22 03:15:08 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ad74a3b7d57cca01364d34b3bb455af6416105ee88b2d9d9e48addb9806739ee/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:15:08 np0005531887 nova_compute[186849]: 2025-11-22 08:15:08.606 186853 DEBUG nova.network.neutron [req-4f301cf4-078d-4c3d-80e2-1a46f14d9422 req-4d4a6f0b-59ca-488a-8e86-ccb69d94fc21 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Updated VIF entry in instance network info cache for port 77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:15:08 np0005531887 nova_compute[186849]: 2025-11-22 08:15:08.607 186853 DEBUG nova.network.neutron [req-4f301cf4-078d-4c3d-80e2-1a46f14d9422 req-4d4a6f0b-59ca-488a-8e86-ccb69d94fc21 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Updating instance_info_cache with network_info: [{"id": "d9732384-f751-4c79-b2d8-54d8d0a67924", "address": "fa:16:3e:ba:aa:4b", "network": {"id": "7e573664-04ba-4ce5-994a-9fb9483a2400", "bridge": "br-int", "label": "tempest-network-smoke--447291836", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd9732384-f7", "ovs_interfaceid": "d9732384-f751-4c79-b2d8-54d8d0a67924", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9", "address": "fa:16:3e:14:9a:3c", "network": {"id": "6c0a2255-6426-43c4-abc3-5c1857ba0a79", "bridge": "br-int", "label": "tempest-network-smoke--277865928", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe14:9a3c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe14:9a3c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77caeab6-ac", "ovs_interfaceid": "77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:15:08 np0005531887 nova_compute[186849]: 2025-11-22 08:15:08.631 186853 DEBUG oslo_concurrency.lockutils [req-4f301cf4-078d-4c3d-80e2-1a46f14d9422 req-4d4a6f0b-59ca-488a-8e86-ccb69d94fc21 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-11b63192-42c9-4462-80c0-d66b0f6fcd47" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:15:08 np0005531887 podman[234939]: 2025-11-22 08:15:08.697186113 +0000 UTC m=+1.093473457 container init 424a58f5fad36fe36f58ef8ac275cf4cf6524c3cc31fe07db1ab1cb882b07a89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6c0a2255-6426-43c4-abc3-5c1857ba0a79, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:15:08 np0005531887 podman[234939]: 2025-11-22 08:15:08.705731294 +0000 UTC m=+1.102018608 container start 424a58f5fad36fe36f58ef8ac275cf4cf6524c3cc31fe07db1ab1cb882b07a89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6c0a2255-6426-43c4-abc3-5c1857ba0a79, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118)
Nov 22 03:15:08 np0005531887 neutron-haproxy-ovnmeta-6c0a2255-6426-43c4-abc3-5c1857ba0a79[234990]: [NOTICE]   (234996) : New worker (234998) forked
Nov 22 03:15:08 np0005531887 neutron-haproxy-ovnmeta-6c0a2255-6426-43c4-abc3-5c1857ba0a79[234990]: [NOTICE]   (234996) : Loading success.
Nov 22 03:15:08 np0005531887 nova_compute[186849]: 2025-11-22 08:15:08.837 186853 DEBUG nova.virt.libvirt.guest [None req-4de49a16-e83c-4b65-98e8-e82d6d2844b9 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] COPY block job progress, current cursor: 1048575 final cursor: 1048576 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Nov 22 03:15:08 np0005531887 nova_compute[186849]: 2025-11-22 08:15:08.904 186853 DEBUG nova.compute.manager [req-af653da8-c153-4f14-9038-cae2da8c3306 req-0173d466-0cfc-49ad-9d19-28b6e050bc25 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Received event network-vif-plugged-77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:15:08 np0005531887 nova_compute[186849]: 2025-11-22 08:15:08.905 186853 DEBUG oslo_concurrency.lockutils [req-af653da8-c153-4f14-9038-cae2da8c3306 req-0173d466-0cfc-49ad-9d19-28b6e050bc25 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "11b63192-42c9-4462-80c0-d66b0f6fcd47-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:15:08 np0005531887 nova_compute[186849]: 2025-11-22 08:15:08.906 186853 DEBUG oslo_concurrency.lockutils [req-af653da8-c153-4f14-9038-cae2da8c3306 req-0173d466-0cfc-49ad-9d19-28b6e050bc25 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "11b63192-42c9-4462-80c0-d66b0f6fcd47-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:15:08 np0005531887 nova_compute[186849]: 2025-11-22 08:15:08.906 186853 DEBUG oslo_concurrency.lockutils [req-af653da8-c153-4f14-9038-cae2da8c3306 req-0173d466-0cfc-49ad-9d19-28b6e050bc25 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "11b63192-42c9-4462-80c0-d66b0f6fcd47-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:15:08 np0005531887 nova_compute[186849]: 2025-11-22 08:15:08.907 186853 DEBUG nova.compute.manager [req-af653da8-c153-4f14-9038-cae2da8c3306 req-0173d466-0cfc-49ad-9d19-28b6e050bc25 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] No waiting events found dispatching network-vif-plugged-77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:15:08 np0005531887 nova_compute[186849]: 2025-11-22 08:15:08.907 186853 WARNING nova.compute.manager [req-af653da8-c153-4f14-9038-cae2da8c3306 req-0173d466-0cfc-49ad-9d19-28b6e050bc25 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Received unexpected event network-vif-plugged-77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9 for instance with vm_state active and task_state None.#033[00m
Nov 22 03:15:09 np0005531887 nova_compute[186849]: 2025-11-22 08:15:09.003 186853 DEBUG nova.compute.manager [req-fc04ea72-f941-4d5d-8b92-b027eeb97989 req-85f9d3fd-61e2-405d-bf00-365c4f70d453 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Received event network-vif-plugged-d9732384-f751-4c79-b2d8-54d8d0a67924 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:15:09 np0005531887 nova_compute[186849]: 2025-11-22 08:15:09.004 186853 DEBUG oslo_concurrency.lockutils [req-fc04ea72-f941-4d5d-8b92-b027eeb97989 req-85f9d3fd-61e2-405d-bf00-365c4f70d453 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "11b63192-42c9-4462-80c0-d66b0f6fcd47-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:15:09 np0005531887 nova_compute[186849]: 2025-11-22 08:15:09.005 186853 DEBUG oslo_concurrency.lockutils [req-fc04ea72-f941-4d5d-8b92-b027eeb97989 req-85f9d3fd-61e2-405d-bf00-365c4f70d453 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "11b63192-42c9-4462-80c0-d66b0f6fcd47-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:15:09 np0005531887 nova_compute[186849]: 2025-11-22 08:15:09.005 186853 DEBUG oslo_concurrency.lockutils [req-fc04ea72-f941-4d5d-8b92-b027eeb97989 req-85f9d3fd-61e2-405d-bf00-365c4f70d453 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "11b63192-42c9-4462-80c0-d66b0f6fcd47-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:15:09 np0005531887 nova_compute[186849]: 2025-11-22 08:15:09.005 186853 DEBUG nova.compute.manager [req-fc04ea72-f941-4d5d-8b92-b027eeb97989 req-85f9d3fd-61e2-405d-bf00-365c4f70d453 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] No waiting events found dispatching network-vif-plugged-d9732384-f751-4c79-b2d8-54d8d0a67924 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:15:09 np0005531887 nova_compute[186849]: 2025-11-22 08:15:09.006 186853 WARNING nova.compute.manager [req-fc04ea72-f941-4d5d-8b92-b027eeb97989 req-85f9d3fd-61e2-405d-bf00-365c4f70d453 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Received unexpected event network-vif-plugged-d9732384-f751-4c79-b2d8-54d8d0a67924 for instance with vm_state active and task_state None.#033[00m
Nov 22 03:15:09 np0005531887 nova_compute[186849]: 2025-11-22 08:15:09.340 186853 DEBUG nova.virt.libvirt.guest [None req-4de49a16-e83c-4b65-98e8-e82d6d2844b9 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] COPY block job progress, current cursor: 1048576 final cursor: 1048576 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Nov 22 03:15:09 np0005531887 nova_compute[186849]: 2025-11-22 08:15:09.342 186853 INFO nova.virt.libvirt.driver [None req-4de49a16-e83c-4b65-98e8-e82d6d2844b9 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Nov 22 03:15:09 np0005531887 nova_compute[186849]: 2025-11-22 08:15:09.478 186853 DEBUG nova.privsep.utils [None req-4de49a16-e83c-4b65-98e8-e82d6d2844b9 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 22 03:15:09 np0005531887 nova_compute[186849]: 2025-11-22 08:15:09.479 186853 DEBUG oslo_concurrency.processutils [None req-4de49a16-e83c-4b65-98e8-e82d6d2844b9 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmptyjgmp9c/be01f554f88348d09a37dcd3098d3929.delta /var/lib/nova/instances/snapshots/tmptyjgmp9c/be01f554f88348d09a37dcd3098d3929 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:15:10 np0005531887 nova_compute[186849]: 2025-11-22 08:15:10.557 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:11 np0005531887 podman[235017]: 2025-11-22 08:15:11.008542303 +0000 UTC m=+0.233556424 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 22 03:15:11 np0005531887 podman[235018]: 2025-11-22 08:15:11.036942595 +0000 UTC m=+0.257979459 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:15:11 np0005531887 nova_compute[186849]: 2025-11-22 08:15:11.256 186853 DEBUG oslo_concurrency.processutils [None req-4de49a16-e83c-4b65-98e8-e82d6d2844b9 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmptyjgmp9c/be01f554f88348d09a37dcd3098d3929.delta /var/lib/nova/instances/snapshots/tmptyjgmp9c/be01f554f88348d09a37dcd3098d3929" returned: 0 in 1.777s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:15:11 np0005531887 nova_compute[186849]: 2025-11-22 08:15:11.257 186853 INFO nova.virt.libvirt.driver [None req-4de49a16-e83c-4b65-98e8-e82d6d2844b9 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Snapshot extracted, beginning image upload#033[00m
Nov 22 03:15:12 np0005531887 nova_compute[186849]: 2025-11-22 08:15:12.039 186853 DEBUG nova.network.neutron [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Updating instance_info_cache with network_info: [{"id": "6fca0d10-ec3d-4b7b-844b-458a39db0a47", "address": "fa:16:3e:90:00:cb", "network": {"id": "5cbf5083-8d50-44bd-b6ba-93e507a8654e", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-722250133-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9016c6b616412fa2db0983e23a8150", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6fca0d10-ec", "ovs_interfaceid": "6fca0d10-ec3d-4b7b-844b-458a39db0a47", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:15:12 np0005531887 nova_compute[186849]: 2025-11-22 08:15:12.060 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:12 np0005531887 nova_compute[186849]: 2025-11-22 08:15:12.081 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Releasing lock "refresh_cache-9d3b7a77-8b28-4774-9eeb-65b858c3820b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:15:12 np0005531887 nova_compute[186849]: 2025-11-22 08:15:12.082 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 03:15:12 np0005531887 nova_compute[186849]: 2025-11-22 08:15:12.082 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:15:13 np0005531887 nova_compute[186849]: 2025-11-22 08:15:13.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:15:15 np0005531887 nova_compute[186849]: 2025-11-22 08:15:15.418 186853 INFO nova.virt.libvirt.driver [None req-4de49a16-e83c-4b65-98e8-e82d6d2844b9 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Snapshot image upload complete#033[00m
Nov 22 03:15:15 np0005531887 nova_compute[186849]: 2025-11-22 08:15:15.419 186853 INFO nova.compute.manager [None req-4de49a16-e83c-4b65-98e8-e82d6d2844b9 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Took 8.67 seconds to snapshot the instance on the hypervisor.#033[00m
Nov 22 03:15:15 np0005531887 nova_compute[186849]: 2025-11-22 08:15:15.564 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:16 np0005531887 podman[235064]: 2025-11-22 08:15:16.833303191 +0000 UTC m=+0.055061151 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:15:17 np0005531887 nova_compute[186849]: 2025-11-22 08:15:17.063 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:17 np0005531887 nova_compute[186849]: 2025-11-22 08:15:17.705 186853 DEBUG nova.compute.manager [req-8f97dda9-aab1-4900-af99-2526e9a7889c req-b1035a92-5483-4595-8f6e-1831c7c77c77 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Received event network-changed-d9732384-f751-4c79-b2d8-54d8d0a67924 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:15:17 np0005531887 nova_compute[186849]: 2025-11-22 08:15:17.705 186853 DEBUG nova.compute.manager [req-8f97dda9-aab1-4900-af99-2526e9a7889c req-b1035a92-5483-4595-8f6e-1831c7c77c77 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Refreshing instance network info cache due to event network-changed-d9732384-f751-4c79-b2d8-54d8d0a67924. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:15:17 np0005531887 nova_compute[186849]: 2025-11-22 08:15:17.706 186853 DEBUG oslo_concurrency.lockutils [req-8f97dda9-aab1-4900-af99-2526e9a7889c req-b1035a92-5483-4595-8f6e-1831c7c77c77 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-11b63192-42c9-4462-80c0-d66b0f6fcd47" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:15:17 np0005531887 nova_compute[186849]: 2025-11-22 08:15:17.706 186853 DEBUG oslo_concurrency.lockutils [req-8f97dda9-aab1-4900-af99-2526e9a7889c req-b1035a92-5483-4595-8f6e-1831c7c77c77 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-11b63192-42c9-4462-80c0-d66b0f6fcd47" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:15:17 np0005531887 nova_compute[186849]: 2025-11-22 08:15:17.706 186853 DEBUG nova.network.neutron [req-8f97dda9-aab1-4900-af99-2526e9a7889c req-b1035a92-5483-4595-8f6e-1831c7c77c77 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Refreshing network info cache for port d9732384-f751-4c79-b2d8-54d8d0a67924 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:15:19 np0005531887 nova_compute[186849]: 2025-11-22 08:15:19.451 186853 DEBUG oslo_concurrency.lockutils [None req-80b21385-dc4c-4ffc-aed6-ecc63cbd3d60 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Acquiring lock "9d3b7a77-8b28-4774-9eeb-65b858c3820b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:15:19 np0005531887 nova_compute[186849]: 2025-11-22 08:15:19.452 186853 DEBUG oslo_concurrency.lockutils [None req-80b21385-dc4c-4ffc-aed6-ecc63cbd3d60 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Lock "9d3b7a77-8b28-4774-9eeb-65b858c3820b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:15:19 np0005531887 nova_compute[186849]: 2025-11-22 08:15:19.452 186853 DEBUG oslo_concurrency.lockutils [None req-80b21385-dc4c-4ffc-aed6-ecc63cbd3d60 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Acquiring lock "9d3b7a77-8b28-4774-9eeb-65b858c3820b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:15:19 np0005531887 nova_compute[186849]: 2025-11-22 08:15:19.452 186853 DEBUG oslo_concurrency.lockutils [None req-80b21385-dc4c-4ffc-aed6-ecc63cbd3d60 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Lock "9d3b7a77-8b28-4774-9eeb-65b858c3820b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:15:19 np0005531887 nova_compute[186849]: 2025-11-22 08:15:19.453 186853 DEBUG oslo_concurrency.lockutils [None req-80b21385-dc4c-4ffc-aed6-ecc63cbd3d60 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Lock "9d3b7a77-8b28-4774-9eeb-65b858c3820b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:15:19 np0005531887 nova_compute[186849]: 2025-11-22 08:15:19.461 186853 INFO nova.compute.manager [None req-80b21385-dc4c-4ffc-aed6-ecc63cbd3d60 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Terminating instance#033[00m
Nov 22 03:15:19 np0005531887 nova_compute[186849]: 2025-11-22 08:15:19.469 186853 DEBUG nova.compute.manager [None req-80b21385-dc4c-4ffc-aed6-ecc63cbd3d60 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 03:15:19 np0005531887 kernel: tap6fca0d10-ec (unregistering): left promiscuous mode
Nov 22 03:15:19 np0005531887 NetworkManager[55210]: <info>  [1763799319.4977] device (tap6fca0d10-ec): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:15:19 np0005531887 ovn_controller[95130]: 2025-11-22T08:15:19Z|00479|binding|INFO|Releasing lport 6fca0d10-ec3d-4b7b-844b-458a39db0a47 from this chassis (sb_readonly=0)
Nov 22 03:15:19 np0005531887 ovn_controller[95130]: 2025-11-22T08:15:19Z|00480|binding|INFO|Setting lport 6fca0d10-ec3d-4b7b-844b-458a39db0a47 down in Southbound
Nov 22 03:15:19 np0005531887 ovn_controller[95130]: 2025-11-22T08:15:19Z|00481|binding|INFO|Removing iface tap6fca0d10-ec ovn-installed in OVS
Nov 22 03:15:19 np0005531887 nova_compute[186849]: 2025-11-22 08:15:19.502 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:19 np0005531887 nova_compute[186849]: 2025-11-22 08:15:19.504 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:19 np0005531887 nova_compute[186849]: 2025-11-22 08:15:19.516 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:19 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:19.524 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:90:00:cb 10.100.0.5'], port_security=['fa:16:3e:90:00:cb 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '9d3b7a77-8b28-4774-9eeb-65b858c3820b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5cbf5083-8d50-44bd-b6ba-93e507a8654e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5c9016c6b616412fa2db0983e23a8150', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7420c781-e9c7-4653-97a5-92e76e44aa71', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7e7964e9-a04c-4b66-8053-f482dcbb2cee, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=6fca0d10-ec3d-4b7b-844b-458a39db0a47) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:15:19 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:19.526 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 6fca0d10-ec3d-4b7b-844b-458a39db0a47 in datapath 5cbf5083-8d50-44bd-b6ba-93e507a8654e unbound from our chassis#033[00m
Nov 22 03:15:19 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:19.527 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5cbf5083-8d50-44bd-b6ba-93e507a8654e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:15:19 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:19.528 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[e4c8a9e0-9cca-43d9-8a28-fa38e977b427]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:19 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:19.529 104084 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-5cbf5083-8d50-44bd-b6ba-93e507a8654e namespace which is not needed anymore#033[00m
Nov 22 03:15:19 np0005531887 systemd[1]: machine-qemu\x2d51\x2dinstance\x2d0000008d.scope: Deactivated successfully.
Nov 22 03:15:19 np0005531887 systemd[1]: machine-qemu\x2d51\x2dinstance\x2d0000008d.scope: Consumed 16.241s CPU time.
Nov 22 03:15:19 np0005531887 systemd-machined[153180]: Machine qemu-51-instance-0000008d terminated.
Nov 22 03:15:19 np0005531887 neutron-haproxy-ovnmeta-5cbf5083-8d50-44bd-b6ba-93e507a8654e[234451]: [NOTICE]   (234455) : haproxy version is 2.8.14-c23fe91
Nov 22 03:15:19 np0005531887 neutron-haproxy-ovnmeta-5cbf5083-8d50-44bd-b6ba-93e507a8654e[234451]: [NOTICE]   (234455) : path to executable is /usr/sbin/haproxy
Nov 22 03:15:19 np0005531887 neutron-haproxy-ovnmeta-5cbf5083-8d50-44bd-b6ba-93e507a8654e[234451]: [WARNING]  (234455) : Exiting Master process...
Nov 22 03:15:19 np0005531887 neutron-haproxy-ovnmeta-5cbf5083-8d50-44bd-b6ba-93e507a8654e[234451]: [ALERT]    (234455) : Current worker (234457) exited with code 143 (Terminated)
Nov 22 03:15:19 np0005531887 neutron-haproxy-ovnmeta-5cbf5083-8d50-44bd-b6ba-93e507a8654e[234451]: [WARNING]  (234455) : All workers exited. Exiting... (0)
Nov 22 03:15:19 np0005531887 systemd[1]: libpod-09b990fd2c0f52de4d7a8c32cc4b3937fbc49dddac25fc692fcc373f51595444.scope: Deactivated successfully.
Nov 22 03:15:19 np0005531887 podman[235118]: 2025-11-22 08:15:19.669811801 +0000 UTC m=+0.051054080 container died 09b990fd2c0f52de4d7a8c32cc4b3937fbc49dddac25fc692fcc373f51595444 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5cbf5083-8d50-44bd-b6ba-93e507a8654e, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 22 03:15:19 np0005531887 NetworkManager[55210]: <info>  [1763799319.6885] manager: (tap6fca0d10-ec): new Tun device (/org/freedesktop/NetworkManager/Devices/222)
Nov 22 03:15:19 np0005531887 nova_compute[186849]: 2025-11-22 08:15:19.691 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:19 np0005531887 nova_compute[186849]: 2025-11-22 08:15:19.695 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:19 np0005531887 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-09b990fd2c0f52de4d7a8c32cc4b3937fbc49dddac25fc692fcc373f51595444-userdata-shm.mount: Deactivated successfully.
Nov 22 03:15:19 np0005531887 systemd[1]: var-lib-containers-storage-overlay-4ce1f7cf373bd09991ed80acf4e7908bce79a67ef32b6719ad78666bf614964e-merged.mount: Deactivated successfully.
Nov 22 03:15:19 np0005531887 nova_compute[186849]: 2025-11-22 08:15:19.737 186853 INFO nova.virt.libvirt.driver [-] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Instance destroyed successfully.#033[00m
Nov 22 03:15:19 np0005531887 nova_compute[186849]: 2025-11-22 08:15:19.739 186853 DEBUG nova.objects.instance [None req-80b21385-dc4c-4ffc-aed6-ecc63cbd3d60 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Lazy-loading 'resources' on Instance uuid 9d3b7a77-8b28-4774-9eeb-65b858c3820b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:15:19 np0005531887 podman[235118]: 2025-11-22 08:15:19.752080652 +0000 UTC m=+0.133322931 container cleanup 09b990fd2c0f52de4d7a8c32cc4b3937fbc49dddac25fc692fcc373f51595444 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5cbf5083-8d50-44bd-b6ba-93e507a8654e, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 22 03:15:19 np0005531887 systemd[1]: libpod-conmon-09b990fd2c0f52de4d7a8c32cc4b3937fbc49dddac25fc692fcc373f51595444.scope: Deactivated successfully.
Nov 22 03:15:19 np0005531887 nova_compute[186849]: 2025-11-22 08:15:19.761 186853 DEBUG nova.virt.libvirt.vif [None req-80b21385-dc4c-4ffc-aed6-ecc63cbd3d60 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:14:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-368734434',display_name='tempest-TestSnapshotPattern-server-368734434',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-368734434',id=141,image_ref='2b9b9d31-f80f-437c-8142-755f74bb78ae',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO3zqwL5oCAVcYUK4UfRxRlwiLCpXhyrVibiQXfDMPSmEzdCg2weZeJjjoUlK1vs2o/ZsP7kK+r7TBW2xEMw9M43RfSbbpgfpmDe3/3E/PZ1RgVY0zy+sKDgo7g8yf0esA==',key_name='tempest-TestSnapshotPattern-653067273',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:14:26Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5c9016c6b616412fa2db0983e23a8150',ramdisk_id='',reservation_id='r-9jtkwk6t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_boot_roles='member,reader',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='64848f5c-64c9-41ed-9c0d-c2ef3839d5de',image_min_disk='1',image_min_ram='0',image_owner_id='5c9016c6b616412fa2db0983e23a8150',image_owner_project_name='tempest-TestSnapshotPattern-1254822391',image_owner_user_name='tempest-TestSnapshotPattern-1254822391-project-member',image_user_id='72df4512d7f245118018df81223ce5ff',image_version='8.0',owner_project_name='tempest-TestSnapshotPattern-1254822391',owner_user_name='tempest-TestSnapshotPattern-1254822391-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:15:15Z,user_data=None,user_id='72df4512d7f245118018df81223ce5ff',uuid=9d3b7a77-8b28-4774-9eeb-65b858c3820b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6fca0d10-ec3d-4b7b-844b-458a39db0a47", "address": "fa:16:3e:90:00:cb", "network": {"id": "5cbf5083-8d50-44bd-b6ba-93e507a8654e", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-722250133-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9016c6b616412fa2db0983e23a8150", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6fca0d10-ec", "ovs_interfaceid": "6fca0d10-ec3d-4b7b-844b-458a39db0a47", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:15:19 np0005531887 nova_compute[186849]: 2025-11-22 08:15:19.762 186853 DEBUG nova.network.os_vif_util [None req-80b21385-dc4c-4ffc-aed6-ecc63cbd3d60 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Converting VIF {"id": "6fca0d10-ec3d-4b7b-844b-458a39db0a47", "address": "fa:16:3e:90:00:cb", "network": {"id": "5cbf5083-8d50-44bd-b6ba-93e507a8654e", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-722250133-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9016c6b616412fa2db0983e23a8150", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6fca0d10-ec", "ovs_interfaceid": "6fca0d10-ec3d-4b7b-844b-458a39db0a47", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:15:19 np0005531887 nova_compute[186849]: 2025-11-22 08:15:19.762 186853 DEBUG nova.network.os_vif_util [None req-80b21385-dc4c-4ffc-aed6-ecc63cbd3d60 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:90:00:cb,bridge_name='br-int',has_traffic_filtering=True,id=6fca0d10-ec3d-4b7b-844b-458a39db0a47,network=Network(5cbf5083-8d50-44bd-b6ba-93e507a8654e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6fca0d10-ec') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:15:19 np0005531887 nova_compute[186849]: 2025-11-22 08:15:19.763 186853 DEBUG os_vif [None req-80b21385-dc4c-4ffc-aed6-ecc63cbd3d60 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:90:00:cb,bridge_name='br-int',has_traffic_filtering=True,id=6fca0d10-ec3d-4b7b-844b-458a39db0a47,network=Network(5cbf5083-8d50-44bd-b6ba-93e507a8654e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6fca0d10-ec') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:15:19 np0005531887 nova_compute[186849]: 2025-11-22 08:15:19.765 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:19 np0005531887 nova_compute[186849]: 2025-11-22 08:15:19.765 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6fca0d10-ec, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:15:19 np0005531887 nova_compute[186849]: 2025-11-22 08:15:19.766 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:19 np0005531887 nova_compute[186849]: 2025-11-22 08:15:19.768 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:19 np0005531887 nova_compute[186849]: 2025-11-22 08:15:19.769 186853 INFO os_vif [None req-80b21385-dc4c-4ffc-aed6-ecc63cbd3d60 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:90:00:cb,bridge_name='br-int',has_traffic_filtering=True,id=6fca0d10-ec3d-4b7b-844b-458a39db0a47,network=Network(5cbf5083-8d50-44bd-b6ba-93e507a8654e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6fca0d10-ec')#033[00m
Nov 22 03:15:19 np0005531887 nova_compute[186849]: 2025-11-22 08:15:19.770 186853 INFO nova.virt.libvirt.driver [None req-80b21385-dc4c-4ffc-aed6-ecc63cbd3d60 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Deleting instance files /var/lib/nova/instances/9d3b7a77-8b28-4774-9eeb-65b858c3820b_del#033[00m
Nov 22 03:15:19 np0005531887 nova_compute[186849]: 2025-11-22 08:15:19.771 186853 INFO nova.virt.libvirt.driver [None req-80b21385-dc4c-4ffc-aed6-ecc63cbd3d60 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Deletion of /var/lib/nova/instances/9d3b7a77-8b28-4774-9eeb-65b858c3820b_del complete#033[00m
Nov 22 03:15:19 np0005531887 podman[235164]: 2025-11-22 08:15:19.828712863 +0000 UTC m=+0.055112141 container remove 09b990fd2c0f52de4d7a8c32cc4b3937fbc49dddac25fc692fcc373f51595444 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5cbf5083-8d50-44bd-b6ba-93e507a8654e, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Nov 22 03:15:19 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:19.834 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[47e8ed63-f797-4df0-9c73-17373eb40f25]: (4, ('Sat Nov 22 08:15:19 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-5cbf5083-8d50-44bd-b6ba-93e507a8654e (09b990fd2c0f52de4d7a8c32cc4b3937fbc49dddac25fc692fcc373f51595444)\n09b990fd2c0f52de4d7a8c32cc4b3937fbc49dddac25fc692fcc373f51595444\nSat Nov 22 08:15:19 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-5cbf5083-8d50-44bd-b6ba-93e507a8654e (09b990fd2c0f52de4d7a8c32cc4b3937fbc49dddac25fc692fcc373f51595444)\n09b990fd2c0f52de4d7a8c32cc4b3937fbc49dddac25fc692fcc373f51595444\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:19 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:19.836 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[96ad0bec-e346-451c-bc17-8827fcdba010]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:19 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:19.837 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5cbf5083-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:15:19 np0005531887 nova_compute[186849]: 2025-11-22 08:15:19.839 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:19 np0005531887 kernel: tap5cbf5083-80: left promiscuous mode
Nov 22 03:15:19 np0005531887 nova_compute[186849]: 2025-11-22 08:15:19.851 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:19 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:19.854 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[82153fed-3b16-47c9-ae35-b6671ac27750]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:19 np0005531887 nova_compute[186849]: 2025-11-22 08:15:19.867 186853 DEBUG nova.compute.manager [req-c6d974c0-0475-44aa-b424-5bd5528052be req-9a997e91-da9a-4b84-a8b1-04594cba6576 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Received event network-changed-6fca0d10-ec3d-4b7b-844b-458a39db0a47 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:15:19 np0005531887 nova_compute[186849]: 2025-11-22 08:15:19.867 186853 DEBUG nova.compute.manager [req-c6d974c0-0475-44aa-b424-5bd5528052be req-9a997e91-da9a-4b84-a8b1-04594cba6576 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Refreshing instance network info cache due to event network-changed-6fca0d10-ec3d-4b7b-844b-458a39db0a47. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:15:19 np0005531887 nova_compute[186849]: 2025-11-22 08:15:19.867 186853 DEBUG oslo_concurrency.lockutils [req-c6d974c0-0475-44aa-b424-5bd5528052be req-9a997e91-da9a-4b84-a8b1-04594cba6576 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-9d3b7a77-8b28-4774-9eeb-65b858c3820b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:15:19 np0005531887 nova_compute[186849]: 2025-11-22 08:15:19.867 186853 DEBUG oslo_concurrency.lockutils [req-c6d974c0-0475-44aa-b424-5bd5528052be req-9a997e91-da9a-4b84-a8b1-04594cba6576 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-9d3b7a77-8b28-4774-9eeb-65b858c3820b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:15:19 np0005531887 nova_compute[186849]: 2025-11-22 08:15:19.868 186853 DEBUG nova.network.neutron [req-c6d974c0-0475-44aa-b424-5bd5528052be req-9a997e91-da9a-4b84-a8b1-04594cba6576 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Refreshing network info cache for port 6fca0d10-ec3d-4b7b-844b-458a39db0a47 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:15:19 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:19.872 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[af7c18d4-877e-4155-96f8-73cca48cdee5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:19 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:19.874 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[a3d47bd6-0ff8-4d69-a99b-31ad426800bd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:19 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:19.892 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[e9272ec1-8161-4891-a78e-5382103caf50]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 600871, 'reachable_time': 26692, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235177, 'error': None, 'target': 'ovnmeta-5cbf5083-8d50-44bd-b6ba-93e507a8654e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:19 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:19.895 104200 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-5cbf5083-8d50-44bd-b6ba-93e507a8654e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:15:19 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:19.895 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[41c947e0-7595-4bf1-8edb-4a3b5c140ee1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:19 np0005531887 systemd[1]: run-netns-ovnmeta\x2d5cbf5083\x2d8d50\x2d44bd\x2db6ba\x2d93e507a8654e.mount: Deactivated successfully.
Nov 22 03:15:19 np0005531887 nova_compute[186849]: 2025-11-22 08:15:19.927 186853 INFO nova.compute.manager [None req-80b21385-dc4c-4ffc-aed6-ecc63cbd3d60 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Took 0.46 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 03:15:19 np0005531887 nova_compute[186849]: 2025-11-22 08:15:19.928 186853 DEBUG oslo.service.loopingcall [None req-80b21385-dc4c-4ffc-aed6-ecc63cbd3d60 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 03:15:19 np0005531887 nova_compute[186849]: 2025-11-22 08:15:19.928 186853 DEBUG nova.compute.manager [-] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 03:15:19 np0005531887 nova_compute[186849]: 2025-11-22 08:15:19.928 186853 DEBUG nova.network.neutron [-] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 03:15:20 np0005531887 nova_compute[186849]: 2025-11-22 08:15:20.011 186853 DEBUG nova.compute.manager [req-b4f2cc44-197f-4891-9bd5-44f38d4e2567 req-c096150a-bd8c-490a-aafe-3513e45f52fc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Received event network-vif-unplugged-6fca0d10-ec3d-4b7b-844b-458a39db0a47 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:15:20 np0005531887 nova_compute[186849]: 2025-11-22 08:15:20.012 186853 DEBUG oslo_concurrency.lockutils [req-b4f2cc44-197f-4891-9bd5-44f38d4e2567 req-c096150a-bd8c-490a-aafe-3513e45f52fc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "9d3b7a77-8b28-4774-9eeb-65b858c3820b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:15:20 np0005531887 nova_compute[186849]: 2025-11-22 08:15:20.012 186853 DEBUG oslo_concurrency.lockutils [req-b4f2cc44-197f-4891-9bd5-44f38d4e2567 req-c096150a-bd8c-490a-aafe-3513e45f52fc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "9d3b7a77-8b28-4774-9eeb-65b858c3820b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:15:20 np0005531887 nova_compute[186849]: 2025-11-22 08:15:20.012 186853 DEBUG oslo_concurrency.lockutils [req-b4f2cc44-197f-4891-9bd5-44f38d4e2567 req-c096150a-bd8c-490a-aafe-3513e45f52fc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "9d3b7a77-8b28-4774-9eeb-65b858c3820b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:15:20 np0005531887 nova_compute[186849]: 2025-11-22 08:15:20.012 186853 DEBUG nova.compute.manager [req-b4f2cc44-197f-4891-9bd5-44f38d4e2567 req-c096150a-bd8c-490a-aafe-3513e45f52fc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] No waiting events found dispatching network-vif-unplugged-6fca0d10-ec3d-4b7b-844b-458a39db0a47 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:15:20 np0005531887 nova_compute[186849]: 2025-11-22 08:15:20.013 186853 DEBUG nova.compute.manager [req-b4f2cc44-197f-4891-9bd5-44f38d4e2567 req-c096150a-bd8c-490a-aafe-3513e45f52fc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Received event network-vif-unplugged-6fca0d10-ec3d-4b7b-844b-458a39db0a47 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 03:15:20 np0005531887 nova_compute[186849]: 2025-11-22 08:15:20.078 186853 DEBUG nova.network.neutron [req-8f97dda9-aab1-4900-af99-2526e9a7889c req-b1035a92-5483-4595-8f6e-1831c7c77c77 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Updated VIF entry in instance network info cache for port d9732384-f751-4c79-b2d8-54d8d0a67924. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:15:20 np0005531887 nova_compute[186849]: 2025-11-22 08:15:20.078 186853 DEBUG nova.network.neutron [req-8f97dda9-aab1-4900-af99-2526e9a7889c req-b1035a92-5483-4595-8f6e-1831c7c77c77 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Updating instance_info_cache with network_info: [{"id": "d9732384-f751-4c79-b2d8-54d8d0a67924", "address": "fa:16:3e:ba:aa:4b", "network": {"id": "7e573664-04ba-4ce5-994a-9fb9483a2400", "bridge": "br-int", "label": "tempest-network-smoke--447291836", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd9732384-f7", "ovs_interfaceid": "d9732384-f751-4c79-b2d8-54d8d0a67924", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9", "address": "fa:16:3e:14:9a:3c", "network": {"id": "6c0a2255-6426-43c4-abc3-5c1857ba0a79", "bridge": "br-int", "label": "tempest-network-smoke--277865928", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe14:9a3c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe14:9a3c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77caeab6-ac", "ovs_interfaceid": "77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:15:20 np0005531887 nova_compute[186849]: 2025-11-22 08:15:20.122 186853 DEBUG oslo_concurrency.lockutils [req-8f97dda9-aab1-4900-af99-2526e9a7889c req-b1035a92-5483-4595-8f6e-1831c7c77c77 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-11b63192-42c9-4462-80c0-d66b0f6fcd47" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:15:21 np0005531887 podman[235192]: 2025-11-22 08:15:21.850533109 +0000 UTC m=+0.066072261 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:15:21 np0005531887 nova_compute[186849]: 2025-11-22 08:15:21.937 186853 DEBUG nova.network.neutron [-] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:15:21 np0005531887 nova_compute[186849]: 2025-11-22 08:15:21.959 186853 INFO nova.compute.manager [-] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Took 2.03 seconds to deallocate network for instance.#033[00m
Nov 22 03:15:22 np0005531887 nova_compute[186849]: 2025-11-22 08:15:22.003 186853 DEBUG nova.compute.manager [req-3069bc82-d925-40a9-b723-1eb68ba8f267 req-63412012-9854-4c0a-aabd-46e53a696c3d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Received event network-vif-deleted-6fca0d10-ec3d-4b7b-844b-458a39db0a47 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:15:22 np0005531887 nova_compute[186849]: 2025-11-22 08:15:22.065 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:22 np0005531887 nova_compute[186849]: 2025-11-22 08:15:22.125 186853 DEBUG nova.compute.manager [req-62b00c9c-432b-4442-8f07-5b86b24c0930 req-d0448557-dc79-48f0-b154-df628824d97e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Received event network-vif-plugged-6fca0d10-ec3d-4b7b-844b-458a39db0a47 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:15:22 np0005531887 nova_compute[186849]: 2025-11-22 08:15:22.125 186853 DEBUG oslo_concurrency.lockutils [req-62b00c9c-432b-4442-8f07-5b86b24c0930 req-d0448557-dc79-48f0-b154-df628824d97e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "9d3b7a77-8b28-4774-9eeb-65b858c3820b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:15:22 np0005531887 nova_compute[186849]: 2025-11-22 08:15:22.126 186853 DEBUG oslo_concurrency.lockutils [req-62b00c9c-432b-4442-8f07-5b86b24c0930 req-d0448557-dc79-48f0-b154-df628824d97e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "9d3b7a77-8b28-4774-9eeb-65b858c3820b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:15:22 np0005531887 nova_compute[186849]: 2025-11-22 08:15:22.126 186853 DEBUG oslo_concurrency.lockutils [req-62b00c9c-432b-4442-8f07-5b86b24c0930 req-d0448557-dc79-48f0-b154-df628824d97e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "9d3b7a77-8b28-4774-9eeb-65b858c3820b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:15:22 np0005531887 nova_compute[186849]: 2025-11-22 08:15:22.126 186853 DEBUG nova.compute.manager [req-62b00c9c-432b-4442-8f07-5b86b24c0930 req-d0448557-dc79-48f0-b154-df628824d97e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] No waiting events found dispatching network-vif-plugged-6fca0d10-ec3d-4b7b-844b-458a39db0a47 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:15:22 np0005531887 nova_compute[186849]: 2025-11-22 08:15:22.127 186853 WARNING nova.compute.manager [req-62b00c9c-432b-4442-8f07-5b86b24c0930 req-d0448557-dc79-48f0-b154-df628824d97e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Received unexpected event network-vif-plugged-6fca0d10-ec3d-4b7b-844b-458a39db0a47 for instance with vm_state deleted and task_state None.#033[00m
Nov 22 03:15:22 np0005531887 nova_compute[186849]: 2025-11-22 08:15:22.213 186853 DEBUG oslo_concurrency.lockutils [None req-80b21385-dc4c-4ffc-aed6-ecc63cbd3d60 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:15:22 np0005531887 nova_compute[186849]: 2025-11-22 08:15:22.214 186853 DEBUG oslo_concurrency.lockutils [None req-80b21385-dc4c-4ffc-aed6-ecc63cbd3d60 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:15:22 np0005531887 nova_compute[186849]: 2025-11-22 08:15:22.320 186853 DEBUG nova.compute.provider_tree [None req-80b21385-dc4c-4ffc-aed6-ecc63cbd3d60 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:15:22 np0005531887 nova_compute[186849]: 2025-11-22 08:15:22.337 186853 DEBUG nova.scheduler.client.report [None req-80b21385-dc4c-4ffc-aed6-ecc63cbd3d60 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:15:22 np0005531887 nova_compute[186849]: 2025-11-22 08:15:22.361 186853 DEBUG oslo_concurrency.lockutils [None req-80b21385-dc4c-4ffc-aed6-ecc63cbd3d60 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.147s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:15:22 np0005531887 nova_compute[186849]: 2025-11-22 08:15:22.400 186853 INFO nova.scheduler.client.report [None req-80b21385-dc4c-4ffc-aed6-ecc63cbd3d60 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Deleted allocations for instance 9d3b7a77-8b28-4774-9eeb-65b858c3820b#033[00m
Nov 22 03:15:22 np0005531887 nova_compute[186849]: 2025-11-22 08:15:22.478 186853 DEBUG oslo_concurrency.lockutils [None req-80b21385-dc4c-4ffc-aed6-ecc63cbd3d60 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Lock "9d3b7a77-8b28-4774-9eeb-65b858c3820b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.026s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:15:22 np0005531887 nova_compute[186849]: 2025-11-22 08:15:22.579 186853 DEBUG nova.network.neutron [req-c6d974c0-0475-44aa-b424-5bd5528052be req-9a997e91-da9a-4b84-a8b1-04594cba6576 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Updated VIF entry in instance network info cache for port 6fca0d10-ec3d-4b7b-844b-458a39db0a47. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:15:22 np0005531887 nova_compute[186849]: 2025-11-22 08:15:22.580 186853 DEBUG nova.network.neutron [req-c6d974c0-0475-44aa-b424-5bd5528052be req-9a997e91-da9a-4b84-a8b1-04594cba6576 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Updating instance_info_cache with network_info: [{"id": "6fca0d10-ec3d-4b7b-844b-458a39db0a47", "address": "fa:16:3e:90:00:cb", "network": {"id": "5cbf5083-8d50-44bd-b6ba-93e507a8654e", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-722250133-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9016c6b616412fa2db0983e23a8150", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6fca0d10-ec", "ovs_interfaceid": "6fca0d10-ec3d-4b7b-844b-458a39db0a47", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:15:22 np0005531887 nova_compute[186849]: 2025-11-22 08:15:22.607 186853 DEBUG oslo_concurrency.lockutils [req-c6d974c0-0475-44aa-b424-5bd5528052be req-9a997e91-da9a-4b84-a8b1-04594cba6576 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-9d3b7a77-8b28-4774-9eeb-65b858c3820b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:15:22 np0005531887 ovn_controller[95130]: 2025-11-22T08:15:22Z|00059|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ba:aa:4b 10.100.0.9
Nov 22 03:15:22 np0005531887 ovn_controller[95130]: 2025-11-22T08:15:22Z|00060|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ba:aa:4b 10.100.0.9
Nov 22 03:15:24 np0005531887 nova_compute[186849]: 2025-11-22 08:15:24.769 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:27 np0005531887 nova_compute[186849]: 2025-11-22 08:15:27.068 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:27 np0005531887 podman[235212]: 2025-11-22 08:15:27.841158069 +0000 UTC m=+0.056709771 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 22 03:15:29 np0005531887 nova_compute[186849]: 2025-11-22 08:15:29.772 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:32 np0005531887 nova_compute[186849]: 2025-11-22 08:15:32.070 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:32 np0005531887 podman[235232]: 2025-11-22 08:15:32.846558195 +0000 UTC m=+0.056834963 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 03:15:34 np0005531887 nova_compute[186849]: 2025-11-22 08:15:34.737 186853 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763799319.7304354, 9d3b7a77-8b28-4774-9eeb-65b858c3820b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:15:34 np0005531887 nova_compute[186849]: 2025-11-22 08:15:34.738 186853 INFO nova.compute.manager [-] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:15:34 np0005531887 nova_compute[186849]: 2025-11-22 08:15:34.763 186853 DEBUG nova.compute.manager [None req-423c6c1c-a55d-4724-9e98-e0e6a9a95e4f - - - - - -] [instance: 9d3b7a77-8b28-4774-9eeb-65b858c3820b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:15:34 np0005531887 nova_compute[186849]: 2025-11-22 08:15:34.776 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:35 np0005531887 nova_compute[186849]: 2025-11-22 08:15:35.907 186853 DEBUG nova.compute.manager [req-6de6538f-07dd-4dd7-a46b-de2214a97b7f req-ef6b4bf0-4288-4dbb-b68f-20dcd05108f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Received event network-changed-d9732384-f751-4c79-b2d8-54d8d0a67924 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:15:35 np0005531887 nova_compute[186849]: 2025-11-22 08:15:35.907 186853 DEBUG nova.compute.manager [req-6de6538f-07dd-4dd7-a46b-de2214a97b7f req-ef6b4bf0-4288-4dbb-b68f-20dcd05108f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Refreshing instance network info cache due to event network-changed-d9732384-f751-4c79-b2d8-54d8d0a67924. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:15:35 np0005531887 nova_compute[186849]: 2025-11-22 08:15:35.908 186853 DEBUG oslo_concurrency.lockutils [req-6de6538f-07dd-4dd7-a46b-de2214a97b7f req-ef6b4bf0-4288-4dbb-b68f-20dcd05108f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-11b63192-42c9-4462-80c0-d66b0f6fcd47" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:15:35 np0005531887 nova_compute[186849]: 2025-11-22 08:15:35.908 186853 DEBUG oslo_concurrency.lockutils [req-6de6538f-07dd-4dd7-a46b-de2214a97b7f req-ef6b4bf0-4288-4dbb-b68f-20dcd05108f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-11b63192-42c9-4462-80c0-d66b0f6fcd47" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:15:35 np0005531887 nova_compute[186849]: 2025-11-22 08:15:35.908 186853 DEBUG nova.network.neutron [req-6de6538f-07dd-4dd7-a46b-de2214a97b7f req-ef6b4bf0-4288-4dbb-b68f-20dcd05108f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Refreshing network info cache for port d9732384-f751-4c79-b2d8-54d8d0a67924 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:15:36 np0005531887 nova_compute[186849]: 2025-11-22 08:15:36.000 186853 DEBUG oslo_concurrency.lockutils [None req-93230fa6-9435-4be0-99dd-83405d37e62b 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "11b63192-42c9-4462-80c0-d66b0f6fcd47" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:15:36 np0005531887 nova_compute[186849]: 2025-11-22 08:15:36.001 186853 DEBUG oslo_concurrency.lockutils [None req-93230fa6-9435-4be0-99dd-83405d37e62b 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "11b63192-42c9-4462-80c0-d66b0f6fcd47" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:15:36 np0005531887 nova_compute[186849]: 2025-11-22 08:15:36.001 186853 DEBUG oslo_concurrency.lockutils [None req-93230fa6-9435-4be0-99dd-83405d37e62b 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "11b63192-42c9-4462-80c0-d66b0f6fcd47-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:15:36 np0005531887 nova_compute[186849]: 2025-11-22 08:15:36.001 186853 DEBUG oslo_concurrency.lockutils [None req-93230fa6-9435-4be0-99dd-83405d37e62b 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "11b63192-42c9-4462-80c0-d66b0f6fcd47-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:15:36 np0005531887 nova_compute[186849]: 2025-11-22 08:15:36.002 186853 DEBUG oslo_concurrency.lockutils [None req-93230fa6-9435-4be0-99dd-83405d37e62b 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "11b63192-42c9-4462-80c0-d66b0f6fcd47-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:15:36 np0005531887 nova_compute[186849]: 2025-11-22 08:15:36.009 186853 INFO nova.compute.manager [None req-93230fa6-9435-4be0-99dd-83405d37e62b 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Terminating instance#033[00m
Nov 22 03:15:36 np0005531887 nova_compute[186849]: 2025-11-22 08:15:36.017 186853 DEBUG nova.compute.manager [None req-93230fa6-9435-4be0-99dd-83405d37e62b 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 03:15:36 np0005531887 kernel: tapd9732384-f7 (unregistering): left promiscuous mode
Nov 22 03:15:36 np0005531887 NetworkManager[55210]: <info>  [1763799336.0413] device (tapd9732384-f7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:15:36 np0005531887 ovn_controller[95130]: 2025-11-22T08:15:36Z|00482|binding|INFO|Releasing lport d9732384-f751-4c79-b2d8-54d8d0a67924 from this chassis (sb_readonly=0)
Nov 22 03:15:36 np0005531887 ovn_controller[95130]: 2025-11-22T08:15:36Z|00483|binding|INFO|Setting lport d9732384-f751-4c79-b2d8-54d8d0a67924 down in Southbound
Nov 22 03:15:36 np0005531887 ovn_controller[95130]: 2025-11-22T08:15:36Z|00484|binding|INFO|Removing iface tapd9732384-f7 ovn-installed in OVS
Nov 22 03:15:36 np0005531887 nova_compute[186849]: 2025-11-22 08:15:36.051 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:36 np0005531887 nova_compute[186849]: 2025-11-22 08:15:36.053 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:36.064 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ba:aa:4b 10.100.0.9'], port_security=['fa:16:3e:ba:aa:4b 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '11b63192-42c9-4462-80c0-d66b0f6fcd47', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7e573664-04ba-4ce5-994a-9fb9483a2400', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'bc613044-b796-41b5-a7b0-c508f998d641', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=da796c20-96a3-420c-a9ae-3320426db7c7, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=d9732384-f751-4c79-b2d8-54d8d0a67924) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:15:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:36.066 104084 INFO neutron.agent.ovn.metadata.agent [-] Port d9732384-f751-4c79-b2d8-54d8d0a67924 in datapath 7e573664-04ba-4ce5-994a-9fb9483a2400 unbound from our chassis#033[00m
Nov 22 03:15:36 np0005531887 nova_compute[186849]: 2025-11-22 08:15:36.065 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:36.067 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7e573664-04ba-4ce5-994a-9fb9483a2400, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:15:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:36.068 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[b0c54482-87bd-4ddd-8867-337b9e90cae9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:36.069 104084 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7e573664-04ba-4ce5-994a-9fb9483a2400 namespace which is not needed anymore#033[00m
Nov 22 03:15:36 np0005531887 kernel: tap77caeab6-ac (unregistering): left promiscuous mode
Nov 22 03:15:36 np0005531887 NetworkManager[55210]: <info>  [1763799336.0876] device (tap77caeab6-ac): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:15:36 np0005531887 nova_compute[186849]: 2025-11-22 08:15:36.093 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:36 np0005531887 nova_compute[186849]: 2025-11-22 08:15:36.095 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:36 np0005531887 ovn_controller[95130]: 2025-11-22T08:15:36Z|00485|binding|INFO|Releasing lport 77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9 from this chassis (sb_readonly=0)
Nov 22 03:15:36 np0005531887 ovn_controller[95130]: 2025-11-22T08:15:36Z|00486|binding|INFO|Setting lport 77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9 down in Southbound
Nov 22 03:15:36 np0005531887 ovn_controller[95130]: 2025-11-22T08:15:36Z|00487|binding|INFO|Removing iface tap77caeab6-ac ovn-installed in OVS
Nov 22 03:15:36 np0005531887 nova_compute[186849]: 2025-11-22 08:15:36.096 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:36.105 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:14:9a:3c 2001:db8:0:1:f816:3eff:fe14:9a3c 2001:db8::f816:3eff:fe14:9a3c'], port_security=['fa:16:3e:14:9a:3c 2001:db8:0:1:f816:3eff:fe14:9a3c 2001:db8::f816:3eff:fe14:9a3c'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe14:9a3c/64 2001:db8::f816:3eff:fe14:9a3c/64', 'neutron:device_id': '11b63192-42c9-4462-80c0-d66b0f6fcd47', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6c0a2255-6426-43c4-abc3-5c1857ba0a79', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'bc613044-b796-41b5-a7b0-c508f998d641', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1f6eb0a2-d476-48e9-8756-79e6bbc84c15, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:15:36 np0005531887 nova_compute[186849]: 2025-11-22 08:15:36.113 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:36 np0005531887 systemd[1]: machine-qemu\x2d52\x2dinstance\x2d0000008f.scope: Deactivated successfully.
Nov 22 03:15:36 np0005531887 systemd[1]: machine-qemu\x2d52\x2dinstance\x2d0000008f.scope: Consumed 15.829s CPU time.
Nov 22 03:15:36 np0005531887 systemd-machined[153180]: Machine qemu-52-instance-0000008f terminated.
Nov 22 03:15:36 np0005531887 neutron-haproxy-ovnmeta-7e573664-04ba-4ce5-994a-9fb9483a2400[234877]: [NOTICE]   (234881) : haproxy version is 2.8.14-c23fe91
Nov 22 03:15:36 np0005531887 neutron-haproxy-ovnmeta-7e573664-04ba-4ce5-994a-9fb9483a2400[234877]: [NOTICE]   (234881) : path to executable is /usr/sbin/haproxy
Nov 22 03:15:36 np0005531887 neutron-haproxy-ovnmeta-7e573664-04ba-4ce5-994a-9fb9483a2400[234877]: [WARNING]  (234881) : Exiting Master process...
Nov 22 03:15:36 np0005531887 neutron-haproxy-ovnmeta-7e573664-04ba-4ce5-994a-9fb9483a2400[234877]: [ALERT]    (234881) : Current worker (234883) exited with code 143 (Terminated)
Nov 22 03:15:36 np0005531887 neutron-haproxy-ovnmeta-7e573664-04ba-4ce5-994a-9fb9483a2400[234877]: [WARNING]  (234881) : All workers exited. Exiting... (0)
Nov 22 03:15:36 np0005531887 systemd[1]: libpod-a97a28606d197da05da3aabdd453428e9790d76caedf8354ddc360222672ba83.scope: Deactivated successfully.
Nov 22 03:15:36 np0005531887 NetworkManager[55210]: <info>  [1763799336.2404] manager: (tapd9732384-f7): new Tun device (/org/freedesktop/NetworkManager/Devices/223)
Nov 22 03:15:36 np0005531887 podman[235285]: 2025-11-22 08:15:36.244688997 +0000 UTC m=+0.081385460 container died a97a28606d197da05da3aabdd453428e9790d76caedf8354ddc360222672ba83 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7e573664-04ba-4ce5-994a-9fb9483a2400, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 22 03:15:36 np0005531887 nova_compute[186849]: 2025-11-22 08:15:36.245 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:36 np0005531887 nova_compute[186849]: 2025-11-22 08:15:36.251 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:36 np0005531887 NetworkManager[55210]: <info>  [1763799336.2540] manager: (tap77caeab6-ac): new Tun device (/org/freedesktop/NetworkManager/Devices/224)
Nov 22 03:15:36 np0005531887 nova_compute[186849]: 2025-11-22 08:15:36.293 186853 INFO nova.virt.libvirt.driver [-] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Instance destroyed successfully.#033[00m
Nov 22 03:15:36 np0005531887 nova_compute[186849]: 2025-11-22 08:15:36.294 186853 DEBUG nova.objects.instance [None req-93230fa6-9435-4be0-99dd-83405d37e62b 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lazy-loading 'resources' on Instance uuid 11b63192-42c9-4462-80c0-d66b0f6fcd47 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:15:36 np0005531887 nova_compute[186849]: 2025-11-22 08:15:36.307 186853 DEBUG nova.virt.libvirt.vif [None req-93230fa6-9435-4be0-99dd-83405d37e62b 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:14:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-540966288',display_name='tempest-TestGettingAddress-server-540966288',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-540966288',id=143,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBE3h1SGyV3gSWS5sQ4nylu2AAOZq0lzcQgQV1fi/afTfgHNAebinpbyavWAHUC3BTYwehM8YAaeM76WaxgrKeLAhyjYG3qrO7DzWZz90S7erIXCzT/UdxFEeIFnV62ADrw==',key_name='tempest-TestGettingAddress-2002370226',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:15:07Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-yuux33uq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:15:07Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=11b63192-42c9-4462-80c0-d66b0f6fcd47,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d9732384-f751-4c79-b2d8-54d8d0a67924", "address": "fa:16:3e:ba:aa:4b", "network": {"id": "7e573664-04ba-4ce5-994a-9fb9483a2400", "bridge": "br-int", "label": "tempest-network-smoke--447291836", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd9732384-f7", "ovs_interfaceid": "d9732384-f751-4c79-b2d8-54d8d0a67924", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:15:36 np0005531887 nova_compute[186849]: 2025-11-22 08:15:36.307 186853 DEBUG nova.network.os_vif_util [None req-93230fa6-9435-4be0-99dd-83405d37e62b 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "d9732384-f751-4c79-b2d8-54d8d0a67924", "address": "fa:16:3e:ba:aa:4b", "network": {"id": "7e573664-04ba-4ce5-994a-9fb9483a2400", "bridge": "br-int", "label": "tempest-network-smoke--447291836", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd9732384-f7", "ovs_interfaceid": "d9732384-f751-4c79-b2d8-54d8d0a67924", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:15:36 np0005531887 nova_compute[186849]: 2025-11-22 08:15:36.308 186853 DEBUG nova.network.os_vif_util [None req-93230fa6-9435-4be0-99dd-83405d37e62b 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ba:aa:4b,bridge_name='br-int',has_traffic_filtering=True,id=d9732384-f751-4c79-b2d8-54d8d0a67924,network=Network(7e573664-04ba-4ce5-994a-9fb9483a2400),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd9732384-f7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:15:36 np0005531887 nova_compute[186849]: 2025-11-22 08:15:36.308 186853 DEBUG os_vif [None req-93230fa6-9435-4be0-99dd-83405d37e62b 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ba:aa:4b,bridge_name='br-int',has_traffic_filtering=True,id=d9732384-f751-4c79-b2d8-54d8d0a67924,network=Network(7e573664-04ba-4ce5-994a-9fb9483a2400),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd9732384-f7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:15:36 np0005531887 nova_compute[186849]: 2025-11-22 08:15:36.309 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:36 np0005531887 nova_compute[186849]: 2025-11-22 08:15:36.309 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd9732384-f7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:15:36 np0005531887 nova_compute[186849]: 2025-11-22 08:15:36.311 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:36 np0005531887 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a97a28606d197da05da3aabdd453428e9790d76caedf8354ddc360222672ba83-userdata-shm.mount: Deactivated successfully.
Nov 22 03:15:36 np0005531887 nova_compute[186849]: 2025-11-22 08:15:36.313 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:15:36 np0005531887 systemd[1]: var-lib-containers-storage-overlay-3dc039eb6a6e58fa720c6065638e598ce9270cfd7881cb655688e3e58fda1373-merged.mount: Deactivated successfully.
Nov 22 03:15:36 np0005531887 nova_compute[186849]: 2025-11-22 08:15:36.316 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:36 np0005531887 nova_compute[186849]: 2025-11-22 08:15:36.320 186853 INFO os_vif [None req-93230fa6-9435-4be0-99dd-83405d37e62b 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ba:aa:4b,bridge_name='br-int',has_traffic_filtering=True,id=d9732384-f751-4c79-b2d8-54d8d0a67924,network=Network(7e573664-04ba-4ce5-994a-9fb9483a2400),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd9732384-f7')#033[00m
Nov 22 03:15:36 np0005531887 nova_compute[186849]: 2025-11-22 08:15:36.321 186853 DEBUG nova.virt.libvirt.vif [None req-93230fa6-9435-4be0-99dd-83405d37e62b 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:14:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-540966288',display_name='tempest-TestGettingAddress-server-540966288',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-540966288',id=143,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBE3h1SGyV3gSWS5sQ4nylu2AAOZq0lzcQgQV1fi/afTfgHNAebinpbyavWAHUC3BTYwehM8YAaeM76WaxgrKeLAhyjYG3qrO7DzWZz90S7erIXCzT/UdxFEeIFnV62ADrw==',key_name='tempest-TestGettingAddress-2002370226',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:15:07Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-yuux33uq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:15:07Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=11b63192-42c9-4462-80c0-d66b0f6fcd47,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9", "address": "fa:16:3e:14:9a:3c", "network": {"id": "6c0a2255-6426-43c4-abc3-5c1857ba0a79", "bridge": "br-int", "label": "tempest-network-smoke--277865928", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe14:9a3c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe14:9a3c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77caeab6-ac", "ovs_interfaceid": "77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:15:36 np0005531887 nova_compute[186849]: 2025-11-22 08:15:36.321 186853 DEBUG nova.network.os_vif_util [None req-93230fa6-9435-4be0-99dd-83405d37e62b 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9", "address": "fa:16:3e:14:9a:3c", "network": {"id": "6c0a2255-6426-43c4-abc3-5c1857ba0a79", "bridge": "br-int", "label": "tempest-network-smoke--277865928", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe14:9a3c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe14:9a3c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77caeab6-ac", "ovs_interfaceid": "77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:15:36 np0005531887 nova_compute[186849]: 2025-11-22 08:15:36.322 186853 DEBUG nova.network.os_vif_util [None req-93230fa6-9435-4be0-99dd-83405d37e62b 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:14:9a:3c,bridge_name='br-int',has_traffic_filtering=True,id=77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9,network=Network(6c0a2255-6426-43c4-abc3-5c1857ba0a79),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77caeab6-ac') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:15:36 np0005531887 nova_compute[186849]: 2025-11-22 08:15:36.322 186853 DEBUG os_vif [None req-93230fa6-9435-4be0-99dd-83405d37e62b 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:14:9a:3c,bridge_name='br-int',has_traffic_filtering=True,id=77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9,network=Network(6c0a2255-6426-43c4-abc3-5c1857ba0a79),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77caeab6-ac') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:15:36 np0005531887 nova_compute[186849]: 2025-11-22 08:15:36.324 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:36 np0005531887 nova_compute[186849]: 2025-11-22 08:15:36.324 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap77caeab6-ac, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:15:36 np0005531887 nova_compute[186849]: 2025-11-22 08:15:36.326 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:36 np0005531887 nova_compute[186849]: 2025-11-22 08:15:36.327 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:36 np0005531887 nova_compute[186849]: 2025-11-22 08:15:36.329 186853 INFO os_vif [None req-93230fa6-9435-4be0-99dd-83405d37e62b 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:14:9a:3c,bridge_name='br-int',has_traffic_filtering=True,id=77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9,network=Network(6c0a2255-6426-43c4-abc3-5c1857ba0a79),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77caeab6-ac')#033[00m
Nov 22 03:15:36 np0005531887 nova_compute[186849]: 2025-11-22 08:15:36.330 186853 INFO nova.virt.libvirt.driver [None req-93230fa6-9435-4be0-99dd-83405d37e62b 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Deleting instance files /var/lib/nova/instances/11b63192-42c9-4462-80c0-d66b0f6fcd47_del#033[00m
Nov 22 03:15:36 np0005531887 nova_compute[186849]: 2025-11-22 08:15:36.330 186853 INFO nova.virt.libvirt.driver [None req-93230fa6-9435-4be0-99dd-83405d37e62b 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Deletion of /var/lib/nova/instances/11b63192-42c9-4462-80c0-d66b0f6fcd47_del complete#033[00m
Nov 22 03:15:36 np0005531887 podman[235285]: 2025-11-22 08:15:36.346683394 +0000 UTC m=+0.183379857 container cleanup a97a28606d197da05da3aabdd453428e9790d76caedf8354ddc360222672ba83 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7e573664-04ba-4ce5-994a-9fb9483a2400, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 22 03:15:36 np0005531887 systemd[1]: libpod-conmon-a97a28606d197da05da3aabdd453428e9790d76caedf8354ddc360222672ba83.scope: Deactivated successfully.
Nov 22 03:15:36 np0005531887 nova_compute[186849]: 2025-11-22 08:15:36.440 186853 INFO nova.compute.manager [None req-93230fa6-9435-4be0-99dd-83405d37e62b 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Took 0.42 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 03:15:36 np0005531887 nova_compute[186849]: 2025-11-22 08:15:36.441 186853 DEBUG oslo.service.loopingcall [None req-93230fa6-9435-4be0-99dd-83405d37e62b 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 03:15:36 np0005531887 nova_compute[186849]: 2025-11-22 08:15:36.442 186853 DEBUG nova.compute.manager [-] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 03:15:36 np0005531887 nova_compute[186849]: 2025-11-22 08:15:36.442 186853 DEBUG nova.network.neutron [-] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 03:15:36 np0005531887 podman[235347]: 2025-11-22 08:15:36.46688057 +0000 UTC m=+0.096948823 container remove a97a28606d197da05da3aabdd453428e9790d76caedf8354ddc360222672ba83 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7e573664-04ba-4ce5-994a-9fb9483a2400, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:15:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:36.473 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[bfb16cf2-f175-4b4e-8016-a4fbb07b4467]: (4, ('Sat Nov 22 08:15:36 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-7e573664-04ba-4ce5-994a-9fb9483a2400 (a97a28606d197da05da3aabdd453428e9790d76caedf8354ddc360222672ba83)\na97a28606d197da05da3aabdd453428e9790d76caedf8354ddc360222672ba83\nSat Nov 22 08:15:36 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-7e573664-04ba-4ce5-994a-9fb9483a2400 (a97a28606d197da05da3aabdd453428e9790d76caedf8354ddc360222672ba83)\na97a28606d197da05da3aabdd453428e9790d76caedf8354ddc360222672ba83\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:36.475 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[c81d2077-12af-4c39-b46a-08948089126f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:36.476 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7e573664-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:15:36 np0005531887 nova_compute[186849]: 2025-11-22 08:15:36.477 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:36 np0005531887 kernel: tap7e573664-00: left promiscuous mode
Nov 22 03:15:36 np0005531887 nova_compute[186849]: 2025-11-22 08:15:36.479 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:36.483 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[0640eb8b-3d7d-4f34-8ec3-e910f698b71c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:36 np0005531887 nova_compute[186849]: 2025-11-22 08:15:36.490 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:36.507 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[62a19a4e-5dee-44ce-9765-e5ded2ab42ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:36.509 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[aa901633-7967-48e1-8f83-1916375a817f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:36.525 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[6806ffea-b65e-4c30-b654-c0979d17a299]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 604829, 'reachable_time': 37370, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235362, 'error': None, 'target': 'ovnmeta-7e573664-04ba-4ce5-994a-9fb9483a2400', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:36.527 104200 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7e573664-04ba-4ce5-994a-9fb9483a2400 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:15:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:36.528 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[eb5cccc7-718c-41b8-b7f2-3d729e656c6a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:36.528 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9 in datapath 6c0a2255-6426-43c4-abc3-5c1857ba0a79 unbound from our chassis#033[00m
Nov 22 03:15:36 np0005531887 systemd[1]: run-netns-ovnmeta\x2d7e573664\x2d04ba\x2d4ce5\x2d994a\x2d9fb9483a2400.mount: Deactivated successfully.
Nov 22 03:15:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:36.530 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6c0a2255-6426-43c4-abc3-5c1857ba0a79, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:15:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:36.531 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[6096fab3-2470-49b3-a9f9-47d7ba94b9f9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:36.532 104084 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6c0a2255-6426-43c4-abc3-5c1857ba0a79 namespace which is not needed anymore#033[00m
Nov 22 03:15:36 np0005531887 neutron-haproxy-ovnmeta-6c0a2255-6426-43c4-abc3-5c1857ba0a79[234990]: [NOTICE]   (234996) : haproxy version is 2.8.14-c23fe91
Nov 22 03:15:36 np0005531887 neutron-haproxy-ovnmeta-6c0a2255-6426-43c4-abc3-5c1857ba0a79[234990]: [NOTICE]   (234996) : path to executable is /usr/sbin/haproxy
Nov 22 03:15:36 np0005531887 neutron-haproxy-ovnmeta-6c0a2255-6426-43c4-abc3-5c1857ba0a79[234990]: [WARNING]  (234996) : Exiting Master process...
Nov 22 03:15:36 np0005531887 neutron-haproxy-ovnmeta-6c0a2255-6426-43c4-abc3-5c1857ba0a79[234990]: [ALERT]    (234996) : Current worker (234998) exited with code 143 (Terminated)
Nov 22 03:15:36 np0005531887 neutron-haproxy-ovnmeta-6c0a2255-6426-43c4-abc3-5c1857ba0a79[234990]: [WARNING]  (234996) : All workers exited. Exiting... (0)
Nov 22 03:15:36 np0005531887 systemd[1]: libpod-424a58f5fad36fe36f58ef8ac275cf4cf6524c3cc31fe07db1ab1cb882b07a89.scope: Deactivated successfully.
Nov 22 03:15:36 np0005531887 podman[235378]: 2025-11-22 08:15:36.679321183 +0000 UTC m=+0.058253479 container died 424a58f5fad36fe36f58ef8ac275cf4cf6524c3cc31fe07db1ab1cb882b07a89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6c0a2255-6426-43c4-abc3-5c1857ba0a79, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 22 03:15:36 np0005531887 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-424a58f5fad36fe36f58ef8ac275cf4cf6524c3cc31fe07db1ab1cb882b07a89-userdata-shm.mount: Deactivated successfully.
Nov 22 03:15:36 np0005531887 systemd[1]: var-lib-containers-storage-overlay-ad74a3b7d57cca01364d34b3bb455af6416105ee88b2d9d9e48addb9806739ee-merged.mount: Deactivated successfully.
Nov 22 03:15:36 np0005531887 ovn_controller[95130]: 2025-11-22T08:15:36Z|00488|binding|INFO|Releasing lport 433cf940-3b59-425c-aeb8-689a57de46c2 from this chassis (sb_readonly=0)
Nov 22 03:15:36 np0005531887 nova_compute[186849]: 2025-11-22 08:15:36.751 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:36 np0005531887 podman[235378]: 2025-11-22 08:15:36.753127265 +0000 UTC m=+0.132059551 container cleanup 424a58f5fad36fe36f58ef8ac275cf4cf6524c3cc31fe07db1ab1cb882b07a89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6c0a2255-6426-43c4-abc3-5c1857ba0a79, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 03:15:36 np0005531887 systemd[1]: libpod-conmon-424a58f5fad36fe36f58ef8ac275cf4cf6524c3cc31fe07db1ab1cb882b07a89.scope: Deactivated successfully.
Nov 22 03:15:36 np0005531887 nova_compute[186849]: 2025-11-22 08:15:36.888 186853 DEBUG nova.compute.manager [req-9633c380-0013-48cd-9173-7857e257bbb3 req-60842bcf-813a-4209-9f4c-5f6e6ae0a2d2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Received event network-vif-unplugged-77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:15:36 np0005531887 nova_compute[186849]: 2025-11-22 08:15:36.889 186853 DEBUG oslo_concurrency.lockutils [req-9633c380-0013-48cd-9173-7857e257bbb3 req-60842bcf-813a-4209-9f4c-5f6e6ae0a2d2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "11b63192-42c9-4462-80c0-d66b0f6fcd47-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:15:36 np0005531887 nova_compute[186849]: 2025-11-22 08:15:36.889 186853 DEBUG oslo_concurrency.lockutils [req-9633c380-0013-48cd-9173-7857e257bbb3 req-60842bcf-813a-4209-9f4c-5f6e6ae0a2d2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "11b63192-42c9-4462-80c0-d66b0f6fcd47-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:15:36 np0005531887 nova_compute[186849]: 2025-11-22 08:15:36.889 186853 DEBUG oslo_concurrency.lockutils [req-9633c380-0013-48cd-9173-7857e257bbb3 req-60842bcf-813a-4209-9f4c-5f6e6ae0a2d2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "11b63192-42c9-4462-80c0-d66b0f6fcd47-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:15:36 np0005531887 nova_compute[186849]: 2025-11-22 08:15:36.890 186853 DEBUG nova.compute.manager [req-9633c380-0013-48cd-9173-7857e257bbb3 req-60842bcf-813a-4209-9f4c-5f6e6ae0a2d2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] No waiting events found dispatching network-vif-unplugged-77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:15:36 np0005531887 nova_compute[186849]: 2025-11-22 08:15:36.890 186853 DEBUG nova.compute.manager [req-9633c380-0013-48cd-9173-7857e257bbb3 req-60842bcf-813a-4209-9f4c-5f6e6ae0a2d2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Received event network-vif-unplugged-77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 03:15:36 np0005531887 podman[235405]: 2025-11-22 08:15:36.903446114 +0000 UTC m=+0.129750113 container remove 424a58f5fad36fe36f58ef8ac275cf4cf6524c3cc31fe07db1ab1cb882b07a89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6c0a2255-6426-43c4-abc3-5c1857ba0a79, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 22 03:15:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:36.909 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[1d6f5d0d-8a24-4fe9-b752-91622000d730]: (4, ('Sat Nov 22 08:15:36 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-6c0a2255-6426-43c4-abc3-5c1857ba0a79 (424a58f5fad36fe36f58ef8ac275cf4cf6524c3cc31fe07db1ab1cb882b07a89)\n424a58f5fad36fe36f58ef8ac275cf4cf6524c3cc31fe07db1ab1cb882b07a89\nSat Nov 22 08:15:36 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-6c0a2255-6426-43c4-abc3-5c1857ba0a79 (424a58f5fad36fe36f58ef8ac275cf4cf6524c3cc31fe07db1ab1cb882b07a89)\n424a58f5fad36fe36f58ef8ac275cf4cf6524c3cc31fe07db1ab1cb882b07a89\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:36.913 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[4844574e-58fc-4428-bc69-e82feca1a78e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:36.914 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6c0a2255-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:15:36 np0005531887 nova_compute[186849]: 2025-11-22 08:15:36.916 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:36 np0005531887 kernel: tap6c0a2255-60: left promiscuous mode
Nov 22 03:15:36 np0005531887 nova_compute[186849]: 2025-11-22 08:15:36.941 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:36.945 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[be1184de-572e-44d1-b659-cee4ec102890]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:36 np0005531887 nova_compute[186849]: 2025-11-22 08:15:36.959 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:36.959 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[5f24f06a-f0fd-4463-ab3e-ae0e9324c2ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:36.962 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[ad631df5-77f9-4cad-a213-b353367210aa]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:36 np0005531887 nova_compute[186849]: 2025-11-22 08:15:36.969 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:36.978 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[80a12fdb-c806-4569-9ce7-ecef48085eda]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 604962, 'reachable_time': 37736, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235420, 'error': None, 'target': 'ovnmeta-6c0a2255-6426-43c4-abc3-5c1857ba0a79', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:36.981 104200 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6c0a2255-6426-43c4-abc3-5c1857ba0a79 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:15:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:36.981 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[dfea2419-3bf1-4b9b-8fcc-a4b1cf5eef64]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:15:37 np0005531887 nova_compute[186849]: 2025-11-22 08:15:37.072 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:37 np0005531887 systemd[1]: run-netns-ovnmeta\x2d6c0a2255\x2d6426\x2d43c4\x2dabc3\x2d5c1857ba0a79.mount: Deactivated successfully.
Nov 22 03:15:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:37.350 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:15:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:37.350 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:15:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:37.350 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:15:37 np0005531887 nova_compute[186849]: 2025-11-22 08:15:37.803 186853 DEBUG nova.network.neutron [req-6de6538f-07dd-4dd7-a46b-de2214a97b7f req-ef6b4bf0-4288-4dbb-b68f-20dcd05108f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Updated VIF entry in instance network info cache for port d9732384-f751-4c79-b2d8-54d8d0a67924. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:15:37 np0005531887 nova_compute[186849]: 2025-11-22 08:15:37.804 186853 DEBUG nova.network.neutron [req-6de6538f-07dd-4dd7-a46b-de2214a97b7f req-ef6b4bf0-4288-4dbb-b68f-20dcd05108f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Updating instance_info_cache with network_info: [{"id": "d9732384-f751-4c79-b2d8-54d8d0a67924", "address": "fa:16:3e:ba:aa:4b", "network": {"id": "7e573664-04ba-4ce5-994a-9fb9483a2400", "bridge": "br-int", "label": "tempest-network-smoke--447291836", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd9732384-f7", "ovs_interfaceid": "d9732384-f751-4c79-b2d8-54d8d0a67924", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9", "address": "fa:16:3e:14:9a:3c", "network": {"id": "6c0a2255-6426-43c4-abc3-5c1857ba0a79", "bridge": "br-int", "label": "tempest-network-smoke--277865928", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe14:9a3c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe14:9a3c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77caeab6-ac", "ovs_interfaceid": "77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:15:37 np0005531887 nova_compute[186849]: 2025-11-22 08:15:37.847 186853 DEBUG nova.network.neutron [-] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:15:37 np0005531887 nova_compute[186849]: 2025-11-22 08:15:37.865 186853 DEBUG oslo_concurrency.lockutils [req-6de6538f-07dd-4dd7-a46b-de2214a97b7f req-ef6b4bf0-4288-4dbb-b68f-20dcd05108f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-11b63192-42c9-4462-80c0-d66b0f6fcd47" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:15:37 np0005531887 nova_compute[186849]: 2025-11-22 08:15:37.876 186853 INFO nova.compute.manager [-] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Took 1.43 seconds to deallocate network for instance.#033[00m
Nov 22 03:15:38 np0005531887 nova_compute[186849]: 2025-11-22 08:15:38.007 186853 DEBUG oslo_concurrency.lockutils [None req-93230fa6-9435-4be0-99dd-83405d37e62b 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:15:38 np0005531887 nova_compute[186849]: 2025-11-22 08:15:38.008 186853 DEBUG oslo_concurrency.lockutils [None req-93230fa6-9435-4be0-99dd-83405d37e62b 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:15:38 np0005531887 nova_compute[186849]: 2025-11-22 08:15:38.049 186853 DEBUG nova.compute.manager [req-e5ac36d9-a802-48c1-923e-239c3a5d9778 req-ec1bb191-01bc-461e-ac9d-462a4448b576 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Received event network-vif-deleted-77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:15:38 np0005531887 nova_compute[186849]: 2025-11-22 08:15:38.049 186853 DEBUG nova.compute.manager [req-e5ac36d9-a802-48c1-923e-239c3a5d9778 req-ec1bb191-01bc-461e-ac9d-462a4448b576 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Received event network-vif-deleted-d9732384-f751-4c79-b2d8-54d8d0a67924 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:15:38 np0005531887 nova_compute[186849]: 2025-11-22 08:15:38.081 186853 DEBUG nova.compute.provider_tree [None req-93230fa6-9435-4be0-99dd-83405d37e62b 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:15:38 np0005531887 nova_compute[186849]: 2025-11-22 08:15:38.094 186853 DEBUG nova.scheduler.client.report [None req-93230fa6-9435-4be0-99dd-83405d37e62b 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:15:38 np0005531887 nova_compute[186849]: 2025-11-22 08:15:38.116 186853 DEBUG oslo_concurrency.lockutils [None req-93230fa6-9435-4be0-99dd-83405d37e62b 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:15:38 np0005531887 nova_compute[186849]: 2025-11-22 08:15:38.146 186853 INFO nova.scheduler.client.report [None req-93230fa6-9435-4be0-99dd-83405d37e62b 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Deleted allocations for instance 11b63192-42c9-4462-80c0-d66b0f6fcd47#033[00m
Nov 22 03:15:38 np0005531887 nova_compute[186849]: 2025-11-22 08:15:38.187 186853 DEBUG nova.compute.manager [req-0ee8e693-fb85-46cb-9179-47ade2a79cca req-143fc773-7e94-479b-a1e1-7e0127bbaa18 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Received event network-vif-unplugged-d9732384-f751-4c79-b2d8-54d8d0a67924 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:15:38 np0005531887 nova_compute[186849]: 2025-11-22 08:15:38.187 186853 DEBUG oslo_concurrency.lockutils [req-0ee8e693-fb85-46cb-9179-47ade2a79cca req-143fc773-7e94-479b-a1e1-7e0127bbaa18 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "11b63192-42c9-4462-80c0-d66b0f6fcd47-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:15:38 np0005531887 nova_compute[186849]: 2025-11-22 08:15:38.188 186853 DEBUG oslo_concurrency.lockutils [req-0ee8e693-fb85-46cb-9179-47ade2a79cca req-143fc773-7e94-479b-a1e1-7e0127bbaa18 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "11b63192-42c9-4462-80c0-d66b0f6fcd47-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:15:38 np0005531887 nova_compute[186849]: 2025-11-22 08:15:38.188 186853 DEBUG oslo_concurrency.lockutils [req-0ee8e693-fb85-46cb-9179-47ade2a79cca req-143fc773-7e94-479b-a1e1-7e0127bbaa18 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "11b63192-42c9-4462-80c0-d66b0f6fcd47-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:15:38 np0005531887 nova_compute[186849]: 2025-11-22 08:15:38.188 186853 DEBUG nova.compute.manager [req-0ee8e693-fb85-46cb-9179-47ade2a79cca req-143fc773-7e94-479b-a1e1-7e0127bbaa18 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] No waiting events found dispatching network-vif-unplugged-d9732384-f751-4c79-b2d8-54d8d0a67924 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:15:38 np0005531887 nova_compute[186849]: 2025-11-22 08:15:38.189 186853 WARNING nova.compute.manager [req-0ee8e693-fb85-46cb-9179-47ade2a79cca req-143fc773-7e94-479b-a1e1-7e0127bbaa18 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Received unexpected event network-vif-unplugged-d9732384-f751-4c79-b2d8-54d8d0a67924 for instance with vm_state deleted and task_state None.#033[00m
Nov 22 03:15:38 np0005531887 nova_compute[186849]: 2025-11-22 08:15:38.189 186853 DEBUG nova.compute.manager [req-0ee8e693-fb85-46cb-9179-47ade2a79cca req-143fc773-7e94-479b-a1e1-7e0127bbaa18 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Received event network-vif-plugged-d9732384-f751-4c79-b2d8-54d8d0a67924 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:15:38 np0005531887 nova_compute[186849]: 2025-11-22 08:15:38.189 186853 DEBUG oslo_concurrency.lockutils [req-0ee8e693-fb85-46cb-9179-47ade2a79cca req-143fc773-7e94-479b-a1e1-7e0127bbaa18 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "11b63192-42c9-4462-80c0-d66b0f6fcd47-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:15:38 np0005531887 nova_compute[186849]: 2025-11-22 08:15:38.189 186853 DEBUG oslo_concurrency.lockutils [req-0ee8e693-fb85-46cb-9179-47ade2a79cca req-143fc773-7e94-479b-a1e1-7e0127bbaa18 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "11b63192-42c9-4462-80c0-d66b0f6fcd47-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:15:38 np0005531887 nova_compute[186849]: 2025-11-22 08:15:38.190 186853 DEBUG oslo_concurrency.lockutils [req-0ee8e693-fb85-46cb-9179-47ade2a79cca req-143fc773-7e94-479b-a1e1-7e0127bbaa18 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "11b63192-42c9-4462-80c0-d66b0f6fcd47-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:15:38 np0005531887 nova_compute[186849]: 2025-11-22 08:15:38.190 186853 DEBUG nova.compute.manager [req-0ee8e693-fb85-46cb-9179-47ade2a79cca req-143fc773-7e94-479b-a1e1-7e0127bbaa18 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] No waiting events found dispatching network-vif-plugged-d9732384-f751-4c79-b2d8-54d8d0a67924 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:15:38 np0005531887 nova_compute[186849]: 2025-11-22 08:15:38.190 186853 WARNING nova.compute.manager [req-0ee8e693-fb85-46cb-9179-47ade2a79cca req-143fc773-7e94-479b-a1e1-7e0127bbaa18 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Received unexpected event network-vif-plugged-d9732384-f751-4c79-b2d8-54d8d0a67924 for instance with vm_state deleted and task_state None.#033[00m
Nov 22 03:15:38 np0005531887 nova_compute[186849]: 2025-11-22 08:15:38.204 186853 DEBUG oslo_concurrency.lockutils [None req-93230fa6-9435-4be0-99dd-83405d37e62b 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "11b63192-42c9-4462-80c0-d66b0f6fcd47" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.203s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:15:38 np0005531887 podman[235421]: 2025-11-22 08:15:38.860424673 +0000 UTC m=+0.084544698 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.33.7, name=ubi9-minimal, io.openshift.expose-services=, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, architecture=x86_64, config_id=edpm, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6)
Nov 22 03:15:38 np0005531887 nova_compute[186849]: 2025-11-22 08:15:38.983 186853 DEBUG nova.compute.manager [req-ca027ece-02ef-4e69-9bbe-02d7d2f3c2a2 req-f49e68f8-d076-4dc5-b635-2d797534b698 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Received event network-vif-plugged-77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:15:38 np0005531887 nova_compute[186849]: 2025-11-22 08:15:38.983 186853 DEBUG oslo_concurrency.lockutils [req-ca027ece-02ef-4e69-9bbe-02d7d2f3c2a2 req-f49e68f8-d076-4dc5-b635-2d797534b698 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "11b63192-42c9-4462-80c0-d66b0f6fcd47-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:15:38 np0005531887 nova_compute[186849]: 2025-11-22 08:15:38.983 186853 DEBUG oslo_concurrency.lockutils [req-ca027ece-02ef-4e69-9bbe-02d7d2f3c2a2 req-f49e68f8-d076-4dc5-b635-2d797534b698 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "11b63192-42c9-4462-80c0-d66b0f6fcd47-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:15:38 np0005531887 nova_compute[186849]: 2025-11-22 08:15:38.984 186853 DEBUG oslo_concurrency.lockutils [req-ca027ece-02ef-4e69-9bbe-02d7d2f3c2a2 req-f49e68f8-d076-4dc5-b635-2d797534b698 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "11b63192-42c9-4462-80c0-d66b0f6fcd47-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:15:38 np0005531887 nova_compute[186849]: 2025-11-22 08:15:38.984 186853 DEBUG nova.compute.manager [req-ca027ece-02ef-4e69-9bbe-02d7d2f3c2a2 req-f49e68f8-d076-4dc5-b635-2d797534b698 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] No waiting events found dispatching network-vif-plugged-77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:15:38 np0005531887 nova_compute[186849]: 2025-11-22 08:15:38.984 186853 WARNING nova.compute.manager [req-ca027ece-02ef-4e69-9bbe-02d7d2f3c2a2 req-f49e68f8-d076-4dc5-b635-2d797534b698 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Received unexpected event network-vif-plugged-77caeab6-ace1-4bd9-aa13-72f6bdc1cdd9 for instance with vm_state deleted and task_state None.#033[00m
Nov 22 03:15:41 np0005531887 nova_compute[186849]: 2025-11-22 08:15:41.327 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:41 np0005531887 podman[235445]: 2025-11-22 08:15:41.883050724 +0000 UTC m=+0.093137549 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 22 03:15:41 np0005531887 podman[235446]: 2025-11-22 08:15:41.904492953 +0000 UTC m=+0.110560939 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true)
Nov 22 03:15:42 np0005531887 nova_compute[186849]: 2025-11-22 08:15:42.075 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:43 np0005531887 nova_compute[186849]: 2025-11-22 08:15:43.211 186853 DEBUG oslo_concurrency.lockutils [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "c9d97da7-1af8-48d1-9faa-7a8ef1e0699e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:15:43 np0005531887 nova_compute[186849]: 2025-11-22 08:15:43.212 186853 DEBUG oslo_concurrency.lockutils [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "c9d97da7-1af8-48d1-9faa-7a8ef1e0699e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:15:43 np0005531887 nova_compute[186849]: 2025-11-22 08:15:43.307 186853 DEBUG nova.compute.manager [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 03:15:43 np0005531887 nova_compute[186849]: 2025-11-22 08:15:43.477 186853 DEBUG oslo_concurrency.lockutils [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:15:43 np0005531887 nova_compute[186849]: 2025-11-22 08:15:43.478 186853 DEBUG oslo_concurrency.lockutils [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:15:43 np0005531887 nova_compute[186849]: 2025-11-22 08:15:43.487 186853 DEBUG nova.virt.hardware [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:15:43 np0005531887 nova_compute[186849]: 2025-11-22 08:15:43.487 186853 INFO nova.compute.claims [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 22 03:15:43 np0005531887 nova_compute[186849]: 2025-11-22 08:15:43.670 186853 DEBUG nova.compute.provider_tree [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:15:43 np0005531887 nova_compute[186849]: 2025-11-22 08:15:43.700 186853 DEBUG nova.scheduler.client.report [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:15:43 np0005531887 nova_compute[186849]: 2025-11-22 08:15:43.740 186853 DEBUG oslo_concurrency.lockutils [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.262s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:15:43 np0005531887 nova_compute[186849]: 2025-11-22 08:15:43.741 186853 DEBUG nova.compute.manager [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 03:15:43 np0005531887 nova_compute[186849]: 2025-11-22 08:15:43.820 186853 DEBUG nova.compute.manager [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 03:15:43 np0005531887 nova_compute[186849]: 2025-11-22 08:15:43.821 186853 DEBUG nova.network.neutron [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:15:43 np0005531887 nova_compute[186849]: 2025-11-22 08:15:43.857 186853 INFO nova.virt.libvirt.driver [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 03:15:43 np0005531887 nova_compute[186849]: 2025-11-22 08:15:43.889 186853 DEBUG nova.compute.manager [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 03:15:44 np0005531887 nova_compute[186849]: 2025-11-22 08:15:44.038 186853 DEBUG nova.compute.manager [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 03:15:44 np0005531887 nova_compute[186849]: 2025-11-22 08:15:44.039 186853 DEBUG nova.virt.libvirt.driver [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:15:44 np0005531887 nova_compute[186849]: 2025-11-22 08:15:44.039 186853 INFO nova.virt.libvirt.driver [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Creating image(s)#033[00m
Nov 22 03:15:44 np0005531887 nova_compute[186849]: 2025-11-22 08:15:44.040 186853 DEBUG oslo_concurrency.lockutils [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "/var/lib/nova/instances/c9d97da7-1af8-48d1-9faa-7a8ef1e0699e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:15:44 np0005531887 nova_compute[186849]: 2025-11-22 08:15:44.040 186853 DEBUG oslo_concurrency.lockutils [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "/var/lib/nova/instances/c9d97da7-1af8-48d1-9faa-7a8ef1e0699e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:15:44 np0005531887 nova_compute[186849]: 2025-11-22 08:15:44.041 186853 DEBUG oslo_concurrency.lockutils [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "/var/lib/nova/instances/c9d97da7-1af8-48d1-9faa-7a8ef1e0699e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:15:44 np0005531887 nova_compute[186849]: 2025-11-22 08:15:44.053 186853 DEBUG oslo_concurrency.processutils [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:15:44 np0005531887 nova_compute[186849]: 2025-11-22 08:15:44.083 186853 DEBUG nova.policy [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '70cb231da30d4002a985cf18a579cd6a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:15:44 np0005531887 nova_compute[186849]: 2025-11-22 08:15:44.112 186853 DEBUG oslo_concurrency.processutils [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:15:44 np0005531887 nova_compute[186849]: 2025-11-22 08:15:44.113 186853 DEBUG oslo_concurrency.lockutils [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:15:44 np0005531887 nova_compute[186849]: 2025-11-22 08:15:44.114 186853 DEBUG oslo_concurrency.lockutils [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:15:44 np0005531887 nova_compute[186849]: 2025-11-22 08:15:44.130 186853 DEBUG oslo_concurrency.processutils [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:15:44 np0005531887 nova_compute[186849]: 2025-11-22 08:15:44.207 186853 DEBUG oslo_concurrency.processutils [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:15:44 np0005531887 nova_compute[186849]: 2025-11-22 08:15:44.208 186853 DEBUG oslo_concurrency.processutils [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/c9d97da7-1af8-48d1-9faa-7a8ef1e0699e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:15:44 np0005531887 nova_compute[186849]: 2025-11-22 08:15:44.442 186853 DEBUG oslo_concurrency.processutils [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/c9d97da7-1af8-48d1-9faa-7a8ef1e0699e/disk 1073741824" returned: 0 in 0.234s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:15:44 np0005531887 nova_compute[186849]: 2025-11-22 08:15:44.444 186853 DEBUG oslo_concurrency.lockutils [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.330s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:15:44 np0005531887 nova_compute[186849]: 2025-11-22 08:15:44.445 186853 DEBUG oslo_concurrency.processutils [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:15:44 np0005531887 nova_compute[186849]: 2025-11-22 08:15:44.536 186853 DEBUG oslo_concurrency.processutils [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:15:44 np0005531887 nova_compute[186849]: 2025-11-22 08:15:44.537 186853 DEBUG nova.virt.disk.api [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Checking if we can resize image /var/lib/nova/instances/c9d97da7-1af8-48d1-9faa-7a8ef1e0699e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:15:44 np0005531887 nova_compute[186849]: 2025-11-22 08:15:44.537 186853 DEBUG oslo_concurrency.processutils [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c9d97da7-1af8-48d1-9faa-7a8ef1e0699e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:15:44 np0005531887 nova_compute[186849]: 2025-11-22 08:15:44.603 186853 DEBUG oslo_concurrency.processutils [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c9d97da7-1af8-48d1-9faa-7a8ef1e0699e/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:15:44 np0005531887 nova_compute[186849]: 2025-11-22 08:15:44.604 186853 DEBUG nova.virt.disk.api [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Cannot resize image /var/lib/nova/instances/c9d97da7-1af8-48d1-9faa-7a8ef1e0699e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:15:44 np0005531887 nova_compute[186849]: 2025-11-22 08:15:44.605 186853 DEBUG nova.objects.instance [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lazy-loading 'migration_context' on Instance uuid c9d97da7-1af8-48d1-9faa-7a8ef1e0699e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:15:44 np0005531887 nova_compute[186849]: 2025-11-22 08:15:44.623 186853 DEBUG nova.virt.libvirt.driver [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:15:44 np0005531887 nova_compute[186849]: 2025-11-22 08:15:44.624 186853 DEBUG nova.virt.libvirt.driver [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Ensure instance console log exists: /var/lib/nova/instances/c9d97da7-1af8-48d1-9faa-7a8ef1e0699e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:15:44 np0005531887 nova_compute[186849]: 2025-11-22 08:15:44.624 186853 DEBUG oslo_concurrency.lockutils [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:15:44 np0005531887 nova_compute[186849]: 2025-11-22 08:15:44.625 186853 DEBUG oslo_concurrency.lockutils [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:15:44 np0005531887 nova_compute[186849]: 2025-11-22 08:15:44.625 186853 DEBUG oslo_concurrency.lockutils [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:15:46 np0005531887 nova_compute[186849]: 2025-11-22 08:15:46.329 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:47 np0005531887 nova_compute[186849]: 2025-11-22 08:15:47.077 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:47 np0005531887 podman[235506]: 2025-11-22 08:15:47.847775546 +0000 UTC m=+0.060644099 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 03:15:50 np0005531887 nova_compute[186849]: 2025-11-22 08:15:50.887 186853 DEBUG nova.network.neutron [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Successfully created port: dd3a3100-fbea-496f-91f5-a4d7c56ff913 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:15:51 np0005531887 nova_compute[186849]: 2025-11-22 08:15:51.292 186853 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763799336.291037, 11b63192-42c9-4462-80c0-d66b0f6fcd47 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:15:51 np0005531887 nova_compute[186849]: 2025-11-22 08:15:51.293 186853 INFO nova.compute.manager [-] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:15:51 np0005531887 nova_compute[186849]: 2025-11-22 08:15:51.309 186853 DEBUG nova.compute.manager [None req-b72ab7e5-8b66-471c-9d04-d59055e85504 - - - - - -] [instance: 11b63192-42c9-4462-80c0-d66b0f6fcd47] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:15:51 np0005531887 nova_compute[186849]: 2025-11-22 08:15:51.331 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:51 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:51.807 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=41, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=40) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:15:51 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:15:51.808 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:15:51 np0005531887 nova_compute[186849]: 2025-11-22 08:15:51.808 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:52 np0005531887 nova_compute[186849]: 2025-11-22 08:15:52.079 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:52 np0005531887 podman[235531]: 2025-11-22 08:15:52.846938067 +0000 UTC m=+0.066806590 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 22 03:15:53 np0005531887 nova_compute[186849]: 2025-11-22 08:15:53.189 186853 DEBUG nova.network.neutron [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Successfully updated port: dd3a3100-fbea-496f-91f5-a4d7c56ff913 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:15:53 np0005531887 nova_compute[186849]: 2025-11-22 08:15:53.415 186853 DEBUG oslo_concurrency.lockutils [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "refresh_cache-c9d97da7-1af8-48d1-9faa-7a8ef1e0699e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:15:53 np0005531887 nova_compute[186849]: 2025-11-22 08:15:53.415 186853 DEBUG oslo_concurrency.lockutils [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquired lock "refresh_cache-c9d97da7-1af8-48d1-9faa-7a8ef1e0699e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:15:53 np0005531887 nova_compute[186849]: 2025-11-22 08:15:53.415 186853 DEBUG nova.network.neutron [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:15:53 np0005531887 nova_compute[186849]: 2025-11-22 08:15:53.706 186853 DEBUG nova.network.neutron [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:15:54 np0005531887 nova_compute[186849]: 2025-11-22 08:15:54.372 186853 DEBUG nova.compute.manager [req-34fda450-0a72-4c67-9ad5-a2603857fb2e req-ca1f1db5-5f6b-4220-a356-d5bfa14cdaec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Received event network-changed-dd3a3100-fbea-496f-91f5-a4d7c56ff913 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:15:54 np0005531887 nova_compute[186849]: 2025-11-22 08:15:54.373 186853 DEBUG nova.compute.manager [req-34fda450-0a72-4c67-9ad5-a2603857fb2e req-ca1f1db5-5f6b-4220-a356-d5bfa14cdaec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Refreshing instance network info cache due to event network-changed-dd3a3100-fbea-496f-91f5-a4d7c56ff913. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:15:54 np0005531887 nova_compute[186849]: 2025-11-22 08:15:54.373 186853 DEBUG oslo_concurrency.lockutils [req-34fda450-0a72-4c67-9ad5-a2603857fb2e req-ca1f1db5-5f6b-4220-a356-d5bfa14cdaec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-c9d97da7-1af8-48d1-9faa-7a8ef1e0699e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:15:54 np0005531887 nova_compute[186849]: 2025-11-22 08:15:54.896 186853 DEBUG nova.network.neutron [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Updating instance_info_cache with network_info: [{"id": "dd3a3100-fbea-496f-91f5-a4d7c56ff913", "address": "fa:16:3e:f3:25:30", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd3a3100-fb", "ovs_interfaceid": "dd3a3100-fbea-496f-91f5-a4d7c56ff913", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:15:55 np0005531887 nova_compute[186849]: 2025-11-22 08:15:55.268 186853 DEBUG oslo_concurrency.lockutils [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Releasing lock "refresh_cache-c9d97da7-1af8-48d1-9faa-7a8ef1e0699e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:15:55 np0005531887 nova_compute[186849]: 2025-11-22 08:15:55.269 186853 DEBUG nova.compute.manager [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Instance network_info: |[{"id": "dd3a3100-fbea-496f-91f5-a4d7c56ff913", "address": "fa:16:3e:f3:25:30", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd3a3100-fb", "ovs_interfaceid": "dd3a3100-fbea-496f-91f5-a4d7c56ff913", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 03:15:55 np0005531887 nova_compute[186849]: 2025-11-22 08:15:55.269 186853 DEBUG oslo_concurrency.lockutils [req-34fda450-0a72-4c67-9ad5-a2603857fb2e req-ca1f1db5-5f6b-4220-a356-d5bfa14cdaec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-c9d97da7-1af8-48d1-9faa-7a8ef1e0699e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:15:55 np0005531887 nova_compute[186849]: 2025-11-22 08:15:55.270 186853 DEBUG nova.network.neutron [req-34fda450-0a72-4c67-9ad5-a2603857fb2e req-ca1f1db5-5f6b-4220-a356-d5bfa14cdaec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Refreshing network info cache for port dd3a3100-fbea-496f-91f5-a4d7c56ff913 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:15:55 np0005531887 nova_compute[186849]: 2025-11-22 08:15:55.274 186853 DEBUG nova.virt.libvirt.driver [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Start _get_guest_xml network_info=[{"id": "dd3a3100-fbea-496f-91f5-a4d7c56ff913", "address": "fa:16:3e:f3:25:30", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd3a3100-fb", "ovs_interfaceid": "dd3a3100-fbea-496f-91f5-a4d7c56ff913", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'size': 0, 'boot_index': 0, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:15:55 np0005531887 nova_compute[186849]: 2025-11-22 08:15:55.279 186853 WARNING nova.virt.libvirt.driver [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:15:55 np0005531887 nova_compute[186849]: 2025-11-22 08:15:55.291 186853 DEBUG nova.virt.libvirt.host [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:15:55 np0005531887 nova_compute[186849]: 2025-11-22 08:15:55.292 186853 DEBUG nova.virt.libvirt.host [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:15:55 np0005531887 nova_compute[186849]: 2025-11-22 08:15:55.299 186853 DEBUG nova.virt.libvirt.host [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:15:55 np0005531887 nova_compute[186849]: 2025-11-22 08:15:55.300 186853 DEBUG nova.virt.libvirt.host [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:15:55 np0005531887 nova_compute[186849]: 2025-11-22 08:15:55.301 186853 DEBUG nova.virt.libvirt.driver [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:15:55 np0005531887 nova_compute[186849]: 2025-11-22 08:15:55.301 186853 DEBUG nova.virt.hardware [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:15:55 np0005531887 nova_compute[186849]: 2025-11-22 08:15:55.302 186853 DEBUG nova.virt.hardware [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:15:55 np0005531887 nova_compute[186849]: 2025-11-22 08:15:55.302 186853 DEBUG nova.virt.hardware [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:15:55 np0005531887 nova_compute[186849]: 2025-11-22 08:15:55.302 186853 DEBUG nova.virt.hardware [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:15:55 np0005531887 nova_compute[186849]: 2025-11-22 08:15:55.302 186853 DEBUG nova.virt.hardware [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:15:55 np0005531887 nova_compute[186849]: 2025-11-22 08:15:55.302 186853 DEBUG nova.virt.hardware [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:15:55 np0005531887 nova_compute[186849]: 2025-11-22 08:15:55.303 186853 DEBUG nova.virt.hardware [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:15:55 np0005531887 nova_compute[186849]: 2025-11-22 08:15:55.303 186853 DEBUG nova.virt.hardware [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:15:55 np0005531887 nova_compute[186849]: 2025-11-22 08:15:55.303 186853 DEBUG nova.virt.hardware [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:15:55 np0005531887 nova_compute[186849]: 2025-11-22 08:15:55.303 186853 DEBUG nova.virt.hardware [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:15:55 np0005531887 nova_compute[186849]: 2025-11-22 08:15:55.304 186853 DEBUG nova.virt.hardware [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:15:55 np0005531887 nova_compute[186849]: 2025-11-22 08:15:55.307 186853 DEBUG nova.virt.libvirt.vif [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:15:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1639208197',display_name='tempest-ServersTestJSON-server-1639208197',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestjson-server-1639208197',id=146,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='70cb231da30d4002a985cf18a579cd6a',ramdisk_id='',reservation_id='r-v83hmndt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1620770071',owner_user_name='tempest-ServersTestJSON-1620770071-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:15:43Z,user_data=None,user_id='11d95211a44e4da9a04eb309ec3ab024',uuid=c9d97da7-1af8-48d1-9faa-7a8ef1e0699e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dd3a3100-fbea-496f-91f5-a4d7c56ff913", "address": "fa:16:3e:f3:25:30", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd3a3100-fb", "ovs_interfaceid": "dd3a3100-fbea-496f-91f5-a4d7c56ff913", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:15:55 np0005531887 nova_compute[186849]: 2025-11-22 08:15:55.308 186853 DEBUG nova.network.os_vif_util [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Converting VIF {"id": "dd3a3100-fbea-496f-91f5-a4d7c56ff913", "address": "fa:16:3e:f3:25:30", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd3a3100-fb", "ovs_interfaceid": "dd3a3100-fbea-496f-91f5-a4d7c56ff913", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:15:55 np0005531887 nova_compute[186849]: 2025-11-22 08:15:55.308 186853 DEBUG nova.network.os_vif_util [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f3:25:30,bridge_name='br-int',has_traffic_filtering=True,id=dd3a3100-fbea-496f-91f5-a4d7c56ff913,network=Network(66c945b4-7237-4e85-b411-0c51b31ea31a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdd3a3100-fb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:15:55 np0005531887 nova_compute[186849]: 2025-11-22 08:15:55.309 186853 DEBUG nova.objects.instance [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lazy-loading 'pci_devices' on Instance uuid c9d97da7-1af8-48d1-9faa-7a8ef1e0699e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:15:55 np0005531887 nova_compute[186849]: 2025-11-22 08:15:55.335 186853 DEBUG nova.virt.libvirt.driver [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:15:55 np0005531887 nova_compute[186849]:  <uuid>c9d97da7-1af8-48d1-9faa-7a8ef1e0699e</uuid>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:  <name>instance-00000092</name>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:  <memory>131072</memory>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:  <vcpu>1</vcpu>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:15:55 np0005531887 nova_compute[186849]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:      <nova:name>tempest-ServersTestJSON-server-1639208197</nova:name>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:      <nova:creationTime>2025-11-22 08:15:55</nova:creationTime>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:      <nova:flavor name="m1.nano">
Nov 22 03:15:55 np0005531887 nova_compute[186849]:        <nova:memory>128</nova:memory>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:        <nova:disk>1</nova:disk>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:        <nova:swap>0</nova:swap>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:      </nova:flavor>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:      <nova:owner>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:        <nova:user uuid="11d95211a44e4da9a04eb309ec3ab024">tempest-ServersTestJSON-1620770071-project-member</nova:user>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:        <nova:project uuid="70cb231da30d4002a985cf18a579cd6a">tempest-ServersTestJSON-1620770071</nova:project>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:      </nova:owner>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:      <nova:ports>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:        <nova:port uuid="dd3a3100-fbea-496f-91f5-a4d7c56ff913">
Nov 22 03:15:55 np0005531887 nova_compute[186849]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:        </nova:port>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:      </nova:ports>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:    </nova:instance>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:  <sysinfo type="smbios">
Nov 22 03:15:55 np0005531887 nova_compute[186849]:    <system>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:      <entry name="serial">c9d97da7-1af8-48d1-9faa-7a8ef1e0699e</entry>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:      <entry name="uuid">c9d97da7-1af8-48d1-9faa-7a8ef1e0699e</entry>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:    </system>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:  <os>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:    <boot dev="hd"/>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:    <smbios mode="sysinfo"/>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:  </os>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:  <features>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:    <vmcoreinfo/>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:  </features>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:  <clock offset="utc">
Nov 22 03:15:55 np0005531887 nova_compute[186849]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:    <timer name="hpet" present="no"/>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:  </clock>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:  <cpu mode="custom" match="exact">
Nov 22 03:15:55 np0005531887 nova_compute[186849]:    <model>Nehalem</model>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:  <devices>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:    <disk type="file" device="disk">
Nov 22 03:15:55 np0005531887 nova_compute[186849]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/c9d97da7-1af8-48d1-9faa-7a8ef1e0699e/disk"/>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:      <target dev="vda" bus="virtio"/>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:    <disk type="file" device="cdrom">
Nov 22 03:15:55 np0005531887 nova_compute[186849]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/c9d97da7-1af8-48d1-9faa-7a8ef1e0699e/disk.config"/>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:      <target dev="sda" bus="sata"/>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:    <interface type="ethernet">
Nov 22 03:15:55 np0005531887 nova_compute[186849]:      <mac address="fa:16:3e:f3:25:30"/>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:      <mtu size="1442"/>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:      <target dev="tapdd3a3100-fb"/>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:    </interface>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:    <serial type="pty">
Nov 22 03:15:55 np0005531887 nova_compute[186849]:      <log file="/var/lib/nova/instances/c9d97da7-1af8-48d1-9faa-7a8ef1e0699e/console.log" append="off"/>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:    </serial>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:    <video>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:    </video>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:    <input type="tablet" bus="usb"/>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:    <rng model="virtio">
Nov 22 03:15:55 np0005531887 nova_compute[186849]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:    </rng>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:    <controller type="usb" index="0"/>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:    <memballoon model="virtio">
Nov 22 03:15:55 np0005531887 nova_compute[186849]:      <stats period="10"/>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 03:15:55 np0005531887 nova_compute[186849]:  </devices>
Nov 22 03:15:55 np0005531887 nova_compute[186849]: </domain>
Nov 22 03:15:55 np0005531887 nova_compute[186849]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:15:55 np0005531887 nova_compute[186849]: 2025-11-22 08:15:55.336 186853 DEBUG nova.compute.manager [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Preparing to wait for external event network-vif-plugged-dd3a3100-fbea-496f-91f5-a4d7c56ff913 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:15:55 np0005531887 nova_compute[186849]: 2025-11-22 08:15:55.337 186853 DEBUG oslo_concurrency.lockutils [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "c9d97da7-1af8-48d1-9faa-7a8ef1e0699e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:15:55 np0005531887 nova_compute[186849]: 2025-11-22 08:15:55.337 186853 DEBUG oslo_concurrency.lockutils [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "c9d97da7-1af8-48d1-9faa-7a8ef1e0699e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:15:55 np0005531887 nova_compute[186849]: 2025-11-22 08:15:55.338 186853 DEBUG oslo_concurrency.lockutils [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "c9d97da7-1af8-48d1-9faa-7a8ef1e0699e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:15:55 np0005531887 nova_compute[186849]: 2025-11-22 08:15:55.339 186853 DEBUG nova.virt.libvirt.vif [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:15:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1639208197',display_name='tempest-ServersTestJSON-server-1639208197',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestjson-server-1639208197',id=146,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='70cb231da30d4002a985cf18a579cd6a',ramdisk_id='',reservation_id='r-v83hmndt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1620770071',owner_user_name='tempest-ServersTestJSON-1620770071-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:15:43Z,user_data=None,user_id='11d95211a44e4da9a04eb309ec3ab024',uuid=c9d97da7-1af8-48d1-9faa-7a8ef1e0699e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dd3a3100-fbea-496f-91f5-a4d7c56ff913", "address": "fa:16:3e:f3:25:30", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd3a3100-fb", "ovs_interfaceid": "dd3a3100-fbea-496f-91f5-a4d7c56ff913", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:15:55 np0005531887 nova_compute[186849]: 2025-11-22 08:15:55.339 186853 DEBUG nova.network.os_vif_util [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Converting VIF {"id": "dd3a3100-fbea-496f-91f5-a4d7c56ff913", "address": "fa:16:3e:f3:25:30", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd3a3100-fb", "ovs_interfaceid": "dd3a3100-fbea-496f-91f5-a4d7c56ff913", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:15:55 np0005531887 nova_compute[186849]: 2025-11-22 08:15:55.340 186853 DEBUG nova.network.os_vif_util [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f3:25:30,bridge_name='br-int',has_traffic_filtering=True,id=dd3a3100-fbea-496f-91f5-a4d7c56ff913,network=Network(66c945b4-7237-4e85-b411-0c51b31ea31a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdd3a3100-fb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:15:55 np0005531887 nova_compute[186849]: 2025-11-22 08:15:55.341 186853 DEBUG os_vif [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f3:25:30,bridge_name='br-int',has_traffic_filtering=True,id=dd3a3100-fbea-496f-91f5-a4d7c56ff913,network=Network(66c945b4-7237-4e85-b411-0c51b31ea31a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdd3a3100-fb') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:15:55 np0005531887 nova_compute[186849]: 2025-11-22 08:15:55.342 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:55 np0005531887 nova_compute[186849]: 2025-11-22 08:15:55.342 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:15:55 np0005531887 nova_compute[186849]: 2025-11-22 08:15:55.343 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:15:55 np0005531887 nova_compute[186849]: 2025-11-22 08:15:55.346 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:55 np0005531887 nova_compute[186849]: 2025-11-22 08:15:55.347 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdd3a3100-fb, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:15:55 np0005531887 nova_compute[186849]: 2025-11-22 08:15:55.347 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapdd3a3100-fb, col_values=(('external_ids', {'iface-id': 'dd3a3100-fbea-496f-91f5-a4d7c56ff913', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f3:25:30', 'vm-uuid': 'c9d97da7-1af8-48d1-9faa-7a8ef1e0699e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:15:55 np0005531887 nova_compute[186849]: 2025-11-22 08:15:55.349 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:55 np0005531887 NetworkManager[55210]: <info>  [1763799355.3508] manager: (tapdd3a3100-fb): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/225)
Nov 22 03:15:55 np0005531887 nova_compute[186849]: 2025-11-22 08:15:55.353 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:15:55 np0005531887 nova_compute[186849]: 2025-11-22 08:15:55.355 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:55 np0005531887 nova_compute[186849]: 2025-11-22 08:15:55.356 186853 INFO os_vif [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f3:25:30,bridge_name='br-int',has_traffic_filtering=True,id=dd3a3100-fbea-496f-91f5-a4d7c56ff913,network=Network(66c945b4-7237-4e85-b411-0c51b31ea31a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdd3a3100-fb')#033[00m
Nov 22 03:15:57 np0005531887 nova_compute[186849]: 2025-11-22 08:15:57.080 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:15:57 np0005531887 nova_compute[186849]: 2025-11-22 08:15:57.241 186853 DEBUG nova.network.neutron [req-34fda450-0a72-4c67-9ad5-a2603857fb2e req-ca1f1db5-5f6b-4220-a356-d5bfa14cdaec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Updated VIF entry in instance network info cache for port dd3a3100-fbea-496f-91f5-a4d7c56ff913. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:15:57 np0005531887 nova_compute[186849]: 2025-11-22 08:15:57.242 186853 DEBUG nova.network.neutron [req-34fda450-0a72-4c67-9ad5-a2603857fb2e req-ca1f1db5-5f6b-4220-a356-d5bfa14cdaec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Updating instance_info_cache with network_info: [{"id": "dd3a3100-fbea-496f-91f5-a4d7c56ff913", "address": "fa:16:3e:f3:25:30", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd3a3100-fb", "ovs_interfaceid": "dd3a3100-fbea-496f-91f5-a4d7c56ff913", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:15:57 np0005531887 nova_compute[186849]: 2025-11-22 08:15:57.262 186853 DEBUG oslo_concurrency.lockutils [req-34fda450-0a72-4c67-9ad5-a2603857fb2e req-ca1f1db5-5f6b-4220-a356-d5bfa14cdaec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-c9d97da7-1af8-48d1-9faa-7a8ef1e0699e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:15:58 np0005531887 podman[235554]: 2025-11-22 08:15:58.878896128 +0000 UTC m=+0.090429203 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 03:15:59 np0005531887 nova_compute[186849]: 2025-11-22 08:15:59.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:16:00 np0005531887 nova_compute[186849]: 2025-11-22 08:16:00.352 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:00 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:16:00.810 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '41'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:16:02 np0005531887 nova_compute[186849]: 2025-11-22 08:16:02.082 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:03 np0005531887 podman[235575]: 2025-11-22 08:16:03.848789618 +0000 UTC m=+0.063623122 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:16:04 np0005531887 nova_compute[186849]: 2025-11-22 08:16:04.761 186853 DEBUG nova.virt.libvirt.driver [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:16:04 np0005531887 nova_compute[186849]: 2025-11-22 08:16:04.762 186853 DEBUG nova.virt.libvirt.driver [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:16:04 np0005531887 nova_compute[186849]: 2025-11-22 08:16:04.762 186853 DEBUG nova.virt.libvirt.driver [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] No VIF found with MAC fa:16:3e:f3:25:30, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:16:04 np0005531887 nova_compute[186849]: 2025-11-22 08:16:04.763 186853 INFO nova.virt.libvirt.driver [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Using config drive#033[00m
Nov 22 03:16:05 np0005531887 nova_compute[186849]: 2025-11-22 08:16:05.195 186853 INFO nova.virt.libvirt.driver [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Creating config drive at /var/lib/nova/instances/c9d97da7-1af8-48d1-9faa-7a8ef1e0699e/disk.config#033[00m
Nov 22 03:16:05 np0005531887 nova_compute[186849]: 2025-11-22 08:16:05.199 186853 DEBUG oslo_concurrency.processutils [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c9d97da7-1af8-48d1-9faa-7a8ef1e0699e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpipl5yfcr execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:16:05 np0005531887 nova_compute[186849]: 2025-11-22 08:16:05.331 186853 DEBUG oslo_concurrency.processutils [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c9d97da7-1af8-48d1-9faa-7a8ef1e0699e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpipl5yfcr" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:16:05 np0005531887 nova_compute[186849]: 2025-11-22 08:16:05.356 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:05 np0005531887 kernel: tapdd3a3100-fb: entered promiscuous mode
Nov 22 03:16:05 np0005531887 NetworkManager[55210]: <info>  [1763799365.4053] manager: (tapdd3a3100-fb): new Tun device (/org/freedesktop/NetworkManager/Devices/226)
Nov 22 03:16:05 np0005531887 ovn_controller[95130]: 2025-11-22T08:16:05Z|00489|binding|INFO|Claiming lport dd3a3100-fbea-496f-91f5-a4d7c56ff913 for this chassis.
Nov 22 03:16:05 np0005531887 nova_compute[186849]: 2025-11-22 08:16:05.406 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:05 np0005531887 ovn_controller[95130]: 2025-11-22T08:16:05Z|00490|binding|INFO|dd3a3100-fbea-496f-91f5-a4d7c56ff913: Claiming fa:16:3e:f3:25:30 10.100.0.5
Nov 22 03:16:05 np0005531887 nova_compute[186849]: 2025-11-22 08:16:05.410 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:05 np0005531887 systemd-udevd[235616]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:16:05 np0005531887 systemd-machined[153180]: New machine qemu-53-instance-00000092.
Nov 22 03:16:05 np0005531887 NetworkManager[55210]: <info>  [1763799365.4512] device (tapdd3a3100-fb): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:16:05 np0005531887 NetworkManager[55210]: <info>  [1763799365.4523] device (tapdd3a3100-fb): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:16:05 np0005531887 nova_compute[186849]: 2025-11-22 08:16:05.463 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:05 np0005531887 ovn_controller[95130]: 2025-11-22T08:16:05Z|00491|binding|INFO|Setting lport dd3a3100-fbea-496f-91f5-a4d7c56ff913 ovn-installed in OVS
Nov 22 03:16:05 np0005531887 nova_compute[186849]: 2025-11-22 08:16:05.470 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:05 np0005531887 systemd[1]: Started Virtual Machine qemu-53-instance-00000092.
Nov 22 03:16:05 np0005531887 ovn_controller[95130]: 2025-11-22T08:16:05Z|00492|binding|INFO|Setting lport dd3a3100-fbea-496f-91f5-a4d7c56ff913 up in Southbound
Nov 22 03:16:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:16:05.507 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f3:25:30 10.100.0.5'], port_security=['fa:16:3e:f3:25:30 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'c9d97da7-1af8-48d1-9faa-7a8ef1e0699e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-66c945b4-7237-4e85-b411-0c51b31ea31a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '70cb231da30d4002a985cf18a579cd6a', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'cdac32cd-3018-48f9-b8b4-269b2f46b94b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b63d9e41-5235-4b2c-88f9-85531fc2355b, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=dd3a3100-fbea-496f-91f5-a4d7c56ff913) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:16:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:16:05.508 104084 INFO neutron.agent.ovn.metadata.agent [-] Port dd3a3100-fbea-496f-91f5-a4d7c56ff913 in datapath 66c945b4-7237-4e85-b411-0c51b31ea31a bound to our chassis#033[00m
Nov 22 03:16:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:16:05.509 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 66c945b4-7237-4e85-b411-0c51b31ea31a#033[00m
Nov 22 03:16:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:16:05.522 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[b0b7c55f-d907-40ae-919c-d4a136776937]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:16:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:16:05.523 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap66c945b4-71 in ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:16:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:16:05.525 213790 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap66c945b4-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:16:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:16:05.525 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[81971bc7-7585-45fe-bf61-962ff2a171b7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:16:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:16:05.526 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[6016aa9d-434e-49e1-97e5-a5fbe9e0a701]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:16:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:16:05.538 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[bb5bfa84-a180-44f0-9883-c704dbdf2b2a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:16:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:16:05.566 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[94d972ef-48a9-44e5-87e9-49636e293ec7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:16:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:16:05.595 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[9b9ef80a-96b3-4058-8993-7f787239e604]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:16:05 np0005531887 systemd-udevd[235619]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:16:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:16:05.601 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[d7952773-b7d1-49a6-9e58-a3681a492899]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:16:05 np0005531887 NetworkManager[55210]: <info>  [1763799365.6044] manager: (tap66c945b4-70): new Veth device (/org/freedesktop/NetworkManager/Devices/227)
Nov 22 03:16:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:16:05.635 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[ad9d520f-459e-494d-8951-d23c6635df9c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:16:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:16:05.639 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[a9de5c08-da8e-44c4-a091-0b2ce597703c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:16:05 np0005531887 NetworkManager[55210]: <info>  [1763799365.6609] device (tap66c945b4-70): carrier: link connected
Nov 22 03:16:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:16:05.667 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[5b4e341a-7198-4c10-9e3c-c04c9370a173]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:16:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:16:05.684 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[01a29819-63ae-4e93-82df-90d595f85fca]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap66c945b4-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7c:5a:27'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 147], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 610826, 'reachable_time': 40815, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235650, 'error': None, 'target': 'ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:16:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:16:05.701 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[83e6e560-ccbd-4754-af0c-77a26bbb193d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe7c:5a27'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 610826, 'tstamp': 610826}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235651, 'error': None, 'target': 'ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:16:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:16:05.719 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[a8169e3b-107f-4809-b331-e3fba8b7e100]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap66c945b4-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7c:5a:27'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 147], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 610826, 'reachable_time': 40815, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 235652, 'error': None, 'target': 'ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:16:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:16:05.751 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[4dcd31db-2c59-4c10-9adc-7cc23df89da2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:16:05 np0005531887 nova_compute[186849]: 2025-11-22 08:16:05.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:16:05 np0005531887 nova_compute[186849]: 2025-11-22 08:16:05.786 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:16:05 np0005531887 nova_compute[186849]: 2025-11-22 08:16:05.787 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:16:05 np0005531887 nova_compute[186849]: 2025-11-22 08:16:05.787 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:16:05 np0005531887 nova_compute[186849]: 2025-11-22 08:16:05.787 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:16:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:16:05.820 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[f175a428-d38f-44fa-b157-5510ecab5627]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:16:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:16:05.821 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap66c945b4-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:16:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:16:05.821 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:16:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:16:05.822 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap66c945b4-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:16:05 np0005531887 nova_compute[186849]: 2025-11-22 08:16:05.823 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:05 np0005531887 kernel: tap66c945b4-70: entered promiscuous mode
Nov 22 03:16:05 np0005531887 nova_compute[186849]: 2025-11-22 08:16:05.825 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:05 np0005531887 NetworkManager[55210]: <info>  [1763799365.8255] manager: (tap66c945b4-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/228)
Nov 22 03:16:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:16:05.828 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap66c945b4-70, col_values=(('external_ids', {'iface-id': 'd6ef1392-aa2a-4e3e-91ba-ec0ce61e416a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:16:05 np0005531887 nova_compute[186849]: 2025-11-22 08:16:05.829 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:05 np0005531887 ovn_controller[95130]: 2025-11-22T08:16:05Z|00493|binding|INFO|Releasing lport d6ef1392-aa2a-4e3e-91ba-ec0ce61e416a from this chassis (sb_readonly=0)
Nov 22 03:16:05 np0005531887 nova_compute[186849]: 2025-11-22 08:16:05.830 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:16:05.831 104084 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/66c945b4-7237-4e85-b411-0c51b31ea31a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/66c945b4-7237-4e85-b411-0c51b31ea31a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:16:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:16:05.831 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[06825a75-e0a0-4999-8168-d9de9f5916d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:16:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:16:05.832 104084 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:16:05 np0005531887 ovn_metadata_agent[104079]: global
Nov 22 03:16:05 np0005531887 ovn_metadata_agent[104079]:    log         /dev/log local0 debug
Nov 22 03:16:05 np0005531887 ovn_metadata_agent[104079]:    log-tag     haproxy-metadata-proxy-66c945b4-7237-4e85-b411-0c51b31ea31a
Nov 22 03:16:05 np0005531887 ovn_metadata_agent[104079]:    user        root
Nov 22 03:16:05 np0005531887 ovn_metadata_agent[104079]:    group       root
Nov 22 03:16:05 np0005531887 ovn_metadata_agent[104079]:    maxconn     1024
Nov 22 03:16:05 np0005531887 ovn_metadata_agent[104079]:    pidfile     /var/lib/neutron/external/pids/66c945b4-7237-4e85-b411-0c51b31ea31a.pid.haproxy
Nov 22 03:16:05 np0005531887 ovn_metadata_agent[104079]:    daemon
Nov 22 03:16:05 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:16:05 np0005531887 ovn_metadata_agent[104079]: defaults
Nov 22 03:16:05 np0005531887 ovn_metadata_agent[104079]:    log global
Nov 22 03:16:05 np0005531887 ovn_metadata_agent[104079]:    mode http
Nov 22 03:16:05 np0005531887 ovn_metadata_agent[104079]:    option httplog
Nov 22 03:16:05 np0005531887 ovn_metadata_agent[104079]:    option dontlognull
Nov 22 03:16:05 np0005531887 ovn_metadata_agent[104079]:    option http-server-close
Nov 22 03:16:05 np0005531887 ovn_metadata_agent[104079]:    option forwardfor
Nov 22 03:16:05 np0005531887 ovn_metadata_agent[104079]:    retries                 3
Nov 22 03:16:05 np0005531887 ovn_metadata_agent[104079]:    timeout http-request    30s
Nov 22 03:16:05 np0005531887 ovn_metadata_agent[104079]:    timeout connect         30s
Nov 22 03:16:05 np0005531887 ovn_metadata_agent[104079]:    timeout client          32s
Nov 22 03:16:05 np0005531887 ovn_metadata_agent[104079]:    timeout server          32s
Nov 22 03:16:05 np0005531887 ovn_metadata_agent[104079]:    timeout http-keep-alive 30s
Nov 22 03:16:05 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:16:05 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:16:05 np0005531887 ovn_metadata_agent[104079]: listen listener
Nov 22 03:16:05 np0005531887 ovn_metadata_agent[104079]:    bind 169.254.169.254:80
Nov 22 03:16:05 np0005531887 ovn_metadata_agent[104079]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:16:05 np0005531887 ovn_metadata_agent[104079]:    http-request add-header X-OVN-Network-ID 66c945b4-7237-4e85-b411-0c51b31ea31a
Nov 22 03:16:05 np0005531887 ovn_metadata_agent[104079]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:16:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:16:05.832 104084 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a', 'env', 'PROCESS_TAG=haproxy-66c945b4-7237-4e85-b411-0c51b31ea31a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/66c945b4-7237-4e85-b411-0c51b31ea31a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:16:05 np0005531887 nova_compute[186849]: 2025-11-22 08:16:05.842 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:05 np0005531887 nova_compute[186849]: 2025-11-22 08:16:05.859 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c9d97da7-1af8-48d1-9faa-7a8ef1e0699e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:16:05 np0005531887 nova_compute[186849]: 2025-11-22 08:16:05.921 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c9d97da7-1af8-48d1-9faa-7a8ef1e0699e/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:16:05 np0005531887 nova_compute[186849]: 2025-11-22 08:16:05.922 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c9d97da7-1af8-48d1-9faa-7a8ef1e0699e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:16:05 np0005531887 nova_compute[186849]: 2025-11-22 08:16:05.975 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c9d97da7-1af8-48d1-9faa-7a8ef1e0699e/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:16:06 np0005531887 nova_compute[186849]: 2025-11-22 08:16:06.138 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:16:06 np0005531887 nova_compute[186849]: 2025-11-22 08:16:06.139 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5702MB free_disk=73.27312088012695GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:16:06 np0005531887 nova_compute[186849]: 2025-11-22 08:16:06.140 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:16:06 np0005531887 nova_compute[186849]: 2025-11-22 08:16:06.140 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:16:06 np0005531887 podman[235690]: 2025-11-22 08:16:06.195819889 +0000 UTC m=+0.021873121 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:16:06 np0005531887 nova_compute[186849]: 2025-11-22 08:16:06.304 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Instance c9d97da7-1af8-48d1-9faa-7a8ef1e0699e actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 03:16:06 np0005531887 nova_compute[186849]: 2025-11-22 08:16:06.304 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:16:06 np0005531887 nova_compute[186849]: 2025-11-22 08:16:06.304 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:16:06 np0005531887 nova_compute[186849]: 2025-11-22 08:16:06.329 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Refreshing inventories for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 22 03:16:06 np0005531887 nova_compute[186849]: 2025-11-22 08:16:06.354 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Updating ProviderTree inventory for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 22 03:16:06 np0005531887 nova_compute[186849]: 2025-11-22 08:16:06.355 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Updating inventory in ProviderTree for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 22 03:16:06 np0005531887 nova_compute[186849]: 2025-11-22 08:16:06.380 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Refreshing aggregate associations for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 22 03:16:06 np0005531887 nova_compute[186849]: 2025-11-22 08:16:06.413 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Refreshing trait associations for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78, traits: COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_MMX,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_PCNET _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 22 03:16:06 np0005531887 nova_compute[186849]: 2025-11-22 08:16:06.456 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:16:06 np0005531887 nova_compute[186849]: 2025-11-22 08:16:06.468 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:16:06 np0005531887 nova_compute[186849]: 2025-11-22 08:16:06.486 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763799366.4861426, c9d97da7-1af8-48d1-9faa-7a8ef1e0699e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:16:06 np0005531887 nova_compute[186849]: 2025-11-22 08:16:06.487 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] VM Started (Lifecycle Event)#033[00m
Nov 22 03:16:06 np0005531887 nova_compute[186849]: 2025-11-22 08:16:06.500 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:16:06 np0005531887 nova_compute[186849]: 2025-11-22 08:16:06.502 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763799366.4893444, c9d97da7-1af8-48d1-9faa-7a8ef1e0699e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:16:06 np0005531887 nova_compute[186849]: 2025-11-22 08:16:06.503 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:16:06 np0005531887 nova_compute[186849]: 2025-11-22 08:16:06.520 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:16:06 np0005531887 nova_compute[186849]: 2025-11-22 08:16:06.522 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:16:06 np0005531887 nova_compute[186849]: 2025-11-22 08:16:06.538 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:16:06 np0005531887 nova_compute[186849]: 2025-11-22 08:16:06.623 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:16:06 np0005531887 nova_compute[186849]: 2025-11-22 08:16:06.624 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.484s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:16:06 np0005531887 podman[235690]: 2025-11-22 08:16:06.809084183 +0000 UTC m=+0.635137405 container create 1635fa3d9cdb33ca4edf9db68dfe1b6eb4431b65f8869cea3c6d0ca9afded964 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118)
Nov 22 03:16:07 np0005531887 nova_compute[186849]: 2025-11-22 08:16:07.045 186853 DEBUG nova.compute.manager [req-8bbf7aca-0676-40b3-8def-265c79e1fcc8 req-86aca896-379f-4aa1-9fc7-e94ec24bc6e7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Received event network-vif-plugged-dd3a3100-fbea-496f-91f5-a4d7c56ff913 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:16:07 np0005531887 nova_compute[186849]: 2025-11-22 08:16:07.046 186853 DEBUG oslo_concurrency.lockutils [req-8bbf7aca-0676-40b3-8def-265c79e1fcc8 req-86aca896-379f-4aa1-9fc7-e94ec24bc6e7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c9d97da7-1af8-48d1-9faa-7a8ef1e0699e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:16:07 np0005531887 nova_compute[186849]: 2025-11-22 08:16:07.046 186853 DEBUG oslo_concurrency.lockutils [req-8bbf7aca-0676-40b3-8def-265c79e1fcc8 req-86aca896-379f-4aa1-9fc7-e94ec24bc6e7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c9d97da7-1af8-48d1-9faa-7a8ef1e0699e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:16:07 np0005531887 nova_compute[186849]: 2025-11-22 08:16:07.046 186853 DEBUG oslo_concurrency.lockutils [req-8bbf7aca-0676-40b3-8def-265c79e1fcc8 req-86aca896-379f-4aa1-9fc7-e94ec24bc6e7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c9d97da7-1af8-48d1-9faa-7a8ef1e0699e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:16:07 np0005531887 nova_compute[186849]: 2025-11-22 08:16:07.046 186853 DEBUG nova.compute.manager [req-8bbf7aca-0676-40b3-8def-265c79e1fcc8 req-86aca896-379f-4aa1-9fc7-e94ec24bc6e7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Processing event network-vif-plugged-dd3a3100-fbea-496f-91f5-a4d7c56ff913 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:16:07 np0005531887 nova_compute[186849]: 2025-11-22 08:16:07.047 186853 DEBUG nova.compute.manager [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:16:07 np0005531887 nova_compute[186849]: 2025-11-22 08:16:07.050 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763799367.0506268, c9d97da7-1af8-48d1-9faa-7a8ef1e0699e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:16:07 np0005531887 nova_compute[186849]: 2025-11-22 08:16:07.051 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:16:07 np0005531887 nova_compute[186849]: 2025-11-22 08:16:07.053 186853 DEBUG nova.virt.libvirt.driver [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:16:07 np0005531887 nova_compute[186849]: 2025-11-22 08:16:07.057 186853 INFO nova.virt.libvirt.driver [-] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Instance spawned successfully.#033[00m
Nov 22 03:16:07 np0005531887 nova_compute[186849]: 2025-11-22 08:16:07.057 186853 DEBUG nova.virt.libvirt.driver [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 03:16:07 np0005531887 nova_compute[186849]: 2025-11-22 08:16:07.072 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:16:07 np0005531887 nova_compute[186849]: 2025-11-22 08:16:07.078 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:16:07 np0005531887 nova_compute[186849]: 2025-11-22 08:16:07.082 186853 DEBUG nova.virt.libvirt.driver [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:16:07 np0005531887 nova_compute[186849]: 2025-11-22 08:16:07.083 186853 DEBUG nova.virt.libvirt.driver [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:16:07 np0005531887 nova_compute[186849]: 2025-11-22 08:16:07.083 186853 DEBUG nova.virt.libvirt.driver [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:16:07 np0005531887 nova_compute[186849]: 2025-11-22 08:16:07.083 186853 DEBUG nova.virt.libvirt.driver [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:16:07 np0005531887 nova_compute[186849]: 2025-11-22 08:16:07.084 186853 DEBUG nova.virt.libvirt.driver [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:16:07 np0005531887 nova_compute[186849]: 2025-11-22 08:16:07.084 186853 DEBUG nova.virt.libvirt.driver [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:16:07 np0005531887 nova_compute[186849]: 2025-11-22 08:16:07.087 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:07 np0005531887 nova_compute[186849]: 2025-11-22 08:16:07.105 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:16:07 np0005531887 systemd[1]: Started libpod-conmon-1635fa3d9cdb33ca4edf9db68dfe1b6eb4431b65f8869cea3c6d0ca9afded964.scope.
Nov 22 03:16:07 np0005531887 systemd[1]: Started libcrun container.
Nov 22 03:16:07 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b43f42cdadff5df6186811a8bfa53d7c8e09922affda78fb631cb0523d9ffd5f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:16:07 np0005531887 podman[235690]: 2025-11-22 08:16:07.288235138 +0000 UTC m=+1.114288390 container init 1635fa3d9cdb33ca4edf9db68dfe1b6eb4431b65f8869cea3c6d0ca9afded964 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0)
Nov 22 03:16:07 np0005531887 podman[235690]: 2025-11-22 08:16:07.299900925 +0000 UTC m=+1.125954157 container start 1635fa3d9cdb33ca4edf9db68dfe1b6eb4431b65f8869cea3c6d0ca9afded964 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Nov 22 03:16:07 np0005531887 neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a[235710]: [NOTICE]   (235716) : New worker (235718) forked
Nov 22 03:16:07 np0005531887 neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a[235710]: [NOTICE]   (235716) : Loading success.
Nov 22 03:16:07 np0005531887 nova_compute[186849]: 2025-11-22 08:16:07.592 186853 INFO nova.compute.manager [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Took 23.55 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 03:16:07 np0005531887 nova_compute[186849]: 2025-11-22 08:16:07.593 186853 DEBUG nova.compute.manager [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:16:07 np0005531887 nova_compute[186849]: 2025-11-22 08:16:07.624 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:16:07 np0005531887 nova_compute[186849]: 2025-11-22 08:16:07.625 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:16:07 np0005531887 nova_compute[186849]: 2025-11-22 08:16:07.625 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:16:07 np0005531887 nova_compute[186849]: 2025-11-22 08:16:07.763 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:16:07 np0005531887 nova_compute[186849]: 2025-11-22 08:16:07.767 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:16:07 np0005531887 nova_compute[186849]: 2025-11-22 08:16:07.768 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:16:07 np0005531887 nova_compute[186849]: 2025-11-22 08:16:07.783 186853 INFO nova.compute.manager [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Took 24.37 seconds to build instance.#033[00m
Nov 22 03:16:07 np0005531887 nova_compute[186849]: 2025-11-22 08:16:07.799 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:16:07 np0005531887 nova_compute[186849]: 2025-11-22 08:16:07.834 186853 DEBUG oslo_concurrency.lockutils [None req-68a48d9e-9fe3-45cd-9f11-ffadc4eab95a 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "c9d97da7-1af8-48d1-9faa-7a8ef1e0699e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 24.622s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:16:08 np0005531887 nova_compute[186849]: 2025-11-22 08:16:08.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:16:09 np0005531887 nova_compute[186849]: 2025-11-22 08:16:09.184 186853 DEBUG nova.compute.manager [req-4c8d1881-b6fe-4958-851a-54f4d4b13ba4 req-bab9859d-7ae9-4275-8323-8cef5510ce9a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Received event network-vif-plugged-dd3a3100-fbea-496f-91f5-a4d7c56ff913 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:16:09 np0005531887 nova_compute[186849]: 2025-11-22 08:16:09.184 186853 DEBUG oslo_concurrency.lockutils [req-4c8d1881-b6fe-4958-851a-54f4d4b13ba4 req-bab9859d-7ae9-4275-8323-8cef5510ce9a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c9d97da7-1af8-48d1-9faa-7a8ef1e0699e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:16:09 np0005531887 nova_compute[186849]: 2025-11-22 08:16:09.184 186853 DEBUG oslo_concurrency.lockutils [req-4c8d1881-b6fe-4958-851a-54f4d4b13ba4 req-bab9859d-7ae9-4275-8323-8cef5510ce9a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c9d97da7-1af8-48d1-9faa-7a8ef1e0699e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:16:09 np0005531887 nova_compute[186849]: 2025-11-22 08:16:09.184 186853 DEBUG oslo_concurrency.lockutils [req-4c8d1881-b6fe-4958-851a-54f4d4b13ba4 req-bab9859d-7ae9-4275-8323-8cef5510ce9a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c9d97da7-1af8-48d1-9faa-7a8ef1e0699e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:16:09 np0005531887 nova_compute[186849]: 2025-11-22 08:16:09.185 186853 DEBUG nova.compute.manager [req-4c8d1881-b6fe-4958-851a-54f4d4b13ba4 req-bab9859d-7ae9-4275-8323-8cef5510ce9a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] No waiting events found dispatching network-vif-plugged-dd3a3100-fbea-496f-91f5-a4d7c56ff913 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:16:09 np0005531887 nova_compute[186849]: 2025-11-22 08:16:09.185 186853 WARNING nova.compute.manager [req-4c8d1881-b6fe-4958-851a-54f4d4b13ba4 req-bab9859d-7ae9-4275-8323-8cef5510ce9a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Received unexpected event network-vif-plugged-dd3a3100-fbea-496f-91f5-a4d7c56ff913 for instance with vm_state active and task_state None.#033[00m
Nov 22 03:16:09 np0005531887 podman[235729]: 2025-11-22 08:16:09.529840978 +0000 UTC m=+0.067351083 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., container_name=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=9.6, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 03:16:10 np0005531887 nova_compute[186849]: 2025-11-22 08:16:10.360 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:12 np0005531887 nova_compute[186849]: 2025-11-22 08:16:12.089 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:12 np0005531887 podman[235751]: 2025-11-22 08:16:12.839009534 +0000 UTC m=+0.062292839 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute)
Nov 22 03:16:12 np0005531887 podman[235752]: 2025-11-22 08:16:12.865208899 +0000 UTC m=+0.082104716 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Nov 22 03:16:13 np0005531887 nova_compute[186849]: 2025-11-22 08:16:13.195 186853 DEBUG oslo_concurrency.lockutils [None req-3684d350-190a-46ef-a55a-0ef29c27905b 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "c9d97da7-1af8-48d1-9faa-7a8ef1e0699e" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:16:13 np0005531887 nova_compute[186849]: 2025-11-22 08:16:13.195 186853 DEBUG oslo_concurrency.lockutils [None req-3684d350-190a-46ef-a55a-0ef29c27905b 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "c9d97da7-1af8-48d1-9faa-7a8ef1e0699e" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:16:13 np0005531887 nova_compute[186849]: 2025-11-22 08:16:13.196 186853 DEBUG nova.compute.manager [None req-3684d350-190a-46ef-a55a-0ef29c27905b 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:16:13 np0005531887 nova_compute[186849]: 2025-11-22 08:16:13.200 186853 DEBUG nova.compute.manager [None req-3684d350-190a-46ef-a55a-0ef29c27905b 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m
Nov 22 03:16:13 np0005531887 nova_compute[186849]: 2025-11-22 08:16:13.201 186853 DEBUG nova.objects.instance [None req-3684d350-190a-46ef-a55a-0ef29c27905b 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lazy-loading 'flavor' on Instance uuid c9d97da7-1af8-48d1-9faa-7a8ef1e0699e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:16:13 np0005531887 nova_compute[186849]: 2025-11-22 08:16:13.244 186853 DEBUG nova.objects.instance [None req-3684d350-190a-46ef-a55a-0ef29c27905b 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lazy-loading 'info_cache' on Instance uuid c9d97da7-1af8-48d1-9faa-7a8ef1e0699e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:16:13 np0005531887 nova_compute[186849]: 2025-11-22 08:16:13.271 186853 DEBUG nova.virt.libvirt.driver [None req-3684d350-190a-46ef-a55a-0ef29c27905b 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 22 03:16:13 np0005531887 nova_compute[186849]: 2025-11-22 08:16:13.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:16:15 np0005531887 nova_compute[186849]: 2025-11-22 08:16:15.362 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:15 np0005531887 nova_compute[186849]: 2025-11-22 08:16:15.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:16:17 np0005531887 nova_compute[186849]: 2025-11-22 08:16:17.090 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:18 np0005531887 podman[235800]: 2025-11-22 08:16:18.831043428 +0000 UTC m=+0.052192899 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:16:20 np0005531887 nova_compute[186849]: 2025-11-22 08:16:20.365 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:22 np0005531887 nova_compute[186849]: 2025-11-22 08:16:22.091 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:23 np0005531887 nova_compute[186849]: 2025-11-22 08:16:23.318 186853 DEBUG nova.virt.libvirt.driver [None req-3684d350-190a-46ef-a55a-0ef29c27905b 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Nov 22 03:16:23 np0005531887 nova_compute[186849]: 2025-11-22 08:16:23.762 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:16:23 np0005531887 podman[235837]: 2025-11-22 08:16:23.864043766 +0000 UTC m=+0.080220870 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 22 03:16:25 np0005531887 nova_compute[186849]: 2025-11-22 08:16:25.368 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:27 np0005531887 nova_compute[186849]: 2025-11-22 08:16:27.094 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:27 np0005531887 ovn_controller[95130]: 2025-11-22T08:16:27Z|00061|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f3:25:30 10.100.0.5
Nov 22 03:16:27 np0005531887 ovn_controller[95130]: 2025-11-22T08:16:27Z|00062|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f3:25:30 10.100.0.5
Nov 22 03:16:29 np0005531887 podman[235858]: 2025-11-22 08:16:29.85158891 +0000 UTC m=+0.070875960 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 03:16:30 np0005531887 nova_compute[186849]: 2025-11-22 08:16:30.372 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:32 np0005531887 nova_compute[186849]: 2025-11-22 08:16:32.095 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:34 np0005531887 nova_compute[186849]: 2025-11-22 08:16:34.362 186853 DEBUG nova.virt.libvirt.driver [None req-3684d350-190a-46ef-a55a-0ef29c27905b 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Instance in state 1 after 21 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Nov 22 03:16:34 np0005531887 podman[235876]: 2025-11-22 08:16:34.859516758 +0000 UTC m=+0.079204816 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:16:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:16:34.924 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=42, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=41) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:16:34 np0005531887 nova_compute[186849]: 2025-11-22 08:16:34.925 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:16:34.926 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:16:35 np0005531887 nova_compute[186849]: 2025-11-22 08:16:35.374 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:36 np0005531887 kernel: tapdd3a3100-fb (unregistering): left promiscuous mode
Nov 22 03:16:36 np0005531887 NetworkManager[55210]: <info>  [1763799396.5432] device (tapdd3a3100-fb): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:16:36 np0005531887 ovn_controller[95130]: 2025-11-22T08:16:36Z|00494|binding|INFO|Releasing lport dd3a3100-fbea-496f-91f5-a4d7c56ff913 from this chassis (sb_readonly=0)
Nov 22 03:16:36 np0005531887 ovn_controller[95130]: 2025-11-22T08:16:36Z|00495|binding|INFO|Setting lport dd3a3100-fbea-496f-91f5-a4d7c56ff913 down in Southbound
Nov 22 03:16:36 np0005531887 ovn_controller[95130]: 2025-11-22T08:16:36Z|00496|binding|INFO|Removing iface tapdd3a3100-fb ovn-installed in OVS
Nov 22 03:16:36 np0005531887 nova_compute[186849]: 2025-11-22 08:16:36.555 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:16:36.563 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f3:25:30 10.100.0.5'], port_security=['fa:16:3e:f3:25:30 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'c9d97da7-1af8-48d1-9faa-7a8ef1e0699e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-66c945b4-7237-4e85-b411-0c51b31ea31a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '70cb231da30d4002a985cf18a579cd6a', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'cdac32cd-3018-48f9-b8b4-269b2f46b94b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b63d9e41-5235-4b2c-88f9-85531fc2355b, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=dd3a3100-fbea-496f-91f5-a4d7c56ff913) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:16:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:16:36.565 104084 INFO neutron.agent.ovn.metadata.agent [-] Port dd3a3100-fbea-496f-91f5-a4d7c56ff913 in datapath 66c945b4-7237-4e85-b411-0c51b31ea31a unbound from our chassis#033[00m
Nov 22 03:16:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:16:36.566 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 66c945b4-7237-4e85-b411-0c51b31ea31a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:16:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:16:36.567 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[cbf25104-8b32-4aae-ae93-1195944bf553]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:16:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:16:36.568 104084 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a namespace which is not needed anymore#033[00m
Nov 22 03:16:36 np0005531887 nova_compute[186849]: 2025-11-22 08:16:36.571 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:36 np0005531887 systemd[1]: machine-qemu\x2d53\x2dinstance\x2d00000092.scope: Deactivated successfully.
Nov 22 03:16:36 np0005531887 systemd[1]: machine-qemu\x2d53\x2dinstance\x2d00000092.scope: Consumed 17.183s CPU time.
Nov 22 03:16:36 np0005531887 systemd-machined[153180]: Machine qemu-53-instance-00000092 terminated.
Nov 22 03:16:36 np0005531887 neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a[235710]: [NOTICE]   (235716) : haproxy version is 2.8.14-c23fe91
Nov 22 03:16:36 np0005531887 neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a[235710]: [NOTICE]   (235716) : path to executable is /usr/sbin/haproxy
Nov 22 03:16:36 np0005531887 neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a[235710]: [WARNING]  (235716) : Exiting Master process...
Nov 22 03:16:36 np0005531887 neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a[235710]: [ALERT]    (235716) : Current worker (235718) exited with code 143 (Terminated)
Nov 22 03:16:36 np0005531887 neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a[235710]: [WARNING]  (235716) : All workers exited. Exiting... (0)
Nov 22 03:16:36 np0005531887 systemd[1]: libpod-1635fa3d9cdb33ca4edf9db68dfe1b6eb4431b65f8869cea3c6d0ca9afded964.scope: Deactivated successfully.
Nov 22 03:16:36 np0005531887 podman[235924]: 2025-11-22 08:16:36.7166585 +0000 UTC m=+0.058342121 container died 1635fa3d9cdb33ca4edf9db68dfe1b6eb4431b65f8869cea3c6d0ca9afded964 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:16:36.816 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c9d97da7-1af8-48d1-9faa-7a8ef1e0699e', 'name': 'tempest-ServersTestJSON-server-1639208197', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000092', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'shutdown', 'tenant_id': '70cb231da30d4002a985cf18a579cd6a', 'user_id': '11d95211a44e4da9a04eb309ec3ab024', 'hostId': 'a02eadc065b2d5c1508b298f93e83b94d34d58955e3359dab2f4e055', 'status': 'stopped', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 03:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:16:36.817 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 22 03:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:16:36.817 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:16:36.818 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServersTestJSON-server-1639208197>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersTestJSON-server-1639208197>]
Nov 22 03:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:16:36.818 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 03:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:16:36.820 12 DEBUG ceilometer.compute.pollsters [-] Instance c9d97da7-1af8-48d1-9faa-7a8ef1e0699e was shut off while getting sample of network.incoming.bytes.delta: Failed to inspect data of instance <name=instance-00000092, id=c9d97da7-1af8-48d1-9faa-7a8ef1e0699e>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 03:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:16:36.820 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 03:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:16:36.822 12 DEBUG ceilometer.compute.pollsters [-] Instance c9d97da7-1af8-48d1-9faa-7a8ef1e0699e was shut off while getting sample of disk.device.allocation: Failed to inspect data of instance <name=instance-00000092, id=c9d97da7-1af8-48d1-9faa-7a8ef1e0699e>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 03:16:36 np0005531887 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1635fa3d9cdb33ca4edf9db68dfe1b6eb4431b65f8869cea3c6d0ca9afded964-userdata-shm.mount: Deactivated successfully.
Nov 22 03:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:16:36.823 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 03:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:16:36.824 12 DEBUG ceilometer.compute.pollsters [-] Instance c9d97da7-1af8-48d1-9faa-7a8ef1e0699e was shut off while getting sample of network.incoming.bytes: Failed to inspect data of instance <name=instance-00000092, id=c9d97da7-1af8-48d1-9faa-7a8ef1e0699e>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 03:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:16:36.824 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 03:16:36 np0005531887 systemd[1]: var-lib-containers-storage-overlay-b43f42cdadff5df6186811a8bfa53d7c8e09922affda78fb631cb0523d9ffd5f-merged.mount: Deactivated successfully.
Nov 22 03:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:16:36.825 12 DEBUG ceilometer.compute.pollsters [-] Instance c9d97da7-1af8-48d1-9faa-7a8ef1e0699e was shut off while getting sample of disk.device.read.bytes: Failed to inspect data of instance <name=instance-00000092, id=c9d97da7-1af8-48d1-9faa-7a8ef1e0699e>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 03:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:16:36.825 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 03:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:16:36.826 12 DEBUG ceilometer.compute.pollsters [-] Instance c9d97da7-1af8-48d1-9faa-7a8ef1e0699e was shut off while getting sample of network.incoming.packets.error: Failed to inspect data of instance <name=instance-00000092, id=c9d97da7-1af8-48d1-9faa-7a8ef1e0699e>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 03:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:16:36.827 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 03:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:16:36.827 12 DEBUG ceilometer.compute.pollsters [-] Instance c9d97da7-1af8-48d1-9faa-7a8ef1e0699e was shut off while getting sample of network.outgoing.bytes.delta: Failed to inspect data of instance <name=instance-00000092, id=c9d97da7-1af8-48d1-9faa-7a8ef1e0699e>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 03:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:16:36.828 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 03:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:16:36.828 12 DEBUG ceilometer.compute.pollsters [-] Instance c9d97da7-1af8-48d1-9faa-7a8ef1e0699e was shut off while getting sample of network.incoming.packets.drop: Failed to inspect data of instance <name=instance-00000092, id=c9d97da7-1af8-48d1-9faa-7a8ef1e0699e>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 03:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:16:36.829 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 03:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:16:36.830 12 DEBUG ceilometer.compute.pollsters [-] Instance c9d97da7-1af8-48d1-9faa-7a8ef1e0699e was shut off while getting sample of disk.device.write.requests: Failed to inspect data of instance <name=instance-00000092, id=c9d97da7-1af8-48d1-9faa-7a8ef1e0699e>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 03:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:16:36.830 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 03:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:16:36.831 12 DEBUG ceilometer.compute.pollsters [-] Instance c9d97da7-1af8-48d1-9faa-7a8ef1e0699e was shut off while getting sample of network.outgoing.packets.error: Failed to inspect data of instance <name=instance-00000092, id=c9d97da7-1af8-48d1-9faa-7a8ef1e0699e>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 03:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:16:36.831 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 03:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:16:36.833 12 DEBUG ceilometer.compute.pollsters [-] Instance c9d97da7-1af8-48d1-9faa-7a8ef1e0699e was shut off while getting sample of disk.device.capacity: Failed to inspect data of instance <name=instance-00000092, id=c9d97da7-1af8-48d1-9faa-7a8ef1e0699e>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 03:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:16:36.833 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 03:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:16:36.834 12 DEBUG ceilometer.compute.pollsters [-] Instance c9d97da7-1af8-48d1-9faa-7a8ef1e0699e was shut off while getting sample of network.incoming.packets: Failed to inspect data of instance <name=instance-00000092, id=c9d97da7-1af8-48d1-9faa-7a8ef1e0699e>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 03:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:16:36.835 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 03:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:16:36.836 12 DEBUG ceilometer.compute.pollsters [-] Instance c9d97da7-1af8-48d1-9faa-7a8ef1e0699e was shut off while getting sample of cpu: Failed to inspect data of instance <name=instance-00000092, id=c9d97da7-1af8-48d1-9faa-7a8ef1e0699e>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 03:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:16:36.836 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 22 03:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:16:36.836 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:16:36.836 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServersTestJSON-server-1639208197>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersTestJSON-server-1639208197>]
Nov 22 03:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:16:36.837 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 03:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:16:36.837 12 DEBUG ceilometer.compute.pollsters [-] Instance c9d97da7-1af8-48d1-9faa-7a8ef1e0699e was shut off while getting sample of network.outgoing.packets: Failed to inspect data of instance <name=instance-00000092, id=c9d97da7-1af8-48d1-9faa-7a8ef1e0699e>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 03:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:16:36.838 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 03:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:16:36.838 12 DEBUG ceilometer.compute.pollsters [-] Instance c9d97da7-1af8-48d1-9faa-7a8ef1e0699e was shut off while getting sample of memory.usage: Failed to inspect data of instance <name=instance-00000092, id=c9d97da7-1af8-48d1-9faa-7a8ef1e0699e>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 03:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:16:36.839 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 22 03:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:16:36.839 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:16:36.839 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServersTestJSON-server-1639208197>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersTestJSON-server-1639208197>]
Nov 22 03:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:16:36.839 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 03:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:16:36.841 12 DEBUG ceilometer.compute.pollsters [-] Instance c9d97da7-1af8-48d1-9faa-7a8ef1e0699e was shut off while getting sample of network.outgoing.bytes: Failed to inspect data of instance <name=instance-00000092, id=c9d97da7-1af8-48d1-9faa-7a8ef1e0699e>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 03:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:16:36.841 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 03:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:16:36.842 12 DEBUG ceilometer.compute.pollsters [-] Instance c9d97da7-1af8-48d1-9faa-7a8ef1e0699e was shut off while getting sample of disk.device.usage: Failed to inspect data of instance <name=instance-00000092, id=c9d97da7-1af8-48d1-9faa-7a8ef1e0699e>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 03:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:16:36.842 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 03:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:16:36.843 12 DEBUG ceilometer.compute.pollsters [-] Instance c9d97da7-1af8-48d1-9faa-7a8ef1e0699e was shut off while getting sample of disk.device.write.latency: Failed to inspect data of instance <name=instance-00000092, id=c9d97da7-1af8-48d1-9faa-7a8ef1e0699e>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 03:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:16:36.843 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 03:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:16:36.844 12 DEBUG ceilometer.compute.pollsters [-] Instance c9d97da7-1af8-48d1-9faa-7a8ef1e0699e was shut off while getting sample of disk.device.read.latency: Failed to inspect data of instance <name=instance-00000092, id=c9d97da7-1af8-48d1-9faa-7a8ef1e0699e>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 03:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:16:36.844 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 03:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:16:36.845 12 DEBUG ceilometer.compute.pollsters [-] Instance c9d97da7-1af8-48d1-9faa-7a8ef1e0699e was shut off while getting sample of disk.device.read.requests: Failed to inspect data of instance <name=instance-00000092, id=c9d97da7-1af8-48d1-9faa-7a8ef1e0699e>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 03:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:16:36.845 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 03:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:16:36.846 12 DEBUG ceilometer.compute.pollsters [-] Instance c9d97da7-1af8-48d1-9faa-7a8ef1e0699e was shut off while getting sample of network.outgoing.packets.drop: Failed to inspect data of instance <name=instance-00000092, id=c9d97da7-1af8-48d1-9faa-7a8ef1e0699e>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 03:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:16:36.846 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 22 03:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:16:36.846 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:16:36.846 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServersTestJSON-server-1639208197>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersTestJSON-server-1639208197>]
Nov 22 03:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:16:36.846 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 03:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:16:36.847 12 DEBUG ceilometer.compute.pollsters [-] Instance c9d97da7-1af8-48d1-9faa-7a8ef1e0699e was shut off while getting sample of disk.device.write.bytes: Failed to inspect data of instance <name=instance-00000092, id=c9d97da7-1af8-48d1-9faa-7a8ef1e0699e>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 03:16:36 np0005531887 podman[235924]: 2025-11-22 08:16:36.94559393 +0000 UTC m=+0.287277551 container cleanup 1635fa3d9cdb33ca4edf9db68dfe1b6eb4431b65f8869cea3c6d0ca9afded964 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:16:36 np0005531887 systemd[1]: libpod-conmon-1635fa3d9cdb33ca4edf9db68dfe1b6eb4431b65f8869cea3c6d0ca9afded964.scope: Deactivated successfully.
Nov 22 03:16:37 np0005531887 nova_compute[186849]: 2025-11-22 08:16:37.097 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:37 np0005531887 nova_compute[186849]: 2025-11-22 08:16:37.212 186853 DEBUG nova.compute.manager [req-21780f84-91b6-4d4e-a481-32174fa88403 req-4f0603ca-9bca-4f42-b976-2c0c3a1e7bcd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Received event network-vif-unplugged-dd3a3100-fbea-496f-91f5-a4d7c56ff913 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:16:37 np0005531887 nova_compute[186849]: 2025-11-22 08:16:37.213 186853 DEBUG oslo_concurrency.lockutils [req-21780f84-91b6-4d4e-a481-32174fa88403 req-4f0603ca-9bca-4f42-b976-2c0c3a1e7bcd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c9d97da7-1af8-48d1-9faa-7a8ef1e0699e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:16:37 np0005531887 nova_compute[186849]: 2025-11-22 08:16:37.214 186853 DEBUG oslo_concurrency.lockutils [req-21780f84-91b6-4d4e-a481-32174fa88403 req-4f0603ca-9bca-4f42-b976-2c0c3a1e7bcd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c9d97da7-1af8-48d1-9faa-7a8ef1e0699e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:16:37 np0005531887 nova_compute[186849]: 2025-11-22 08:16:37.214 186853 DEBUG oslo_concurrency.lockutils [req-21780f84-91b6-4d4e-a481-32174fa88403 req-4f0603ca-9bca-4f42-b976-2c0c3a1e7bcd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c9d97da7-1af8-48d1-9faa-7a8ef1e0699e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:16:37 np0005531887 nova_compute[186849]: 2025-11-22 08:16:37.215 186853 DEBUG nova.compute.manager [req-21780f84-91b6-4d4e-a481-32174fa88403 req-4f0603ca-9bca-4f42-b976-2c0c3a1e7bcd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] No waiting events found dispatching network-vif-unplugged-dd3a3100-fbea-496f-91f5-a4d7c56ff913 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:16:37 np0005531887 nova_compute[186849]: 2025-11-22 08:16:37.215 186853 WARNING nova.compute.manager [req-21780f84-91b6-4d4e-a481-32174fa88403 req-4f0603ca-9bca-4f42-b976-2c0c3a1e7bcd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Received unexpected event network-vif-unplugged-dd3a3100-fbea-496f-91f5-a4d7c56ff913 for instance with vm_state active and task_state powering-off.#033[00m
Nov 22 03:16:37 np0005531887 podman[235966]: 2025-11-22 08:16:37.217743646 +0000 UTC m=+0.248924875 container remove 1635fa3d9cdb33ca4edf9db68dfe1b6eb4431b65f8869cea3c6d0ca9afded964 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a, tcib_managed=true, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 03:16:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:16:37.223 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[36123843-bbe9-4427-9881-7d37482c5b0b]: (4, ('Sat Nov 22 08:16:36 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a (1635fa3d9cdb33ca4edf9db68dfe1b6eb4431b65f8869cea3c6d0ca9afded964)\n1635fa3d9cdb33ca4edf9db68dfe1b6eb4431b65f8869cea3c6d0ca9afded964\nSat Nov 22 08:16:36 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a (1635fa3d9cdb33ca4edf9db68dfe1b6eb4431b65f8869cea3c6d0ca9afded964)\n1635fa3d9cdb33ca4edf9db68dfe1b6eb4431b65f8869cea3c6d0ca9afded964\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:16:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:16:37.225 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[99af1e20-b1f6-4906-935b-2376b885d56b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:16:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:16:37.226 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap66c945b4-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:16:37 np0005531887 nova_compute[186849]: 2025-11-22 08:16:37.229 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:37 np0005531887 kernel: tap66c945b4-70: left promiscuous mode
Nov 22 03:16:37 np0005531887 nova_compute[186849]: 2025-11-22 08:16:37.251 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:16:37.255 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[24891b2f-8e4d-4493-88cb-e1169823cb43]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:16:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:16:37.271 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[4a614eb1-21c2-4cf3-acba-7291014615bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:16:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:16:37.273 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[23f46f1b-8335-4d41-a55b-d183a9af65be]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:16:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:16:37.295 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[6a725ae4-aec8-4ee4-bc9b-b85de8ccce59]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 610819, 'reachable_time': 43450, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235985, 'error': None, 'target': 'ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:16:37 np0005531887 systemd[1]: run-netns-ovnmeta\x2d66c945b4\x2d7237\x2d4e85\x2db411\x2d0c51b31ea31a.mount: Deactivated successfully.
Nov 22 03:16:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:16:37.299 104200 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-66c945b4-7237-4e85-b411-0c51b31ea31a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:16:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:16:37.299 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[dd67672b-5e09-4176-95ff-ae1bf416b19e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:16:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:16:37.350 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:16:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:16:37.351 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:16:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:16:37.351 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:16:37 np0005531887 nova_compute[186849]: 2025-11-22 08:16:37.375 186853 INFO nova.virt.libvirt.driver [None req-3684d350-190a-46ef-a55a-0ef29c27905b 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Instance shutdown successfully after 24 seconds.#033[00m
Nov 22 03:16:37 np0005531887 nova_compute[186849]: 2025-11-22 08:16:37.382 186853 INFO nova.virt.libvirt.driver [-] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Instance destroyed successfully.#033[00m
Nov 22 03:16:37 np0005531887 nova_compute[186849]: 2025-11-22 08:16:37.382 186853 DEBUG nova.objects.instance [None req-3684d350-190a-46ef-a55a-0ef29c27905b 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lazy-loading 'numa_topology' on Instance uuid c9d97da7-1af8-48d1-9faa-7a8ef1e0699e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:16:37 np0005531887 nova_compute[186849]: 2025-11-22 08:16:37.395 186853 DEBUG nova.compute.manager [None req-3684d350-190a-46ef-a55a-0ef29c27905b 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:16:37 np0005531887 nova_compute[186849]: 2025-11-22 08:16:37.470 186853 DEBUG oslo_concurrency.lockutils [None req-3684d350-190a-46ef-a55a-0ef29c27905b 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "c9d97da7-1af8-48d1-9faa-7a8ef1e0699e" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 24.275s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:16:39 np0005531887 nova_compute[186849]: 2025-11-22 08:16:39.662 186853 DEBUG nova.compute.manager [req-7f564e59-e0a7-4539-9cc4-aec873c60e80 req-6a0f8dc1-e60b-435c-8e3c-e50ce71ac775 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Received event network-vif-plugged-dd3a3100-fbea-496f-91f5-a4d7c56ff913 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:16:39 np0005531887 nova_compute[186849]: 2025-11-22 08:16:39.662 186853 DEBUG oslo_concurrency.lockutils [req-7f564e59-e0a7-4539-9cc4-aec873c60e80 req-6a0f8dc1-e60b-435c-8e3c-e50ce71ac775 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c9d97da7-1af8-48d1-9faa-7a8ef1e0699e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:16:39 np0005531887 nova_compute[186849]: 2025-11-22 08:16:39.663 186853 DEBUG oslo_concurrency.lockutils [req-7f564e59-e0a7-4539-9cc4-aec873c60e80 req-6a0f8dc1-e60b-435c-8e3c-e50ce71ac775 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c9d97da7-1af8-48d1-9faa-7a8ef1e0699e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:16:39 np0005531887 nova_compute[186849]: 2025-11-22 08:16:39.663 186853 DEBUG oslo_concurrency.lockutils [req-7f564e59-e0a7-4539-9cc4-aec873c60e80 req-6a0f8dc1-e60b-435c-8e3c-e50ce71ac775 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c9d97da7-1af8-48d1-9faa-7a8ef1e0699e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:16:39 np0005531887 nova_compute[186849]: 2025-11-22 08:16:39.664 186853 DEBUG nova.compute.manager [req-7f564e59-e0a7-4539-9cc4-aec873c60e80 req-6a0f8dc1-e60b-435c-8e3c-e50ce71ac775 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] No waiting events found dispatching network-vif-plugged-dd3a3100-fbea-496f-91f5-a4d7c56ff913 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:16:39 np0005531887 nova_compute[186849]: 2025-11-22 08:16:39.664 186853 WARNING nova.compute.manager [req-7f564e59-e0a7-4539-9cc4-aec873c60e80 req-6a0f8dc1-e60b-435c-8e3c-e50ce71ac775 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Received unexpected event network-vif-plugged-dd3a3100-fbea-496f-91f5-a4d7c56ff913 for instance with vm_state stopped and task_state None.#033[00m
Nov 22 03:16:39 np0005531887 podman[235986]: 2025-11-22 08:16:39.839326073 +0000 UTC m=+0.059845798 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, distribution-scope=public, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.buildah.version=1.33.7, io.openshift.expose-services=)
Nov 22 03:16:39 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:16:39.930 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '42'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:16:40 np0005531887 nova_compute[186849]: 2025-11-22 08:16:40.379 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:41 np0005531887 nova_compute[186849]: 2025-11-22 08:16:41.866 186853 DEBUG oslo_concurrency.lockutils [None req-ee3d4779-859f-47ac-8526-2e6a466937af 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "c9d97da7-1af8-48d1-9faa-7a8ef1e0699e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:16:41 np0005531887 nova_compute[186849]: 2025-11-22 08:16:41.867 186853 DEBUG oslo_concurrency.lockutils [None req-ee3d4779-859f-47ac-8526-2e6a466937af 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "c9d97da7-1af8-48d1-9faa-7a8ef1e0699e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:16:41 np0005531887 nova_compute[186849]: 2025-11-22 08:16:41.867 186853 DEBUG oslo_concurrency.lockutils [None req-ee3d4779-859f-47ac-8526-2e6a466937af 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "c9d97da7-1af8-48d1-9faa-7a8ef1e0699e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:16:41 np0005531887 nova_compute[186849]: 2025-11-22 08:16:41.867 186853 DEBUG oslo_concurrency.lockutils [None req-ee3d4779-859f-47ac-8526-2e6a466937af 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "c9d97da7-1af8-48d1-9faa-7a8ef1e0699e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:16:41 np0005531887 nova_compute[186849]: 2025-11-22 08:16:41.867 186853 DEBUG oslo_concurrency.lockutils [None req-ee3d4779-859f-47ac-8526-2e6a466937af 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "c9d97da7-1af8-48d1-9faa-7a8ef1e0699e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:16:41 np0005531887 nova_compute[186849]: 2025-11-22 08:16:41.875 186853 INFO nova.compute.manager [None req-ee3d4779-859f-47ac-8526-2e6a466937af 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Terminating instance#033[00m
Nov 22 03:16:41 np0005531887 nova_compute[186849]: 2025-11-22 08:16:41.880 186853 DEBUG nova.compute.manager [None req-ee3d4779-859f-47ac-8526-2e6a466937af 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 03:16:41 np0005531887 nova_compute[186849]: 2025-11-22 08:16:41.886 186853 INFO nova.virt.libvirt.driver [-] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Instance destroyed successfully.#033[00m
Nov 22 03:16:41 np0005531887 nova_compute[186849]: 2025-11-22 08:16:41.886 186853 DEBUG nova.objects.instance [None req-ee3d4779-859f-47ac-8526-2e6a466937af 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lazy-loading 'resources' on Instance uuid c9d97da7-1af8-48d1-9faa-7a8ef1e0699e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:16:41 np0005531887 nova_compute[186849]: 2025-11-22 08:16:41.906 186853 DEBUG nova.virt.libvirt.vif [None req-ee3d4779-859f-47ac-8526-2e6a466937af 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:15:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1639208197',display_name='tempest-Íñstáñcé-1385477844',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestjson-server-1639208197',id=146,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:16:07Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='70cb231da30d4002a985cf18a579cd6a',ramdisk_id='',reservation_id='r-v83hmndt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-1620770071',owner_user_name='tempest-ServersTestJSON-1620770071-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:16:40Z,user_data=None,user_id='11d95211a44e4da9a04eb309ec3ab024',uuid=c9d97da7-1af8-48d1-9faa-7a8ef1e0699e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "dd3a3100-fbea-496f-91f5-a4d7c56ff913", "address": "fa:16:3e:f3:25:30", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd3a3100-fb", "ovs_interfaceid": "dd3a3100-fbea-496f-91f5-a4d7c56ff913", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:16:41 np0005531887 nova_compute[186849]: 2025-11-22 08:16:41.907 186853 DEBUG nova.network.os_vif_util [None req-ee3d4779-859f-47ac-8526-2e6a466937af 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Converting VIF {"id": "dd3a3100-fbea-496f-91f5-a4d7c56ff913", "address": "fa:16:3e:f3:25:30", "network": {"id": "66c945b4-7237-4e85-b411-0c51b31ea31a", "bridge": "br-int", "label": "tempest-ServersTestJSON-624222518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70cb231da30d4002a985cf18a579cd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd3a3100-fb", "ovs_interfaceid": "dd3a3100-fbea-496f-91f5-a4d7c56ff913", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:16:41 np0005531887 nova_compute[186849]: 2025-11-22 08:16:41.908 186853 DEBUG nova.network.os_vif_util [None req-ee3d4779-859f-47ac-8526-2e6a466937af 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f3:25:30,bridge_name='br-int',has_traffic_filtering=True,id=dd3a3100-fbea-496f-91f5-a4d7c56ff913,network=Network(66c945b4-7237-4e85-b411-0c51b31ea31a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdd3a3100-fb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:16:41 np0005531887 nova_compute[186849]: 2025-11-22 08:16:41.909 186853 DEBUG os_vif [None req-ee3d4779-859f-47ac-8526-2e6a466937af 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f3:25:30,bridge_name='br-int',has_traffic_filtering=True,id=dd3a3100-fbea-496f-91f5-a4d7c56ff913,network=Network(66c945b4-7237-4e85-b411-0c51b31ea31a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdd3a3100-fb') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:16:41 np0005531887 nova_compute[186849]: 2025-11-22 08:16:41.912 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:41 np0005531887 nova_compute[186849]: 2025-11-22 08:16:41.913 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdd3a3100-fb, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:16:41 np0005531887 nova_compute[186849]: 2025-11-22 08:16:41.914 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:41 np0005531887 nova_compute[186849]: 2025-11-22 08:16:41.915 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:41 np0005531887 nova_compute[186849]: 2025-11-22 08:16:41.918 186853 INFO os_vif [None req-ee3d4779-859f-47ac-8526-2e6a466937af 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f3:25:30,bridge_name='br-int',has_traffic_filtering=True,id=dd3a3100-fbea-496f-91f5-a4d7c56ff913,network=Network(66c945b4-7237-4e85-b411-0c51b31ea31a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdd3a3100-fb')#033[00m
Nov 22 03:16:41 np0005531887 nova_compute[186849]: 2025-11-22 08:16:41.918 186853 INFO nova.virt.libvirt.driver [None req-ee3d4779-859f-47ac-8526-2e6a466937af 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Deleting instance files /var/lib/nova/instances/c9d97da7-1af8-48d1-9faa-7a8ef1e0699e_del#033[00m
Nov 22 03:16:41 np0005531887 nova_compute[186849]: 2025-11-22 08:16:41.919 186853 INFO nova.virt.libvirt.driver [None req-ee3d4779-859f-47ac-8526-2e6a466937af 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Deletion of /var/lib/nova/instances/c9d97da7-1af8-48d1-9faa-7a8ef1e0699e_del complete#033[00m
Nov 22 03:16:42 np0005531887 nova_compute[186849]: 2025-11-22 08:16:42.038 186853 INFO nova.compute.manager [None req-ee3d4779-859f-47ac-8526-2e6a466937af 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Took 0.16 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 03:16:42 np0005531887 nova_compute[186849]: 2025-11-22 08:16:42.038 186853 DEBUG oslo.service.loopingcall [None req-ee3d4779-859f-47ac-8526-2e6a466937af 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 03:16:42 np0005531887 nova_compute[186849]: 2025-11-22 08:16:42.039 186853 DEBUG nova.compute.manager [-] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 03:16:42 np0005531887 nova_compute[186849]: 2025-11-22 08:16:42.039 186853 DEBUG nova.network.neutron [-] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 03:16:42 np0005531887 nova_compute[186849]: 2025-11-22 08:16:42.100 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:42 np0005531887 nova_compute[186849]: 2025-11-22 08:16:42.862 186853 DEBUG nova.network.neutron [-] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:16:42 np0005531887 nova_compute[186849]: 2025-11-22 08:16:42.901 186853 INFO nova.compute.manager [-] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Took 0.86 seconds to deallocate network for instance.#033[00m
Nov 22 03:16:42 np0005531887 nova_compute[186849]: 2025-11-22 08:16:42.982 186853 DEBUG nova.compute.manager [req-4412695f-b50f-4673-b3f6-cd43ba928642 req-768e9052-0b5e-4270-b01a-76d772c067b6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Received event network-vif-deleted-dd3a3100-fbea-496f-91f5-a4d7c56ff913 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:16:43 np0005531887 nova_compute[186849]: 2025-11-22 08:16:43.022 186853 DEBUG oslo_concurrency.lockutils [None req-ee3d4779-859f-47ac-8526-2e6a466937af 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:16:43 np0005531887 nova_compute[186849]: 2025-11-22 08:16:43.022 186853 DEBUG oslo_concurrency.lockutils [None req-ee3d4779-859f-47ac-8526-2e6a466937af 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:16:43 np0005531887 nova_compute[186849]: 2025-11-22 08:16:43.113 186853 DEBUG nova.compute.provider_tree [None req-ee3d4779-859f-47ac-8526-2e6a466937af 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:16:43 np0005531887 nova_compute[186849]: 2025-11-22 08:16:43.127 186853 DEBUG nova.scheduler.client.report [None req-ee3d4779-859f-47ac-8526-2e6a466937af 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:16:43 np0005531887 nova_compute[186849]: 2025-11-22 08:16:43.153 186853 DEBUG oslo_concurrency.lockutils [None req-ee3d4779-859f-47ac-8526-2e6a466937af 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.131s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:16:43 np0005531887 nova_compute[186849]: 2025-11-22 08:16:43.193 186853 INFO nova.scheduler.client.report [None req-ee3d4779-859f-47ac-8526-2e6a466937af 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Deleted allocations for instance c9d97da7-1af8-48d1-9faa-7a8ef1e0699e#033[00m
Nov 22 03:16:43 np0005531887 nova_compute[186849]: 2025-11-22 08:16:43.343 186853 DEBUG oslo_concurrency.lockutils [None req-ee3d4779-859f-47ac-8526-2e6a466937af 11d95211a44e4da9a04eb309ec3ab024 70cb231da30d4002a985cf18a579cd6a - - default default] Lock "c9d97da7-1af8-48d1-9faa-7a8ef1e0699e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.477s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:16:43 np0005531887 podman[236008]: 2025-11-22 08:16:43.849475708 +0000 UTC m=+0.071497405 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ceilometer_agent_compute)
Nov 22 03:16:43 np0005531887 podman[236009]: 2025-11-22 08:16:43.883557979 +0000 UTC m=+0.101090385 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller)
Nov 22 03:16:46 np0005531887 nova_compute[186849]: 2025-11-22 08:16:46.917 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:47 np0005531887 nova_compute[186849]: 2025-11-22 08:16:47.101 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:49 np0005531887 podman[236053]: 2025-11-22 08:16:49.864253205 +0000 UTC m=+0.080169610 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:16:51 np0005531887 nova_compute[186849]: 2025-11-22 08:16:51.816 186853 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763799396.814937, c9d97da7-1af8-48d1-9faa-7a8ef1e0699e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:16:51 np0005531887 nova_compute[186849]: 2025-11-22 08:16:51.817 186853 INFO nova.compute.manager [-] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:16:51 np0005531887 nova_compute[186849]: 2025-11-22 08:16:51.846 186853 DEBUG nova.compute.manager [None req-8225f305-5c51-4a5a-a0a5-24ba33a46711 - - - - - -] [instance: c9d97da7-1af8-48d1-9faa-7a8ef1e0699e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:16:51 np0005531887 nova_compute[186849]: 2025-11-22 08:16:51.920 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:52 np0005531887 nova_compute[186849]: 2025-11-22 08:16:52.104 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:54 np0005531887 podman[236079]: 2025-11-22 08:16:54.849482791 +0000 UTC m=+0.069303920 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:16:56 np0005531887 nova_compute[186849]: 2025-11-22 08:16:56.237 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:56 np0005531887 nova_compute[186849]: 2025-11-22 08:16:56.923 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:57 np0005531887 nova_compute[186849]: 2025-11-22 08:16:57.106 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:16:59 np0005531887 nova_compute[186849]: 2025-11-22 08:16:59.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:17:00 np0005531887 podman[236100]: 2025-11-22 08:17:00.849509572 +0000 UTC m=+0.068964562 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 03:17:01 np0005531887 nova_compute[186849]: 2025-11-22 08:17:01.926 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:17:02 np0005531887 nova_compute[186849]: 2025-11-22 08:17:02.108 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:17:05 np0005531887 nova_compute[186849]: 2025-11-22 08:17:05.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:17:05 np0005531887 nova_compute[186849]: 2025-11-22 08:17:05.807 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:17:05 np0005531887 nova_compute[186849]: 2025-11-22 08:17:05.807 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:17:05 np0005531887 nova_compute[186849]: 2025-11-22 08:17:05.807 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:17:05 np0005531887 nova_compute[186849]: 2025-11-22 08:17:05.807 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:17:05 np0005531887 podman[236120]: 2025-11-22 08:17:05.840244831 +0000 UTC m=+0.055876633 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 03:17:05 np0005531887 nova_compute[186849]: 2025-11-22 08:17:05.984 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:17:05 np0005531887 nova_compute[186849]: 2025-11-22 08:17:05.985 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5739MB free_disk=73.27426147460938GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:17:05 np0005531887 nova_compute[186849]: 2025-11-22 08:17:05.985 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:17:05 np0005531887 nova_compute[186849]: 2025-11-22 08:17:05.986 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:17:06 np0005531887 nova_compute[186849]: 2025-11-22 08:17:06.049 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:17:06 np0005531887 nova_compute[186849]: 2025-11-22 08:17:06.050 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:17:06 np0005531887 nova_compute[186849]: 2025-11-22 08:17:06.073 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:17:06 np0005531887 nova_compute[186849]: 2025-11-22 08:17:06.100 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:17:06 np0005531887 nova_compute[186849]: 2025-11-22 08:17:06.406 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:17:06 np0005531887 nova_compute[186849]: 2025-11-22 08:17:06.407 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.421s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:17:06 np0005531887 nova_compute[186849]: 2025-11-22 08:17:06.929 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:17:07 np0005531887 nova_compute[186849]: 2025-11-22 08:17:07.110 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:17:07 np0005531887 nova_compute[186849]: 2025-11-22 08:17:07.407 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:17:07 np0005531887 nova_compute[186849]: 2025-11-22 08:17:07.408 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:17:07 np0005531887 nova_compute[186849]: 2025-11-22 08:17:07.408 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:17:07 np0005531887 nova_compute[186849]: 2025-11-22 08:17:07.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:17:07 np0005531887 nova_compute[186849]: 2025-11-22 08:17:07.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:17:07 np0005531887 nova_compute[186849]: 2025-11-22 08:17:07.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:17:07 np0005531887 nova_compute[186849]: 2025-11-22 08:17:07.783 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:17:08 np0005531887 nova_compute[186849]: 2025-11-22 08:17:08.776 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:17:09 np0005531887 nova_compute[186849]: 2025-11-22 08:17:09.771 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:17:10 np0005531887 podman[236145]: 2025-11-22 08:17:10.855298138 +0000 UTC m=+0.070128705 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.openshift.tags=minimal rhel9, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=ubi9-minimal)
Nov 22 03:17:11 np0005531887 nova_compute[186849]: 2025-11-22 08:17:11.933 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:17:12 np0005531887 nova_compute[186849]: 2025-11-22 08:17:12.113 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:17:14 np0005531887 nova_compute[186849]: 2025-11-22 08:17:14.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:17:14 np0005531887 podman[236166]: 2025-11-22 08:17:14.860955594 +0000 UTC m=+0.076231027 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:17:14 np0005531887 podman[236167]: 2025-11-22 08:17:14.877215386 +0000 UTC m=+0.096662742 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:17:16 np0005531887 nova_compute[186849]: 2025-11-22 08:17:16.936 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:17:17 np0005531887 nova_compute[186849]: 2025-11-22 08:17:17.115 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:17:17 np0005531887 nova_compute[186849]: 2025-11-22 08:17:17.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:17:20 np0005531887 podman[236211]: 2025-11-22 08:17:20.834120645 +0000 UTC m=+0.052715585 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 03:17:21 np0005531887 nova_compute[186849]: 2025-11-22 08:17:21.940 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:17:22 np0005531887 nova_compute[186849]: 2025-11-22 08:17:22.117 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:17:25 np0005531887 podman[236235]: 2025-11-22 08:17:25.882050005 +0000 UTC m=+0.095678727 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 22 03:17:26 np0005531887 nova_compute[186849]: 2025-11-22 08:17:26.943 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:17:27 np0005531887 ovn_controller[95130]: 2025-11-22T08:17:27Z|00497|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Nov 22 03:17:27 np0005531887 nova_compute[186849]: 2025-11-22 08:17:27.119 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:17:31 np0005531887 podman[236255]: 2025-11-22 08:17:31.838720247 +0000 UTC m=+0.062891796 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:17:31 np0005531887 nova_compute[186849]: 2025-11-22 08:17:31.945 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:17:32 np0005531887 nova_compute[186849]: 2025-11-22 08:17:32.121 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:17:36 np0005531887 podman[236275]: 2025-11-22 08:17:36.84431079 +0000 UTC m=+0.060202830 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 03:17:36 np0005531887 nova_compute[186849]: 2025-11-22 08:17:36.948 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:17:37 np0005531887 nova_compute[186849]: 2025-11-22 08:17:37.122 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:17:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:17:37.351 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:17:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:17:37.352 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:17:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:17:37.352 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:17:41 np0005531887 podman[236301]: 2025-11-22 08:17:41.854438586 +0000 UTC m=+0.071751656 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, distribution-scope=public, io.buildah.version=1.33.7, release=1755695350, com.redhat.component=ubi9-minimal-container, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, architecture=x86_64, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc.)
Nov 22 03:17:41 np0005531887 nova_compute[186849]: 2025-11-22 08:17:41.952 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:17:42 np0005531887 nova_compute[186849]: 2025-11-22 08:17:42.126 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:17:45 np0005531887 podman[236323]: 2025-11-22 08:17:45.863224809 +0000 UTC m=+0.079949688 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 22 03:17:45 np0005531887 podman[236324]: 2025-11-22 08:17:45.890613396 +0000 UTC m=+0.102503176 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3)
Nov 22 03:17:46 np0005531887 nova_compute[186849]: 2025-11-22 08:17:46.956 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:17:47 np0005531887 nova_compute[186849]: 2025-11-22 08:17:47.134 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:17:51 np0005531887 podman[236372]: 2025-11-22 08:17:51.83476206 +0000 UTC m=+0.054062348 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:17:51 np0005531887 nova_compute[186849]: 2025-11-22 08:17:51.958 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:17:52 np0005531887 nova_compute[186849]: 2025-11-22 08:17:52.137 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:17:56 np0005531887 podman[236396]: 2025-11-22 08:17:56.87748608 +0000 UTC m=+0.086801727 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:17:56 np0005531887 nova_compute[186849]: 2025-11-22 08:17:56.962 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:17:57 np0005531887 nova_compute[186849]: 2025-11-22 08:17:57.022 186853 DEBUG oslo_concurrency.lockutils [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "dd01a59a-8825-4686-8ad2-48c0d7c29bcf" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:17:57 np0005531887 nova_compute[186849]: 2025-11-22 08:17:57.023 186853 DEBUG oslo_concurrency.lockutils [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "dd01a59a-8825-4686-8ad2-48c0d7c29bcf" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:17:57 np0005531887 nova_compute[186849]: 2025-11-22 08:17:57.043 186853 DEBUG nova.compute.manager [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 03:17:57 np0005531887 nova_compute[186849]: 2025-11-22 08:17:57.139 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:17:57 np0005531887 nova_compute[186849]: 2025-11-22 08:17:57.194 186853 DEBUG oslo_concurrency.lockutils [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:17:57 np0005531887 nova_compute[186849]: 2025-11-22 08:17:57.194 186853 DEBUG oslo_concurrency.lockutils [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:17:57 np0005531887 nova_compute[186849]: 2025-11-22 08:17:57.199 186853 DEBUG nova.virt.hardware [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:17:57 np0005531887 nova_compute[186849]: 2025-11-22 08:17:57.200 186853 INFO nova.compute.claims [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 22 03:17:57 np0005531887 nova_compute[186849]: 2025-11-22 08:17:57.375 186853 DEBUG nova.compute.provider_tree [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:17:57 np0005531887 nova_compute[186849]: 2025-11-22 08:17:57.388 186853 DEBUG nova.scheduler.client.report [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:17:57 np0005531887 nova_compute[186849]: 2025-11-22 08:17:57.453 186853 DEBUG oslo_concurrency.lockutils [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.258s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:17:57 np0005531887 nova_compute[186849]: 2025-11-22 08:17:57.454 186853 DEBUG nova.compute.manager [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 03:17:57 np0005531887 nova_compute[186849]: 2025-11-22 08:17:57.514 186853 DEBUG nova.compute.manager [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 03:17:57 np0005531887 nova_compute[186849]: 2025-11-22 08:17:57.515 186853 DEBUG nova.network.neutron [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:17:57 np0005531887 nova_compute[186849]: 2025-11-22 08:17:57.567 186853 INFO nova.virt.libvirt.driver [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 03:17:57 np0005531887 nova_compute[186849]: 2025-11-22 08:17:57.600 186853 DEBUG nova.compute.manager [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 03:17:57 np0005531887 nova_compute[186849]: 2025-11-22 08:17:57.716 186853 DEBUG nova.compute.manager [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 03:17:57 np0005531887 nova_compute[186849]: 2025-11-22 08:17:57.717 186853 DEBUG nova.virt.libvirt.driver [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:17:57 np0005531887 nova_compute[186849]: 2025-11-22 08:17:57.718 186853 INFO nova.virt.libvirt.driver [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Creating image(s)#033[00m
Nov 22 03:17:57 np0005531887 nova_compute[186849]: 2025-11-22 08:17:57.718 186853 DEBUG oslo_concurrency.lockutils [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "/var/lib/nova/instances/dd01a59a-8825-4686-8ad2-48c0d7c29bcf/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:17:57 np0005531887 nova_compute[186849]: 2025-11-22 08:17:57.719 186853 DEBUG oslo_concurrency.lockutils [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "/var/lib/nova/instances/dd01a59a-8825-4686-8ad2-48c0d7c29bcf/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:17:57 np0005531887 nova_compute[186849]: 2025-11-22 08:17:57.719 186853 DEBUG oslo_concurrency.lockutils [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "/var/lib/nova/instances/dd01a59a-8825-4686-8ad2-48c0d7c29bcf/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:17:57 np0005531887 nova_compute[186849]: 2025-11-22 08:17:57.736 186853 DEBUG oslo_concurrency.processutils [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:17:57 np0005531887 nova_compute[186849]: 2025-11-22 08:17:57.809 186853 DEBUG oslo_concurrency.processutils [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:17:57 np0005531887 nova_compute[186849]: 2025-11-22 08:17:57.810 186853 DEBUG oslo_concurrency.lockutils [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:17:57 np0005531887 nova_compute[186849]: 2025-11-22 08:17:57.811 186853 DEBUG oslo_concurrency.lockutils [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:17:57 np0005531887 nova_compute[186849]: 2025-11-22 08:17:57.824 186853 DEBUG oslo_concurrency.processutils [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:17:57 np0005531887 nova_compute[186849]: 2025-11-22 08:17:57.891 186853 DEBUG oslo_concurrency.processutils [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:17:57 np0005531887 nova_compute[186849]: 2025-11-22 08:17:57.892 186853 DEBUG oslo_concurrency.processutils [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/dd01a59a-8825-4686-8ad2-48c0d7c29bcf/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:17:58 np0005531887 nova_compute[186849]: 2025-11-22 08:17:58.071 186853 DEBUG nova.policy [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '809b865601654264af5bff7f49127cea', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:17:58 np0005531887 nova_compute[186849]: 2025-11-22 08:17:58.207 186853 DEBUG oslo_concurrency.processutils [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/dd01a59a-8825-4686-8ad2-48c0d7c29bcf/disk 1073741824" returned: 0 in 0.314s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:17:58 np0005531887 nova_compute[186849]: 2025-11-22 08:17:58.208 186853 DEBUG oslo_concurrency.lockutils [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.397s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:17:58 np0005531887 nova_compute[186849]: 2025-11-22 08:17:58.209 186853 DEBUG oslo_concurrency.processutils [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:17:58 np0005531887 nova_compute[186849]: 2025-11-22 08:17:58.272 186853 DEBUG oslo_concurrency.processutils [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:17:58 np0005531887 nova_compute[186849]: 2025-11-22 08:17:58.274 186853 DEBUG nova.virt.disk.api [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Checking if we can resize image /var/lib/nova/instances/dd01a59a-8825-4686-8ad2-48c0d7c29bcf/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:17:58 np0005531887 nova_compute[186849]: 2025-11-22 08:17:58.274 186853 DEBUG oslo_concurrency.processutils [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dd01a59a-8825-4686-8ad2-48c0d7c29bcf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:17:58 np0005531887 nova_compute[186849]: 2025-11-22 08:17:58.343 186853 DEBUG oslo_concurrency.processutils [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dd01a59a-8825-4686-8ad2-48c0d7c29bcf/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:17:58 np0005531887 nova_compute[186849]: 2025-11-22 08:17:58.344 186853 DEBUG nova.virt.disk.api [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Cannot resize image /var/lib/nova/instances/dd01a59a-8825-4686-8ad2-48c0d7c29bcf/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:17:58 np0005531887 nova_compute[186849]: 2025-11-22 08:17:58.344 186853 DEBUG nova.objects.instance [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lazy-loading 'migration_context' on Instance uuid dd01a59a-8825-4686-8ad2-48c0d7c29bcf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:17:58 np0005531887 nova_compute[186849]: 2025-11-22 08:17:58.363 186853 DEBUG nova.virt.libvirt.driver [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:17:58 np0005531887 nova_compute[186849]: 2025-11-22 08:17:58.363 186853 DEBUG nova.virt.libvirt.driver [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Ensure instance console log exists: /var/lib/nova/instances/dd01a59a-8825-4686-8ad2-48c0d7c29bcf/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:17:58 np0005531887 nova_compute[186849]: 2025-11-22 08:17:58.364 186853 DEBUG oslo_concurrency.lockutils [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:17:58 np0005531887 nova_compute[186849]: 2025-11-22 08:17:58.364 186853 DEBUG oslo_concurrency.lockutils [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:17:58 np0005531887 nova_compute[186849]: 2025-11-22 08:17:58.364 186853 DEBUG oslo_concurrency.lockutils [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:17:58 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:17:58.507 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=43, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=42) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:17:58 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:17:58.508 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:17:58 np0005531887 nova_compute[186849]: 2025-11-22 08:17:58.509 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:17:59 np0005531887 nova_compute[186849]: 2025-11-22 08:17:59.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:17:59 np0005531887 nova_compute[186849]: 2025-11-22 08:17:59.964 186853 DEBUG nova.network.neutron [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Successfully created port: 9dcbb883-4317-4193-a384-0d8b55f051a7 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:18:00 np0005531887 nova_compute[186849]: 2025-11-22 08:18:00.858 186853 DEBUG nova.network.neutron [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Successfully created port: da46e34b-ec37-4cc4-b1ab-4e8564ebbb60 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:18:01 np0005531887 nova_compute[186849]: 2025-11-22 08:18:01.965 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:02 np0005531887 nova_compute[186849]: 2025-11-22 08:18:02.051 186853 DEBUG nova.network.neutron [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Successfully updated port: 9dcbb883-4317-4193-a384-0d8b55f051a7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:18:02 np0005531887 nova_compute[186849]: 2025-11-22 08:18:02.142 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:02 np0005531887 nova_compute[186849]: 2025-11-22 08:18:02.207 186853 DEBUG nova.compute.manager [req-b7a8ffdf-1035-43b7-8293-60e12133a030 req-feb0847c-8e1e-4963-8866-fe6cba86cd9a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Received event network-changed-9dcbb883-4317-4193-a384-0d8b55f051a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:18:02 np0005531887 nova_compute[186849]: 2025-11-22 08:18:02.207 186853 DEBUG nova.compute.manager [req-b7a8ffdf-1035-43b7-8293-60e12133a030 req-feb0847c-8e1e-4963-8866-fe6cba86cd9a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Refreshing instance network info cache due to event network-changed-9dcbb883-4317-4193-a384-0d8b55f051a7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:18:02 np0005531887 nova_compute[186849]: 2025-11-22 08:18:02.207 186853 DEBUG oslo_concurrency.lockutils [req-b7a8ffdf-1035-43b7-8293-60e12133a030 req-feb0847c-8e1e-4963-8866-fe6cba86cd9a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-dd01a59a-8825-4686-8ad2-48c0d7c29bcf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:18:02 np0005531887 nova_compute[186849]: 2025-11-22 08:18:02.207 186853 DEBUG oslo_concurrency.lockutils [req-b7a8ffdf-1035-43b7-8293-60e12133a030 req-feb0847c-8e1e-4963-8866-fe6cba86cd9a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-dd01a59a-8825-4686-8ad2-48c0d7c29bcf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:18:02 np0005531887 nova_compute[186849]: 2025-11-22 08:18:02.208 186853 DEBUG nova.network.neutron [req-b7a8ffdf-1035-43b7-8293-60e12133a030 req-feb0847c-8e1e-4963-8866-fe6cba86cd9a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Refreshing network info cache for port 9dcbb883-4317-4193-a384-0d8b55f051a7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:18:02 np0005531887 nova_compute[186849]: 2025-11-22 08:18:02.556 186853 DEBUG nova.network.neutron [req-b7a8ffdf-1035-43b7-8293-60e12133a030 req-feb0847c-8e1e-4963-8866-fe6cba86cd9a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:18:02 np0005531887 podman[236430]: 2025-11-22 08:18:02.835956488 +0000 UTC m=+0.058722043 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 22 03:18:03 np0005531887 nova_compute[186849]: 2025-11-22 08:18:03.080 186853 DEBUG nova.network.neutron [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Successfully updated port: da46e34b-ec37-4cc4-b1ab-4e8564ebbb60 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:18:03 np0005531887 nova_compute[186849]: 2025-11-22 08:18:03.091 186853 DEBUG oslo_concurrency.lockutils [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "refresh_cache-dd01a59a-8825-4686-8ad2-48c0d7c29bcf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:18:03 np0005531887 nova_compute[186849]: 2025-11-22 08:18:03.229 186853 DEBUG nova.network.neutron [req-b7a8ffdf-1035-43b7-8293-60e12133a030 req-feb0847c-8e1e-4963-8866-fe6cba86cd9a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:18:03 np0005531887 nova_compute[186849]: 2025-11-22 08:18:03.246 186853 DEBUG oslo_concurrency.lockutils [req-b7a8ffdf-1035-43b7-8293-60e12133a030 req-feb0847c-8e1e-4963-8866-fe6cba86cd9a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-dd01a59a-8825-4686-8ad2-48c0d7c29bcf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:18:03 np0005531887 nova_compute[186849]: 2025-11-22 08:18:03.247 186853 DEBUG oslo_concurrency.lockutils [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquired lock "refresh_cache-dd01a59a-8825-4686-8ad2-48c0d7c29bcf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:18:03 np0005531887 nova_compute[186849]: 2025-11-22 08:18:03.247 186853 DEBUG nova.network.neutron [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:18:03 np0005531887 nova_compute[186849]: 2025-11-22 08:18:03.622 186853 DEBUG nova.network.neutron [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:18:04 np0005531887 nova_compute[186849]: 2025-11-22 08:18:04.384 186853 DEBUG nova.compute.manager [req-474873ee-e6af-4f3b-8aa8-2b890b1ba197 req-3ef4c609-7571-4714-bf9b-83cc85dc4ab7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Received event network-changed-da46e34b-ec37-4cc4-b1ab-4e8564ebbb60 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:18:04 np0005531887 nova_compute[186849]: 2025-11-22 08:18:04.385 186853 DEBUG nova.compute.manager [req-474873ee-e6af-4f3b-8aa8-2b890b1ba197 req-3ef4c609-7571-4714-bf9b-83cc85dc4ab7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Refreshing instance network info cache due to event network-changed-da46e34b-ec37-4cc4-b1ab-4e8564ebbb60. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:18:04 np0005531887 nova_compute[186849]: 2025-11-22 08:18:04.385 186853 DEBUG oslo_concurrency.lockutils [req-474873ee-e6af-4f3b-8aa8-2b890b1ba197 req-3ef4c609-7571-4714-bf9b-83cc85dc4ab7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-dd01a59a-8825-4686-8ad2-48c0d7c29bcf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:18:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:04.510 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '43'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.453 186853 DEBUG nova.network.neutron [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Updating instance_info_cache with network_info: [{"id": "9dcbb883-4317-4193-a384-0d8b55f051a7", "address": "fa:16:3e:ae:bf:ae", "network": {"id": "28285a99-0933-48f9-aee6-f1e507bcd777", "bridge": "br-int", "label": "tempest-network-smoke--85194758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9dcbb883-43", "ovs_interfaceid": "9dcbb883-4317-4193-a384-0d8b55f051a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "da46e34b-ec37-4cc4-b1ab-4e8564ebbb60", "address": "fa:16:3e:4b:a1:91", "network": {"id": "206a04da-ce2f-48ff-99c7-e70706547580", "bridge": "br-int", "label": "tempest-network-smoke--1887597380", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe4b:a191", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda46e34b-ec", "ovs_interfaceid": "da46e34b-ec37-4cc4-b1ab-4e8564ebbb60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.480 186853 DEBUG oslo_concurrency.lockutils [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Releasing lock "refresh_cache-dd01a59a-8825-4686-8ad2-48c0d7c29bcf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.481 186853 DEBUG nova.compute.manager [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Instance network_info: |[{"id": "9dcbb883-4317-4193-a384-0d8b55f051a7", "address": "fa:16:3e:ae:bf:ae", "network": {"id": "28285a99-0933-48f9-aee6-f1e507bcd777", "bridge": "br-int", "label": "tempest-network-smoke--85194758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9dcbb883-43", "ovs_interfaceid": "9dcbb883-4317-4193-a384-0d8b55f051a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "da46e34b-ec37-4cc4-b1ab-4e8564ebbb60", "address": "fa:16:3e:4b:a1:91", "network": {"id": "206a04da-ce2f-48ff-99c7-e70706547580", "bridge": "br-int", "label": "tempest-network-smoke--1887597380", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe4b:a191", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda46e34b-ec", "ovs_interfaceid": "da46e34b-ec37-4cc4-b1ab-4e8564ebbb60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.481 186853 DEBUG oslo_concurrency.lockutils [req-474873ee-e6af-4f3b-8aa8-2b890b1ba197 req-3ef4c609-7571-4714-bf9b-83cc85dc4ab7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-dd01a59a-8825-4686-8ad2-48c0d7c29bcf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.482 186853 DEBUG nova.network.neutron [req-474873ee-e6af-4f3b-8aa8-2b890b1ba197 req-3ef4c609-7571-4714-bf9b-83cc85dc4ab7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Refreshing network info cache for port da46e34b-ec37-4cc4-b1ab-4e8564ebbb60 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.485 186853 DEBUG nova.virt.libvirt.driver [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Start _get_guest_xml network_info=[{"id": "9dcbb883-4317-4193-a384-0d8b55f051a7", "address": "fa:16:3e:ae:bf:ae", "network": {"id": "28285a99-0933-48f9-aee6-f1e507bcd777", "bridge": "br-int", "label": "tempest-network-smoke--85194758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9dcbb883-43", "ovs_interfaceid": "9dcbb883-4317-4193-a384-0d8b55f051a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "da46e34b-ec37-4cc4-b1ab-4e8564ebbb60", "address": "fa:16:3e:4b:a1:91", "network": {"id": "206a04da-ce2f-48ff-99c7-e70706547580", "bridge": "br-int", "label": "tempest-network-smoke--1887597380", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe4b:a191", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda46e34b-ec", "ovs_interfaceid": "da46e34b-ec37-4cc4-b1ab-4e8564ebbb60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'size': 0, 'boot_index': 0, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.490 186853 WARNING nova.virt.libvirt.driver [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.500 186853 DEBUG nova.virt.libvirt.host [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.501 186853 DEBUG nova.virt.libvirt.host [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.505 186853 DEBUG nova.virt.libvirt.host [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.506 186853 DEBUG nova.virt.libvirt.host [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.507 186853 DEBUG nova.virt.libvirt.driver [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.507 186853 DEBUG nova.virt.hardware [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.508 186853 DEBUG nova.virt.hardware [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.508 186853 DEBUG nova.virt.hardware [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.508 186853 DEBUG nova.virt.hardware [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.508 186853 DEBUG nova.virt.hardware [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.509 186853 DEBUG nova.virt.hardware [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.509 186853 DEBUG nova.virt.hardware [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.509 186853 DEBUG nova.virt.hardware [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.509 186853 DEBUG nova.virt.hardware [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.510 186853 DEBUG nova.virt.hardware [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.510 186853 DEBUG nova.virt.hardware [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.515 186853 DEBUG nova.virt.libvirt.vif [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:17:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-686264333',display_name='tempest-TestGettingAddress-server-686264333',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-686264333',id=149,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM4eWB6S8gF3+vAQNfWODRrXJ2TSEmW43skqR+J/UySpPtPQ+ovw0XjJfGr33wuxAxwi/2V7+yN1aEcDFfs9GT9vSaMpY282CNYDuBhDuhcnpLdM1GTSDhOlEpnjVOI3fA==',key_name='tempest-TestGettingAddress-444639012',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-ynkhx0vc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:17:57Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=dd01a59a-8825-4686-8ad2-48c0d7c29bcf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9dcbb883-4317-4193-a384-0d8b55f051a7", "address": "fa:16:3e:ae:bf:ae", "network": {"id": "28285a99-0933-48f9-aee6-f1e507bcd777", "bridge": "br-int", "label": "tempest-network-smoke--85194758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9dcbb883-43", "ovs_interfaceid": "9dcbb883-4317-4193-a384-0d8b55f051a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.515 186853 DEBUG nova.network.os_vif_util [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "9dcbb883-4317-4193-a384-0d8b55f051a7", "address": "fa:16:3e:ae:bf:ae", "network": {"id": "28285a99-0933-48f9-aee6-f1e507bcd777", "bridge": "br-int", "label": "tempest-network-smoke--85194758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9dcbb883-43", "ovs_interfaceid": "9dcbb883-4317-4193-a384-0d8b55f051a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.516 186853 DEBUG nova.network.os_vif_util [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ae:bf:ae,bridge_name='br-int',has_traffic_filtering=True,id=9dcbb883-4317-4193-a384-0d8b55f051a7,network=Network(28285a99-0933-48f9-aee6-f1e507bcd777),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9dcbb883-43') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.517 186853 DEBUG nova.virt.libvirt.vif [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:17:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-686264333',display_name='tempest-TestGettingAddress-server-686264333',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-686264333',id=149,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM4eWB6S8gF3+vAQNfWODRrXJ2TSEmW43skqR+J/UySpPtPQ+ovw0XjJfGr33wuxAxwi/2V7+yN1aEcDFfs9GT9vSaMpY282CNYDuBhDuhcnpLdM1GTSDhOlEpnjVOI3fA==',key_name='tempest-TestGettingAddress-444639012',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-ynkhx0vc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:17:57Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=dd01a59a-8825-4686-8ad2-48c0d7c29bcf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "da46e34b-ec37-4cc4-b1ab-4e8564ebbb60", "address": "fa:16:3e:4b:a1:91", "network": {"id": "206a04da-ce2f-48ff-99c7-e70706547580", "bridge": "br-int", "label": "tempest-network-smoke--1887597380", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe4b:a191", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda46e34b-ec", "ovs_interfaceid": "da46e34b-ec37-4cc4-b1ab-4e8564ebbb60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.517 186853 DEBUG nova.network.os_vif_util [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "da46e34b-ec37-4cc4-b1ab-4e8564ebbb60", "address": "fa:16:3e:4b:a1:91", "network": {"id": "206a04da-ce2f-48ff-99c7-e70706547580", "bridge": "br-int", "label": "tempest-network-smoke--1887597380", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe4b:a191", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda46e34b-ec", "ovs_interfaceid": "da46e34b-ec37-4cc4-b1ab-4e8564ebbb60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.518 186853 DEBUG nova.network.os_vif_util [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4b:a1:91,bridge_name='br-int',has_traffic_filtering=True,id=da46e34b-ec37-4cc4-b1ab-4e8564ebbb60,network=Network(206a04da-ce2f-48ff-99c7-e70706547580),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda46e34b-ec') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.520 186853 DEBUG nova.objects.instance [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lazy-loading 'pci_devices' on Instance uuid dd01a59a-8825-4686-8ad2-48c0d7c29bcf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.539 186853 DEBUG nova.virt.libvirt.driver [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:18:06 np0005531887 nova_compute[186849]:  <uuid>dd01a59a-8825-4686-8ad2-48c0d7c29bcf</uuid>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:  <name>instance-00000095</name>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:  <memory>131072</memory>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:  <vcpu>1</vcpu>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:18:06 np0005531887 nova_compute[186849]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:      <nova:name>tempest-TestGettingAddress-server-686264333</nova:name>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:      <nova:creationTime>2025-11-22 08:18:06</nova:creationTime>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:      <nova:flavor name="m1.nano">
Nov 22 03:18:06 np0005531887 nova_compute[186849]:        <nova:memory>128</nova:memory>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:        <nova:disk>1</nova:disk>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:        <nova:swap>0</nova:swap>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:      </nova:flavor>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:      <nova:owner>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:        <nova:user uuid="809b865601654264af5bff7f49127cea">tempest-TestGettingAddress-25838038-project-member</nova:user>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:        <nova:project uuid="c4200f1d1fbb44a5aaf5e3578f6354ae">tempest-TestGettingAddress-25838038</nova:project>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:      </nova:owner>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:      <nova:ports>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:        <nova:port uuid="9dcbb883-4317-4193-a384-0d8b55f051a7">
Nov 22 03:18:06 np0005531887 nova_compute[186849]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:        </nova:port>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:        <nova:port uuid="da46e34b-ec37-4cc4-b1ab-4e8564ebbb60">
Nov 22 03:18:06 np0005531887 nova_compute[186849]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fe4b:a191" ipVersion="6"/>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:        </nova:port>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:      </nova:ports>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:    </nova:instance>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:  <sysinfo type="smbios">
Nov 22 03:18:06 np0005531887 nova_compute[186849]:    <system>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:      <entry name="serial">dd01a59a-8825-4686-8ad2-48c0d7c29bcf</entry>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:      <entry name="uuid">dd01a59a-8825-4686-8ad2-48c0d7c29bcf</entry>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:    </system>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:  <os>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:    <boot dev="hd"/>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:    <smbios mode="sysinfo"/>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:  </os>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:  <features>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:    <vmcoreinfo/>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:  </features>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:  <clock offset="utc">
Nov 22 03:18:06 np0005531887 nova_compute[186849]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:    <timer name="hpet" present="no"/>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:  </clock>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:  <cpu mode="custom" match="exact">
Nov 22 03:18:06 np0005531887 nova_compute[186849]:    <model>Nehalem</model>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:  <devices>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:    <disk type="file" device="disk">
Nov 22 03:18:06 np0005531887 nova_compute[186849]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/dd01a59a-8825-4686-8ad2-48c0d7c29bcf/disk"/>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:      <target dev="vda" bus="virtio"/>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:    <disk type="file" device="cdrom">
Nov 22 03:18:06 np0005531887 nova_compute[186849]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/dd01a59a-8825-4686-8ad2-48c0d7c29bcf/disk.config"/>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:      <target dev="sda" bus="sata"/>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:    <interface type="ethernet">
Nov 22 03:18:06 np0005531887 nova_compute[186849]:      <mac address="fa:16:3e:ae:bf:ae"/>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:      <mtu size="1442"/>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:      <target dev="tap9dcbb883-43"/>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:    </interface>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:    <interface type="ethernet">
Nov 22 03:18:06 np0005531887 nova_compute[186849]:      <mac address="fa:16:3e:4b:a1:91"/>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:      <mtu size="1442"/>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:      <target dev="tapda46e34b-ec"/>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:    </interface>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:    <serial type="pty">
Nov 22 03:18:06 np0005531887 nova_compute[186849]:      <log file="/var/lib/nova/instances/dd01a59a-8825-4686-8ad2-48c0d7c29bcf/console.log" append="off"/>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:    </serial>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:    <video>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:    </video>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:    <input type="tablet" bus="usb"/>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:    <rng model="virtio">
Nov 22 03:18:06 np0005531887 nova_compute[186849]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:    </rng>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:    <controller type="usb" index="0"/>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:    <memballoon model="virtio">
Nov 22 03:18:06 np0005531887 nova_compute[186849]:      <stats period="10"/>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 03:18:06 np0005531887 nova_compute[186849]:  </devices>
Nov 22 03:18:06 np0005531887 nova_compute[186849]: </domain>
Nov 22 03:18:06 np0005531887 nova_compute[186849]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.540 186853 DEBUG nova.compute.manager [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Preparing to wait for external event network-vif-plugged-9dcbb883-4317-4193-a384-0d8b55f051a7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.541 186853 DEBUG oslo_concurrency.lockutils [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "dd01a59a-8825-4686-8ad2-48c0d7c29bcf-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.541 186853 DEBUG oslo_concurrency.lockutils [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "dd01a59a-8825-4686-8ad2-48c0d7c29bcf-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.541 186853 DEBUG oslo_concurrency.lockutils [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "dd01a59a-8825-4686-8ad2-48c0d7c29bcf-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.542 186853 DEBUG nova.compute.manager [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Preparing to wait for external event network-vif-plugged-da46e34b-ec37-4cc4-b1ab-4e8564ebbb60 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.542 186853 DEBUG oslo_concurrency.lockutils [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "dd01a59a-8825-4686-8ad2-48c0d7c29bcf-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.542 186853 DEBUG oslo_concurrency.lockutils [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "dd01a59a-8825-4686-8ad2-48c0d7c29bcf-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.542 186853 DEBUG oslo_concurrency.lockutils [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "dd01a59a-8825-4686-8ad2-48c0d7c29bcf-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.543 186853 DEBUG nova.virt.libvirt.vif [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:17:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-686264333',display_name='tempest-TestGettingAddress-server-686264333',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-686264333',id=149,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM4eWB6S8gF3+vAQNfWODRrXJ2TSEmW43skqR+J/UySpPtPQ+ovw0XjJfGr33wuxAxwi/2V7+yN1aEcDFfs9GT9vSaMpY282CNYDuBhDuhcnpLdM1GTSDhOlEpnjVOI3fA==',key_name='tempest-TestGettingAddress-444639012',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-ynkhx0vc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:17:57Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=dd01a59a-8825-4686-8ad2-48c0d7c29bcf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9dcbb883-4317-4193-a384-0d8b55f051a7", "address": "fa:16:3e:ae:bf:ae", "network": {"id": "28285a99-0933-48f9-aee6-f1e507bcd777", "bridge": "br-int", "label": "tempest-network-smoke--85194758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9dcbb883-43", "ovs_interfaceid": "9dcbb883-4317-4193-a384-0d8b55f051a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.543 186853 DEBUG nova.network.os_vif_util [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "9dcbb883-4317-4193-a384-0d8b55f051a7", "address": "fa:16:3e:ae:bf:ae", "network": {"id": "28285a99-0933-48f9-aee6-f1e507bcd777", "bridge": "br-int", "label": "tempest-network-smoke--85194758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9dcbb883-43", "ovs_interfaceid": "9dcbb883-4317-4193-a384-0d8b55f051a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.544 186853 DEBUG nova.network.os_vif_util [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ae:bf:ae,bridge_name='br-int',has_traffic_filtering=True,id=9dcbb883-4317-4193-a384-0d8b55f051a7,network=Network(28285a99-0933-48f9-aee6-f1e507bcd777),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9dcbb883-43') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.545 186853 DEBUG os_vif [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ae:bf:ae,bridge_name='br-int',has_traffic_filtering=True,id=9dcbb883-4317-4193-a384-0d8b55f051a7,network=Network(28285a99-0933-48f9-aee6-f1e507bcd777),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9dcbb883-43') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.545 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.546 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.546 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.550 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.550 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9dcbb883-43, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.551 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9dcbb883-43, col_values=(('external_ids', {'iface-id': '9dcbb883-4317-4193-a384-0d8b55f051a7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ae:bf:ae', 'vm-uuid': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.553 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:06 np0005531887 NetworkManager[55210]: <info>  [1763799486.5537] manager: (tap9dcbb883-43): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/229)
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.554 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.560 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.561 186853 INFO os_vif [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ae:bf:ae,bridge_name='br-int',has_traffic_filtering=True,id=9dcbb883-4317-4193-a384-0d8b55f051a7,network=Network(28285a99-0933-48f9-aee6-f1e507bcd777),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9dcbb883-43')#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.562 186853 DEBUG nova.virt.libvirt.vif [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:17:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-686264333',display_name='tempest-TestGettingAddress-server-686264333',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-686264333',id=149,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM4eWB6S8gF3+vAQNfWODRrXJ2TSEmW43skqR+J/UySpPtPQ+ovw0XjJfGr33wuxAxwi/2V7+yN1aEcDFfs9GT9vSaMpY282CNYDuBhDuhcnpLdM1GTSDhOlEpnjVOI3fA==',key_name='tempest-TestGettingAddress-444639012',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-ynkhx0vc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:17:57Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=dd01a59a-8825-4686-8ad2-48c0d7c29bcf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "da46e34b-ec37-4cc4-b1ab-4e8564ebbb60", "address": "fa:16:3e:4b:a1:91", "network": {"id": "206a04da-ce2f-48ff-99c7-e70706547580", "bridge": "br-int", "label": "tempest-network-smoke--1887597380", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe4b:a191", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda46e34b-ec", "ovs_interfaceid": "da46e34b-ec37-4cc4-b1ab-4e8564ebbb60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.563 186853 DEBUG nova.network.os_vif_util [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "da46e34b-ec37-4cc4-b1ab-4e8564ebbb60", "address": "fa:16:3e:4b:a1:91", "network": {"id": "206a04da-ce2f-48ff-99c7-e70706547580", "bridge": "br-int", "label": "tempest-network-smoke--1887597380", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe4b:a191", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda46e34b-ec", "ovs_interfaceid": "da46e34b-ec37-4cc4-b1ab-4e8564ebbb60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.563 186853 DEBUG nova.network.os_vif_util [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4b:a1:91,bridge_name='br-int',has_traffic_filtering=True,id=da46e34b-ec37-4cc4-b1ab-4e8564ebbb60,network=Network(206a04da-ce2f-48ff-99c7-e70706547580),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda46e34b-ec') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.563 186853 DEBUG os_vif [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4b:a1:91,bridge_name='br-int',has_traffic_filtering=True,id=da46e34b-ec37-4cc4-b1ab-4e8564ebbb60,network=Network(206a04da-ce2f-48ff-99c7-e70706547580),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda46e34b-ec') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.564 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.564 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.564 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.568 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.568 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapda46e34b-ec, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.568 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapda46e34b-ec, col_values=(('external_ids', {'iface-id': 'da46e34b-ec37-4cc4-b1ab-4e8564ebbb60', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4b:a1:91', 'vm-uuid': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.570 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:06 np0005531887 NetworkManager[55210]: <info>  [1763799486.5717] manager: (tapda46e34b-ec): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/230)
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.572 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.581 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.582 186853 INFO os_vif [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4b:a1:91,bridge_name='br-int',has_traffic_filtering=True,id=da46e34b-ec37-4cc4-b1ab-4e8564ebbb60,network=Network(206a04da-ce2f-48ff-99c7-e70706547580),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda46e34b-ec')#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.693 186853 DEBUG nova.virt.libvirt.driver [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.693 186853 DEBUG nova.virt.libvirt.driver [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.693 186853 DEBUG nova.virt.libvirt.driver [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] No VIF found with MAC fa:16:3e:ae:bf:ae, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.693 186853 DEBUG nova.virt.libvirt.driver [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] No VIF found with MAC fa:16:3e:4b:a1:91, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.694 186853 INFO nova.virt.libvirt.driver [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Using config drive#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.797 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.797 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.797 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.798 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.861 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dd01a59a-8825-4686-8ad2-48c0d7c29bcf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.923 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dd01a59a-8825-4686-8ad2-48c0d7c29bcf/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.923 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dd01a59a-8825-4686-8ad2-48c0d7c29bcf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.989 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dd01a59a-8825-4686-8ad2-48c0d7c29bcf/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:18:06 np0005531887 nova_compute[186849]: 2025-11-22 08:18:06.991 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Periodic task is updating the host stat, it is trying to get disk instance-00000095, but disk file was removed by concurrent operations such as resize.: FileNotFoundError: [Errno 2] No such file or directory: '/var/lib/nova/instances/dd01a59a-8825-4686-8ad2-48c0d7c29bcf/disk.config'#033[00m
Nov 22 03:18:07 np0005531887 nova_compute[186849]: 2025-11-22 08:18:07.144 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:07 np0005531887 nova_compute[186849]: 2025-11-22 08:18:07.167 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:18:07 np0005531887 nova_compute[186849]: 2025-11-22 08:18:07.169 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5739MB free_disk=73.2735595703125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:18:07 np0005531887 nova_compute[186849]: 2025-11-22 08:18:07.169 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:18:07 np0005531887 nova_compute[186849]: 2025-11-22 08:18:07.169 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:18:07 np0005531887 nova_compute[186849]: 2025-11-22 08:18:07.250 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Instance dd01a59a-8825-4686-8ad2-48c0d7c29bcf actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 03:18:07 np0005531887 nova_compute[186849]: 2025-11-22 08:18:07.250 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:18:07 np0005531887 nova_compute[186849]: 2025-11-22 08:18:07.250 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:18:07 np0005531887 nova_compute[186849]: 2025-11-22 08:18:07.286 186853 INFO nova.virt.libvirt.driver [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Creating config drive at /var/lib/nova/instances/dd01a59a-8825-4686-8ad2-48c0d7c29bcf/disk.config#033[00m
Nov 22 03:18:07 np0005531887 nova_compute[186849]: 2025-11-22 08:18:07.291 186853 DEBUG oslo_concurrency.processutils [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/dd01a59a-8825-4686-8ad2-48c0d7c29bcf/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbbdwynht execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:18:07 np0005531887 nova_compute[186849]: 2025-11-22 08:18:07.321 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:18:07 np0005531887 nova_compute[186849]: 2025-11-22 08:18:07.336 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:18:07 np0005531887 nova_compute[186849]: 2025-11-22 08:18:07.359 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:18:07 np0005531887 nova_compute[186849]: 2025-11-22 08:18:07.359 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.190s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:18:07 np0005531887 podman[236466]: 2025-11-22 08:18:07.836164578 +0000 UTC m=+0.055105554 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 03:18:08 np0005531887 nova_compute[186849]: 2025-11-22 08:18:08.198 186853 DEBUG nova.network.neutron [req-474873ee-e6af-4f3b-8aa8-2b890b1ba197 req-3ef4c609-7571-4714-bf9b-83cc85dc4ab7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Updated VIF entry in instance network info cache for port da46e34b-ec37-4cc4-b1ab-4e8564ebbb60. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:18:08 np0005531887 nova_compute[186849]: 2025-11-22 08:18:08.198 186853 DEBUG nova.network.neutron [req-474873ee-e6af-4f3b-8aa8-2b890b1ba197 req-3ef4c609-7571-4714-bf9b-83cc85dc4ab7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Updating instance_info_cache with network_info: [{"id": "9dcbb883-4317-4193-a384-0d8b55f051a7", "address": "fa:16:3e:ae:bf:ae", "network": {"id": "28285a99-0933-48f9-aee6-f1e507bcd777", "bridge": "br-int", "label": "tempest-network-smoke--85194758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9dcbb883-43", "ovs_interfaceid": "9dcbb883-4317-4193-a384-0d8b55f051a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "da46e34b-ec37-4cc4-b1ab-4e8564ebbb60", "address": "fa:16:3e:4b:a1:91", "network": {"id": "206a04da-ce2f-48ff-99c7-e70706547580", "bridge": "br-int", "label": "tempest-network-smoke--1887597380", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe4b:a191", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda46e34b-ec", "ovs_interfaceid": "da46e34b-ec37-4cc4-b1ab-4e8564ebbb60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:18:08 np0005531887 nova_compute[186849]: 2025-11-22 08:18:08.215 186853 DEBUG oslo_concurrency.lockutils [req-474873ee-e6af-4f3b-8aa8-2b890b1ba197 req-3ef4c609-7571-4714-bf9b-83cc85dc4ab7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-dd01a59a-8825-4686-8ad2-48c0d7c29bcf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:18:08 np0005531887 nova_compute[186849]: 2025-11-22 08:18:08.359 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:18:08 np0005531887 nova_compute[186849]: 2025-11-22 08:18:08.360 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:18:08 np0005531887 nova_compute[186849]: 2025-11-22 08:18:08.360 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:18:08 np0005531887 nova_compute[186849]: 2025-11-22 08:18:08.376 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Nov 22 03:18:08 np0005531887 nova_compute[186849]: 2025-11-22 08:18:08.377 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:18:08 np0005531887 nova_compute[186849]: 2025-11-22 08:18:08.426 186853 DEBUG oslo_concurrency.processutils [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/dd01a59a-8825-4686-8ad2-48c0d7c29bcf/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbbdwynht" returned: 0 in 1.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:18:08 np0005531887 kernel: tap9dcbb883-43: entered promiscuous mode
Nov 22 03:18:08 np0005531887 NetworkManager[55210]: <info>  [1763799488.4920] manager: (tap9dcbb883-43): new Tun device (/org/freedesktop/NetworkManager/Devices/231)
Nov 22 03:18:08 np0005531887 ovn_controller[95130]: 2025-11-22T08:18:08Z|00498|binding|INFO|Claiming lport 9dcbb883-4317-4193-a384-0d8b55f051a7 for this chassis.
Nov 22 03:18:08 np0005531887 ovn_controller[95130]: 2025-11-22T08:18:08Z|00499|binding|INFO|9dcbb883-4317-4193-a384-0d8b55f051a7: Claiming fa:16:3e:ae:bf:ae 10.100.0.7
Nov 22 03:18:08 np0005531887 nova_compute[186849]: 2025-11-22 08:18:08.496 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:08 np0005531887 nova_compute[186849]: 2025-11-22 08:18:08.505 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:08 np0005531887 NetworkManager[55210]: <info>  [1763799488.5103] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/232)
Nov 22 03:18:08 np0005531887 NetworkManager[55210]: <info>  [1763799488.5108] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/233)
Nov 22 03:18:08 np0005531887 nova_compute[186849]: 2025-11-22 08:18:08.509 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:08 np0005531887 NetworkManager[55210]: <info>  [1763799488.5151] manager: (tapda46e34b-ec): new Tun device (/org/freedesktop/NetworkManager/Devices/234)
Nov 22 03:18:08 np0005531887 systemd-udevd[236510]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:18:08 np0005531887 systemd-udevd[236509]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:18:08 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:08.537 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ae:bf:ae 10.100.0.7'], port_security=['fa:16:3e:ae:bf:ae 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-28285a99-0933-48f9-aee6-f1e507bcd777', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2f853bc3-cae6-48c5-838f-5d956d1719f7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2ce7fc5f-5ca9-4729-bcf9-4866d6397f92, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=9dcbb883-4317-4193-a384-0d8b55f051a7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:18:08 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:08.538 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 9dcbb883-4317-4193-a384-0d8b55f051a7 in datapath 28285a99-0933-48f9-aee6-f1e507bcd777 bound to our chassis#033[00m
Nov 22 03:18:08 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:08.539 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 28285a99-0933-48f9-aee6-f1e507bcd777#033[00m
Nov 22 03:18:08 np0005531887 NetworkManager[55210]: <info>  [1763799488.5435] device (tap9dcbb883-43): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:18:08 np0005531887 NetworkManager[55210]: <info>  [1763799488.5450] device (tap9dcbb883-43): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:18:08 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:08.553 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[d589f729-ce87-4288-a0e5-62772b508ee3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:08 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:08.555 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap28285a99-01 in ovnmeta-28285a99-0933-48f9-aee6-f1e507bcd777 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:18:08 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:08.557 213790 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap28285a99-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:18:08 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:08.557 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[36c965b4-3dd3-45f7-ad21-ed33d658ffbb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:08 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:08.558 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[7359f1a6-33ac-4ae1-b700-5ea245d55ab8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:08 np0005531887 systemd-machined[153180]: New machine qemu-54-instance-00000095.
Nov 22 03:18:08 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:08.571 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[c9484304-ed0b-4764-926b-d24a2939e1cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:08 np0005531887 systemd[1]: Started Virtual Machine qemu-54-instance-00000095.
Nov 22 03:18:08 np0005531887 kernel: tapda46e34b-ec: entered promiscuous mode
Nov 22 03:18:08 np0005531887 NetworkManager[55210]: <info>  [1763799488.5932] device (tapda46e34b-ec): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:18:08 np0005531887 NetworkManager[55210]: <info>  [1763799488.5946] device (tapda46e34b-ec): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:18:08 np0005531887 nova_compute[186849]: 2025-11-22 08:18:08.595 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:08 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:08.597 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[577776e4-3e47-458c-94a2-70e030b8bb41]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:08 np0005531887 ovn_controller[95130]: 2025-11-22T08:18:08Z|00500|binding|INFO|Claiming lport da46e34b-ec37-4cc4-b1ab-4e8564ebbb60 for this chassis.
Nov 22 03:18:08 np0005531887 ovn_controller[95130]: 2025-11-22T08:18:08Z|00501|binding|INFO|da46e34b-ec37-4cc4-b1ab-4e8564ebbb60: Claiming fa:16:3e:4b:a1:91 2001:db8::f816:3eff:fe4b:a191
Nov 22 03:18:08 np0005531887 ovn_controller[95130]: 2025-11-22T08:18:08Z|00502|binding|INFO|Setting lport 9dcbb883-4317-4193-a384-0d8b55f051a7 ovn-installed in OVS
Nov 22 03:18:08 np0005531887 nova_compute[186849]: 2025-11-22 08:18:08.612 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:08 np0005531887 ovn_controller[95130]: 2025-11-22T08:18:08Z|00503|binding|INFO|Setting lport 9dcbb883-4317-4193-a384-0d8b55f051a7 up in Southbound
Nov 22 03:18:08 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:08.620 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4b:a1:91 2001:db8::f816:3eff:fe4b:a191'], port_security=['fa:16:3e:4b:a1:91 2001:db8::f816:3eff:fe4b:a191'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe4b:a191/64', 'neutron:device_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-206a04da-ce2f-48ff-99c7-e70706547580', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2f853bc3-cae6-48c5-838f-5d956d1719f7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=450e32a6-ae0a-4cd4-b338-c697096c146f, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=da46e34b-ec37-4cc4-b1ab-4e8564ebbb60) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:18:08 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:08.627 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[22f3e4f7-1852-48bc-93b6-d0a7b52b4aa3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:08 np0005531887 ovn_controller[95130]: 2025-11-22T08:18:08Z|00504|binding|INFO|Setting lport da46e34b-ec37-4cc4-b1ab-4e8564ebbb60 ovn-installed in OVS
Nov 22 03:18:08 np0005531887 ovn_controller[95130]: 2025-11-22T08:18:08Z|00505|binding|INFO|Setting lport da46e34b-ec37-4cc4-b1ab-4e8564ebbb60 up in Southbound
Nov 22 03:18:08 np0005531887 nova_compute[186849]: 2025-11-22 08:18:08.632 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:08 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:08.633 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[3f94443a-b161-4299-a9e3-a408c6a984f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:08 np0005531887 NetworkManager[55210]: <info>  [1763799488.6337] manager: (tap28285a99-00): new Veth device (/org/freedesktop/NetworkManager/Devices/235)
Nov 22 03:18:08 np0005531887 systemd-udevd[236515]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:18:08 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:08.662 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[6dcd12c9-32cd-450e-a199-365bf8109014]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:08 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:08.665 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[7231d44e-abea-45bc-ac3c-a73dbc25fc64]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:08 np0005531887 NetworkManager[55210]: <info>  [1763799488.6883] device (tap28285a99-00): carrier: link connected
Nov 22 03:18:08 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:08.696 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[7acafe0b-e03a-4aeb-9a8f-2ff0d6ae07be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:08 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:08.714 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[a210a575-f393-4f63-a5c3-2e4d2a297afe]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap28285a99-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:37:93:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 151], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 623129, 'reachable_time': 25407, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236546, 'error': None, 'target': 'ovnmeta-28285a99-0933-48f9-aee6-f1e507bcd777', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:08 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:08.733 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[2a4e5eab-57d0-458a-a550-ebe2b002058d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe37:938c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 623129, 'tstamp': 623129}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236547, 'error': None, 'target': 'ovnmeta-28285a99-0933-48f9-aee6-f1e507bcd777', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:08 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:08.752 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[5b242f48-ff4e-46f4-b280-32a6a1f1e9da]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap28285a99-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:37:93:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 151], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 623129, 'reachable_time': 25407, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 236548, 'error': None, 'target': 'ovnmeta-28285a99-0933-48f9-aee6-f1e507bcd777', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:08 np0005531887 nova_compute[186849]: 2025-11-22 08:18:08.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:18:08 np0005531887 nova_compute[186849]: 2025-11-22 08:18:08.768 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:18:08 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:08.786 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[609aaf89-b6c1-41ed-97c4-eab63efeecc5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:08 np0005531887 nova_compute[186849]: 2025-11-22 08:18:08.854 186853 DEBUG nova.compute.manager [req-be906a3f-f03f-49cd-8bb4-5517f6cb01c8 req-ad0cca2e-27b1-4b53-b7fa-b06110d60b00 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Received event network-vif-plugged-da46e34b-ec37-4cc4-b1ab-4e8564ebbb60 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:18:08 np0005531887 nova_compute[186849]: 2025-11-22 08:18:08.854 186853 DEBUG oslo_concurrency.lockutils [req-be906a3f-f03f-49cd-8bb4-5517f6cb01c8 req-ad0cca2e-27b1-4b53-b7fa-b06110d60b00 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "dd01a59a-8825-4686-8ad2-48c0d7c29bcf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:18:08 np0005531887 nova_compute[186849]: 2025-11-22 08:18:08.854 186853 DEBUG oslo_concurrency.lockutils [req-be906a3f-f03f-49cd-8bb4-5517f6cb01c8 req-ad0cca2e-27b1-4b53-b7fa-b06110d60b00 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "dd01a59a-8825-4686-8ad2-48c0d7c29bcf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:18:08 np0005531887 nova_compute[186849]: 2025-11-22 08:18:08.854 186853 DEBUG oslo_concurrency.lockutils [req-be906a3f-f03f-49cd-8bb4-5517f6cb01c8 req-ad0cca2e-27b1-4b53-b7fa-b06110d60b00 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "dd01a59a-8825-4686-8ad2-48c0d7c29bcf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:18:08 np0005531887 nova_compute[186849]: 2025-11-22 08:18:08.854 186853 DEBUG nova.compute.manager [req-be906a3f-f03f-49cd-8bb4-5517f6cb01c8 req-ad0cca2e-27b1-4b53-b7fa-b06110d60b00 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Processing event network-vif-plugged-da46e34b-ec37-4cc4-b1ab-4e8564ebbb60 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:18:08 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:08.857 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[ab3a4e59-5324-4cff-a8b2-d5feab4949c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:08 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:08.858 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap28285a99-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:18:08 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:08.859 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:18:08 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:08.859 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap28285a99-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:18:08 np0005531887 nova_compute[186849]: 2025-11-22 08:18:08.861 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:08 np0005531887 NetworkManager[55210]: <info>  [1763799488.8629] manager: (tap28285a99-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/236)
Nov 22 03:18:08 np0005531887 kernel: tap28285a99-00: entered promiscuous mode
Nov 22 03:18:08 np0005531887 nova_compute[186849]: 2025-11-22 08:18:08.864 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:08 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:08.865 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap28285a99-00, col_values=(('external_ids', {'iface-id': '44730bfb-6390-4f63-a416-89f912674e46'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:18:08 np0005531887 nova_compute[186849]: 2025-11-22 08:18:08.867 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:08 np0005531887 nova_compute[186849]: 2025-11-22 08:18:08.867 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:08 np0005531887 ovn_controller[95130]: 2025-11-22T08:18:08Z|00506|binding|INFO|Releasing lport 44730bfb-6390-4f63-a416-89f912674e46 from this chassis (sb_readonly=0)
Nov 22 03:18:08 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:08.869 104084 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/28285a99-0933-48f9-aee6-f1e507bcd777.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/28285a99-0933-48f9-aee6-f1e507bcd777.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:18:08 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:08.869 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[d355aecc-b38c-48da-9b6f-5ed1a05b15d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:08 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:08.870 104084 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:18:08 np0005531887 ovn_metadata_agent[104079]: global
Nov 22 03:18:08 np0005531887 ovn_metadata_agent[104079]:    log         /dev/log local0 debug
Nov 22 03:18:08 np0005531887 ovn_metadata_agent[104079]:    log-tag     haproxy-metadata-proxy-28285a99-0933-48f9-aee6-f1e507bcd777
Nov 22 03:18:08 np0005531887 ovn_metadata_agent[104079]:    user        root
Nov 22 03:18:08 np0005531887 ovn_metadata_agent[104079]:    group       root
Nov 22 03:18:08 np0005531887 ovn_metadata_agent[104079]:    maxconn     1024
Nov 22 03:18:08 np0005531887 ovn_metadata_agent[104079]:    pidfile     /var/lib/neutron/external/pids/28285a99-0933-48f9-aee6-f1e507bcd777.pid.haproxy
Nov 22 03:18:08 np0005531887 ovn_metadata_agent[104079]:    daemon
Nov 22 03:18:08 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:18:08 np0005531887 ovn_metadata_agent[104079]: defaults
Nov 22 03:18:08 np0005531887 ovn_metadata_agent[104079]:    log global
Nov 22 03:18:08 np0005531887 ovn_metadata_agent[104079]:    mode http
Nov 22 03:18:08 np0005531887 ovn_metadata_agent[104079]:    option httplog
Nov 22 03:18:08 np0005531887 ovn_metadata_agent[104079]:    option dontlognull
Nov 22 03:18:08 np0005531887 ovn_metadata_agent[104079]:    option http-server-close
Nov 22 03:18:08 np0005531887 ovn_metadata_agent[104079]:    option forwardfor
Nov 22 03:18:08 np0005531887 ovn_metadata_agent[104079]:    retries                 3
Nov 22 03:18:08 np0005531887 ovn_metadata_agent[104079]:    timeout http-request    30s
Nov 22 03:18:08 np0005531887 ovn_metadata_agent[104079]:    timeout connect         30s
Nov 22 03:18:08 np0005531887 ovn_metadata_agent[104079]:    timeout client          32s
Nov 22 03:18:08 np0005531887 ovn_metadata_agent[104079]:    timeout server          32s
Nov 22 03:18:08 np0005531887 ovn_metadata_agent[104079]:    timeout http-keep-alive 30s
Nov 22 03:18:08 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:18:08 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:18:08 np0005531887 ovn_metadata_agent[104079]: listen listener
Nov 22 03:18:08 np0005531887 ovn_metadata_agent[104079]:    bind 169.254.169.254:80
Nov 22 03:18:08 np0005531887 ovn_metadata_agent[104079]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:18:08 np0005531887 ovn_metadata_agent[104079]:    http-request add-header X-OVN-Network-ID 28285a99-0933-48f9-aee6-f1e507bcd777
Nov 22 03:18:08 np0005531887 ovn_metadata_agent[104079]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:18:08 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:08.872 104084 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-28285a99-0933-48f9-aee6-f1e507bcd777', 'env', 'PROCESS_TAG=haproxy-28285a99-0933-48f9-aee6-f1e507bcd777', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/28285a99-0933-48f9-aee6-f1e507bcd777.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:18:08 np0005531887 nova_compute[186849]: 2025-11-22 08:18:08.881 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:08 np0005531887 nova_compute[186849]: 2025-11-22 08:18:08.951 186853 DEBUG nova.compute.manager [req-be6e7b0f-caad-4c7e-939a-5d4b0fa13feb req-e568ba70-ddd5-4da1-9a0d-ccbe6d4d85f1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Received event network-vif-plugged-9dcbb883-4317-4193-a384-0d8b55f051a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:18:08 np0005531887 nova_compute[186849]: 2025-11-22 08:18:08.951 186853 DEBUG oslo_concurrency.lockutils [req-be6e7b0f-caad-4c7e-939a-5d4b0fa13feb req-e568ba70-ddd5-4da1-9a0d-ccbe6d4d85f1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "dd01a59a-8825-4686-8ad2-48c0d7c29bcf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:18:08 np0005531887 nova_compute[186849]: 2025-11-22 08:18:08.952 186853 DEBUG oslo_concurrency.lockutils [req-be6e7b0f-caad-4c7e-939a-5d4b0fa13feb req-e568ba70-ddd5-4da1-9a0d-ccbe6d4d85f1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "dd01a59a-8825-4686-8ad2-48c0d7c29bcf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:18:08 np0005531887 nova_compute[186849]: 2025-11-22 08:18:08.952 186853 DEBUG oslo_concurrency.lockutils [req-be6e7b0f-caad-4c7e-939a-5d4b0fa13feb req-e568ba70-ddd5-4da1-9a0d-ccbe6d4d85f1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "dd01a59a-8825-4686-8ad2-48c0d7c29bcf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:18:08 np0005531887 nova_compute[186849]: 2025-11-22 08:18:08.952 186853 DEBUG nova.compute.manager [req-be6e7b0f-caad-4c7e-939a-5d4b0fa13feb req-e568ba70-ddd5-4da1-9a0d-ccbe6d4d85f1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Processing event network-vif-plugged-9dcbb883-4317-4193-a384-0d8b55f051a7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:18:09 np0005531887 nova_compute[186849]: 2025-11-22 08:18:09.280 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763799489.2796302, dd01a59a-8825-4686-8ad2-48c0d7c29bcf => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:18:09 np0005531887 nova_compute[186849]: 2025-11-22 08:18:09.280 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] VM Started (Lifecycle Event)#033[00m
Nov 22 03:18:09 np0005531887 nova_compute[186849]: 2025-11-22 08:18:09.282 186853 DEBUG nova.compute.manager [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Instance event wait completed in 0 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:18:09 np0005531887 nova_compute[186849]: 2025-11-22 08:18:09.286 186853 DEBUG nova.virt.libvirt.driver [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:18:09 np0005531887 nova_compute[186849]: 2025-11-22 08:18:09.290 186853 INFO nova.virt.libvirt.driver [-] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Instance spawned successfully.#033[00m
Nov 22 03:18:09 np0005531887 nova_compute[186849]: 2025-11-22 08:18:09.290 186853 DEBUG nova.virt.libvirt.driver [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 03:18:09 np0005531887 nova_compute[186849]: 2025-11-22 08:18:09.302 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:18:09 np0005531887 nova_compute[186849]: 2025-11-22 08:18:09.307 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:18:09 np0005531887 nova_compute[186849]: 2025-11-22 08:18:09.311 186853 DEBUG nova.virt.libvirt.driver [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:18:09 np0005531887 nova_compute[186849]: 2025-11-22 08:18:09.311 186853 DEBUG nova.virt.libvirt.driver [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:18:09 np0005531887 nova_compute[186849]: 2025-11-22 08:18:09.312 186853 DEBUG nova.virt.libvirt.driver [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:18:09 np0005531887 nova_compute[186849]: 2025-11-22 08:18:09.312 186853 DEBUG nova.virt.libvirt.driver [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:18:09 np0005531887 nova_compute[186849]: 2025-11-22 08:18:09.313 186853 DEBUG nova.virt.libvirt.driver [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:18:09 np0005531887 nova_compute[186849]: 2025-11-22 08:18:09.313 186853 DEBUG nova.virt.libvirt.driver [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:18:09 np0005531887 nova_compute[186849]: 2025-11-22 08:18:09.321 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:18:09 np0005531887 nova_compute[186849]: 2025-11-22 08:18:09.322 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763799489.2800314, dd01a59a-8825-4686-8ad2-48c0d7c29bcf => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:18:09 np0005531887 nova_compute[186849]: 2025-11-22 08:18:09.322 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:18:09 np0005531887 podman[236587]: 2025-11-22 08:18:09.231904823 +0000 UTC m=+0.022177060 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:18:09 np0005531887 nova_compute[186849]: 2025-11-22 08:18:09.344 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:18:09 np0005531887 nova_compute[186849]: 2025-11-22 08:18:09.347 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763799489.2851853, dd01a59a-8825-4686-8ad2-48c0d7c29bcf => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:18:09 np0005531887 nova_compute[186849]: 2025-11-22 08:18:09.348 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:18:09 np0005531887 nova_compute[186849]: 2025-11-22 08:18:09.377 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:18:09 np0005531887 nova_compute[186849]: 2025-11-22 08:18:09.381 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:18:09 np0005531887 nova_compute[186849]: 2025-11-22 08:18:09.384 186853 INFO nova.compute.manager [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Took 11.67 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 03:18:09 np0005531887 nova_compute[186849]: 2025-11-22 08:18:09.385 186853 DEBUG nova.compute.manager [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:18:09 np0005531887 nova_compute[186849]: 2025-11-22 08:18:09.423 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:18:09 np0005531887 nova_compute[186849]: 2025-11-22 08:18:09.464 186853 INFO nova.compute.manager [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Took 12.34 seconds to build instance.#033[00m
Nov 22 03:18:09 np0005531887 nova_compute[186849]: 2025-11-22 08:18:09.479 186853 DEBUG oslo_concurrency.lockutils [None req-e637aa12-6703-46aa-82f0-24ef53b9cab0 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "dd01a59a-8825-4686-8ad2-48c0d7c29bcf" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.456s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:18:09 np0005531887 podman[236587]: 2025-11-22 08:18:09.500178908 +0000 UTC m=+0.290451145 container create 82fd359214e4d78bf917ec357947882c99158b9d09ef3b2b3206b653c73cb165 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-28285a99-0933-48f9-aee6-f1e507bcd777, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:18:09 np0005531887 systemd[1]: Started libpod-conmon-82fd359214e4d78bf917ec357947882c99158b9d09ef3b2b3206b653c73cb165.scope.
Nov 22 03:18:09 np0005531887 systemd[1]: Started libcrun container.
Nov 22 03:18:09 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2096d3a3fad7fd9169d98a51d3dd599a1e7474195815a3502a73b178c7a2a0ad/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:18:09 np0005531887 podman[236587]: 2025-11-22 08:18:09.662989343 +0000 UTC m=+0.453261570 container init 82fd359214e4d78bf917ec357947882c99158b9d09ef3b2b3206b653c73cb165 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-28285a99-0933-48f9-aee6-f1e507bcd777, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:18:09 np0005531887 podman[236587]: 2025-11-22 08:18:09.671685708 +0000 UTC m=+0.461957905 container start 82fd359214e4d78bf917ec357947882c99158b9d09ef3b2b3206b653c73cb165 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-28285a99-0933-48f9-aee6-f1e507bcd777, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 03:18:09 np0005531887 neutron-haproxy-ovnmeta-28285a99-0933-48f9-aee6-f1e507bcd777[236603]: [NOTICE]   (236607) : New worker (236609) forked
Nov 22 03:18:09 np0005531887 neutron-haproxy-ovnmeta-28285a99-0933-48f9-aee6-f1e507bcd777[236603]: [NOTICE]   (236607) : Loading success.
Nov 22 03:18:09 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:09.782 104084 INFO neutron.agent.ovn.metadata.agent [-] Port da46e34b-ec37-4cc4-b1ab-4e8564ebbb60 in datapath 206a04da-ce2f-48ff-99c7-e70706547580 unbound from our chassis#033[00m
Nov 22 03:18:09 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:09.784 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 206a04da-ce2f-48ff-99c7-e70706547580#033[00m
Nov 22 03:18:09 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:09.796 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[7fa662f5-7ae5-4413-b605-ea785f8e66cb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:09 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:09.797 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap206a04da-c1 in ovnmeta-206a04da-ce2f-48ff-99c7-e70706547580 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:18:09 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:09.800 213790 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap206a04da-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:18:09 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:09.800 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[e082fd20-9ff7-4cf4-88a3-30adb902155a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:09 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:09.802 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[95e12eb6-14ae-4dc4-a4ce-4dcbb51a1b3a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:09 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:09.813 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[ad0ad765-1191-4882-a753-dbf2e6ae3735]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:09 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:09.839 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[ca874bf8-2cac-4b01-ad9d-030822edcbb1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:09 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:09.867 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[2170b8fb-e082-46f3-8964-f922e41b528a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:09 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:09.875 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[83505e13-20ca-4368-a504-a54020053a08]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:09 np0005531887 systemd-udevd[236540]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:18:09 np0005531887 NetworkManager[55210]: <info>  [1763799489.8774] manager: (tap206a04da-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/237)
Nov 22 03:18:09 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:09.905 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[f2bb368e-24cb-4400-8f26-14a5a64f165c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:09 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:09.908 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[1853ddb5-e96b-476a-a2f9-f834abb73048]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:09 np0005531887 NetworkManager[55210]: <info>  [1763799489.9313] device (tap206a04da-c0): carrier: link connected
Nov 22 03:18:09 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:09.939 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[aaa8508d-4ef6-4c99-8be6-76f5cf50bf8e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:09 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:09.957 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[826bfda6-e43d-4024-b465-2874c14dd01c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap206a04da-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0c:fc:b7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 152], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 623253, 'reachable_time': 32616, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236628, 'error': None, 'target': 'ovnmeta-206a04da-ce2f-48ff-99c7-e70706547580', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:09 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:09.971 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[90183436-a9cd-414e-b8fc-84152ea0e567]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0c:fcb7'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 623253, 'tstamp': 623253}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236629, 'error': None, 'target': 'ovnmeta-206a04da-ce2f-48ff-99c7-e70706547580', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:09 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:09.990 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[50d20669-2134-45e9-84d6-e160e1e1e055]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap206a04da-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0c:fc:b7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 152], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 623253, 'reachable_time': 32616, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 236630, 'error': None, 'target': 'ovnmeta-206a04da-ce2f-48ff-99c7-e70706547580', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:10.022 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[cca6fc91-2f27-4a2c-a8fb-1c3a5c098b6e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:10.053 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[778f136c-5b6f-41a7-b113-44750f5b95f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:10.055 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap206a04da-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:18:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:10.055 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:18:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:10.056 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap206a04da-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:18:10 np0005531887 nova_compute[186849]: 2025-11-22 08:18:10.058 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:10 np0005531887 NetworkManager[55210]: <info>  [1763799490.0594] manager: (tap206a04da-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/238)
Nov 22 03:18:10 np0005531887 kernel: tap206a04da-c0: entered promiscuous mode
Nov 22 03:18:10 np0005531887 nova_compute[186849]: 2025-11-22 08:18:10.060 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:10.061 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap206a04da-c0, col_values=(('external_ids', {'iface-id': '9e35917e-9d7c-4228-9823-4967f6df52f0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:18:10 np0005531887 nova_compute[186849]: 2025-11-22 08:18:10.062 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:10 np0005531887 ovn_controller[95130]: 2025-11-22T08:18:10Z|00507|binding|INFO|Releasing lport 9e35917e-9d7c-4228-9823-4967f6df52f0 from this chassis (sb_readonly=0)
Nov 22 03:18:10 np0005531887 nova_compute[186849]: 2025-11-22 08:18:10.064 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:10.065 104084 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/206a04da-ce2f-48ff-99c7-e70706547580.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/206a04da-ce2f-48ff-99c7-e70706547580.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:18:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:10.066 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[a8d8988e-0cd9-4103-ba61-ec25f3b87108]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:10.067 104084 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:18:10 np0005531887 ovn_metadata_agent[104079]: global
Nov 22 03:18:10 np0005531887 ovn_metadata_agent[104079]:    log         /dev/log local0 debug
Nov 22 03:18:10 np0005531887 ovn_metadata_agent[104079]:    log-tag     haproxy-metadata-proxy-206a04da-ce2f-48ff-99c7-e70706547580
Nov 22 03:18:10 np0005531887 ovn_metadata_agent[104079]:    user        root
Nov 22 03:18:10 np0005531887 ovn_metadata_agent[104079]:    group       root
Nov 22 03:18:10 np0005531887 ovn_metadata_agent[104079]:    maxconn     1024
Nov 22 03:18:10 np0005531887 ovn_metadata_agent[104079]:    pidfile     /var/lib/neutron/external/pids/206a04da-ce2f-48ff-99c7-e70706547580.pid.haproxy
Nov 22 03:18:10 np0005531887 ovn_metadata_agent[104079]:    daemon
Nov 22 03:18:10 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:18:10 np0005531887 ovn_metadata_agent[104079]: defaults
Nov 22 03:18:10 np0005531887 ovn_metadata_agent[104079]:    log global
Nov 22 03:18:10 np0005531887 ovn_metadata_agent[104079]:    mode http
Nov 22 03:18:10 np0005531887 ovn_metadata_agent[104079]:    option httplog
Nov 22 03:18:10 np0005531887 ovn_metadata_agent[104079]:    option dontlognull
Nov 22 03:18:10 np0005531887 ovn_metadata_agent[104079]:    option http-server-close
Nov 22 03:18:10 np0005531887 ovn_metadata_agent[104079]:    option forwardfor
Nov 22 03:18:10 np0005531887 ovn_metadata_agent[104079]:    retries                 3
Nov 22 03:18:10 np0005531887 ovn_metadata_agent[104079]:    timeout http-request    30s
Nov 22 03:18:10 np0005531887 ovn_metadata_agent[104079]:    timeout connect         30s
Nov 22 03:18:10 np0005531887 ovn_metadata_agent[104079]:    timeout client          32s
Nov 22 03:18:10 np0005531887 ovn_metadata_agent[104079]:    timeout server          32s
Nov 22 03:18:10 np0005531887 ovn_metadata_agent[104079]:    timeout http-keep-alive 30s
Nov 22 03:18:10 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:18:10 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:18:10 np0005531887 ovn_metadata_agent[104079]: listen listener
Nov 22 03:18:10 np0005531887 ovn_metadata_agent[104079]:    bind 169.254.169.254:80
Nov 22 03:18:10 np0005531887 ovn_metadata_agent[104079]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:18:10 np0005531887 ovn_metadata_agent[104079]:    http-request add-header X-OVN-Network-ID 206a04da-ce2f-48ff-99c7-e70706547580
Nov 22 03:18:10 np0005531887 ovn_metadata_agent[104079]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:18:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:10.067 104084 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-206a04da-ce2f-48ff-99c7-e70706547580', 'env', 'PROCESS_TAG=haproxy-206a04da-ce2f-48ff-99c7-e70706547580', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/206a04da-ce2f-48ff-99c7-e70706547580.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:18:10 np0005531887 nova_compute[186849]: 2025-11-22 08:18:10.075 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:10 np0005531887 podman[236661]: 2025-11-22 08:18:10.410619532 +0000 UTC m=+0.020607561 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:18:10 np0005531887 podman[236661]: 2025-11-22 08:18:10.735983527 +0000 UTC m=+0.345971536 container create a09bfd3c48b24a7545fd2da109bec938dfb8cb21a991f763adaa7af919cfff17 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-206a04da-ce2f-48ff-99c7-e70706547580, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 22 03:18:10 np0005531887 nova_compute[186849]: 2025-11-22 08:18:10.762 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:18:10 np0005531887 systemd[1]: Started libpod-conmon-a09bfd3c48b24a7545fd2da109bec938dfb8cb21a991f763adaa7af919cfff17.scope.
Nov 22 03:18:10 np0005531887 systemd[1]: Started libcrun container.
Nov 22 03:18:10 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1328f86e78b08da4844bd99389c998f6c06fd3c55c9dbfaedf7ebffd9e60cac5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:18:10 np0005531887 podman[236661]: 2025-11-22 08:18:10.934014745 +0000 UTC m=+0.544002774 container init a09bfd3c48b24a7545fd2da109bec938dfb8cb21a991f763adaa7af919cfff17 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-206a04da-ce2f-48ff-99c7-e70706547580, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 22 03:18:10 np0005531887 podman[236661]: 2025-11-22 08:18:10.940813583 +0000 UTC m=+0.550801582 container start a09bfd3c48b24a7545fd2da109bec938dfb8cb21a991f763adaa7af919cfff17 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-206a04da-ce2f-48ff-99c7-e70706547580, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 22 03:18:10 np0005531887 neutron-haproxy-ovnmeta-206a04da-ce2f-48ff-99c7-e70706547580[236679]: [NOTICE]   (236683) : New worker (236685) forked
Nov 22 03:18:10 np0005531887 neutron-haproxy-ovnmeta-206a04da-ce2f-48ff-99c7-e70706547580[236679]: [NOTICE]   (236683) : Loading success.
Nov 22 03:18:10 np0005531887 nova_compute[186849]: 2025-11-22 08:18:10.960 186853 DEBUG nova.compute.manager [req-2f9b89a4-2e0c-4db2-82fd-26a36cdf31f9 req-41a74637-6cd1-4b9f-af80-9d507d9adb0b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Received event network-vif-plugged-da46e34b-ec37-4cc4-b1ab-4e8564ebbb60 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:18:10 np0005531887 nova_compute[186849]: 2025-11-22 08:18:10.961 186853 DEBUG oslo_concurrency.lockutils [req-2f9b89a4-2e0c-4db2-82fd-26a36cdf31f9 req-41a74637-6cd1-4b9f-af80-9d507d9adb0b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "dd01a59a-8825-4686-8ad2-48c0d7c29bcf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:18:10 np0005531887 nova_compute[186849]: 2025-11-22 08:18:10.961 186853 DEBUG oslo_concurrency.lockutils [req-2f9b89a4-2e0c-4db2-82fd-26a36cdf31f9 req-41a74637-6cd1-4b9f-af80-9d507d9adb0b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "dd01a59a-8825-4686-8ad2-48c0d7c29bcf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:18:10 np0005531887 nova_compute[186849]: 2025-11-22 08:18:10.962 186853 DEBUG oslo_concurrency.lockutils [req-2f9b89a4-2e0c-4db2-82fd-26a36cdf31f9 req-41a74637-6cd1-4b9f-af80-9d507d9adb0b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "dd01a59a-8825-4686-8ad2-48c0d7c29bcf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:18:10 np0005531887 nova_compute[186849]: 2025-11-22 08:18:10.962 186853 DEBUG nova.compute.manager [req-2f9b89a4-2e0c-4db2-82fd-26a36cdf31f9 req-41a74637-6cd1-4b9f-af80-9d507d9adb0b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] No waiting events found dispatching network-vif-plugged-da46e34b-ec37-4cc4-b1ab-4e8564ebbb60 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:18:10 np0005531887 nova_compute[186849]: 2025-11-22 08:18:10.962 186853 WARNING nova.compute.manager [req-2f9b89a4-2e0c-4db2-82fd-26a36cdf31f9 req-41a74637-6cd1-4b9f-af80-9d507d9adb0b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Received unexpected event network-vif-plugged-da46e34b-ec37-4cc4-b1ab-4e8564ebbb60 for instance with vm_state active and task_state None.#033[00m
Nov 22 03:18:11 np0005531887 nova_compute[186849]: 2025-11-22 08:18:11.059 186853 DEBUG nova.compute.manager [req-2de7f372-76f6-4873-a08f-5d06ad186459 req-895f74dd-d9bb-4fa8-b779-b06ce7ba6d75 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Received event network-vif-plugged-9dcbb883-4317-4193-a384-0d8b55f051a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:18:11 np0005531887 nova_compute[186849]: 2025-11-22 08:18:11.060 186853 DEBUG oslo_concurrency.lockutils [req-2de7f372-76f6-4873-a08f-5d06ad186459 req-895f74dd-d9bb-4fa8-b779-b06ce7ba6d75 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "dd01a59a-8825-4686-8ad2-48c0d7c29bcf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:18:11 np0005531887 nova_compute[186849]: 2025-11-22 08:18:11.060 186853 DEBUG oslo_concurrency.lockutils [req-2de7f372-76f6-4873-a08f-5d06ad186459 req-895f74dd-d9bb-4fa8-b779-b06ce7ba6d75 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "dd01a59a-8825-4686-8ad2-48c0d7c29bcf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:18:11 np0005531887 nova_compute[186849]: 2025-11-22 08:18:11.060 186853 DEBUG oslo_concurrency.lockutils [req-2de7f372-76f6-4873-a08f-5d06ad186459 req-895f74dd-d9bb-4fa8-b779-b06ce7ba6d75 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "dd01a59a-8825-4686-8ad2-48c0d7c29bcf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:18:11 np0005531887 nova_compute[186849]: 2025-11-22 08:18:11.061 186853 DEBUG nova.compute.manager [req-2de7f372-76f6-4873-a08f-5d06ad186459 req-895f74dd-d9bb-4fa8-b779-b06ce7ba6d75 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] No waiting events found dispatching network-vif-plugged-9dcbb883-4317-4193-a384-0d8b55f051a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:18:11 np0005531887 nova_compute[186849]: 2025-11-22 08:18:11.061 186853 WARNING nova.compute.manager [req-2de7f372-76f6-4873-a08f-5d06ad186459 req-895f74dd-d9bb-4fa8-b779-b06ce7ba6d75 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Received unexpected event network-vif-plugged-9dcbb883-4317-4193-a384-0d8b55f051a7 for instance with vm_state active and task_state None.#033[00m
Nov 22 03:18:11 np0005531887 nova_compute[186849]: 2025-11-22 08:18:11.572 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:11 np0005531887 nova_compute[186849]: 2025-11-22 08:18:11.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:18:12 np0005531887 nova_compute[186849]: 2025-11-22 08:18:12.147 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:12 np0005531887 podman[236694]: 2025-11-22 08:18:12.848140399 +0000 UTC m=+0.066073474 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.openshift.expose-services=, release=1755695350, name=ubi9-minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, version=9.6)
Nov 22 03:18:14 np0005531887 nova_compute[186849]: 2025-11-22 08:18:14.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:18:16 np0005531887 nova_compute[186849]: 2025-11-22 08:18:16.575 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:16 np0005531887 podman[236714]: 2025-11-22 08:18:16.846064234 +0000 UTC m=+0.066783643 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 22 03:18:16 np0005531887 podman[236715]: 2025-11-22 08:18:16.880211459 +0000 UTC m=+0.095828301 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:18:17 np0005531887 nova_compute[186849]: 2025-11-22 08:18:17.148 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:18 np0005531887 nova_compute[186849]: 2025-11-22 08:18:18.175 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:18 np0005531887 nova_compute[186849]: 2025-11-22 08:18:18.465 186853 DEBUG nova.compute.manager [req-0188137a-5348-4d12-a7a9-0f8b6dddb1d3 req-745a3714-8037-45b5-a85c-88a5a5989c6f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Received event network-changed-9dcbb883-4317-4193-a384-0d8b55f051a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:18:18 np0005531887 nova_compute[186849]: 2025-11-22 08:18:18.465 186853 DEBUG nova.compute.manager [req-0188137a-5348-4d12-a7a9-0f8b6dddb1d3 req-745a3714-8037-45b5-a85c-88a5a5989c6f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Refreshing instance network info cache due to event network-changed-9dcbb883-4317-4193-a384-0d8b55f051a7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:18:18 np0005531887 nova_compute[186849]: 2025-11-22 08:18:18.466 186853 DEBUG oslo_concurrency.lockutils [req-0188137a-5348-4d12-a7a9-0f8b6dddb1d3 req-745a3714-8037-45b5-a85c-88a5a5989c6f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-dd01a59a-8825-4686-8ad2-48c0d7c29bcf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:18:18 np0005531887 nova_compute[186849]: 2025-11-22 08:18:18.467 186853 DEBUG oslo_concurrency.lockutils [req-0188137a-5348-4d12-a7a9-0f8b6dddb1d3 req-745a3714-8037-45b5-a85c-88a5a5989c6f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-dd01a59a-8825-4686-8ad2-48c0d7c29bcf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:18:18 np0005531887 nova_compute[186849]: 2025-11-22 08:18:18.467 186853 DEBUG nova.network.neutron [req-0188137a-5348-4d12-a7a9-0f8b6dddb1d3 req-745a3714-8037-45b5-a85c-88a5a5989c6f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Refreshing network info cache for port 9dcbb883-4317-4193-a384-0d8b55f051a7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:18:18 np0005531887 nova_compute[186849]: 2025-11-22 08:18:18.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:18:19 np0005531887 nova_compute[186849]: 2025-11-22 08:18:19.803 186853 DEBUG nova.network.neutron [req-0188137a-5348-4d12-a7a9-0f8b6dddb1d3 req-745a3714-8037-45b5-a85c-88a5a5989c6f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Updated VIF entry in instance network info cache for port 9dcbb883-4317-4193-a384-0d8b55f051a7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:18:19 np0005531887 nova_compute[186849]: 2025-11-22 08:18:19.804 186853 DEBUG nova.network.neutron [req-0188137a-5348-4d12-a7a9-0f8b6dddb1d3 req-745a3714-8037-45b5-a85c-88a5a5989c6f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Updating instance_info_cache with network_info: [{"id": "9dcbb883-4317-4193-a384-0d8b55f051a7", "address": "fa:16:3e:ae:bf:ae", "network": {"id": "28285a99-0933-48f9-aee6-f1e507bcd777", "bridge": "br-int", "label": "tempest-network-smoke--85194758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9dcbb883-43", "ovs_interfaceid": "9dcbb883-4317-4193-a384-0d8b55f051a7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "da46e34b-ec37-4cc4-b1ab-4e8564ebbb60", "address": "fa:16:3e:4b:a1:91", "network": {"id": "206a04da-ce2f-48ff-99c7-e70706547580", "bridge": "br-int", "label": "tempest-network-smoke--1887597380", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe4b:a191", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda46e34b-ec", "ovs_interfaceid": "da46e34b-ec37-4cc4-b1ab-4e8564ebbb60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:18:19 np0005531887 nova_compute[186849]: 2025-11-22 08:18:19.826 186853 DEBUG oslo_concurrency.lockutils [req-0188137a-5348-4d12-a7a9-0f8b6dddb1d3 req-745a3714-8037-45b5-a85c-88a5a5989c6f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-dd01a59a-8825-4686-8ad2-48c0d7c29bcf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:18:21 np0005531887 nova_compute[186849]: 2025-11-22 08:18:21.578 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:22 np0005531887 nova_compute[186849]: 2025-11-22 08:18:22.009 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:22 np0005531887 nova_compute[186849]: 2025-11-22 08:18:22.150 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:22 np0005531887 podman[236759]: 2025-11-22 08:18:22.842299486 +0000 UTC m=+0.055065483 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 03:18:26 np0005531887 ovn_controller[95130]: 2025-11-22T08:18:26Z|00063|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ae:bf:ae 10.100.0.7
Nov 22 03:18:26 np0005531887 ovn_controller[95130]: 2025-11-22T08:18:26Z|00064|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ae:bf:ae 10.100.0.7
Nov 22 03:18:26 np0005531887 nova_compute[186849]: 2025-11-22 08:18:26.579 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:26 np0005531887 nova_compute[186849]: 2025-11-22 08:18:26.763 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:18:27 np0005531887 nova_compute[186849]: 2025-11-22 08:18:27.153 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:27 np0005531887 podman[236804]: 2025-11-22 08:18:27.839205955 +0000 UTC m=+0.062307332 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 22 03:18:31 np0005531887 nova_compute[186849]: 2025-11-22 08:18:31.582 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:32 np0005531887 nova_compute[186849]: 2025-11-22 08:18:32.157 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:33 np0005531887 podman[236824]: 2025-11-22 08:18:33.855313777 +0000 UTC m=+0.075832116 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:18:36 np0005531887 nova_compute[186849]: 2025-11-22 08:18:36.584 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.671 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf', 'name': 'tempest-TestGettingAddress-server-686264333', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000095', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'user_id': '809b865601654264af5bff7f49127cea', 'hostId': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.672 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.676 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for dd01a59a-8825-4686-8ad2-48c0d7c29bcf / tap9dcbb883-43 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.677 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for dd01a59a-8825-4686-8ad2-48c0d7c29bcf / tapda46e34b-ec inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.677 12 DEBUG ceilometer.compute.pollsters [-] dd01a59a-8825-4686-8ad2-48c0d7c29bcf/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.678 12 DEBUG ceilometer.compute.pollsters [-] dd01a59a-8825-4686-8ad2-48c0d7c29bcf/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.680 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '96c57ccd-3a66-4769-84d6-926fb1644f1a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000095-dd01a59a-8825-4686-8ad2-48c0d7c29bcf-tap9dcbb883-43', 'timestamp': '2025-11-22T08:18:36.672572', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-686264333', 'name': 'tap9dcbb883-43', 'instance_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ae:bf:ae', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9dcbb883-43'}, 'message_id': 'd7f892a8-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6259.342368724, 'message_signature': 'afe6a315e1d34cd4390eabb5331e1ab66930ece617e3fb4b4fc3253036de5669'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000095-dd01a59a-8825-4686-8ad2-48c0d7c29bcf-tapda46e34b-ec', 'timestamp': '2025-11-22T08:18:36.672572', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-686264333', 'name': 'tapda46e34b-ec', 'instance_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4b:a1:91', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapda46e34b-ec'}, 'message_id': 'd7f8a248-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6259.342368724, 'message_signature': '3c27967f762fa1e43f31289ebc1b663a8c6152ed0ebffa8b98dc23866b34244e'}]}, 'timestamp': '2025-11-22 08:18:36.678766', '_unique_id': '9f037412c62c4aaa91702f30042c6d9d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.680 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.680 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.680 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.680 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.680 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.680 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.680 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.680 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.680 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.680 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.680 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.680 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.680 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.680 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.680 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.680 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.680 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.680 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.680 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.680 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.680 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.680 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.680 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.680 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.680 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.680 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.680 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.680 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.680 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.680 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.680 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.681 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.681 12 DEBUG ceilometer.compute.pollsters [-] dd01a59a-8825-4686-8ad2-48c0d7c29bcf/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.681 12 DEBUG ceilometer.compute.pollsters [-] dd01a59a-8825-4686-8ad2-48c0d7c29bcf/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.682 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '999a5c14-c14e-4e9f-8e50-b34877b20157', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000095-dd01a59a-8825-4686-8ad2-48c0d7c29bcf-tap9dcbb883-43', 'timestamp': '2025-11-22T08:18:36.681211', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-686264333', 'name': 'tap9dcbb883-43', 'instance_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ae:bf:ae', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9dcbb883-43'}, 'message_id': 'd7f91156-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6259.342368724, 'message_signature': '162c8635a2c73bef9d30a7af74d54f594527f34bf1d0ba762396921568ce41d1'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000095-dd01a59a-8825-4686-8ad2-48c0d7c29bcf-tapda46e34b-ec', 'timestamp': '2025-11-22T08:18:36.681211', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-686264333', 'name': 'tapda46e34b-ec', 'instance_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4b:a1:91', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapda46e34b-ec'}, 'message_id': 'd7f91d86-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6259.342368724, 'message_signature': '3a52811f6098a033d735757eb1014b03752413b0304cc86af6f73e4df268b86d'}]}, 'timestamp': '2025-11-22 08:18:36.681911', '_unique_id': '46b4d2439fd645318d3c472356168cf1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.682 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.682 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.682 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.682 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.682 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.682 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.682 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.682 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.682 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.682 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.682 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.682 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.682 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.682 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.682 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.682 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.682 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.682 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.682 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.682 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.682 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.682 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.682 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.682 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.682 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.682 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.682 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.682 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.682 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.682 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.682 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.683 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.707 12 DEBUG ceilometer.compute.pollsters [-] dd01a59a-8825-4686-8ad2-48c0d7c29bcf/cpu volume: 14410000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.708 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '53238d3c-0487-413c-8bcf-1a6a418157f0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 14410000000, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf', 'timestamp': '2025-11-22T08:18:36.683706', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-686264333', 'name': 'instance-00000095', 'instance_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'd7fd084c-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6259.376726013, 'message_signature': 'e4d48dd5717c6d2c94e411a4b7c63682e5590d878e278a4ce49f3e1d094ba42b'}]}, 'timestamp': '2025-11-22 08:18:36.707665', '_unique_id': 'a78d1d3f4c8c4a70921477a9e17d06e8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.708 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.708 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.708 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.708 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.708 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.708 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.708 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.708 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.708 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.708 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.708 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.708 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.708 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.708 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.708 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.708 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.708 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.708 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.708 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.708 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.708 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.708 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.708 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.708 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.708 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.708 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.708 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.708 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.708 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.708 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.708 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.709 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.709 12 DEBUG ceilometer.compute.pollsters [-] dd01a59a-8825-4686-8ad2-48c0d7c29bcf/network.incoming.packets volume: 159 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.710 12 DEBUG ceilometer.compute.pollsters [-] dd01a59a-8825-4686-8ad2-48c0d7c29bcf/network.incoming.packets volume: 13 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.711 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '530e2484-b740-4584-a18d-6e8b127632cc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 159, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000095-dd01a59a-8825-4686-8ad2-48c0d7c29bcf-tap9dcbb883-43', 'timestamp': '2025-11-22T08:18:36.709772', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-686264333', 'name': 'tap9dcbb883-43', 'instance_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ae:bf:ae', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9dcbb883-43'}, 'message_id': 'd7fd6a8a-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6259.342368724, 'message_signature': 'ae3baaab978d3a2494cbeea9f4fe38e6999be2cbf7cafbb805b1f1e76b1373a2'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 13, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000095-dd01a59a-8825-4686-8ad2-48c0d7c29bcf-tapda46e34b-ec', 'timestamp': '2025-11-22T08:18:36.709772', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-686264333', 'name': 'tapda46e34b-ec', 'instance_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4b:a1:91', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapda46e34b-ec'}, 'message_id': 'd7fd77b4-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6259.342368724, 'message_signature': '6ecf46f03a344ab19e3d93fe2d1e7cf8ae6fd2d15da52704469071516530551d'}]}, 'timestamp': '2025-11-22 08:18:36.710448', '_unique_id': '7919d4a9540c4c2281e9892ccdadf217'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.711 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.711 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.711 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.711 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.711 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.711 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.711 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.711 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.711 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.711 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.711 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.711 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.711 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.711 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.711 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.711 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.711 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.711 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.711 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.711 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.711 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.711 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.711 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.711 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.711 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.711 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.711 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.711 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.711 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.711 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.711 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.711 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.712 12 DEBUG ceilometer.compute.pollsters [-] dd01a59a-8825-4686-8ad2-48c0d7c29bcf/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.712 12 DEBUG ceilometer.compute.pollsters [-] dd01a59a-8825-4686-8ad2-48c0d7c29bcf/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.713 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '309fc605-433a-4948-ba94-756fecf1f748', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000095-dd01a59a-8825-4686-8ad2-48c0d7c29bcf-tap9dcbb883-43', 'timestamp': '2025-11-22T08:18:36.712062', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-686264333', 'name': 'tap9dcbb883-43', 'instance_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ae:bf:ae', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9dcbb883-43'}, 'message_id': 'd7fdc34a-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6259.342368724, 'message_signature': 'a27f45c34d8b4ced1e4f8455ea23a9b62c16e83cc8b8bcbd40b1fdc6f5afae61'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000095-dd01a59a-8825-4686-8ad2-48c0d7c29bcf-tapda46e34b-ec', 'timestamp': '2025-11-22T08:18:36.712062', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-686264333', 'name': 'tapda46e34b-ec', 'instance_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4b:a1:91', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapda46e34b-ec'}, 'message_id': 'd7fdd060-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6259.342368724, 'message_signature': '52f6158f61adb72a256ce5160c4a66b164e9c1657cc07a0f1e61cfa4842b6202'}]}, 'timestamp': '2025-11-22 08:18:36.712711', '_unique_id': '847837cc086543a9a1078019365700d8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.713 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.713 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.713 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.713 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.713 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.713 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.713 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.713 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.713 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.713 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.713 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.713 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.713 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.713 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.713 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.713 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.713 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.713 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.713 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.713 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.713 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.713 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.713 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.713 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.713 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.713 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.713 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.713 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.713 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.713 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.713 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.714 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.744 12 DEBUG ceilometer.compute.pollsters [-] dd01a59a-8825-4686-8ad2-48c0d7c29bcf/disk.device.read.requests volume: 1100 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.745 12 DEBUG ceilometer.compute.pollsters [-] dd01a59a-8825-4686-8ad2-48c0d7c29bcf/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.746 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2b8fed20-116e-45f7-a980-8e9de2aa5723', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1100, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf-vda', 'timestamp': '2025-11-22T08:18:36.714284', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-686264333', 'name': 'instance-00000095', 'instance_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd802c200-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6259.384106475, 'message_signature': '870315c89287e8445d050855b0ad2ec3e5c65a1880b5e36393809124865fb877'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf-sda', 'timestamp': '2025-11-22T08:18:36.714284', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-686264333', 'name': 'instance-00000095', 'instance_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd802d1dc-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6259.384106475, 'message_signature': '2632e9e28834e7372c2438d1f01ae8c5441813d6cdcb69d7dbb40b2267503561'}]}, 'timestamp': '2025-11-22 08:18:36.745516', '_unique_id': '6b623669b5664cf696526aa3ea0f289a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.746 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.746 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.746 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.746 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.746 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.746 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.746 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.746 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.746 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.746 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.746 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.746 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.746 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.746 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.746 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.746 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.746 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.746 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.746 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.746 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.746 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.746 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.746 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.746 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.746 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.746 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.746 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.746 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.746 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.746 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.746 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.747 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.747 12 DEBUG ceilometer.compute.pollsters [-] dd01a59a-8825-4686-8ad2-48c0d7c29bcf/disk.device.read.bytes volume: 30714368 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.747 12 DEBUG ceilometer.compute.pollsters [-] dd01a59a-8825-4686-8ad2-48c0d7c29bcf/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.748 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '58998ed5-2a7a-4a39-98f4-8333768d6c21', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30714368, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf-vda', 'timestamp': '2025-11-22T08:18:36.747431', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-686264333', 'name': 'instance-00000095', 'instance_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd80327c2-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6259.384106475, 'message_signature': '10b4d089292db7fd0b5e8e752a1b301d1e3c215a17b3fd5ad9ec43e4191716e8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf-sda', 'timestamp': '2025-11-22T08:18:36.747431', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-686264333', 'name': 'instance-00000095', 'instance_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd8033046-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6259.384106475, 'message_signature': '06bd5cde358e37c849a7bf8f74b64da853df0d769ecbc96844edfe4c43ed97ef'}]}, 'timestamp': '2025-11-22 08:18:36.747893', '_unique_id': 'd51b5d204e6b4d1fa53bd84d2b0a8297'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.748 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.748 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.748 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.748 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.748 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.748 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.748 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.748 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.748 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.748 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.748 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.748 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.748 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.748 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.748 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.748 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.748 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.748 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.748 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.748 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.748 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.748 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.748 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.748 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.748 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.748 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.748 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.748 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.748 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.748 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.748 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.749 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.749 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.749 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestGettingAddress-server-686264333>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-686264333>]
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.749 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.763 12 DEBUG ceilometer.compute.pollsters [-] dd01a59a-8825-4686-8ad2-48c0d7c29bcf/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.764 12 DEBUG ceilometer.compute.pollsters [-] dd01a59a-8825-4686-8ad2-48c0d7c29bcf/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.765 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '88893b81-51dc-4991-8dd8-36f02aa2b142', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf-vda', 'timestamp': '2025-11-22T08:18:36.749570', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-686264333', 'name': 'instance-00000095', 'instance_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd805a8bc-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6259.419393709, 'message_signature': '8fd680d31205c6a956a7d7bca83572c9ba84c929373553ec2d4dc19f115295cd'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf-sda', 'timestamp': '2025-11-22T08:18:36.749570', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-686264333', 'name': 'instance-00000095', 'instance_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd805b794-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6259.419393709, 'message_signature': 'b7251be199bf8aa475343afe2fd0284cfdd4f28241ed7a36f20d2adf0ecdd26a'}]}, 'timestamp': '2025-11-22 08:18:36.764513', '_unique_id': '5cf939044e9c4079882d3c5ffa08135e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.765 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.765 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.765 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.765 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.765 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.765 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.765 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.765 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.765 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.765 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.765 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.765 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.765 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.765 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.765 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.765 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.765 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.765 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.765 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.765 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.765 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.765 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.765 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.765 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.765 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.765 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.765 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.765 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.765 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.765 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.765 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.766 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.766 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.766 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestGettingAddress-server-686264333>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-686264333>]
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.766 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.767 12 DEBUG ceilometer.compute.pollsters [-] dd01a59a-8825-4686-8ad2-48c0d7c29bcf/network.outgoing.bytes volume: 26912 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.767 12 DEBUG ceilometer.compute.pollsters [-] dd01a59a-8825-4686-8ad2-48c0d7c29bcf/network.outgoing.bytes volume: 2418 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.768 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '246e9668-95d3-444c-8416-d6f6bb6436d7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 26912, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000095-dd01a59a-8825-4686-8ad2-48c0d7c29bcf-tap9dcbb883-43', 'timestamp': '2025-11-22T08:18:36.767005', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-686264333', 'name': 'tap9dcbb883-43', 'instance_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ae:bf:ae', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9dcbb883-43'}, 'message_id': 'd80625c6-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6259.342368724, 'message_signature': '86fb4def9b5a403dfc2d25de855346b2bda4f11d7112cc070c628a7aa2d85a8a'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2418, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000095-dd01a59a-8825-4686-8ad2-48c0d7c29bcf-tapda46e34b-ec', 'timestamp': '2025-11-22T08:18:36.767005', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-686264333', 'name': 'tapda46e34b-ec', 'instance_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4b:a1:91', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapda46e34b-ec'}, 'message_id': 'd8063282-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6259.342368724, 'message_signature': '359b6becc192a1f6e606979f416463f31198ffae65802bc19f3f75bd52ffd1f4'}]}, 'timestamp': '2025-11-22 08:18:36.767652', '_unique_id': '38d2797266c7459589e53483605792cd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.768 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.768 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.768 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.768 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.768 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.768 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.768 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.768 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.768 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.768 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.768 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.768 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.768 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.768 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.768 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.768 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.768 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.768 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.768 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.768 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.768 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.768 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.768 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.768 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.768 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.768 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.768 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.768 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.768 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.768 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.768 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.769 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.769 12 DEBUG ceilometer.compute.pollsters [-] dd01a59a-8825-4686-8ad2-48c0d7c29bcf/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.769 12 DEBUG ceilometer.compute.pollsters [-] dd01a59a-8825-4686-8ad2-48c0d7c29bcf/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.770 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9d210b5c-6a6a-417b-a3a5-a4a7f9706fc1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf-vda', 'timestamp': '2025-11-22T08:18:36.769411', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-686264333', 'name': 'instance-00000095', 'instance_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd806832c-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6259.419393709, 'message_signature': '42750c30b948a6fe2f9aba6c150256710c0b75796b8267b0523971a8374a2e44'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf-sda', 'timestamp': '2025-11-22T08:18:36.769411', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-686264333', 'name': 'instance-00000095', 'instance_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd8068e26-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6259.419393709, 'message_signature': '1acde6534f1c0c1e20a570f040422b007d02dda487b1f0fe9e475d0f7a9e7928'}]}, 'timestamp': '2025-11-22 08:18:36.769982', '_unique_id': 'ef2dce17b0c24e429f2ac10228497d99'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.770 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.770 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.770 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.770 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.770 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.770 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.770 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.770 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.770 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.770 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.770 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.770 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.770 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.770 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.770 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.770 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.770 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.770 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.770 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.770 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.770 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.770 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.770 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.770 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.770 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.770 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.770 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.770 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.770 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.770 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.770 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.770 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.771 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.771 12 DEBUG ceilometer.compute.pollsters [-] dd01a59a-8825-4686-8ad2-48c0d7c29bcf/memory.usage volume: 43.8515625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.772 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8ca7bfa9-7db8-4028-9cb4-3840c291d242', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 43.8515625, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf', 'timestamp': '2025-11-22T08:18:36.771409', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-686264333', 'name': 'instance-00000095', 'instance_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'd806d12e-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6259.376726013, 'message_signature': '29aa86859658b1e1b415f8d40ed2b3812f75b4d12cadf7a524e36b22abb57fcc'}]}, 'timestamp': '2025-11-22 08:18:36.771689', '_unique_id': '8c7e33cafc554c8aa4fe92b9aaf6e6d9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.772 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.772 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.772 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.772 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.772 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.772 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.772 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.772 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.772 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.772 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.772 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.772 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.772 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.772 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.772 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.772 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.772 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.772 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.772 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.772 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.772 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.772 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.772 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.772 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.772 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.772 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.772 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.772 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.772 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.772 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.772 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.772 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.772 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.772 12 DEBUG ceilometer.compute.pollsters [-] dd01a59a-8825-4686-8ad2-48c0d7c29bcf/network.outgoing.packets volume: 171 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.773 12 DEBUG ceilometer.compute.pollsters [-] dd01a59a-8825-4686-8ad2-48c0d7c29bcf/network.outgoing.packets volume: 23 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.773 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7ec68ff6-be37-4021-949e-90325fad0b71', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 171, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000095-dd01a59a-8825-4686-8ad2-48c0d7c29bcf-tap9dcbb883-43', 'timestamp': '2025-11-22T08:18:36.772864', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-686264333', 'name': 'tap9dcbb883-43', 'instance_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ae:bf:ae', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9dcbb883-43'}, 'message_id': 'd8070932-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6259.342368724, 'message_signature': '6e501f578b0df43d8179061de362795d3d74a877bcde91e359ea675be97114f3'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 23, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000095-dd01a59a-8825-4686-8ad2-48c0d7c29bcf-tapda46e34b-ec', 'timestamp': '2025-11-22T08:18:36.772864', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-686264333', 'name': 'tapda46e34b-ec', 'instance_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4b:a1:91', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapda46e34b-ec'}, 'message_id': 'd8071436-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6259.342368724, 'message_signature': '2a107a62786426c765fcbf192043fdbffa16ebfaaaa4ee25aea3d2c2dc9f5c7a'}]}, 'timestamp': '2025-11-22 08:18:36.773383', '_unique_id': 'ea7b925d9614421b874635013cbf935e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.773 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.773 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.773 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.773 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.773 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.773 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.773 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.773 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.773 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.773 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.773 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.773 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.773 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.773 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.773 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.773 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.773 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.773 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.773 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.773 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.773 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.773 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.773 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.773 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.773 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.773 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.773 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.773 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.773 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.773 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.773 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.774 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.774 12 DEBUG ceilometer.compute.pollsters [-] dd01a59a-8825-4686-8ad2-48c0d7c29bcf/disk.device.write.bytes volume: 72900608 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.774 12 DEBUG ceilometer.compute.pollsters [-] dd01a59a-8825-4686-8ad2-48c0d7c29bcf/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.775 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0b8fc923-0132-490f-9613-511d9d51d330', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72900608, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf-vda', 'timestamp': '2025-11-22T08:18:36.774529', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-686264333', 'name': 'instance-00000095', 'instance_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd8074af0-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6259.384106475, 'message_signature': '196df33ad8f0e46a794042a3a9864a2f9e9f3ab926ba2114bd771dff0a33da05'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf-sda', 'timestamp': '2025-11-22T08:18:36.774529', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-686264333', 'name': 'instance-00000095', 'instance_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd80754be-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6259.384106475, 'message_signature': '51bc52975ee39bc57a3fcbaafbc6c0cf3163be256f133c9a2b8abd8e59465daa'}]}, 'timestamp': '2025-11-22 08:18:36.775027', '_unique_id': '4e866617729f4c8cb7cf4eb2aa8d80ea'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.775 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.775 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.775 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.775 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.775 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.775 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.775 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.775 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.775 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.775 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.775 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.775 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.775 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.775 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.775 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.775 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.775 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.775 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.775 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.775 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.775 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.775 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.775 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.775 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.775 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.775 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.775 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.775 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.775 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.775 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.775 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.776 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.776 12 DEBUG ceilometer.compute.pollsters [-] dd01a59a-8825-4686-8ad2-48c0d7c29bcf/disk.device.write.requests volume: 301 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.776 12 DEBUG ceilometer.compute.pollsters [-] dd01a59a-8825-4686-8ad2-48c0d7c29bcf/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.777 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7b81796e-bc51-4040-81fb-bcd9daf1e3de', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 301, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf-vda', 'timestamp': '2025-11-22T08:18:36.776120', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-686264333', 'name': 'instance-00000095', 'instance_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd80787cc-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6259.384106475, 'message_signature': '6eb30517fa233da2ce9712f2110f9cfab51e5c2c4430ee294add69a4069ca79d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf-sda', 'timestamp': '2025-11-22T08:18:36.776120', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-686264333', 'name': 'instance-00000095', 'instance_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd80790a0-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6259.384106475, 'message_signature': '570587ccb07077717fdaa0df3d37fc0469a2c15fcef0f9f4997830f0aa7ace18'}]}, 'timestamp': '2025-11-22 08:18:36.776559', '_unique_id': '920bb96d2cbe4996b28b769c9ad40025'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.777 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.777 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.777 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.777 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.777 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.777 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.777 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.777 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.777 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.777 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.777 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.777 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.777 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.777 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.777 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.777 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.777 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.777 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.777 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.777 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.777 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.777 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.777 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.777 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.777 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.777 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.777 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.777 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.777 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.777 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.777 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.777 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.777 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.777 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-686264333>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-686264333>]
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.777 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.777 12 DEBUG ceilometer.compute.pollsters [-] dd01a59a-8825-4686-8ad2-48c0d7c29bcf/disk.device.write.latency volume: 41256668488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.778 12 DEBUG ceilometer.compute.pollsters [-] dd01a59a-8825-4686-8ad2-48c0d7c29bcf/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.778 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2f365d11-2f16-4326-b1e1-87ceb9d569e2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 41256668488, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf-vda', 'timestamp': '2025-11-22T08:18:36.777981', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-686264333', 'name': 'instance-00000095', 'instance_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd807d01a-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6259.384106475, 'message_signature': '686ff75ad125e56b456436874032eea4ff7ddc2b99a3e5f5443bdb8e934c7bb3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf-sda', 'timestamp': '2025-11-22T08:18:36.777981', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-686264333', 'name': 'instance-00000095', 'instance_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd807d8e4-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6259.384106475, 'message_signature': '6232c79693fbceb563c68325313e7f04e22406b1f2e5924222f976e23237027b'}]}, 'timestamp': '2025-11-22 08:18:36.778419', '_unique_id': '3e4619b7da3f4c56b43294768ee9fb60'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.778 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.778 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.778 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.778 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.778 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.778 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.778 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.778 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.778 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.778 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.778 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.778 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.778 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.778 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.778 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.778 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.778 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.778 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.778 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.778 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.778 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.778 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.778 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.778 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.778 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.778 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.778 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.778 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.778 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.778 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.778 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.779 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.779 12 DEBUG ceilometer.compute.pollsters [-] dd01a59a-8825-4686-8ad2-48c0d7c29bcf/disk.device.read.latency volume: 1829344963 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.779 12 DEBUG ceilometer.compute.pollsters [-] dd01a59a-8825-4686-8ad2-48c0d7c29bcf/disk.device.read.latency volume: 87008699 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.780 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a722897e-46ca-4293-a56f-e505c2616968', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1829344963, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf-vda', 'timestamp': '2025-11-22T08:18:36.779520', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-686264333', 'name': 'instance-00000095', 'instance_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd8080c74-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6259.384106475, 'message_signature': '5df0fb00197403c29ff3da73214de90d50493253bb102fa78a10099b8a74ae33'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 87008699, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf-sda', 'timestamp': '2025-11-22T08:18:36.779520', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-686264333', 'name': 'instance-00000095', 'instance_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd80813e0-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6259.384106475, 'message_signature': '89bcdbd95777498a72e2eaa63ff9f4b95881791bacb18c1f3ab0fa8d5602d8ed'}]}, 'timestamp': '2025-11-22 08:18:36.779925', '_unique_id': '0e2a2705935949ed9e6e299d5f0bd220'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.780 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.780 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.780 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.780 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.780 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.780 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.780 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.780 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.780 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.780 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.780 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.780 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.780 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.780 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.780 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.780 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.780 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.780 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.780 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.780 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.780 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.780 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.780 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.780 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.780 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.780 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.780 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.780 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.780 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.780 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.780 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.780 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.781 12 DEBUG ceilometer.compute.pollsters [-] dd01a59a-8825-4686-8ad2-48c0d7c29bcf/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.781 12 DEBUG ceilometer.compute.pollsters [-] dd01a59a-8825-4686-8ad2-48c0d7c29bcf/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.782 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '299a9195-f6ee-4ac3-b580-6137cbc5070c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000095-dd01a59a-8825-4686-8ad2-48c0d7c29bcf-tap9dcbb883-43', 'timestamp': '2025-11-22T08:18:36.781062', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-686264333', 'name': 'tap9dcbb883-43', 'instance_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ae:bf:ae', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9dcbb883-43'}, 'message_id': 'd808496e-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6259.342368724, 'message_signature': 'b5f59b3c50f54c1987aa1a690e66ee0174d09fd242cb3fcc01743de42b475a32'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000095-dd01a59a-8825-4686-8ad2-48c0d7c29bcf-tapda46e34b-ec', 'timestamp': '2025-11-22T08:18:36.781062', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-686264333', 'name': 'tapda46e34b-ec', 'instance_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4b:a1:91', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapda46e34b-ec'}, 'message_id': 'd8085594-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6259.342368724, 'message_signature': '683928e89db58e523ac2bd10c9c1007c2345f710f6d1f0fd28cc5be413beade9'}]}, 'timestamp': '2025-11-22 08:18:36.781611', '_unique_id': 'b3c54d74402048d1aa88039efe65e632'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.782 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.782 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.782 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.782 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.782 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.782 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.782 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.782 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.782 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.782 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.782 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.782 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.782 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.782 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.782 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.782 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.782 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.782 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.782 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.782 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.782 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.782 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.782 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.782 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.782 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.782 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.782 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.782 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.782 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.782 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.782 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.782 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.782 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.782 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-686264333>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-686264333>]
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.782 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.782 12 DEBUG ceilometer.compute.pollsters [-] dd01a59a-8825-4686-8ad2-48c0d7c29bcf/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.783 12 DEBUG ceilometer.compute.pollsters [-] dd01a59a-8825-4686-8ad2-48c0d7c29bcf/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.783 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3b23b132-a842-4af5-b0a7-f51775bc8346', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000095-dd01a59a-8825-4686-8ad2-48c0d7c29bcf-tap9dcbb883-43', 'timestamp': '2025-11-22T08:18:36.782977', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-686264333', 'name': 'tap9dcbb883-43', 'instance_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ae:bf:ae', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9dcbb883-43'}, 'message_id': 'd8089374-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6259.342368724, 'message_signature': '1cdb7c425a16b68ca736a3a1860d8933a5904228d1fc4f46aabd5db9dba13c62'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000095-dd01a59a-8825-4686-8ad2-48c0d7c29bcf-tapda46e34b-ec', 'timestamp': '2025-11-22T08:18:36.782977', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-686264333', 'name': 'tapda46e34b-ec', 'instance_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4b:a1:91', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapda46e34b-ec'}, 'message_id': 'd8089c34-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6259.342368724, 'message_signature': '63f075e159b187948759dbc816edfc1dd66bf55510dcfe717df9861236c78fe8'}]}, 'timestamp': '2025-11-22 08:18:36.783415', '_unique_id': 'ee8df970e8da4ee1b479c6e6ceee88c5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.783 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.783 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.783 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.783 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.783 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.783 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.783 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.783 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.783 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.783 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.783 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.783 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.783 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.783 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.783 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.783 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.783 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.783 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.783 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.783 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.783 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.783 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.783 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.783 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.783 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.783 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.783 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.783 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.783 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.783 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.783 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.784 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.785 12 DEBUG ceilometer.compute.pollsters [-] dd01a59a-8825-4686-8ad2-48c0d7c29bcf/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.785 12 DEBUG ceilometer.compute.pollsters [-] dd01a59a-8825-4686-8ad2-48c0d7c29bcf/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.785 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '08242e6c-907d-462a-93bd-eced2cbf7d07', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000095-dd01a59a-8825-4686-8ad2-48c0d7c29bcf-tap9dcbb883-43', 'timestamp': '2025-11-22T08:18:36.785027', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-686264333', 'name': 'tap9dcbb883-43', 'instance_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ae:bf:ae', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9dcbb883-43'}, 'message_id': 'd808e3ce-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6259.342368724, 'message_signature': 'be1c303227a7f4e5e97bbefcb72d72db67451c1c1da6e14dd6cf150089d8ecf1'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000095-dd01a59a-8825-4686-8ad2-48c0d7c29bcf-tapda46e34b-ec', 'timestamp': '2025-11-22T08:18:36.785027', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-686264333', 'name': 'tapda46e34b-ec', 'instance_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4b:a1:91', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapda46e34b-ec'}, 'message_id': 'd808ec70-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6259.342368724, 'message_signature': '3ed9a940aef63fc30d001d2d6a83096ed41394c01dcb6608d20d0c6c4255d564'}]}, 'timestamp': '2025-11-22 08:18:36.785466', '_unique_id': '14cebc5264334dc491b363de2f121f05'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.785 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.785 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.785 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.785 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.785 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.785 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.785 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.785 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.785 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.785 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.785 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.785 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.785 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.785 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.785 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.785 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.785 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.785 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.785 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.785 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.785 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.785 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.785 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.785 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.785 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.785 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.785 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.785 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.785 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.785 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.785 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.786 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.786 12 DEBUG ceilometer.compute.pollsters [-] dd01a59a-8825-4686-8ad2-48c0d7c29bcf/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.786 12 DEBUG ceilometer.compute.pollsters [-] dd01a59a-8825-4686-8ad2-48c0d7c29bcf/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.787 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3854548b-957e-4f72-853c-75e62552ef20', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf-vda', 'timestamp': '2025-11-22T08:18:36.786530', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-686264333', 'name': 'instance-00000095', 'instance_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd8091e2a-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6259.419393709, 'message_signature': 'eaee954b7f12d1b98acc66ca9fa6ebbd173ef9beaa262e81cb389d0e057fab6f'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf-sda', 'timestamp': '2025-11-22T08:18:36.786530', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-686264333', 'name': 'instance-00000095', 'instance_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd809292e-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6259.419393709, 'message_signature': 'ccb16af62eaeba91db6e795ba6e4380492f8d2841c3d953f5a1dfb182161b87e'}]}, 'timestamp': '2025-11-22 08:18:36.787012', '_unique_id': '2eefa3bcec93486c9d8543bd23399522'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.787 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.787 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.787 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.787 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.787 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.787 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.787 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.787 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.787 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.787 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.787 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.787 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.787 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.787 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.787 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.787 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.787 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.787 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.787 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.787 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.787 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.787 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.787 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.787 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.787 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.787 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.787 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.787 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.787 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.787 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.787 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.788 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.788 12 DEBUG ceilometer.compute.pollsters [-] dd01a59a-8825-4686-8ad2-48c0d7c29bcf/network.incoming.bytes volume: 29309 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.788 12 DEBUG ceilometer.compute.pollsters [-] dd01a59a-8825-4686-8ad2-48c0d7c29bcf/network.incoming.bytes volume: 1266 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.789 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e0172722-74fa-4a2d-a8de-ddcee98c3f7b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29309, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000095-dd01a59a-8825-4686-8ad2-48c0d7c29bcf-tap9dcbb883-43', 'timestamp': '2025-11-22T08:18:36.788221', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-686264333', 'name': 'tap9dcbb883-43', 'instance_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ae:bf:ae', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9dcbb883-43'}, 'message_id': 'd8096114-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6259.342368724, 'message_signature': 'e597ea7721ecd96e8d4e3e5155acc74a266719c0eeaad659f2b7d6b378cd812a'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1266, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000095-dd01a59a-8825-4686-8ad2-48c0d7c29bcf-tapda46e34b-ec', 'timestamp': '2025-11-22T08:18:36.788221', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-686264333', 'name': 'tapda46e34b-ec', 'instance_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4b:a1:91', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapda46e34b-ec'}, 'message_id': 'd80968a8-c77b-11f0-9b25-fa163ecc0304', 'monotonic_time': 6259.342368724, 'message_signature': '7ea7d78c1254eca1216397db0b4c045a96a00f016a1494cdbf9f9aaf5d04b752'}]}, 'timestamp': '2025-11-22 08:18:36.788645', '_unique_id': '3a0f90498b5e44c198925b537f6f2c1c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.789 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.789 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.789 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.789 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.789 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.789 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.789 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.789 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.789 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.789 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.789 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.789 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.789 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.789 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.789 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.789 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.789 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.789 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.789 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.789 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.789 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.789 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.789 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.789 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.789 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.789 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.789 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.789 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.789 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.789 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:18:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:18:36.789 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.034 186853 DEBUG oslo_concurrency.lockutils [None req-0ce931af-0a49-4fd9-b4cd-945fdec25cf2 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "dd01a59a-8825-4686-8ad2-48c0d7c29bcf" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.035 186853 DEBUG oslo_concurrency.lockutils [None req-0ce931af-0a49-4fd9-b4cd-945fdec25cf2 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "dd01a59a-8825-4686-8ad2-48c0d7c29bcf" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.035 186853 DEBUG oslo_concurrency.lockutils [None req-0ce931af-0a49-4fd9-b4cd-945fdec25cf2 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "dd01a59a-8825-4686-8ad2-48c0d7c29bcf-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.035 186853 DEBUG oslo_concurrency.lockutils [None req-0ce931af-0a49-4fd9-b4cd-945fdec25cf2 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "dd01a59a-8825-4686-8ad2-48c0d7c29bcf-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.036 186853 DEBUG oslo_concurrency.lockutils [None req-0ce931af-0a49-4fd9-b4cd-945fdec25cf2 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "dd01a59a-8825-4686-8ad2-48c0d7c29bcf-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.043 186853 INFO nova.compute.manager [None req-0ce931af-0a49-4fd9-b4cd-945fdec25cf2 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Terminating instance#033[00m
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.050 186853 DEBUG nova.compute.manager [None req-0ce931af-0a49-4fd9-b4cd-945fdec25cf2 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 03:18:37 np0005531887 kernel: tap9dcbb883-43 (unregistering): left promiscuous mode
Nov 22 03:18:37 np0005531887 NetworkManager[55210]: <info>  [1763799517.0971] device (tap9dcbb883-43): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.105 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:37 np0005531887 ovn_controller[95130]: 2025-11-22T08:18:37Z|00508|binding|INFO|Releasing lport 9dcbb883-4317-4193-a384-0d8b55f051a7 from this chassis (sb_readonly=0)
Nov 22 03:18:37 np0005531887 ovn_controller[95130]: 2025-11-22T08:18:37Z|00509|binding|INFO|Setting lport 9dcbb883-4317-4193-a384-0d8b55f051a7 down in Southbound
Nov 22 03:18:37 np0005531887 ovn_controller[95130]: 2025-11-22T08:18:37Z|00510|binding|INFO|Removing iface tap9dcbb883-43 ovn-installed in OVS
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.111 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:37.119 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ae:bf:ae 10.100.0.7'], port_security=['fa:16:3e:ae:bf:ae 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-28285a99-0933-48f9-aee6-f1e507bcd777', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2f853bc3-cae6-48c5-838f-5d956d1719f7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2ce7fc5f-5ca9-4729-bcf9-4866d6397f92, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=9dcbb883-4317-4193-a384-0d8b55f051a7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:18:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:37.120 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 9dcbb883-4317-4193-a384-0d8b55f051a7 in datapath 28285a99-0933-48f9-aee6-f1e507bcd777 unbound from our chassis#033[00m
Nov 22 03:18:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:37.121 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 28285a99-0933-48f9-aee6-f1e507bcd777, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:18:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:37.122 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[29407161-aab4-4378-9a24-30dd1a30a249]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:37.122 104084 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-28285a99-0933-48f9-aee6-f1e507bcd777 namespace which is not needed anymore#033[00m
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.123 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:37 np0005531887 kernel: tapda46e34b-ec (unregistering): left promiscuous mode
Nov 22 03:18:37 np0005531887 NetworkManager[55210]: <info>  [1763799517.1323] device (tapda46e34b-ec): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.135 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.151 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:37 np0005531887 ovn_controller[95130]: 2025-11-22T08:18:37Z|00511|binding|INFO|Releasing lport da46e34b-ec37-4cc4-b1ab-4e8564ebbb60 from this chassis (sb_readonly=0)
Nov 22 03:18:37 np0005531887 ovn_controller[95130]: 2025-11-22T08:18:37Z|00512|binding|INFO|Setting lport da46e34b-ec37-4cc4-b1ab-4e8564ebbb60 down in Southbound
Nov 22 03:18:37 np0005531887 ovn_controller[95130]: 2025-11-22T08:18:37Z|00513|binding|INFO|Removing iface tapda46e34b-ec ovn-installed in OVS
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.153 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.163 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:37.185 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4b:a1:91 2001:db8::f816:3eff:fe4b:a191'], port_security=['fa:16:3e:4b:a1:91 2001:db8::f816:3eff:fe4b:a191'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe4b:a191/64', 'neutron:device_id': 'dd01a59a-8825-4686-8ad2-48c0d7c29bcf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-206a04da-ce2f-48ff-99c7-e70706547580', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2f853bc3-cae6-48c5-838f-5d956d1719f7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=450e32a6-ae0a-4cd4-b338-c697096c146f, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=da46e34b-ec37-4cc4-b1ab-4e8564ebbb60) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:18:37 np0005531887 systemd[1]: machine-qemu\x2d54\x2dinstance\x2d00000095.scope: Deactivated successfully.
Nov 22 03:18:37 np0005531887 systemd[1]: machine-qemu\x2d54\x2dinstance\x2d00000095.scope: Consumed 16.617s CPU time.
Nov 22 03:18:37 np0005531887 systemd-machined[153180]: Machine qemu-54-instance-00000095 terminated.
Nov 22 03:18:37 np0005531887 neutron-haproxy-ovnmeta-28285a99-0933-48f9-aee6-f1e507bcd777[236603]: [NOTICE]   (236607) : haproxy version is 2.8.14-c23fe91
Nov 22 03:18:37 np0005531887 neutron-haproxy-ovnmeta-28285a99-0933-48f9-aee6-f1e507bcd777[236603]: [NOTICE]   (236607) : path to executable is /usr/sbin/haproxy
Nov 22 03:18:37 np0005531887 neutron-haproxy-ovnmeta-28285a99-0933-48f9-aee6-f1e507bcd777[236603]: [WARNING]  (236607) : Exiting Master process...
Nov 22 03:18:37 np0005531887 neutron-haproxy-ovnmeta-28285a99-0933-48f9-aee6-f1e507bcd777[236603]: [WARNING]  (236607) : Exiting Master process...
Nov 22 03:18:37 np0005531887 neutron-haproxy-ovnmeta-28285a99-0933-48f9-aee6-f1e507bcd777[236603]: [ALERT]    (236607) : Current worker (236609) exited with code 143 (Terminated)
Nov 22 03:18:37 np0005531887 neutron-haproxy-ovnmeta-28285a99-0933-48f9-aee6-f1e507bcd777[236603]: [WARNING]  (236607) : All workers exited. Exiting... (0)
Nov 22 03:18:37 np0005531887 systemd[1]: libpod-82fd359214e4d78bf917ec357947882c99158b9d09ef3b2b3206b653c73cb165.scope: Deactivated successfully.
Nov 22 03:18:37 np0005531887 podman[236874]: 2025-11-22 08:18:37.273539907 +0000 UTC m=+0.053113335 container died 82fd359214e4d78bf917ec357947882c99158b9d09ef3b2b3206b653c73cb165 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-28285a99-0933-48f9-aee6-f1e507bcd777, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 03:18:37 np0005531887 NetworkManager[55210]: <info>  [1763799517.2974] manager: (tapda46e34b-ec): new Tun device (/org/freedesktop/NetworkManager/Devices/239)
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.306 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:37 np0005531887 systemd[1]: var-lib-containers-storage-overlay-2096d3a3fad7fd9169d98a51d3dd599a1e7474195815a3502a73b178c7a2a0ad-merged.mount: Deactivated successfully.
Nov 22 03:18:37 np0005531887 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-82fd359214e4d78bf917ec357947882c99158b9d09ef3b2b3206b653c73cb165-userdata-shm.mount: Deactivated successfully.
Nov 22 03:18:37 np0005531887 podman[236874]: 2025-11-22 08:18:37.326294771 +0000 UTC m=+0.105868199 container cleanup 82fd359214e4d78bf917ec357947882c99158b9d09ef3b2b3206b653c73cb165 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-28285a99-0933-48f9-aee6-f1e507bcd777, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.327 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:37 np0005531887 systemd[1]: libpod-conmon-82fd359214e4d78bf917ec357947882c99158b9d09ef3b2b3206b653c73cb165.scope: Deactivated successfully.
Nov 22 03:18:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:37.352 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:18:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:37.352 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:18:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:37.353 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.358 186853 INFO nova.virt.libvirt.driver [-] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Instance destroyed successfully.#033[00m
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.359 186853 DEBUG nova.objects.instance [None req-0ce931af-0a49-4fd9-b4cd-945fdec25cf2 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lazy-loading 'resources' on Instance uuid dd01a59a-8825-4686-8ad2-48c0d7c29bcf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.383 186853 DEBUG nova.virt.libvirt.vif [None req-0ce931af-0a49-4fd9-b4cd-945fdec25cf2 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:17:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-686264333',display_name='tempest-TestGettingAddress-server-686264333',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-686264333',id=149,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM4eWB6S8gF3+vAQNfWODRrXJ2TSEmW43skqR+J/UySpPtPQ+ovw0XjJfGr33wuxAxwi/2V7+yN1aEcDFfs9GT9vSaMpY282CNYDuBhDuhcnpLdM1GTSDhOlEpnjVOI3fA==',key_name='tempest-TestGettingAddress-444639012',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:18:09Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-ynkhx0vc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:18:09Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=dd01a59a-8825-4686-8ad2-48c0d7c29bcf,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9dcbb883-4317-4193-a384-0d8b55f051a7", "address": "fa:16:3e:ae:bf:ae", "network": {"id": "28285a99-0933-48f9-aee6-f1e507bcd777", "bridge": "br-int", "label": "tempest-network-smoke--85194758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9dcbb883-43", "ovs_interfaceid": "9dcbb883-4317-4193-a384-0d8b55f051a7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.384 186853 DEBUG nova.network.os_vif_util [None req-0ce931af-0a49-4fd9-b4cd-945fdec25cf2 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "9dcbb883-4317-4193-a384-0d8b55f051a7", "address": "fa:16:3e:ae:bf:ae", "network": {"id": "28285a99-0933-48f9-aee6-f1e507bcd777", "bridge": "br-int", "label": "tempest-network-smoke--85194758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9dcbb883-43", "ovs_interfaceid": "9dcbb883-4317-4193-a384-0d8b55f051a7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.385 186853 DEBUG nova.network.os_vif_util [None req-0ce931af-0a49-4fd9-b4cd-945fdec25cf2 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ae:bf:ae,bridge_name='br-int',has_traffic_filtering=True,id=9dcbb883-4317-4193-a384-0d8b55f051a7,network=Network(28285a99-0933-48f9-aee6-f1e507bcd777),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9dcbb883-43') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.385 186853 DEBUG os_vif [None req-0ce931af-0a49-4fd9-b4cd-945fdec25cf2 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ae:bf:ae,bridge_name='br-int',has_traffic_filtering=True,id=9dcbb883-4317-4193-a384-0d8b55f051a7,network=Network(28285a99-0933-48f9-aee6-f1e507bcd777),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9dcbb883-43') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.387 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.388 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9dcbb883-43, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.390 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.392 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.396 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.399 186853 INFO os_vif [None req-0ce931af-0a49-4fd9-b4cd-945fdec25cf2 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ae:bf:ae,bridge_name='br-int',has_traffic_filtering=True,id=9dcbb883-4317-4193-a384-0d8b55f051a7,network=Network(28285a99-0933-48f9-aee6-f1e507bcd777),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9dcbb883-43')#033[00m
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.400 186853 DEBUG nova.virt.libvirt.vif [None req-0ce931af-0a49-4fd9-b4cd-945fdec25cf2 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:17:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-686264333',display_name='tempest-TestGettingAddress-server-686264333',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-686264333',id=149,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM4eWB6S8gF3+vAQNfWODRrXJ2TSEmW43skqR+J/UySpPtPQ+ovw0XjJfGr33wuxAxwi/2V7+yN1aEcDFfs9GT9vSaMpY282CNYDuBhDuhcnpLdM1GTSDhOlEpnjVOI3fA==',key_name='tempest-TestGettingAddress-444639012',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:18:09Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-ynkhx0vc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:18:09Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=dd01a59a-8825-4686-8ad2-48c0d7c29bcf,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "da46e34b-ec37-4cc4-b1ab-4e8564ebbb60", "address": "fa:16:3e:4b:a1:91", "network": {"id": "206a04da-ce2f-48ff-99c7-e70706547580", "bridge": "br-int", "label": "tempest-network-smoke--1887597380", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe4b:a191", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda46e34b-ec", "ovs_interfaceid": "da46e34b-ec37-4cc4-b1ab-4e8564ebbb60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.401 186853 DEBUG nova.network.os_vif_util [None req-0ce931af-0a49-4fd9-b4cd-945fdec25cf2 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "da46e34b-ec37-4cc4-b1ab-4e8564ebbb60", "address": "fa:16:3e:4b:a1:91", "network": {"id": "206a04da-ce2f-48ff-99c7-e70706547580", "bridge": "br-int", "label": "tempest-network-smoke--1887597380", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe4b:a191", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda46e34b-ec", "ovs_interfaceid": "da46e34b-ec37-4cc4-b1ab-4e8564ebbb60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.401 186853 DEBUG nova.network.os_vif_util [None req-0ce931af-0a49-4fd9-b4cd-945fdec25cf2 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4b:a1:91,bridge_name='br-int',has_traffic_filtering=True,id=da46e34b-ec37-4cc4-b1ab-4e8564ebbb60,network=Network(206a04da-ce2f-48ff-99c7-e70706547580),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda46e34b-ec') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.402 186853 DEBUG os_vif [None req-0ce931af-0a49-4fd9-b4cd-945fdec25cf2 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4b:a1:91,bridge_name='br-int',has_traffic_filtering=True,id=da46e34b-ec37-4cc4-b1ab-4e8564ebbb60,network=Network(206a04da-ce2f-48ff-99c7-e70706547580),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda46e34b-ec') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.404 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.404 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapda46e34b-ec, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:18:37 np0005531887 podman[236932]: 2025-11-22 08:18:37.405678094 +0000 UTC m=+0.054210652 container remove 82fd359214e4d78bf917ec357947882c99158b9d09ef3b2b3206b653c73cb165 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-28285a99-0933-48f9-aee6-f1e507bcd777, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.406 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.408 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.410 186853 INFO os_vif [None req-0ce931af-0a49-4fd9-b4cd-945fdec25cf2 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4b:a1:91,bridge_name='br-int',has_traffic_filtering=True,id=da46e34b-ec37-4cc4-b1ab-4e8564ebbb60,network=Network(206a04da-ce2f-48ff-99c7-e70706547580),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda46e34b-ec')#033[00m
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.411 186853 INFO nova.virt.libvirt.driver [None req-0ce931af-0a49-4fd9-b4cd-945fdec25cf2 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Deleting instance files /var/lib/nova/instances/dd01a59a-8825-4686-8ad2-48c0d7c29bcf_del#033[00m
Nov 22 03:18:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:37.411 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[2d7d8123-0f95-428b-b88c-a908e99b3e6a]: (4, ('Sat Nov 22 08:18:37 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-28285a99-0933-48f9-aee6-f1e507bcd777 (82fd359214e4d78bf917ec357947882c99158b9d09ef3b2b3206b653c73cb165)\n82fd359214e4d78bf917ec357947882c99158b9d09ef3b2b3206b653c73cb165\nSat Nov 22 08:18:37 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-28285a99-0933-48f9-aee6-f1e507bcd777 (82fd359214e4d78bf917ec357947882c99158b9d09ef3b2b3206b653c73cb165)\n82fd359214e4d78bf917ec357947882c99158b9d09ef3b2b3206b653c73cb165\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.412 186853 INFO nova.virt.libvirt.driver [None req-0ce931af-0a49-4fd9-b4cd-945fdec25cf2 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Deletion of /var/lib/nova/instances/dd01a59a-8825-4686-8ad2-48c0d7c29bcf_del complete#033[00m
Nov 22 03:18:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:37.413 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[414b7df6-c970-44da-a277-4210c9370183]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:37.414 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap28285a99-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.416 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:37 np0005531887 kernel: tap28285a99-00: left promiscuous mode
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.424 186853 DEBUG nova.compute.manager [req-75c56d25-7477-4ae2-a6c9-9dd5086d6589 req-c720afb7-3d08-4e5a-851f-504114d12842 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Received event network-vif-unplugged-9dcbb883-4317-4193-a384-0d8b55f051a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.425 186853 DEBUG oslo_concurrency.lockutils [req-75c56d25-7477-4ae2-a6c9-9dd5086d6589 req-c720afb7-3d08-4e5a-851f-504114d12842 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "dd01a59a-8825-4686-8ad2-48c0d7c29bcf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.426 186853 DEBUG oslo_concurrency.lockutils [req-75c56d25-7477-4ae2-a6c9-9dd5086d6589 req-c720afb7-3d08-4e5a-851f-504114d12842 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "dd01a59a-8825-4686-8ad2-48c0d7c29bcf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.426 186853 DEBUG oslo_concurrency.lockutils [req-75c56d25-7477-4ae2-a6c9-9dd5086d6589 req-c720afb7-3d08-4e5a-851f-504114d12842 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "dd01a59a-8825-4686-8ad2-48c0d7c29bcf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.426 186853 DEBUG nova.compute.manager [req-75c56d25-7477-4ae2-a6c9-9dd5086d6589 req-c720afb7-3d08-4e5a-851f-504114d12842 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] No waiting events found dispatching network-vif-unplugged-9dcbb883-4317-4193-a384-0d8b55f051a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.426 186853 DEBUG nova.compute.manager [req-75c56d25-7477-4ae2-a6c9-9dd5086d6589 req-c720afb7-3d08-4e5a-851f-504114d12842 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Received event network-vif-unplugged-9dcbb883-4317-4193-a384-0d8b55f051a7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.430 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.431 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:37.433 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[bf1ad0bc-6232-4b27-bc0f-5e226fb33827]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:37.455 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[f4c00eb5-7b27-4491-aaf9-bd4f68412e0b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:37.457 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[8de2d0d3-d30c-4edf-aa8f-b53eb51f2464]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:37.475 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[68f7c4d5-a79a-4747-8435-4d48518a3400]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 623123, 'reachable_time': 22024, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236951, 'error': None, 'target': 'ovnmeta-28285a99-0933-48f9-aee6-f1e507bcd777', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:37.479 104200 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-28285a99-0933-48f9-aee6-f1e507bcd777 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:18:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:37.479 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[c022e2a3-7a2e-4372-bd8a-24b3e2d441d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:37 np0005531887 systemd[1]: run-netns-ovnmeta\x2d28285a99\x2d0933\x2d48f9\x2daee6\x2df1e507bcd777.mount: Deactivated successfully.
Nov 22 03:18:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:37.480 104084 INFO neutron.agent.ovn.metadata.agent [-] Port da46e34b-ec37-4cc4-b1ab-4e8564ebbb60 in datapath 206a04da-ce2f-48ff-99c7-e70706547580 unbound from our chassis#033[00m
Nov 22 03:18:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:37.482 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 206a04da-ce2f-48ff-99c7-e70706547580, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:18:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:37.483 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[136ac1dd-96d7-40c1-bccd-2e04ab70c552]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:37.483 104084 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-206a04da-ce2f-48ff-99c7-e70706547580 namespace which is not needed anymore#033[00m
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.525 186853 DEBUG nova.compute.manager [req-a36848fd-d4c1-424a-9b71-689bc7d10f9e req-aef13ca0-4038-46f9-b62b-9e7eac3e3a36 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Received event network-vif-unplugged-da46e34b-ec37-4cc4-b1ab-4e8564ebbb60 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.526 186853 DEBUG oslo_concurrency.lockutils [req-a36848fd-d4c1-424a-9b71-689bc7d10f9e req-aef13ca0-4038-46f9-b62b-9e7eac3e3a36 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "dd01a59a-8825-4686-8ad2-48c0d7c29bcf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.526 186853 DEBUG oslo_concurrency.lockutils [req-a36848fd-d4c1-424a-9b71-689bc7d10f9e req-aef13ca0-4038-46f9-b62b-9e7eac3e3a36 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "dd01a59a-8825-4686-8ad2-48c0d7c29bcf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.526 186853 DEBUG oslo_concurrency.lockutils [req-a36848fd-d4c1-424a-9b71-689bc7d10f9e req-aef13ca0-4038-46f9-b62b-9e7eac3e3a36 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "dd01a59a-8825-4686-8ad2-48c0d7c29bcf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.526 186853 DEBUG nova.compute.manager [req-a36848fd-d4c1-424a-9b71-689bc7d10f9e req-aef13ca0-4038-46f9-b62b-9e7eac3e3a36 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] No waiting events found dispatching network-vif-unplugged-da46e34b-ec37-4cc4-b1ab-4e8564ebbb60 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.527 186853 DEBUG nova.compute.manager [req-a36848fd-d4c1-424a-9b71-689bc7d10f9e req-aef13ca0-4038-46f9-b62b-9e7eac3e3a36 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Received event network-vif-unplugged-da46e34b-ec37-4cc4-b1ab-4e8564ebbb60 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.539 186853 INFO nova.compute.manager [None req-0ce931af-0a49-4fd9-b4cd-945fdec25cf2 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Took 0.49 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.539 186853 DEBUG oslo.service.loopingcall [None req-0ce931af-0a49-4fd9-b4cd-945fdec25cf2 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.540 186853 DEBUG nova.compute.manager [-] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.540 186853 DEBUG nova.network.neutron [-] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.607 186853 DEBUG nova.compute.manager [req-f5d73a59-faef-4836-b8d0-d0e60950feb6 req-53837be1-8ad2-4c2d-8bb5-11f245940f68 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Received event network-changed-9dcbb883-4317-4193-a384-0d8b55f051a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.607 186853 DEBUG nova.compute.manager [req-f5d73a59-faef-4836-b8d0-d0e60950feb6 req-53837be1-8ad2-4c2d-8bb5-11f245940f68 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Refreshing instance network info cache due to event network-changed-9dcbb883-4317-4193-a384-0d8b55f051a7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.607 186853 DEBUG oslo_concurrency.lockutils [req-f5d73a59-faef-4836-b8d0-d0e60950feb6 req-53837be1-8ad2-4c2d-8bb5-11f245940f68 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-dd01a59a-8825-4686-8ad2-48c0d7c29bcf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.608 186853 DEBUG oslo_concurrency.lockutils [req-f5d73a59-faef-4836-b8d0-d0e60950feb6 req-53837be1-8ad2-4c2d-8bb5-11f245940f68 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-dd01a59a-8825-4686-8ad2-48c0d7c29bcf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.608 186853 DEBUG nova.network.neutron [req-f5d73a59-faef-4836-b8d0-d0e60950feb6 req-53837be1-8ad2-4c2d-8bb5-11f245940f68 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Refreshing network info cache for port 9dcbb883-4317-4193-a384-0d8b55f051a7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:18:37 np0005531887 neutron-haproxy-ovnmeta-206a04da-ce2f-48ff-99c7-e70706547580[236679]: [NOTICE]   (236683) : haproxy version is 2.8.14-c23fe91
Nov 22 03:18:37 np0005531887 neutron-haproxy-ovnmeta-206a04da-ce2f-48ff-99c7-e70706547580[236679]: [NOTICE]   (236683) : path to executable is /usr/sbin/haproxy
Nov 22 03:18:37 np0005531887 neutron-haproxy-ovnmeta-206a04da-ce2f-48ff-99c7-e70706547580[236679]: [WARNING]  (236683) : Exiting Master process...
Nov 22 03:18:37 np0005531887 neutron-haproxy-ovnmeta-206a04da-ce2f-48ff-99c7-e70706547580[236679]: [ALERT]    (236683) : Current worker (236685) exited with code 143 (Terminated)
Nov 22 03:18:37 np0005531887 neutron-haproxy-ovnmeta-206a04da-ce2f-48ff-99c7-e70706547580[236679]: [WARNING]  (236683) : All workers exited. Exiting... (0)
Nov 22 03:18:37 np0005531887 systemd[1]: libpod-a09bfd3c48b24a7545fd2da109bec938dfb8cb21a991f763adaa7af919cfff17.scope: Deactivated successfully.
Nov 22 03:18:37 np0005531887 conmon[236679]: conmon a09bfd3c48b24a7545fd <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-a09bfd3c48b24a7545fd2da109bec938dfb8cb21a991f763adaa7af919cfff17.scope/container/memory.events
Nov 22 03:18:37 np0005531887 podman[236967]: 2025-11-22 08:18:37.62714407 +0000 UTC m=+0.048069309 container died a09bfd3c48b24a7545fd2da109bec938dfb8cb21a991f763adaa7af919cfff17 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-206a04da-ce2f-48ff-99c7-e70706547580, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:18:37 np0005531887 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a09bfd3c48b24a7545fd2da109bec938dfb8cb21a991f763adaa7af919cfff17-userdata-shm.mount: Deactivated successfully.
Nov 22 03:18:37 np0005531887 systemd[1]: var-lib-containers-storage-overlay-1328f86e78b08da4844bd99389c998f6c06fd3c55c9dbfaedf7ebffd9e60cac5-merged.mount: Deactivated successfully.
Nov 22 03:18:37 np0005531887 podman[236967]: 2025-11-22 08:18:37.661734466 +0000 UTC m=+0.082659705 container cleanup a09bfd3c48b24a7545fd2da109bec938dfb8cb21a991f763adaa7af919cfff17 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-206a04da-ce2f-48ff-99c7-e70706547580, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 03:18:37 np0005531887 systemd[1]: libpod-conmon-a09bfd3c48b24a7545fd2da109bec938dfb8cb21a991f763adaa7af919cfff17.scope: Deactivated successfully.
Nov 22 03:18:37 np0005531887 podman[236997]: 2025-11-22 08:18:37.720506069 +0000 UTC m=+0.038026571 container remove a09bfd3c48b24a7545fd2da109bec938dfb8cb21a991f763adaa7af919cfff17 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-206a04da-ce2f-48ff-99c7-e70706547580, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 03:18:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:37.726 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[5b71ee37-b72c-4b25-a5c5-2c8f056496e0]: (4, ('Sat Nov 22 08:18:37 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-206a04da-ce2f-48ff-99c7-e70706547580 (a09bfd3c48b24a7545fd2da109bec938dfb8cb21a991f763adaa7af919cfff17)\na09bfd3c48b24a7545fd2da109bec938dfb8cb21a991f763adaa7af919cfff17\nSat Nov 22 08:18:37 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-206a04da-ce2f-48ff-99c7-e70706547580 (a09bfd3c48b24a7545fd2da109bec938dfb8cb21a991f763adaa7af919cfff17)\na09bfd3c48b24a7545fd2da109bec938dfb8cb21a991f763adaa7af919cfff17\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:37.727 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[ba122f4d-114d-4e04-b0c3-4f7669d7afe4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:37.728 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap206a04da-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.730 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:37 np0005531887 kernel: tap206a04da-c0: left promiscuous mode
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.731 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:37.734 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[c3f9d2e2-c87a-42d1-85dc-923c141cbca2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:37 np0005531887 nova_compute[186849]: 2025-11-22 08:18:37.743 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:37.767 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[a278767b-fcd1-4aeb-9096-04de00cee496]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:37.769 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[4a44cf6f-6d63-4c67-83ae-da3e174e25e4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:37.786 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[c9229807-842d-4008-b33a-79544fa603bc]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 623247, 'reachable_time': 17534, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237012, 'error': None, 'target': 'ovnmeta-206a04da-ce2f-48ff-99c7-e70706547580', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:37.788 104200 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-206a04da-ce2f-48ff-99c7-e70706547580 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:18:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:37.788 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[2d4b7d87-fe1c-4ba6-982d-9a79dcac662a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:18:37 np0005531887 systemd[1]: run-netns-ovnmeta\x2d206a04da\x2dce2f\x2d48ff\x2d99c7\x2de70706547580.mount: Deactivated successfully.
Nov 22 03:18:38 np0005531887 podman[237013]: 2025-11-22 08:18:38.835759108 +0000 UTC m=+0.053255538 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:18:39 np0005531887 nova_compute[186849]: 2025-11-22 08:18:39.201 186853 DEBUG nova.compute.manager [req-c41067a4-b712-4a12-8376-08da1648c5ae req-759d146d-3ec2-4598-8229-77d5261d13cd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Received event network-vif-deleted-da46e34b-ec37-4cc4-b1ab-4e8564ebbb60 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:18:39 np0005531887 nova_compute[186849]: 2025-11-22 08:18:39.202 186853 INFO nova.compute.manager [req-c41067a4-b712-4a12-8376-08da1648c5ae req-759d146d-3ec2-4598-8229-77d5261d13cd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Neutron deleted interface da46e34b-ec37-4cc4-b1ab-4e8564ebbb60; detaching it from the instance and deleting it from the info cache#033[00m
Nov 22 03:18:39 np0005531887 nova_compute[186849]: 2025-11-22 08:18:39.204 186853 DEBUG nova.network.neutron [req-c41067a4-b712-4a12-8376-08da1648c5ae req-759d146d-3ec2-4598-8229-77d5261d13cd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Updating instance_info_cache with network_info: [{"id": "9dcbb883-4317-4193-a384-0d8b55f051a7", "address": "fa:16:3e:ae:bf:ae", "network": {"id": "28285a99-0933-48f9-aee6-f1e507bcd777", "bridge": "br-int", "label": "tempest-network-smoke--85194758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9dcbb883-43", "ovs_interfaceid": "9dcbb883-4317-4193-a384-0d8b55f051a7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:18:39 np0005531887 nova_compute[186849]: 2025-11-22 08:18:39.225 186853 DEBUG nova.compute.manager [req-c41067a4-b712-4a12-8376-08da1648c5ae req-759d146d-3ec2-4598-8229-77d5261d13cd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Detach interface failed, port_id=da46e34b-ec37-4cc4-b1ab-4e8564ebbb60, reason: Instance dd01a59a-8825-4686-8ad2-48c0d7c29bcf could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Nov 22 03:18:39 np0005531887 nova_compute[186849]: 2025-11-22 08:18:39.544 186853 DEBUG nova.compute.manager [req-5d67f213-e232-447e-a35d-58c1f0c9b2e5 req-0c5c3b70-dc41-4a68-9689-586bcf636029 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Received event network-vif-plugged-9dcbb883-4317-4193-a384-0d8b55f051a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:18:39 np0005531887 nova_compute[186849]: 2025-11-22 08:18:39.544 186853 DEBUG oslo_concurrency.lockutils [req-5d67f213-e232-447e-a35d-58c1f0c9b2e5 req-0c5c3b70-dc41-4a68-9689-586bcf636029 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "dd01a59a-8825-4686-8ad2-48c0d7c29bcf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:18:39 np0005531887 nova_compute[186849]: 2025-11-22 08:18:39.544 186853 DEBUG oslo_concurrency.lockutils [req-5d67f213-e232-447e-a35d-58c1f0c9b2e5 req-0c5c3b70-dc41-4a68-9689-586bcf636029 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "dd01a59a-8825-4686-8ad2-48c0d7c29bcf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:18:39 np0005531887 nova_compute[186849]: 2025-11-22 08:18:39.544 186853 DEBUG oslo_concurrency.lockutils [req-5d67f213-e232-447e-a35d-58c1f0c9b2e5 req-0c5c3b70-dc41-4a68-9689-586bcf636029 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "dd01a59a-8825-4686-8ad2-48c0d7c29bcf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:18:39 np0005531887 nova_compute[186849]: 2025-11-22 08:18:39.544 186853 DEBUG nova.compute.manager [req-5d67f213-e232-447e-a35d-58c1f0c9b2e5 req-0c5c3b70-dc41-4a68-9689-586bcf636029 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] No waiting events found dispatching network-vif-plugged-9dcbb883-4317-4193-a384-0d8b55f051a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:18:39 np0005531887 nova_compute[186849]: 2025-11-22 08:18:39.544 186853 WARNING nova.compute.manager [req-5d67f213-e232-447e-a35d-58c1f0c9b2e5 req-0c5c3b70-dc41-4a68-9689-586bcf636029 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Received unexpected event network-vif-plugged-9dcbb883-4317-4193-a384-0d8b55f051a7 for instance with vm_state active and task_state deleting.#033[00m
Nov 22 03:18:39 np0005531887 nova_compute[186849]: 2025-11-22 08:18:39.613 186853 DEBUG nova.compute.manager [req-95e92651-9a4c-4ff5-9d23-3596b8ee2f0b req-2a4bf71b-edb6-42c6-b7be-e1b32b9f31f6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Received event network-vif-plugged-da46e34b-ec37-4cc4-b1ab-4e8564ebbb60 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:18:39 np0005531887 nova_compute[186849]: 2025-11-22 08:18:39.613 186853 DEBUG oslo_concurrency.lockutils [req-95e92651-9a4c-4ff5-9d23-3596b8ee2f0b req-2a4bf71b-edb6-42c6-b7be-e1b32b9f31f6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "dd01a59a-8825-4686-8ad2-48c0d7c29bcf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:18:39 np0005531887 nova_compute[186849]: 2025-11-22 08:18:39.614 186853 DEBUG oslo_concurrency.lockutils [req-95e92651-9a4c-4ff5-9d23-3596b8ee2f0b req-2a4bf71b-edb6-42c6-b7be-e1b32b9f31f6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "dd01a59a-8825-4686-8ad2-48c0d7c29bcf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:18:39 np0005531887 nova_compute[186849]: 2025-11-22 08:18:39.614 186853 DEBUG oslo_concurrency.lockutils [req-95e92651-9a4c-4ff5-9d23-3596b8ee2f0b req-2a4bf71b-edb6-42c6-b7be-e1b32b9f31f6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "dd01a59a-8825-4686-8ad2-48c0d7c29bcf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:18:39 np0005531887 nova_compute[186849]: 2025-11-22 08:18:39.614 186853 DEBUG nova.compute.manager [req-95e92651-9a4c-4ff5-9d23-3596b8ee2f0b req-2a4bf71b-edb6-42c6-b7be-e1b32b9f31f6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] No waiting events found dispatching network-vif-plugged-da46e34b-ec37-4cc4-b1ab-4e8564ebbb60 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:18:39 np0005531887 nova_compute[186849]: 2025-11-22 08:18:39.614 186853 WARNING nova.compute.manager [req-95e92651-9a4c-4ff5-9d23-3596b8ee2f0b req-2a4bf71b-edb6-42c6-b7be-e1b32b9f31f6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Received unexpected event network-vif-plugged-da46e34b-ec37-4cc4-b1ab-4e8564ebbb60 for instance with vm_state active and task_state deleting.#033[00m
Nov 22 03:18:40 np0005531887 nova_compute[186849]: 2025-11-22 08:18:40.095 186853 DEBUG nova.network.neutron [req-f5d73a59-faef-4836-b8d0-d0e60950feb6 req-53837be1-8ad2-4c2d-8bb5-11f245940f68 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Updated VIF entry in instance network info cache for port 9dcbb883-4317-4193-a384-0d8b55f051a7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:18:40 np0005531887 nova_compute[186849]: 2025-11-22 08:18:40.096 186853 DEBUG nova.network.neutron [req-f5d73a59-faef-4836-b8d0-d0e60950feb6 req-53837be1-8ad2-4c2d-8bb5-11f245940f68 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Updating instance_info_cache with network_info: [{"id": "9dcbb883-4317-4193-a384-0d8b55f051a7", "address": "fa:16:3e:ae:bf:ae", "network": {"id": "28285a99-0933-48f9-aee6-f1e507bcd777", "bridge": "br-int", "label": "tempest-network-smoke--85194758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9dcbb883-43", "ovs_interfaceid": "9dcbb883-4317-4193-a384-0d8b55f051a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "da46e34b-ec37-4cc4-b1ab-4e8564ebbb60", "address": "fa:16:3e:4b:a1:91", "network": {"id": "206a04da-ce2f-48ff-99c7-e70706547580", "bridge": "br-int", "label": "tempest-network-smoke--1887597380", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe4b:a191", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda46e34b-ec", "ovs_interfaceid": "da46e34b-ec37-4cc4-b1ab-4e8564ebbb60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:18:40 np0005531887 nova_compute[186849]: 2025-11-22 08:18:40.114 186853 DEBUG nova.network.neutron [-] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:18:40 np0005531887 nova_compute[186849]: 2025-11-22 08:18:40.133 186853 DEBUG oslo_concurrency.lockutils [req-f5d73a59-faef-4836-b8d0-d0e60950feb6 req-53837be1-8ad2-4c2d-8bb5-11f245940f68 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-dd01a59a-8825-4686-8ad2-48c0d7c29bcf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:18:40 np0005531887 nova_compute[186849]: 2025-11-22 08:18:40.137 186853 INFO nova.compute.manager [-] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Took 2.60 seconds to deallocate network for instance.#033[00m
Nov 22 03:18:40 np0005531887 nova_compute[186849]: 2025-11-22 08:18:40.537 186853 DEBUG oslo_concurrency.lockutils [None req-0ce931af-0a49-4fd9-b4cd-945fdec25cf2 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:18:40 np0005531887 nova_compute[186849]: 2025-11-22 08:18:40.538 186853 DEBUG oslo_concurrency.lockutils [None req-0ce931af-0a49-4fd9-b4cd-945fdec25cf2 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:18:40 np0005531887 nova_compute[186849]: 2025-11-22 08:18:40.591 186853 DEBUG nova.compute.provider_tree [None req-0ce931af-0a49-4fd9-b4cd-945fdec25cf2 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:18:40 np0005531887 nova_compute[186849]: 2025-11-22 08:18:40.603 186853 DEBUG nova.scheduler.client.report [None req-0ce931af-0a49-4fd9-b4cd-945fdec25cf2 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:18:40 np0005531887 nova_compute[186849]: 2025-11-22 08:18:40.620 186853 DEBUG oslo_concurrency.lockutils [None req-0ce931af-0a49-4fd9-b4cd-945fdec25cf2 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.083s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:18:40 np0005531887 nova_compute[186849]: 2025-11-22 08:18:40.640 186853 INFO nova.scheduler.client.report [None req-0ce931af-0a49-4fd9-b4cd-945fdec25cf2 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Deleted allocations for instance dd01a59a-8825-4686-8ad2-48c0d7c29bcf#033[00m
Nov 22 03:18:40 np0005531887 nova_compute[186849]: 2025-11-22 08:18:40.694 186853 DEBUG oslo_concurrency.lockutils [None req-0ce931af-0a49-4fd9-b4cd-945fdec25cf2 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "dd01a59a-8825-4686-8ad2-48c0d7c29bcf" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.659s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:18:41 np0005531887 nova_compute[186849]: 2025-11-22 08:18:41.325 186853 DEBUG nova.compute.manager [req-4677b30b-32e8-42d8-aa74-c31638a43f94 req-31546dbe-c509-4572-beef-96cda68bf49c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Received event network-vif-deleted-9dcbb883-4317-4193-a384-0d8b55f051a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:18:42 np0005531887 nova_compute[186849]: 2025-11-22 08:18:42.169 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:42 np0005531887 nova_compute[186849]: 2025-11-22 08:18:42.406 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:43 np0005531887 podman[237040]: 2025-11-22 08:18:43.85544653 +0000 UTC m=+0.074513264 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, version=9.6, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_id=edpm, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7)
Nov 22 03:18:47 np0005531887 nova_compute[186849]: 2025-11-22 08:18:47.170 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:47 np0005531887 nova_compute[186849]: 2025-11-22 08:18:47.417 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:47 np0005531887 podman[237061]: 2025-11-22 08:18:47.830367627 +0000 UTC m=+0.054625412 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:18:47 np0005531887 podman[237062]: 2025-11-22 08:18:47.870637912 +0000 UTC m=+0.079453025 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 22 03:18:52 np0005531887 nova_compute[186849]: 2025-11-22 08:18:52.173 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:52 np0005531887 nova_compute[186849]: 2025-11-22 08:18:52.357 186853 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763799517.3558161, dd01a59a-8825-4686-8ad2-48c0d7c29bcf => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:18:52 np0005531887 nova_compute[186849]: 2025-11-22 08:18:52.357 186853 INFO nova.compute.manager [-] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:18:52 np0005531887 nova_compute[186849]: 2025-11-22 08:18:52.382 186853 DEBUG nova.compute.manager [None req-5ff8035c-aa9d-4b93-9b2c-a67e2e5ccbef - - - - - -] [instance: dd01a59a-8825-4686-8ad2-48c0d7c29bcf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:18:52 np0005531887 nova_compute[186849]: 2025-11-22 08:18:52.419 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:53 np0005531887 podman[237107]: 2025-11-22 08:18:53.838020728 +0000 UTC m=+0.063781747 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:18:55 np0005531887 nova_compute[186849]: 2025-11-22 08:18:55.070 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:55 np0005531887 nova_compute[186849]: 2025-11-22 08:18:55.195 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:57 np0005531887 nova_compute[186849]: 2025-11-22 08:18:57.175 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:57 np0005531887 nova_compute[186849]: 2025-11-22 08:18:57.421 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:58 np0005531887 nova_compute[186849]: 2025-11-22 08:18:58.753 186853 DEBUG nova.compute.manager [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Nov 22 03:18:58 np0005531887 podman[237133]: 2025-11-22 08:18:58.840165846 +0000 UTC m=+0.063848670 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 03:18:58 np0005531887 nova_compute[186849]: 2025-11-22 08:18:58.880 186853 DEBUG oslo_concurrency.lockutils [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:18:58 np0005531887 nova_compute[186849]: 2025-11-22 08:18:58.880 186853 DEBUG oslo_concurrency.lockutils [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:18:58 np0005531887 nova_compute[186849]: 2025-11-22 08:18:58.913 186853 DEBUG nova.objects.instance [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Lazy-loading 'pci_requests' on Instance uuid ff7656a5-6680-4acd-a89d-fdc5e9fb914a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:18:58 np0005531887 nova_compute[186849]: 2025-11-22 08:18:58.924 186853 DEBUG nova.virt.hardware [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:18:58 np0005531887 nova_compute[186849]: 2025-11-22 08:18:58.924 186853 INFO nova.compute.claims [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 22 03:18:58 np0005531887 nova_compute[186849]: 2025-11-22 08:18:58.924 186853 DEBUG nova.objects.instance [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Lazy-loading 'resources' on Instance uuid ff7656a5-6680-4acd-a89d-fdc5e9fb914a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:18:58 np0005531887 nova_compute[186849]: 2025-11-22 08:18:58.933 186853 DEBUG nova.objects.instance [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Lazy-loading 'numa_topology' on Instance uuid ff7656a5-6680-4acd-a89d-fdc5e9fb914a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:18:58 np0005531887 nova_compute[186849]: 2025-11-22 08:18:58.942 186853 DEBUG nova.objects.instance [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Lazy-loading 'pci_devices' on Instance uuid ff7656a5-6680-4acd-a89d-fdc5e9fb914a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:18:58 np0005531887 nova_compute[186849]: 2025-11-22 08:18:58.975 186853 INFO nova.compute.resource_tracker [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Updating resource usage from migration 86738dcc-794c-48e8-bcb5-a8fc825b0b3f#033[00m
Nov 22 03:18:58 np0005531887 nova_compute[186849]: 2025-11-22 08:18:58.975 186853 DEBUG nova.compute.resource_tracker [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Starting to track incoming migration 86738dcc-794c-48e8-bcb5-a8fc825b0b3f with flavor 31612188-3cd6-428b-9166-9568f0affd4a _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Nov 22 03:18:59 np0005531887 nova_compute[186849]: 2025-11-22 08:18:59.035 186853 DEBUG nova.compute.provider_tree [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:18:59 np0005531887 nova_compute[186849]: 2025-11-22 08:18:59.045 186853 DEBUG nova.scheduler.client.report [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:18:59 np0005531887 nova_compute[186849]: 2025-11-22 08:18:59.062 186853 DEBUG oslo_concurrency.lockutils [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.181s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:18:59 np0005531887 nova_compute[186849]: 2025-11-22 08:18:59.062 186853 INFO nova.compute.manager [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Migrating#033[00m
Nov 22 03:18:59 np0005531887 nova_compute[186849]: 2025-11-22 08:18:59.498 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:18:59 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:59.498 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=44, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=43) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:18:59 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:59.499 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:18:59 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:18:59.500 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '44'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:19:00 np0005531887 nova_compute[186849]: 2025-11-22 08:19:00.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:19:00 np0005531887 systemd[1]: Created slice User Slice of UID 42436.
Nov 22 03:19:00 np0005531887 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 22 03:19:00 np0005531887 systemd-logind[821]: New session 51 of user nova.
Nov 22 03:19:00 np0005531887 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 22 03:19:00 np0005531887 systemd[1]: Starting User Manager for UID 42436...
Nov 22 03:19:00 np0005531887 systemd[237155]: Queued start job for default target Main User Target.
Nov 22 03:19:00 np0005531887 systemd[237155]: Created slice User Application Slice.
Nov 22 03:19:00 np0005531887 systemd[237155]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 22 03:19:00 np0005531887 systemd[237155]: Started Daily Cleanup of User's Temporary Directories.
Nov 22 03:19:00 np0005531887 systemd[237155]: Reached target Paths.
Nov 22 03:19:00 np0005531887 systemd[237155]: Reached target Timers.
Nov 22 03:19:00 np0005531887 systemd[237155]: Starting D-Bus User Message Bus Socket...
Nov 22 03:19:00 np0005531887 systemd[237155]: Starting Create User's Volatile Files and Directories...
Nov 22 03:19:00 np0005531887 systemd[237155]: Finished Create User's Volatile Files and Directories.
Nov 22 03:19:00 np0005531887 systemd[237155]: Listening on D-Bus User Message Bus Socket.
Nov 22 03:19:00 np0005531887 systemd[237155]: Reached target Sockets.
Nov 22 03:19:00 np0005531887 systemd[237155]: Reached target Basic System.
Nov 22 03:19:00 np0005531887 systemd[237155]: Reached target Main User Target.
Nov 22 03:19:00 np0005531887 systemd[237155]: Startup finished in 142ms.
Nov 22 03:19:00 np0005531887 systemd[1]: Started User Manager for UID 42436.
Nov 22 03:19:01 np0005531887 systemd[1]: Started Session 51 of User nova.
Nov 22 03:19:01 np0005531887 systemd[1]: session-51.scope: Deactivated successfully.
Nov 22 03:19:01 np0005531887 systemd-logind[821]: Session 51 logged out. Waiting for processes to exit.
Nov 22 03:19:01 np0005531887 systemd-logind[821]: Removed session 51.
Nov 22 03:19:01 np0005531887 systemd-logind[821]: New session 53 of user nova.
Nov 22 03:19:01 np0005531887 systemd[1]: Started Session 53 of User nova.
Nov 22 03:19:01 np0005531887 systemd[1]: session-53.scope: Deactivated successfully.
Nov 22 03:19:01 np0005531887 systemd-logind[821]: Session 53 logged out. Waiting for processes to exit.
Nov 22 03:19:01 np0005531887 systemd-logind[821]: Removed session 53.
Nov 22 03:19:02 np0005531887 nova_compute[186849]: 2025-11-22 08:19:02.177 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:02 np0005531887 nova_compute[186849]: 2025-11-22 08:19:02.423 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:04 np0005531887 nova_compute[186849]: 2025-11-22 08:19:04.137 186853 DEBUG nova.compute.manager [req-edb51260-2ec4-42d1-9570-d1ccd3bfb8ee req-194937e1-3d0e-4193-884e-cca5ba7f8113 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Received event network-vif-unplugged-a6be1de1-c2dd-4be7-89df-bfa4d9bc296c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:19:04 np0005531887 nova_compute[186849]: 2025-11-22 08:19:04.138 186853 DEBUG oslo_concurrency.lockutils [req-edb51260-2ec4-42d1-9570-d1ccd3bfb8ee req-194937e1-3d0e-4193-884e-cca5ba7f8113 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "ff7656a5-6680-4acd-a89d-fdc5e9fb914a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:19:04 np0005531887 nova_compute[186849]: 2025-11-22 08:19:04.138 186853 DEBUG oslo_concurrency.lockutils [req-edb51260-2ec4-42d1-9570-d1ccd3bfb8ee req-194937e1-3d0e-4193-884e-cca5ba7f8113 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ff7656a5-6680-4acd-a89d-fdc5e9fb914a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:19:04 np0005531887 nova_compute[186849]: 2025-11-22 08:19:04.139 186853 DEBUG oslo_concurrency.lockutils [req-edb51260-2ec4-42d1-9570-d1ccd3bfb8ee req-194937e1-3d0e-4193-884e-cca5ba7f8113 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ff7656a5-6680-4acd-a89d-fdc5e9fb914a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:19:04 np0005531887 nova_compute[186849]: 2025-11-22 08:19:04.139 186853 DEBUG nova.compute.manager [req-edb51260-2ec4-42d1-9570-d1ccd3bfb8ee req-194937e1-3d0e-4193-884e-cca5ba7f8113 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] No waiting events found dispatching network-vif-unplugged-a6be1de1-c2dd-4be7-89df-bfa4d9bc296c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:19:04 np0005531887 nova_compute[186849]: 2025-11-22 08:19:04.140 186853 WARNING nova.compute.manager [req-edb51260-2ec4-42d1-9570-d1ccd3bfb8ee req-194937e1-3d0e-4193-884e-cca5ba7f8113 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Received unexpected event network-vif-unplugged-a6be1de1-c2dd-4be7-89df-bfa4d9bc296c for instance with vm_state active and task_state resize_migrating.#033[00m
Nov 22 03:19:04 np0005531887 systemd-logind[821]: New session 54 of user nova.
Nov 22 03:19:04 np0005531887 systemd[1]: Started Session 54 of User nova.
Nov 22 03:19:04 np0005531887 podman[237179]: 2025-11-22 08:19:04.726415207 +0000 UTC m=+0.073331254 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 22 03:19:05 np0005531887 systemd[1]: session-54.scope: Deactivated successfully.
Nov 22 03:19:05 np0005531887 systemd-logind[821]: Session 54 logged out. Waiting for processes to exit.
Nov 22 03:19:05 np0005531887 systemd-logind[821]: Removed session 54.
Nov 22 03:19:05 np0005531887 systemd-logind[821]: New session 55 of user nova.
Nov 22 03:19:05 np0005531887 systemd[1]: Started Session 55 of User nova.
Nov 22 03:19:05 np0005531887 systemd[1]: session-55.scope: Deactivated successfully.
Nov 22 03:19:05 np0005531887 systemd-logind[821]: Session 55 logged out. Waiting for processes to exit.
Nov 22 03:19:05 np0005531887 systemd-logind[821]: Removed session 55.
Nov 22 03:19:05 np0005531887 systemd-logind[821]: New session 56 of user nova.
Nov 22 03:19:05 np0005531887 systemd[1]: Started Session 56 of User nova.
Nov 22 03:19:05 np0005531887 systemd[1]: session-56.scope: Deactivated successfully.
Nov 22 03:19:05 np0005531887 systemd-logind[821]: Session 56 logged out. Waiting for processes to exit.
Nov 22 03:19:05 np0005531887 systemd-logind[821]: Removed session 56.
Nov 22 03:19:06 np0005531887 nova_compute[186849]: 2025-11-22 08:19:06.279 186853 DEBUG nova.compute.manager [req-8ed7aea9-0e3a-4e97-9376-8b258a175743 req-6952e8a5-f6ce-40dd-b8ff-5f3fa975b553 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Received event network-vif-plugged-a6be1de1-c2dd-4be7-89df-bfa4d9bc296c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:19:06 np0005531887 nova_compute[186849]: 2025-11-22 08:19:06.280 186853 DEBUG oslo_concurrency.lockutils [req-8ed7aea9-0e3a-4e97-9376-8b258a175743 req-6952e8a5-f6ce-40dd-b8ff-5f3fa975b553 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "ff7656a5-6680-4acd-a89d-fdc5e9fb914a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:19:06 np0005531887 nova_compute[186849]: 2025-11-22 08:19:06.281 186853 DEBUG oslo_concurrency.lockutils [req-8ed7aea9-0e3a-4e97-9376-8b258a175743 req-6952e8a5-f6ce-40dd-b8ff-5f3fa975b553 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ff7656a5-6680-4acd-a89d-fdc5e9fb914a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:19:06 np0005531887 nova_compute[186849]: 2025-11-22 08:19:06.281 186853 DEBUG oslo_concurrency.lockutils [req-8ed7aea9-0e3a-4e97-9376-8b258a175743 req-6952e8a5-f6ce-40dd-b8ff-5f3fa975b553 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ff7656a5-6680-4acd-a89d-fdc5e9fb914a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:19:06 np0005531887 nova_compute[186849]: 2025-11-22 08:19:06.281 186853 DEBUG nova.compute.manager [req-8ed7aea9-0e3a-4e97-9376-8b258a175743 req-6952e8a5-f6ce-40dd-b8ff-5f3fa975b553 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] No waiting events found dispatching network-vif-plugged-a6be1de1-c2dd-4be7-89df-bfa4d9bc296c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:19:06 np0005531887 nova_compute[186849]: 2025-11-22 08:19:06.281 186853 WARNING nova.compute.manager [req-8ed7aea9-0e3a-4e97-9376-8b258a175743 req-6952e8a5-f6ce-40dd-b8ff-5f3fa975b553 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Received unexpected event network-vif-plugged-a6be1de1-c2dd-4be7-89df-bfa4d9bc296c for instance with vm_state active and task_state resize_migrated.#033[00m
Nov 22 03:19:06 np0005531887 nova_compute[186849]: 2025-11-22 08:19:06.471 186853 INFO nova.network.neutron [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Updating port a6be1de1-c2dd-4be7-89df-bfa4d9bc296c with attributes {'binding:host_id': 'compute-1.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Nov 22 03:19:07 np0005531887 nova_compute[186849]: 2025-11-22 08:19:07.179 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:07 np0005531887 nova_compute[186849]: 2025-11-22 08:19:07.425 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:07 np0005531887 nova_compute[186849]: 2025-11-22 08:19:07.443 186853 DEBUG oslo_concurrency.lockutils [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Acquiring lock "refresh_cache-ff7656a5-6680-4acd-a89d-fdc5e9fb914a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:19:07 np0005531887 nova_compute[186849]: 2025-11-22 08:19:07.443 186853 DEBUG oslo_concurrency.lockutils [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Acquired lock "refresh_cache-ff7656a5-6680-4acd-a89d-fdc5e9fb914a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:19:07 np0005531887 nova_compute[186849]: 2025-11-22 08:19:07.443 186853 DEBUG nova.network.neutron [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:19:07 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:19:07.549 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2c:5b:22 10.100.0.2 2001:db8::f816:3eff:fe2c:5b22'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe2c:5b22/64', 'neutron:device_id': 'ovnmeta-cfb1249f-37ac-4df7-b559-e7968406997d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cfb1249f-37ac-4df7-b559-e7968406997d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=94e93905-1b64-4ecc-b682-ceea307bebcf, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ac76a812-5ead-4b51-8c63-4eaca1b65820) old=Port_Binding(mac=['fa:16:3e:2c:5b:22 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-cfb1249f-37ac-4df7-b559-e7968406997d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cfb1249f-37ac-4df7-b559-e7968406997d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:19:07 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:19:07.551 104084 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ac76a812-5ead-4b51-8c63-4eaca1b65820 in datapath cfb1249f-37ac-4df7-b559-e7968406997d updated#033[00m
Nov 22 03:19:07 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:19:07.552 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cfb1249f-37ac-4df7-b559-e7968406997d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:19:07 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:19:07.553 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[5e3edb19-99f3-4351-8649-1827b85928c2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:19:07 np0005531887 nova_compute[186849]: 2025-11-22 08:19:07.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:19:07 np0005531887 nova_compute[186849]: 2025-11-22 08:19:07.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:19:07 np0005531887 nova_compute[186849]: 2025-11-22 08:19:07.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:19:07 np0005531887 nova_compute[186849]: 2025-11-22 08:19:07.783 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "refresh_cache-ff7656a5-6680-4acd-a89d-fdc5e9fb914a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:19:08 np0005531887 nova_compute[186849]: 2025-11-22 08:19:08.415 186853 DEBUG nova.compute.manager [req-e97bb9ba-d9db-49fe-9ca4-6add86beba63 req-8bab740d-4a00-412f-a173-29616eed57b0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Received event network-changed-a6be1de1-c2dd-4be7-89df-bfa4d9bc296c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:19:08 np0005531887 nova_compute[186849]: 2025-11-22 08:19:08.416 186853 DEBUG nova.compute.manager [req-e97bb9ba-d9db-49fe-9ca4-6add86beba63 req-8bab740d-4a00-412f-a173-29616eed57b0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Refreshing instance network info cache due to event network-changed-a6be1de1-c2dd-4be7-89df-bfa4d9bc296c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:19:08 np0005531887 nova_compute[186849]: 2025-11-22 08:19:08.416 186853 DEBUG oslo_concurrency.lockutils [req-e97bb9ba-d9db-49fe-9ca4-6add86beba63 req-8bab740d-4a00-412f-a173-29616eed57b0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-ff7656a5-6680-4acd-a89d-fdc5e9fb914a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:19:08 np0005531887 nova_compute[186849]: 2025-11-22 08:19:08.926 186853 DEBUG nova.network.neutron [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Updating instance_info_cache with network_info: [{"id": "a6be1de1-c2dd-4be7-89df-bfa4d9bc296c", "address": "fa:16:3e:e8:eb:ea", "network": {"id": "ec72ffac-7400-49d0-9e0a-60c991449755", "bridge": "br-int", "label": "tempest-network-smoke--1838609208", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6be1de1-c2", "ovs_interfaceid": "a6be1de1-c2dd-4be7-89df-bfa4d9bc296c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:19:08 np0005531887 nova_compute[186849]: 2025-11-22 08:19:08.948 186853 DEBUG oslo_concurrency.lockutils [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Releasing lock "refresh_cache-ff7656a5-6680-4acd-a89d-fdc5e9fb914a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:19:08 np0005531887 nova_compute[186849]: 2025-11-22 08:19:08.951 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquired lock "refresh_cache-ff7656a5-6680-4acd-a89d-fdc5e9fb914a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:19:08 np0005531887 nova_compute[186849]: 2025-11-22 08:19:08.951 186853 DEBUG nova.network.neutron [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 03:19:08 np0005531887 nova_compute[186849]: 2025-11-22 08:19:08.952 186853 DEBUG nova.objects.instance [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid ff7656a5-6680-4acd-a89d-fdc5e9fb914a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.214 186853 DEBUG nova.virt.libvirt.driver [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.217 186853 DEBUG nova.virt.libvirt.driver [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.217 186853 INFO nova.virt.libvirt.driver [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Creating image(s)#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.219 186853 DEBUG nova.objects.instance [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Lazy-loading 'trusted_certs' on Instance uuid ff7656a5-6680-4acd-a89d-fdc5e9fb914a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.249 186853 DEBUG oslo_concurrency.processutils [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.309 186853 DEBUG oslo_concurrency.processutils [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.312 186853 DEBUG nova.virt.disk.api [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Checking if we can resize image /var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.312 186853 DEBUG oslo_concurrency.processutils [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.392 186853 DEBUG oslo_concurrency.processutils [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.394 186853 DEBUG nova.virt.disk.api [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Cannot resize image /var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.417 186853 DEBUG nova.virt.libvirt.driver [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.418 186853 DEBUG nova.virt.libvirt.driver [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Ensure instance console log exists: /var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.419 186853 DEBUG oslo_concurrency.lockutils [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.419 186853 DEBUG oslo_concurrency.lockutils [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.419 186853 DEBUG oslo_concurrency.lockutils [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.422 186853 DEBUG nova.virt.libvirt.driver [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Start _get_guest_xml network_info=[{"id": "a6be1de1-c2dd-4be7-89df-bfa4d9bc296c", "address": "fa:16:3e:e8:eb:ea", "network": {"id": "ec72ffac-7400-49d0-9e0a-60c991449755", "bridge": "br-int", "label": "tempest-network-smoke--1838609208", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1838609208", "vif_mac": "fa:16:3e:e8:eb:ea"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6be1de1-c2", "ovs_interfaceid": "a6be1de1-c2dd-4be7-89df-bfa4d9bc296c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'size': 0, 'boot_index': 0, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.429 186853 WARNING nova.virt.libvirt.driver [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.437 186853 DEBUG nova.virt.libvirt.host [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.438 186853 DEBUG nova.virt.libvirt.host [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.442 186853 DEBUG nova.virt.libvirt.host [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.442 186853 DEBUG nova.virt.libvirt.host [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.443 186853 DEBUG nova.virt.libvirt.driver [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.444 186853 DEBUG nova.virt.hardware [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.444 186853 DEBUG nova.virt.hardware [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.444 186853 DEBUG nova.virt.hardware [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.445 186853 DEBUG nova.virt.hardware [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.445 186853 DEBUG nova.virt.hardware [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.445 186853 DEBUG nova.virt.hardware [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.445 186853 DEBUG nova.virt.hardware [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.446 186853 DEBUG nova.virt.hardware [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.446 186853 DEBUG nova.virt.hardware [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.446 186853 DEBUG nova.virt.hardware [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.446 186853 DEBUG nova.virt.hardware [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.447 186853 DEBUG nova.objects.instance [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Lazy-loading 'vcpu_model' on Instance uuid ff7656a5-6680-4acd-a89d-fdc5e9fb914a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.461 186853 DEBUG oslo_concurrency.processutils [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.525 186853 DEBUG oslo_concurrency.processutils [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk.config --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.526 186853 DEBUG oslo_concurrency.lockutils [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Acquiring lock "/var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.527 186853 DEBUG oslo_concurrency.lockutils [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Lock "/var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.528 186853 DEBUG oslo_concurrency.lockutils [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Lock "/var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.529 186853 DEBUG nova.virt.libvirt.vif [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:18:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-104914358',display_name='tempest-TestNetworkAdvancedServerOps-server-104914358',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-104914358',id=150,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGuYeAEoXXbFDPuWNPKdh/K1JH4L9ZCXU/SY8Quy5TL9WW/Qq6H4zQToZJbmU7x96LpJWQ/NfkaUrq1jAo7d4tTwPh3rAycu6tk9EuY65V+7L7m3g1sqWP9C3rGfSGoErQ==',key_name='tempest-TestNetworkAdvancedServerOps-1623117955',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:18:33Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='042f6d127720471aaedb8a1fb7535416',ramdisk_id='',reservation_id='r-mnvd2q8w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1221065053',owner_user_name='tempest-TestNetworkAdvancedServerOps-1221065053-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:19:06Z,user_data=None,user_id='d8853d84c1e84f6baaf01635ef1d0f7c',uuid=ff7656a5-6680-4acd-a89d-fdc5e9fb914a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a6be1de1-c2dd-4be7-89df-bfa4d9bc296c", "address": "fa:16:3e:e8:eb:ea", "network": {"id": "ec72ffac-7400-49d0-9e0a-60c991449755", "bridge": "br-int", "label": "tempest-network-smoke--1838609208", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1838609208", "vif_mac": "fa:16:3e:e8:eb:ea"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6be1de1-c2", "ovs_interfaceid": "a6be1de1-c2dd-4be7-89df-bfa4d9bc296c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.529 186853 DEBUG nova.network.os_vif_util [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Converting VIF {"id": "a6be1de1-c2dd-4be7-89df-bfa4d9bc296c", "address": "fa:16:3e:e8:eb:ea", "network": {"id": "ec72ffac-7400-49d0-9e0a-60c991449755", "bridge": "br-int", "label": "tempest-network-smoke--1838609208", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1838609208", "vif_mac": "fa:16:3e:e8:eb:ea"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6be1de1-c2", "ovs_interfaceid": "a6be1de1-c2dd-4be7-89df-bfa4d9bc296c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.530 186853 DEBUG nova.network.os_vif_util [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e8:eb:ea,bridge_name='br-int',has_traffic_filtering=True,id=a6be1de1-c2dd-4be7-89df-bfa4d9bc296c,network=Network(ec72ffac-7400-49d0-9e0a-60c991449755),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa6be1de1-c2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.532 186853 DEBUG nova.virt.libvirt.driver [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:19:09 np0005531887 nova_compute[186849]:  <uuid>ff7656a5-6680-4acd-a89d-fdc5e9fb914a</uuid>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:  <name>instance-00000096</name>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:  <memory>131072</memory>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:  <vcpu>1</vcpu>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:19:09 np0005531887 nova_compute[186849]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-104914358</nova:name>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:      <nova:creationTime>2025-11-22 08:19:09</nova:creationTime>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:      <nova:flavor name="m1.nano">
Nov 22 03:19:09 np0005531887 nova_compute[186849]:        <nova:memory>128</nova:memory>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:        <nova:disk>1</nova:disk>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:        <nova:swap>0</nova:swap>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:      </nova:flavor>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:      <nova:owner>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:        <nova:user uuid="d8853d84c1e84f6baaf01635ef1d0f7c">tempest-TestNetworkAdvancedServerOps-1221065053-project-member</nova:user>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:        <nova:project uuid="042f6d127720471aaedb8a1fb7535416">tempest-TestNetworkAdvancedServerOps-1221065053</nova:project>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:      </nova:owner>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:      <nova:ports>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:        <nova:port uuid="a6be1de1-c2dd-4be7-89df-bfa4d9bc296c">
Nov 22 03:19:09 np0005531887 nova_compute[186849]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:        </nova:port>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:      </nova:ports>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:    </nova:instance>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:  <sysinfo type="smbios">
Nov 22 03:19:09 np0005531887 nova_compute[186849]:    <system>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:      <entry name="serial">ff7656a5-6680-4acd-a89d-fdc5e9fb914a</entry>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:      <entry name="uuid">ff7656a5-6680-4acd-a89d-fdc5e9fb914a</entry>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:    </system>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:  <os>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:    <boot dev="hd"/>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:    <smbios mode="sysinfo"/>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:  </os>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:  <features>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:    <vmcoreinfo/>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:  </features>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:  <clock offset="utc">
Nov 22 03:19:09 np0005531887 nova_compute[186849]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:    <timer name="hpet" present="no"/>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:  </clock>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:  <cpu mode="custom" match="exact">
Nov 22 03:19:09 np0005531887 nova_compute[186849]:    <model>Nehalem</model>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:  <devices>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:    <disk type="file" device="disk">
Nov 22 03:19:09 np0005531887 nova_compute[186849]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk"/>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:      <target dev="vda" bus="virtio"/>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:    <disk type="file" device="cdrom">
Nov 22 03:19:09 np0005531887 nova_compute[186849]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk.config"/>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:      <target dev="sda" bus="sata"/>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:    <interface type="ethernet">
Nov 22 03:19:09 np0005531887 nova_compute[186849]:      <mac address="fa:16:3e:e8:eb:ea"/>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:      <mtu size="1442"/>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:      <target dev="tapa6be1de1-c2"/>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:    </interface>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:    <serial type="pty">
Nov 22 03:19:09 np0005531887 nova_compute[186849]:      <log file="/var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a/console.log" append="off"/>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:    </serial>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:    <video>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:    </video>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:    <input type="tablet" bus="usb"/>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:    <rng model="virtio">
Nov 22 03:19:09 np0005531887 nova_compute[186849]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:    </rng>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:    <controller type="usb" index="0"/>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:    <memballoon model="virtio">
Nov 22 03:19:09 np0005531887 nova_compute[186849]:      <stats period="10"/>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 03:19:09 np0005531887 nova_compute[186849]:  </devices>
Nov 22 03:19:09 np0005531887 nova_compute[186849]: </domain>
Nov 22 03:19:09 np0005531887 nova_compute[186849]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.533 186853 DEBUG nova.virt.libvirt.vif [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:18:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-104914358',display_name='tempest-TestNetworkAdvancedServerOps-server-104914358',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-104914358',id=150,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGuYeAEoXXbFDPuWNPKdh/K1JH4L9ZCXU/SY8Quy5TL9WW/Qq6H4zQToZJbmU7x96LpJWQ/NfkaUrq1jAo7d4tTwPh3rAycu6tk9EuY65V+7L7m3g1sqWP9C3rGfSGoErQ==',key_name='tempest-TestNetworkAdvancedServerOps-1623117955',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:18:33Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='042f6d127720471aaedb8a1fb7535416',ramdisk_id='',reservation_id='r-mnvd2q8w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1221065053',owner_user_name='tempest-TestNetworkAdvancedServerOps-1221065053-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:19:06Z,user_data=None,user_id='d8853d84c1e84f6baaf01635ef1d0f7c',uuid=ff7656a5-6680-4acd-a89d-fdc5e9fb914a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a6be1de1-c2dd-4be7-89df-bfa4d9bc296c", "address": "fa:16:3e:e8:eb:ea", "network": {"id": "ec72ffac-7400-49d0-9e0a-60c991449755", "bridge": "br-int", "label": "tempest-network-smoke--1838609208", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1838609208", "vif_mac": "fa:16:3e:e8:eb:ea"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6be1de1-c2", "ovs_interfaceid": "a6be1de1-c2dd-4be7-89df-bfa4d9bc296c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.534 186853 DEBUG nova.network.os_vif_util [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Converting VIF {"id": "a6be1de1-c2dd-4be7-89df-bfa4d9bc296c", "address": "fa:16:3e:e8:eb:ea", "network": {"id": "ec72ffac-7400-49d0-9e0a-60c991449755", "bridge": "br-int", "label": "tempest-network-smoke--1838609208", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1838609208", "vif_mac": "fa:16:3e:e8:eb:ea"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6be1de1-c2", "ovs_interfaceid": "a6be1de1-c2dd-4be7-89df-bfa4d9bc296c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.534 186853 DEBUG nova.network.os_vif_util [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e8:eb:ea,bridge_name='br-int',has_traffic_filtering=True,id=a6be1de1-c2dd-4be7-89df-bfa4d9bc296c,network=Network(ec72ffac-7400-49d0-9e0a-60c991449755),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa6be1de1-c2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.534 186853 DEBUG os_vif [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e8:eb:ea,bridge_name='br-int',has_traffic_filtering=True,id=a6be1de1-c2dd-4be7-89df-bfa4d9bc296c,network=Network(ec72ffac-7400-49d0-9e0a-60c991449755),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa6be1de1-c2') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.535 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.535 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.536 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.541 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.541 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa6be1de1-c2, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.542 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa6be1de1-c2, col_values=(('external_ids', {'iface-id': 'a6be1de1-c2dd-4be7-89df-bfa4d9bc296c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e8:eb:ea', 'vm-uuid': 'ff7656a5-6680-4acd-a89d-fdc5e9fb914a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.544 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:09 np0005531887 NetworkManager[55210]: <info>  [1763799549.5454] manager: (tapa6be1de1-c2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/240)
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.548 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.553 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.554 186853 INFO os_vif [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e8:eb:ea,bridge_name='br-int',has_traffic_filtering=True,id=a6be1de1-c2dd-4be7-89df-bfa4d9bc296c,network=Network(ec72ffac-7400-49d0-9e0a-60c991449755),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa6be1de1-c2')#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.627 186853 DEBUG nova.virt.libvirt.driver [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.628 186853 DEBUG nova.virt.libvirt.driver [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.628 186853 DEBUG nova.virt.libvirt.driver [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] No VIF found with MAC fa:16:3e:e8:eb:ea, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.629 186853 INFO nova.virt.libvirt.driver [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Using config drive#033[00m
Nov 22 03:19:09 np0005531887 podman[237221]: 2025-11-22 08:19:09.688895295 +0000 UTC m=+0.094442226 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:19:09 np0005531887 kernel: tapa6be1de1-c2: entered promiscuous mode
Nov 22 03:19:09 np0005531887 NetworkManager[55210]: <info>  [1763799549.7097] manager: (tapa6be1de1-c2): new Tun device (/org/freedesktop/NetworkManager/Devices/241)
Nov 22 03:19:09 np0005531887 ovn_controller[95130]: 2025-11-22T08:19:09Z|00514|binding|INFO|Claiming lport a6be1de1-c2dd-4be7-89df-bfa4d9bc296c for this chassis.
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.710 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:09 np0005531887 ovn_controller[95130]: 2025-11-22T08:19:09Z|00515|binding|INFO|a6be1de1-c2dd-4be7-89df-bfa4d9bc296c: Claiming fa:16:3e:e8:eb:ea 10.100.0.3
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.716 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:09 np0005531887 NetworkManager[55210]: <info>  [1763799549.7260] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/242)
Nov 22 03:19:09 np0005531887 NetworkManager[55210]: <info>  [1763799549.7266] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/243)
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.725 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:09 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:19:09.731 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e8:eb:ea 10.100.0.3'], port_security=['fa:16:3e:e8:eb:ea 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'ff7656a5-6680-4acd-a89d-fdc5e9fb914a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ec72ffac-7400-49d0-9e0a-60c991449755', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '042f6d127720471aaedb8a1fb7535416', 'neutron:revision_number': '6', 'neutron:security_group_ids': '84df0425-e3b1-4ba9-b876-812d98417396', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.228'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=271b2087-b100-40c2-aba4-df256e37c26c, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=a6be1de1-c2dd-4be7-89df-bfa4d9bc296c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:19:09 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:19:09.732 104084 INFO neutron.agent.ovn.metadata.agent [-] Port a6be1de1-c2dd-4be7-89df-bfa4d9bc296c in datapath ec72ffac-7400-49d0-9e0a-60c991449755 bound to our chassis#033[00m
Nov 22 03:19:09 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:19:09.734 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ec72ffac-7400-49d0-9e0a-60c991449755#033[00m
Nov 22 03:19:09 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:19:09.745 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[e09c9aa9-de74-4c42-9716-12e621e9d94c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:19:09 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:19:09.746 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapec72ffac-71 in ovnmeta-ec72ffac-7400-49d0-9e0a-60c991449755 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:19:09 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:19:09.748 213790 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapec72ffac-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:19:09 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:19:09.748 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[3950f028-0390-4e99-9818-67b229d47412]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:19:09 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:19:09.749 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[87be3e53-2639-4694-bf3f-2f7d48c77384]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:19:09 np0005531887 systemd-udevd[237261]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:19:09 np0005531887 systemd-machined[153180]: New machine qemu-55-instance-00000096.
Nov 22 03:19:09 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:19:09.762 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[eae37da4-9f3a-44cc-aa52-3e55604a5d51]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:19:09 np0005531887 NetworkManager[55210]: <info>  [1763799549.7656] device (tapa6be1de1-c2): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:19:09 np0005531887 NetworkManager[55210]: <info>  [1763799549.7666] device (tapa6be1de1-c2): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:19:09 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:19:09.793 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[8f557d8a-9586-4d46-b38a-d25f487f661f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:19:09 np0005531887 systemd[1]: Started Virtual Machine qemu-55-instance-00000096.
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.833 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:09 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:19:09.831 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[843e1006-4749-4c16-b054-cba6efaace8c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.839 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:09 np0005531887 NetworkManager[55210]: <info>  [1763799549.8431] manager: (tapec72ffac-70): new Veth device (/org/freedesktop/NetworkManager/Devices/244)
Nov 22 03:19:09 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:19:09.842 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[28487315-efe4-4dd0-ba9a-2cb20a870955]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.844 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:09 np0005531887 ovn_controller[95130]: 2025-11-22T08:19:09Z|00516|binding|INFO|Setting lport a6be1de1-c2dd-4be7-89df-bfa4d9bc296c ovn-installed in OVS
Nov 22 03:19:09 np0005531887 ovn_controller[95130]: 2025-11-22T08:19:09Z|00517|binding|INFO|Setting lport a6be1de1-c2dd-4be7-89df-bfa4d9bc296c up in Southbound
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.869 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:09 np0005531887 nova_compute[186849]: 2025-11-22 08:19:09.873 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:09 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:19:09.889 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[dfc19633-ed1b-4dc3-8c51-5104c263027d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:19:09 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:19:09.893 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[9467a680-c168-4886-92b2-301a985fc997]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:19:09 np0005531887 NetworkManager[55210]: <info>  [1763799549.9221] device (tapec72ffac-70): carrier: link connected
Nov 22 03:19:09 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:19:09.929 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[aa6d9966-92d1-4323-b8e2-527a07ae403e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:19:09 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:19:09.949 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[dbce86fd-bfae-480c-815d-bf2debf6f941]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapec72ffac-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:31:5f:6b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 156], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 629252, 'reachable_time': 19064, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237294, 'error': None, 'target': 'ovnmeta-ec72ffac-7400-49d0-9e0a-60c991449755', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:19:09 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:19:09.964 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[e944cb12-4a58-4626-a7f3-6cf9ba337440]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe31:5f6b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 629252, 'tstamp': 629252}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237295, 'error': None, 'target': 'ovnmeta-ec72ffac-7400-49d0-9e0a-60c991449755', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:19:09 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:19:09.981 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[3bd960cb-fc61-479b-b680-441e5ca29b24]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapec72ffac-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:31:5f:6b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 156], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 629252, 'reachable_time': 19064, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 237296, 'error': None, 'target': 'ovnmeta-ec72ffac-7400-49d0-9e0a-60c991449755', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:19:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:19:10.016 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[b875fa37-6e57-4db0-8473-443f5eabc7e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:19:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:19:10.082 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[8726713e-36e5-4574-9268-237c1e80b78f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:19:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:19:10.084 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapec72ffac-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:19:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:19:10.084 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:19:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:19:10.084 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapec72ffac-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:19:10 np0005531887 NetworkManager[55210]: <info>  [1763799550.0868] manager: (tapec72ffac-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/245)
Nov 22 03:19:10 np0005531887 kernel: tapec72ffac-70: entered promiscuous mode
Nov 22 03:19:10 np0005531887 nova_compute[186849]: 2025-11-22 08:19:10.088 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:19:10.090 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapec72ffac-70, col_values=(('external_ids', {'iface-id': 'a04e532b-8aea-4e90-9617-f7d5299315eb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:19:10 np0005531887 nova_compute[186849]: 2025-11-22 08:19:10.091 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:10 np0005531887 ovn_controller[95130]: 2025-11-22T08:19:10Z|00518|binding|INFO|Releasing lport a04e532b-8aea-4e90-9617-f7d5299315eb from this chassis (sb_readonly=0)
Nov 22 03:19:10 np0005531887 nova_compute[186849]: 2025-11-22 08:19:10.092 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:19:10.092 104084 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ec72ffac-7400-49d0-9e0a-60c991449755.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ec72ffac-7400-49d0-9e0a-60c991449755.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:19:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:19:10.093 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[98201246-135c-4887-b10a-ad37c3cddf70]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:19:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:19:10.094 104084 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:19:10 np0005531887 ovn_metadata_agent[104079]: global
Nov 22 03:19:10 np0005531887 ovn_metadata_agent[104079]:    log         /dev/log local0 debug
Nov 22 03:19:10 np0005531887 ovn_metadata_agent[104079]:    log-tag     haproxy-metadata-proxy-ec72ffac-7400-49d0-9e0a-60c991449755
Nov 22 03:19:10 np0005531887 ovn_metadata_agent[104079]:    user        root
Nov 22 03:19:10 np0005531887 ovn_metadata_agent[104079]:    group       root
Nov 22 03:19:10 np0005531887 ovn_metadata_agent[104079]:    maxconn     1024
Nov 22 03:19:10 np0005531887 ovn_metadata_agent[104079]:    pidfile     /var/lib/neutron/external/pids/ec72ffac-7400-49d0-9e0a-60c991449755.pid.haproxy
Nov 22 03:19:10 np0005531887 ovn_metadata_agent[104079]:    daemon
Nov 22 03:19:10 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:19:10 np0005531887 ovn_metadata_agent[104079]: defaults
Nov 22 03:19:10 np0005531887 ovn_metadata_agent[104079]:    log global
Nov 22 03:19:10 np0005531887 ovn_metadata_agent[104079]:    mode http
Nov 22 03:19:10 np0005531887 ovn_metadata_agent[104079]:    option httplog
Nov 22 03:19:10 np0005531887 ovn_metadata_agent[104079]:    option dontlognull
Nov 22 03:19:10 np0005531887 ovn_metadata_agent[104079]:    option http-server-close
Nov 22 03:19:10 np0005531887 ovn_metadata_agent[104079]:    option forwardfor
Nov 22 03:19:10 np0005531887 ovn_metadata_agent[104079]:    retries                 3
Nov 22 03:19:10 np0005531887 ovn_metadata_agent[104079]:    timeout http-request    30s
Nov 22 03:19:10 np0005531887 ovn_metadata_agent[104079]:    timeout connect         30s
Nov 22 03:19:10 np0005531887 ovn_metadata_agent[104079]:    timeout client          32s
Nov 22 03:19:10 np0005531887 ovn_metadata_agent[104079]:    timeout server          32s
Nov 22 03:19:10 np0005531887 ovn_metadata_agent[104079]:    timeout http-keep-alive 30s
Nov 22 03:19:10 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:19:10 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:19:10 np0005531887 ovn_metadata_agent[104079]: listen listener
Nov 22 03:19:10 np0005531887 ovn_metadata_agent[104079]:    bind 169.254.169.254:80
Nov 22 03:19:10 np0005531887 ovn_metadata_agent[104079]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:19:10 np0005531887 ovn_metadata_agent[104079]:    http-request add-header X-OVN-Network-ID ec72ffac-7400-49d0-9e0a-60c991449755
Nov 22 03:19:10 np0005531887 ovn_metadata_agent[104079]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:19:10 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:19:10.095 104084 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ec72ffac-7400-49d0-9e0a-60c991449755', 'env', 'PROCESS_TAG=haproxy-ec72ffac-7400-49d0-9e0a-60c991449755', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ec72ffac-7400-49d0-9e0a-60c991449755.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:19:10 np0005531887 nova_compute[186849]: 2025-11-22 08:19:10.103 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:10 np0005531887 nova_compute[186849]: 2025-11-22 08:19:10.367 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763799550.36648, ff7656a5-6680-4acd-a89d-fdc5e9fb914a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:19:10 np0005531887 nova_compute[186849]: 2025-11-22 08:19:10.374 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:19:10 np0005531887 nova_compute[186849]: 2025-11-22 08:19:10.377 186853 DEBUG nova.compute.manager [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:19:10 np0005531887 nova_compute[186849]: 2025-11-22 08:19:10.382 186853 INFO nova.virt.libvirt.driver [-] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Instance running successfully.#033[00m
Nov 22 03:19:10 np0005531887 virtqemud[186424]: argument unsupported: QEMU guest agent is not configured
Nov 22 03:19:10 np0005531887 nova_compute[186849]: 2025-11-22 08:19:10.385 186853 DEBUG nova.virt.libvirt.guest [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Nov 22 03:19:10 np0005531887 nova_compute[186849]: 2025-11-22 08:19:10.386 186853 DEBUG nova.virt.libvirt.driver [None req-e9fdeae6-c413-48f7-963c-d61797883262 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Nov 22 03:19:10 np0005531887 nova_compute[186849]: 2025-11-22 08:19:10.398 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:19:10 np0005531887 nova_compute[186849]: 2025-11-22 08:19:10.402 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:19:10 np0005531887 nova_compute[186849]: 2025-11-22 08:19:10.420 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Nov 22 03:19:10 np0005531887 nova_compute[186849]: 2025-11-22 08:19:10.421 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763799550.3678684, ff7656a5-6680-4acd-a89d-fdc5e9fb914a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:19:10 np0005531887 nova_compute[186849]: 2025-11-22 08:19:10.421 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] VM Started (Lifecycle Event)#033[00m
Nov 22 03:19:10 np0005531887 nova_compute[186849]: 2025-11-22 08:19:10.451 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:19:10 np0005531887 nova_compute[186849]: 2025-11-22 08:19:10.455 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:19:10 np0005531887 nova_compute[186849]: 2025-11-22 08:19:10.489 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Nov 22 03:19:10 np0005531887 podman[237333]: 2025-11-22 08:19:10.548313248 +0000 UTC m=+0.109517139 container create 2828ece5581c7966514bfc80f09f1202c88796a9b4dc6ec92600e2396e6c8207 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ec72ffac-7400-49d0-9e0a-60c991449755, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 03:19:10 np0005531887 podman[237333]: 2025-11-22 08:19:10.461070911 +0000 UTC m=+0.022274802 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:19:10 np0005531887 systemd[1]: Started libpod-conmon-2828ece5581c7966514bfc80f09f1202c88796a9b4dc6ec92600e2396e6c8207.scope.
Nov 22 03:19:10 np0005531887 systemd[1]: Started libcrun container.
Nov 22 03:19:10 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4b970247e5aff03b9d6fce7ce8c229ab657d76bdfeb15470a3a219dbcfa4f0bf/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:19:10 np0005531887 podman[237333]: 2025-11-22 08:19:10.652894104 +0000 UTC m=+0.214098005 container init 2828ece5581c7966514bfc80f09f1202c88796a9b4dc6ec92600e2396e6c8207 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ec72ffac-7400-49d0-9e0a-60c991449755, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:19:10 np0005531887 podman[237333]: 2025-11-22 08:19:10.658421111 +0000 UTC m=+0.219624982 container start 2828ece5581c7966514bfc80f09f1202c88796a9b4dc6ec92600e2396e6c8207 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ec72ffac-7400-49d0-9e0a-60c991449755, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118)
Nov 22 03:19:10 np0005531887 neutron-haproxy-ovnmeta-ec72ffac-7400-49d0-9e0a-60c991449755[237346]: [NOTICE]   (237350) : New worker (237352) forked
Nov 22 03:19:10 np0005531887 neutron-haproxy-ovnmeta-ec72ffac-7400-49d0-9e0a-60c991449755[237346]: [NOTICE]   (237350) : Loading success.
Nov 22 03:19:10 np0005531887 nova_compute[186849]: 2025-11-22 08:19:10.820 186853 DEBUG nova.compute.manager [req-d560908d-2b72-4a1e-8a0d-54d5a9cf90a5 req-e4e995e7-5e58-42a5-94c1-bd71d3c81f75 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Received event network-vif-plugged-a6be1de1-c2dd-4be7-89df-bfa4d9bc296c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:19:10 np0005531887 nova_compute[186849]: 2025-11-22 08:19:10.821 186853 DEBUG oslo_concurrency.lockutils [req-d560908d-2b72-4a1e-8a0d-54d5a9cf90a5 req-e4e995e7-5e58-42a5-94c1-bd71d3c81f75 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "ff7656a5-6680-4acd-a89d-fdc5e9fb914a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:19:10 np0005531887 nova_compute[186849]: 2025-11-22 08:19:10.822 186853 DEBUG oslo_concurrency.lockutils [req-d560908d-2b72-4a1e-8a0d-54d5a9cf90a5 req-e4e995e7-5e58-42a5-94c1-bd71d3c81f75 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ff7656a5-6680-4acd-a89d-fdc5e9fb914a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:19:10 np0005531887 nova_compute[186849]: 2025-11-22 08:19:10.822 186853 DEBUG oslo_concurrency.lockutils [req-d560908d-2b72-4a1e-8a0d-54d5a9cf90a5 req-e4e995e7-5e58-42a5-94c1-bd71d3c81f75 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ff7656a5-6680-4acd-a89d-fdc5e9fb914a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:19:10 np0005531887 nova_compute[186849]: 2025-11-22 08:19:10.822 186853 DEBUG nova.compute.manager [req-d560908d-2b72-4a1e-8a0d-54d5a9cf90a5 req-e4e995e7-5e58-42a5-94c1-bd71d3c81f75 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] No waiting events found dispatching network-vif-plugged-a6be1de1-c2dd-4be7-89df-bfa4d9bc296c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:19:10 np0005531887 nova_compute[186849]: 2025-11-22 08:19:10.822 186853 WARNING nova.compute.manager [req-d560908d-2b72-4a1e-8a0d-54d5a9cf90a5 req-e4e995e7-5e58-42a5-94c1-bd71d3c81f75 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Received unexpected event network-vif-plugged-a6be1de1-c2dd-4be7-89df-bfa4d9bc296c for instance with vm_state resized and task_state None.#033[00m
Nov 22 03:19:10 np0005531887 nova_compute[186849]: 2025-11-22 08:19:10.948 186853 DEBUG nova.network.neutron [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Updating instance_info_cache with network_info: [{"id": "a6be1de1-c2dd-4be7-89df-bfa4d9bc296c", "address": "fa:16:3e:e8:eb:ea", "network": {"id": "ec72ffac-7400-49d0-9e0a-60c991449755", "bridge": "br-int", "label": "tempest-network-smoke--1838609208", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6be1de1-c2", "ovs_interfaceid": "a6be1de1-c2dd-4be7-89df-bfa4d9bc296c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:19:10 np0005531887 nova_compute[186849]: 2025-11-22 08:19:10.961 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Releasing lock "refresh_cache-ff7656a5-6680-4acd-a89d-fdc5e9fb914a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:19:10 np0005531887 nova_compute[186849]: 2025-11-22 08:19:10.962 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 03:19:10 np0005531887 nova_compute[186849]: 2025-11-22 08:19:10.962 186853 DEBUG oslo_concurrency.lockutils [req-e97bb9ba-d9db-49fe-9ca4-6add86beba63 req-8bab740d-4a00-412f-a173-29616eed57b0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-ff7656a5-6680-4acd-a89d-fdc5e9fb914a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:19:10 np0005531887 nova_compute[186849]: 2025-11-22 08:19:10.962 186853 DEBUG nova.network.neutron [req-e97bb9ba-d9db-49fe-9ca4-6add86beba63 req-8bab740d-4a00-412f-a173-29616eed57b0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Refreshing network info cache for port a6be1de1-c2dd-4be7-89df-bfa4d9bc296c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:19:10 np0005531887 nova_compute[186849]: 2025-11-22 08:19:10.963 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:19:10 np0005531887 nova_compute[186849]: 2025-11-22 08:19:10.964 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:19:10 np0005531887 nova_compute[186849]: 2025-11-22 08:19:10.964 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:19:10 np0005531887 nova_compute[186849]: 2025-11-22 08:19:10.964 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:19:10 np0005531887 nova_compute[186849]: 2025-11-22 08:19:10.983 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:19:10 np0005531887 nova_compute[186849]: 2025-11-22 08:19:10.983 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:19:10 np0005531887 nova_compute[186849]: 2025-11-22 08:19:10.983 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:19:10 np0005531887 nova_compute[186849]: 2025-11-22 08:19:10.984 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:19:11 np0005531887 nova_compute[186849]: 2025-11-22 08:19:11.050 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:19:11 np0005531887 nova_compute[186849]: 2025-11-22 08:19:11.110 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:19:11 np0005531887 nova_compute[186849]: 2025-11-22 08:19:11.112 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:19:11 np0005531887 nova_compute[186849]: 2025-11-22 08:19:11.180 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:19:11 np0005531887 nova_compute[186849]: 2025-11-22 08:19:11.362 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:19:11 np0005531887 nova_compute[186849]: 2025-11-22 08:19:11.365 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5610MB free_disk=73.2451286315918GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:19:11 np0005531887 nova_compute[186849]: 2025-11-22 08:19:11.365 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:19:11 np0005531887 nova_compute[186849]: 2025-11-22 08:19:11.366 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:19:11 np0005531887 nova_compute[186849]: 2025-11-22 08:19:11.405 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Applying migration context for instance ff7656a5-6680-4acd-a89d-fdc5e9fb914a as it has an incoming, in-progress migration 86738dcc-794c-48e8-bcb5-a8fc825b0b3f. Migration status is finished _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:950#033[00m
Nov 22 03:19:11 np0005531887 nova_compute[186849]: 2025-11-22 08:19:11.406 186853 INFO nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Updating resource usage from migration 86738dcc-794c-48e8-bcb5-a8fc825b0b3f#033[00m
Nov 22 03:19:11 np0005531887 nova_compute[186849]: 2025-11-22 08:19:11.476 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Instance ff7656a5-6680-4acd-a89d-fdc5e9fb914a actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 03:19:11 np0005531887 nova_compute[186849]: 2025-11-22 08:19:11.477 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:19:11 np0005531887 nova_compute[186849]: 2025-11-22 08:19:11.478 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:19:11 np0005531887 nova_compute[186849]: 2025-11-22 08:19:11.536 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:19:11 np0005531887 nova_compute[186849]: 2025-11-22 08:19:11.549 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:19:11 np0005531887 nova_compute[186849]: 2025-11-22 08:19:11.597 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:19:11 np0005531887 nova_compute[186849]: 2025-11-22 08:19:11.598 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.233s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:19:12 np0005531887 nova_compute[186849]: 2025-11-22 08:19:12.180 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:12 np0005531887 nova_compute[186849]: 2025-11-22 08:19:12.941 186853 DEBUG nova.compute.manager [req-73ebb9d8-82ec-4ce4-a540-ee008eb0a2d9 req-c10e305b-1c74-4330-87fd-861e6ce92d0b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Received event network-vif-plugged-a6be1de1-c2dd-4be7-89df-bfa4d9bc296c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:19:12 np0005531887 nova_compute[186849]: 2025-11-22 08:19:12.942 186853 DEBUG oslo_concurrency.lockutils [req-73ebb9d8-82ec-4ce4-a540-ee008eb0a2d9 req-c10e305b-1c74-4330-87fd-861e6ce92d0b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "ff7656a5-6680-4acd-a89d-fdc5e9fb914a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:19:12 np0005531887 nova_compute[186849]: 2025-11-22 08:19:12.942 186853 DEBUG oslo_concurrency.lockutils [req-73ebb9d8-82ec-4ce4-a540-ee008eb0a2d9 req-c10e305b-1c74-4330-87fd-861e6ce92d0b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ff7656a5-6680-4acd-a89d-fdc5e9fb914a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:19:12 np0005531887 nova_compute[186849]: 2025-11-22 08:19:12.943 186853 DEBUG oslo_concurrency.lockutils [req-73ebb9d8-82ec-4ce4-a540-ee008eb0a2d9 req-c10e305b-1c74-4330-87fd-861e6ce92d0b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ff7656a5-6680-4acd-a89d-fdc5e9fb914a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:19:12 np0005531887 nova_compute[186849]: 2025-11-22 08:19:12.943 186853 DEBUG nova.compute.manager [req-73ebb9d8-82ec-4ce4-a540-ee008eb0a2d9 req-c10e305b-1c74-4330-87fd-861e6ce92d0b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] No waiting events found dispatching network-vif-plugged-a6be1de1-c2dd-4be7-89df-bfa4d9bc296c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:19:12 np0005531887 nova_compute[186849]: 2025-11-22 08:19:12.944 186853 WARNING nova.compute.manager [req-73ebb9d8-82ec-4ce4-a540-ee008eb0a2d9 req-c10e305b-1c74-4330-87fd-861e6ce92d0b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Received unexpected event network-vif-plugged-a6be1de1-c2dd-4be7-89df-bfa4d9bc296c for instance with vm_state resized and task_state None.#033[00m
Nov 22 03:19:14 np0005531887 nova_compute[186849]: 2025-11-22 08:19:14.404 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:19:14 np0005531887 nova_compute[186849]: 2025-11-22 08:19:14.405 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:19:14 np0005531887 nova_compute[186849]: 2025-11-22 08:19:14.546 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:14 np0005531887 podman[237369]: 2025-11-22 08:19:14.8397162 +0000 UTC m=+0.062119427 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., name=ubi9-minimal, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, release=1755695350, version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 22 03:19:15 np0005531887 systemd[1]: Stopping User Manager for UID 42436...
Nov 22 03:19:15 np0005531887 systemd[237155]: Activating special unit Exit the Session...
Nov 22 03:19:15 np0005531887 systemd[237155]: Stopped target Main User Target.
Nov 22 03:19:15 np0005531887 systemd[237155]: Stopped target Basic System.
Nov 22 03:19:15 np0005531887 systemd[237155]: Stopped target Paths.
Nov 22 03:19:15 np0005531887 systemd[237155]: Stopped target Sockets.
Nov 22 03:19:15 np0005531887 systemd[237155]: Stopped target Timers.
Nov 22 03:19:15 np0005531887 systemd[237155]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 22 03:19:15 np0005531887 systemd[237155]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 22 03:19:15 np0005531887 systemd[237155]: Closed D-Bus User Message Bus Socket.
Nov 22 03:19:15 np0005531887 systemd[237155]: Stopped Create User's Volatile Files and Directories.
Nov 22 03:19:15 np0005531887 systemd[237155]: Removed slice User Application Slice.
Nov 22 03:19:15 np0005531887 systemd[237155]: Reached target Shutdown.
Nov 22 03:19:15 np0005531887 systemd[237155]: Finished Exit the Session.
Nov 22 03:19:15 np0005531887 systemd[237155]: Reached target Exit the Session.
Nov 22 03:19:15 np0005531887 systemd[1]: user@42436.service: Deactivated successfully.
Nov 22 03:19:15 np0005531887 systemd[1]: Stopped User Manager for UID 42436.
Nov 22 03:19:15 np0005531887 nova_compute[186849]: 2025-11-22 08:19:15.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:19:15 np0005531887 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 22 03:19:15 np0005531887 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 22 03:19:15 np0005531887 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 22 03:19:15 np0005531887 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 22 03:19:15 np0005531887 systemd[1]: Removed slice User Slice of UID 42436.
Nov 22 03:19:16 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:19:16.050 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2c:5b:22 10.100.0.2 2001:db8:0:1:f816:3eff:fe2c:5b22 2001:db8::f816:3eff:fe2c:5b22'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8:0:1:f816:3eff:fe2c:5b22/64 2001:db8::f816:3eff:fe2c:5b22/64', 'neutron:device_id': 'ovnmeta-cfb1249f-37ac-4df7-b559-e7968406997d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cfb1249f-37ac-4df7-b559-e7968406997d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=94e93905-1b64-4ecc-b682-ceea307bebcf, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ac76a812-5ead-4b51-8c63-4eaca1b65820) old=Port_Binding(mac=['fa:16:3e:2c:5b:22 10.100.0.2 2001:db8::f816:3eff:fe2c:5b22'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe2c:5b22/64', 'neutron:device_id': 'ovnmeta-cfb1249f-37ac-4df7-b559-e7968406997d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cfb1249f-37ac-4df7-b559-e7968406997d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:19:16 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:19:16.051 104084 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ac76a812-5ead-4b51-8c63-4eaca1b65820 in datapath cfb1249f-37ac-4df7-b559-e7968406997d updated#033[00m
Nov 22 03:19:16 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:19:16.053 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cfb1249f-37ac-4df7-b559-e7968406997d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:19:16 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:19:16.055 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[16859778-aead-4cb6-8b75-b08c143d038e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:19:16 np0005531887 nova_compute[186849]: 2025-11-22 08:19:16.086 186853 DEBUG nova.network.neutron [req-e97bb9ba-d9db-49fe-9ca4-6add86beba63 req-8bab740d-4a00-412f-a173-29616eed57b0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Updated VIF entry in instance network info cache for port a6be1de1-c2dd-4be7-89df-bfa4d9bc296c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:19:16 np0005531887 nova_compute[186849]: 2025-11-22 08:19:16.087 186853 DEBUG nova.network.neutron [req-e97bb9ba-d9db-49fe-9ca4-6add86beba63 req-8bab740d-4a00-412f-a173-29616eed57b0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Updating instance_info_cache with network_info: [{"id": "a6be1de1-c2dd-4be7-89df-bfa4d9bc296c", "address": "fa:16:3e:e8:eb:ea", "network": {"id": "ec72ffac-7400-49d0-9e0a-60c991449755", "bridge": "br-int", "label": "tempest-network-smoke--1838609208", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6be1de1-c2", "ovs_interfaceid": "a6be1de1-c2dd-4be7-89df-bfa4d9bc296c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:19:16 np0005531887 nova_compute[186849]: 2025-11-22 08:19:16.160 186853 DEBUG oslo_concurrency.lockutils [req-e97bb9ba-d9db-49fe-9ca4-6add86beba63 req-8bab740d-4a00-412f-a173-29616eed57b0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-ff7656a5-6680-4acd-a89d-fdc5e9fb914a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:19:17 np0005531887 nova_compute[186849]: 2025-11-22 08:19:17.182 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:18 np0005531887 nova_compute[186849]: 2025-11-22 08:19:18.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:19:18 np0005531887 nova_compute[186849]: 2025-11-22 08:19:18.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 22 03:19:18 np0005531887 podman[237392]: 2025-11-22 08:19:18.852835081 +0000 UTC m=+0.071343386 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118)
Nov 22 03:19:18 np0005531887 podman[237393]: 2025-11-22 08:19:18.883759605 +0000 UTC m=+0.101401808 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 22 03:19:19 np0005531887 nova_compute[186849]: 2025-11-22 08:19:19.548 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:19 np0005531887 nova_compute[186849]: 2025-11-22 08:19:19.787 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:19:22 np0005531887 nova_compute[186849]: 2025-11-22 08:19:22.185 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:23 np0005531887 nova_compute[186849]: 2025-11-22 08:19:23.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:19:23 np0005531887 nova_compute[186849]: 2025-11-22 08:19:23.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 22 03:19:23 np0005531887 nova_compute[186849]: 2025-11-22 08:19:23.797 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 22 03:19:24 np0005531887 nova_compute[186849]: 2025-11-22 08:19:24.552 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:24 np0005531887 podman[237448]: 2025-11-22 08:19:24.648410889 +0000 UTC m=+0.055640086 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:19:26 np0005531887 ovn_controller[95130]: 2025-11-22T08:19:26Z|00065|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e8:eb:ea 10.100.0.3
Nov 22 03:19:27 np0005531887 nova_compute[186849]: 2025-11-22 08:19:27.187 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:29 np0005531887 nova_compute[186849]: 2025-11-22 08:19:29.555 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:29 np0005531887 podman[237471]: 2025-11-22 08:19:29.841138661 +0000 UTC m=+0.058334953 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:19:32 np0005531887 nova_compute[186849]: 2025-11-22 08:19:32.190 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:33 np0005531887 nova_compute[186849]: 2025-11-22 08:19:33.579 186853 INFO nova.compute.manager [None req-1b178c81-f68d-445d-8dd1-1bcaf4d4112c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Get console output#033[00m
Nov 22 03:19:33 np0005531887 nova_compute[186849]: 2025-11-22 08:19:33.670 213402 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 22 03:19:34 np0005531887 nova_compute[186849]: 2025-11-22 08:19:34.558 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:34 np0005531887 podman[237492]: 2025-11-22 08:19:34.845328409 +0000 UTC m=+0.064383923 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=multipathd, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 22 03:19:35 np0005531887 nova_compute[186849]: 2025-11-22 08:19:35.664 186853 DEBUG nova.compute.manager [req-4c561d22-29d6-4481-b63c-cbeb69d2ceba req-e49ebb26-7bdf-4c93-9025-2ca6e96211cb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Received event network-changed-a6be1de1-c2dd-4be7-89df-bfa4d9bc296c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:19:35 np0005531887 nova_compute[186849]: 2025-11-22 08:19:35.664 186853 DEBUG nova.compute.manager [req-4c561d22-29d6-4481-b63c-cbeb69d2ceba req-e49ebb26-7bdf-4c93-9025-2ca6e96211cb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Refreshing instance network info cache due to event network-changed-a6be1de1-c2dd-4be7-89df-bfa4d9bc296c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:19:35 np0005531887 nova_compute[186849]: 2025-11-22 08:19:35.664 186853 DEBUG oslo_concurrency.lockutils [req-4c561d22-29d6-4481-b63c-cbeb69d2ceba req-e49ebb26-7bdf-4c93-9025-2ca6e96211cb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-ff7656a5-6680-4acd-a89d-fdc5e9fb914a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:19:35 np0005531887 nova_compute[186849]: 2025-11-22 08:19:35.665 186853 DEBUG oslo_concurrency.lockutils [req-4c561d22-29d6-4481-b63c-cbeb69d2ceba req-e49ebb26-7bdf-4c93-9025-2ca6e96211cb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-ff7656a5-6680-4acd-a89d-fdc5e9fb914a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:19:35 np0005531887 nova_compute[186849]: 2025-11-22 08:19:35.665 186853 DEBUG nova.network.neutron [req-4c561d22-29d6-4481-b63c-cbeb69d2ceba req-e49ebb26-7bdf-4c93-9025-2ca6e96211cb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Refreshing network info cache for port a6be1de1-c2dd-4be7-89df-bfa4d9bc296c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:19:35 np0005531887 nova_compute[186849]: 2025-11-22 08:19:35.738 186853 DEBUG oslo_concurrency.lockutils [None req-6cb95399-3a13-4afb-abf0-e9c1983431a5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "ff7656a5-6680-4acd-a89d-fdc5e9fb914a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:19:35 np0005531887 nova_compute[186849]: 2025-11-22 08:19:35.738 186853 DEBUG oslo_concurrency.lockutils [None req-6cb95399-3a13-4afb-abf0-e9c1983431a5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "ff7656a5-6680-4acd-a89d-fdc5e9fb914a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:19:35 np0005531887 nova_compute[186849]: 2025-11-22 08:19:35.738 186853 DEBUG oslo_concurrency.lockutils [None req-6cb95399-3a13-4afb-abf0-e9c1983431a5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "ff7656a5-6680-4acd-a89d-fdc5e9fb914a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:19:35 np0005531887 nova_compute[186849]: 2025-11-22 08:19:35.739 186853 DEBUG oslo_concurrency.lockutils [None req-6cb95399-3a13-4afb-abf0-e9c1983431a5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "ff7656a5-6680-4acd-a89d-fdc5e9fb914a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:19:35 np0005531887 nova_compute[186849]: 2025-11-22 08:19:35.739 186853 DEBUG oslo_concurrency.lockutils [None req-6cb95399-3a13-4afb-abf0-e9c1983431a5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "ff7656a5-6680-4acd-a89d-fdc5e9fb914a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:19:35 np0005531887 nova_compute[186849]: 2025-11-22 08:19:35.745 186853 INFO nova.compute.manager [None req-6cb95399-3a13-4afb-abf0-e9c1983431a5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Terminating instance#033[00m
Nov 22 03:19:35 np0005531887 nova_compute[186849]: 2025-11-22 08:19:35.750 186853 DEBUG nova.compute.manager [None req-6cb95399-3a13-4afb-abf0-e9c1983431a5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 03:19:35 np0005531887 kernel: tapa6be1de1-c2 (unregistering): left promiscuous mode
Nov 22 03:19:35 np0005531887 NetworkManager[55210]: <info>  [1763799575.7897] device (tapa6be1de1-c2): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:19:35 np0005531887 nova_compute[186849]: 2025-11-22 08:19:35.800 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:35 np0005531887 ovn_controller[95130]: 2025-11-22T08:19:35Z|00519|binding|INFO|Releasing lport a6be1de1-c2dd-4be7-89df-bfa4d9bc296c from this chassis (sb_readonly=0)
Nov 22 03:19:35 np0005531887 ovn_controller[95130]: 2025-11-22T08:19:35Z|00520|binding|INFO|Setting lport a6be1de1-c2dd-4be7-89df-bfa4d9bc296c down in Southbound
Nov 22 03:19:35 np0005531887 ovn_controller[95130]: 2025-11-22T08:19:35Z|00521|binding|INFO|Removing iface tapa6be1de1-c2 ovn-installed in OVS
Nov 22 03:19:35 np0005531887 nova_compute[186849]: 2025-11-22 08:19:35.803 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:19:35.817 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e8:eb:ea 10.100.0.3'], port_security=['fa:16:3e:e8:eb:ea 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'ff7656a5-6680-4acd-a89d-fdc5e9fb914a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ec72ffac-7400-49d0-9e0a-60c991449755', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '042f6d127720471aaedb8a1fb7535416', 'neutron:revision_number': '8', 'neutron:security_group_ids': '84df0425-e3b1-4ba9-b876-812d98417396', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=271b2087-b100-40c2-aba4-df256e37c26c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=a6be1de1-c2dd-4be7-89df-bfa4d9bc296c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:19:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:19:35.818 104084 INFO neutron.agent.ovn.metadata.agent [-] Port a6be1de1-c2dd-4be7-89df-bfa4d9bc296c in datapath ec72ffac-7400-49d0-9e0a-60c991449755 unbound from our chassis#033[00m
Nov 22 03:19:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:19:35.819 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ec72ffac-7400-49d0-9e0a-60c991449755, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:19:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:19:35.820 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[7b84782e-4970-4948-ac58-9aa304f66d9c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:19:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:19:35.821 104084 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ec72ffac-7400-49d0-9e0a-60c991449755 namespace which is not needed anymore#033[00m
Nov 22 03:19:35 np0005531887 nova_compute[186849]: 2025-11-22 08:19:35.823 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:35 np0005531887 systemd[1]: machine-qemu\x2d55\x2dinstance\x2d00000096.scope: Deactivated successfully.
Nov 22 03:19:35 np0005531887 systemd[1]: machine-qemu\x2d55\x2dinstance\x2d00000096.scope: Consumed 15.713s CPU time.
Nov 22 03:19:35 np0005531887 systemd-machined[153180]: Machine qemu-55-instance-00000096 terminated.
Nov 22 03:19:36 np0005531887 nova_compute[186849]: 2025-11-22 08:19:36.020 186853 INFO nova.virt.libvirt.driver [-] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Instance destroyed successfully.#033[00m
Nov 22 03:19:36 np0005531887 nova_compute[186849]: 2025-11-22 08:19:36.022 186853 DEBUG nova.objects.instance [None req-6cb95399-3a13-4afb-abf0-e9c1983431a5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'resources' on Instance uuid ff7656a5-6680-4acd-a89d-fdc5e9fb914a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:19:36 np0005531887 nova_compute[186849]: 2025-11-22 08:19:36.041 186853 DEBUG nova.virt.libvirt.vif [None req-6cb95399-3a13-4afb-abf0-e9c1983431a5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:18:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-104914358',display_name='tempest-TestNetworkAdvancedServerOps-server-104914358',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-104914358',id=150,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGuYeAEoXXbFDPuWNPKdh/K1JH4L9ZCXU/SY8Quy5TL9WW/Qq6H4zQToZJbmU7x96LpJWQ/NfkaUrq1jAo7d4tTwPh3rAycu6tk9EuY65V+7L7m3g1sqWP9C3rGfSGoErQ==',key_name='tempest-TestNetworkAdvancedServerOps-1623117955',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:19:10Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='042f6d127720471aaedb8a1fb7535416',ramdisk_id='',reservation_id='r-mnvd2q8w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1221065053',owner_user_name='tempest-TestNetworkAdvancedServerOps-1221065053-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:19:20Z,user_data=None,user_id='d8853d84c1e84f6baaf01635ef1d0f7c',uuid=ff7656a5-6680-4acd-a89d-fdc5e9fb914a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a6be1de1-c2dd-4be7-89df-bfa4d9bc296c", "address": "fa:16:3e:e8:eb:ea", "network": {"id": "ec72ffac-7400-49d0-9e0a-60c991449755", "bridge": "br-int", "label": "tempest-network-smoke--1838609208", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6be1de1-c2", "ovs_interfaceid": "a6be1de1-c2dd-4be7-89df-bfa4d9bc296c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:19:36 np0005531887 nova_compute[186849]: 2025-11-22 08:19:36.041 186853 DEBUG nova.network.os_vif_util [None req-6cb95399-3a13-4afb-abf0-e9c1983431a5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converting VIF {"id": "a6be1de1-c2dd-4be7-89df-bfa4d9bc296c", "address": "fa:16:3e:e8:eb:ea", "network": {"id": "ec72ffac-7400-49d0-9e0a-60c991449755", "bridge": "br-int", "label": "tempest-network-smoke--1838609208", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6be1de1-c2", "ovs_interfaceid": "a6be1de1-c2dd-4be7-89df-bfa4d9bc296c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:19:36 np0005531887 nova_compute[186849]: 2025-11-22 08:19:36.043 186853 DEBUG nova.network.os_vif_util [None req-6cb95399-3a13-4afb-abf0-e9c1983431a5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e8:eb:ea,bridge_name='br-int',has_traffic_filtering=True,id=a6be1de1-c2dd-4be7-89df-bfa4d9bc296c,network=Network(ec72ffac-7400-49d0-9e0a-60c991449755),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa6be1de1-c2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:19:36 np0005531887 nova_compute[186849]: 2025-11-22 08:19:36.043 186853 DEBUG os_vif [None req-6cb95399-3a13-4afb-abf0-e9c1983431a5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e8:eb:ea,bridge_name='br-int',has_traffic_filtering=True,id=a6be1de1-c2dd-4be7-89df-bfa4d9bc296c,network=Network(ec72ffac-7400-49d0-9e0a-60c991449755),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa6be1de1-c2') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:19:36 np0005531887 nova_compute[186849]: 2025-11-22 08:19:36.046 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:36 np0005531887 nova_compute[186849]: 2025-11-22 08:19:36.047 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa6be1de1-c2, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:19:36 np0005531887 nova_compute[186849]: 2025-11-22 08:19:36.049 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:36 np0005531887 nova_compute[186849]: 2025-11-22 08:19:36.052 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:19:36 np0005531887 nova_compute[186849]: 2025-11-22 08:19:36.054 186853 INFO os_vif [None req-6cb95399-3a13-4afb-abf0-e9c1983431a5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e8:eb:ea,bridge_name='br-int',has_traffic_filtering=True,id=a6be1de1-c2dd-4be7-89df-bfa4d9bc296c,network=Network(ec72ffac-7400-49d0-9e0a-60c991449755),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa6be1de1-c2')#033[00m
Nov 22 03:19:36 np0005531887 nova_compute[186849]: 2025-11-22 08:19:36.055 186853 INFO nova.virt.libvirt.driver [None req-6cb95399-3a13-4afb-abf0-e9c1983431a5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Deleting instance files /var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a_del#033[00m
Nov 22 03:19:36 np0005531887 nova_compute[186849]: 2025-11-22 08:19:36.067 186853 INFO nova.virt.libvirt.driver [None req-6cb95399-3a13-4afb-abf0-e9c1983431a5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Deletion of /var/lib/nova/instances/ff7656a5-6680-4acd-a89d-fdc5e9fb914a_del complete#033[00m
Nov 22 03:19:36 np0005531887 nova_compute[186849]: 2025-11-22 08:19:36.169 186853 INFO nova.compute.manager [None req-6cb95399-3a13-4afb-abf0-e9c1983431a5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Took 0.42 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 03:19:36 np0005531887 nova_compute[186849]: 2025-11-22 08:19:36.170 186853 DEBUG oslo.service.loopingcall [None req-6cb95399-3a13-4afb-abf0-e9c1983431a5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 03:19:36 np0005531887 nova_compute[186849]: 2025-11-22 08:19:36.171 186853 DEBUG nova.compute.manager [-] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 03:19:36 np0005531887 nova_compute[186849]: 2025-11-22 08:19:36.171 186853 DEBUG nova.network.neutron [-] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 03:19:36 np0005531887 neutron-haproxy-ovnmeta-ec72ffac-7400-49d0-9e0a-60c991449755[237346]: [NOTICE]   (237350) : haproxy version is 2.8.14-c23fe91
Nov 22 03:19:36 np0005531887 neutron-haproxy-ovnmeta-ec72ffac-7400-49d0-9e0a-60c991449755[237346]: [NOTICE]   (237350) : path to executable is /usr/sbin/haproxy
Nov 22 03:19:36 np0005531887 neutron-haproxy-ovnmeta-ec72ffac-7400-49d0-9e0a-60c991449755[237346]: [WARNING]  (237350) : Exiting Master process...
Nov 22 03:19:36 np0005531887 neutron-haproxy-ovnmeta-ec72ffac-7400-49d0-9e0a-60c991449755[237346]: [ALERT]    (237350) : Current worker (237352) exited with code 143 (Terminated)
Nov 22 03:19:36 np0005531887 neutron-haproxy-ovnmeta-ec72ffac-7400-49d0-9e0a-60c991449755[237346]: [WARNING]  (237350) : All workers exited. Exiting... (0)
Nov 22 03:19:36 np0005531887 systemd[1]: libpod-2828ece5581c7966514bfc80f09f1202c88796a9b4dc6ec92600e2396e6c8207.scope: Deactivated successfully.
Nov 22 03:19:36 np0005531887 podman[237537]: 2025-11-22 08:19:36.212397677 +0000 UTC m=+0.299434236 container died 2828ece5581c7966514bfc80f09f1202c88796a9b4dc6ec92600e2396e6c8207 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ec72ffac-7400-49d0-9e0a-60c991449755, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 22 03:19:36 np0005531887 systemd[1]: var-lib-containers-storage-overlay-4b970247e5aff03b9d6fce7ce8c229ab657d76bdfeb15470a3a219dbcfa4f0bf-merged.mount: Deactivated successfully.
Nov 22 03:19:36 np0005531887 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2828ece5581c7966514bfc80f09f1202c88796a9b4dc6ec92600e2396e6c8207-userdata-shm.mount: Deactivated successfully.
Nov 22 03:19:36 np0005531887 podman[237537]: 2025-11-22 08:19:36.637517899 +0000 UTC m=+0.724554448 container cleanup 2828ece5581c7966514bfc80f09f1202c88796a9b4dc6ec92600e2396e6c8207 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ec72ffac-7400-49d0-9e0a-60c991449755, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 22 03:19:36 np0005531887 systemd[1]: libpod-conmon-2828ece5581c7966514bfc80f09f1202c88796a9b4dc6ec92600e2396e6c8207.scope: Deactivated successfully.
Nov 22 03:19:36 np0005531887 podman[237584]: 2025-11-22 08:19:36.748517374 +0000 UTC m=+0.089097534 container remove 2828ece5581c7966514bfc80f09f1202c88796a9b4dc6ec92600e2396e6c8207 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ec72ffac-7400-49d0-9e0a-60c991449755, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:19:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:19:36.756 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[c945483b-bf92-4b4a-9221-58bb18925aa0]: (4, ('Sat Nov 22 08:19:35 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-ec72ffac-7400-49d0-9e0a-60c991449755 (2828ece5581c7966514bfc80f09f1202c88796a9b4dc6ec92600e2396e6c8207)\n2828ece5581c7966514bfc80f09f1202c88796a9b4dc6ec92600e2396e6c8207\nSat Nov 22 08:19:36 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ec72ffac-7400-49d0-9e0a-60c991449755 (2828ece5581c7966514bfc80f09f1202c88796a9b4dc6ec92600e2396e6c8207)\n2828ece5581c7966514bfc80f09f1202c88796a9b4dc6ec92600e2396e6c8207\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:19:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:19:36.758 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[affaf414-17fe-4d1b-9fbe-9c9045804090]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:19:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:19:36.759 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapec72ffac-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:19:36 np0005531887 nova_compute[186849]: 2025-11-22 08:19:36.761 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:36 np0005531887 kernel: tapec72ffac-70: left promiscuous mode
Nov 22 03:19:36 np0005531887 nova_compute[186849]: 2025-11-22 08:19:36.772 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:19:36.777 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[c8f64a9d-55cd-46ed-986a-c5a0e95e171f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:19:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:19:36.793 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[a5bc5423-1f2e-4486-933a-a6937f665376]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:19:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:19:36.795 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[71cbc8a4-ae36-4b05-b2c4-6c878940f6e3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:19:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:19:36.814 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[131c3325-bd29-44aa-842b-0519ddcb0df8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 629243, 'reachable_time': 17174, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237599, 'error': None, 'target': 'ovnmeta-ec72ffac-7400-49d0-9e0a-60c991449755', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:19:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:19:36.817 104200 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ec72ffac-7400-49d0-9e0a-60c991449755 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:19:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:19:36.817 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[cac21160-c0a6-4235-9183-74c376b8964c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:19:36 np0005531887 systemd[1]: run-netns-ovnmeta\x2dec72ffac\x2d7400\x2d49d0\x2d9e0a\x2d60c991449755.mount: Deactivated successfully.
Nov 22 03:19:37 np0005531887 nova_compute[186849]: 2025-11-22 08:19:37.192 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:19:37.353 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:19:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:19:37.353 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:19:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:19:37.354 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:19:37 np0005531887 nova_compute[186849]: 2025-11-22 08:19:37.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:19:37 np0005531887 nova_compute[186849]: 2025-11-22 08:19:37.779 186853 DEBUG nova.compute.manager [req-bc8e2e5a-3bc2-4457-95fc-14b4e8922715 req-0f9b6430-a0e5-4789-b8b4-56fd28c9968d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Received event network-vif-unplugged-a6be1de1-c2dd-4be7-89df-bfa4d9bc296c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:19:37 np0005531887 nova_compute[186849]: 2025-11-22 08:19:37.780 186853 DEBUG oslo_concurrency.lockutils [req-bc8e2e5a-3bc2-4457-95fc-14b4e8922715 req-0f9b6430-a0e5-4789-b8b4-56fd28c9968d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "ff7656a5-6680-4acd-a89d-fdc5e9fb914a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:19:37 np0005531887 nova_compute[186849]: 2025-11-22 08:19:37.780 186853 DEBUG oslo_concurrency.lockutils [req-bc8e2e5a-3bc2-4457-95fc-14b4e8922715 req-0f9b6430-a0e5-4789-b8b4-56fd28c9968d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ff7656a5-6680-4acd-a89d-fdc5e9fb914a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:19:37 np0005531887 nova_compute[186849]: 2025-11-22 08:19:37.780 186853 DEBUG oslo_concurrency.lockutils [req-bc8e2e5a-3bc2-4457-95fc-14b4e8922715 req-0f9b6430-a0e5-4789-b8b4-56fd28c9968d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ff7656a5-6680-4acd-a89d-fdc5e9fb914a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:19:37 np0005531887 nova_compute[186849]: 2025-11-22 08:19:37.781 186853 DEBUG nova.compute.manager [req-bc8e2e5a-3bc2-4457-95fc-14b4e8922715 req-0f9b6430-a0e5-4789-b8b4-56fd28c9968d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] No waiting events found dispatching network-vif-unplugged-a6be1de1-c2dd-4be7-89df-bfa4d9bc296c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:19:37 np0005531887 nova_compute[186849]: 2025-11-22 08:19:37.781 186853 DEBUG nova.compute.manager [req-bc8e2e5a-3bc2-4457-95fc-14b4e8922715 req-0f9b6430-a0e5-4789-b8b4-56fd28c9968d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Received event network-vif-unplugged-a6be1de1-c2dd-4be7-89df-bfa4d9bc296c for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 03:19:37 np0005531887 nova_compute[186849]: 2025-11-22 08:19:37.781 186853 DEBUG nova.compute.manager [req-bc8e2e5a-3bc2-4457-95fc-14b4e8922715 req-0f9b6430-a0e5-4789-b8b4-56fd28c9968d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Received event network-vif-plugged-a6be1de1-c2dd-4be7-89df-bfa4d9bc296c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:19:37 np0005531887 nova_compute[186849]: 2025-11-22 08:19:37.781 186853 DEBUG oslo_concurrency.lockutils [req-bc8e2e5a-3bc2-4457-95fc-14b4e8922715 req-0f9b6430-a0e5-4789-b8b4-56fd28c9968d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "ff7656a5-6680-4acd-a89d-fdc5e9fb914a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:19:37 np0005531887 nova_compute[186849]: 2025-11-22 08:19:37.782 186853 DEBUG oslo_concurrency.lockutils [req-bc8e2e5a-3bc2-4457-95fc-14b4e8922715 req-0f9b6430-a0e5-4789-b8b4-56fd28c9968d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ff7656a5-6680-4acd-a89d-fdc5e9fb914a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:19:37 np0005531887 nova_compute[186849]: 2025-11-22 08:19:37.783 186853 DEBUG oslo_concurrency.lockutils [req-bc8e2e5a-3bc2-4457-95fc-14b4e8922715 req-0f9b6430-a0e5-4789-b8b4-56fd28c9968d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ff7656a5-6680-4acd-a89d-fdc5e9fb914a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:19:37 np0005531887 nova_compute[186849]: 2025-11-22 08:19:37.783 186853 DEBUG nova.compute.manager [req-bc8e2e5a-3bc2-4457-95fc-14b4e8922715 req-0f9b6430-a0e5-4789-b8b4-56fd28c9968d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] No waiting events found dispatching network-vif-plugged-a6be1de1-c2dd-4be7-89df-bfa4d9bc296c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:19:37 np0005531887 nova_compute[186849]: 2025-11-22 08:19:37.784 186853 WARNING nova.compute.manager [req-bc8e2e5a-3bc2-4457-95fc-14b4e8922715 req-0f9b6430-a0e5-4789-b8b4-56fd28c9968d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Received unexpected event network-vif-plugged-a6be1de1-c2dd-4be7-89df-bfa4d9bc296c for instance with vm_state active and task_state deleting.#033[00m
Nov 22 03:19:39 np0005531887 nova_compute[186849]: 2025-11-22 08:19:39.025 186853 DEBUG nova.network.neutron [-] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:19:39 np0005531887 nova_compute[186849]: 2025-11-22 08:19:39.032 186853 DEBUG nova.network.neutron [req-4c561d22-29d6-4481-b63c-cbeb69d2ceba req-e49ebb26-7bdf-4c93-9025-2ca6e96211cb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Updated VIF entry in instance network info cache for port a6be1de1-c2dd-4be7-89df-bfa4d9bc296c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:19:39 np0005531887 nova_compute[186849]: 2025-11-22 08:19:39.033 186853 DEBUG nova.network.neutron [req-4c561d22-29d6-4481-b63c-cbeb69d2ceba req-e49ebb26-7bdf-4c93-9025-2ca6e96211cb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Updating instance_info_cache with network_info: [{"id": "a6be1de1-c2dd-4be7-89df-bfa4d9bc296c", "address": "fa:16:3e:e8:eb:ea", "network": {"id": "ec72ffac-7400-49d0-9e0a-60c991449755", "bridge": "br-int", "label": "tempest-network-smoke--1838609208", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6be1de1-c2", "ovs_interfaceid": "a6be1de1-c2dd-4be7-89df-bfa4d9bc296c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:19:39 np0005531887 nova_compute[186849]: 2025-11-22 08:19:39.048 186853 INFO nova.compute.manager [-] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Took 2.88 seconds to deallocate network for instance.#033[00m
Nov 22 03:19:39 np0005531887 nova_compute[186849]: 2025-11-22 08:19:39.048 186853 DEBUG oslo_concurrency.lockutils [req-4c561d22-29d6-4481-b63c-cbeb69d2ceba req-e49ebb26-7bdf-4c93-9025-2ca6e96211cb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-ff7656a5-6680-4acd-a89d-fdc5e9fb914a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:19:39 np0005531887 nova_compute[186849]: 2025-11-22 08:19:39.091 186853 DEBUG nova.compute.manager [req-a22063da-2eba-44ec-a466-12fa397bd5fd req-c2577a82-22d7-4697-85f2-706699165a9b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Received event network-vif-deleted-a6be1de1-c2dd-4be7-89df-bfa4d9bc296c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:19:39 np0005531887 nova_compute[186849]: 2025-11-22 08:19:39.092 186853 INFO nova.compute.manager [req-a22063da-2eba-44ec-a466-12fa397bd5fd req-c2577a82-22d7-4697-85f2-706699165a9b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Neutron deleted interface a6be1de1-c2dd-4be7-89df-bfa4d9bc296c; detaching it from the instance and deleting it from the info cache#033[00m
Nov 22 03:19:39 np0005531887 nova_compute[186849]: 2025-11-22 08:19:39.092 186853 DEBUG nova.network.neutron [req-a22063da-2eba-44ec-a466-12fa397bd5fd req-c2577a82-22d7-4697-85f2-706699165a9b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:19:39 np0005531887 nova_compute[186849]: 2025-11-22 08:19:39.121 186853 DEBUG nova.compute.manager [req-a22063da-2eba-44ec-a466-12fa397bd5fd req-c2577a82-22d7-4697-85f2-706699165a9b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Detach interface failed, port_id=a6be1de1-c2dd-4be7-89df-bfa4d9bc296c, reason: Instance ff7656a5-6680-4acd-a89d-fdc5e9fb914a could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Nov 22 03:19:39 np0005531887 nova_compute[186849]: 2025-11-22 08:19:39.127 186853 DEBUG oslo_concurrency.lockutils [None req-6cb95399-3a13-4afb-abf0-e9c1983431a5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:19:39 np0005531887 nova_compute[186849]: 2025-11-22 08:19:39.127 186853 DEBUG oslo_concurrency.lockutils [None req-6cb95399-3a13-4afb-abf0-e9c1983431a5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:19:39 np0005531887 nova_compute[186849]: 2025-11-22 08:19:39.203 186853 DEBUG nova.compute.provider_tree [None req-6cb95399-3a13-4afb-abf0-e9c1983431a5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:19:39 np0005531887 nova_compute[186849]: 2025-11-22 08:19:39.220 186853 DEBUG nova.scheduler.client.report [None req-6cb95399-3a13-4afb-abf0-e9c1983431a5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:19:39 np0005531887 nova_compute[186849]: 2025-11-22 08:19:39.238 186853 DEBUG oslo_concurrency.lockutils [None req-6cb95399-3a13-4afb-abf0-e9c1983431a5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:19:39 np0005531887 nova_compute[186849]: 2025-11-22 08:19:39.258 186853 INFO nova.scheduler.client.report [None req-6cb95399-3a13-4afb-abf0-e9c1983431a5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Deleted allocations for instance ff7656a5-6680-4acd-a89d-fdc5e9fb914a#033[00m
Nov 22 03:19:39 np0005531887 nova_compute[186849]: 2025-11-22 08:19:39.321 186853 DEBUG oslo_concurrency.lockutils [None req-6cb95399-3a13-4afb-abf0-e9c1983431a5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "ff7656a5-6680-4acd-a89d-fdc5e9fb914a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.583s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:19:39 np0005531887 podman[237600]: 2025-11-22 08:19:39.833107441 +0000 UTC m=+0.052618802 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 03:19:41 np0005531887 nova_compute[186849]: 2025-11-22 08:19:41.051 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:42 np0005531887 nova_compute[186849]: 2025-11-22 08:19:42.194 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:42 np0005531887 nova_compute[186849]: 2025-11-22 08:19:42.207 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:42 np0005531887 nova_compute[186849]: 2025-11-22 08:19:42.322 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:45 np0005531887 podman[237626]: 2025-11-22 08:19:45.851151382 +0000 UTC m=+0.061680376 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1755695350, maintainer=Red Hat, Inc., name=ubi9-minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.buildah.version=1.33.7)
Nov 22 03:19:46 np0005531887 nova_compute[186849]: 2025-11-22 08:19:46.052 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:47 np0005531887 nova_compute[186849]: 2025-11-22 08:19:47.197 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:49 np0005531887 podman[237647]: 2025-11-22 08:19:49.850955861 +0000 UTC m=+0.067472289 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:19:49 np0005531887 podman[237648]: 2025-11-22 08:19:49.87068375 +0000 UTC m=+0.087281970 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller)
Nov 22 03:19:51 np0005531887 nova_compute[186849]: 2025-11-22 08:19:51.019 186853 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763799576.01799, ff7656a5-6680-4acd-a89d-fdc5e9fb914a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:19:51 np0005531887 nova_compute[186849]: 2025-11-22 08:19:51.019 186853 INFO nova.compute.manager [-] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:19:51 np0005531887 nova_compute[186849]: 2025-11-22 08:19:51.051 186853 DEBUG nova.compute.manager [None req-4b025c11-83b9-452b-a520-e9815cf0b2c2 - - - - - -] [instance: ff7656a5-6680-4acd-a89d-fdc5e9fb914a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:19:51 np0005531887 nova_compute[186849]: 2025-11-22 08:19:51.053 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:52 np0005531887 nova_compute[186849]: 2025-11-22 08:19:52.198 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:54 np0005531887 podman[237693]: 2025-11-22 08:19:54.823660802 +0000 UTC m=+0.048698925 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:19:56 np0005531887 nova_compute[186849]: 2025-11-22 08:19:56.055 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:19:57 np0005531887 nova_compute[186849]: 2025-11-22 08:19:57.199 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:00 np0005531887 podman[237717]: 2025-11-22 08:20:00.837489858 +0000 UTC m=+0.055287187 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:20:01 np0005531887 nova_compute[186849]: 2025-11-22 08:20:01.057 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:01 np0005531887 nova_compute[186849]: 2025-11-22 08:20:01.796 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:20:02 np0005531887 nova_compute[186849]: 2025-11-22 08:20:02.201 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:20:04.031 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=45, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=44) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:20:04 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:20:04.032 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:20:04 np0005531887 nova_compute[186849]: 2025-11-22 08:20:04.031 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:04 np0005531887 nova_compute[186849]: 2025-11-22 08:20:04.867 186853 DEBUG oslo_concurrency.lockutils [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "6b66084f-0e71-48ba-897d-2a2519ece774" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:20:04 np0005531887 nova_compute[186849]: 2025-11-22 08:20:04.867 186853 DEBUG oslo_concurrency.lockutils [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "6b66084f-0e71-48ba-897d-2a2519ece774" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:20:04 np0005531887 nova_compute[186849]: 2025-11-22 08:20:04.894 186853 DEBUG nova.compute.manager [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 03:20:04 np0005531887 nova_compute[186849]: 2025-11-22 08:20:04.987 186853 DEBUG oslo_concurrency.lockutils [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:20:04 np0005531887 nova_compute[186849]: 2025-11-22 08:20:04.987 186853 DEBUG oslo_concurrency.lockutils [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:20:04 np0005531887 nova_compute[186849]: 2025-11-22 08:20:04.995 186853 DEBUG nova.virt.hardware [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:20:04 np0005531887 nova_compute[186849]: 2025-11-22 08:20:04.996 186853 INFO nova.compute.claims [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 22 03:20:05 np0005531887 nova_compute[186849]: 2025-11-22 08:20:05.153 186853 DEBUG nova.compute.provider_tree [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:20:05 np0005531887 nova_compute[186849]: 2025-11-22 08:20:05.172 186853 DEBUG nova.scheduler.client.report [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:20:05 np0005531887 nova_compute[186849]: 2025-11-22 08:20:05.198 186853 DEBUG oslo_concurrency.lockutils [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.211s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:20:05 np0005531887 nova_compute[186849]: 2025-11-22 08:20:05.199 186853 DEBUG nova.compute.manager [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 03:20:05 np0005531887 nova_compute[186849]: 2025-11-22 08:20:05.259 186853 DEBUG nova.compute.manager [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 03:20:05 np0005531887 nova_compute[186849]: 2025-11-22 08:20:05.260 186853 DEBUG nova.network.neutron [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:20:05 np0005531887 nova_compute[186849]: 2025-11-22 08:20:05.283 186853 INFO nova.virt.libvirt.driver [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 03:20:05 np0005531887 nova_compute[186849]: 2025-11-22 08:20:05.302 186853 DEBUG nova.compute.manager [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 03:20:05 np0005531887 nova_compute[186849]: 2025-11-22 08:20:05.409 186853 DEBUG nova.compute.manager [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 03:20:05 np0005531887 nova_compute[186849]: 2025-11-22 08:20:05.410 186853 DEBUG nova.virt.libvirt.driver [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:20:05 np0005531887 nova_compute[186849]: 2025-11-22 08:20:05.411 186853 INFO nova.virt.libvirt.driver [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Creating image(s)#033[00m
Nov 22 03:20:05 np0005531887 nova_compute[186849]: 2025-11-22 08:20:05.411 186853 DEBUG oslo_concurrency.lockutils [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "/var/lib/nova/instances/6b66084f-0e71-48ba-897d-2a2519ece774/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:20:05 np0005531887 nova_compute[186849]: 2025-11-22 08:20:05.412 186853 DEBUG oslo_concurrency.lockutils [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "/var/lib/nova/instances/6b66084f-0e71-48ba-897d-2a2519ece774/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:20:05 np0005531887 nova_compute[186849]: 2025-11-22 08:20:05.412 186853 DEBUG oslo_concurrency.lockutils [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "/var/lib/nova/instances/6b66084f-0e71-48ba-897d-2a2519ece774/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:20:05 np0005531887 nova_compute[186849]: 2025-11-22 08:20:05.426 186853 DEBUG oslo_concurrency.processutils [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:20:05 np0005531887 nova_compute[186849]: 2025-11-22 08:20:05.480 186853 DEBUG oslo_concurrency.processutils [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:20:05 np0005531887 nova_compute[186849]: 2025-11-22 08:20:05.481 186853 DEBUG oslo_concurrency.lockutils [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:20:05 np0005531887 nova_compute[186849]: 2025-11-22 08:20:05.481 186853 DEBUG oslo_concurrency.lockutils [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:20:05 np0005531887 nova_compute[186849]: 2025-11-22 08:20:05.493 186853 DEBUG oslo_concurrency.processutils [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:20:05 np0005531887 nova_compute[186849]: 2025-11-22 08:20:05.547 186853 DEBUG oslo_concurrency.processutils [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:20:05 np0005531887 nova_compute[186849]: 2025-11-22 08:20:05.549 186853 DEBUG oslo_concurrency.processutils [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/6b66084f-0e71-48ba-897d-2a2519ece774/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:20:05 np0005531887 nova_compute[186849]: 2025-11-22 08:20:05.641 186853 DEBUG oslo_concurrency.processutils [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/6b66084f-0e71-48ba-897d-2a2519ece774/disk 1073741824" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:20:05 np0005531887 nova_compute[186849]: 2025-11-22 08:20:05.642 186853 DEBUG oslo_concurrency.lockutils [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.160s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:20:05 np0005531887 nova_compute[186849]: 2025-11-22 08:20:05.642 186853 DEBUG oslo_concurrency.processutils [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:20:05 np0005531887 nova_compute[186849]: 2025-11-22 08:20:05.661 186853 DEBUG nova.policy [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '809b865601654264af5bff7f49127cea', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:20:05 np0005531887 nova_compute[186849]: 2025-11-22 08:20:05.700 186853 DEBUG oslo_concurrency.processutils [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:20:05 np0005531887 nova_compute[186849]: 2025-11-22 08:20:05.701 186853 DEBUG nova.virt.disk.api [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Checking if we can resize image /var/lib/nova/instances/6b66084f-0e71-48ba-897d-2a2519ece774/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:20:05 np0005531887 nova_compute[186849]: 2025-11-22 08:20:05.701 186853 DEBUG oslo_concurrency.processutils [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6b66084f-0e71-48ba-897d-2a2519ece774/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:20:05 np0005531887 nova_compute[186849]: 2025-11-22 08:20:05.769 186853 DEBUG oslo_concurrency.processutils [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6b66084f-0e71-48ba-897d-2a2519ece774/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:20:05 np0005531887 nova_compute[186849]: 2025-11-22 08:20:05.770 186853 DEBUG nova.virt.disk.api [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Cannot resize image /var/lib/nova/instances/6b66084f-0e71-48ba-897d-2a2519ece774/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:20:05 np0005531887 nova_compute[186849]: 2025-11-22 08:20:05.771 186853 DEBUG nova.objects.instance [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lazy-loading 'migration_context' on Instance uuid 6b66084f-0e71-48ba-897d-2a2519ece774 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:20:05 np0005531887 nova_compute[186849]: 2025-11-22 08:20:05.782 186853 DEBUG nova.virt.libvirt.driver [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:20:05 np0005531887 nova_compute[186849]: 2025-11-22 08:20:05.782 186853 DEBUG nova.virt.libvirt.driver [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Ensure instance console log exists: /var/lib/nova/instances/6b66084f-0e71-48ba-897d-2a2519ece774/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:20:05 np0005531887 nova_compute[186849]: 2025-11-22 08:20:05.783 186853 DEBUG oslo_concurrency.lockutils [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:20:05 np0005531887 nova_compute[186849]: 2025-11-22 08:20:05.783 186853 DEBUG oslo_concurrency.lockutils [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:20:05 np0005531887 nova_compute[186849]: 2025-11-22 08:20:05.784 186853 DEBUG oslo_concurrency.lockutils [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:20:05 np0005531887 podman[237749]: 2025-11-22 08:20:05.840099238 +0000 UTC m=+0.062290011 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 22 03:20:06 np0005531887 nova_compute[186849]: 2025-11-22 08:20:06.059 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:07 np0005531887 nova_compute[186849]: 2025-11-22 08:20:07.203 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:07 np0005531887 nova_compute[186849]: 2025-11-22 08:20:07.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:20:08 np0005531887 nova_compute[186849]: 2025-11-22 08:20:08.147 186853 DEBUG nova.network.neutron [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Successfully created port: 527c4007-077d-42a6-9f7c-79c07ded5ed7 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:20:08 np0005531887 nova_compute[186849]: 2025-11-22 08:20:08.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:20:08 np0005531887 nova_compute[186849]: 2025-11-22 08:20:08.797 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:20:08 np0005531887 nova_compute[186849]: 2025-11-22 08:20:08.798 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:20:08 np0005531887 nova_compute[186849]: 2025-11-22 08:20:08.798 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:20:08 np0005531887 nova_compute[186849]: 2025-11-22 08:20:08.798 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:20:08 np0005531887 nova_compute[186849]: 2025-11-22 08:20:08.991 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:20:08 np0005531887 nova_compute[186849]: 2025-11-22 08:20:08.992 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5723MB free_disk=73.2735595703125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:20:08 np0005531887 nova_compute[186849]: 2025-11-22 08:20:08.992 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:20:08 np0005531887 nova_compute[186849]: 2025-11-22 08:20:08.993 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:20:09 np0005531887 nova_compute[186849]: 2025-11-22 08:20:09.399 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Instance 6b66084f-0e71-48ba-897d-2a2519ece774 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 03:20:09 np0005531887 nova_compute[186849]: 2025-11-22 08:20:09.399 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:20:09 np0005531887 nova_compute[186849]: 2025-11-22 08:20:09.400 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:20:09 np0005531887 nova_compute[186849]: 2025-11-22 08:20:09.587 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:20:09 np0005531887 nova_compute[186849]: 2025-11-22 08:20:09.610 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:20:09 np0005531887 nova_compute[186849]: 2025-11-22 08:20:09.634 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:20:09 np0005531887 nova_compute[186849]: 2025-11-22 08:20:09.634 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.641s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:20:10 np0005531887 nova_compute[186849]: 2025-11-22 08:20:10.304 186853 DEBUG nova.network.neutron [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Successfully updated port: 527c4007-077d-42a6-9f7c-79c07ded5ed7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:20:10 np0005531887 nova_compute[186849]: 2025-11-22 08:20:10.321 186853 DEBUG oslo_concurrency.lockutils [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "refresh_cache-6b66084f-0e71-48ba-897d-2a2519ece774" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:20:10 np0005531887 nova_compute[186849]: 2025-11-22 08:20:10.321 186853 DEBUG oslo_concurrency.lockutils [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquired lock "refresh_cache-6b66084f-0e71-48ba-897d-2a2519ece774" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:20:10 np0005531887 nova_compute[186849]: 2025-11-22 08:20:10.321 186853 DEBUG nova.network.neutron [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:20:10 np0005531887 nova_compute[186849]: 2025-11-22 08:20:10.420 186853 DEBUG nova.compute.manager [req-9ff0a39f-3273-4856-a269-d165d157ccf6 req-4ee46c70-8d6f-43f1-9da6-44d2936ebde1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Received event network-changed-527c4007-077d-42a6-9f7c-79c07ded5ed7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:20:10 np0005531887 nova_compute[186849]: 2025-11-22 08:20:10.420 186853 DEBUG nova.compute.manager [req-9ff0a39f-3273-4856-a269-d165d157ccf6 req-4ee46c70-8d6f-43f1-9da6-44d2936ebde1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Refreshing instance network info cache due to event network-changed-527c4007-077d-42a6-9f7c-79c07ded5ed7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:20:10 np0005531887 nova_compute[186849]: 2025-11-22 08:20:10.420 186853 DEBUG oslo_concurrency.lockutils [req-9ff0a39f-3273-4856-a269-d165d157ccf6 req-4ee46c70-8d6f-43f1-9da6-44d2936ebde1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-6b66084f-0e71-48ba-897d-2a2519ece774" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:20:10 np0005531887 nova_compute[186849]: 2025-11-22 08:20:10.530 186853 DEBUG nova.network.neutron [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:20:10 np0005531887 nova_compute[186849]: 2025-11-22 08:20:10.634 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:20:10 np0005531887 nova_compute[186849]: 2025-11-22 08:20:10.635 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:20:10 np0005531887 nova_compute[186849]: 2025-11-22 08:20:10.635 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:20:10 np0005531887 nova_compute[186849]: 2025-11-22 08:20:10.652 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Nov 22 03:20:10 np0005531887 nova_compute[186849]: 2025-11-22 08:20:10.653 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:20:10 np0005531887 nova_compute[186849]: 2025-11-22 08:20:10.654 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:20:10 np0005531887 nova_compute[186849]: 2025-11-22 08:20:10.654 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:20:10 np0005531887 nova_compute[186849]: 2025-11-22 08:20:10.782 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:20:10 np0005531887 podman[237772]: 2025-11-22 08:20:10.834569916 +0000 UTC m=+0.054456398 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 03:20:11 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:20:11.033 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '45'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:20:11 np0005531887 nova_compute[186849]: 2025-11-22 08:20:11.061 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:12 np0005531887 nova_compute[186849]: 2025-11-22 08:20:12.204 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:14 np0005531887 nova_compute[186849]: 2025-11-22 08:20:14.159 186853 DEBUG nova.network.neutron [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Updating instance_info_cache with network_info: [{"id": "527c4007-077d-42a6-9f7c-79c07ded5ed7", "address": "fa:16:3e:83:36:2c", "network": {"id": "cfb1249f-37ac-4df7-b559-e7968406997d", "bridge": "br-int", "label": "tempest-network-smoke--1635184120", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe83:362c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe83:362c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap527c4007-07", "ovs_interfaceid": "527c4007-077d-42a6-9f7c-79c07ded5ed7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:20:14 np0005531887 nova_compute[186849]: 2025-11-22 08:20:14.185 186853 DEBUG oslo_concurrency.lockutils [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Releasing lock "refresh_cache-6b66084f-0e71-48ba-897d-2a2519ece774" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:20:14 np0005531887 nova_compute[186849]: 2025-11-22 08:20:14.185 186853 DEBUG nova.compute.manager [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Instance network_info: |[{"id": "527c4007-077d-42a6-9f7c-79c07ded5ed7", "address": "fa:16:3e:83:36:2c", "network": {"id": "cfb1249f-37ac-4df7-b559-e7968406997d", "bridge": "br-int", "label": "tempest-network-smoke--1635184120", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe83:362c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe83:362c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap527c4007-07", "ovs_interfaceid": "527c4007-077d-42a6-9f7c-79c07ded5ed7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 03:20:14 np0005531887 nova_compute[186849]: 2025-11-22 08:20:14.186 186853 DEBUG oslo_concurrency.lockutils [req-9ff0a39f-3273-4856-a269-d165d157ccf6 req-4ee46c70-8d6f-43f1-9da6-44d2936ebde1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-6b66084f-0e71-48ba-897d-2a2519ece774" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:20:14 np0005531887 nova_compute[186849]: 2025-11-22 08:20:14.186 186853 DEBUG nova.network.neutron [req-9ff0a39f-3273-4856-a269-d165d157ccf6 req-4ee46c70-8d6f-43f1-9da6-44d2936ebde1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Refreshing network info cache for port 527c4007-077d-42a6-9f7c-79c07ded5ed7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:20:14 np0005531887 nova_compute[186849]: 2025-11-22 08:20:14.190 186853 DEBUG nova.virt.libvirt.driver [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Start _get_guest_xml network_info=[{"id": "527c4007-077d-42a6-9f7c-79c07ded5ed7", "address": "fa:16:3e:83:36:2c", "network": {"id": "cfb1249f-37ac-4df7-b559-e7968406997d", "bridge": "br-int", "label": "tempest-network-smoke--1635184120", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe83:362c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe83:362c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap527c4007-07", "ovs_interfaceid": "527c4007-077d-42a6-9f7c-79c07ded5ed7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'size': 0, 'boot_index': 0, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:20:14 np0005531887 nova_compute[186849]: 2025-11-22 08:20:14.195 186853 WARNING nova.virt.libvirt.driver [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:20:14 np0005531887 nova_compute[186849]: 2025-11-22 08:20:14.200 186853 DEBUG nova.virt.libvirt.host [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:20:14 np0005531887 nova_compute[186849]: 2025-11-22 08:20:14.201 186853 DEBUG nova.virt.libvirt.host [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:20:14 np0005531887 nova_compute[186849]: 2025-11-22 08:20:14.209 186853 DEBUG nova.virt.libvirt.host [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:20:14 np0005531887 nova_compute[186849]: 2025-11-22 08:20:14.210 186853 DEBUG nova.virt.libvirt.host [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:20:14 np0005531887 nova_compute[186849]: 2025-11-22 08:20:14.211 186853 DEBUG nova.virt.libvirt.driver [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:20:14 np0005531887 nova_compute[186849]: 2025-11-22 08:20:14.211 186853 DEBUG nova.virt.hardware [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:20:14 np0005531887 nova_compute[186849]: 2025-11-22 08:20:14.212 186853 DEBUG nova.virt.hardware [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:20:14 np0005531887 nova_compute[186849]: 2025-11-22 08:20:14.212 186853 DEBUG nova.virt.hardware [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:20:14 np0005531887 nova_compute[186849]: 2025-11-22 08:20:14.212 186853 DEBUG nova.virt.hardware [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:20:14 np0005531887 nova_compute[186849]: 2025-11-22 08:20:14.212 186853 DEBUG nova.virt.hardware [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:20:14 np0005531887 nova_compute[186849]: 2025-11-22 08:20:14.213 186853 DEBUG nova.virt.hardware [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:20:14 np0005531887 nova_compute[186849]: 2025-11-22 08:20:14.213 186853 DEBUG nova.virt.hardware [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:20:14 np0005531887 nova_compute[186849]: 2025-11-22 08:20:14.213 186853 DEBUG nova.virt.hardware [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:20:14 np0005531887 nova_compute[186849]: 2025-11-22 08:20:14.213 186853 DEBUG nova.virt.hardware [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:20:14 np0005531887 nova_compute[186849]: 2025-11-22 08:20:14.213 186853 DEBUG nova.virt.hardware [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:20:14 np0005531887 nova_compute[186849]: 2025-11-22 08:20:14.214 186853 DEBUG nova.virt.hardware [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:20:14 np0005531887 nova_compute[186849]: 2025-11-22 08:20:14.217 186853 DEBUG nova.virt.libvirt.vif [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:20:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1523037456',display_name='tempest-TestGettingAddress-server-1523037456',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1523037456',id=153,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFe+Iivl03JqMU254FLdbdmYsFU6DbbfEEAr4K/FY8GDuQ0mBhcnts9hxMb1kzVXY50lm7S9mYwpnOmQECnf5XsVq5CeQ5VY2CUHiqO5dq+d/xeUGYNH940WARTUGprt0Q==',key_name='tempest-TestGettingAddress-223844638',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-5t8n4cws',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:20:05Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=6b66084f-0e71-48ba-897d-2a2519ece774,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "527c4007-077d-42a6-9f7c-79c07ded5ed7", "address": "fa:16:3e:83:36:2c", "network": {"id": "cfb1249f-37ac-4df7-b559-e7968406997d", "bridge": "br-int", "label": "tempest-network-smoke--1635184120", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe83:362c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe83:362c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap527c4007-07", "ovs_interfaceid": "527c4007-077d-42a6-9f7c-79c07ded5ed7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:20:14 np0005531887 nova_compute[186849]: 2025-11-22 08:20:14.218 186853 DEBUG nova.network.os_vif_util [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "527c4007-077d-42a6-9f7c-79c07ded5ed7", "address": "fa:16:3e:83:36:2c", "network": {"id": "cfb1249f-37ac-4df7-b559-e7968406997d", "bridge": "br-int", "label": "tempest-network-smoke--1635184120", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe83:362c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe83:362c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap527c4007-07", "ovs_interfaceid": "527c4007-077d-42a6-9f7c-79c07ded5ed7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:20:14 np0005531887 nova_compute[186849]: 2025-11-22 08:20:14.218 186853 DEBUG nova.network.os_vif_util [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:83:36:2c,bridge_name='br-int',has_traffic_filtering=True,id=527c4007-077d-42a6-9f7c-79c07ded5ed7,network=Network(cfb1249f-37ac-4df7-b559-e7968406997d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap527c4007-07') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:20:14 np0005531887 nova_compute[186849]: 2025-11-22 08:20:14.219 186853 DEBUG nova.objects.instance [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lazy-loading 'pci_devices' on Instance uuid 6b66084f-0e71-48ba-897d-2a2519ece774 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:20:14 np0005531887 nova_compute[186849]: 2025-11-22 08:20:14.247 186853 DEBUG nova.virt.libvirt.driver [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:20:14 np0005531887 nova_compute[186849]:  <uuid>6b66084f-0e71-48ba-897d-2a2519ece774</uuid>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:  <name>instance-00000099</name>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:  <memory>131072</memory>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:  <vcpu>1</vcpu>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:20:14 np0005531887 nova_compute[186849]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:      <nova:name>tempest-TestGettingAddress-server-1523037456</nova:name>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:      <nova:creationTime>2025-11-22 08:20:14</nova:creationTime>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:      <nova:flavor name="m1.nano">
Nov 22 03:20:14 np0005531887 nova_compute[186849]:        <nova:memory>128</nova:memory>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:        <nova:disk>1</nova:disk>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:        <nova:swap>0</nova:swap>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:      </nova:flavor>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:      <nova:owner>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:        <nova:user uuid="809b865601654264af5bff7f49127cea">tempest-TestGettingAddress-25838038-project-member</nova:user>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:        <nova:project uuid="c4200f1d1fbb44a5aaf5e3578f6354ae">tempest-TestGettingAddress-25838038</nova:project>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:      </nova:owner>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:      <nova:ports>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:        <nova:port uuid="527c4007-077d-42a6-9f7c-79c07ded5ed7">
Nov 22 03:20:14 np0005531887 nova_compute[186849]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fe83:362c" ipVersion="6"/>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:          <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fe83:362c" ipVersion="6"/>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:        </nova:port>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:      </nova:ports>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:    </nova:instance>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:  <sysinfo type="smbios">
Nov 22 03:20:14 np0005531887 nova_compute[186849]:    <system>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:      <entry name="serial">6b66084f-0e71-48ba-897d-2a2519ece774</entry>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:      <entry name="uuid">6b66084f-0e71-48ba-897d-2a2519ece774</entry>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:    </system>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:  <os>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:    <boot dev="hd"/>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:    <smbios mode="sysinfo"/>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:  </os>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:  <features>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:    <vmcoreinfo/>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:  </features>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:  <clock offset="utc">
Nov 22 03:20:14 np0005531887 nova_compute[186849]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:    <timer name="hpet" present="no"/>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:  </clock>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:  <cpu mode="custom" match="exact">
Nov 22 03:20:14 np0005531887 nova_compute[186849]:    <model>Nehalem</model>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:  <devices>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:    <disk type="file" device="disk">
Nov 22 03:20:14 np0005531887 nova_compute[186849]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/6b66084f-0e71-48ba-897d-2a2519ece774/disk"/>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:      <target dev="vda" bus="virtio"/>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:    <disk type="file" device="cdrom">
Nov 22 03:20:14 np0005531887 nova_compute[186849]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/6b66084f-0e71-48ba-897d-2a2519ece774/disk.config"/>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:      <target dev="sda" bus="sata"/>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:    <interface type="ethernet">
Nov 22 03:20:14 np0005531887 nova_compute[186849]:      <mac address="fa:16:3e:83:36:2c"/>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:      <mtu size="1442"/>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:      <target dev="tap527c4007-07"/>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:    </interface>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:    <serial type="pty">
Nov 22 03:20:14 np0005531887 nova_compute[186849]:      <log file="/var/lib/nova/instances/6b66084f-0e71-48ba-897d-2a2519ece774/console.log" append="off"/>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:    </serial>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:    <video>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:    </video>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:    <input type="tablet" bus="usb"/>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:    <rng model="virtio">
Nov 22 03:20:14 np0005531887 nova_compute[186849]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:    </rng>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:    <controller type="usb" index="0"/>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:    <memballoon model="virtio">
Nov 22 03:20:14 np0005531887 nova_compute[186849]:      <stats period="10"/>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 03:20:14 np0005531887 nova_compute[186849]:  </devices>
Nov 22 03:20:14 np0005531887 nova_compute[186849]: </domain>
Nov 22 03:20:14 np0005531887 nova_compute[186849]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:20:14 np0005531887 nova_compute[186849]: 2025-11-22 08:20:14.249 186853 DEBUG nova.compute.manager [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Preparing to wait for external event network-vif-plugged-527c4007-077d-42a6-9f7c-79c07ded5ed7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:20:14 np0005531887 nova_compute[186849]: 2025-11-22 08:20:14.249 186853 DEBUG oslo_concurrency.lockutils [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "6b66084f-0e71-48ba-897d-2a2519ece774-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:20:14 np0005531887 nova_compute[186849]: 2025-11-22 08:20:14.249 186853 DEBUG oslo_concurrency.lockutils [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "6b66084f-0e71-48ba-897d-2a2519ece774-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:20:14 np0005531887 nova_compute[186849]: 2025-11-22 08:20:14.249 186853 DEBUG oslo_concurrency.lockutils [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "6b66084f-0e71-48ba-897d-2a2519ece774-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:20:14 np0005531887 nova_compute[186849]: 2025-11-22 08:20:14.250 186853 DEBUG nova.virt.libvirt.vif [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:20:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1523037456',display_name='tempest-TestGettingAddress-server-1523037456',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1523037456',id=153,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFe+Iivl03JqMU254FLdbdmYsFU6DbbfEEAr4K/FY8GDuQ0mBhcnts9hxMb1kzVXY50lm7S9mYwpnOmQECnf5XsVq5CeQ5VY2CUHiqO5dq+d/xeUGYNH940WARTUGprt0Q==',key_name='tempest-TestGettingAddress-223844638',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-5t8n4cws',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:20:05Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=6b66084f-0e71-48ba-897d-2a2519ece774,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "527c4007-077d-42a6-9f7c-79c07ded5ed7", "address": "fa:16:3e:83:36:2c", "network": {"id": "cfb1249f-37ac-4df7-b559-e7968406997d", "bridge": "br-int", "label": "tempest-network-smoke--1635184120", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe83:362c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe83:362c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap527c4007-07", "ovs_interfaceid": "527c4007-077d-42a6-9f7c-79c07ded5ed7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:20:14 np0005531887 nova_compute[186849]: 2025-11-22 08:20:14.250 186853 DEBUG nova.network.os_vif_util [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "527c4007-077d-42a6-9f7c-79c07ded5ed7", "address": "fa:16:3e:83:36:2c", "network": {"id": "cfb1249f-37ac-4df7-b559-e7968406997d", "bridge": "br-int", "label": "tempest-network-smoke--1635184120", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe83:362c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe83:362c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap527c4007-07", "ovs_interfaceid": "527c4007-077d-42a6-9f7c-79c07ded5ed7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:20:14 np0005531887 nova_compute[186849]: 2025-11-22 08:20:14.251 186853 DEBUG nova.network.os_vif_util [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:83:36:2c,bridge_name='br-int',has_traffic_filtering=True,id=527c4007-077d-42a6-9f7c-79c07ded5ed7,network=Network(cfb1249f-37ac-4df7-b559-e7968406997d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap527c4007-07') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:20:14 np0005531887 nova_compute[186849]: 2025-11-22 08:20:14.251 186853 DEBUG os_vif [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:83:36:2c,bridge_name='br-int',has_traffic_filtering=True,id=527c4007-077d-42a6-9f7c-79c07ded5ed7,network=Network(cfb1249f-37ac-4df7-b559-e7968406997d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap527c4007-07') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:20:14 np0005531887 nova_compute[186849]: 2025-11-22 08:20:14.252 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:14 np0005531887 nova_compute[186849]: 2025-11-22 08:20:14.252 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:20:14 np0005531887 nova_compute[186849]: 2025-11-22 08:20:14.252 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:20:14 np0005531887 nova_compute[186849]: 2025-11-22 08:20:14.255 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:14 np0005531887 nova_compute[186849]: 2025-11-22 08:20:14.255 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap527c4007-07, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:20:14 np0005531887 nova_compute[186849]: 2025-11-22 08:20:14.255 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap527c4007-07, col_values=(('external_ids', {'iface-id': '527c4007-077d-42a6-9f7c-79c07ded5ed7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:83:36:2c', 'vm-uuid': '6b66084f-0e71-48ba-897d-2a2519ece774'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:20:14 np0005531887 nova_compute[186849]: 2025-11-22 08:20:14.257 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:14 np0005531887 NetworkManager[55210]: <info>  [1763799614.2594] manager: (tap527c4007-07): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/246)
Nov 22 03:20:14 np0005531887 nova_compute[186849]: 2025-11-22 08:20:14.260 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:20:14 np0005531887 nova_compute[186849]: 2025-11-22 08:20:14.271 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:14 np0005531887 nova_compute[186849]: 2025-11-22 08:20:14.272 186853 INFO os_vif [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:83:36:2c,bridge_name='br-int',has_traffic_filtering=True,id=527c4007-077d-42a6-9f7c-79c07ded5ed7,network=Network(cfb1249f-37ac-4df7-b559-e7968406997d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap527c4007-07')#033[00m
Nov 22 03:20:14 np0005531887 nova_compute[186849]: 2025-11-22 08:20:14.324 186853 DEBUG nova.virt.libvirt.driver [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:20:14 np0005531887 nova_compute[186849]: 2025-11-22 08:20:14.325 186853 DEBUG nova.virt.libvirt.driver [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:20:14 np0005531887 nova_compute[186849]: 2025-11-22 08:20:14.325 186853 DEBUG nova.virt.libvirt.driver [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] No VIF found with MAC fa:16:3e:83:36:2c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:20:14 np0005531887 nova_compute[186849]: 2025-11-22 08:20:14.325 186853 INFO nova.virt.libvirt.driver [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Using config drive#033[00m
Nov 22 03:20:15 np0005531887 nova_compute[186849]: 2025-11-22 08:20:15.062 186853 INFO nova.virt.libvirt.driver [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Creating config drive at /var/lib/nova/instances/6b66084f-0e71-48ba-897d-2a2519ece774/disk.config#033[00m
Nov 22 03:20:15 np0005531887 nova_compute[186849]: 2025-11-22 08:20:15.072 186853 DEBUG oslo_concurrency.processutils [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6b66084f-0e71-48ba-897d-2a2519ece774/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpi6ltr38r execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:20:15 np0005531887 nova_compute[186849]: 2025-11-22 08:20:15.202 186853 DEBUG oslo_concurrency.processutils [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6b66084f-0e71-48ba-897d-2a2519ece774/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpi6ltr38r" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:20:15 np0005531887 kernel: tap527c4007-07: entered promiscuous mode
Nov 22 03:20:15 np0005531887 NetworkManager[55210]: <info>  [1763799615.2837] manager: (tap527c4007-07): new Tun device (/org/freedesktop/NetworkManager/Devices/247)
Nov 22 03:20:15 np0005531887 nova_compute[186849]: 2025-11-22 08:20:15.285 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:15 np0005531887 ovn_controller[95130]: 2025-11-22T08:20:15Z|00522|binding|INFO|Claiming lport 527c4007-077d-42a6-9f7c-79c07ded5ed7 for this chassis.
Nov 22 03:20:15 np0005531887 ovn_controller[95130]: 2025-11-22T08:20:15Z|00523|binding|INFO|527c4007-077d-42a6-9f7c-79c07ded5ed7: Claiming fa:16:3e:83:36:2c 10.100.0.8 2001:db8:0:1:f816:3eff:fe83:362c 2001:db8::f816:3eff:fe83:362c
Nov 22 03:20:15 np0005531887 nova_compute[186849]: 2025-11-22 08:20:15.289 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:15 np0005531887 nova_compute[186849]: 2025-11-22 08:20:15.295 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:15 np0005531887 nova_compute[186849]: 2025-11-22 08:20:15.302 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:15 np0005531887 NetworkManager[55210]: <info>  [1763799615.3050] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/248)
Nov 22 03:20:15 np0005531887 NetworkManager[55210]: <info>  [1763799615.3059] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/249)
Nov 22 03:20:15 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:20:15.309 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:83:36:2c 10.100.0.8 2001:db8:0:1:f816:3eff:fe83:362c 2001:db8::f816:3eff:fe83:362c'], port_security=['fa:16:3e:83:36:2c 10.100.0.8 2001:db8:0:1:f816:3eff:fe83:362c 2001:db8::f816:3eff:fe83:362c'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28 2001:db8:0:1:f816:3eff:fe83:362c/64 2001:db8::f816:3eff:fe83:362c/64', 'neutron:device_id': '6b66084f-0e71-48ba-897d-2a2519ece774', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cfb1249f-37ac-4df7-b559-e7968406997d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '2', 'neutron:security_group_ids': '04c6695d-d046-49fb-a069-528067303a16', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=94e93905-1b64-4ecc-b682-ceea307bebcf, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=527c4007-077d-42a6-9f7c-79c07ded5ed7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:20:15 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:20:15.310 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 527c4007-077d-42a6-9f7c-79c07ded5ed7 in datapath cfb1249f-37ac-4df7-b559-e7968406997d bound to our chassis#033[00m
Nov 22 03:20:15 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:20:15.312 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network cfb1249f-37ac-4df7-b559-e7968406997d#033[00m
Nov 22 03:20:15 np0005531887 systemd-udevd[237815]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:20:15 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:20:15.325 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[3fd4961b-0b57-4dfd-9d7a-9e06bb4811de]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:20:15 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:20:15.326 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapcfb1249f-31 in ovnmeta-cfb1249f-37ac-4df7-b559-e7968406997d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:20:15 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:20:15.328 213790 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapcfb1249f-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:20:15 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:20:15.328 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[4027fac5-f8bd-4db8-a98a-c6edf83cc723]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:20:15 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:20:15.329 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[2f84b127-0685-4bd7-8046-2529abb321c3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:20:15 np0005531887 NetworkManager[55210]: <info>  [1763799615.3326] device (tap527c4007-07): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:20:15 np0005531887 NetworkManager[55210]: <info>  [1763799615.3349] device (tap527c4007-07): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:20:15 np0005531887 systemd-machined[153180]: New machine qemu-56-instance-00000099.
Nov 22 03:20:15 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:20:15.341 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[4b82979c-1c14-4ac5-9dba-699f0ee171c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:20:15 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:20:15.369 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[8ca35436-fd09-438c-9ef3-967b9c27765b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:20:15 np0005531887 nova_compute[186849]: 2025-11-22 08:20:15.403 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:15 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:20:15.405 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[befc5f36-7fbd-48da-9d76-fe05db0f485f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:20:15 np0005531887 systemd[1]: Started Virtual Machine qemu-56-instance-00000099.
Nov 22 03:20:15 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:20:15.410 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[9cfb0b97-0f16-4550-9a9b-c5c8bfeeaa12]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:20:15 np0005531887 NetworkManager[55210]: <info>  [1763799615.4114] manager: (tapcfb1249f-30): new Veth device (/org/freedesktop/NetworkManager/Devices/250)
Nov 22 03:20:15 np0005531887 nova_compute[186849]: 2025-11-22 08:20:15.417 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:15 np0005531887 ovn_controller[95130]: 2025-11-22T08:20:15Z|00524|binding|INFO|Setting lport 527c4007-077d-42a6-9f7c-79c07ded5ed7 ovn-installed in OVS
Nov 22 03:20:15 np0005531887 ovn_controller[95130]: 2025-11-22T08:20:15Z|00525|binding|INFO|Setting lport 527c4007-077d-42a6-9f7c-79c07ded5ed7 up in Southbound
Nov 22 03:20:15 np0005531887 nova_compute[186849]: 2025-11-22 08:20:15.426 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:15 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:20:15.442 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[4e10b2e8-1ae6-4690-ae06-23cd02938779]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:20:15 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:20:15.446 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[542f2a49-5046-42c9-87f9-23f814fa92fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:20:15 np0005531887 NetworkManager[55210]: <info>  [1763799615.4728] device (tapcfb1249f-30): carrier: link connected
Nov 22 03:20:15 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:20:15.478 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[152b1d52-8fb7-4f44-94b1-3600fe229be8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:20:15 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:20:15.494 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[6bf6fcb7-12a3-464a-ad3b-bf9253990940]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcfb1249f-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2c:5b:22'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 159], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 635808, 'reachable_time': 34173, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237848, 'error': None, 'target': 'ovnmeta-cfb1249f-37ac-4df7-b559-e7968406997d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:20:15 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:20:15.515 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[ec8ee232-6ce2-4a9f-b598-c7f062f63f18]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2c:5b22'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 635808, 'tstamp': 635808}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237850, 'error': None, 'target': 'ovnmeta-cfb1249f-37ac-4df7-b559-e7968406997d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:20:15 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:20:15.543 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[a0e755ca-27ba-4c9a-9f94-1a7556617654]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcfb1249f-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2c:5b:22'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 159], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 635808, 'reachable_time': 34173, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 237851, 'error': None, 'target': 'ovnmeta-cfb1249f-37ac-4df7-b559-e7968406997d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:20:15 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:20:15.585 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[f818638c-ad6a-4eb5-8bb1-d77b10bdf062]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:20:15 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:20:15.661 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[1823a533-2194-4d0b-861d-d79d1f2f184a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:20:15 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:20:15.662 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcfb1249f-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:20:15 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:20:15.662 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:20:15 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:20:15.662 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcfb1249f-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:20:15 np0005531887 nova_compute[186849]: 2025-11-22 08:20:15.664 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:15 np0005531887 NetworkManager[55210]: <info>  [1763799615.6647] manager: (tapcfb1249f-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/251)
Nov 22 03:20:15 np0005531887 kernel: tapcfb1249f-30: entered promiscuous mode
Nov 22 03:20:15 np0005531887 nova_compute[186849]: 2025-11-22 08:20:15.666 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:15 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:20:15.668 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapcfb1249f-30, col_values=(('external_ids', {'iface-id': 'ac76a812-5ead-4b51-8c63-4eaca1b65820'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:20:15 np0005531887 nova_compute[186849]: 2025-11-22 08:20:15.669 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:15 np0005531887 ovn_controller[95130]: 2025-11-22T08:20:15Z|00526|binding|INFO|Releasing lport ac76a812-5ead-4b51-8c63-4eaca1b65820 from this chassis (sb_readonly=0)
Nov 22 03:20:15 np0005531887 nova_compute[186849]: 2025-11-22 08:20:15.670 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:15 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:20:15.671 104084 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/cfb1249f-37ac-4df7-b559-e7968406997d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/cfb1249f-37ac-4df7-b559-e7968406997d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:20:15 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:20:15.672 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[27fa6ac0-2ca9-4875-94dc-cd04e3f6fa7c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:20:15 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:20:15.673 104084 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:20:15 np0005531887 ovn_metadata_agent[104079]: global
Nov 22 03:20:15 np0005531887 ovn_metadata_agent[104079]:    log         /dev/log local0 debug
Nov 22 03:20:15 np0005531887 ovn_metadata_agent[104079]:    log-tag     haproxy-metadata-proxy-cfb1249f-37ac-4df7-b559-e7968406997d
Nov 22 03:20:15 np0005531887 ovn_metadata_agent[104079]:    user        root
Nov 22 03:20:15 np0005531887 ovn_metadata_agent[104079]:    group       root
Nov 22 03:20:15 np0005531887 ovn_metadata_agent[104079]:    maxconn     1024
Nov 22 03:20:15 np0005531887 ovn_metadata_agent[104079]:    pidfile     /var/lib/neutron/external/pids/cfb1249f-37ac-4df7-b559-e7968406997d.pid.haproxy
Nov 22 03:20:15 np0005531887 ovn_metadata_agent[104079]:    daemon
Nov 22 03:20:15 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:20:15 np0005531887 ovn_metadata_agent[104079]: defaults
Nov 22 03:20:15 np0005531887 ovn_metadata_agent[104079]:    log global
Nov 22 03:20:15 np0005531887 ovn_metadata_agent[104079]:    mode http
Nov 22 03:20:15 np0005531887 ovn_metadata_agent[104079]:    option httplog
Nov 22 03:20:15 np0005531887 ovn_metadata_agent[104079]:    option dontlognull
Nov 22 03:20:15 np0005531887 ovn_metadata_agent[104079]:    option http-server-close
Nov 22 03:20:15 np0005531887 ovn_metadata_agent[104079]:    option forwardfor
Nov 22 03:20:15 np0005531887 ovn_metadata_agent[104079]:    retries                 3
Nov 22 03:20:15 np0005531887 ovn_metadata_agent[104079]:    timeout http-request    30s
Nov 22 03:20:15 np0005531887 ovn_metadata_agent[104079]:    timeout connect         30s
Nov 22 03:20:15 np0005531887 ovn_metadata_agent[104079]:    timeout client          32s
Nov 22 03:20:15 np0005531887 ovn_metadata_agent[104079]:    timeout server          32s
Nov 22 03:20:15 np0005531887 ovn_metadata_agent[104079]:    timeout http-keep-alive 30s
Nov 22 03:20:15 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:20:15 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:20:15 np0005531887 ovn_metadata_agent[104079]: listen listener
Nov 22 03:20:15 np0005531887 ovn_metadata_agent[104079]:    bind 169.254.169.254:80
Nov 22 03:20:15 np0005531887 ovn_metadata_agent[104079]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:20:15 np0005531887 ovn_metadata_agent[104079]:    http-request add-header X-OVN-Network-ID cfb1249f-37ac-4df7-b559-e7968406997d
Nov 22 03:20:15 np0005531887 ovn_metadata_agent[104079]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:20:15 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:20:15.675 104084 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-cfb1249f-37ac-4df7-b559-e7968406997d', 'env', 'PROCESS_TAG=haproxy-cfb1249f-37ac-4df7-b559-e7968406997d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/cfb1249f-37ac-4df7-b559-e7968406997d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:20:15 np0005531887 nova_compute[186849]: 2025-11-22 08:20:15.680 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:15 np0005531887 nova_compute[186849]: 2025-11-22 08:20:15.765 186853 DEBUG nova.compute.manager [req-5107df7e-4853-47a2-95b5-9997fce0773b req-424e27de-64a1-49c0-9b5d-390a911cbdd7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Received event network-vif-plugged-527c4007-077d-42a6-9f7c-79c07ded5ed7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:20:15 np0005531887 nova_compute[186849]: 2025-11-22 08:20:15.766 186853 DEBUG oslo_concurrency.lockutils [req-5107df7e-4853-47a2-95b5-9997fce0773b req-424e27de-64a1-49c0-9b5d-390a911cbdd7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "6b66084f-0e71-48ba-897d-2a2519ece774-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:20:15 np0005531887 nova_compute[186849]: 2025-11-22 08:20:15.766 186853 DEBUG oslo_concurrency.lockutils [req-5107df7e-4853-47a2-95b5-9997fce0773b req-424e27de-64a1-49c0-9b5d-390a911cbdd7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6b66084f-0e71-48ba-897d-2a2519ece774-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:20:15 np0005531887 nova_compute[186849]: 2025-11-22 08:20:15.766 186853 DEBUG oslo_concurrency.lockutils [req-5107df7e-4853-47a2-95b5-9997fce0773b req-424e27de-64a1-49c0-9b5d-390a911cbdd7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6b66084f-0e71-48ba-897d-2a2519ece774-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:20:15 np0005531887 nova_compute[186849]: 2025-11-22 08:20:15.767 186853 DEBUG nova.compute.manager [req-5107df7e-4853-47a2-95b5-9997fce0773b req-424e27de-64a1-49c0-9b5d-390a911cbdd7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Processing event network-vif-plugged-527c4007-077d-42a6-9f7c-79c07ded5ed7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:20:15 np0005531887 nova_compute[186849]: 2025-11-22 08:20:15.767 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763799615.76611, 6b66084f-0e71-48ba-897d-2a2519ece774 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:20:15 np0005531887 nova_compute[186849]: 2025-11-22 08:20:15.768 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] VM Started (Lifecycle Event)#033[00m
Nov 22 03:20:15 np0005531887 nova_compute[186849]: 2025-11-22 08:20:15.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:20:15 np0005531887 nova_compute[186849]: 2025-11-22 08:20:15.770 186853 DEBUG nova.compute.manager [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:20:15 np0005531887 nova_compute[186849]: 2025-11-22 08:20:15.773 186853 DEBUG nova.virt.libvirt.driver [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:20:15 np0005531887 nova_compute[186849]: 2025-11-22 08:20:15.777 186853 INFO nova.virt.libvirt.driver [-] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Instance spawned successfully.#033[00m
Nov 22 03:20:15 np0005531887 nova_compute[186849]: 2025-11-22 08:20:15.777 186853 DEBUG nova.virt.libvirt.driver [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 03:20:15 np0005531887 nova_compute[186849]: 2025-11-22 08:20:15.866 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:20:15 np0005531887 nova_compute[186849]: 2025-11-22 08:20:15.875 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:20:15 np0005531887 nova_compute[186849]: 2025-11-22 08:20:15.893 186853 DEBUG nova.virt.libvirt.driver [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:20:15 np0005531887 nova_compute[186849]: 2025-11-22 08:20:15.894 186853 DEBUG nova.virt.libvirt.driver [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:20:15 np0005531887 nova_compute[186849]: 2025-11-22 08:20:15.894 186853 DEBUG nova.virt.libvirt.driver [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:20:15 np0005531887 nova_compute[186849]: 2025-11-22 08:20:15.895 186853 DEBUG nova.virt.libvirt.driver [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:20:15 np0005531887 nova_compute[186849]: 2025-11-22 08:20:15.895 186853 DEBUG nova.virt.libvirt.driver [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:20:15 np0005531887 nova_compute[186849]: 2025-11-22 08:20:15.896 186853 DEBUG nova.virt.libvirt.driver [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:20:15 np0005531887 nova_compute[186849]: 2025-11-22 08:20:15.904 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:20:15 np0005531887 nova_compute[186849]: 2025-11-22 08:20:15.904 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763799615.7665505, 6b66084f-0e71-48ba-897d-2a2519ece774 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:20:15 np0005531887 nova_compute[186849]: 2025-11-22 08:20:15.904 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:20:15 np0005531887 nova_compute[186849]: 2025-11-22 08:20:15.986 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:20:15 np0005531887 nova_compute[186849]: 2025-11-22 08:20:15.990 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763799615.7727695, 6b66084f-0e71-48ba-897d-2a2519ece774 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:20:15 np0005531887 nova_compute[186849]: 2025-11-22 08:20:15.991 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:20:16 np0005531887 nova_compute[186849]: 2025-11-22 08:20:16.010 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:20:16 np0005531887 nova_compute[186849]: 2025-11-22 08:20:16.013 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:20:16 np0005531887 nova_compute[186849]: 2025-11-22 08:20:16.033 186853 INFO nova.compute.manager [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Took 10.62 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 03:20:16 np0005531887 nova_compute[186849]: 2025-11-22 08:20:16.033 186853 DEBUG nova.compute.manager [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:20:16 np0005531887 nova_compute[186849]: 2025-11-22 08:20:16.039 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:20:16 np0005531887 podman[237890]: 2025-11-22 08:20:16.07190818 +0000 UTC m=+0.059338879 container create a81a37670ca76e12f89c577eda314323e967f14369219817d3d22b33bbf17394 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cfb1249f-37ac-4df7-b559-e7968406997d, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 22 03:20:16 np0005531887 systemd[1]: Started libpod-conmon-a81a37670ca76e12f89c577eda314323e967f14369219817d3d22b33bbf17394.scope.
Nov 22 03:20:16 np0005531887 nova_compute[186849]: 2025-11-22 08:20:16.119 186853 INFO nova.compute.manager [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Took 11.17 seconds to build instance.#033[00m
Nov 22 03:20:16 np0005531887 podman[237890]: 2025-11-22 08:20:16.035868098 +0000 UTC m=+0.023298817 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:20:16 np0005531887 systemd[1]: Started libcrun container.
Nov 22 03:20:16 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b9c3cdcc9478ca4c22b245800c15df0c897161f0f1ac649976b87a45523c9323/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:20:16 np0005531887 nova_compute[186849]: 2025-11-22 08:20:16.147 186853 DEBUG oslo_concurrency.lockutils [None req-40e881cf-c707-4df9-8e0c-29f2fdc712d8 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "6b66084f-0e71-48ba-897d-2a2519ece774" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.279s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:20:16 np0005531887 podman[237890]: 2025-11-22 08:20:16.157869315 +0000 UTC m=+0.145300044 container init a81a37670ca76e12f89c577eda314323e967f14369219817d3d22b33bbf17394 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cfb1249f-37ac-4df7-b559-e7968406997d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:20:16 np0005531887 podman[237890]: 2025-11-22 08:20:16.165049083 +0000 UTC m=+0.152479782 container start a81a37670ca76e12f89c577eda314323e967f14369219817d3d22b33bbf17394 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cfb1249f-37ac-4df7-b559-e7968406997d, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:20:16 np0005531887 podman[237903]: 2025-11-22 08:20:16.182637028 +0000 UTC m=+0.068679429 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, config_id=edpm, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=)
Nov 22 03:20:16 np0005531887 neutron-haproxy-ovnmeta-cfb1249f-37ac-4df7-b559-e7968406997d[237906]: [NOTICE]   (237925) : New worker (237930) forked
Nov 22 03:20:16 np0005531887 neutron-haproxy-ovnmeta-cfb1249f-37ac-4df7-b559-e7968406997d[237906]: [NOTICE]   (237925) : Loading success.
Nov 22 03:20:16 np0005531887 nova_compute[186849]: 2025-11-22 08:20:16.675 186853 DEBUG nova.network.neutron [req-9ff0a39f-3273-4856-a269-d165d157ccf6 req-4ee46c70-8d6f-43f1-9da6-44d2936ebde1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Updated VIF entry in instance network info cache for port 527c4007-077d-42a6-9f7c-79c07ded5ed7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:20:16 np0005531887 nova_compute[186849]: 2025-11-22 08:20:16.676 186853 DEBUG nova.network.neutron [req-9ff0a39f-3273-4856-a269-d165d157ccf6 req-4ee46c70-8d6f-43f1-9da6-44d2936ebde1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Updating instance_info_cache with network_info: [{"id": "527c4007-077d-42a6-9f7c-79c07ded5ed7", "address": "fa:16:3e:83:36:2c", "network": {"id": "cfb1249f-37ac-4df7-b559-e7968406997d", "bridge": "br-int", "label": "tempest-network-smoke--1635184120", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe83:362c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe83:362c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap527c4007-07", "ovs_interfaceid": "527c4007-077d-42a6-9f7c-79c07ded5ed7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:20:16 np0005531887 nova_compute[186849]: 2025-11-22 08:20:16.693 186853 DEBUG oslo_concurrency.lockutils [req-9ff0a39f-3273-4856-a269-d165d157ccf6 req-4ee46c70-8d6f-43f1-9da6-44d2936ebde1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-6b66084f-0e71-48ba-897d-2a2519ece774" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:20:16 np0005531887 nova_compute[186849]: 2025-11-22 08:20:16.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:20:17 np0005531887 nova_compute[186849]: 2025-11-22 08:20:17.206 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:18 np0005531887 nova_compute[186849]: 2025-11-22 08:20:18.039 186853 DEBUG nova.compute.manager [req-e86ef384-8787-4d0d-a02b-10b1d6c8cdc8 req-787409c6-066d-4e2b-915d-5a6b24747aba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Received event network-vif-plugged-527c4007-077d-42a6-9f7c-79c07ded5ed7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:20:18 np0005531887 nova_compute[186849]: 2025-11-22 08:20:18.040 186853 DEBUG oslo_concurrency.lockutils [req-e86ef384-8787-4d0d-a02b-10b1d6c8cdc8 req-787409c6-066d-4e2b-915d-5a6b24747aba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "6b66084f-0e71-48ba-897d-2a2519ece774-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:20:18 np0005531887 nova_compute[186849]: 2025-11-22 08:20:18.040 186853 DEBUG oslo_concurrency.lockutils [req-e86ef384-8787-4d0d-a02b-10b1d6c8cdc8 req-787409c6-066d-4e2b-915d-5a6b24747aba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6b66084f-0e71-48ba-897d-2a2519ece774-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:20:18 np0005531887 nova_compute[186849]: 2025-11-22 08:20:18.040 186853 DEBUG oslo_concurrency.lockutils [req-e86ef384-8787-4d0d-a02b-10b1d6c8cdc8 req-787409c6-066d-4e2b-915d-5a6b24747aba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6b66084f-0e71-48ba-897d-2a2519ece774-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:20:18 np0005531887 nova_compute[186849]: 2025-11-22 08:20:18.041 186853 DEBUG nova.compute.manager [req-e86ef384-8787-4d0d-a02b-10b1d6c8cdc8 req-787409c6-066d-4e2b-915d-5a6b24747aba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] No waiting events found dispatching network-vif-plugged-527c4007-077d-42a6-9f7c-79c07ded5ed7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:20:18 np0005531887 nova_compute[186849]: 2025-11-22 08:20:18.041 186853 WARNING nova.compute.manager [req-e86ef384-8787-4d0d-a02b-10b1d6c8cdc8 req-787409c6-066d-4e2b-915d-5a6b24747aba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Received unexpected event network-vif-plugged-527c4007-077d-42a6-9f7c-79c07ded5ed7 for instance with vm_state active and task_state None.#033[00m
Nov 22 03:20:19 np0005531887 nova_compute[186849]: 2025-11-22 08:20:19.259 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:20 np0005531887 podman[237940]: 2025-11-22 08:20:20.852480359 +0000 UTC m=+0.070006622 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 03:20:20 np0005531887 podman[237941]: 2025-11-22 08:20:20.897289527 +0000 UTC m=+0.109485468 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 03:20:21 np0005531887 nova_compute[186849]: 2025-11-22 08:20:21.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:20:22 np0005531887 nova_compute[186849]: 2025-11-22 08:20:22.189 186853 DEBUG nova.compute.manager [req-7dd4e474-ab80-4600-979c-d34eef162836 req-5d5aff97-b9e3-4ded-ac23-1089d2df600f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Received event network-changed-527c4007-077d-42a6-9f7c-79c07ded5ed7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:20:22 np0005531887 nova_compute[186849]: 2025-11-22 08:20:22.189 186853 DEBUG nova.compute.manager [req-7dd4e474-ab80-4600-979c-d34eef162836 req-5d5aff97-b9e3-4ded-ac23-1089d2df600f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Refreshing instance network info cache due to event network-changed-527c4007-077d-42a6-9f7c-79c07ded5ed7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:20:22 np0005531887 nova_compute[186849]: 2025-11-22 08:20:22.190 186853 DEBUG oslo_concurrency.lockutils [req-7dd4e474-ab80-4600-979c-d34eef162836 req-5d5aff97-b9e3-4ded-ac23-1089d2df600f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-6b66084f-0e71-48ba-897d-2a2519ece774" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:20:22 np0005531887 nova_compute[186849]: 2025-11-22 08:20:22.190 186853 DEBUG oslo_concurrency.lockutils [req-7dd4e474-ab80-4600-979c-d34eef162836 req-5d5aff97-b9e3-4ded-ac23-1089d2df600f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-6b66084f-0e71-48ba-897d-2a2519ece774" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:20:22 np0005531887 nova_compute[186849]: 2025-11-22 08:20:22.190 186853 DEBUG nova.network.neutron [req-7dd4e474-ab80-4600-979c-d34eef162836 req-5d5aff97-b9e3-4ded-ac23-1089d2df600f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Refreshing network info cache for port 527c4007-077d-42a6-9f7c-79c07ded5ed7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:20:22 np0005531887 nova_compute[186849]: 2025-11-22 08:20:22.209 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:24 np0005531887 nova_compute[186849]: 2025-11-22 08:20:24.057 186853 DEBUG nova.network.neutron [req-7dd4e474-ab80-4600-979c-d34eef162836 req-5d5aff97-b9e3-4ded-ac23-1089d2df600f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Updated VIF entry in instance network info cache for port 527c4007-077d-42a6-9f7c-79c07ded5ed7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:20:24 np0005531887 nova_compute[186849]: 2025-11-22 08:20:24.058 186853 DEBUG nova.network.neutron [req-7dd4e474-ab80-4600-979c-d34eef162836 req-5d5aff97-b9e3-4ded-ac23-1089d2df600f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Updating instance_info_cache with network_info: [{"id": "527c4007-077d-42a6-9f7c-79c07ded5ed7", "address": "fa:16:3e:83:36:2c", "network": {"id": "cfb1249f-37ac-4df7-b559-e7968406997d", "bridge": "br-int", "label": "tempest-network-smoke--1635184120", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe83:362c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe83:362c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap527c4007-07", "ovs_interfaceid": "527c4007-077d-42a6-9f7c-79c07ded5ed7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:20:24 np0005531887 nova_compute[186849]: 2025-11-22 08:20:24.103 186853 DEBUG oslo_concurrency.lockutils [req-7dd4e474-ab80-4600-979c-d34eef162836 req-5d5aff97-b9e3-4ded-ac23-1089d2df600f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-6b66084f-0e71-48ba-897d-2a2519ece774" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:20:24 np0005531887 nova_compute[186849]: 2025-11-22 08:20:24.262 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:25 np0005531887 podman[237984]: 2025-11-22 08:20:25.855675904 +0000 UTC m=+0.071257793 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:20:27 np0005531887 nova_compute[186849]: 2025-11-22 08:20:27.210 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:29 np0005531887 nova_compute[186849]: 2025-11-22 08:20:29.267 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:30 np0005531887 ovn_controller[95130]: 2025-11-22T08:20:30Z|00066|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:83:36:2c 10.100.0.8
Nov 22 03:20:30 np0005531887 ovn_controller[95130]: 2025-11-22T08:20:30Z|00067|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:83:36:2c 10.100.0.8
Nov 22 03:20:31 np0005531887 nova_compute[186849]: 2025-11-22 08:20:31.762 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:20:31 np0005531887 podman[238027]: 2025-11-22 08:20:31.847435975 +0000 UTC m=+0.069620373 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:20:32 np0005531887 nova_compute[186849]: 2025-11-22 08:20:32.213 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:34 np0005531887 nova_compute[186849]: 2025-11-22 08:20:34.269 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.670 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '6b66084f-0e71-48ba-897d-2a2519ece774', 'name': 'tempest-TestGettingAddress-server-1523037456', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000099', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'user_id': '809b865601654264af5bff7f49127cea', 'hostId': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.671 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.671 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.671 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1523037456>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1523037456>]
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.671 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.675 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 6b66084f-0e71-48ba-897d-2a2519ece774 / tap527c4007-07 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.675 12 DEBUG ceilometer.compute.pollsters [-] 6b66084f-0e71-48ba-897d-2a2519ece774/network.outgoing.packets volume: 22 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.677 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'db1896d0-0077-454c-bf42-77d68a8eee54', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 22, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000099-6b66084f-0e71-48ba-897d-2a2519ece774-tap527c4007-07', 'timestamp': '2025-11-22T08:20:36.671862', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1523037456', 'name': 'tap527c4007-07', 'instance_id': '6b66084f-0e71-48ba-897d-2a2519ece774', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:83:36:2c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap527c4007-07'}, 'message_id': '1f7ebbc0-c77c-11f0-9b25-fa163ecc0304', 'monotonic_time': 6379.341655964, 'message_signature': '73d63c957993d252240caac52502163f2a4480fac505fe21f7eb1cc101fa04c6'}]}, 'timestamp': '2025-11-22 08:20:36.675892', '_unique_id': 'd1a4df276dd7437eb6cf5092f15c4e62'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.677 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.677 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.677 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.677 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.677 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.677 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.677 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.677 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.677 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.677 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.677 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.677 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.677 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.677 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.677 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.677 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.677 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.677 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.677 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.677 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.677 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.677 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.677 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.677 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.677 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.677 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.677 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.677 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.677 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.677 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.677 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.677 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.678 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.678 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1523037456>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1523037456>]
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.678 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.679 12 DEBUG ceilometer.compute.pollsters [-] 6b66084f-0e71-48ba-897d-2a2519ece774/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.680 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '027bd582-2ac1-4a93-a520-ef89686ddeae', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000099-6b66084f-0e71-48ba-897d-2a2519ece774-tap527c4007-07', 'timestamp': '2025-11-22T08:20:36.679108', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1523037456', 'name': 'tap527c4007-07', 'instance_id': '6b66084f-0e71-48ba-897d-2a2519ece774', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:83:36:2c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap527c4007-07'}, 'message_id': '1f7f5684-c77c-11f0-9b25-fa163ecc0304', 'monotonic_time': 6379.341655964, 'message_signature': 'd0ec791e5f44a61b0b62bedc5fdf1f097063fc100b50958a2fcc527bbae52d37'}]}, 'timestamp': '2025-11-22 08:20:36.679845', '_unique_id': 'eaa254ada94345f8b806be055f10f0c1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.680 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.680 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.680 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.680 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.680 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.680 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.680 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.680 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.680 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.680 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.680 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.680 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.680 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.680 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.680 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.680 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.680 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.680 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.680 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.680 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.680 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.680 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.680 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.680 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.680 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.680 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.680 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.680 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.680 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.680 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.680 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.681 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.710 12 DEBUG ceilometer.compute.pollsters [-] 6b66084f-0e71-48ba-897d-2a2519ece774/disk.device.write.requests volume: 312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.710 12 DEBUG ceilometer.compute.pollsters [-] 6b66084f-0e71-48ba-897d-2a2519ece774/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '05cbbe7b-2ca0-411f-a454-01734961ed20', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 312, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '6b66084f-0e71-48ba-897d-2a2519ece774-vda', 'timestamp': '2025-11-22T08:20:36.681697', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1523037456', 'name': 'instance-00000099', 'instance_id': '6b66084f-0e71-48ba-897d-2a2519ece774', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1f840b7a-c77c-11f0-9b25-fa163ecc0304', 'monotonic_time': 6379.351515888, 'message_signature': 'f14e0d9dab61445b89e4bc89e8d7b5329e957e0079cb4b3a54c3ef8bbbdec4aa'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '6b66084f-0e71-48ba-897d-2a2519ece774-sda', 'timestamp': '2025-11-22T08:20:36.681697', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1523037456', 'name': 'instance-00000099', 'instance_id': '6b66084f-0e71-48ba-897d-2a2519ece774', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1f841908-c77c-11f0-9b25-fa163ecc0304', 'monotonic_time': 6379.351515888, 'message_signature': '65ee22161a33c155c9a36cabf2d76ab4942dbc8f32042b2adcdcd3d322bf9626'}]}, 'timestamp': '2025-11-22 08:20:36.710958', '_unique_id': 'c804708ce16a4b6195f533c9b6748dbd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.712 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.712 12 DEBUG ceilometer.compute.pollsters [-] 6b66084f-0e71-48ba-897d-2a2519ece774/disk.device.write.latency volume: 3525384406 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.713 12 DEBUG ceilometer.compute.pollsters [-] 6b66084f-0e71-48ba-897d-2a2519ece774/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b0ae92e2-5142-4eb7-800d-3a592148505c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3525384406, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '6b66084f-0e71-48ba-897d-2a2519ece774-vda', 'timestamp': '2025-11-22T08:20:36.712776', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1523037456', 'name': 'instance-00000099', 'instance_id': '6b66084f-0e71-48ba-897d-2a2519ece774', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1f846a2a-c77c-11f0-9b25-fa163ecc0304', 'monotonic_time': 6379.351515888, 'message_signature': '7aa1767daacb2037f904be0fb1b754d48f428d3406cc1a399cca85677bdf144a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '6b66084f-0e71-48ba-897d-2a2519ece774-sda', 'timestamp': '2025-11-22T08:20:36.712776', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1523037456', 'name': 'instance-00000099', 'instance_id': '6b66084f-0e71-48ba-897d-2a2519ece774', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1f8474b6-c77c-11f0-9b25-fa163ecc0304', 'monotonic_time': 6379.351515888, 'message_signature': '9a338a09e6c0b08a782b23b142975bdd21df208e85bd5effadba4d407807ef66'}]}, 'timestamp': '2025-11-22 08:20:36.713321', '_unique_id': 'c7371a8e975746eca5817d802ab2998e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.714 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.730 12 DEBUG ceilometer.compute.pollsters [-] 6b66084f-0e71-48ba-897d-2a2519ece774/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.730 12 DEBUG ceilometer.compute.pollsters [-] 6b66084f-0e71-48ba-897d-2a2519ece774/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.731 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '96860521-b84d-4ae9-ad82-a1a4002622f7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '6b66084f-0e71-48ba-897d-2a2519ece774-vda', 'timestamp': '2025-11-22T08:20:36.714539', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1523037456', 'name': 'instance-00000099', 'instance_id': '6b66084f-0e71-48ba-897d-2a2519ece774', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1f87210c-c77c-11f0-9b25-fa163ecc0304', 'monotonic_time': 6379.384351059, 'message_signature': 'be434e964146089c95dc651198183ae2a3583deb667eb0abce94befb5701c86c'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '6b66084f-0e71-48ba-897d-2a2519ece774-sda', 'timestamp': '2025-11-22T08:20:36.714539', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1523037456', 'name': 'instance-00000099', 'instance_id': '6b66084f-0e71-48ba-897d-2a2519ece774', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1f872c10-c77c-11f0-9b25-fa163ecc0304', 'monotonic_time': 6379.384351059, 'message_signature': '72a52428a637767af101ea42fb5c449643aba8dbff3a8767a8f36d566054f9a9'}]}, 'timestamp': '2025-11-22 08:20:36.731098', '_unique_id': '3b03717b31bc4a00906dc86cb558e59b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.731 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.731 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.731 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.731 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.731 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.731 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.731 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.731 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.731 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.731 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.731 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.731 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.731 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.731 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.731 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.731 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.731 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.731 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.731 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.731 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.731 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.731 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.731 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.731 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.731 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.731 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.731 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.731 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.731 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.731 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.731 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.731 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.732 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.732 12 DEBUG ceilometer.compute.pollsters [-] 6b66084f-0e71-48ba-897d-2a2519ece774/disk.device.read.bytes volume: 31017472 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.732 12 DEBUG ceilometer.compute.pollsters [-] 6b66084f-0e71-48ba-897d-2a2519ece774/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.733 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6eaeddef-8f01-4b85-bc76-42732e5d2d2c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31017472, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '6b66084f-0e71-48ba-897d-2a2519ece774-vda', 'timestamp': '2025-11-22T08:20:36.732689', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1523037456', 'name': 'instance-00000099', 'instance_id': '6b66084f-0e71-48ba-897d-2a2519ece774', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1f8773b4-c77c-11f0-9b25-fa163ecc0304', 'monotonic_time': 6379.351515888, 'message_signature': 'b04ed7caac4e2ab42cec617d5c01e99d64254a24361d7fa04e826c396205b5da'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '6b66084f-0e71-48ba-897d-2a2519ece774-sda', 'timestamp': '2025-11-22T08:20:36.732689', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1523037456', 'name': 'instance-00000099', 'instance_id': '6b66084f-0e71-48ba-897d-2a2519ece774', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1f877cba-c77c-11f0-9b25-fa163ecc0304', 'monotonic_time': 6379.351515888, 'message_signature': '084dd6960033871641e759859e687a416cc7d0c3357a5cc42d42eba9190a803f'}]}, 'timestamp': '2025-11-22 08:20:36.733171', '_unique_id': 'b42bd7e9f30548bcaaa1bb2ef2b64855'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.733 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.733 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.733 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.733 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.733 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.733 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.733 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.733 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.733 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.733 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.733 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.733 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.733 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.733 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.733 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.733 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.733 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.733 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.733 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.733 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.733 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.733 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.733 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.733 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.733 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.733 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.733 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.733 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.733 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.733 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.733 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.734 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.734 12 DEBUG ceilometer.compute.pollsters [-] 6b66084f-0e71-48ba-897d-2a2519ece774/disk.device.read.latency volume: 858876065 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.734 12 DEBUG ceilometer.compute.pollsters [-] 6b66084f-0e71-48ba-897d-2a2519ece774/disk.device.read.latency volume: 74071965 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.735 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8e0324d1-7143-4609-87e1-9f678f55e159', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 858876065, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '6b66084f-0e71-48ba-897d-2a2519ece774-vda', 'timestamp': '2025-11-22T08:20:36.734443', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1523037456', 'name': 'instance-00000099', 'instance_id': '6b66084f-0e71-48ba-897d-2a2519ece774', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1f87b810-c77c-11f0-9b25-fa163ecc0304', 'monotonic_time': 6379.351515888, 'message_signature': '2acc95bbaf2ac656d2f7459271870e2e6a918320ad88a6a5d73081d7ae9c58c9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 74071965, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '6b66084f-0e71-48ba-897d-2a2519ece774-sda', 'timestamp': '2025-11-22T08:20:36.734443', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1523037456', 'name': 'instance-00000099', 'instance_id': '6b66084f-0e71-48ba-897d-2a2519ece774', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1f87c0c6-c77c-11f0-9b25-fa163ecc0304', 'monotonic_time': 6379.351515888, 'message_signature': '5bc56fddb8605d4c8caeeff40723562ee10f41efce6518711fdcd5eac03c385a'}]}, 'timestamp': '2025-11-22 08:20:36.734895', '_unique_id': '7fe1b7cbc3fd4650adfab4c66388324e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.735 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.735 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.735 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.735 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.735 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.735 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.735 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.735 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.735 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.735 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.735 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.735 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.735 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.735 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.735 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.735 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.735 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.735 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.735 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.735 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.735 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.735 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.735 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.735 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.735 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.735 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.735 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.735 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.735 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.735 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.735 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.736 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.736 12 DEBUG ceilometer.compute.pollsters [-] 6b66084f-0e71-48ba-897d-2a2519ece774/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.736 12 DEBUG ceilometer.compute.pollsters [-] 6b66084f-0e71-48ba-897d-2a2519ece774/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.737 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '66add2da-4d23-4614-84c2-3355a74fcfc7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '6b66084f-0e71-48ba-897d-2a2519ece774-vda', 'timestamp': '2025-11-22T08:20:36.736165', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1523037456', 'name': 'instance-00000099', 'instance_id': '6b66084f-0e71-48ba-897d-2a2519ece774', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1f87fcee-c77c-11f0-9b25-fa163ecc0304', 'monotonic_time': 6379.384351059, 'message_signature': '09dfd3efaebfa609b300ac62f3919acb5d3d12afc3a17a3587e7a17430a46fdd'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '6b66084f-0e71-48ba-897d-2a2519ece774-sda', 'timestamp': '2025-11-22T08:20:36.736165', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1523037456', 'name': 'instance-00000099', 'instance_id': '6b66084f-0e71-48ba-897d-2a2519ece774', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1f8805ae-c77c-11f0-9b25-fa163ecc0304', 'monotonic_time': 6379.384351059, 'message_signature': '0879d13e537ccc48b99daeee50cebf37060d3b8345db449049bca544142aa518'}]}, 'timestamp': '2025-11-22 08:20:36.736666', '_unique_id': '5efc1eae0cf64828b78d1e93c75c3540'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.737 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.737 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.737 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.737 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.737 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.737 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.737 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.737 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.737 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.737 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.737 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.737 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.737 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.737 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.737 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.737 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.737 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.737 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.737 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.737 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.737 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.737 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.737 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.737 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.737 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.737 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.737 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.737 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.737 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.737 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.737 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.737 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.737 12 DEBUG ceilometer.compute.pollsters [-] 6b66084f-0e71-48ba-897d-2a2519ece774/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.738 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8bfc606d-cc09-4950-9cbe-166a58210b8b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000099-6b66084f-0e71-48ba-897d-2a2519ece774-tap527c4007-07', 'timestamp': '2025-11-22T08:20:36.737858', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1523037456', 'name': 'tap527c4007-07', 'instance_id': '6b66084f-0e71-48ba-897d-2a2519ece774', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:83:36:2c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap527c4007-07'}, 'message_id': '1f883da8-c77c-11f0-9b25-fa163ecc0304', 'monotonic_time': 6379.341655964, 'message_signature': 'a14a53402bf92cdf139ffa077231e295701aa5339a1e887267b6103dc923afd8'}]}, 'timestamp': '2025-11-22 08:20:36.738117', '_unique_id': '4ca063231a9b47c59af8487390a3547a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.738 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.738 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.738 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.738 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.738 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.738 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.738 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.738 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.738 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.738 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.738 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.738 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.738 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.738 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.738 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.738 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.738 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.738 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.738 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.738 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.738 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.738 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.738 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.738 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.738 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.738 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.738 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.738 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.738 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.738 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.738 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.739 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.739 12 DEBUG ceilometer.compute.pollsters [-] 6b66084f-0e71-48ba-897d-2a2519ece774/network.outgoing.bytes volume: 2236 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.740 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8e3a1018-6d93-48fb-99af-10c9350804dd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2236, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000099-6b66084f-0e71-48ba-897d-2a2519ece774-tap527c4007-07', 'timestamp': '2025-11-22T08:20:36.739471', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1523037456', 'name': 'tap527c4007-07', 'instance_id': '6b66084f-0e71-48ba-897d-2a2519ece774', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:83:36:2c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap527c4007-07'}, 'message_id': '1f887c8c-c77c-11f0-9b25-fa163ecc0304', 'monotonic_time': 6379.341655964, 'message_signature': '4e01dc728901f7d01b6650f7e02c6a84eddbc8d9d2be414adb45a1213dbfb6b1'}]}, 'timestamp': '2025-11-22 08:20:36.739718', '_unique_id': '3be40e2c5d4445dfaa7f3f5df3b91878'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.740 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.740 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.740 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.740 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.740 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.740 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.740 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.740 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.740 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.740 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.740 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.740 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.740 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.740 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.740 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.740 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.740 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.740 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.740 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.740 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.740 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.740 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.740 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.740 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.740 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.740 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.740 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.740 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.740 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.740 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.740 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.740 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.740 12 DEBUG ceilometer.compute.pollsters [-] 6b66084f-0e71-48ba-897d-2a2519ece774/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.741 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aab145ad-e3e5-449a-b4b0-8f9d8c0348b8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000099-6b66084f-0e71-48ba-897d-2a2519ece774-tap527c4007-07', 'timestamp': '2025-11-22T08:20:36.740904', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1523037456', 'name': 'tap527c4007-07', 'instance_id': '6b66084f-0e71-48ba-897d-2a2519ece774', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:83:36:2c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap527c4007-07'}, 'message_id': '1f88b45e-c77c-11f0-9b25-fa163ecc0304', 'monotonic_time': 6379.341655964, 'message_signature': '5ea0b0bb7b16fb68a6f914071251ddca078bd2a5b831ce0a84b60d08b3852dbe'}]}, 'timestamp': '2025-11-22 08:20:36.741146', '_unique_id': '18d3d6776aa0418d84835994c51a0265'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.741 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.741 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.741 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.741 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.741 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.741 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.741 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.741 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.741 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.741 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.741 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.741 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.741 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.741 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.741 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.741 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.741 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.741 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.741 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.741 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.741 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.741 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.741 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.741 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.741 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.741 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.741 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.741 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.741 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.741 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.741 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.742 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.742 12 DEBUG ceilometer.compute.pollsters [-] 6b66084f-0e71-48ba-897d-2a2519ece774/disk.device.write.bytes volume: 72912896 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.742 12 DEBUG ceilometer.compute.pollsters [-] 6b66084f-0e71-48ba-897d-2a2519ece774/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.743 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6da81e30-5360-4a29-9959-94cd1e9b30ba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72912896, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '6b66084f-0e71-48ba-897d-2a2519ece774-vda', 'timestamp': '2025-11-22T08:20:36.742329', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1523037456', 'name': 'instance-00000099', 'instance_id': '6b66084f-0e71-48ba-897d-2a2519ece774', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1f88ebea-c77c-11f0-9b25-fa163ecc0304', 'monotonic_time': 6379.351515888, 'message_signature': '50c6aba74a40d4a59b239625e80c51537b9cd790fd8d3ee6473aa5661dfa66ea'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '6b66084f-0e71-48ba-897d-2a2519ece774-sda', 'timestamp': '2025-11-22T08:20:36.742329', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1523037456', 'name': 'instance-00000099', 'instance_id': '6b66084f-0e71-48ba-897d-2a2519ece774', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1f88f4d2-c77c-11f0-9b25-fa163ecc0304', 'monotonic_time': 6379.351515888, 'message_signature': '2e2475bb25eaa77edc7c97e1fed4b3f3f1aff8835d8b1f18aeb6ccbf6042d69b'}]}, 'timestamp': '2025-11-22 08:20:36.742781', '_unique_id': '7fa5f25df6284a749e65dd5a14dd19ff'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.743 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.743 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.743 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.743 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.743 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.743 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.743 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.743 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.743 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.743 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.743 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.743 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.743 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.743 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.743 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.743 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.743 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.743 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.743 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.743 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.743 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.743 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.743 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.743 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.743 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.743 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.743 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.743 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.743 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.743 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.743 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.743 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.744 12 DEBUG ceilometer.compute.pollsters [-] 6b66084f-0e71-48ba-897d-2a2519ece774/network.incoming.packets volume: 14 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.744 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5c4dbb16-de9b-4e20-ac91-1ea681ef2e4f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 14, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000099-6b66084f-0e71-48ba-897d-2a2519ece774-tap527c4007-07', 'timestamp': '2025-11-22T08:20:36.743979', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1523037456', 'name': 'tap527c4007-07', 'instance_id': '6b66084f-0e71-48ba-897d-2a2519ece774', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:83:36:2c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap527c4007-07'}, 'message_id': '1f892c9a-c77c-11f0-9b25-fa163ecc0304', 'monotonic_time': 6379.341655964, 'message_signature': '0b3f6ce04cc75f2c2015c6425f97683f1b259ee4e34b96bddf29ef8535bce94a'}]}, 'timestamp': '2025-11-22 08:20:36.744246', '_unique_id': '329893306115434bb3d151b8867f3ffb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.744 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.744 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.744 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.744 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.744 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.744 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.744 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.744 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.744 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.744 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.744 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.744 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.744 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.744 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.744 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.744 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.744 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.744 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.744 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.744 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.744 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.744 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.744 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.744 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.744 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.744 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.744 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.744 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.744 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.744 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.744 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.745 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.745 12 DEBUG ceilometer.compute.pollsters [-] 6b66084f-0e71-48ba-897d-2a2519ece774/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.746 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f2981a96-199d-4faf-af7b-847827c03da4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000099-6b66084f-0e71-48ba-897d-2a2519ece774-tap527c4007-07', 'timestamp': '2025-11-22T08:20:36.745439', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1523037456', 'name': 'tap527c4007-07', 'instance_id': '6b66084f-0e71-48ba-897d-2a2519ece774', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:83:36:2c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap527c4007-07'}, 'message_id': '1f896584-c77c-11f0-9b25-fa163ecc0304', 'monotonic_time': 6379.341655964, 'message_signature': 'f11dc74316537f2d38660d355bb0e8214d7661ce46b8a62d3c01b1431aadcfe0'}]}, 'timestamp': '2025-11-22 08:20:36.745688', '_unique_id': 'a50eecfcb696457a95dab1ddd06de7d2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.746 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.746 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.746 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.746 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.746 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.746 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.746 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.746 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.746 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.746 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.746 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.746 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.746 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.746 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.746 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.746 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.746 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.746 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.746 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.746 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.746 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.746 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.746 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.746 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.746 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.746 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.746 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.746 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.746 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.746 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.746 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.746 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.772 12 DEBUG ceilometer.compute.pollsters [-] 6b66084f-0e71-48ba-897d-2a2519ece774/memory.usage volume: 42.4296875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.773 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dfd1628a-3c37-4719-ae26-d82e77cc616c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.4296875, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '6b66084f-0e71-48ba-897d-2a2519ece774', 'timestamp': '2025-11-22T08:20:36.746867', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1523037456', 'name': 'instance-00000099', 'instance_id': '6b66084f-0e71-48ba-897d-2a2519ece774', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '1f8d8880-c77c-11f0-9b25-fa163ecc0304', 'monotonic_time': 6379.442071517, 'message_signature': '62b0fcf21e7a179746b853209500677f64289144782b5e22dea7e34fd6858c22'}]}, 'timestamp': '2025-11-22 08:20:36.772827', '_unique_id': '3cbc1b4fb79a40659b672cd55bee8907'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.773 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.773 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.773 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.773 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.773 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.773 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.773 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.773 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.773 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.773 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.773 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.773 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.773 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.773 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.773 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.773 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.773 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.773 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.773 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.773 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.773 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.773 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.773 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.773 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.773 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.773 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.773 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.773 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.773 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.773 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.773 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.774 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.774 12 DEBUG ceilometer.compute.pollsters [-] 6b66084f-0e71-48ba-897d-2a2519ece774/network.incoming.bytes volume: 1798 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.775 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e8e3fa0f-a712-477b-a611-e991591e9bea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1798, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000099-6b66084f-0e71-48ba-897d-2a2519ece774-tap527c4007-07', 'timestamp': '2025-11-22T08:20:36.774502', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1523037456', 'name': 'tap527c4007-07', 'instance_id': '6b66084f-0e71-48ba-897d-2a2519ece774', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:83:36:2c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap527c4007-07'}, 'message_id': '1f8dd7cc-c77c-11f0-9b25-fa163ecc0304', 'monotonic_time': 6379.341655964, 'message_signature': '2b3da2d3993406e20dd04b001433b4f4265bc3fd2b48f1b0f95c2bd49ffaa1b7'}]}, 'timestamp': '2025-11-22 08:20:36.774838', '_unique_id': 'f6f4fc7b934642da8ee82293162559a1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.775 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.775 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.775 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.775 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.775 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.775 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.775 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.775 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.775 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.775 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.775 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.775 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.775 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.775 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.775 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.775 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.775 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.775 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.775 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.775 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.775 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.775 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.775 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.775 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.775 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.775 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.775 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.775 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.775 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.775 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.775 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.776 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.776 12 DEBUG ceilometer.compute.pollsters [-] 6b66084f-0e71-48ba-897d-2a2519ece774/cpu volume: 13890000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.776 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b05a5429-2b41-40f0-bb54-a5078365afef', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 13890000000, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '6b66084f-0e71-48ba-897d-2a2519ece774', 'timestamp': '2025-11-22T08:20:36.776080', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1523037456', 'name': 'instance-00000099', 'instance_id': '6b66084f-0e71-48ba-897d-2a2519ece774', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '1f8e1282-c77c-11f0-9b25-fa163ecc0304', 'monotonic_time': 6379.442071517, 'message_signature': '53f67ba301340de0fc75856cf72a2793974f93862d12117625c748a9b3922d5d'}]}, 'timestamp': '2025-11-22 08:20:36.776336', '_unique_id': 'd1d29867feff4392be05b7e87a2a8b4e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.776 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.776 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.776 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.776 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.776 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.776 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.776 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.776 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.776 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.776 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.776 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.776 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.776 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.776 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.776 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.776 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.776 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.776 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.776 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.776 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.776 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.776 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.776 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.776 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.776 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.776 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.776 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.776 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.776 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.776 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.776 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.777 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.777 12 DEBUG ceilometer.compute.pollsters [-] 6b66084f-0e71-48ba-897d-2a2519ece774/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.778 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e8bfaac2-6b81-4bb8-894a-ee6227f6f9fb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000099-6b66084f-0e71-48ba-897d-2a2519ece774-tap527c4007-07', 'timestamp': '2025-11-22T08:20:36.777542', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1523037456', 'name': 'tap527c4007-07', 'instance_id': '6b66084f-0e71-48ba-897d-2a2519ece774', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:83:36:2c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap527c4007-07'}, 'message_id': '1f8e4d56-c77c-11f0-9b25-fa163ecc0304', 'monotonic_time': 6379.341655964, 'message_signature': 'c25d0e30054d0a90b84556a273b1edbee62b75b7d9943f711fcf2ff2ea008b57'}]}, 'timestamp': '2025-11-22 08:20:36.777838', '_unique_id': '259ecd262a5c4879acdbbf2152d01bad'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.778 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.778 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.778 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.778 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.778 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.778 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.778 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.778 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.778 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.778 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.778 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.778 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.778 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.778 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.778 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.778 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.778 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.778 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.778 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.778 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.778 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.778 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.778 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.778 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.778 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.778 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.778 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.778 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.778 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.778 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.778 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.778 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.778 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.779 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.779 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1523037456>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1523037456>]
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.779 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.779 12 DEBUG ceilometer.compute.pollsters [-] 6b66084f-0e71-48ba-897d-2a2519ece774/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.780 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '37081186-ba6c-4fbb-b071-eef922fc8d88', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000099-6b66084f-0e71-48ba-897d-2a2519ece774-tap527c4007-07', 'timestamp': '2025-11-22T08:20:36.779385', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1523037456', 'name': 'tap527c4007-07', 'instance_id': '6b66084f-0e71-48ba-897d-2a2519ece774', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:83:36:2c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap527c4007-07'}, 'message_id': '1f8e93d8-c77c-11f0-9b25-fa163ecc0304', 'monotonic_time': 6379.341655964, 'message_signature': '50c856af355177edfac3ff54cbbc134a1d728931c051ea9afbe0fa435d6525e7'}]}, 'timestamp': '2025-11-22 08:20:36.779638', '_unique_id': 'd7bbe84f877f43eea1bbf8ff2aa9dd98'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.780 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.780 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.780 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.780 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.780 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.780 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.780 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.780 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.780 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.780 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.780 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.780 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.780 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.780 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.780 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.780 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.780 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.780 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.780 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.780 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.780 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.780 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.780 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.780 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.780 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.780 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.780 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.780 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.780 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.780 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.780 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.780 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.780 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.780 12 DEBUG ceilometer.compute.pollsters [-] 6b66084f-0e71-48ba-897d-2a2519ece774/disk.device.read.requests volume: 1135 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.781 12 DEBUG ceilometer.compute.pollsters [-] 6b66084f-0e71-48ba-897d-2a2519ece774/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.781 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aff8a906-a378-48b1-a065-31f862c08352', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1135, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '6b66084f-0e71-48ba-897d-2a2519ece774-vda', 'timestamp': '2025-11-22T08:20:36.780942', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1523037456', 'name': 'instance-00000099', 'instance_id': '6b66084f-0e71-48ba-897d-2a2519ece774', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1f8ed064-c77c-11f0-9b25-fa163ecc0304', 'monotonic_time': 6379.351515888, 'message_signature': '9c983e53a7488570942321063f9315006aa4ee361f2de80764d0faf674362c35'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '6b66084f-0e71-48ba-897d-2a2519ece774-sda', 'timestamp': '2025-11-22T08:20:36.780942', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1523037456', 'name': 'instance-00000099', 'instance_id': '6b66084f-0e71-48ba-897d-2a2519ece774', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1f8ed9d8-c77c-11f0-9b25-fa163ecc0304', 'monotonic_time': 6379.351515888, 'message_signature': '8f64255c4c41fbafd154aaa23cb258a65262449e8f3c8732159f7d33073f49b6'}]}, 'timestamp': '2025-11-22 08:20:36.781420', '_unique_id': '0960b5ff4474417395f21a55d4ca18f6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.781 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.781 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.781 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.781 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.781 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.781 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.781 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.781 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.781 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.781 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.781 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.781 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.781 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.781 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.781 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.781 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.781 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.781 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.781 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.781 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.781 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.781 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.781 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.781 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.781 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.781 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.781 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.781 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.781 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.781 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.781 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.782 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.782 12 DEBUG ceilometer.compute.pollsters [-] 6b66084f-0e71-48ba-897d-2a2519ece774/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.782 12 DEBUG ceilometer.compute.pollsters [-] 6b66084f-0e71-48ba-897d-2a2519ece774/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.783 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '77f1c2ab-ee8f-4b0a-a046-332a0f25546e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '6b66084f-0e71-48ba-897d-2a2519ece774-vda', 'timestamp': '2025-11-22T08:20:36.782651', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1523037456', 'name': 'instance-00000099', 'instance_id': '6b66084f-0e71-48ba-897d-2a2519ece774', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1f8f12f4-c77c-11f0-9b25-fa163ecc0304', 'monotonic_time': 6379.384351059, 'message_signature': 'db320174926eb082dcf6b8a3fcfcedb7f34fa639522d76768d6c48a89db61032'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '6b66084f-0e71-48ba-897d-2a2519ece774-sda', 'timestamp': '2025-11-22T08:20:36.782651', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1523037456', 'name': 'instance-00000099', 'instance_id': '6b66084f-0e71-48ba-897d-2a2519ece774', 'instance_type': 'm1.nano', 'host': '44b5b7a1f202c5ebc845b03260c518ddb3091aed33f76f8ae7d3148f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1f8f1e66-c77c-11f0-9b25-fa163ecc0304', 'monotonic_time': 6379.384351059, 'message_signature': 'c2106af0e747f9a1283c7b3e6b682fab5ca99be3797c98fbb755e9ea7d74485a'}]}, 'timestamp': '2025-11-22 08:20:36.783170', '_unique_id': '4926ba2ff0544b7096f240399097bf63'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.783 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.783 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.783 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.783 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.783 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.783 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.783 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.783 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.783 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.783 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.783 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.783 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.783 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.783 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.783 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.783 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.783 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.783 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.783 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.783 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.783 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.783 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.783 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.783 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.783 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.783 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.783 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.783 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.783 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.783 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.783 12 ERROR oslo_messaging.notify.messaging 
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.784 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.784 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 03:20:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:20:36.784 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1523037456>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1523037456>]
Nov 22 03:20:36 np0005531887 podman[238047]: 2025-11-22 08:20:36.836076797 +0000 UTC m=+0.056318504 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 22 03:20:37 np0005531887 nova_compute[186849]: 2025-11-22 08:20:37.216 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:20:37.354 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:20:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:20:37.355 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:20:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:20:37.355 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:20:39 np0005531887 nova_compute[186849]: 2025-11-22 08:20:39.271 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:40 np0005531887 nova_compute[186849]: 2025-11-22 08:20:40.984 186853 DEBUG nova.compute.manager [req-07b2545e-181c-44d5-b953-486098446d35 req-99bcb3d8-9dff-4e23-9bae-3b5752ae6ff8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Received event network-changed-527c4007-077d-42a6-9f7c-79c07ded5ed7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:20:40 np0005531887 nova_compute[186849]: 2025-11-22 08:20:40.985 186853 DEBUG nova.compute.manager [req-07b2545e-181c-44d5-b953-486098446d35 req-99bcb3d8-9dff-4e23-9bae-3b5752ae6ff8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Refreshing instance network info cache due to event network-changed-527c4007-077d-42a6-9f7c-79c07ded5ed7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:20:40 np0005531887 nova_compute[186849]: 2025-11-22 08:20:40.985 186853 DEBUG oslo_concurrency.lockutils [req-07b2545e-181c-44d5-b953-486098446d35 req-99bcb3d8-9dff-4e23-9bae-3b5752ae6ff8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-6b66084f-0e71-48ba-897d-2a2519ece774" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:20:40 np0005531887 nova_compute[186849]: 2025-11-22 08:20:40.985 186853 DEBUG oslo_concurrency.lockutils [req-07b2545e-181c-44d5-b953-486098446d35 req-99bcb3d8-9dff-4e23-9bae-3b5752ae6ff8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-6b66084f-0e71-48ba-897d-2a2519ece774" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:20:40 np0005531887 nova_compute[186849]: 2025-11-22 08:20:40.986 186853 DEBUG nova.network.neutron [req-07b2545e-181c-44d5-b953-486098446d35 req-99bcb3d8-9dff-4e23-9bae-3b5752ae6ff8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Refreshing network info cache for port 527c4007-077d-42a6-9f7c-79c07ded5ed7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:20:41 np0005531887 nova_compute[186849]: 2025-11-22 08:20:41.071 186853 DEBUG oslo_concurrency.lockutils [None req-ee41bf50-7c5f-4b9f-af84-f7b76a5b7c73 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "6b66084f-0e71-48ba-897d-2a2519ece774" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:20:41 np0005531887 nova_compute[186849]: 2025-11-22 08:20:41.071 186853 DEBUG oslo_concurrency.lockutils [None req-ee41bf50-7c5f-4b9f-af84-f7b76a5b7c73 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "6b66084f-0e71-48ba-897d-2a2519ece774" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:20:41 np0005531887 nova_compute[186849]: 2025-11-22 08:20:41.072 186853 DEBUG oslo_concurrency.lockutils [None req-ee41bf50-7c5f-4b9f-af84-f7b76a5b7c73 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "6b66084f-0e71-48ba-897d-2a2519ece774-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:20:41 np0005531887 nova_compute[186849]: 2025-11-22 08:20:41.072 186853 DEBUG oslo_concurrency.lockutils [None req-ee41bf50-7c5f-4b9f-af84-f7b76a5b7c73 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "6b66084f-0e71-48ba-897d-2a2519ece774-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:20:41 np0005531887 nova_compute[186849]: 2025-11-22 08:20:41.072 186853 DEBUG oslo_concurrency.lockutils [None req-ee41bf50-7c5f-4b9f-af84-f7b76a5b7c73 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "6b66084f-0e71-48ba-897d-2a2519ece774-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:20:41 np0005531887 nova_compute[186849]: 2025-11-22 08:20:41.080 186853 INFO nova.compute.manager [None req-ee41bf50-7c5f-4b9f-af84-f7b76a5b7c73 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Terminating instance#033[00m
Nov 22 03:20:41 np0005531887 nova_compute[186849]: 2025-11-22 08:20:41.086 186853 DEBUG nova.compute.manager [None req-ee41bf50-7c5f-4b9f-af84-f7b76a5b7c73 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 03:20:41 np0005531887 kernel: tap527c4007-07 (unregistering): left promiscuous mode
Nov 22 03:20:41 np0005531887 NetworkManager[55210]: <info>  [1763799641.1140] device (tap527c4007-07): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:20:41 np0005531887 nova_compute[186849]: 2025-11-22 08:20:41.125 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:41 np0005531887 ovn_controller[95130]: 2025-11-22T08:20:41Z|00527|binding|INFO|Releasing lport 527c4007-077d-42a6-9f7c-79c07ded5ed7 from this chassis (sb_readonly=0)
Nov 22 03:20:41 np0005531887 ovn_controller[95130]: 2025-11-22T08:20:41Z|00528|binding|INFO|Setting lport 527c4007-077d-42a6-9f7c-79c07ded5ed7 down in Southbound
Nov 22 03:20:41 np0005531887 ovn_controller[95130]: 2025-11-22T08:20:41Z|00529|binding|INFO|Removing iface tap527c4007-07 ovn-installed in OVS
Nov 22 03:20:41 np0005531887 nova_compute[186849]: 2025-11-22 08:20:41.152 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:41 np0005531887 systemd[1]: machine-qemu\x2d56\x2dinstance\x2d00000099.scope: Deactivated successfully.
Nov 22 03:20:41 np0005531887 systemd[1]: machine-qemu\x2d56\x2dinstance\x2d00000099.scope: Consumed 15.816s CPU time.
Nov 22 03:20:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:20:41.156 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:83:36:2c 10.100.0.8 2001:db8:0:1:f816:3eff:fe83:362c 2001:db8::f816:3eff:fe83:362c'], port_security=['fa:16:3e:83:36:2c 10.100.0.8 2001:db8:0:1:f816:3eff:fe83:362c 2001:db8::f816:3eff:fe83:362c'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28 2001:db8:0:1:f816:3eff:fe83:362c/64 2001:db8::f816:3eff:fe83:362c/64', 'neutron:device_id': '6b66084f-0e71-48ba-897d-2a2519ece774', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cfb1249f-37ac-4df7-b559-e7968406997d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '4', 'neutron:security_group_ids': '04c6695d-d046-49fb-a069-528067303a16', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=94e93905-1b64-4ecc-b682-ceea307bebcf, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=527c4007-077d-42a6-9f7c-79c07ded5ed7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:20:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:20:41.157 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 527c4007-077d-42a6-9f7c-79c07ded5ed7 in datapath cfb1249f-37ac-4df7-b559-e7968406997d unbound from our chassis#033[00m
Nov 22 03:20:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:20:41.159 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cfb1249f-37ac-4df7-b559-e7968406997d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:20:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:20:41.160 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[812afc4a-64a2-4d47-bd1e-d5cc2bd16adb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:20:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:20:41.160 104084 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-cfb1249f-37ac-4df7-b559-e7968406997d namespace which is not needed anymore#033[00m
Nov 22 03:20:41 np0005531887 systemd-machined[153180]: Machine qemu-56-instance-00000099 terminated.
Nov 22 03:20:41 np0005531887 podman[238068]: 2025-11-22 08:20:41.197313546 +0000 UTC m=+0.057535374 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 03:20:41 np0005531887 nova_compute[186849]: 2025-11-22 08:20:41.307 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:41 np0005531887 nova_compute[186849]: 2025-11-22 08:20:41.311 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:41 np0005531887 nova_compute[186849]: 2025-11-22 08:20:41.357 186853 INFO nova.virt.libvirt.driver [-] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Instance destroyed successfully.#033[00m
Nov 22 03:20:41 np0005531887 nova_compute[186849]: 2025-11-22 08:20:41.358 186853 DEBUG nova.objects.instance [None req-ee41bf50-7c5f-4b9f-af84-f7b76a5b7c73 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lazy-loading 'resources' on Instance uuid 6b66084f-0e71-48ba-897d-2a2519ece774 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:20:41 np0005531887 nova_compute[186849]: 2025-11-22 08:20:41.377 186853 DEBUG nova.virt.libvirt.vif [None req-ee41bf50-7c5f-4b9f-af84-f7b76a5b7c73 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:20:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1523037456',display_name='tempest-TestGettingAddress-server-1523037456',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1523037456',id=153,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFe+Iivl03JqMU254FLdbdmYsFU6DbbfEEAr4K/FY8GDuQ0mBhcnts9hxMb1kzVXY50lm7S9mYwpnOmQECnf5XsVq5CeQ5VY2CUHiqO5dq+d/xeUGYNH940WARTUGprt0Q==',key_name='tempest-TestGettingAddress-223844638',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:20:16Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-5t8n4cws',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:20:16Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=6b66084f-0e71-48ba-897d-2a2519ece774,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "527c4007-077d-42a6-9f7c-79c07ded5ed7", "address": "fa:16:3e:83:36:2c", "network": {"id": "cfb1249f-37ac-4df7-b559-e7968406997d", "bridge": "br-int", "label": "tempest-network-smoke--1635184120", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe83:362c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe83:362c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap527c4007-07", "ovs_interfaceid": "527c4007-077d-42a6-9f7c-79c07ded5ed7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:20:41 np0005531887 nova_compute[186849]: 2025-11-22 08:20:41.377 186853 DEBUG nova.network.os_vif_util [None req-ee41bf50-7c5f-4b9f-af84-f7b76a5b7c73 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "527c4007-077d-42a6-9f7c-79c07ded5ed7", "address": "fa:16:3e:83:36:2c", "network": {"id": "cfb1249f-37ac-4df7-b559-e7968406997d", "bridge": "br-int", "label": "tempest-network-smoke--1635184120", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe83:362c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe83:362c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap527c4007-07", "ovs_interfaceid": "527c4007-077d-42a6-9f7c-79c07ded5ed7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:20:41 np0005531887 nova_compute[186849]: 2025-11-22 08:20:41.379 186853 DEBUG nova.network.os_vif_util [None req-ee41bf50-7c5f-4b9f-af84-f7b76a5b7c73 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:83:36:2c,bridge_name='br-int',has_traffic_filtering=True,id=527c4007-077d-42a6-9f7c-79c07ded5ed7,network=Network(cfb1249f-37ac-4df7-b559-e7968406997d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap527c4007-07') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:20:41 np0005531887 nova_compute[186849]: 2025-11-22 08:20:41.380 186853 DEBUG os_vif [None req-ee41bf50-7c5f-4b9f-af84-f7b76a5b7c73 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:83:36:2c,bridge_name='br-int',has_traffic_filtering=True,id=527c4007-077d-42a6-9f7c-79c07ded5ed7,network=Network(cfb1249f-37ac-4df7-b559-e7968406997d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap527c4007-07') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:20:41 np0005531887 nova_compute[186849]: 2025-11-22 08:20:41.382 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:41 np0005531887 nova_compute[186849]: 2025-11-22 08:20:41.382 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap527c4007-07, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:20:41 np0005531887 nova_compute[186849]: 2025-11-22 08:20:41.384 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:41 np0005531887 nova_compute[186849]: 2025-11-22 08:20:41.391 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:20:41 np0005531887 nova_compute[186849]: 2025-11-22 08:20:41.395 186853 INFO os_vif [None req-ee41bf50-7c5f-4b9f-af84-f7b76a5b7c73 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:83:36:2c,bridge_name='br-int',has_traffic_filtering=True,id=527c4007-077d-42a6-9f7c-79c07ded5ed7,network=Network(cfb1249f-37ac-4df7-b559-e7968406997d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap527c4007-07')#033[00m
Nov 22 03:20:41 np0005531887 nova_compute[186849]: 2025-11-22 08:20:41.395 186853 INFO nova.virt.libvirt.driver [None req-ee41bf50-7c5f-4b9f-af84-f7b76a5b7c73 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Deleting instance files /var/lib/nova/instances/6b66084f-0e71-48ba-897d-2a2519ece774_del#033[00m
Nov 22 03:20:41 np0005531887 nova_compute[186849]: 2025-11-22 08:20:41.396 186853 INFO nova.virt.libvirt.driver [None req-ee41bf50-7c5f-4b9f-af84-f7b76a5b7c73 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Deletion of /var/lib/nova/instances/6b66084f-0e71-48ba-897d-2a2519ece774_del complete#033[00m
Nov 22 03:20:41 np0005531887 nova_compute[186849]: 2025-11-22 08:20:41.591 186853 INFO nova.compute.manager [None req-ee41bf50-7c5f-4b9f-af84-f7b76a5b7c73 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Took 0.50 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 03:20:41 np0005531887 nova_compute[186849]: 2025-11-22 08:20:41.593 186853 DEBUG oslo.service.loopingcall [None req-ee41bf50-7c5f-4b9f-af84-f7b76a5b7c73 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 03:20:41 np0005531887 nova_compute[186849]: 2025-11-22 08:20:41.594 186853 DEBUG nova.compute.manager [-] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 03:20:41 np0005531887 nova_compute[186849]: 2025-11-22 08:20:41.595 186853 DEBUG nova.network.neutron [-] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 03:20:41 np0005531887 neutron-haproxy-ovnmeta-cfb1249f-37ac-4df7-b559-e7968406997d[237906]: [NOTICE]   (237925) : haproxy version is 2.8.14-c23fe91
Nov 22 03:20:41 np0005531887 neutron-haproxy-ovnmeta-cfb1249f-37ac-4df7-b559-e7968406997d[237906]: [NOTICE]   (237925) : path to executable is /usr/sbin/haproxy
Nov 22 03:20:41 np0005531887 neutron-haproxy-ovnmeta-cfb1249f-37ac-4df7-b559-e7968406997d[237906]: [WARNING]  (237925) : Exiting Master process...
Nov 22 03:20:41 np0005531887 neutron-haproxy-ovnmeta-cfb1249f-37ac-4df7-b559-e7968406997d[237906]: [WARNING]  (237925) : Exiting Master process...
Nov 22 03:20:41 np0005531887 neutron-haproxy-ovnmeta-cfb1249f-37ac-4df7-b559-e7968406997d[237906]: [ALERT]    (237925) : Current worker (237930) exited with code 143 (Terminated)
Nov 22 03:20:41 np0005531887 neutron-haproxy-ovnmeta-cfb1249f-37ac-4df7-b559-e7968406997d[237906]: [WARNING]  (237925) : All workers exited. Exiting... (0)
Nov 22 03:20:41 np0005531887 systemd[1]: libpod-a81a37670ca76e12f89c577eda314323e967f14369219817d3d22b33bbf17394.scope: Deactivated successfully.
Nov 22 03:20:41 np0005531887 podman[238114]: 2025-11-22 08:20:41.727096126 +0000 UTC m=+0.458100839 container died a81a37670ca76e12f89c577eda314323e967f14369219817d3d22b33bbf17394 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cfb1249f-37ac-4df7-b559-e7968406997d, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:20:42 np0005531887 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a81a37670ca76e12f89c577eda314323e967f14369219817d3d22b33bbf17394-userdata-shm.mount: Deactivated successfully.
Nov 22 03:20:42 np0005531887 systemd[1]: var-lib-containers-storage-overlay-b9c3cdcc9478ca4c22b245800c15df0c897161f0f1ac649976b87a45523c9323-merged.mount: Deactivated successfully.
Nov 22 03:20:42 np0005531887 nova_compute[186849]: 2025-11-22 08:20:42.218 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:42 np0005531887 nova_compute[186849]: 2025-11-22 08:20:42.510 186853 DEBUG nova.compute.manager [req-5d5e447b-cf5c-478f-b2ea-1df5721cd8b6 req-357ca4d4-acd8-4aba-872b-8e390feb4c16 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Received event network-vif-unplugged-527c4007-077d-42a6-9f7c-79c07ded5ed7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:20:42 np0005531887 nova_compute[186849]: 2025-11-22 08:20:42.511 186853 DEBUG oslo_concurrency.lockutils [req-5d5e447b-cf5c-478f-b2ea-1df5721cd8b6 req-357ca4d4-acd8-4aba-872b-8e390feb4c16 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "6b66084f-0e71-48ba-897d-2a2519ece774-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:20:42 np0005531887 nova_compute[186849]: 2025-11-22 08:20:42.512 186853 DEBUG oslo_concurrency.lockutils [req-5d5e447b-cf5c-478f-b2ea-1df5721cd8b6 req-357ca4d4-acd8-4aba-872b-8e390feb4c16 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6b66084f-0e71-48ba-897d-2a2519ece774-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:20:42 np0005531887 nova_compute[186849]: 2025-11-22 08:20:42.512 186853 DEBUG oslo_concurrency.lockutils [req-5d5e447b-cf5c-478f-b2ea-1df5721cd8b6 req-357ca4d4-acd8-4aba-872b-8e390feb4c16 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6b66084f-0e71-48ba-897d-2a2519ece774-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:20:42 np0005531887 nova_compute[186849]: 2025-11-22 08:20:42.512 186853 DEBUG nova.compute.manager [req-5d5e447b-cf5c-478f-b2ea-1df5721cd8b6 req-357ca4d4-acd8-4aba-872b-8e390feb4c16 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] No waiting events found dispatching network-vif-unplugged-527c4007-077d-42a6-9f7c-79c07ded5ed7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:20:42 np0005531887 nova_compute[186849]: 2025-11-22 08:20:42.513 186853 DEBUG nova.compute.manager [req-5d5e447b-cf5c-478f-b2ea-1df5721cd8b6 req-357ca4d4-acd8-4aba-872b-8e390feb4c16 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Received event network-vif-unplugged-527c4007-077d-42a6-9f7c-79c07ded5ed7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 03:20:42 np0005531887 nova_compute[186849]: 2025-11-22 08:20:42.513 186853 DEBUG nova.compute.manager [req-5d5e447b-cf5c-478f-b2ea-1df5721cd8b6 req-357ca4d4-acd8-4aba-872b-8e390feb4c16 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Received event network-vif-plugged-527c4007-077d-42a6-9f7c-79c07ded5ed7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:20:42 np0005531887 nova_compute[186849]: 2025-11-22 08:20:42.514 186853 DEBUG oslo_concurrency.lockutils [req-5d5e447b-cf5c-478f-b2ea-1df5721cd8b6 req-357ca4d4-acd8-4aba-872b-8e390feb4c16 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "6b66084f-0e71-48ba-897d-2a2519ece774-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:20:42 np0005531887 nova_compute[186849]: 2025-11-22 08:20:42.514 186853 DEBUG oslo_concurrency.lockutils [req-5d5e447b-cf5c-478f-b2ea-1df5721cd8b6 req-357ca4d4-acd8-4aba-872b-8e390feb4c16 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6b66084f-0e71-48ba-897d-2a2519ece774-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:20:42 np0005531887 nova_compute[186849]: 2025-11-22 08:20:42.515 186853 DEBUG oslo_concurrency.lockutils [req-5d5e447b-cf5c-478f-b2ea-1df5721cd8b6 req-357ca4d4-acd8-4aba-872b-8e390feb4c16 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6b66084f-0e71-48ba-897d-2a2519ece774-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:20:42 np0005531887 nova_compute[186849]: 2025-11-22 08:20:42.515 186853 DEBUG nova.compute.manager [req-5d5e447b-cf5c-478f-b2ea-1df5721cd8b6 req-357ca4d4-acd8-4aba-872b-8e390feb4c16 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] No waiting events found dispatching network-vif-plugged-527c4007-077d-42a6-9f7c-79c07ded5ed7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:20:42 np0005531887 nova_compute[186849]: 2025-11-22 08:20:42.515 186853 WARNING nova.compute.manager [req-5d5e447b-cf5c-478f-b2ea-1df5721cd8b6 req-357ca4d4-acd8-4aba-872b-8e390feb4c16 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Received unexpected event network-vif-plugged-527c4007-077d-42a6-9f7c-79c07ded5ed7 for instance with vm_state active and task_state deleting.#033[00m
Nov 22 03:20:42 np0005531887 podman[238114]: 2025-11-22 08:20:42.598628059 +0000 UTC m=+1.329632802 container cleanup a81a37670ca76e12f89c577eda314323e967f14369219817d3d22b33bbf17394 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cfb1249f-37ac-4df7-b559-e7968406997d, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:20:42 np0005531887 systemd[1]: libpod-conmon-a81a37670ca76e12f89c577eda314323e967f14369219817d3d22b33bbf17394.scope: Deactivated successfully.
Nov 22 03:20:42 np0005531887 nova_compute[186849]: 2025-11-22 08:20:42.877 186853 DEBUG nova.network.neutron [-] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:20:42 np0005531887 nova_compute[186849]: 2025-11-22 08:20:42.920 186853 INFO nova.compute.manager [-] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Took 1.33 seconds to deallocate network for instance.#033[00m
Nov 22 03:20:42 np0005531887 nova_compute[186849]: 2025-11-22 08:20:42.989 186853 DEBUG oslo_concurrency.lockutils [None req-ee41bf50-7c5f-4b9f-af84-f7b76a5b7c73 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:20:42 np0005531887 nova_compute[186849]: 2025-11-22 08:20:42.990 186853 DEBUG oslo_concurrency.lockutils [None req-ee41bf50-7c5f-4b9f-af84-f7b76a5b7c73 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:20:43 np0005531887 nova_compute[186849]: 2025-11-22 08:20:43.056 186853 DEBUG nova.compute.provider_tree [None req-ee41bf50-7c5f-4b9f-af84-f7b76a5b7c73 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:20:43 np0005531887 nova_compute[186849]: 2025-11-22 08:20:43.065 186853 DEBUG nova.compute.manager [req-0aa16019-a253-4959-9000-4526889dcdd3 req-35fc957c-4585-4182-9373-cf63d2d1439b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Received event network-vif-deleted-527c4007-077d-42a6-9f7c-79c07ded5ed7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:20:43 np0005531887 nova_compute[186849]: 2025-11-22 08:20:43.082 186853 DEBUG nova.scheduler.client.report [None req-ee41bf50-7c5f-4b9f-af84-f7b76a5b7c73 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:20:43 np0005531887 nova_compute[186849]: 2025-11-22 08:20:43.110 186853 DEBUG oslo_concurrency.lockutils [None req-ee41bf50-7c5f-4b9f-af84-f7b76a5b7c73 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:20:43 np0005531887 nova_compute[186849]: 2025-11-22 08:20:43.150 186853 INFO nova.scheduler.client.report [None req-ee41bf50-7c5f-4b9f-af84-f7b76a5b7c73 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Deleted allocations for instance 6b66084f-0e71-48ba-897d-2a2519ece774#033[00m
Nov 22 03:20:43 np0005531887 nova_compute[186849]: 2025-11-22 08:20:43.235 186853 DEBUG oslo_concurrency.lockutils [None req-ee41bf50-7c5f-4b9f-af84-f7b76a5b7c73 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "6b66084f-0e71-48ba-897d-2a2519ece774" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.164s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:20:43 np0005531887 nova_compute[186849]: 2025-11-22 08:20:43.476 186853 DEBUG nova.network.neutron [req-07b2545e-181c-44d5-b953-486098446d35 req-99bcb3d8-9dff-4e23-9bae-3b5752ae6ff8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Updated VIF entry in instance network info cache for port 527c4007-077d-42a6-9f7c-79c07ded5ed7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:20:43 np0005531887 nova_compute[186849]: 2025-11-22 08:20:43.477 186853 DEBUG nova.network.neutron [req-07b2545e-181c-44d5-b953-486098446d35 req-99bcb3d8-9dff-4e23-9bae-3b5752ae6ff8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Updating instance_info_cache with network_info: [{"id": "527c4007-077d-42a6-9f7c-79c07ded5ed7", "address": "fa:16:3e:83:36:2c", "network": {"id": "cfb1249f-37ac-4df7-b559-e7968406997d", "bridge": "br-int", "label": "tempest-network-smoke--1635184120", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe83:362c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe83:362c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap527c4007-07", "ovs_interfaceid": "527c4007-077d-42a6-9f7c-79c07ded5ed7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:20:43 np0005531887 nova_compute[186849]: 2025-11-22 08:20:43.500 186853 DEBUG oslo_concurrency.lockutils [req-07b2545e-181c-44d5-b953-486098446d35 req-99bcb3d8-9dff-4e23-9bae-3b5752ae6ff8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-6b66084f-0e71-48ba-897d-2a2519ece774" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:20:43 np0005531887 podman[238159]: 2025-11-22 08:20:43.738490277 +0000 UTC m=+1.110573966 container remove a81a37670ca76e12f89c577eda314323e967f14369219817d3d22b33bbf17394 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cfb1249f-37ac-4df7-b559-e7968406997d, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:20:43 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:20:43.743 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[8090617f-7c6d-4b25-8f34-6a6eb149f50e]: (4, ('Sat Nov 22 08:20:41 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-cfb1249f-37ac-4df7-b559-e7968406997d (a81a37670ca76e12f89c577eda314323e967f14369219817d3d22b33bbf17394)\na81a37670ca76e12f89c577eda314323e967f14369219817d3d22b33bbf17394\nSat Nov 22 08:20:42 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-cfb1249f-37ac-4df7-b559-e7968406997d (a81a37670ca76e12f89c577eda314323e967f14369219817d3d22b33bbf17394)\na81a37670ca76e12f89c577eda314323e967f14369219817d3d22b33bbf17394\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:20:43 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:20:43.746 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[cb4ce0f7-bc15-40d5-8f9f-434d25c5970e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:20:43 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:20:43.747 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcfb1249f-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:20:43 np0005531887 nova_compute[186849]: 2025-11-22 08:20:43.749 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:43 np0005531887 kernel: tapcfb1249f-30: left promiscuous mode
Nov 22 03:20:43 np0005531887 nova_compute[186849]: 2025-11-22 08:20:43.776 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:43 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:20:43.781 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[fb93a5a5-c5ea-4e7f-b991-ffa255942fb0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:20:43 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:20:43.795 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[2b59c6f5-607b-43c4-a281-6c9d3b364826]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:20:43 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:20:43.798 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[14d12cd9-cd7a-4681-86b0-e6e73c2918b4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:20:43 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:20:43.821 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[233c4cfd-fe78-492a-96e6-00a88471afda]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 635800, 'reachable_time': 28166, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238175, 'error': None, 'target': 'ovnmeta-cfb1249f-37ac-4df7-b559-e7968406997d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:20:43 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:20:43.825 104200 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-cfb1249f-37ac-4df7-b559-e7968406997d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:20:43 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:20:43.825 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[275c571f-6ba9-47f1-bc8e-81de79c635f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:20:43 np0005531887 systemd[1]: run-netns-ovnmeta\x2dcfb1249f\x2d37ac\x2d4df7\x2db559\x2de7968406997d.mount: Deactivated successfully.
Nov 22 03:20:46 np0005531887 nova_compute[186849]: 2025-11-22 08:20:46.385 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:46 np0005531887 podman[238176]: 2025-11-22 08:20:46.86614543 +0000 UTC m=+0.077951108 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, config_id=edpm, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 22 03:20:47 np0005531887 nova_compute[186849]: 2025-11-22 08:20:47.220 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:51 np0005531887 nova_compute[186849]: 2025-11-22 08:20:51.389 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:51 np0005531887 podman[238197]: 2025-11-22 08:20:51.86595449 +0000 UTC m=+0.079737662 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 22 03:20:51 np0005531887 podman[238198]: 2025-11-22 08:20:51.930459606 +0000 UTC m=+0.133169625 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:20:52 np0005531887 nova_compute[186849]: 2025-11-22 08:20:52.225 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:53 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:20:53.543 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=46, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=45) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:20:53 np0005531887 nova_compute[186849]: 2025-11-22 08:20:53.544 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:53 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:20:53.544 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:20:56 np0005531887 nova_compute[186849]: 2025-11-22 08:20:56.356 186853 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763799641.3551612, 6b66084f-0e71-48ba-897d-2a2519ece774 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:20:56 np0005531887 nova_compute[186849]: 2025-11-22 08:20:56.357 186853 INFO nova.compute.manager [-] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:20:56 np0005531887 nova_compute[186849]: 2025-11-22 08:20:56.380 186853 DEBUG nova.compute.manager [None req-b9e7f277-1e27-41f5-a678-5aa732ffb4da - - - - - -] [instance: 6b66084f-0e71-48ba-897d-2a2519ece774] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:20:56 np0005531887 nova_compute[186849]: 2025-11-22 08:20:56.392 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:56 np0005531887 podman[238243]: 2025-11-22 08:20:56.438405883 +0000 UTC m=+0.051471204 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:20:56 np0005531887 nova_compute[186849]: 2025-11-22 08:20:56.726 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:56 np0005531887 nova_compute[186849]: 2025-11-22 08:20:56.841 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:20:57 np0005531887 nova_compute[186849]: 2025-11-22 08:20:57.227 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:01 np0005531887 nova_compute[186849]: 2025-11-22 08:21:01.398 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:01 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:21:01.548 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '46'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:21:02 np0005531887 nova_compute[186849]: 2025-11-22 08:21:02.228 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:02 np0005531887 nova_compute[186849]: 2025-11-22 08:21:02.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:21:02 np0005531887 podman[238269]: 2025-11-22 08:21:02.837380143 +0000 UTC m=+0.054310173 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Nov 22 03:21:06 np0005531887 nova_compute[186849]: 2025-11-22 08:21:06.402 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:07 np0005531887 nova_compute[186849]: 2025-11-22 08:21:07.231 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:07 np0005531887 podman[238289]: 2025-11-22 08:21:07.849205182 +0000 UTC m=+0.065793698 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_id=multipathd, managed_by=edpm_ansible)
Nov 22 03:21:09 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:21:09.195 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bd:96:a1 10.100.0.2 2001:db8::f816:3eff:febd:96a1'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:febd:96a1/64', 'neutron:device_id': 'ovnmeta-6b35c418-bf90-4666-a674-9b7153e90ab7', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6b35c418-bf90-4666-a674-9b7153e90ab7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=68a17d4f-a25c-4374-9e25-7ee5a2fc8b25, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=397a3db2-78b9-4182-b3e5-f29d5ae58cda) old=Port_Binding(mac=['fa:16:3e:bd:96:a1 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-6b35c418-bf90-4666-a674-9b7153e90ab7', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6b35c418-bf90-4666-a674-9b7153e90ab7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:21:09 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:21:09.196 104084 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 397a3db2-78b9-4182-b3e5-f29d5ae58cda in datapath 6b35c418-bf90-4666-a674-9b7153e90ab7 updated#033[00m
Nov 22 03:21:09 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:21:09.197 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6b35c418-bf90-4666-a674-9b7153e90ab7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:21:09 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:21:09.199 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[19861d58-0c8e-4a8f-855f-bc223bea5085]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:21:09 np0005531887 nova_compute[186849]: 2025-11-22 08:21:09.771 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:21:10 np0005531887 nova_compute[186849]: 2025-11-22 08:21:10.763 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:21:10 np0005531887 nova_compute[186849]: 2025-11-22 08:21:10.767 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:21:10 np0005531887 nova_compute[186849]: 2025-11-22 08:21:10.768 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:21:10 np0005531887 nova_compute[186849]: 2025-11-22 08:21:10.768 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:21:10 np0005531887 nova_compute[186849]: 2025-11-22 08:21:10.780 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:21:10 np0005531887 nova_compute[186849]: 2025-11-22 08:21:10.781 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:21:10 np0005531887 nova_compute[186849]: 2025-11-22 08:21:10.782 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:21:10 np0005531887 nova_compute[186849]: 2025-11-22 08:21:10.782 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:21:10 np0005531887 nova_compute[186849]: 2025-11-22 08:21:10.804 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:21:10 np0005531887 nova_compute[186849]: 2025-11-22 08:21:10.805 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:21:10 np0005531887 nova_compute[186849]: 2025-11-22 08:21:10.805 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:21:10 np0005531887 nova_compute[186849]: 2025-11-22 08:21:10.806 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:21:10 np0005531887 nova_compute[186849]: 2025-11-22 08:21:10.974 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:21:10 np0005531887 nova_compute[186849]: 2025-11-22 08:21:10.975 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5736MB free_disk=73.27377319335938GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:21:10 np0005531887 nova_compute[186849]: 2025-11-22 08:21:10.975 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:21:10 np0005531887 nova_compute[186849]: 2025-11-22 08:21:10.976 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:21:11 np0005531887 nova_compute[186849]: 2025-11-22 08:21:11.020 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:21:11 np0005531887 nova_compute[186849]: 2025-11-22 08:21:11.021 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:21:11 np0005531887 nova_compute[186849]: 2025-11-22 08:21:11.037 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Refreshing inventories for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 22 03:21:11 np0005531887 nova_compute[186849]: 2025-11-22 08:21:11.057 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Updating ProviderTree inventory for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 22 03:21:11 np0005531887 nova_compute[186849]: 2025-11-22 08:21:11.057 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Updating inventory in ProviderTree for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 22 03:21:11 np0005531887 nova_compute[186849]: 2025-11-22 08:21:11.077 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Refreshing aggregate associations for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 22 03:21:11 np0005531887 nova_compute[186849]: 2025-11-22 08:21:11.107 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Refreshing trait associations for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78, traits: COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_MMX,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_PCNET _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 22 03:21:11 np0005531887 nova_compute[186849]: 2025-11-22 08:21:11.135 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:21:11 np0005531887 nova_compute[186849]: 2025-11-22 08:21:11.157 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:21:11 np0005531887 nova_compute[186849]: 2025-11-22 08:21:11.178 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:21:11 np0005531887 nova_compute[186849]: 2025-11-22 08:21:11.178 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.203s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:21:11 np0005531887 nova_compute[186849]: 2025-11-22 08:21:11.404 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:11 np0005531887 podman[238313]: 2025-11-22 08:21:11.867000028 +0000 UTC m=+0.084131381 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 03:21:12 np0005531887 nova_compute[186849]: 2025-11-22 08:21:12.234 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:12 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:21:12.687 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bd:96:a1 10.100.0.2 2001:db8:0:1:f816:3eff:febd:96a1 2001:db8::f816:3eff:febd:96a1'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8:0:1:f816:3eff:febd:96a1/64 2001:db8::f816:3eff:febd:96a1/64', 'neutron:device_id': 'ovnmeta-6b35c418-bf90-4666-a674-9b7153e90ab7', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6b35c418-bf90-4666-a674-9b7153e90ab7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=68a17d4f-a25c-4374-9e25-7ee5a2fc8b25, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=397a3db2-78b9-4182-b3e5-f29d5ae58cda) old=Port_Binding(mac=['fa:16:3e:bd:96:a1 10.100.0.2 2001:db8::f816:3eff:febd:96a1'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:febd:96a1/64', 'neutron:device_id': 'ovnmeta-6b35c418-bf90-4666-a674-9b7153e90ab7', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6b35c418-bf90-4666-a674-9b7153e90ab7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:21:12 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:21:12.690 104084 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 397a3db2-78b9-4182-b3e5-f29d5ae58cda in datapath 6b35c418-bf90-4666-a674-9b7153e90ab7 updated#033[00m
Nov 22 03:21:12 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:21:12.692 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6b35c418-bf90-4666-a674-9b7153e90ab7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:21:12 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:21:12.693 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[d61eb790-84be-4efb-8d5f-56faa6b584b2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:21:16 np0005531887 nova_compute[186849]: 2025-11-22 08:21:16.166 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:21:16 np0005531887 nova_compute[186849]: 2025-11-22 08:21:16.405 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:17 np0005531887 nova_compute[186849]: 2025-11-22 08:21:17.236 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:17 np0005531887 podman[238338]: 2025-11-22 08:21:17.845319206 +0000 UTC m=+0.069556820 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-type=git, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, version=9.6, name=ubi9-minimal, container_name=openstack_network_exporter, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9)
Nov 22 03:21:18 np0005531887 nova_compute[186849]: 2025-11-22 08:21:18.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:21:21 np0005531887 nova_compute[186849]: 2025-11-22 08:21:21.407 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:22 np0005531887 nova_compute[186849]: 2025-11-22 08:21:22.238 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:22 np0005531887 podman[238359]: 2025-11-22 08:21:22.856074728 +0000 UTC m=+0.065516102 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:21:22 np0005531887 podman[238360]: 2025-11-22 08:21:22.918125342 +0000 UTC m=+0.116696087 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 22 03:21:23 np0005531887 nova_compute[186849]: 2025-11-22 08:21:23.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:21:26 np0005531887 nova_compute[186849]: 2025-11-22 08:21:26.410 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:26 np0005531887 podman[238406]: 2025-11-22 08:21:26.836291025 +0000 UTC m=+0.052547521 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:21:27 np0005531887 nova_compute[186849]: 2025-11-22 08:21:27.239 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:31 np0005531887 nova_compute[186849]: 2025-11-22 08:21:31.413 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:32 np0005531887 nova_compute[186849]: 2025-11-22 08:21:32.242 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:33 np0005531887 podman[238428]: 2025-11-22 08:21:33.845714271 +0000 UTC m=+0.060109428 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 03:21:36 np0005531887 nova_compute[186849]: 2025-11-22 08:21:36.414 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:37 np0005531887 nova_compute[186849]: 2025-11-22 08:21:37.244 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:21:37.355 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:21:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:21:37.355 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:21:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:21:37.356 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:21:38 np0005531887 podman[238449]: 2025-11-22 08:21:38.848143336 +0000 UTC m=+0.065724386 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, container_name=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 22 03:21:41 np0005531887 nova_compute[186849]: 2025-11-22 08:21:41.417 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:42 np0005531887 nova_compute[186849]: 2025-11-22 08:21:42.246 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:42 np0005531887 podman[238470]: 2025-11-22 08:21:42.840508243 +0000 UTC m=+0.058130148 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 03:21:43 np0005531887 nova_compute[186849]: 2025-11-22 08:21:43.452 186853 DEBUG oslo_concurrency.lockutils [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "1ef73533-7ed2-422b-a432-c1f12dbc7323" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:21:43 np0005531887 nova_compute[186849]: 2025-11-22 08:21:43.453 186853 DEBUG oslo_concurrency.lockutils [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "1ef73533-7ed2-422b-a432-c1f12dbc7323" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:21:43 np0005531887 nova_compute[186849]: 2025-11-22 08:21:43.468 186853 DEBUG nova.compute.manager [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 03:21:43 np0005531887 nova_compute[186849]: 2025-11-22 08:21:43.587 186853 DEBUG oslo_concurrency.lockutils [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:21:43 np0005531887 nova_compute[186849]: 2025-11-22 08:21:43.588 186853 DEBUG oslo_concurrency.lockutils [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:21:43 np0005531887 nova_compute[186849]: 2025-11-22 08:21:43.601 186853 DEBUG nova.virt.hardware [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:21:43 np0005531887 nova_compute[186849]: 2025-11-22 08:21:43.602 186853 INFO nova.compute.claims [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 22 03:21:43 np0005531887 nova_compute[186849]: 2025-11-22 08:21:43.733 186853 DEBUG nova.compute.provider_tree [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:21:43 np0005531887 nova_compute[186849]: 2025-11-22 08:21:43.756 186853 DEBUG nova.scheduler.client.report [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:21:43 np0005531887 nova_compute[186849]: 2025-11-22 08:21:43.779 186853 DEBUG oslo_concurrency.lockutils [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.190s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:21:43 np0005531887 nova_compute[186849]: 2025-11-22 08:21:43.780 186853 DEBUG nova.compute.manager [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 03:21:43 np0005531887 nova_compute[186849]: 2025-11-22 08:21:43.830 186853 DEBUG nova.compute.manager [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 03:21:43 np0005531887 nova_compute[186849]: 2025-11-22 08:21:43.830 186853 DEBUG nova.network.neutron [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:21:43 np0005531887 nova_compute[186849]: 2025-11-22 08:21:43.848 186853 INFO nova.virt.libvirt.driver [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 03:21:43 np0005531887 nova_compute[186849]: 2025-11-22 08:21:43.873 186853 DEBUG nova.compute.manager [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 03:21:43 np0005531887 nova_compute[186849]: 2025-11-22 08:21:43.981 186853 DEBUG nova.compute.manager [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 03:21:43 np0005531887 nova_compute[186849]: 2025-11-22 08:21:43.982 186853 DEBUG nova.virt.libvirt.driver [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:21:43 np0005531887 nova_compute[186849]: 2025-11-22 08:21:43.983 186853 INFO nova.virt.libvirt.driver [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Creating image(s)#033[00m
Nov 22 03:21:43 np0005531887 nova_compute[186849]: 2025-11-22 08:21:43.984 186853 DEBUG oslo_concurrency.lockutils [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "/var/lib/nova/instances/1ef73533-7ed2-422b-a432-c1f12dbc7323/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:21:43 np0005531887 nova_compute[186849]: 2025-11-22 08:21:43.984 186853 DEBUG oslo_concurrency.lockutils [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "/var/lib/nova/instances/1ef73533-7ed2-422b-a432-c1f12dbc7323/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:21:43 np0005531887 nova_compute[186849]: 2025-11-22 08:21:43.986 186853 DEBUG oslo_concurrency.lockutils [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "/var/lib/nova/instances/1ef73533-7ed2-422b-a432-c1f12dbc7323/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:21:44 np0005531887 nova_compute[186849]: 2025-11-22 08:21:44.000 186853 DEBUG oslo_concurrency.processutils [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:21:44 np0005531887 nova_compute[186849]: 2025-11-22 08:21:44.087 186853 DEBUG oslo_concurrency.processutils [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:21:44 np0005531887 nova_compute[186849]: 2025-11-22 08:21:44.089 186853 DEBUG oslo_concurrency.lockutils [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:21:44 np0005531887 nova_compute[186849]: 2025-11-22 08:21:44.090 186853 DEBUG oslo_concurrency.lockutils [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:21:44 np0005531887 nova_compute[186849]: 2025-11-22 08:21:44.108 186853 DEBUG oslo_concurrency.processutils [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:21:44 np0005531887 nova_compute[186849]: 2025-11-22 08:21:44.132 186853 DEBUG nova.policy [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:21:44 np0005531887 nova_compute[186849]: 2025-11-22 08:21:44.168 186853 DEBUG oslo_concurrency.processutils [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:21:44 np0005531887 nova_compute[186849]: 2025-11-22 08:21:44.169 186853 DEBUG oslo_concurrency.processutils [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/1ef73533-7ed2-422b-a432-c1f12dbc7323/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:21:44 np0005531887 nova_compute[186849]: 2025-11-22 08:21:44.209 186853 DEBUG oslo_concurrency.processutils [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/1ef73533-7ed2-422b-a432-c1f12dbc7323/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:21:44 np0005531887 nova_compute[186849]: 2025-11-22 08:21:44.211 186853 DEBUG oslo_concurrency.lockutils [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:21:44 np0005531887 nova_compute[186849]: 2025-11-22 08:21:44.211 186853 DEBUG oslo_concurrency.processutils [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:21:44 np0005531887 nova_compute[186849]: 2025-11-22 08:21:44.277 186853 DEBUG oslo_concurrency.processutils [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:21:44 np0005531887 nova_compute[186849]: 2025-11-22 08:21:44.279 186853 DEBUG nova.virt.disk.api [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Checking if we can resize image /var/lib/nova/instances/1ef73533-7ed2-422b-a432-c1f12dbc7323/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:21:44 np0005531887 nova_compute[186849]: 2025-11-22 08:21:44.279 186853 DEBUG oslo_concurrency.processutils [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1ef73533-7ed2-422b-a432-c1f12dbc7323/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:21:44 np0005531887 nova_compute[186849]: 2025-11-22 08:21:44.347 186853 DEBUG oslo_concurrency.processutils [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1ef73533-7ed2-422b-a432-c1f12dbc7323/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:21:44 np0005531887 nova_compute[186849]: 2025-11-22 08:21:44.348 186853 DEBUG nova.virt.disk.api [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Cannot resize image /var/lib/nova/instances/1ef73533-7ed2-422b-a432-c1f12dbc7323/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:21:44 np0005531887 nova_compute[186849]: 2025-11-22 08:21:44.349 186853 DEBUG nova.objects.instance [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'migration_context' on Instance uuid 1ef73533-7ed2-422b-a432-c1f12dbc7323 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:21:44 np0005531887 nova_compute[186849]: 2025-11-22 08:21:44.361 186853 DEBUG nova.virt.libvirt.driver [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:21:44 np0005531887 nova_compute[186849]: 2025-11-22 08:21:44.362 186853 DEBUG nova.virt.libvirt.driver [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Ensure instance console log exists: /var/lib/nova/instances/1ef73533-7ed2-422b-a432-c1f12dbc7323/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:21:44 np0005531887 nova_compute[186849]: 2025-11-22 08:21:44.362 186853 DEBUG oslo_concurrency.lockutils [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:21:44 np0005531887 nova_compute[186849]: 2025-11-22 08:21:44.363 186853 DEBUG oslo_concurrency.lockutils [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:21:44 np0005531887 nova_compute[186849]: 2025-11-22 08:21:44.363 186853 DEBUG oslo_concurrency.lockutils [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:21:45 np0005531887 nova_compute[186849]: 2025-11-22 08:21:45.052 186853 DEBUG nova.network.neutron [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Successfully created port: 9887acef-e389-49e2-87d8-70796da43759 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:21:45 np0005531887 nova_compute[186849]: 2025-11-22 08:21:45.977 186853 DEBUG nova.network.neutron [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Successfully updated port: 9887acef-e389-49e2-87d8-70796da43759 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:21:45 np0005531887 nova_compute[186849]: 2025-11-22 08:21:45.989 186853 DEBUG oslo_concurrency.lockutils [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "refresh_cache-1ef73533-7ed2-422b-a432-c1f12dbc7323" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:21:45 np0005531887 nova_compute[186849]: 2025-11-22 08:21:45.989 186853 DEBUG oslo_concurrency.lockutils [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquired lock "refresh_cache-1ef73533-7ed2-422b-a432-c1f12dbc7323" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:21:45 np0005531887 nova_compute[186849]: 2025-11-22 08:21:45.990 186853 DEBUG nova.network.neutron [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:21:46 np0005531887 nova_compute[186849]: 2025-11-22 08:21:46.104 186853 DEBUG nova.compute.manager [req-faf6ff2b-8f43-466e-91ba-38228887976d req-0902bcad-83be-40c7-ae74-b2b2d1055c3c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Received event network-changed-9887acef-e389-49e2-87d8-70796da43759 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:21:46 np0005531887 nova_compute[186849]: 2025-11-22 08:21:46.105 186853 DEBUG nova.compute.manager [req-faf6ff2b-8f43-466e-91ba-38228887976d req-0902bcad-83be-40c7-ae74-b2b2d1055c3c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Refreshing instance network info cache due to event network-changed-9887acef-e389-49e2-87d8-70796da43759. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:21:46 np0005531887 nova_compute[186849]: 2025-11-22 08:21:46.105 186853 DEBUG oslo_concurrency.lockutils [req-faf6ff2b-8f43-466e-91ba-38228887976d req-0902bcad-83be-40c7-ae74-b2b2d1055c3c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-1ef73533-7ed2-422b-a432-c1f12dbc7323" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:21:46 np0005531887 nova_compute[186849]: 2025-11-22 08:21:46.419 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:47 np0005531887 nova_compute[186849]: 2025-11-22 08:21:47.045 186853 DEBUG nova.network.neutron [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:21:47 np0005531887 nova_compute[186849]: 2025-11-22 08:21:47.248 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:47 np0005531887 nova_compute[186849]: 2025-11-22 08:21:47.850 186853 DEBUG nova.network.neutron [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Updating instance_info_cache with network_info: [{"id": "9887acef-e389-49e2-87d8-70796da43759", "address": "fa:16:3e:b4:b4:21", "network": {"id": "cee14e96-7070-410d-8934-e305861050e3", "bridge": "br-int", "label": "tempest-network-smoke--303711675", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9887acef-e3", "ovs_interfaceid": "9887acef-e389-49e2-87d8-70796da43759", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:21:47 np0005531887 nova_compute[186849]: 2025-11-22 08:21:47.901 186853 DEBUG oslo_concurrency.lockutils [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Releasing lock "refresh_cache-1ef73533-7ed2-422b-a432-c1f12dbc7323" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:21:47 np0005531887 nova_compute[186849]: 2025-11-22 08:21:47.902 186853 DEBUG nova.compute.manager [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Instance network_info: |[{"id": "9887acef-e389-49e2-87d8-70796da43759", "address": "fa:16:3e:b4:b4:21", "network": {"id": "cee14e96-7070-410d-8934-e305861050e3", "bridge": "br-int", "label": "tempest-network-smoke--303711675", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9887acef-e3", "ovs_interfaceid": "9887acef-e389-49e2-87d8-70796da43759", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 03:21:47 np0005531887 nova_compute[186849]: 2025-11-22 08:21:47.902 186853 DEBUG oslo_concurrency.lockutils [req-faf6ff2b-8f43-466e-91ba-38228887976d req-0902bcad-83be-40c7-ae74-b2b2d1055c3c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-1ef73533-7ed2-422b-a432-c1f12dbc7323" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:21:47 np0005531887 nova_compute[186849]: 2025-11-22 08:21:47.903 186853 DEBUG nova.network.neutron [req-faf6ff2b-8f43-466e-91ba-38228887976d req-0902bcad-83be-40c7-ae74-b2b2d1055c3c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Refreshing network info cache for port 9887acef-e389-49e2-87d8-70796da43759 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:21:47 np0005531887 nova_compute[186849]: 2025-11-22 08:21:47.906 186853 DEBUG nova.virt.libvirt.driver [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Start _get_guest_xml network_info=[{"id": "9887acef-e389-49e2-87d8-70796da43759", "address": "fa:16:3e:b4:b4:21", "network": {"id": "cee14e96-7070-410d-8934-e305861050e3", "bridge": "br-int", "label": "tempest-network-smoke--303711675", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9887acef-e3", "ovs_interfaceid": "9887acef-e389-49e2-87d8-70796da43759", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'size': 0, 'boot_index': 0, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:21:47 np0005531887 nova_compute[186849]: 2025-11-22 08:21:47.911 186853 WARNING nova.virt.libvirt.driver [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:21:47 np0005531887 nova_compute[186849]: 2025-11-22 08:21:47.915 186853 DEBUG nova.virt.libvirt.host [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:21:47 np0005531887 nova_compute[186849]: 2025-11-22 08:21:47.916 186853 DEBUG nova.virt.libvirt.host [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:21:47 np0005531887 nova_compute[186849]: 2025-11-22 08:21:47.920 186853 DEBUG nova.virt.libvirt.host [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:21:47 np0005531887 nova_compute[186849]: 2025-11-22 08:21:47.921 186853 DEBUG nova.virt.libvirt.host [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:21:47 np0005531887 nova_compute[186849]: 2025-11-22 08:21:47.922 186853 DEBUG nova.virt.libvirt.driver [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:21:47 np0005531887 nova_compute[186849]: 2025-11-22 08:21:47.922 186853 DEBUG nova.virt.hardware [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:21:47 np0005531887 nova_compute[186849]: 2025-11-22 08:21:47.923 186853 DEBUG nova.virt.hardware [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:21:47 np0005531887 nova_compute[186849]: 2025-11-22 08:21:47.923 186853 DEBUG nova.virt.hardware [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:21:47 np0005531887 nova_compute[186849]: 2025-11-22 08:21:47.924 186853 DEBUG nova.virt.hardware [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:21:47 np0005531887 nova_compute[186849]: 2025-11-22 08:21:47.924 186853 DEBUG nova.virt.hardware [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:21:47 np0005531887 nova_compute[186849]: 2025-11-22 08:21:47.924 186853 DEBUG nova.virt.hardware [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:21:47 np0005531887 nova_compute[186849]: 2025-11-22 08:21:47.924 186853 DEBUG nova.virt.hardware [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:21:47 np0005531887 nova_compute[186849]: 2025-11-22 08:21:47.925 186853 DEBUG nova.virt.hardware [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:21:47 np0005531887 nova_compute[186849]: 2025-11-22 08:21:47.925 186853 DEBUG nova.virt.hardware [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:21:47 np0005531887 nova_compute[186849]: 2025-11-22 08:21:47.925 186853 DEBUG nova.virt.hardware [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:21:47 np0005531887 nova_compute[186849]: 2025-11-22 08:21:47.925 186853 DEBUG nova.virt.hardware [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:21:47 np0005531887 nova_compute[186849]: 2025-11-22 08:21:47.930 186853 DEBUG nova.virt.libvirt.vif [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:21:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-896988644',display_name='tempest-TestNetworkAdvancedServerOps-server-896988644',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-896988644',id=155,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGEdgv7ZU5mr1DCHTKMPGMEa2ECF9EUHEhtvPrAip3HJ7nfj7TmONl8h5osSWq7Dqr3V6Hj92ZlV3FEvFnmrY27FQG+lpmv/tm8jg4LcVo4ZQR1NoEXbdJ/E0azh0THluw==',key_name='tempest-TestNetworkAdvancedServerOps-205176852',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='042f6d127720471aaedb8a1fb7535416',ramdisk_id='',reservation_id='r-9m0of0y0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1221065053',owner_user_name='tempest-TestNetworkAdvancedServerOps-1221065053-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:21:43Z,user_data=None,user_id='d8853d84c1e84f6baaf01635ef1d0f7c',uuid=1ef73533-7ed2-422b-a432-c1f12dbc7323,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9887acef-e389-49e2-87d8-70796da43759", "address": "fa:16:3e:b4:b4:21", "network": {"id": "cee14e96-7070-410d-8934-e305861050e3", "bridge": "br-int", "label": "tempest-network-smoke--303711675", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9887acef-e3", "ovs_interfaceid": "9887acef-e389-49e2-87d8-70796da43759", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:21:47 np0005531887 nova_compute[186849]: 2025-11-22 08:21:47.931 186853 DEBUG nova.network.os_vif_util [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converting VIF {"id": "9887acef-e389-49e2-87d8-70796da43759", "address": "fa:16:3e:b4:b4:21", "network": {"id": "cee14e96-7070-410d-8934-e305861050e3", "bridge": "br-int", "label": "tempest-network-smoke--303711675", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9887acef-e3", "ovs_interfaceid": "9887acef-e389-49e2-87d8-70796da43759", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:21:47 np0005531887 nova_compute[186849]: 2025-11-22 08:21:47.931 186853 DEBUG nova.network.os_vif_util [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b4:b4:21,bridge_name='br-int',has_traffic_filtering=True,id=9887acef-e389-49e2-87d8-70796da43759,network=Network(cee14e96-7070-410d-8934-e305861050e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9887acef-e3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:21:47 np0005531887 nova_compute[186849]: 2025-11-22 08:21:47.932 186853 DEBUG nova.objects.instance [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1ef73533-7ed2-422b-a432-c1f12dbc7323 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:21:47 np0005531887 nova_compute[186849]: 2025-11-22 08:21:47.946 186853 DEBUG nova.virt.libvirt.driver [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:21:47 np0005531887 nova_compute[186849]:  <uuid>1ef73533-7ed2-422b-a432-c1f12dbc7323</uuid>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:  <name>instance-0000009b</name>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:  <memory>131072</memory>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:  <vcpu>1</vcpu>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:21:47 np0005531887 nova_compute[186849]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-896988644</nova:name>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:      <nova:creationTime>2025-11-22 08:21:47</nova:creationTime>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:      <nova:flavor name="m1.nano">
Nov 22 03:21:47 np0005531887 nova_compute[186849]:        <nova:memory>128</nova:memory>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:        <nova:disk>1</nova:disk>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:        <nova:swap>0</nova:swap>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:      </nova:flavor>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:      <nova:owner>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:        <nova:user uuid="d8853d84c1e84f6baaf01635ef1d0f7c">tempest-TestNetworkAdvancedServerOps-1221065053-project-member</nova:user>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:        <nova:project uuid="042f6d127720471aaedb8a1fb7535416">tempest-TestNetworkAdvancedServerOps-1221065053</nova:project>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:      </nova:owner>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:      <nova:ports>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:        <nova:port uuid="9887acef-e389-49e2-87d8-70796da43759">
Nov 22 03:21:47 np0005531887 nova_compute[186849]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:        </nova:port>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:      </nova:ports>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:    </nova:instance>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:  <sysinfo type="smbios">
Nov 22 03:21:47 np0005531887 nova_compute[186849]:    <system>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:      <entry name="serial">1ef73533-7ed2-422b-a432-c1f12dbc7323</entry>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:      <entry name="uuid">1ef73533-7ed2-422b-a432-c1f12dbc7323</entry>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:    </system>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:  <os>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:    <boot dev="hd"/>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:    <smbios mode="sysinfo"/>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:  </os>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:  <features>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:    <vmcoreinfo/>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:  </features>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:  <clock offset="utc">
Nov 22 03:21:47 np0005531887 nova_compute[186849]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:    <timer name="hpet" present="no"/>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:  </clock>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:  <cpu mode="custom" match="exact">
Nov 22 03:21:47 np0005531887 nova_compute[186849]:    <model>Nehalem</model>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:  <devices>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:    <disk type="file" device="disk">
Nov 22 03:21:47 np0005531887 nova_compute[186849]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/1ef73533-7ed2-422b-a432-c1f12dbc7323/disk"/>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:      <target dev="vda" bus="virtio"/>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:    <disk type="file" device="cdrom">
Nov 22 03:21:47 np0005531887 nova_compute[186849]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/1ef73533-7ed2-422b-a432-c1f12dbc7323/disk.config"/>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:      <target dev="sda" bus="sata"/>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:    <interface type="ethernet">
Nov 22 03:21:47 np0005531887 nova_compute[186849]:      <mac address="fa:16:3e:b4:b4:21"/>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:      <mtu size="1442"/>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:      <target dev="tap9887acef-e3"/>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:    </interface>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:    <serial type="pty">
Nov 22 03:21:47 np0005531887 nova_compute[186849]:      <log file="/var/lib/nova/instances/1ef73533-7ed2-422b-a432-c1f12dbc7323/console.log" append="off"/>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:    </serial>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:    <video>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:    </video>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:    <input type="tablet" bus="usb"/>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:    <rng model="virtio">
Nov 22 03:21:47 np0005531887 nova_compute[186849]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:    </rng>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:    <controller type="usb" index="0"/>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:    <memballoon model="virtio">
Nov 22 03:21:47 np0005531887 nova_compute[186849]:      <stats period="10"/>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 03:21:47 np0005531887 nova_compute[186849]:  </devices>
Nov 22 03:21:47 np0005531887 nova_compute[186849]: </domain>
Nov 22 03:21:47 np0005531887 nova_compute[186849]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:21:47 np0005531887 nova_compute[186849]: 2025-11-22 08:21:47.948 186853 DEBUG nova.compute.manager [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Preparing to wait for external event network-vif-plugged-9887acef-e389-49e2-87d8-70796da43759 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:21:47 np0005531887 nova_compute[186849]: 2025-11-22 08:21:47.948 186853 DEBUG oslo_concurrency.lockutils [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "1ef73533-7ed2-422b-a432-c1f12dbc7323-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:21:47 np0005531887 nova_compute[186849]: 2025-11-22 08:21:47.948 186853 DEBUG oslo_concurrency.lockutils [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "1ef73533-7ed2-422b-a432-c1f12dbc7323-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:21:47 np0005531887 nova_compute[186849]: 2025-11-22 08:21:47.949 186853 DEBUG oslo_concurrency.lockutils [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "1ef73533-7ed2-422b-a432-c1f12dbc7323-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:21:47 np0005531887 nova_compute[186849]: 2025-11-22 08:21:47.950 186853 DEBUG nova.virt.libvirt.vif [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:21:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-896988644',display_name='tempest-TestNetworkAdvancedServerOps-server-896988644',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-896988644',id=155,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGEdgv7ZU5mr1DCHTKMPGMEa2ECF9EUHEhtvPrAip3HJ7nfj7TmONl8h5osSWq7Dqr3V6Hj92ZlV3FEvFnmrY27FQG+lpmv/tm8jg4LcVo4ZQR1NoEXbdJ/E0azh0THluw==',key_name='tempest-TestNetworkAdvancedServerOps-205176852',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='042f6d127720471aaedb8a1fb7535416',ramdisk_id='',reservation_id='r-9m0of0y0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1221065053',owner_user_name='tempest-TestNetworkAdvancedServerOps-1221065053-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:21:43Z,user_data=None,user_id='d8853d84c1e84f6baaf01635ef1d0f7c',uuid=1ef73533-7ed2-422b-a432-c1f12dbc7323,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9887acef-e389-49e2-87d8-70796da43759", "address": "fa:16:3e:b4:b4:21", "network": {"id": "cee14e96-7070-410d-8934-e305861050e3", "bridge": "br-int", "label": "tempest-network-smoke--303711675", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9887acef-e3", "ovs_interfaceid": "9887acef-e389-49e2-87d8-70796da43759", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:21:47 np0005531887 nova_compute[186849]: 2025-11-22 08:21:47.950 186853 DEBUG nova.network.os_vif_util [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converting VIF {"id": "9887acef-e389-49e2-87d8-70796da43759", "address": "fa:16:3e:b4:b4:21", "network": {"id": "cee14e96-7070-410d-8934-e305861050e3", "bridge": "br-int", "label": "tempest-network-smoke--303711675", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9887acef-e3", "ovs_interfaceid": "9887acef-e389-49e2-87d8-70796da43759", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:21:47 np0005531887 nova_compute[186849]: 2025-11-22 08:21:47.951 186853 DEBUG nova.network.os_vif_util [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b4:b4:21,bridge_name='br-int',has_traffic_filtering=True,id=9887acef-e389-49e2-87d8-70796da43759,network=Network(cee14e96-7070-410d-8934-e305861050e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9887acef-e3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:21:47 np0005531887 nova_compute[186849]: 2025-11-22 08:21:47.951 186853 DEBUG os_vif [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b4:b4:21,bridge_name='br-int',has_traffic_filtering=True,id=9887acef-e389-49e2-87d8-70796da43759,network=Network(cee14e96-7070-410d-8934-e305861050e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9887acef-e3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:21:47 np0005531887 nova_compute[186849]: 2025-11-22 08:21:47.952 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:47 np0005531887 nova_compute[186849]: 2025-11-22 08:21:47.952 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:21:47 np0005531887 nova_compute[186849]: 2025-11-22 08:21:47.952 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:21:47 np0005531887 nova_compute[186849]: 2025-11-22 08:21:47.956 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:47 np0005531887 nova_compute[186849]: 2025-11-22 08:21:47.956 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9887acef-e3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:21:47 np0005531887 nova_compute[186849]: 2025-11-22 08:21:47.957 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9887acef-e3, col_values=(('external_ids', {'iface-id': '9887acef-e389-49e2-87d8-70796da43759', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b4:b4:21', 'vm-uuid': '1ef73533-7ed2-422b-a432-c1f12dbc7323'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:21:47 np0005531887 nova_compute[186849]: 2025-11-22 08:21:47.959 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:47 np0005531887 NetworkManager[55210]: <info>  [1763799707.9603] manager: (tap9887acef-e3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/252)
Nov 22 03:21:47 np0005531887 nova_compute[186849]: 2025-11-22 08:21:47.960 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:21:47 np0005531887 nova_compute[186849]: 2025-11-22 08:21:47.967 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:47 np0005531887 nova_compute[186849]: 2025-11-22 08:21:47.968 186853 INFO os_vif [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b4:b4:21,bridge_name='br-int',has_traffic_filtering=True,id=9887acef-e389-49e2-87d8-70796da43759,network=Network(cee14e96-7070-410d-8934-e305861050e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9887acef-e3')#033[00m
Nov 22 03:21:48 np0005531887 nova_compute[186849]: 2025-11-22 08:21:48.039 186853 DEBUG nova.virt.libvirt.driver [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:21:48 np0005531887 nova_compute[186849]: 2025-11-22 08:21:48.040 186853 DEBUG nova.virt.libvirt.driver [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:21:48 np0005531887 nova_compute[186849]: 2025-11-22 08:21:48.040 186853 DEBUG nova.virt.libvirt.driver [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] No VIF found with MAC fa:16:3e:b4:b4:21, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:21:48 np0005531887 nova_compute[186849]: 2025-11-22 08:21:48.040 186853 INFO nova.virt.libvirt.driver [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Using config drive#033[00m
Nov 22 03:21:48 np0005531887 podman[238512]: 2025-11-22 08:21:48.844154116 +0000 UTC m=+0.058546149 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, distribution-scope=public, name=ubi9-minimal, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, architecture=x86_64)
Nov 22 03:21:49 np0005531887 nova_compute[186849]: 2025-11-22 08:21:49.060 186853 INFO nova.virt.libvirt.driver [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Creating config drive at /var/lib/nova/instances/1ef73533-7ed2-422b-a432-c1f12dbc7323/disk.config#033[00m
Nov 22 03:21:49 np0005531887 nova_compute[186849]: 2025-11-22 08:21:49.065 186853 DEBUG oslo_concurrency.processutils [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1ef73533-7ed2-422b-a432-c1f12dbc7323/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvnxswyeg execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:21:49 np0005531887 nova_compute[186849]: 2025-11-22 08:21:49.192 186853 DEBUG oslo_concurrency.processutils [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1ef73533-7ed2-422b-a432-c1f12dbc7323/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvnxswyeg" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:21:49 np0005531887 kernel: tap9887acef-e3: entered promiscuous mode
Nov 22 03:21:49 np0005531887 NetworkManager[55210]: <info>  [1763799709.2736] manager: (tap9887acef-e3): new Tun device (/org/freedesktop/NetworkManager/Devices/253)
Nov 22 03:21:49 np0005531887 ovn_controller[95130]: 2025-11-22T08:21:49Z|00530|binding|INFO|Claiming lport 9887acef-e389-49e2-87d8-70796da43759 for this chassis.
Nov 22 03:21:49 np0005531887 nova_compute[186849]: 2025-11-22 08:21:49.276 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:49 np0005531887 ovn_controller[95130]: 2025-11-22T08:21:49Z|00531|binding|INFO|9887acef-e389-49e2-87d8-70796da43759: Claiming fa:16:3e:b4:b4:21 10.100.0.9
Nov 22 03:21:49 np0005531887 nova_compute[186849]: 2025-11-22 08:21:49.282 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:49 np0005531887 nova_compute[186849]: 2025-11-22 08:21:49.285 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:49 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:21:49.292 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b4:b4:21 10.100.0.9'], port_security=['fa:16:3e:b4:b4:21 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '1ef73533-7ed2-422b-a432-c1f12dbc7323', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cee14e96-7070-410d-8934-e305861050e3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '042f6d127720471aaedb8a1fb7535416', 'neutron:revision_number': '2', 'neutron:security_group_ids': '6e5fcd1f-0f97-4f29-8604-c8a08fc32894', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fc22de09-f0b3-4482-a7fb-bd5256ece761, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=9887acef-e389-49e2-87d8-70796da43759) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:21:49 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:21:49.294 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 9887acef-e389-49e2-87d8-70796da43759 in datapath cee14e96-7070-410d-8934-e305861050e3 bound to our chassis#033[00m
Nov 22 03:21:49 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:21:49.295 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network cee14e96-7070-410d-8934-e305861050e3#033[00m
Nov 22 03:21:49 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:21:49.307 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[f0eb5e1d-2a6c-40dc-809c-30521a6c2e7f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:21:49 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:21:49.308 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapcee14e96-71 in ovnmeta-cee14e96-7070-410d-8934-e305861050e3 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:21:49 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:21:49.310 213790 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapcee14e96-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:21:49 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:21:49.310 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[44065b9a-3060-4f5b-8ede-b4f2f6479a28]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:21:49 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:21:49.311 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[545cfdc1-4000-4a85-909b-ddd34a1d6d0e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:21:49 np0005531887 systemd-udevd[238552]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:21:49 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:21:49.322 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[5d92df34-9f89-4506-8255-f9a980f2b13b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:21:49 np0005531887 systemd-machined[153180]: New machine qemu-57-instance-0000009b.
Nov 22 03:21:49 np0005531887 NetworkManager[55210]: <info>  [1763799709.3338] device (tap9887acef-e3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:21:49 np0005531887 NetworkManager[55210]: <info>  [1763799709.3359] device (tap9887acef-e3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:21:49 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:21:49.347 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[7cdbb220-4d73-47f7-bec6-584cd1ec18d3]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:21:49 np0005531887 ovn_controller[95130]: 2025-11-22T08:21:49Z|00532|binding|INFO|Setting lport 9887acef-e389-49e2-87d8-70796da43759 ovn-installed in OVS
Nov 22 03:21:49 np0005531887 ovn_controller[95130]: 2025-11-22T08:21:49Z|00533|binding|INFO|Setting lport 9887acef-e389-49e2-87d8-70796da43759 up in Southbound
Nov 22 03:21:49 np0005531887 nova_compute[186849]: 2025-11-22 08:21:49.353 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:49 np0005531887 systemd[1]: Started Virtual Machine qemu-57-instance-0000009b.
Nov 22 03:21:49 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:21:49.383 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[3e491fbf-44b9-43e1-8943-c2d537dfdb96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:21:49 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:21:49.390 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[c43c4e70-f245-4c9a-8934-7d0549c66655]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:21:49 np0005531887 NetworkManager[55210]: <info>  [1763799709.3914] manager: (tapcee14e96-70): new Veth device (/org/freedesktop/NetworkManager/Devices/254)
Nov 22 03:21:49 np0005531887 systemd-udevd[238557]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:21:49 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:21:49.427 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[56382897-0701-4f25-be8f-f8caa8574d01]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:21:49 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:21:49.430 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[60f0d06c-6a23-439b-8a28-48d14cdedecf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:21:49 np0005531887 NetworkManager[55210]: <info>  [1763799709.4554] device (tapcee14e96-70): carrier: link connected
Nov 22 03:21:49 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:21:49.461 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[9004dcd1-1f6f-450a-857b-821874af0e03]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:21:49 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:21:49.481 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[4a7f149e-9a1e-4054-b3c2-dfcd2c5ddcb0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcee14e96-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4e:02:19'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 162], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 645206, 'reachable_time': 16701, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238587, 'error': None, 'target': 'ovnmeta-cee14e96-7070-410d-8934-e305861050e3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:21:49 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:21:49.496 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[02034e9f-0ed8-4aa7-84ea-657e960f69ba]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe4e:219'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 645206, 'tstamp': 645206}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 238588, 'error': None, 'target': 'ovnmeta-cee14e96-7070-410d-8934-e305861050e3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:21:49 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:21:49.516 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[9dccdc60-0cae-42b7-83e1-937a46b6a5aa]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcee14e96-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4e:02:19'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 162], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 645206, 'reachable_time': 16701, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 238589, 'error': None, 'target': 'ovnmeta-cee14e96-7070-410d-8934-e305861050e3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:21:49 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:21:49.553 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[32eccc38-c13d-4c32-8db9-8b7af1f43044]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:21:49 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:21:49.610 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[9726e12d-c120-48f1-bd24-299909607169]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:21:49 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:21:49.612 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcee14e96-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:21:49 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:21:49.613 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:21:49 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:21:49.613 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcee14e96-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:21:49 np0005531887 nova_compute[186849]: 2025-11-22 08:21:49.615 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:49 np0005531887 kernel: tapcee14e96-70: entered promiscuous mode
Nov 22 03:21:49 np0005531887 NetworkManager[55210]: <info>  [1763799709.6162] manager: (tapcee14e96-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/255)
Nov 22 03:21:49 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:21:49.618 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapcee14e96-70, col_values=(('external_ids', {'iface-id': 'a71e340e-db50-4811-9d79-16735ac5733d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:21:49 np0005531887 ovn_controller[95130]: 2025-11-22T08:21:49Z|00534|binding|INFO|Releasing lport a71e340e-db50-4811-9d79-16735ac5733d from this chassis (sb_readonly=0)
Nov 22 03:21:49 np0005531887 nova_compute[186849]: 2025-11-22 08:21:49.619 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:49 np0005531887 nova_compute[186849]: 2025-11-22 08:21:49.632 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:49 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:21:49.634 104084 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/cee14e96-7070-410d-8934-e305861050e3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/cee14e96-7070-410d-8934-e305861050e3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:21:49 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:21:49.635 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[93c2138a-e3e6-4375-a070-d6584d006fed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:21:49 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:21:49.636 104084 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:21:49 np0005531887 ovn_metadata_agent[104079]: global
Nov 22 03:21:49 np0005531887 ovn_metadata_agent[104079]:    log         /dev/log local0 debug
Nov 22 03:21:49 np0005531887 ovn_metadata_agent[104079]:    log-tag     haproxy-metadata-proxy-cee14e96-7070-410d-8934-e305861050e3
Nov 22 03:21:49 np0005531887 ovn_metadata_agent[104079]:    user        root
Nov 22 03:21:49 np0005531887 ovn_metadata_agent[104079]:    group       root
Nov 22 03:21:49 np0005531887 ovn_metadata_agent[104079]:    maxconn     1024
Nov 22 03:21:49 np0005531887 ovn_metadata_agent[104079]:    pidfile     /var/lib/neutron/external/pids/cee14e96-7070-410d-8934-e305861050e3.pid.haproxy
Nov 22 03:21:49 np0005531887 ovn_metadata_agent[104079]:    daemon
Nov 22 03:21:49 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:21:49 np0005531887 ovn_metadata_agent[104079]: defaults
Nov 22 03:21:49 np0005531887 ovn_metadata_agent[104079]:    log global
Nov 22 03:21:49 np0005531887 ovn_metadata_agent[104079]:    mode http
Nov 22 03:21:49 np0005531887 ovn_metadata_agent[104079]:    option httplog
Nov 22 03:21:49 np0005531887 ovn_metadata_agent[104079]:    option dontlognull
Nov 22 03:21:49 np0005531887 ovn_metadata_agent[104079]:    option http-server-close
Nov 22 03:21:49 np0005531887 ovn_metadata_agent[104079]:    option forwardfor
Nov 22 03:21:49 np0005531887 ovn_metadata_agent[104079]:    retries                 3
Nov 22 03:21:49 np0005531887 ovn_metadata_agent[104079]:    timeout http-request    30s
Nov 22 03:21:49 np0005531887 ovn_metadata_agent[104079]:    timeout connect         30s
Nov 22 03:21:49 np0005531887 ovn_metadata_agent[104079]:    timeout client          32s
Nov 22 03:21:49 np0005531887 ovn_metadata_agent[104079]:    timeout server          32s
Nov 22 03:21:49 np0005531887 ovn_metadata_agent[104079]:    timeout http-keep-alive 30s
Nov 22 03:21:49 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:21:49 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:21:49 np0005531887 ovn_metadata_agent[104079]: listen listener
Nov 22 03:21:49 np0005531887 ovn_metadata_agent[104079]:    bind 169.254.169.254:80
Nov 22 03:21:49 np0005531887 ovn_metadata_agent[104079]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:21:49 np0005531887 ovn_metadata_agent[104079]:    http-request add-header X-OVN-Network-ID cee14e96-7070-410d-8934-e305861050e3
Nov 22 03:21:49 np0005531887 ovn_metadata_agent[104079]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:21:49 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:21:49.638 104084 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-cee14e96-7070-410d-8934-e305861050e3', 'env', 'PROCESS_TAG=haproxy-cee14e96-7070-410d-8934-e305861050e3', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/cee14e96-7070-410d-8934-e305861050e3.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:21:50 np0005531887 podman[238620]: 2025-11-22 08:21:50.021332476 +0000 UTC m=+0.054485108 container create a7425acb6369c2eeb9d97539ea5bb6bc955115ebe4b61792631aafd39b3b29d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cee14e96-7070-410d-8934-e305861050e3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118)
Nov 22 03:21:50 np0005531887 systemd[1]: Started libpod-conmon-a7425acb6369c2eeb9d97539ea5bb6bc955115ebe4b61792631aafd39b3b29d4.scope.
Nov 22 03:21:50 np0005531887 podman[238620]: 2025-11-22 08:21:49.990124585 +0000 UTC m=+0.023277237 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:21:50 np0005531887 systemd[1]: Started libcrun container.
Nov 22 03:21:50 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5d7d823441e7fe0787bc75d60febf3924a3a9b8495d3f09b91107da54e884008/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:21:50 np0005531887 podman[238620]: 2025-11-22 08:21:50.129021849 +0000 UTC m=+0.162174511 container init a7425acb6369c2eeb9d97539ea5bb6bc955115ebe4b61792631aafd39b3b29d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cee14e96-7070-410d-8934-e305861050e3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:21:50 np0005531887 podman[238620]: 2025-11-22 08:21:50.136422712 +0000 UTC m=+0.169575344 container start a7425acb6369c2eeb9d97539ea5bb6bc955115ebe4b61792631aafd39b3b29d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cee14e96-7070-410d-8934-e305861050e3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 03:21:50 np0005531887 neutron-haproxy-ovnmeta-cee14e96-7070-410d-8934-e305861050e3[238635]: [NOTICE]   (238639) : New worker (238641) forked
Nov 22 03:21:50 np0005531887 neutron-haproxy-ovnmeta-cee14e96-7070-410d-8934-e305861050e3[238635]: [NOTICE]   (238639) : Loading success.
Nov 22 03:21:50 np0005531887 nova_compute[186849]: 2025-11-22 08:21:50.363 186853 DEBUG nova.compute.manager [req-a17aee67-a897-475b-86d9-1d58c2e36af4 req-3cf402b8-dc4d-465e-8a0a-3b82b000b526 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Received event network-vif-plugged-9887acef-e389-49e2-87d8-70796da43759 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:21:50 np0005531887 nova_compute[186849]: 2025-11-22 08:21:50.364 186853 DEBUG oslo_concurrency.lockutils [req-a17aee67-a897-475b-86d9-1d58c2e36af4 req-3cf402b8-dc4d-465e-8a0a-3b82b000b526 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1ef73533-7ed2-422b-a432-c1f12dbc7323-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:21:50 np0005531887 nova_compute[186849]: 2025-11-22 08:21:50.364 186853 DEBUG oslo_concurrency.lockutils [req-a17aee67-a897-475b-86d9-1d58c2e36af4 req-3cf402b8-dc4d-465e-8a0a-3b82b000b526 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1ef73533-7ed2-422b-a432-c1f12dbc7323-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:21:50 np0005531887 nova_compute[186849]: 2025-11-22 08:21:50.364 186853 DEBUG oslo_concurrency.lockutils [req-a17aee67-a897-475b-86d9-1d58c2e36af4 req-3cf402b8-dc4d-465e-8a0a-3b82b000b526 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1ef73533-7ed2-422b-a432-c1f12dbc7323-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:21:50 np0005531887 nova_compute[186849]: 2025-11-22 08:21:50.364 186853 DEBUG nova.compute.manager [req-a17aee67-a897-475b-86d9-1d58c2e36af4 req-3cf402b8-dc4d-465e-8a0a-3b82b000b526 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Processing event network-vif-plugged-9887acef-e389-49e2-87d8-70796da43759 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:21:50 np0005531887 nova_compute[186849]: 2025-11-22 08:21:50.413 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763799710.4128294, 1ef73533-7ed2-422b-a432-c1f12dbc7323 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:21:50 np0005531887 nova_compute[186849]: 2025-11-22 08:21:50.414 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] VM Started (Lifecycle Event)#033[00m
Nov 22 03:21:50 np0005531887 nova_compute[186849]: 2025-11-22 08:21:50.417 186853 DEBUG nova.compute.manager [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:21:50 np0005531887 nova_compute[186849]: 2025-11-22 08:21:50.421 186853 DEBUG nova.virt.libvirt.driver [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:21:50 np0005531887 nova_compute[186849]: 2025-11-22 08:21:50.425 186853 INFO nova.virt.libvirt.driver [-] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Instance spawned successfully.#033[00m
Nov 22 03:21:50 np0005531887 nova_compute[186849]: 2025-11-22 08:21:50.426 186853 DEBUG nova.virt.libvirt.driver [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 03:21:50 np0005531887 nova_compute[186849]: 2025-11-22 08:21:50.433 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:21:50 np0005531887 nova_compute[186849]: 2025-11-22 08:21:50.436 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:21:50 np0005531887 nova_compute[186849]: 2025-11-22 08:21:50.451 186853 DEBUG nova.virt.libvirt.driver [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:21:50 np0005531887 nova_compute[186849]: 2025-11-22 08:21:50.452 186853 DEBUG nova.virt.libvirt.driver [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:21:50 np0005531887 nova_compute[186849]: 2025-11-22 08:21:50.452 186853 DEBUG nova.virt.libvirt.driver [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:21:50 np0005531887 nova_compute[186849]: 2025-11-22 08:21:50.453 186853 DEBUG nova.virt.libvirt.driver [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:21:50 np0005531887 nova_compute[186849]: 2025-11-22 08:21:50.453 186853 DEBUG nova.virt.libvirt.driver [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:21:50 np0005531887 nova_compute[186849]: 2025-11-22 08:21:50.454 186853 DEBUG nova.virt.libvirt.driver [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:21:50 np0005531887 nova_compute[186849]: 2025-11-22 08:21:50.459 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:21:50 np0005531887 nova_compute[186849]: 2025-11-22 08:21:50.459 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763799710.4140399, 1ef73533-7ed2-422b-a432-c1f12dbc7323 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:21:50 np0005531887 nova_compute[186849]: 2025-11-22 08:21:50.459 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:21:50 np0005531887 nova_compute[186849]: 2025-11-22 08:21:50.484 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:21:50 np0005531887 nova_compute[186849]: 2025-11-22 08:21:50.488 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763799710.420289, 1ef73533-7ed2-422b-a432-c1f12dbc7323 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:21:50 np0005531887 nova_compute[186849]: 2025-11-22 08:21:50.488 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:21:50 np0005531887 nova_compute[186849]: 2025-11-22 08:21:50.507 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:21:50 np0005531887 nova_compute[186849]: 2025-11-22 08:21:50.511 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:21:50 np0005531887 nova_compute[186849]: 2025-11-22 08:21:50.524 186853 INFO nova.compute.manager [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Took 6.54 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 03:21:50 np0005531887 nova_compute[186849]: 2025-11-22 08:21:50.525 186853 DEBUG nova.compute.manager [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:21:50 np0005531887 nova_compute[186849]: 2025-11-22 08:21:50.528 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:21:50 np0005531887 nova_compute[186849]: 2025-11-22 08:21:50.588 186853 INFO nova.compute.manager [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Took 7.07 seconds to build instance.#033[00m
Nov 22 03:21:50 np0005531887 nova_compute[186849]: 2025-11-22 08:21:50.603 186853 DEBUG oslo_concurrency.lockutils [None req-3f5d3fda-3c40-472a-9d4e-4e9086c96fb5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "1ef73533-7ed2-422b-a432-c1f12dbc7323" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.150s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:21:50 np0005531887 nova_compute[186849]: 2025-11-22 08:21:50.836 186853 DEBUG nova.network.neutron [req-faf6ff2b-8f43-466e-91ba-38228887976d req-0902bcad-83be-40c7-ae74-b2b2d1055c3c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Updated VIF entry in instance network info cache for port 9887acef-e389-49e2-87d8-70796da43759. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:21:50 np0005531887 nova_compute[186849]: 2025-11-22 08:21:50.837 186853 DEBUG nova.network.neutron [req-faf6ff2b-8f43-466e-91ba-38228887976d req-0902bcad-83be-40c7-ae74-b2b2d1055c3c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Updating instance_info_cache with network_info: [{"id": "9887acef-e389-49e2-87d8-70796da43759", "address": "fa:16:3e:b4:b4:21", "network": {"id": "cee14e96-7070-410d-8934-e305861050e3", "bridge": "br-int", "label": "tempest-network-smoke--303711675", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9887acef-e3", "ovs_interfaceid": "9887acef-e389-49e2-87d8-70796da43759", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:21:50 np0005531887 nova_compute[186849]: 2025-11-22 08:21:50.849 186853 DEBUG oslo_concurrency.lockutils [req-faf6ff2b-8f43-466e-91ba-38228887976d req-0902bcad-83be-40c7-ae74-b2b2d1055c3c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-1ef73533-7ed2-422b-a432-c1f12dbc7323" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:21:52 np0005531887 nova_compute[186849]: 2025-11-22 08:21:52.251 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:52 np0005531887 nova_compute[186849]: 2025-11-22 08:21:52.465 186853 DEBUG nova.compute.manager [req-57ba975d-0135-4a3e-9eed-325def9eddbe req-aa43f527-b75f-4101-a7f5-3b5e82c77a9c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Received event network-vif-plugged-9887acef-e389-49e2-87d8-70796da43759 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:21:52 np0005531887 nova_compute[186849]: 2025-11-22 08:21:52.466 186853 DEBUG oslo_concurrency.lockutils [req-57ba975d-0135-4a3e-9eed-325def9eddbe req-aa43f527-b75f-4101-a7f5-3b5e82c77a9c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1ef73533-7ed2-422b-a432-c1f12dbc7323-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:21:52 np0005531887 nova_compute[186849]: 2025-11-22 08:21:52.466 186853 DEBUG oslo_concurrency.lockutils [req-57ba975d-0135-4a3e-9eed-325def9eddbe req-aa43f527-b75f-4101-a7f5-3b5e82c77a9c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1ef73533-7ed2-422b-a432-c1f12dbc7323-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:21:52 np0005531887 nova_compute[186849]: 2025-11-22 08:21:52.466 186853 DEBUG oslo_concurrency.lockutils [req-57ba975d-0135-4a3e-9eed-325def9eddbe req-aa43f527-b75f-4101-a7f5-3b5e82c77a9c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1ef73533-7ed2-422b-a432-c1f12dbc7323-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:21:52 np0005531887 nova_compute[186849]: 2025-11-22 08:21:52.466 186853 DEBUG nova.compute.manager [req-57ba975d-0135-4a3e-9eed-325def9eddbe req-aa43f527-b75f-4101-a7f5-3b5e82c77a9c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] No waiting events found dispatching network-vif-plugged-9887acef-e389-49e2-87d8-70796da43759 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:21:52 np0005531887 nova_compute[186849]: 2025-11-22 08:21:52.467 186853 WARNING nova.compute.manager [req-57ba975d-0135-4a3e-9eed-325def9eddbe req-aa43f527-b75f-4101-a7f5-3b5e82c77a9c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Received unexpected event network-vif-plugged-9887acef-e389-49e2-87d8-70796da43759 for instance with vm_state active and task_state None.#033[00m
Nov 22 03:21:52 np0005531887 nova_compute[186849]: 2025-11-22 08:21:52.960 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:53 np0005531887 podman[238657]: 2025-11-22 08:21:53.852213 +0000 UTC m=+0.064793763 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:21:53 np0005531887 podman[238658]: 2025-11-22 08:21:53.87932133 +0000 UTC m=+0.093670637 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:21:55 np0005531887 nova_compute[186849]: 2025-11-22 08:21:55.517 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:55 np0005531887 NetworkManager[55210]: <info>  [1763799715.5332] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/256)
Nov 22 03:21:55 np0005531887 NetworkManager[55210]: <info>  [1763799715.5349] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/257)
Nov 22 03:21:55 np0005531887 nova_compute[186849]: 2025-11-22 08:21:55.585 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:55 np0005531887 ovn_controller[95130]: 2025-11-22T08:21:55Z|00535|binding|INFO|Releasing lport a71e340e-db50-4811-9d79-16735ac5733d from this chassis (sb_readonly=0)
Nov 22 03:21:55 np0005531887 nova_compute[186849]: 2025-11-22 08:21:55.599 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:56 np0005531887 nova_compute[186849]: 2025-11-22 08:21:56.348 186853 DEBUG nova.compute.manager [req-56e8e99c-26db-4c6a-8aa3-6bfc2394c1ec req-ef312686-f875-47f2-a1d6-3d1aead289fa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Received event network-changed-9887acef-e389-49e2-87d8-70796da43759 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:21:56 np0005531887 nova_compute[186849]: 2025-11-22 08:21:56.349 186853 DEBUG nova.compute.manager [req-56e8e99c-26db-4c6a-8aa3-6bfc2394c1ec req-ef312686-f875-47f2-a1d6-3d1aead289fa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Refreshing instance network info cache due to event network-changed-9887acef-e389-49e2-87d8-70796da43759. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:21:56 np0005531887 nova_compute[186849]: 2025-11-22 08:21:56.349 186853 DEBUG oslo_concurrency.lockutils [req-56e8e99c-26db-4c6a-8aa3-6bfc2394c1ec req-ef312686-f875-47f2-a1d6-3d1aead289fa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-1ef73533-7ed2-422b-a432-c1f12dbc7323" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:21:56 np0005531887 nova_compute[186849]: 2025-11-22 08:21:56.350 186853 DEBUG oslo_concurrency.lockutils [req-56e8e99c-26db-4c6a-8aa3-6bfc2394c1ec req-ef312686-f875-47f2-a1d6-3d1aead289fa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-1ef73533-7ed2-422b-a432-c1f12dbc7323" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:21:56 np0005531887 nova_compute[186849]: 2025-11-22 08:21:56.350 186853 DEBUG nova.network.neutron [req-56e8e99c-26db-4c6a-8aa3-6bfc2394c1ec req-ef312686-f875-47f2-a1d6-3d1aead289fa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Refreshing network info cache for port 9887acef-e389-49e2-87d8-70796da43759 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:21:57 np0005531887 nova_compute[186849]: 2025-11-22 08:21:57.254 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:57 np0005531887 podman[238704]: 2025-11-22 08:21:57.838963848 +0000 UTC m=+0.058379004 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:21:57 np0005531887 nova_compute[186849]: 2025-11-22 08:21:57.962 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:58 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:21:58.535 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=47, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=46) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:21:58 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:21:58.536 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:21:58 np0005531887 nova_compute[186849]: 2025-11-22 08:21:58.538 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:21:59 np0005531887 nova_compute[186849]: 2025-11-22 08:21:59.128 186853 DEBUG nova.network.neutron [req-56e8e99c-26db-4c6a-8aa3-6bfc2394c1ec req-ef312686-f875-47f2-a1d6-3d1aead289fa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Updated VIF entry in instance network info cache for port 9887acef-e389-49e2-87d8-70796da43759. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:21:59 np0005531887 nova_compute[186849]: 2025-11-22 08:21:59.129 186853 DEBUG nova.network.neutron [req-56e8e99c-26db-4c6a-8aa3-6bfc2394c1ec req-ef312686-f875-47f2-a1d6-3d1aead289fa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Updating instance_info_cache with network_info: [{"id": "9887acef-e389-49e2-87d8-70796da43759", "address": "fa:16:3e:b4:b4:21", "network": {"id": "cee14e96-7070-410d-8934-e305861050e3", "bridge": "br-int", "label": "tempest-network-smoke--303711675", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.178", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9887acef-e3", "ovs_interfaceid": "9887acef-e389-49e2-87d8-70796da43759", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:21:59 np0005531887 nova_compute[186849]: 2025-11-22 08:21:59.189 186853 DEBUG oslo_concurrency.lockutils [req-56e8e99c-26db-4c6a-8aa3-6bfc2394c1ec req-ef312686-f875-47f2-a1d6-3d1aead289fa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-1ef73533-7ed2-422b-a432-c1f12dbc7323" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:22:02 np0005531887 nova_compute[186849]: 2025-11-22 08:22:02.257 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:02 np0005531887 nova_compute[186849]: 2025-11-22 08:22:02.964 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:04 np0005531887 nova_compute[186849]: 2025-11-22 08:22:04.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:22:04 np0005531887 podman[238753]: 2025-11-22 08:22:04.848372714 +0000 UTC m=+0.067744486 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 03:22:05 np0005531887 ovn_controller[95130]: 2025-11-22T08:22:05Z|00068|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b4:b4:21 10.100.0.9
Nov 22 03:22:05 np0005531887 ovn_controller[95130]: 2025-11-22T08:22:05Z|00069|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b4:b4:21 10.100.0.9
Nov 22 03:22:06 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:22:06.538 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '47'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:22:07 np0005531887 nova_compute[186849]: 2025-11-22 08:22:07.258 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:07 np0005531887 nova_compute[186849]: 2025-11-22 08:22:07.967 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:09 np0005531887 nova_compute[186849]: 2025-11-22 08:22:09.800 186853 INFO nova.compute.manager [None req-2af5c291-792b-4039-9c90-0f1d8aab285d d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Get console output#033[00m
Nov 22 03:22:09 np0005531887 nova_compute[186849]: 2025-11-22 08:22:09.811 213402 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 22 03:22:09 np0005531887 podman[238772]: 2025-11-22 08:22:09.872530408 +0000 UTC m=+0.085587847 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.license=GPLv2)
Nov 22 03:22:10 np0005531887 nova_compute[186849]: 2025-11-22 08:22:10.764 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:22:10 np0005531887 nova_compute[186849]: 2025-11-22 08:22:10.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:22:10 np0005531887 nova_compute[186849]: 2025-11-22 08:22:10.768 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:22:10 np0005531887 nova_compute[186849]: 2025-11-22 08:22:10.768 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:22:12 np0005531887 nova_compute[186849]: 2025-11-22 08:22:12.056 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "refresh_cache-1ef73533-7ed2-422b-a432-c1f12dbc7323" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:22:12 np0005531887 nova_compute[186849]: 2025-11-22 08:22:12.057 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquired lock "refresh_cache-1ef73533-7ed2-422b-a432-c1f12dbc7323" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:22:12 np0005531887 nova_compute[186849]: 2025-11-22 08:22:12.057 186853 DEBUG nova.network.neutron [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 03:22:12 np0005531887 nova_compute[186849]: 2025-11-22 08:22:12.058 186853 DEBUG nova.objects.instance [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 1ef73533-7ed2-422b-a432-c1f12dbc7323 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:22:12 np0005531887 nova_compute[186849]: 2025-11-22 08:22:12.261 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:12 np0005531887 nova_compute[186849]: 2025-11-22 08:22:12.969 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:13 np0005531887 nova_compute[186849]: 2025-11-22 08:22:13.376 186853 INFO nova.compute.manager [None req-f14ead96-af7e-41cb-8ff4-a7f2c42689c1 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Get console output#033[00m
Nov 22 03:22:13 np0005531887 nova_compute[186849]: 2025-11-22 08:22:13.383 213402 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 22 03:22:13 np0005531887 podman[238791]: 2025-11-22 08:22:13.835823596 +0000 UTC m=+0.056397855 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 03:22:15 np0005531887 nova_compute[186849]: 2025-11-22 08:22:15.712 186853 DEBUG nova.network.neutron [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Updating instance_info_cache with network_info: [{"id": "9887acef-e389-49e2-87d8-70796da43759", "address": "fa:16:3e:b4:b4:21", "network": {"id": "cee14e96-7070-410d-8934-e305861050e3", "bridge": "br-int", "label": "tempest-network-smoke--303711675", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.178", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9887acef-e3", "ovs_interfaceid": "9887acef-e389-49e2-87d8-70796da43759", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:22:15 np0005531887 nova_compute[186849]: 2025-11-22 08:22:15.742 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Releasing lock "refresh_cache-1ef73533-7ed2-422b-a432-c1f12dbc7323" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:22:15 np0005531887 nova_compute[186849]: 2025-11-22 08:22:15.742 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 03:22:15 np0005531887 nova_compute[186849]: 2025-11-22 08:22:15.743 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:22:15 np0005531887 nova_compute[186849]: 2025-11-22 08:22:15.743 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:22:15 np0005531887 nova_compute[186849]: 2025-11-22 08:22:15.743 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:22:15 np0005531887 nova_compute[186849]: 2025-11-22 08:22:15.743 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:22:15 np0005531887 nova_compute[186849]: 2025-11-22 08:22:15.763 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:22:15 np0005531887 nova_compute[186849]: 2025-11-22 08:22:15.763 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:22:15 np0005531887 nova_compute[186849]: 2025-11-22 08:22:15.763 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:22:15 np0005531887 nova_compute[186849]: 2025-11-22 08:22:15.764 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:22:15 np0005531887 nova_compute[186849]: 2025-11-22 08:22:15.827 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1ef73533-7ed2-422b-a432-c1f12dbc7323/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:22:15 np0005531887 nova_compute[186849]: 2025-11-22 08:22:15.909 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1ef73533-7ed2-422b-a432-c1f12dbc7323/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:22:15 np0005531887 nova_compute[186849]: 2025-11-22 08:22:15.910 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1ef73533-7ed2-422b-a432-c1f12dbc7323/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:22:15 np0005531887 nova_compute[186849]: 2025-11-22 08:22:15.973 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1ef73533-7ed2-422b-a432-c1f12dbc7323/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:22:16 np0005531887 nova_compute[186849]: 2025-11-22 08:22:16.125 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:22:16 np0005531887 nova_compute[186849]: 2025-11-22 08:22:16.127 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5547MB free_disk=73.24486541748047GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:22:16 np0005531887 nova_compute[186849]: 2025-11-22 08:22:16.127 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:22:16 np0005531887 nova_compute[186849]: 2025-11-22 08:22:16.127 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:22:16 np0005531887 nova_compute[186849]: 2025-11-22 08:22:16.220 186853 INFO nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Instance a35794db-cb93-4d6b-9acb-ff35fa95c4f0 has allocations against this compute host but is not found in the database.#033[00m
Nov 22 03:22:16 np0005531887 nova_compute[186849]: 2025-11-22 08:22:16.220 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:22:16 np0005531887 nova_compute[186849]: 2025-11-22 08:22:16.221 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:22:16 np0005531887 nova_compute[186849]: 2025-11-22 08:22:16.280 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:22:16 np0005531887 nova_compute[186849]: 2025-11-22 08:22:16.292 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:22:16 np0005531887 nova_compute[186849]: 2025-11-22 08:22:16.313 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:22:16 np0005531887 nova_compute[186849]: 2025-11-22 08:22:16.314 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.187s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:22:17 np0005531887 nova_compute[186849]: 2025-11-22 08:22:17.262 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:17 np0005531887 nova_compute[186849]: 2025-11-22 08:22:17.755 186853 DEBUG nova.virt.libvirt.driver [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Check if temp file /var/lib/nova/instances/tmpdu6nfow8 exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065#033[00m
Nov 22 03:22:17 np0005531887 nova_compute[186849]: 2025-11-22 08:22:17.761 186853 DEBUG oslo_concurrency.processutils [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1ef73533-7ed2-422b-a432-c1f12dbc7323/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:22:17 np0005531887 nova_compute[186849]: 2025-11-22 08:22:17.832 186853 DEBUG oslo_concurrency.processutils [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1ef73533-7ed2-422b-a432-c1f12dbc7323/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:22:17 np0005531887 nova_compute[186849]: 2025-11-22 08:22:17.833 186853 DEBUG oslo_concurrency.processutils [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1ef73533-7ed2-422b-a432-c1f12dbc7323/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:22:17 np0005531887 nova_compute[186849]: 2025-11-22 08:22:17.897 186853 DEBUG oslo_concurrency.processutils [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1ef73533-7ed2-422b-a432-c1f12dbc7323/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:22:17 np0005531887 nova_compute[186849]: 2025-11-22 08:22:17.899 186853 DEBUG nova.compute.manager [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpdu6nfow8',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='1ef73533-7ed2-422b-a432-c1f12dbc7323',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587#033[00m
Nov 22 03:22:17 np0005531887 nova_compute[186849]: 2025-11-22 08:22:17.972 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:18 np0005531887 nova_compute[186849]: 2025-11-22 08:22:18.341 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:22:19 np0005531887 podman[238828]: 2025-11-22 08:22:19.850355671 +0000 UTC m=+0.070393731 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, name=ubi9-minimal, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, vcs-type=git, version=9.6, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 03:22:20 np0005531887 nova_compute[186849]: 2025-11-22 08:22:20.391 186853 DEBUG oslo_concurrency.processutils [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1ef73533-7ed2-422b-a432-c1f12dbc7323/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:22:20 np0005531887 nova_compute[186849]: 2025-11-22 08:22:20.462 186853 DEBUG oslo_concurrency.processutils [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1ef73533-7ed2-422b-a432-c1f12dbc7323/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:22:20 np0005531887 nova_compute[186849]: 2025-11-22 08:22:20.464 186853 DEBUG oslo_concurrency.processutils [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1ef73533-7ed2-422b-a432-c1f12dbc7323/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:22:20 np0005531887 nova_compute[186849]: 2025-11-22 08:22:20.530 186853 DEBUG oslo_concurrency.processutils [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1ef73533-7ed2-422b-a432-c1f12dbc7323/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:22:20 np0005531887 nova_compute[186849]: 2025-11-22 08:22:20.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:22:22 np0005531887 nova_compute[186849]: 2025-11-22 08:22:22.265 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:22 np0005531887 systemd[1]: Created slice User Slice of UID 42436.
Nov 22 03:22:22 np0005531887 nova_compute[186849]: 2025-11-22 08:22:22.975 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:22 np0005531887 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 22 03:22:22 np0005531887 systemd-logind[821]: New session 57 of user nova.
Nov 22 03:22:23 np0005531887 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 22 03:22:23 np0005531887 systemd[1]: Starting User Manager for UID 42436...
Nov 22 03:22:23 np0005531887 systemd[238859]: Queued start job for default target Main User Target.
Nov 22 03:22:23 np0005531887 systemd[238859]: Created slice User Application Slice.
Nov 22 03:22:23 np0005531887 systemd[238859]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 22 03:22:23 np0005531887 systemd[238859]: Started Daily Cleanup of User's Temporary Directories.
Nov 22 03:22:23 np0005531887 systemd[238859]: Reached target Paths.
Nov 22 03:22:23 np0005531887 systemd[238859]: Reached target Timers.
Nov 22 03:22:23 np0005531887 systemd[238859]: Starting D-Bus User Message Bus Socket...
Nov 22 03:22:23 np0005531887 systemd[238859]: Starting Create User's Volatile Files and Directories...
Nov 22 03:22:23 np0005531887 systemd[238859]: Finished Create User's Volatile Files and Directories.
Nov 22 03:22:23 np0005531887 systemd[238859]: Listening on D-Bus User Message Bus Socket.
Nov 22 03:22:23 np0005531887 systemd[238859]: Reached target Sockets.
Nov 22 03:22:23 np0005531887 systemd[238859]: Reached target Basic System.
Nov 22 03:22:23 np0005531887 systemd[238859]: Reached target Main User Target.
Nov 22 03:22:23 np0005531887 systemd[238859]: Startup finished in 164ms.
Nov 22 03:22:23 np0005531887 systemd[1]: Started User Manager for UID 42436.
Nov 22 03:22:23 np0005531887 systemd[1]: Started Session 57 of User nova.
Nov 22 03:22:23 np0005531887 systemd[1]: session-57.scope: Deactivated successfully.
Nov 22 03:22:23 np0005531887 systemd-logind[821]: Session 57 logged out. Waiting for processes to exit.
Nov 22 03:22:23 np0005531887 systemd-logind[821]: Removed session 57.
Nov 22 03:22:23 np0005531887 nova_compute[186849]: 2025-11-22 08:22:23.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:22:24 np0005531887 podman[238875]: 2025-11-22 08:22:24.855082886 +0000 UTC m=+0.067784968 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 22 03:22:24 np0005531887 podman[238876]: 2025-11-22 08:22:24.888328928 +0000 UTC m=+0.097980274 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:22:25 np0005531887 nova_compute[186849]: 2025-11-22 08:22:25.372 186853 INFO nova.compute.manager [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Took 4.84 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.#033[00m
Nov 22 03:22:25 np0005531887 nova_compute[186849]: 2025-11-22 08:22:25.373 186853 DEBUG nova.compute.manager [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:22:25 np0005531887 nova_compute[186849]: 2025-11-22 08:22:25.388 186853 DEBUG nova.compute.manager [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpdu6nfow8',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='1ef73533-7ed2-422b-a432-c1f12dbc7323',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(a35794db-cb93-4d6b-9acb-ff35fa95c4f0),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939#033[00m
Nov 22 03:22:25 np0005531887 nova_compute[186849]: 2025-11-22 08:22:25.422 186853 DEBUG nova.objects.instance [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Lazy-loading 'migration_context' on Instance uuid 1ef73533-7ed2-422b-a432-c1f12dbc7323 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:22:25 np0005531887 nova_compute[186849]: 2025-11-22 08:22:25.423 186853 DEBUG nova.virt.libvirt.driver [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639#033[00m
Nov 22 03:22:25 np0005531887 nova_compute[186849]: 2025-11-22 08:22:25.426 186853 DEBUG nova.virt.libvirt.driver [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440#033[00m
Nov 22 03:22:25 np0005531887 nova_compute[186849]: 2025-11-22 08:22:25.427 186853 DEBUG nova.virt.libvirt.driver [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449#033[00m
Nov 22 03:22:25 np0005531887 nova_compute[186849]: 2025-11-22 08:22:25.441 186853 DEBUG nova.virt.libvirt.vif [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:21:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-896988644',display_name='tempest-TestNetworkAdvancedServerOps-server-896988644',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-896988644',id=155,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGEdgv7ZU5mr1DCHTKMPGMEa2ECF9EUHEhtvPrAip3HJ7nfj7TmONl8h5osSWq7Dqr3V6Hj92ZlV3FEvFnmrY27FQG+lpmv/tm8jg4LcVo4ZQR1NoEXbdJ/E0azh0THluw==',key_name='tempest-TestNetworkAdvancedServerOps-205176852',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:21:50Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='042f6d127720471aaedb8a1fb7535416',ramdisk_id='',reservation_id='r-9m0of0y0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1221065053',owner_user_name='tempest-TestNetworkAdvancedServerOps-1221065053-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:21:50Z,user_data=None,user_id='d8853d84c1e84f6baaf01635ef1d0f7c',uuid=1ef73533-7ed2-422b-a432-c1f12dbc7323,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9887acef-e389-49e2-87d8-70796da43759", "address": "fa:16:3e:b4:b4:21", "network": {"id": "cee14e96-7070-410d-8934-e305861050e3", "bridge": "br-int", "label": "tempest-network-smoke--303711675", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.178", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap9887acef-e3", "ovs_interfaceid": "9887acef-e389-49e2-87d8-70796da43759", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:22:25 np0005531887 nova_compute[186849]: 2025-11-22 08:22:25.443 186853 DEBUG nova.network.os_vif_util [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Converting VIF {"id": "9887acef-e389-49e2-87d8-70796da43759", "address": "fa:16:3e:b4:b4:21", "network": {"id": "cee14e96-7070-410d-8934-e305861050e3", "bridge": "br-int", "label": "tempest-network-smoke--303711675", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.178", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap9887acef-e3", "ovs_interfaceid": "9887acef-e389-49e2-87d8-70796da43759", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:22:25 np0005531887 nova_compute[186849]: 2025-11-22 08:22:25.444 186853 DEBUG nova.network.os_vif_util [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b4:b4:21,bridge_name='br-int',has_traffic_filtering=True,id=9887acef-e389-49e2-87d8-70796da43759,network=Network(cee14e96-7070-410d-8934-e305861050e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9887acef-e3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:22:25 np0005531887 nova_compute[186849]: 2025-11-22 08:22:25.445 186853 DEBUG nova.virt.libvirt.migration [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Updating guest XML with vif config: <interface type="ethernet">
Nov 22 03:22:25 np0005531887 nova_compute[186849]:  <mac address="fa:16:3e:b4:b4:21"/>
Nov 22 03:22:25 np0005531887 nova_compute[186849]:  <model type="virtio"/>
Nov 22 03:22:25 np0005531887 nova_compute[186849]:  <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:22:25 np0005531887 nova_compute[186849]:  <mtu size="1442"/>
Nov 22 03:22:25 np0005531887 nova_compute[186849]:  <target dev="tap9887acef-e3"/>
Nov 22 03:22:25 np0005531887 nova_compute[186849]: </interface>
Nov 22 03:22:25 np0005531887 nova_compute[186849]: _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388#033[00m
Nov 22 03:22:25 np0005531887 nova_compute[186849]: 2025-11-22 08:22:25.446 186853 DEBUG nova.virt.libvirt.driver [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272#033[00m
Nov 22 03:22:25 np0005531887 nova_compute[186849]: 2025-11-22 08:22:25.761 186853 DEBUG nova.compute.manager [req-8b4c461d-feb1-4299-9f1b-890720ddd4c8 req-c450d341-4e40-4ea4-ac3f-6f505fc03ba9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Received event network-vif-unplugged-9887acef-e389-49e2-87d8-70796da43759 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:22:25 np0005531887 nova_compute[186849]: 2025-11-22 08:22:25.762 186853 DEBUG oslo_concurrency.lockutils [req-8b4c461d-feb1-4299-9f1b-890720ddd4c8 req-c450d341-4e40-4ea4-ac3f-6f505fc03ba9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1ef73533-7ed2-422b-a432-c1f12dbc7323-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:22:25 np0005531887 nova_compute[186849]: 2025-11-22 08:22:25.763 186853 DEBUG oslo_concurrency.lockutils [req-8b4c461d-feb1-4299-9f1b-890720ddd4c8 req-c450d341-4e40-4ea4-ac3f-6f505fc03ba9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1ef73533-7ed2-422b-a432-c1f12dbc7323-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:22:25 np0005531887 nova_compute[186849]: 2025-11-22 08:22:25.763 186853 DEBUG oslo_concurrency.lockutils [req-8b4c461d-feb1-4299-9f1b-890720ddd4c8 req-c450d341-4e40-4ea4-ac3f-6f505fc03ba9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1ef73533-7ed2-422b-a432-c1f12dbc7323-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:22:25 np0005531887 nova_compute[186849]: 2025-11-22 08:22:25.763 186853 DEBUG nova.compute.manager [req-8b4c461d-feb1-4299-9f1b-890720ddd4c8 req-c450d341-4e40-4ea4-ac3f-6f505fc03ba9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] No waiting events found dispatching network-vif-unplugged-9887acef-e389-49e2-87d8-70796da43759 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:22:25 np0005531887 nova_compute[186849]: 2025-11-22 08:22:25.763 186853 DEBUG nova.compute.manager [req-8b4c461d-feb1-4299-9f1b-890720ddd4c8 req-c450d341-4e40-4ea4-ac3f-6f505fc03ba9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Received event network-vif-unplugged-9887acef-e389-49e2-87d8-70796da43759 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 03:22:25 np0005531887 nova_compute[186849]: 2025-11-22 08:22:25.763 186853 DEBUG nova.compute.manager [req-8b4c461d-feb1-4299-9f1b-890720ddd4c8 req-c450d341-4e40-4ea4-ac3f-6f505fc03ba9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Received event network-vif-plugged-9887acef-e389-49e2-87d8-70796da43759 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:22:25 np0005531887 nova_compute[186849]: 2025-11-22 08:22:25.764 186853 DEBUG oslo_concurrency.lockutils [req-8b4c461d-feb1-4299-9f1b-890720ddd4c8 req-c450d341-4e40-4ea4-ac3f-6f505fc03ba9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1ef73533-7ed2-422b-a432-c1f12dbc7323-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:22:25 np0005531887 nova_compute[186849]: 2025-11-22 08:22:25.764 186853 DEBUG oslo_concurrency.lockutils [req-8b4c461d-feb1-4299-9f1b-890720ddd4c8 req-c450d341-4e40-4ea4-ac3f-6f505fc03ba9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1ef73533-7ed2-422b-a432-c1f12dbc7323-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:22:25 np0005531887 nova_compute[186849]: 2025-11-22 08:22:25.764 186853 DEBUG oslo_concurrency.lockutils [req-8b4c461d-feb1-4299-9f1b-890720ddd4c8 req-c450d341-4e40-4ea4-ac3f-6f505fc03ba9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1ef73533-7ed2-422b-a432-c1f12dbc7323-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:22:25 np0005531887 nova_compute[186849]: 2025-11-22 08:22:25.764 186853 DEBUG nova.compute.manager [req-8b4c461d-feb1-4299-9f1b-890720ddd4c8 req-c450d341-4e40-4ea4-ac3f-6f505fc03ba9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] No waiting events found dispatching network-vif-plugged-9887acef-e389-49e2-87d8-70796da43759 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:22:25 np0005531887 nova_compute[186849]: 2025-11-22 08:22:25.765 186853 WARNING nova.compute.manager [req-8b4c461d-feb1-4299-9f1b-890720ddd4c8 req-c450d341-4e40-4ea4-ac3f-6f505fc03ba9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Received unexpected event network-vif-plugged-9887acef-e389-49e2-87d8-70796da43759 for instance with vm_state active and task_state migrating.#033[00m
Nov 22 03:22:25 np0005531887 nova_compute[186849]: 2025-11-22 08:22:25.765 186853 DEBUG nova.compute.manager [req-8b4c461d-feb1-4299-9f1b-890720ddd4c8 req-c450d341-4e40-4ea4-ac3f-6f505fc03ba9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Received event network-changed-9887acef-e389-49e2-87d8-70796da43759 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:22:25 np0005531887 nova_compute[186849]: 2025-11-22 08:22:25.765 186853 DEBUG nova.compute.manager [req-8b4c461d-feb1-4299-9f1b-890720ddd4c8 req-c450d341-4e40-4ea4-ac3f-6f505fc03ba9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Refreshing instance network info cache due to event network-changed-9887acef-e389-49e2-87d8-70796da43759. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:22:25 np0005531887 nova_compute[186849]: 2025-11-22 08:22:25.765 186853 DEBUG oslo_concurrency.lockutils [req-8b4c461d-feb1-4299-9f1b-890720ddd4c8 req-c450d341-4e40-4ea4-ac3f-6f505fc03ba9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-1ef73533-7ed2-422b-a432-c1f12dbc7323" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:22:25 np0005531887 nova_compute[186849]: 2025-11-22 08:22:25.766 186853 DEBUG oslo_concurrency.lockutils [req-8b4c461d-feb1-4299-9f1b-890720ddd4c8 req-c450d341-4e40-4ea4-ac3f-6f505fc03ba9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-1ef73533-7ed2-422b-a432-c1f12dbc7323" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:22:25 np0005531887 nova_compute[186849]: 2025-11-22 08:22:25.766 186853 DEBUG nova.network.neutron [req-8b4c461d-feb1-4299-9f1b-890720ddd4c8 req-c450d341-4e40-4ea4-ac3f-6f505fc03ba9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Refreshing network info cache for port 9887acef-e389-49e2-87d8-70796da43759 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:22:25 np0005531887 nova_compute[186849]: 2025-11-22 08:22:25.929 186853 DEBUG nova.virt.libvirt.migration [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 22 03:22:25 np0005531887 nova_compute[186849]: 2025-11-22 08:22:25.930 186853 INFO nova.virt.libvirt.migration [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Increasing downtime to 50 ms after 0 sec elapsed time#033[00m
Nov 22 03:22:26 np0005531887 nova_compute[186849]: 2025-11-22 08:22:26.050 186853 INFO nova.virt.libvirt.driver [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).#033[00m
Nov 22 03:22:26 np0005531887 nova_compute[186849]: 2025-11-22 08:22:26.552 186853 DEBUG nova.virt.libvirt.migration [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 22 03:22:26 np0005531887 nova_compute[186849]: 2025-11-22 08:22:26.553 186853 DEBUG nova.virt.libvirt.migration [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 22 03:22:27 np0005531887 nova_compute[186849]: 2025-11-22 08:22:27.055 186853 DEBUG nova.virt.libvirt.migration [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 22 03:22:27 np0005531887 nova_compute[186849]: 2025-11-22 08:22:27.056 186853 DEBUG nova.virt.libvirt.migration [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 22 03:22:27 np0005531887 nova_compute[186849]: 2025-11-22 08:22:27.267 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:27 np0005531887 nova_compute[186849]: 2025-11-22 08:22:27.404 186853 DEBUG nova.network.neutron [req-8b4c461d-feb1-4299-9f1b-890720ddd4c8 req-c450d341-4e40-4ea4-ac3f-6f505fc03ba9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Updated VIF entry in instance network info cache for port 9887acef-e389-49e2-87d8-70796da43759. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:22:27 np0005531887 nova_compute[186849]: 2025-11-22 08:22:27.405 186853 DEBUG nova.network.neutron [req-8b4c461d-feb1-4299-9f1b-890720ddd4c8 req-c450d341-4e40-4ea4-ac3f-6f505fc03ba9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Updating instance_info_cache with network_info: [{"id": "9887acef-e389-49e2-87d8-70796da43759", "address": "fa:16:3e:b4:b4:21", "network": {"id": "cee14e96-7070-410d-8934-e305861050e3", "bridge": "br-int", "label": "tempest-network-smoke--303711675", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.178", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9887acef-e3", "ovs_interfaceid": "9887acef-e389-49e2-87d8-70796da43759", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:22:27 np0005531887 nova_compute[186849]: 2025-11-22 08:22:27.424 186853 DEBUG oslo_concurrency.lockutils [req-8b4c461d-feb1-4299-9f1b-890720ddd4c8 req-c450d341-4e40-4ea4-ac3f-6f505fc03ba9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-1ef73533-7ed2-422b-a432-c1f12dbc7323" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:22:27 np0005531887 nova_compute[186849]: 2025-11-22 08:22:27.560 186853 DEBUG nova.virt.libvirt.migration [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 22 03:22:27 np0005531887 nova_compute[186849]: 2025-11-22 08:22:27.560 186853 DEBUG nova.virt.libvirt.migration [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 22 03:22:27 np0005531887 nova_compute[186849]: 2025-11-22 08:22:27.979 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:28 np0005531887 nova_compute[186849]: 2025-11-22 08:22:28.064 186853 DEBUG nova.virt.libvirt.migration [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 22 03:22:28 np0005531887 nova_compute[186849]: 2025-11-22 08:22:28.065 186853 DEBUG nova.virt.libvirt.migration [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 22 03:22:28 np0005531887 nova_compute[186849]: 2025-11-22 08:22:28.568 186853 DEBUG nova.virt.libvirt.migration [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Current 50 elapsed 3 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 22 03:22:28 np0005531887 nova_compute[186849]: 2025-11-22 08:22:28.569 186853 DEBUG nova.virt.libvirt.migration [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 22 03:22:28 np0005531887 podman[238923]: 2025-11-22 08:22:28.849468804 +0000 UTC m=+0.069706414 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:22:29 np0005531887 nova_compute[186849]: 2025-11-22 08:22:29.072 186853 DEBUG nova.virt.libvirt.migration [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Current 50 elapsed 3 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 22 03:22:29 np0005531887 nova_compute[186849]: 2025-11-22 08:22:29.072 186853 DEBUG nova.virt.libvirt.migration [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 22 03:22:29 np0005531887 nova_compute[186849]: 2025-11-22 08:22:29.302 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763799749.3014922, 1ef73533-7ed2-422b-a432-c1f12dbc7323 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:22:29 np0005531887 nova_compute[186849]: 2025-11-22 08:22:29.302 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:22:29 np0005531887 nova_compute[186849]: 2025-11-22 08:22:29.325 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:22:29 np0005531887 nova_compute[186849]: 2025-11-22 08:22:29.332 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:22:29 np0005531887 nova_compute[186849]: 2025-11-22 08:22:29.349 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] During sync_power_state the instance has a pending task (migrating). Skip.#033[00m
Nov 22 03:22:29 np0005531887 kernel: tap9887acef-e3 (unregistering): left promiscuous mode
Nov 22 03:22:29 np0005531887 NetworkManager[55210]: <info>  [1763799749.4594] device (tap9887acef-e3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:22:29 np0005531887 ovn_controller[95130]: 2025-11-22T08:22:29Z|00536|binding|INFO|Releasing lport 9887acef-e389-49e2-87d8-70796da43759 from this chassis (sb_readonly=0)
Nov 22 03:22:29 np0005531887 ovn_controller[95130]: 2025-11-22T08:22:29Z|00537|binding|INFO|Setting lport 9887acef-e389-49e2-87d8-70796da43759 down in Southbound
Nov 22 03:22:29 np0005531887 nova_compute[186849]: 2025-11-22 08:22:29.475 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:29 np0005531887 ovn_controller[95130]: 2025-11-22T08:22:29Z|00538|binding|INFO|Removing iface tap9887acef-e3 ovn-installed in OVS
Nov 22 03:22:29 np0005531887 nova_compute[186849]: 2025-11-22 08:22:29.479 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:29 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:22:29.486 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b4:b4:21 10.100.0.9'], port_security=['fa:16:3e:b4:b4:21 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'df09844c-c111-44b4-9c36-d4950a55a590'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '1ef73533-7ed2-422b-a432-c1f12dbc7323', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cee14e96-7070-410d-8934-e305861050e3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '042f6d127720471aaedb8a1fb7535416', 'neutron:revision_number': '8', 'neutron:security_group_ids': '6e5fcd1f-0f97-4f29-8604-c8a08fc32894', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.178'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fc22de09-f0b3-4482-a7fb-bd5256ece761, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=9887acef-e389-49e2-87d8-70796da43759) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:22:29 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:22:29.487 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 9887acef-e389-49e2-87d8-70796da43759 in datapath cee14e96-7070-410d-8934-e305861050e3 unbound from our chassis#033[00m
Nov 22 03:22:29 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:22:29.488 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cee14e96-7070-410d-8934-e305861050e3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:22:29 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:22:29.490 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[fdf23a43-2d7f-4d85-af2a-4a0d6da87eaa]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:22:29 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:22:29.491 104084 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-cee14e96-7070-410d-8934-e305861050e3 namespace which is not needed anymore#033[00m
Nov 22 03:22:29 np0005531887 nova_compute[186849]: 2025-11-22 08:22:29.493 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:29 np0005531887 systemd[1]: machine-qemu\x2d57\x2dinstance\x2d0000009b.scope: Deactivated successfully.
Nov 22 03:22:29 np0005531887 systemd[1]: machine-qemu\x2d57\x2dinstance\x2d0000009b.scope: Consumed 16.484s CPU time.
Nov 22 03:22:29 np0005531887 systemd-machined[153180]: Machine qemu-57-instance-0000009b terminated.
Nov 22 03:22:29 np0005531887 nova_compute[186849]: 2025-11-22 08:22:29.656 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:29 np0005531887 nova_compute[186849]: 2025-11-22 08:22:29.660 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:29 np0005531887 nova_compute[186849]: 2025-11-22 08:22:29.703 186853 DEBUG nova.virt.libvirt.guest [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688#033[00m
Nov 22 03:22:29 np0005531887 nova_compute[186849]: 2025-11-22 08:22:29.703 186853 INFO nova.virt.libvirt.driver [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Migration operation has completed#033[00m
Nov 22 03:22:29 np0005531887 nova_compute[186849]: 2025-11-22 08:22:29.703 186853 INFO nova.compute.manager [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] _post_live_migration() is started..#033[00m
Nov 22 03:22:29 np0005531887 nova_compute[186849]: 2025-11-22 08:22:29.706 186853 DEBUG nova.virt.libvirt.driver [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279#033[00m
Nov 22 03:22:29 np0005531887 nova_compute[186849]: 2025-11-22 08:22:29.706 186853 DEBUG nova.virt.libvirt.driver [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327#033[00m
Nov 22 03:22:29 np0005531887 nova_compute[186849]: 2025-11-22 08:22:29.706 186853 DEBUG nova.virt.libvirt.driver [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630#033[00m
Nov 22 03:22:29 np0005531887 neutron-haproxy-ovnmeta-cee14e96-7070-410d-8934-e305861050e3[238635]: [NOTICE]   (238639) : haproxy version is 2.8.14-c23fe91
Nov 22 03:22:29 np0005531887 neutron-haproxy-ovnmeta-cee14e96-7070-410d-8934-e305861050e3[238635]: [NOTICE]   (238639) : path to executable is /usr/sbin/haproxy
Nov 22 03:22:29 np0005531887 neutron-haproxy-ovnmeta-cee14e96-7070-410d-8934-e305861050e3[238635]: [WARNING]  (238639) : Exiting Master process...
Nov 22 03:22:29 np0005531887 neutron-haproxy-ovnmeta-cee14e96-7070-410d-8934-e305861050e3[238635]: [WARNING]  (238639) : Exiting Master process...
Nov 22 03:22:29 np0005531887 neutron-haproxy-ovnmeta-cee14e96-7070-410d-8934-e305861050e3[238635]: [ALERT]    (238639) : Current worker (238641) exited with code 143 (Terminated)
Nov 22 03:22:29 np0005531887 neutron-haproxy-ovnmeta-cee14e96-7070-410d-8934-e305861050e3[238635]: [WARNING]  (238639) : All workers exited. Exiting... (0)
Nov 22 03:22:29 np0005531887 systemd[1]: libpod-a7425acb6369c2eeb9d97539ea5bb6bc955115ebe4b61792631aafd39b3b29d4.scope: Deactivated successfully.
Nov 22 03:22:29 np0005531887 podman[238974]: 2025-11-22 08:22:29.726878822 +0000 UTC m=+0.128374396 container died a7425acb6369c2eeb9d97539ea5bb6bc955115ebe4b61792631aafd39b3b29d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cee14e96-7070-410d-8934-e305861050e3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 03:22:29 np0005531887 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a7425acb6369c2eeb9d97539ea5bb6bc955115ebe4b61792631aafd39b3b29d4-userdata-shm.mount: Deactivated successfully.
Nov 22 03:22:29 np0005531887 systemd[1]: var-lib-containers-storage-overlay-5d7d823441e7fe0787bc75d60febf3924a3a9b8495d3f09b91107da54e884008-merged.mount: Deactivated successfully.
Nov 22 03:22:29 np0005531887 podman[238974]: 2025-11-22 08:22:29.890307453 +0000 UTC m=+0.291803027 container cleanup a7425acb6369c2eeb9d97539ea5bb6bc955115ebe4b61792631aafd39b3b29d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cee14e96-7070-410d-8934-e305861050e3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:22:29 np0005531887 systemd[1]: libpod-conmon-a7425acb6369c2eeb9d97539ea5bb6bc955115ebe4b61792631aafd39b3b29d4.scope: Deactivated successfully.
Nov 22 03:22:30 np0005531887 podman[239021]: 2025-11-22 08:22:30.036446567 +0000 UTC m=+0.119055935 container remove a7425acb6369c2eeb9d97539ea5bb6bc955115ebe4b61792631aafd39b3b29d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cee14e96-7070-410d-8934-e305861050e3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118)
Nov 22 03:22:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:22:30.047 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[46bb1376-833d-4a26-8639-ca697b6d891b]: (4, ('Sat Nov 22 08:22:29 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-cee14e96-7070-410d-8934-e305861050e3 (a7425acb6369c2eeb9d97539ea5bb6bc955115ebe4b61792631aafd39b3b29d4)\na7425acb6369c2eeb9d97539ea5bb6bc955115ebe4b61792631aafd39b3b29d4\nSat Nov 22 08:22:29 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-cee14e96-7070-410d-8934-e305861050e3 (a7425acb6369c2eeb9d97539ea5bb6bc955115ebe4b61792631aafd39b3b29d4)\na7425acb6369c2eeb9d97539ea5bb6bc955115ebe4b61792631aafd39b3b29d4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:22:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:22:30.049 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[352a09ff-478a-4d7b-ba25-49df75537d56]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:22:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:22:30.051 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcee14e96-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:22:30 np0005531887 nova_compute[186849]: 2025-11-22 08:22:30.054 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:30 np0005531887 kernel: tapcee14e96-70: left promiscuous mode
Nov 22 03:22:30 np0005531887 nova_compute[186849]: 2025-11-22 08:22:30.072 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:22:30.075 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[461969d8-9ee2-4704-a8fd-5e8acb2cb680]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:22:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:22:30.092 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[0c90f1f8-67f7-4d55-95b4-f9cfd257fe8a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:22:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:22:30.094 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[284ff713-8adb-4a57-bbf3-a8c964013d55]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:22:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:22:30.110 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[430d80df-53dc-4aa7-b5b1-b569d3225d9d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 645198, 'reachable_time': 27867, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239039, 'error': None, 'target': 'ovnmeta-cee14e96-7070-410d-8934-e305861050e3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:22:30 np0005531887 systemd[1]: run-netns-ovnmeta\x2dcee14e96\x2d7070\x2d410d\x2d8934\x2de305861050e3.mount: Deactivated successfully.
Nov 22 03:22:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:22:30.116 104200 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-cee14e96-7070-410d-8934-e305861050e3 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:22:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:22:30.117 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[26fb8a67-66b7-47f3-a540-a13ef35e2469]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:22:30 np0005531887 nova_compute[186849]: 2025-11-22 08:22:30.235 186853 DEBUG nova.compute.manager [req-4ee729b8-e2c4-4089-bb93-0324bfb718ef req-9fa17572-dfc1-4be0-bf7e-ebe2361271cf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Received event network-vif-unplugged-9887acef-e389-49e2-87d8-70796da43759 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:22:30 np0005531887 nova_compute[186849]: 2025-11-22 08:22:30.236 186853 DEBUG oslo_concurrency.lockutils [req-4ee729b8-e2c4-4089-bb93-0324bfb718ef req-9fa17572-dfc1-4be0-bf7e-ebe2361271cf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1ef73533-7ed2-422b-a432-c1f12dbc7323-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:22:30 np0005531887 nova_compute[186849]: 2025-11-22 08:22:30.237 186853 DEBUG oslo_concurrency.lockutils [req-4ee729b8-e2c4-4089-bb93-0324bfb718ef req-9fa17572-dfc1-4be0-bf7e-ebe2361271cf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1ef73533-7ed2-422b-a432-c1f12dbc7323-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:22:30 np0005531887 nova_compute[186849]: 2025-11-22 08:22:30.237 186853 DEBUG oslo_concurrency.lockutils [req-4ee729b8-e2c4-4089-bb93-0324bfb718ef req-9fa17572-dfc1-4be0-bf7e-ebe2361271cf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1ef73533-7ed2-422b-a432-c1f12dbc7323-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:22:30 np0005531887 nova_compute[186849]: 2025-11-22 08:22:30.238 186853 DEBUG nova.compute.manager [req-4ee729b8-e2c4-4089-bb93-0324bfb718ef req-9fa17572-dfc1-4be0-bf7e-ebe2361271cf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] No waiting events found dispatching network-vif-unplugged-9887acef-e389-49e2-87d8-70796da43759 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:22:30 np0005531887 nova_compute[186849]: 2025-11-22 08:22:30.238 186853 DEBUG nova.compute.manager [req-4ee729b8-e2c4-4089-bb93-0324bfb718ef req-9fa17572-dfc1-4be0-bf7e-ebe2361271cf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Received event network-vif-unplugged-9887acef-e389-49e2-87d8-70796da43759 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 03:22:31 np0005531887 nova_compute[186849]: 2025-11-22 08:22:31.187 186853 DEBUG nova.compute.manager [req-ab80677a-cf85-4890-b885-d420d023b898 req-f6870bf6-adb9-413e-a840-4d338154e1f5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Received event network-vif-unplugged-9887acef-e389-49e2-87d8-70796da43759 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:22:31 np0005531887 nova_compute[186849]: 2025-11-22 08:22:31.188 186853 DEBUG oslo_concurrency.lockutils [req-ab80677a-cf85-4890-b885-d420d023b898 req-f6870bf6-adb9-413e-a840-4d338154e1f5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1ef73533-7ed2-422b-a432-c1f12dbc7323-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:22:31 np0005531887 nova_compute[186849]: 2025-11-22 08:22:31.188 186853 DEBUG oslo_concurrency.lockutils [req-ab80677a-cf85-4890-b885-d420d023b898 req-f6870bf6-adb9-413e-a840-4d338154e1f5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1ef73533-7ed2-422b-a432-c1f12dbc7323-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:22:31 np0005531887 nova_compute[186849]: 2025-11-22 08:22:31.189 186853 DEBUG oslo_concurrency.lockutils [req-ab80677a-cf85-4890-b885-d420d023b898 req-f6870bf6-adb9-413e-a840-4d338154e1f5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1ef73533-7ed2-422b-a432-c1f12dbc7323-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:22:31 np0005531887 nova_compute[186849]: 2025-11-22 08:22:31.189 186853 DEBUG nova.compute.manager [req-ab80677a-cf85-4890-b885-d420d023b898 req-f6870bf6-adb9-413e-a840-4d338154e1f5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] No waiting events found dispatching network-vif-unplugged-9887acef-e389-49e2-87d8-70796da43759 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:22:31 np0005531887 nova_compute[186849]: 2025-11-22 08:22:31.189 186853 DEBUG nova.compute.manager [req-ab80677a-cf85-4890-b885-d420d023b898 req-f6870bf6-adb9-413e-a840-4d338154e1f5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Received event network-vif-unplugged-9887acef-e389-49e2-87d8-70796da43759 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 03:22:31 np0005531887 nova_compute[186849]: 2025-11-22 08:22:31.638 186853 DEBUG nova.network.neutron [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Activated binding for port 9887acef-e389-49e2-87d8-70796da43759 and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181#033[00m
Nov 22 03:22:31 np0005531887 nova_compute[186849]: 2025-11-22 08:22:31.639 186853 DEBUG nova.compute.manager [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "9887acef-e389-49e2-87d8-70796da43759", "address": "fa:16:3e:b4:b4:21", "network": {"id": "cee14e96-7070-410d-8934-e305861050e3", "bridge": "br-int", "label": "tempest-network-smoke--303711675", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.178", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9887acef-e3", "ovs_interfaceid": "9887acef-e389-49e2-87d8-70796da43759", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326#033[00m
Nov 22 03:22:31 np0005531887 nova_compute[186849]: 2025-11-22 08:22:31.639 186853 DEBUG nova.virt.libvirt.vif [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:21:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-896988644',display_name='tempest-TestNetworkAdvancedServerOps-server-896988644',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-896988644',id=155,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGEdgv7ZU5mr1DCHTKMPGMEa2ECF9EUHEhtvPrAip3HJ7nfj7TmONl8h5osSWq7Dqr3V6Hj92ZlV3FEvFnmrY27FQG+lpmv/tm8jg4LcVo4ZQR1NoEXbdJ/E0azh0THluw==',key_name='tempest-TestNetworkAdvancedServerOps-205176852',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:21:50Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='042f6d127720471aaedb8a1fb7535416',ramdisk_id='',reservation_id='r-9m0of0y0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1221065053',owner_user_name='tempest-TestNetworkAdvancedServerOps-1221065053-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:22:15Z,user_data=None,user_id='d8853d84c1e84f6baaf01635ef1d0f7c',uuid=1ef73533-7ed2-422b-a432-c1f12dbc7323,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9887acef-e389-49e2-87d8-70796da43759", "address": "fa:16:3e:b4:b4:21", "network": {"id": "cee14e96-7070-410d-8934-e305861050e3", "bridge": "br-int", "label": "tempest-network-smoke--303711675", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.178", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9887acef-e3", "ovs_interfaceid": "9887acef-e389-49e2-87d8-70796da43759", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:22:31 np0005531887 nova_compute[186849]: 2025-11-22 08:22:31.640 186853 DEBUG nova.network.os_vif_util [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Converting VIF {"id": "9887acef-e389-49e2-87d8-70796da43759", "address": "fa:16:3e:b4:b4:21", "network": {"id": "cee14e96-7070-410d-8934-e305861050e3", "bridge": "br-int", "label": "tempest-network-smoke--303711675", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.178", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9887acef-e3", "ovs_interfaceid": "9887acef-e389-49e2-87d8-70796da43759", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:22:31 np0005531887 nova_compute[186849]: 2025-11-22 08:22:31.640 186853 DEBUG nova.network.os_vif_util [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b4:b4:21,bridge_name='br-int',has_traffic_filtering=True,id=9887acef-e389-49e2-87d8-70796da43759,network=Network(cee14e96-7070-410d-8934-e305861050e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9887acef-e3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:22:31 np0005531887 nova_compute[186849]: 2025-11-22 08:22:31.641 186853 DEBUG os_vif [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b4:b4:21,bridge_name='br-int',has_traffic_filtering=True,id=9887acef-e389-49e2-87d8-70796da43759,network=Network(cee14e96-7070-410d-8934-e305861050e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9887acef-e3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:22:31 np0005531887 nova_compute[186849]: 2025-11-22 08:22:31.643 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:31 np0005531887 nova_compute[186849]: 2025-11-22 08:22:31.643 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9887acef-e3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:22:31 np0005531887 nova_compute[186849]: 2025-11-22 08:22:31.644 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:31 np0005531887 nova_compute[186849]: 2025-11-22 08:22:31.646 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:31 np0005531887 nova_compute[186849]: 2025-11-22 08:22:31.649 186853 INFO os_vif [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b4:b4:21,bridge_name='br-int',has_traffic_filtering=True,id=9887acef-e389-49e2-87d8-70796da43759,network=Network(cee14e96-7070-410d-8934-e305861050e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9887acef-e3')#033[00m
Nov 22 03:22:31 np0005531887 nova_compute[186849]: 2025-11-22 08:22:31.649 186853 DEBUG oslo_concurrency.lockutils [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:22:31 np0005531887 nova_compute[186849]: 2025-11-22 08:22:31.650 186853 DEBUG oslo_concurrency.lockutils [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:22:31 np0005531887 nova_compute[186849]: 2025-11-22 08:22:31.650 186853 DEBUG oslo_concurrency.lockutils [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:22:31 np0005531887 nova_compute[186849]: 2025-11-22 08:22:31.650 186853 DEBUG nova.compute.manager [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349#033[00m
Nov 22 03:22:31 np0005531887 nova_compute[186849]: 2025-11-22 08:22:31.651 186853 INFO nova.virt.libvirt.driver [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Deleting instance files /var/lib/nova/instances/1ef73533-7ed2-422b-a432-c1f12dbc7323_del#033[00m
Nov 22 03:22:31 np0005531887 nova_compute[186849]: 2025-11-22 08:22:31.652 186853 INFO nova.virt.libvirt.driver [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Deletion of /var/lib/nova/instances/1ef73533-7ed2-422b-a432-c1f12dbc7323_del complete#033[00m
Nov 22 03:22:32 np0005531887 nova_compute[186849]: 2025-11-22 08:22:32.271 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:32 np0005531887 nova_compute[186849]: 2025-11-22 08:22:32.353 186853 DEBUG nova.compute.manager [req-bd867047-7074-48e9-9b4a-6595d1c21d06 req-3efd1729-e545-4734-98db-2ca9b0950974 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Received event network-vif-plugged-9887acef-e389-49e2-87d8-70796da43759 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:22:32 np0005531887 nova_compute[186849]: 2025-11-22 08:22:32.354 186853 DEBUG oslo_concurrency.lockutils [req-bd867047-7074-48e9-9b4a-6595d1c21d06 req-3efd1729-e545-4734-98db-2ca9b0950974 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1ef73533-7ed2-422b-a432-c1f12dbc7323-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:22:32 np0005531887 nova_compute[186849]: 2025-11-22 08:22:32.354 186853 DEBUG oslo_concurrency.lockutils [req-bd867047-7074-48e9-9b4a-6595d1c21d06 req-3efd1729-e545-4734-98db-2ca9b0950974 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1ef73533-7ed2-422b-a432-c1f12dbc7323-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:22:32 np0005531887 nova_compute[186849]: 2025-11-22 08:22:32.354 186853 DEBUG oslo_concurrency.lockutils [req-bd867047-7074-48e9-9b4a-6595d1c21d06 req-3efd1729-e545-4734-98db-2ca9b0950974 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1ef73533-7ed2-422b-a432-c1f12dbc7323-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:22:32 np0005531887 nova_compute[186849]: 2025-11-22 08:22:32.354 186853 DEBUG nova.compute.manager [req-bd867047-7074-48e9-9b4a-6595d1c21d06 req-3efd1729-e545-4734-98db-2ca9b0950974 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] No waiting events found dispatching network-vif-plugged-9887acef-e389-49e2-87d8-70796da43759 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:22:32 np0005531887 nova_compute[186849]: 2025-11-22 08:22:32.355 186853 WARNING nova.compute.manager [req-bd867047-7074-48e9-9b4a-6595d1c21d06 req-3efd1729-e545-4734-98db-2ca9b0950974 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Received unexpected event network-vif-plugged-9887acef-e389-49e2-87d8-70796da43759 for instance with vm_state active and task_state migrating.#033[00m
Nov 22 03:22:32 np0005531887 nova_compute[186849]: 2025-11-22 08:22:32.355 186853 DEBUG nova.compute.manager [req-bd867047-7074-48e9-9b4a-6595d1c21d06 req-3efd1729-e545-4734-98db-2ca9b0950974 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Received event network-vif-plugged-9887acef-e389-49e2-87d8-70796da43759 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:22:32 np0005531887 nova_compute[186849]: 2025-11-22 08:22:32.355 186853 DEBUG oslo_concurrency.lockutils [req-bd867047-7074-48e9-9b4a-6595d1c21d06 req-3efd1729-e545-4734-98db-2ca9b0950974 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1ef73533-7ed2-422b-a432-c1f12dbc7323-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:22:32 np0005531887 nova_compute[186849]: 2025-11-22 08:22:32.355 186853 DEBUG oslo_concurrency.lockutils [req-bd867047-7074-48e9-9b4a-6595d1c21d06 req-3efd1729-e545-4734-98db-2ca9b0950974 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1ef73533-7ed2-422b-a432-c1f12dbc7323-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:22:32 np0005531887 nova_compute[186849]: 2025-11-22 08:22:32.356 186853 DEBUG oslo_concurrency.lockutils [req-bd867047-7074-48e9-9b4a-6595d1c21d06 req-3efd1729-e545-4734-98db-2ca9b0950974 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1ef73533-7ed2-422b-a432-c1f12dbc7323-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:22:32 np0005531887 nova_compute[186849]: 2025-11-22 08:22:32.356 186853 DEBUG nova.compute.manager [req-bd867047-7074-48e9-9b4a-6595d1c21d06 req-3efd1729-e545-4734-98db-2ca9b0950974 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] No waiting events found dispatching network-vif-plugged-9887acef-e389-49e2-87d8-70796da43759 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:22:32 np0005531887 nova_compute[186849]: 2025-11-22 08:22:32.356 186853 WARNING nova.compute.manager [req-bd867047-7074-48e9-9b4a-6595d1c21d06 req-3efd1729-e545-4734-98db-2ca9b0950974 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Received unexpected event network-vif-plugged-9887acef-e389-49e2-87d8-70796da43759 for instance with vm_state active and task_state migrating.#033[00m
Nov 22 03:22:32 np0005531887 nova_compute[186849]: 2025-11-22 08:22:32.356 186853 DEBUG nova.compute.manager [req-bd867047-7074-48e9-9b4a-6595d1c21d06 req-3efd1729-e545-4734-98db-2ca9b0950974 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Received event network-vif-plugged-9887acef-e389-49e2-87d8-70796da43759 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:22:32 np0005531887 nova_compute[186849]: 2025-11-22 08:22:32.357 186853 DEBUG oslo_concurrency.lockutils [req-bd867047-7074-48e9-9b4a-6595d1c21d06 req-3efd1729-e545-4734-98db-2ca9b0950974 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1ef73533-7ed2-422b-a432-c1f12dbc7323-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:22:32 np0005531887 nova_compute[186849]: 2025-11-22 08:22:32.357 186853 DEBUG oslo_concurrency.lockutils [req-bd867047-7074-48e9-9b4a-6595d1c21d06 req-3efd1729-e545-4734-98db-2ca9b0950974 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1ef73533-7ed2-422b-a432-c1f12dbc7323-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:22:32 np0005531887 nova_compute[186849]: 2025-11-22 08:22:32.357 186853 DEBUG oslo_concurrency.lockutils [req-bd867047-7074-48e9-9b4a-6595d1c21d06 req-3efd1729-e545-4734-98db-2ca9b0950974 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1ef73533-7ed2-422b-a432-c1f12dbc7323-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:22:32 np0005531887 nova_compute[186849]: 2025-11-22 08:22:32.357 186853 DEBUG nova.compute.manager [req-bd867047-7074-48e9-9b4a-6595d1c21d06 req-3efd1729-e545-4734-98db-2ca9b0950974 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] No waiting events found dispatching network-vif-plugged-9887acef-e389-49e2-87d8-70796da43759 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:22:32 np0005531887 nova_compute[186849]: 2025-11-22 08:22:32.357 186853 WARNING nova.compute.manager [req-bd867047-7074-48e9-9b4a-6595d1c21d06 req-3efd1729-e545-4734-98db-2ca9b0950974 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Received unexpected event network-vif-plugged-9887acef-e389-49e2-87d8-70796da43759 for instance with vm_state active and task_state migrating.#033[00m
Nov 22 03:22:32 np0005531887 nova_compute[186849]: 2025-11-22 08:22:32.358 186853 DEBUG nova.compute.manager [req-bd867047-7074-48e9-9b4a-6595d1c21d06 req-3efd1729-e545-4734-98db-2ca9b0950974 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Received event network-vif-plugged-9887acef-e389-49e2-87d8-70796da43759 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:22:32 np0005531887 nova_compute[186849]: 2025-11-22 08:22:32.358 186853 DEBUG oslo_concurrency.lockutils [req-bd867047-7074-48e9-9b4a-6595d1c21d06 req-3efd1729-e545-4734-98db-2ca9b0950974 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1ef73533-7ed2-422b-a432-c1f12dbc7323-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:22:32 np0005531887 nova_compute[186849]: 2025-11-22 08:22:32.358 186853 DEBUG oslo_concurrency.lockutils [req-bd867047-7074-48e9-9b4a-6595d1c21d06 req-3efd1729-e545-4734-98db-2ca9b0950974 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1ef73533-7ed2-422b-a432-c1f12dbc7323-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:22:32 np0005531887 nova_compute[186849]: 2025-11-22 08:22:32.358 186853 DEBUG oslo_concurrency.lockutils [req-bd867047-7074-48e9-9b4a-6595d1c21d06 req-3efd1729-e545-4734-98db-2ca9b0950974 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1ef73533-7ed2-422b-a432-c1f12dbc7323-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:22:32 np0005531887 nova_compute[186849]: 2025-11-22 08:22:32.358 186853 DEBUG nova.compute.manager [req-bd867047-7074-48e9-9b4a-6595d1c21d06 req-3efd1729-e545-4734-98db-2ca9b0950974 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] No waiting events found dispatching network-vif-plugged-9887acef-e389-49e2-87d8-70796da43759 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:22:32 np0005531887 nova_compute[186849]: 2025-11-22 08:22:32.358 186853 WARNING nova.compute.manager [req-bd867047-7074-48e9-9b4a-6595d1c21d06 req-3efd1729-e545-4734-98db-2ca9b0950974 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Received unexpected event network-vif-plugged-9887acef-e389-49e2-87d8-70796da43759 for instance with vm_state active and task_state migrating.#033[00m
Nov 22 03:22:32 np0005531887 nova_compute[186849]: 2025-11-22 08:22:32.762 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:22:33 np0005531887 systemd[1]: Stopping User Manager for UID 42436...
Nov 22 03:22:33 np0005531887 systemd[238859]: Activating special unit Exit the Session...
Nov 22 03:22:33 np0005531887 systemd[238859]: Stopped target Main User Target.
Nov 22 03:22:33 np0005531887 systemd[238859]: Stopped target Basic System.
Nov 22 03:22:33 np0005531887 systemd[238859]: Stopped target Paths.
Nov 22 03:22:33 np0005531887 systemd[238859]: Stopped target Sockets.
Nov 22 03:22:33 np0005531887 systemd[238859]: Stopped target Timers.
Nov 22 03:22:33 np0005531887 systemd[238859]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 22 03:22:33 np0005531887 systemd[238859]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 22 03:22:33 np0005531887 systemd[238859]: Closed D-Bus User Message Bus Socket.
Nov 22 03:22:33 np0005531887 systemd[238859]: Stopped Create User's Volatile Files and Directories.
Nov 22 03:22:33 np0005531887 systemd[238859]: Removed slice User Application Slice.
Nov 22 03:22:33 np0005531887 systemd[238859]: Reached target Shutdown.
Nov 22 03:22:33 np0005531887 systemd[238859]: Finished Exit the Session.
Nov 22 03:22:33 np0005531887 systemd[238859]: Reached target Exit the Session.
Nov 22 03:22:33 np0005531887 systemd[1]: user@42436.service: Deactivated successfully.
Nov 22 03:22:33 np0005531887 systemd[1]: Stopped User Manager for UID 42436.
Nov 22 03:22:33 np0005531887 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 22 03:22:33 np0005531887 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 22 03:22:33 np0005531887 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 22 03:22:33 np0005531887 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 22 03:22:33 np0005531887 systemd[1]: Removed slice User Slice of UID 42436.
Nov 22 03:22:35 np0005531887 podman[239041]: 2025-11-22 08:22:35.846394762 +0000 UTC m=+0.061875902 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 03:22:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:22:36.148 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=48, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=47) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:22:36 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:22:36.149 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:22:36 np0005531887 nova_compute[186849]: 2025-11-22 08:22:36.150 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:36 np0005531887 nova_compute[186849]: 2025-11-22 08:22:36.646 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:22:36.669 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:22:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:22:36.670 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:22:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:22:36.670 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:22:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:22:36.670 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:22:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:22:36.670 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:22:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:22:36.670 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:22:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:22:36.671 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:22:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:22:36.671 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:22:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:22:36.671 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:22:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:22:36.671 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:22:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:22:36.671 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:22:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:22:36.671 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:22:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:22:36.671 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:22:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:22:36.671 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:22:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:22:36.671 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:22:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:22:36.671 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:22:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:22:36.672 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:22:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:22:36.672 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:22:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:22:36.672 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:22:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:22:36.672 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:22:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:22:36.672 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:22:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:22:36.672 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:22:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:22:36.672 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:22:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:22:36.672 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:22:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:22:36.673 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:22:36 np0005531887 nova_compute[186849]: 2025-11-22 08:22:36.932 186853 DEBUG oslo_concurrency.lockutils [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Acquiring lock "1ef73533-7ed2-422b-a432-c1f12dbc7323-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:22:36 np0005531887 nova_compute[186849]: 2025-11-22 08:22:36.933 186853 DEBUG oslo_concurrency.lockutils [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Lock "1ef73533-7ed2-422b-a432-c1f12dbc7323-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:22:36 np0005531887 nova_compute[186849]: 2025-11-22 08:22:36.933 186853 DEBUG oslo_concurrency.lockutils [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Lock "1ef73533-7ed2-422b-a432-c1f12dbc7323-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:22:36 np0005531887 nova_compute[186849]: 2025-11-22 08:22:36.952 186853 DEBUG oslo_concurrency.lockutils [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:22:36 np0005531887 nova_compute[186849]: 2025-11-22 08:22:36.953 186853 DEBUG oslo_concurrency.lockutils [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:22:36 np0005531887 nova_compute[186849]: 2025-11-22 08:22:36.953 186853 DEBUG oslo_concurrency.lockutils [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:22:36 np0005531887 nova_compute[186849]: 2025-11-22 08:22:36.953 186853 DEBUG nova.compute.resource_tracker [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:22:37 np0005531887 nova_compute[186849]: 2025-11-22 08:22:37.137 186853 WARNING nova.virt.libvirt.driver [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:22:37 np0005531887 nova_compute[186849]: 2025-11-22 08:22:37.138 186853 DEBUG nova.compute.resource_tracker [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5729MB free_disk=73.27374649047852GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:22:37 np0005531887 nova_compute[186849]: 2025-11-22 08:22:37.138 186853 DEBUG oslo_concurrency.lockutils [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:22:37 np0005531887 nova_compute[186849]: 2025-11-22 08:22:37.139 186853 DEBUG oslo_concurrency.lockutils [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:22:37 np0005531887 nova_compute[186849]: 2025-11-22 08:22:37.179 186853 DEBUG nova.compute.resource_tracker [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Migration for instance 1ef73533-7ed2-422b-a432-c1f12dbc7323 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Nov 22 03:22:37 np0005531887 nova_compute[186849]: 2025-11-22 08:22:37.202 186853 DEBUG nova.compute.resource_tracker [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m
Nov 22 03:22:37 np0005531887 nova_compute[186849]: 2025-11-22 08:22:37.273 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:22:37.356 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:22:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:22:37.356 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:22:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:22:37.357 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:22:37 np0005531887 nova_compute[186849]: 2025-11-22 08:22:37.542 186853 DEBUG nova.compute.resource_tracker [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Migration a35794db-cb93-4d6b-9acb-ff35fa95c4f0 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Nov 22 03:22:37 np0005531887 nova_compute[186849]: 2025-11-22 08:22:37.543 186853 DEBUG nova.compute.resource_tracker [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:22:37 np0005531887 nova_compute[186849]: 2025-11-22 08:22:37.543 186853 DEBUG nova.compute.resource_tracker [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:22:37 np0005531887 nova_compute[186849]: 2025-11-22 08:22:37.708 186853 DEBUG nova.compute.provider_tree [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:22:37 np0005531887 nova_compute[186849]: 2025-11-22 08:22:37.721 186853 DEBUG nova.scheduler.client.report [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:22:37 np0005531887 nova_compute[186849]: 2025-11-22 08:22:37.841 186853 DEBUG nova.compute.resource_tracker [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:22:37 np0005531887 nova_compute[186849]: 2025-11-22 08:22:37.842 186853 DEBUG oslo_concurrency.lockutils [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.703s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:22:37 np0005531887 nova_compute[186849]: 2025-11-22 08:22:37.855 186853 INFO nova.compute.manager [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Migrating instance to compute-0.ctlplane.example.com finished successfully.#033[00m
Nov 22 03:22:38 np0005531887 nova_compute[186849]: 2025-11-22 08:22:38.116 186853 INFO nova.scheduler.client.report [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Deleted allocation for migration a35794db-cb93-4d6b-9acb-ff35fa95c4f0#033[00m
Nov 22 03:22:38 np0005531887 nova_compute[186849]: 2025-11-22 08:22:38.117 186853 DEBUG nova.virt.libvirt.driver [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662#033[00m
Nov 22 03:22:40 np0005531887 nova_compute[186849]: 2025-11-22 08:22:40.652 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:40 np0005531887 nova_compute[186849]: 2025-11-22 08:22:40.760 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:40 np0005531887 podman[239064]: 2025-11-22 08:22:40.854713832 +0000 UTC m=+0.063566483 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 22 03:22:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:22:41.152 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '48'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:22:41 np0005531887 nova_compute[186849]: 2025-11-22 08:22:41.649 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:42 np0005531887 nova_compute[186849]: 2025-11-22 08:22:42.274 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:44 np0005531887 nova_compute[186849]: 2025-11-22 08:22:44.703 186853 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763799749.7020543, 1ef73533-7ed2-422b-a432-c1f12dbc7323 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:22:44 np0005531887 nova_compute[186849]: 2025-11-22 08:22:44.703 186853 INFO nova.compute.manager [-] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:22:44 np0005531887 nova_compute[186849]: 2025-11-22 08:22:44.725 186853 DEBUG nova.compute.manager [None req-2aace41b-9317-49cd-8de1-392919dad3df - - - - - -] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:22:44 np0005531887 podman[239084]: 2025-11-22 08:22:44.862587633 +0000 UTC m=+0.075262973 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 03:22:46 np0005531887 nova_compute[186849]: 2025-11-22 08:22:46.651 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:47 np0005531887 nova_compute[186849]: 2025-11-22 08:22:47.276 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:50 np0005531887 podman[239108]: 2025-11-22 08:22:50.832693958 +0000 UTC m=+0.056932519 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.buildah.version=1.33.7, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, container_name=openstack_network_exporter, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, version=9.6, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 22 03:22:51 np0005531887 nova_compute[186849]: 2025-11-22 08:22:51.653 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:52 np0005531887 nova_compute[186849]: 2025-11-22 08:22:52.279 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:55 np0005531887 podman[239129]: 2025-11-22 08:22:55.866369877 +0000 UTC m=+0.089507854 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 22 03:22:55 np0005531887 podman[239130]: 2025-11-22 08:22:55.874792105 +0000 UTC m=+0.094016326 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 03:22:56 np0005531887 nova_compute[186849]: 2025-11-22 08:22:56.657 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:57 np0005531887 nova_compute[186849]: 2025-11-22 08:22:57.281 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:22:59 np0005531887 podman[239177]: 2025-11-22 08:22:59.831486711 +0000 UTC m=+0.054298934 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:23:01 np0005531887 nova_compute[186849]: 2025-11-22 08:23:01.659 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:02 np0005531887 nova_compute[186849]: 2025-11-22 08:23:02.283 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:05 np0005531887 nova_compute[186849]: 2025-11-22 08:23:05.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:23:06 np0005531887 nova_compute[186849]: 2025-11-22 08:23:06.662 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:06 np0005531887 podman[239203]: 2025-11-22 08:23:06.8369599 +0000 UTC m=+0.058292863 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 22 03:23:07 np0005531887 nova_compute[186849]: 2025-11-22 08:23:07.287 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:10 np0005531887 nova_compute[186849]: 2025-11-22 08:23:10.763 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:23:10 np0005531887 nova_compute[186849]: 2025-11-22 08:23:10.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:23:10 np0005531887 nova_compute[186849]: 2025-11-22 08:23:10.768 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:23:10 np0005531887 nova_compute[186849]: 2025-11-22 08:23:10.768 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:23:10 np0005531887 nova_compute[186849]: 2025-11-22 08:23:10.779 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:23:11 np0005531887 nova_compute[186849]: 2025-11-22 08:23:11.665 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:11 np0005531887 podman[239223]: 2025-11-22 08:23:11.858010524 +0000 UTC m=+0.068830283 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=multipathd)
Nov 22 03:23:12 np0005531887 nova_compute[186849]: 2025-11-22 08:23:12.290 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:13 np0005531887 nova_compute[186849]: 2025-11-22 08:23:13.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:23:13 np0005531887 nova_compute[186849]: 2025-11-22 08:23:13.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:23:13 np0005531887 nova_compute[186849]: 2025-11-22 08:23:13.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:23:13 np0005531887 nova_compute[186849]: 2025-11-22 08:23:13.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:23:13 np0005531887 nova_compute[186849]: 2025-11-22 08:23:13.797 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:23:13 np0005531887 nova_compute[186849]: 2025-11-22 08:23:13.797 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:23:13 np0005531887 nova_compute[186849]: 2025-11-22 08:23:13.797 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:23:13 np0005531887 nova_compute[186849]: 2025-11-22 08:23:13.797 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:23:14 np0005531887 nova_compute[186849]: 2025-11-22 08:23:14.003 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:23:14 np0005531887 nova_compute[186849]: 2025-11-22 08:23:14.004 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5737MB free_disk=73.27376556396484GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:23:14 np0005531887 nova_compute[186849]: 2025-11-22 08:23:14.004 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:23:14 np0005531887 nova_compute[186849]: 2025-11-22 08:23:14.005 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:23:14 np0005531887 nova_compute[186849]: 2025-11-22 08:23:14.088 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:23:14 np0005531887 nova_compute[186849]: 2025-11-22 08:23:14.089 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:23:14 np0005531887 nova_compute[186849]: 2025-11-22 08:23:14.125 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:23:14 np0005531887 nova_compute[186849]: 2025-11-22 08:23:14.146 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:23:14 np0005531887 nova_compute[186849]: 2025-11-22 08:23:14.148 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:23:14 np0005531887 nova_compute[186849]: 2025-11-22 08:23:14.148 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.143s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:23:15 np0005531887 podman[239243]: 2025-11-22 08:23:15.866217851 +0000 UTC m=+0.083254199 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:23:16 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:23:16.247 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:10:f1:bb 10.100.0.2 2001:db8::f816:3eff:fe10:f1bb'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe10:f1bb/64', 'neutron:device_id': 'ovnmeta-326c0814-77d4-416b-a5a1-28be00b61ecd', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-326c0814-77d4-416b-a5a1-28be00b61ecd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5c110aad-90e5-4caa-b631-3c18861eaadf, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=e1bc69f6-ec55-4040-be0d-44f334cbe3a6) old=Port_Binding(mac=['fa:16:3e:10:f1:bb 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-326c0814-77d4-416b-a5a1-28be00b61ecd', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-326c0814-77d4-416b-a5a1-28be00b61ecd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:23:16 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:23:16.248 104084 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port e1bc69f6-ec55-4040-be0d-44f334cbe3a6 in datapath 326c0814-77d4-416b-a5a1-28be00b61ecd updated#033[00m
Nov 22 03:23:16 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:23:16.249 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 326c0814-77d4-416b-a5a1-28be00b61ecd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:23:16 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:23:16.249 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[c6e9864b-a0d5-4eeb-b4b1-e38776011528]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:23:16 np0005531887 nova_compute[186849]: 2025-11-22 08:23:16.667 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:17 np0005531887 nova_compute[186849]: 2025-11-22 08:23:17.293 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:19 np0005531887 nova_compute[186849]: 2025-11-22 08:23:19.148 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:23:21 np0005531887 nova_compute[186849]: 2025-11-22 08:23:21.668 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:21 np0005531887 nova_compute[186849]: 2025-11-22 08:23:21.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:23:21 np0005531887 podman[239269]: 2025-11-22 08:23:21.843419022 +0000 UTC m=+0.060852565 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, build-date=2025-08-20T13:12:41, release=1755695350, vcs-type=git, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, io.buildah.version=1.33.7, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm)
Nov 22 03:23:22 np0005531887 nova_compute[186849]: 2025-11-22 08:23:22.296 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:24 np0005531887 nova_compute[186849]: 2025-11-22 08:23:24.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:23:26 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:23:26.123 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=49, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=48) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:23:26 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:23:26.124 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:23:26 np0005531887 nova_compute[186849]: 2025-11-22 08:23:26.124 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:26 np0005531887 nova_compute[186849]: 2025-11-22 08:23:26.670 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:26 np0005531887 podman[239293]: 2025-11-22 08:23:26.84794275 +0000 UTC m=+0.063493932 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 22 03:23:26 np0005531887 podman[239294]: 2025-11-22 08:23:26.888435762 +0000 UTC m=+0.099711548 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 22 03:23:27 np0005531887 nova_compute[186849]: 2025-11-22 08:23:27.298 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:28 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:23:28.126 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '49'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:23:30 np0005531887 podman[239339]: 2025-11-22 08:23:30.832834543 +0000 UTC m=+0.055671958 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 03:23:31 np0005531887 nova_compute[186849]: 2025-11-22 08:23:31.673 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:32 np0005531887 nova_compute[186849]: 2025-11-22 08:23:32.300 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:36 np0005531887 nova_compute[186849]: 2025-11-22 08:23:36.675 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:37 np0005531887 ovn_controller[95130]: 2025-11-22T08:23:37Z|00539|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Nov 22 03:23:37 np0005531887 nova_compute[186849]: 2025-11-22 08:23:37.302 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:23:37.357 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:23:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:23:37.357 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:23:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:23:37.358 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:23:37 np0005531887 podman[239364]: 2025-11-22 08:23:37.837886602 +0000 UTC m=+0.052770207 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Nov 22 03:23:41 np0005531887 nova_compute[186849]: 2025-11-22 08:23:41.683 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:42 np0005531887 nova_compute[186849]: 2025-11-22 08:23:42.307 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:42 np0005531887 podman[239384]: 2025-11-22 08:23:42.841462984 +0000 UTC m=+0.059931032 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_id=multipathd)
Nov 22 03:23:46 np0005531887 nova_compute[186849]: 2025-11-22 08:23:46.685 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:46 np0005531887 podman[239404]: 2025-11-22 08:23:46.839863802 +0000 UTC m=+0.056094119 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:23:47 np0005531887 nova_compute[186849]: 2025-11-22 08:23:47.933 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:51 np0005531887 nova_compute[186849]: 2025-11-22 08:23:51.687 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:52 np0005531887 podman[239431]: 2025-11-22 08:23:52.847230738 +0000 UTC m=+0.068282169 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., release=1755695350, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, architecture=x86_64, config_id=edpm, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 22 03:23:52 np0005531887 nova_compute[186849]: 2025-11-22 08:23:52.935 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:56 np0005531887 nova_compute[186849]: 2025-11-22 08:23:56.689 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:23:57 np0005531887 podman[239452]: 2025-11-22 08:23:57.839136134 +0000 UTC m=+0.060856045 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 22 03:23:57 np0005531887 podman[239453]: 2025-11-22 08:23:57.907331571 +0000 UTC m=+0.122177293 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 22 03:23:57 np0005531887 nova_compute[186849]: 2025-11-22 08:23:57.936 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:01 np0005531887 nova_compute[186849]: 2025-11-22 08:24:01.692 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:01 np0005531887 podman[239498]: 2025-11-22 08:24:01.838386941 +0000 UTC m=+0.058009335 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 03:24:02 np0005531887 nova_compute[186849]: 2025-11-22 08:24:02.938 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:06 np0005531887 nova_compute[186849]: 2025-11-22 08:24:06.694 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:07 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:24:07.604 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=50, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=49) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:24:07 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:24:07.605 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:24:07 np0005531887 nova_compute[186849]: 2025-11-22 08:24:07.605 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:07 np0005531887 nova_compute[186849]: 2025-11-22 08:24:07.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:24:07 np0005531887 nova_compute[186849]: 2025-11-22 08:24:07.939 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:08 np0005531887 podman[239520]: 2025-11-22 08:24:08.862321027 +0000 UTC m=+0.085645479 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 22 03:24:10 np0005531887 nova_compute[186849]: 2025-11-22 08:24:10.770 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:24:10 np0005531887 nova_compute[186849]: 2025-11-22 08:24:10.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:24:10 np0005531887 nova_compute[186849]: 2025-11-22 08:24:10.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:24:10 np0005531887 nova_compute[186849]: 2025-11-22 08:24:10.783 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:24:11 np0005531887 nova_compute[186849]: 2025-11-22 08:24:11.695 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:11 np0005531887 nova_compute[186849]: 2025-11-22 08:24:11.776 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:24:12 np0005531887 nova_compute[186849]: 2025-11-22 08:24:12.940 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:13 np0005531887 nova_compute[186849]: 2025-11-22 08:24:13.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:24:13 np0005531887 podman[239540]: 2025-11-22 08:24:13.855424361 +0000 UTC m=+0.073365654 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:24:14 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:24:14.607 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '50'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:24:14 np0005531887 nova_compute[186849]: 2025-11-22 08:24:14.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:24:14 np0005531887 nova_compute[186849]: 2025-11-22 08:24:14.768 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:24:14 np0005531887 nova_compute[186849]: 2025-11-22 08:24:14.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:24:14 np0005531887 nova_compute[186849]: 2025-11-22 08:24:14.809 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:24:14 np0005531887 nova_compute[186849]: 2025-11-22 08:24:14.809 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:24:14 np0005531887 nova_compute[186849]: 2025-11-22 08:24:14.809 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:24:14 np0005531887 nova_compute[186849]: 2025-11-22 08:24:14.810 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:24:15 np0005531887 nova_compute[186849]: 2025-11-22 08:24:15.053 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:24:15 np0005531887 nova_compute[186849]: 2025-11-22 08:24:15.055 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5759MB free_disk=73.27420043945312GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:24:15 np0005531887 nova_compute[186849]: 2025-11-22 08:24:15.055 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:24:15 np0005531887 nova_compute[186849]: 2025-11-22 08:24:15.055 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:24:15 np0005531887 nova_compute[186849]: 2025-11-22 08:24:15.175 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:24:15 np0005531887 nova_compute[186849]: 2025-11-22 08:24:15.176 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:24:15 np0005531887 nova_compute[186849]: 2025-11-22 08:24:15.210 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:24:15 np0005531887 nova_compute[186849]: 2025-11-22 08:24:15.227 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:24:15 np0005531887 nova_compute[186849]: 2025-11-22 08:24:15.229 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:24:15 np0005531887 nova_compute[186849]: 2025-11-22 08:24:15.229 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.174s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:24:16 np0005531887 nova_compute[186849]: 2025-11-22 08:24:16.698 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:17 np0005531887 podman[239559]: 2025-11-22 08:24:17.828805199 +0000 UTC m=+0.051123815 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 03:24:17 np0005531887 nova_compute[186849]: 2025-11-22 08:24:17.942 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:19 np0005531887 nova_compute[186849]: 2025-11-22 08:24:19.230 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:24:21 np0005531887 nova_compute[186849]: 2025-11-22 08:24:21.701 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:21 np0005531887 nova_compute[186849]: 2025-11-22 08:24:21.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:24:22 np0005531887 nova_compute[186849]: 2025-11-22 08:24:22.943 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:23 np0005531887 podman[239584]: 2025-11-22 08:24:23.838336619 +0000 UTC m=+0.062359553 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, name=ubi9-minimal, build-date=2025-08-20T13:12:41, version=9.6, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 22 03:24:24 np0005531887 nova_compute[186849]: 2025-11-22 08:24:24.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:24:24 np0005531887 nova_compute[186849]: 2025-11-22 08:24:24.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 22 03:24:24 np0005531887 nova_compute[186849]: 2025-11-22 08:24:24.781 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 22 03:24:26 np0005531887 nova_compute[186849]: 2025-11-22 08:24:26.704 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:26 np0005531887 nova_compute[186849]: 2025-11-22 08:24:26.783 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:24:27 np0005531887 nova_compute[186849]: 2025-11-22 08:24:27.944 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:28 np0005531887 podman[239606]: 2025-11-22 08:24:28.833281189 +0000 UTC m=+0.056757984 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 03:24:28 np0005531887 podman[239607]: 2025-11-22 08:24:28.854449213 +0000 UTC m=+0.074150264 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 03:24:29 np0005531887 nova_compute[186849]: 2025-11-22 08:24:29.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:24:29 np0005531887 nova_compute[186849]: 2025-11-22 08:24:29.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 22 03:24:31 np0005531887 nova_compute[186849]: 2025-11-22 08:24:31.707 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:32 np0005531887 podman[239649]: 2025-11-22 08:24:32.832367552 +0000 UTC m=+0.056646852 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 03:24:32 np0005531887 nova_compute[186849]: 2025-11-22 08:24:32.946 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:36 np0005531887 nova_compute[186849]: 2025-11-22 08:24:36.143 186853 DEBUG oslo_concurrency.lockutils [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "0f4580b9-9b95-420b-a41e-971fafe8dab0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:24:36 np0005531887 nova_compute[186849]: 2025-11-22 08:24:36.144 186853 DEBUG oslo_concurrency.lockutils [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "0f4580b9-9b95-420b-a41e-971fafe8dab0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:24:36 np0005531887 nova_compute[186849]: 2025-11-22 08:24:36.249 186853 DEBUG nova.compute.manager [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 03:24:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:24:36.668 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:24:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:24:36.669 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:24:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:24:36.670 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:24:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:24:36.670 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:24:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:24:36.670 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:24:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:24:36.670 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:24:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:24:36.670 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:24:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:24:36.670 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:24:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:24:36.670 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:24:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:24:36.670 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:24:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:24:36.670 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:24:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:24:36.670 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:24:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:24:36.671 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:24:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:24:36.671 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:24:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:24:36.671 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:24:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:24:36.671 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:24:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:24:36.671 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:24:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:24:36.671 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:24:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:24:36.671 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:24:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:24:36.672 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:24:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:24:36.672 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:24:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:24:36.672 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:24:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:24:36.672 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:24:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:24:36.672 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:24:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:24:36.672 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:24:36 np0005531887 nova_compute[186849]: 2025-11-22 08:24:36.710 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:37 np0005531887 nova_compute[186849]: 2025-11-22 08:24:37.206 186853 DEBUG oslo_concurrency.lockutils [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:24:37 np0005531887 nova_compute[186849]: 2025-11-22 08:24:37.207 186853 DEBUG oslo_concurrency.lockutils [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:24:37 np0005531887 nova_compute[186849]: 2025-11-22 08:24:37.215 186853 DEBUG nova.virt.hardware [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:24:37 np0005531887 nova_compute[186849]: 2025-11-22 08:24:37.216 186853 INFO nova.compute.claims [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 22 03:24:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:24:37.357 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:24:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:24:37.358 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:24:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:24:37.358 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:24:37 np0005531887 nova_compute[186849]: 2025-11-22 08:24:37.445 186853 DEBUG nova.compute.provider_tree [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:24:37 np0005531887 nova_compute[186849]: 2025-11-22 08:24:37.466 186853 DEBUG nova.scheduler.client.report [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:24:37 np0005531887 nova_compute[186849]: 2025-11-22 08:24:37.506 186853 DEBUG oslo_concurrency.lockutils [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.299s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:24:37 np0005531887 nova_compute[186849]: 2025-11-22 08:24:37.507 186853 DEBUG nova.compute.manager [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 03:24:37 np0005531887 nova_compute[186849]: 2025-11-22 08:24:37.632 186853 DEBUG nova.compute.manager [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 03:24:37 np0005531887 nova_compute[186849]: 2025-11-22 08:24:37.633 186853 DEBUG nova.network.neutron [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:24:37 np0005531887 nova_compute[186849]: 2025-11-22 08:24:37.704 186853 INFO nova.virt.libvirt.driver [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 03:24:37 np0005531887 nova_compute[186849]: 2025-11-22 08:24:37.738 186853 DEBUG nova.compute.manager [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 03:24:37 np0005531887 nova_compute[186849]: 2025-11-22 08:24:37.784 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:24:37 np0005531887 nova_compute[186849]: 2025-11-22 08:24:37.948 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:38 np0005531887 nova_compute[186849]: 2025-11-22 08:24:38.117 186853 DEBUG nova.compute.manager [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 03:24:38 np0005531887 nova_compute[186849]: 2025-11-22 08:24:38.118 186853 DEBUG nova.virt.libvirt.driver [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:24:38 np0005531887 nova_compute[186849]: 2025-11-22 08:24:38.119 186853 INFO nova.virt.libvirt.driver [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Creating image(s)#033[00m
Nov 22 03:24:38 np0005531887 nova_compute[186849]: 2025-11-22 08:24:38.119 186853 DEBUG oslo_concurrency.lockutils [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "/var/lib/nova/instances/0f4580b9-9b95-420b-a41e-971fafe8dab0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:24:38 np0005531887 nova_compute[186849]: 2025-11-22 08:24:38.120 186853 DEBUG oslo_concurrency.lockutils [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "/var/lib/nova/instances/0f4580b9-9b95-420b-a41e-971fafe8dab0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:24:38 np0005531887 nova_compute[186849]: 2025-11-22 08:24:38.121 186853 DEBUG oslo_concurrency.lockutils [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "/var/lib/nova/instances/0f4580b9-9b95-420b-a41e-971fafe8dab0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:24:38 np0005531887 nova_compute[186849]: 2025-11-22 08:24:38.139 186853 DEBUG nova.policy [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:24:38 np0005531887 nova_compute[186849]: 2025-11-22 08:24:38.142 186853 DEBUG oslo_concurrency.processutils [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:24:38 np0005531887 nova_compute[186849]: 2025-11-22 08:24:38.206 186853 DEBUG oslo_concurrency.processutils [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:24:38 np0005531887 nova_compute[186849]: 2025-11-22 08:24:38.207 186853 DEBUG oslo_concurrency.lockutils [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:24:38 np0005531887 nova_compute[186849]: 2025-11-22 08:24:38.208 186853 DEBUG oslo_concurrency.lockutils [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:24:38 np0005531887 nova_compute[186849]: 2025-11-22 08:24:38.218 186853 DEBUG oslo_concurrency.processutils [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:24:38 np0005531887 nova_compute[186849]: 2025-11-22 08:24:38.284 186853 DEBUG oslo_concurrency.processutils [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:24:38 np0005531887 nova_compute[186849]: 2025-11-22 08:24:38.285 186853 DEBUG oslo_concurrency.processutils [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/0f4580b9-9b95-420b-a41e-971fafe8dab0/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:24:39 np0005531887 nova_compute[186849]: 2025-11-22 08:24:39.334 186853 DEBUG oslo_concurrency.processutils [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/0f4580b9-9b95-420b-a41e-971fafe8dab0/disk 1073741824" returned: 0 in 1.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:24:39 np0005531887 nova_compute[186849]: 2025-11-22 08:24:39.335 186853 DEBUG oslo_concurrency.lockutils [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 1.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:24:39 np0005531887 nova_compute[186849]: 2025-11-22 08:24:39.336 186853 DEBUG oslo_concurrency.processutils [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:24:39 np0005531887 nova_compute[186849]: 2025-11-22 08:24:39.400 186853 DEBUG oslo_concurrency.processutils [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:24:39 np0005531887 nova_compute[186849]: 2025-11-22 08:24:39.402 186853 DEBUG nova.virt.disk.api [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Checking if we can resize image /var/lib/nova/instances/0f4580b9-9b95-420b-a41e-971fafe8dab0/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:24:39 np0005531887 nova_compute[186849]: 2025-11-22 08:24:39.402 186853 DEBUG oslo_concurrency.processutils [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0f4580b9-9b95-420b-a41e-971fafe8dab0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:24:39 np0005531887 nova_compute[186849]: 2025-11-22 08:24:39.462 186853 DEBUG oslo_concurrency.processutils [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0f4580b9-9b95-420b-a41e-971fafe8dab0/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:24:39 np0005531887 nova_compute[186849]: 2025-11-22 08:24:39.464 186853 DEBUG nova.virt.disk.api [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Cannot resize image /var/lib/nova/instances/0f4580b9-9b95-420b-a41e-971fafe8dab0/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:24:39 np0005531887 nova_compute[186849]: 2025-11-22 08:24:39.464 186853 DEBUG nova.objects.instance [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'migration_context' on Instance uuid 0f4580b9-9b95-420b-a41e-971fafe8dab0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:24:39 np0005531887 nova_compute[186849]: 2025-11-22 08:24:39.478 186853 DEBUG nova.virt.libvirt.driver [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:24:39 np0005531887 nova_compute[186849]: 2025-11-22 08:24:39.478 186853 DEBUG nova.virt.libvirt.driver [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Ensure instance console log exists: /var/lib/nova/instances/0f4580b9-9b95-420b-a41e-971fafe8dab0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:24:39 np0005531887 nova_compute[186849]: 2025-11-22 08:24:39.479 186853 DEBUG oslo_concurrency.lockutils [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:24:39 np0005531887 nova_compute[186849]: 2025-11-22 08:24:39.479 186853 DEBUG oslo_concurrency.lockutils [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:24:39 np0005531887 nova_compute[186849]: 2025-11-22 08:24:39.479 186853 DEBUG oslo_concurrency.lockutils [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:24:39 np0005531887 nova_compute[186849]: 2025-11-22 08:24:39.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:24:39 np0005531887 podman[239688]: 2025-11-22 08:24:39.842523876 +0000 UTC m=+0.056502028 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:24:40 np0005531887 nova_compute[186849]: 2025-11-22 08:24:40.915 186853 DEBUG nova.network.neutron [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Successfully created port: a6fcb4cd-f25f-467f-926b-423518d175bb _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:24:41 np0005531887 nova_compute[186849]: 2025-11-22 08:24:41.714 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:42 np0005531887 nova_compute[186849]: 2025-11-22 08:24:42.949 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:44 np0005531887 podman[239710]: 2025-11-22 08:24:44.848509169 +0000 UTC m=+0.065865970 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:24:45 np0005531887 nova_compute[186849]: 2025-11-22 08:24:45.505 186853 DEBUG nova.network.neutron [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Successfully updated port: a6fcb4cd-f25f-467f-926b-423518d175bb _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:24:45 np0005531887 nova_compute[186849]: 2025-11-22 08:24:45.624 186853 DEBUG nova.compute.manager [req-32166c2d-a58c-4ebf-9ae8-b15c3a1d30a1 req-8f0250c8-43f6-4123-b097-8d5aa618bcdd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Received event network-changed-a6fcb4cd-f25f-467f-926b-423518d175bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:24:45 np0005531887 nova_compute[186849]: 2025-11-22 08:24:45.625 186853 DEBUG nova.compute.manager [req-32166c2d-a58c-4ebf-9ae8-b15c3a1d30a1 req-8f0250c8-43f6-4123-b097-8d5aa618bcdd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Refreshing instance network info cache due to event network-changed-a6fcb4cd-f25f-467f-926b-423518d175bb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:24:45 np0005531887 nova_compute[186849]: 2025-11-22 08:24:45.625 186853 DEBUG oslo_concurrency.lockutils [req-32166c2d-a58c-4ebf-9ae8-b15c3a1d30a1 req-8f0250c8-43f6-4123-b097-8d5aa618bcdd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-0f4580b9-9b95-420b-a41e-971fafe8dab0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:24:45 np0005531887 nova_compute[186849]: 2025-11-22 08:24:45.625 186853 DEBUG oslo_concurrency.lockutils [req-32166c2d-a58c-4ebf-9ae8-b15c3a1d30a1 req-8f0250c8-43f6-4123-b097-8d5aa618bcdd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-0f4580b9-9b95-420b-a41e-971fafe8dab0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:24:45 np0005531887 nova_compute[186849]: 2025-11-22 08:24:45.625 186853 DEBUG nova.network.neutron [req-32166c2d-a58c-4ebf-9ae8-b15c3a1d30a1 req-8f0250c8-43f6-4123-b097-8d5aa618bcdd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Refreshing network info cache for port a6fcb4cd-f25f-467f-926b-423518d175bb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:24:45 np0005531887 nova_compute[186849]: 2025-11-22 08:24:45.781 186853 DEBUG oslo_concurrency.lockutils [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "refresh_cache-0f4580b9-9b95-420b-a41e-971fafe8dab0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:24:46 np0005531887 nova_compute[186849]: 2025-11-22 08:24:46.124 186853 DEBUG nova.network.neutron [req-32166c2d-a58c-4ebf-9ae8-b15c3a1d30a1 req-8f0250c8-43f6-4123-b097-8d5aa618bcdd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:24:46 np0005531887 nova_compute[186849]: 2025-11-22 08:24:46.562 186853 DEBUG nova.network.neutron [req-32166c2d-a58c-4ebf-9ae8-b15c3a1d30a1 req-8f0250c8-43f6-4123-b097-8d5aa618bcdd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:24:46 np0005531887 nova_compute[186849]: 2025-11-22 08:24:46.582 186853 DEBUG oslo_concurrency.lockutils [req-32166c2d-a58c-4ebf-9ae8-b15c3a1d30a1 req-8f0250c8-43f6-4123-b097-8d5aa618bcdd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-0f4580b9-9b95-420b-a41e-971fafe8dab0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:24:46 np0005531887 nova_compute[186849]: 2025-11-22 08:24:46.584 186853 DEBUG oslo_concurrency.lockutils [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquired lock "refresh_cache-0f4580b9-9b95-420b-a41e-971fafe8dab0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:24:46 np0005531887 nova_compute[186849]: 2025-11-22 08:24:46.584 186853 DEBUG nova.network.neutron [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:24:46 np0005531887 nova_compute[186849]: 2025-11-22 08:24:46.717 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:47 np0005531887 nova_compute[186849]: 2025-11-22 08:24:47.246 186853 DEBUG nova.network.neutron [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:24:47 np0005531887 nova_compute[186849]: 2025-11-22 08:24:47.951 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:48 np0005531887 podman[239730]: 2025-11-22 08:24:48.834230162 +0000 UTC m=+0.055379970 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:24:49 np0005531887 nova_compute[186849]: 2025-11-22 08:24:49.258 186853 DEBUG nova.network.neutron [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Updating instance_info_cache with network_info: [{"id": "a6fcb4cd-f25f-467f-926b-423518d175bb", "address": "fa:16:3e:04:50:b7", "network": {"id": "87c3d412-41cf-4963-ada8-de4b3881e6fd", "bridge": "br-int", "label": "tempest-network-smoke--716674626", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6fcb4cd-f2", "ovs_interfaceid": "a6fcb4cd-f25f-467f-926b-423518d175bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:24:49 np0005531887 nova_compute[186849]: 2025-11-22 08:24:49.323 186853 DEBUG oslo_concurrency.lockutils [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Releasing lock "refresh_cache-0f4580b9-9b95-420b-a41e-971fafe8dab0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:24:49 np0005531887 nova_compute[186849]: 2025-11-22 08:24:49.323 186853 DEBUG nova.compute.manager [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Instance network_info: |[{"id": "a6fcb4cd-f25f-467f-926b-423518d175bb", "address": "fa:16:3e:04:50:b7", "network": {"id": "87c3d412-41cf-4963-ada8-de4b3881e6fd", "bridge": "br-int", "label": "tempest-network-smoke--716674626", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6fcb4cd-f2", "ovs_interfaceid": "a6fcb4cd-f25f-467f-926b-423518d175bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 03:24:49 np0005531887 nova_compute[186849]: 2025-11-22 08:24:49.325 186853 DEBUG nova.virt.libvirt.driver [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Start _get_guest_xml network_info=[{"id": "a6fcb4cd-f25f-467f-926b-423518d175bb", "address": "fa:16:3e:04:50:b7", "network": {"id": "87c3d412-41cf-4963-ada8-de4b3881e6fd", "bridge": "br-int", "label": "tempest-network-smoke--716674626", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6fcb4cd-f2", "ovs_interfaceid": "a6fcb4cd-f25f-467f-926b-423518d175bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'size': 0, 'boot_index': 0, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:24:49 np0005531887 nova_compute[186849]: 2025-11-22 08:24:49.333 186853 WARNING nova.virt.libvirt.driver [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:24:49 np0005531887 nova_compute[186849]: 2025-11-22 08:24:49.345 186853 DEBUG nova.virt.libvirt.host [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:24:49 np0005531887 nova_compute[186849]: 2025-11-22 08:24:49.346 186853 DEBUG nova.virt.libvirt.host [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:24:49 np0005531887 nova_compute[186849]: 2025-11-22 08:24:49.355 186853 DEBUG nova.virt.libvirt.host [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:24:49 np0005531887 nova_compute[186849]: 2025-11-22 08:24:49.356 186853 DEBUG nova.virt.libvirt.host [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:24:49 np0005531887 nova_compute[186849]: 2025-11-22 08:24:49.357 186853 DEBUG nova.virt.libvirt.driver [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:24:49 np0005531887 nova_compute[186849]: 2025-11-22 08:24:49.358 186853 DEBUG nova.virt.hardware [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:24:49 np0005531887 nova_compute[186849]: 2025-11-22 08:24:49.358 186853 DEBUG nova.virt.hardware [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:24:49 np0005531887 nova_compute[186849]: 2025-11-22 08:24:49.358 186853 DEBUG nova.virt.hardware [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:24:49 np0005531887 nova_compute[186849]: 2025-11-22 08:24:49.359 186853 DEBUG nova.virt.hardware [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:24:49 np0005531887 nova_compute[186849]: 2025-11-22 08:24:49.359 186853 DEBUG nova.virt.hardware [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:24:49 np0005531887 nova_compute[186849]: 2025-11-22 08:24:49.360 186853 DEBUG nova.virt.hardware [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:24:49 np0005531887 nova_compute[186849]: 2025-11-22 08:24:49.360 186853 DEBUG nova.virt.hardware [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:24:49 np0005531887 nova_compute[186849]: 2025-11-22 08:24:49.361 186853 DEBUG nova.virt.hardware [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:24:49 np0005531887 nova_compute[186849]: 2025-11-22 08:24:49.361 186853 DEBUG nova.virt.hardware [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:24:49 np0005531887 nova_compute[186849]: 2025-11-22 08:24:49.361 186853 DEBUG nova.virt.hardware [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:24:49 np0005531887 nova_compute[186849]: 2025-11-22 08:24:49.361 186853 DEBUG nova.virt.hardware [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:24:49 np0005531887 nova_compute[186849]: 2025-11-22 08:24:49.366 186853 DEBUG nova.virt.libvirt.vif [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:24:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1702931055',display_name='tempest-TestNetworkAdvancedServerOps-server-1702931055',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1702931055',id=160,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPxaQpYKOKe4L0KR7bY79WQEoXUDaFhQINyKgLQxxD+DY1SMW41QKPoSnfT27Llv7MI1/G06FoayeK4tBR1oc1AWt511XOIyHrR4CcBO56ZLEeXBHthfNrR7xWlOpV113A==',key_name='tempest-TestNetworkAdvancedServerOps-1146531194',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='042f6d127720471aaedb8a1fb7535416',ramdisk_id='',reservation_id='r-h8w9hc3n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1221065053',owner_user_name='tempest-TestNetworkAdvancedServerOps-1221065053-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:24:37Z,user_data=None,user_id='d8853d84c1e84f6baaf01635ef1d0f7c',uuid=0f4580b9-9b95-420b-a41e-971fafe8dab0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a6fcb4cd-f25f-467f-926b-423518d175bb", "address": "fa:16:3e:04:50:b7", "network": {"id": "87c3d412-41cf-4963-ada8-de4b3881e6fd", "bridge": "br-int", "label": "tempest-network-smoke--716674626", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6fcb4cd-f2", "ovs_interfaceid": "a6fcb4cd-f25f-467f-926b-423518d175bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:24:49 np0005531887 nova_compute[186849]: 2025-11-22 08:24:49.366 186853 DEBUG nova.network.os_vif_util [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converting VIF {"id": "a6fcb4cd-f25f-467f-926b-423518d175bb", "address": "fa:16:3e:04:50:b7", "network": {"id": "87c3d412-41cf-4963-ada8-de4b3881e6fd", "bridge": "br-int", "label": "tempest-network-smoke--716674626", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6fcb4cd-f2", "ovs_interfaceid": "a6fcb4cd-f25f-467f-926b-423518d175bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:24:49 np0005531887 nova_compute[186849]: 2025-11-22 08:24:49.367 186853 DEBUG nova.network.os_vif_util [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:04:50:b7,bridge_name='br-int',has_traffic_filtering=True,id=a6fcb4cd-f25f-467f-926b-423518d175bb,network=Network(87c3d412-41cf-4963-ada8-de4b3881e6fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa6fcb4cd-f2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:24:49 np0005531887 nova_compute[186849]: 2025-11-22 08:24:49.368 186853 DEBUG nova.objects.instance [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0f4580b9-9b95-420b-a41e-971fafe8dab0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:24:49 np0005531887 nova_compute[186849]: 2025-11-22 08:24:49.390 186853 DEBUG nova.virt.libvirt.driver [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:24:49 np0005531887 nova_compute[186849]:  <uuid>0f4580b9-9b95-420b-a41e-971fafe8dab0</uuid>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:  <name>instance-000000a0</name>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:  <memory>131072</memory>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:  <vcpu>1</vcpu>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:24:49 np0005531887 nova_compute[186849]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-1702931055</nova:name>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:      <nova:creationTime>2025-11-22 08:24:49</nova:creationTime>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:      <nova:flavor name="m1.nano">
Nov 22 03:24:49 np0005531887 nova_compute[186849]:        <nova:memory>128</nova:memory>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:        <nova:disk>1</nova:disk>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:        <nova:swap>0</nova:swap>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:      </nova:flavor>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:      <nova:owner>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:        <nova:user uuid="d8853d84c1e84f6baaf01635ef1d0f7c">tempest-TestNetworkAdvancedServerOps-1221065053-project-member</nova:user>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:        <nova:project uuid="042f6d127720471aaedb8a1fb7535416">tempest-TestNetworkAdvancedServerOps-1221065053</nova:project>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:      </nova:owner>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:      <nova:ports>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:        <nova:port uuid="a6fcb4cd-f25f-467f-926b-423518d175bb">
Nov 22 03:24:49 np0005531887 nova_compute[186849]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:        </nova:port>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:      </nova:ports>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:    </nova:instance>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:  <sysinfo type="smbios">
Nov 22 03:24:49 np0005531887 nova_compute[186849]:    <system>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:      <entry name="serial">0f4580b9-9b95-420b-a41e-971fafe8dab0</entry>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:      <entry name="uuid">0f4580b9-9b95-420b-a41e-971fafe8dab0</entry>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:    </system>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:  <os>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:    <boot dev="hd"/>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:    <smbios mode="sysinfo"/>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:  </os>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:  <features>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:    <vmcoreinfo/>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:  </features>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:  <clock offset="utc">
Nov 22 03:24:49 np0005531887 nova_compute[186849]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:    <timer name="hpet" present="no"/>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:  </clock>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:  <cpu mode="custom" match="exact">
Nov 22 03:24:49 np0005531887 nova_compute[186849]:    <model>Nehalem</model>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:  <devices>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:    <disk type="file" device="disk">
Nov 22 03:24:49 np0005531887 nova_compute[186849]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/0f4580b9-9b95-420b-a41e-971fafe8dab0/disk"/>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:      <target dev="vda" bus="virtio"/>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:    <disk type="file" device="cdrom">
Nov 22 03:24:49 np0005531887 nova_compute[186849]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/0f4580b9-9b95-420b-a41e-971fafe8dab0/disk.config"/>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:      <target dev="sda" bus="sata"/>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:    <interface type="ethernet">
Nov 22 03:24:49 np0005531887 nova_compute[186849]:      <mac address="fa:16:3e:04:50:b7"/>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:      <mtu size="1442"/>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:      <target dev="tapa6fcb4cd-f2"/>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:    </interface>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:    <serial type="pty">
Nov 22 03:24:49 np0005531887 nova_compute[186849]:      <log file="/var/lib/nova/instances/0f4580b9-9b95-420b-a41e-971fafe8dab0/console.log" append="off"/>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:    </serial>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:    <video>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:    </video>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:    <input type="tablet" bus="usb"/>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:    <rng model="virtio">
Nov 22 03:24:49 np0005531887 nova_compute[186849]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:    </rng>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:    <controller type="usb" index="0"/>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:    <memballoon model="virtio">
Nov 22 03:24:49 np0005531887 nova_compute[186849]:      <stats period="10"/>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 03:24:49 np0005531887 nova_compute[186849]:  </devices>
Nov 22 03:24:49 np0005531887 nova_compute[186849]: </domain>
Nov 22 03:24:49 np0005531887 nova_compute[186849]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:24:49 np0005531887 nova_compute[186849]: 2025-11-22 08:24:49.391 186853 DEBUG nova.compute.manager [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Preparing to wait for external event network-vif-plugged-a6fcb4cd-f25f-467f-926b-423518d175bb prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:24:49 np0005531887 nova_compute[186849]: 2025-11-22 08:24:49.391 186853 DEBUG oslo_concurrency.lockutils [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "0f4580b9-9b95-420b-a41e-971fafe8dab0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:24:49 np0005531887 nova_compute[186849]: 2025-11-22 08:24:49.391 186853 DEBUG oslo_concurrency.lockutils [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "0f4580b9-9b95-420b-a41e-971fafe8dab0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:24:49 np0005531887 nova_compute[186849]: 2025-11-22 08:24:49.392 186853 DEBUG oslo_concurrency.lockutils [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "0f4580b9-9b95-420b-a41e-971fafe8dab0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:24:49 np0005531887 nova_compute[186849]: 2025-11-22 08:24:49.392 186853 DEBUG nova.virt.libvirt.vif [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:24:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1702931055',display_name='tempest-TestNetworkAdvancedServerOps-server-1702931055',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1702931055',id=160,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPxaQpYKOKe4L0KR7bY79WQEoXUDaFhQINyKgLQxxD+DY1SMW41QKPoSnfT27Llv7MI1/G06FoayeK4tBR1oc1AWt511XOIyHrR4CcBO56ZLEeXBHthfNrR7xWlOpV113A==',key_name='tempest-TestNetworkAdvancedServerOps-1146531194',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='042f6d127720471aaedb8a1fb7535416',ramdisk_id='',reservation_id='r-h8w9hc3n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1221065053',owner_user_name='tempest-TestNetworkAdvancedServerOps-1221065053-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:24:37Z,user_data=None,user_id='d8853d84c1e84f6baaf01635ef1d0f7c',uuid=0f4580b9-9b95-420b-a41e-971fafe8dab0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a6fcb4cd-f25f-467f-926b-423518d175bb", "address": "fa:16:3e:04:50:b7", "network": {"id": "87c3d412-41cf-4963-ada8-de4b3881e6fd", "bridge": "br-int", "label": "tempest-network-smoke--716674626", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6fcb4cd-f2", "ovs_interfaceid": "a6fcb4cd-f25f-467f-926b-423518d175bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:24:49 np0005531887 nova_compute[186849]: 2025-11-22 08:24:49.393 186853 DEBUG nova.network.os_vif_util [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converting VIF {"id": "a6fcb4cd-f25f-467f-926b-423518d175bb", "address": "fa:16:3e:04:50:b7", "network": {"id": "87c3d412-41cf-4963-ada8-de4b3881e6fd", "bridge": "br-int", "label": "tempest-network-smoke--716674626", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6fcb4cd-f2", "ovs_interfaceid": "a6fcb4cd-f25f-467f-926b-423518d175bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:24:49 np0005531887 nova_compute[186849]: 2025-11-22 08:24:49.393 186853 DEBUG nova.network.os_vif_util [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:04:50:b7,bridge_name='br-int',has_traffic_filtering=True,id=a6fcb4cd-f25f-467f-926b-423518d175bb,network=Network(87c3d412-41cf-4963-ada8-de4b3881e6fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa6fcb4cd-f2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:24:49 np0005531887 nova_compute[186849]: 2025-11-22 08:24:49.394 186853 DEBUG os_vif [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:50:b7,bridge_name='br-int',has_traffic_filtering=True,id=a6fcb4cd-f25f-467f-926b-423518d175bb,network=Network(87c3d412-41cf-4963-ada8-de4b3881e6fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa6fcb4cd-f2') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:24:49 np0005531887 nova_compute[186849]: 2025-11-22 08:24:49.394 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:49 np0005531887 nova_compute[186849]: 2025-11-22 08:24:49.395 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:24:49 np0005531887 nova_compute[186849]: 2025-11-22 08:24:49.395 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:24:49 np0005531887 nova_compute[186849]: 2025-11-22 08:24:49.399 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:49 np0005531887 nova_compute[186849]: 2025-11-22 08:24:49.400 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa6fcb4cd-f2, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:24:49 np0005531887 nova_compute[186849]: 2025-11-22 08:24:49.400 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa6fcb4cd-f2, col_values=(('external_ids', {'iface-id': 'a6fcb4cd-f25f-467f-926b-423518d175bb', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:04:50:b7', 'vm-uuid': '0f4580b9-9b95-420b-a41e-971fafe8dab0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:24:49 np0005531887 nova_compute[186849]: 2025-11-22 08:24:49.403 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:49 np0005531887 NetworkManager[55210]: <info>  [1763799889.4039] manager: (tapa6fcb4cd-f2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/258)
Nov 22 03:24:49 np0005531887 nova_compute[186849]: 2025-11-22 08:24:49.406 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:24:49 np0005531887 nova_compute[186849]: 2025-11-22 08:24:49.413 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:49 np0005531887 nova_compute[186849]: 2025-11-22 08:24:49.415 186853 INFO os_vif [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:50:b7,bridge_name='br-int',has_traffic_filtering=True,id=a6fcb4cd-f25f-467f-926b-423518d175bb,network=Network(87c3d412-41cf-4963-ada8-de4b3881e6fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa6fcb4cd-f2')#033[00m
Nov 22 03:24:49 np0005531887 nova_compute[186849]: 2025-11-22 08:24:49.478 186853 DEBUG nova.virt.libvirt.driver [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:24:49 np0005531887 nova_compute[186849]: 2025-11-22 08:24:49.479 186853 DEBUG nova.virt.libvirt.driver [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:24:49 np0005531887 nova_compute[186849]: 2025-11-22 08:24:49.479 186853 DEBUG nova.virt.libvirt.driver [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] No VIF found with MAC fa:16:3e:04:50:b7, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:24:49 np0005531887 nova_compute[186849]: 2025-11-22 08:24:49.479 186853 INFO nova.virt.libvirt.driver [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Using config drive#033[00m
Nov 22 03:24:50 np0005531887 nova_compute[186849]: 2025-11-22 08:24:50.232 186853 INFO nova.virt.libvirt.driver [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Creating config drive at /var/lib/nova/instances/0f4580b9-9b95-420b-a41e-971fafe8dab0/disk.config#033[00m
Nov 22 03:24:50 np0005531887 nova_compute[186849]: 2025-11-22 08:24:50.237 186853 DEBUG oslo_concurrency.processutils [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0f4580b9-9b95-420b-a41e-971fafe8dab0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9lv9ym13 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:24:50 np0005531887 nova_compute[186849]: 2025-11-22 08:24:50.363 186853 DEBUG oslo_concurrency.processutils [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0f4580b9-9b95-420b-a41e-971fafe8dab0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9lv9ym13" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:24:50 np0005531887 kernel: tapa6fcb4cd-f2: entered promiscuous mode
Nov 22 03:24:50 np0005531887 NetworkManager[55210]: <info>  [1763799890.4444] manager: (tapa6fcb4cd-f2): new Tun device (/org/freedesktop/NetworkManager/Devices/259)
Nov 22 03:24:50 np0005531887 nova_compute[186849]: 2025-11-22 08:24:50.445 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:50 np0005531887 ovn_controller[95130]: 2025-11-22T08:24:50Z|00540|binding|INFO|Claiming lport a6fcb4cd-f25f-467f-926b-423518d175bb for this chassis.
Nov 22 03:24:50 np0005531887 ovn_controller[95130]: 2025-11-22T08:24:50Z|00541|binding|INFO|a6fcb4cd-f25f-467f-926b-423518d175bb: Claiming fa:16:3e:04:50:b7 10.100.0.8
Nov 22 03:24:50 np0005531887 nova_compute[186849]: 2025-11-22 08:24:50.449 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:50 np0005531887 nova_compute[186849]: 2025-11-22 08:24:50.454 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:50 np0005531887 systemd-udevd[239775]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:24:50 np0005531887 systemd-machined[153180]: New machine qemu-58-instance-000000a0.
Nov 22 03:24:50 np0005531887 NetworkManager[55210]: <info>  [1763799890.4910] device (tapa6fcb4cd-f2): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:24:50 np0005531887 NetworkManager[55210]: <info>  [1763799890.4921] device (tapa6fcb4cd-f2): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:24:50 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:24:50.505 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:50:b7 10.100.0.8'], port_security=['fa:16:3e:04:50:b7 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '0f4580b9-9b95-420b-a41e-971fafe8dab0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-87c3d412-41cf-4963-ada8-de4b3881e6fd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '042f6d127720471aaedb8a1fb7535416', 'neutron:revision_number': '2', 'neutron:security_group_ids': '919e7659-019c-4818-b503-4d36b04f6078', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2ed71e8b-3bd3-4be1-8730-8e7ffc983db4, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=a6fcb4cd-f25f-467f-926b-423518d175bb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:24:50 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:24:50.506 104084 INFO neutron.agent.ovn.metadata.agent [-] Port a6fcb4cd-f25f-467f-926b-423518d175bb in datapath 87c3d412-41cf-4963-ada8-de4b3881e6fd bound to our chassis#033[00m
Nov 22 03:24:50 np0005531887 nova_compute[186849]: 2025-11-22 08:24:50.507 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:50 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:24:50.507 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 87c3d412-41cf-4963-ada8-de4b3881e6fd#033[00m
Nov 22 03:24:50 np0005531887 ovn_controller[95130]: 2025-11-22T08:24:50Z|00542|binding|INFO|Setting lport a6fcb4cd-f25f-467f-926b-423518d175bb ovn-installed in OVS
Nov 22 03:24:50 np0005531887 ovn_controller[95130]: 2025-11-22T08:24:50Z|00543|binding|INFO|Setting lport a6fcb4cd-f25f-467f-926b-423518d175bb up in Southbound
Nov 22 03:24:50 np0005531887 nova_compute[186849]: 2025-11-22 08:24:50.511 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:50 np0005531887 systemd[1]: Started Virtual Machine qemu-58-instance-000000a0.
Nov 22 03:24:50 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:24:50.519 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[d56e1a92-f63f-42c2-bf8b-d2eebc6d39a9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:24:50 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:24:50.520 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap87c3d412-41 in ovnmeta-87c3d412-41cf-4963-ada8-de4b3881e6fd namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:24:50 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:24:50.523 213790 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap87c3d412-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:24:50 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:24:50.523 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[47dca717-d475-4c34-b489-690cb1b3cf8e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:24:50 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:24:50.524 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[2b9dd8a3-a3c7-4fbe-9c86-11a4da2128cf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:24:50 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:24:50.536 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[a8febe74-73c4-4829-a167-684721f873ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:24:50 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:24:50.562 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[0f8e68cf-9dab-4d54-9a1f-38af8867d15a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:24:50 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:24:50.597 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[2a8902f3-dea9-4ee0-ba52-16378c7c557c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:24:50 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:24:50.606 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[5a28e30e-0d5c-406b-aa44-2fa3d9b9ab0b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:24:50 np0005531887 NetworkManager[55210]: <info>  [1763799890.6080] manager: (tap87c3d412-40): new Veth device (/org/freedesktop/NetworkManager/Devices/260)
Nov 22 03:24:50 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:24:50.643 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[14d0e12a-0764-41ec-923e-639a9a90fa72]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:24:50 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:24:50.647 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[7882ca6a-6609-46b8-9dc4-b92b3c260552]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:24:50 np0005531887 NetworkManager[55210]: <info>  [1763799890.6761] device (tap87c3d412-40): carrier: link connected
Nov 22 03:24:50 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:24:50.682 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[c9e36a7d-a9b8-4848-80cd-36d872ece742]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:24:50 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:24:50.705 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[82271896-90bd-4959-920d-81739a0eba51]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap87c3d412-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5c:05:37'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 165], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 663328, 'reachable_time': 41317, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239809, 'error': None, 'target': 'ovnmeta-87c3d412-41cf-4963-ada8-de4b3881e6fd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:24:50 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:24:50.727 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[e421095f-2e7c-4eea-91b8-1ae35b8c7c2e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5c:537'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 663328, 'tstamp': 663328}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 239811, 'error': None, 'target': 'ovnmeta-87c3d412-41cf-4963-ada8-de4b3881e6fd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:24:50 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:24:50.747 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[d6ea6344-8617-4f69-83f6-abd2e3df94df]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap87c3d412-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5c:05:37'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 165], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 663328, 'reachable_time': 41317, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 239816, 'error': None, 'target': 'ovnmeta-87c3d412-41cf-4963-ada8-de4b3881e6fd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:24:50 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:24:50.790 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[17993e8f-a768-4c43-883d-20f036ffc075]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:24:50 np0005531887 nova_compute[186849]: 2025-11-22 08:24:50.846 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763799890.8456664, 0f4580b9-9b95-420b-a41e-971fafe8dab0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:24:50 np0005531887 nova_compute[186849]: 2025-11-22 08:24:50.846 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] VM Started (Lifecycle Event)#033[00m
Nov 22 03:24:50 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:24:50.856 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[8188f8ed-04dd-43aa-a2da-b6e5d154f89f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:24:50 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:24:50.858 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap87c3d412-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:24:50 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:24:50.859 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:24:50 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:24:50.859 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap87c3d412-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:24:50 np0005531887 nova_compute[186849]: 2025-11-22 08:24:50.861 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:50 np0005531887 NetworkManager[55210]: <info>  [1763799890.8625] manager: (tap87c3d412-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/261)
Nov 22 03:24:50 np0005531887 kernel: tap87c3d412-40: entered promiscuous mode
Nov 22 03:24:50 np0005531887 nova_compute[186849]: 2025-11-22 08:24:50.863 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:50 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:24:50.866 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap87c3d412-40, col_values=(('external_ids', {'iface-id': 'c3dedc73-8573-44a6-afa7-82c65dae3823'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:24:50 np0005531887 nova_compute[186849]: 2025-11-22 08:24:50.868 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:50 np0005531887 ovn_controller[95130]: 2025-11-22T08:24:50Z|00544|binding|INFO|Releasing lport c3dedc73-8573-44a6-afa7-82c65dae3823 from this chassis (sb_readonly=0)
Nov 22 03:24:50 np0005531887 nova_compute[186849]: 2025-11-22 08:24:50.869 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:50 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:24:50.870 104084 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/87c3d412-41cf-4963-ada8-de4b3881e6fd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/87c3d412-41cf-4963-ada8-de4b3881e6fd.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:24:50 np0005531887 nova_compute[186849]: 2025-11-22 08:24:50.871 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:24:50 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:24:50.871 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[8fee45f3-a3f8-47f8-9a0c-6fec0e1a5ac7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:24:50 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:24:50.873 104084 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:24:50 np0005531887 ovn_metadata_agent[104079]: global
Nov 22 03:24:50 np0005531887 ovn_metadata_agent[104079]:    log         /dev/log local0 debug
Nov 22 03:24:50 np0005531887 ovn_metadata_agent[104079]:    log-tag     haproxy-metadata-proxy-87c3d412-41cf-4963-ada8-de4b3881e6fd
Nov 22 03:24:50 np0005531887 ovn_metadata_agent[104079]:    user        root
Nov 22 03:24:50 np0005531887 ovn_metadata_agent[104079]:    group       root
Nov 22 03:24:50 np0005531887 ovn_metadata_agent[104079]:    maxconn     1024
Nov 22 03:24:50 np0005531887 ovn_metadata_agent[104079]:    pidfile     /var/lib/neutron/external/pids/87c3d412-41cf-4963-ada8-de4b3881e6fd.pid.haproxy
Nov 22 03:24:50 np0005531887 ovn_metadata_agent[104079]:    daemon
Nov 22 03:24:50 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:24:50 np0005531887 ovn_metadata_agent[104079]: defaults
Nov 22 03:24:50 np0005531887 ovn_metadata_agent[104079]:    log global
Nov 22 03:24:50 np0005531887 ovn_metadata_agent[104079]:    mode http
Nov 22 03:24:50 np0005531887 ovn_metadata_agent[104079]:    option httplog
Nov 22 03:24:50 np0005531887 ovn_metadata_agent[104079]:    option dontlognull
Nov 22 03:24:50 np0005531887 ovn_metadata_agent[104079]:    option http-server-close
Nov 22 03:24:50 np0005531887 ovn_metadata_agent[104079]:    option forwardfor
Nov 22 03:24:50 np0005531887 ovn_metadata_agent[104079]:    retries                 3
Nov 22 03:24:50 np0005531887 ovn_metadata_agent[104079]:    timeout http-request    30s
Nov 22 03:24:50 np0005531887 ovn_metadata_agent[104079]:    timeout connect         30s
Nov 22 03:24:50 np0005531887 ovn_metadata_agent[104079]:    timeout client          32s
Nov 22 03:24:50 np0005531887 ovn_metadata_agent[104079]:    timeout server          32s
Nov 22 03:24:50 np0005531887 ovn_metadata_agent[104079]:    timeout http-keep-alive 30s
Nov 22 03:24:50 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:24:50 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:24:50 np0005531887 ovn_metadata_agent[104079]: listen listener
Nov 22 03:24:50 np0005531887 ovn_metadata_agent[104079]:    bind 169.254.169.254:80
Nov 22 03:24:50 np0005531887 ovn_metadata_agent[104079]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:24:50 np0005531887 ovn_metadata_agent[104079]:    http-request add-header X-OVN-Network-ID 87c3d412-41cf-4963-ada8-de4b3881e6fd
Nov 22 03:24:50 np0005531887 ovn_metadata_agent[104079]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:24:50 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:24:50.875 104084 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-87c3d412-41cf-4963-ada8-de4b3881e6fd', 'env', 'PROCESS_TAG=haproxy-87c3d412-41cf-4963-ada8-de4b3881e6fd', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/87c3d412-41cf-4963-ada8-de4b3881e6fd.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:24:50 np0005531887 nova_compute[186849]: 2025-11-22 08:24:50.877 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763799890.845801, 0f4580b9-9b95-420b-a41e-971fafe8dab0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:24:50 np0005531887 nova_compute[186849]: 2025-11-22 08:24:50.877 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:24:50 np0005531887 nova_compute[186849]: 2025-11-22 08:24:50.883 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:50 np0005531887 nova_compute[186849]: 2025-11-22 08:24:50.898 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:24:50 np0005531887 nova_compute[186849]: 2025-11-22 08:24:50.904 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:24:50 np0005531887 nova_compute[186849]: 2025-11-22 08:24:50.926 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:24:51 np0005531887 nova_compute[186849]: 2025-11-22 08:24:51.350 186853 DEBUG nova.compute.manager [req-94684951-9c10-4a57-be6d-8ff54ea93a4c req-4860ba3e-5f3c-41b1-956e-ffb83f26673f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Received event network-vif-plugged-a6fcb4cd-f25f-467f-926b-423518d175bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:24:51 np0005531887 nova_compute[186849]: 2025-11-22 08:24:51.350 186853 DEBUG oslo_concurrency.lockutils [req-94684951-9c10-4a57-be6d-8ff54ea93a4c req-4860ba3e-5f3c-41b1-956e-ffb83f26673f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "0f4580b9-9b95-420b-a41e-971fafe8dab0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:24:51 np0005531887 nova_compute[186849]: 2025-11-22 08:24:51.351 186853 DEBUG oslo_concurrency.lockutils [req-94684951-9c10-4a57-be6d-8ff54ea93a4c req-4860ba3e-5f3c-41b1-956e-ffb83f26673f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "0f4580b9-9b95-420b-a41e-971fafe8dab0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:24:51 np0005531887 nova_compute[186849]: 2025-11-22 08:24:51.351 186853 DEBUG oslo_concurrency.lockutils [req-94684951-9c10-4a57-be6d-8ff54ea93a4c req-4860ba3e-5f3c-41b1-956e-ffb83f26673f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "0f4580b9-9b95-420b-a41e-971fafe8dab0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:24:51 np0005531887 nova_compute[186849]: 2025-11-22 08:24:51.352 186853 DEBUG nova.compute.manager [req-94684951-9c10-4a57-be6d-8ff54ea93a4c req-4860ba3e-5f3c-41b1-956e-ffb83f26673f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Processing event network-vif-plugged-a6fcb4cd-f25f-467f-926b-423518d175bb _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:24:51 np0005531887 nova_compute[186849]: 2025-11-22 08:24:51.352 186853 DEBUG nova.compute.manager [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:24:51 np0005531887 nova_compute[186849]: 2025-11-22 08:24:51.356 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763799891.3560128, 0f4580b9-9b95-420b-a41e-971fafe8dab0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:24:51 np0005531887 nova_compute[186849]: 2025-11-22 08:24:51.356 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:24:51 np0005531887 nova_compute[186849]: 2025-11-22 08:24:51.358 186853 DEBUG nova.virt.libvirt.driver [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:24:51 np0005531887 nova_compute[186849]: 2025-11-22 08:24:51.362 186853 INFO nova.virt.libvirt.driver [-] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Instance spawned successfully.#033[00m
Nov 22 03:24:51 np0005531887 nova_compute[186849]: 2025-11-22 08:24:51.363 186853 DEBUG nova.virt.libvirt.driver [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 03:24:51 np0005531887 podman[239849]: 2025-11-22 08:24:51.291475207 +0000 UTC m=+0.031013988 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:24:51 np0005531887 nova_compute[186849]: 2025-11-22 08:24:51.398 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:24:51 np0005531887 nova_compute[186849]: 2025-11-22 08:24:51.405 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:24:51 np0005531887 nova_compute[186849]: 2025-11-22 08:24:51.410 186853 DEBUG nova.virt.libvirt.driver [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:24:51 np0005531887 nova_compute[186849]: 2025-11-22 08:24:51.411 186853 DEBUG nova.virt.libvirt.driver [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:24:51 np0005531887 nova_compute[186849]: 2025-11-22 08:24:51.411 186853 DEBUG nova.virt.libvirt.driver [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:24:51 np0005531887 nova_compute[186849]: 2025-11-22 08:24:51.411 186853 DEBUG nova.virt.libvirt.driver [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:24:51 np0005531887 nova_compute[186849]: 2025-11-22 08:24:51.412 186853 DEBUG nova.virt.libvirt.driver [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:24:51 np0005531887 nova_compute[186849]: 2025-11-22 08:24:51.412 186853 DEBUG nova.virt.libvirt.driver [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:24:51 np0005531887 nova_compute[186849]: 2025-11-22 08:24:51.438 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:24:51 np0005531887 nova_compute[186849]: 2025-11-22 08:24:51.506 186853 INFO nova.compute.manager [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Took 13.39 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 03:24:51 np0005531887 nova_compute[186849]: 2025-11-22 08:24:51.506 186853 DEBUG nova.compute.manager [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:24:51 np0005531887 podman[239849]: 2025-11-22 08:24:51.523564166 +0000 UTC m=+0.263102917 container create 634ed7a34482971aaa50adad302c26317458faafbc500bced2cbea6554f56036 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-87c3d412-41cf-4963-ada8-de4b3881e6fd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:24:51 np0005531887 nova_compute[186849]: 2025-11-22 08:24:51.644 186853 INFO nova.compute.manager [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Took 14.56 seconds to build instance.#033[00m
Nov 22 03:24:51 np0005531887 systemd[1]: Started libpod-conmon-634ed7a34482971aaa50adad302c26317458faafbc500bced2cbea6554f56036.scope.
Nov 22 03:24:51 np0005531887 systemd[1]: Started libcrun container.
Nov 22 03:24:51 np0005531887 nova_compute[186849]: 2025-11-22 08:24:51.700 186853 DEBUG oslo_concurrency.lockutils [None req-d8c52090-5ad7-44f5-9e86-402950144df2 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "0f4580b9-9b95-420b-a41e-971fafe8dab0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.556s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:24:51 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/16fa7737556e2ecdb49351547e64a742bcebfd2d0c79971211bb5253e6ecac80/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:24:51 np0005531887 podman[239849]: 2025-11-22 08:24:51.882749769 +0000 UTC m=+0.622288540 container init 634ed7a34482971aaa50adad302c26317458faafbc500bced2cbea6554f56036 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-87c3d412-41cf-4963-ada8-de4b3881e6fd, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 22 03:24:51 np0005531887 podman[239849]: 2025-11-22 08:24:51.889581368 +0000 UTC m=+0.629120119 container start 634ed7a34482971aaa50adad302c26317458faafbc500bced2cbea6554f56036 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-87c3d412-41cf-4963-ada8-de4b3881e6fd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 22 03:24:51 np0005531887 neutron-haproxy-ovnmeta-87c3d412-41cf-4963-ada8-de4b3881e6fd[239864]: [NOTICE]   (239868) : New worker (239870) forked
Nov 22 03:24:51 np0005531887 neutron-haproxy-ovnmeta-87c3d412-41cf-4963-ada8-de4b3881e6fd[239864]: [NOTICE]   (239868) : Loading success.
Nov 22 03:24:52 np0005531887 nova_compute[186849]: 2025-11-22 08:24:52.954 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:53 np0005531887 nova_compute[186849]: 2025-11-22 08:24:53.460 186853 DEBUG nova.compute.manager [req-b342c9cc-8ef1-4a35-8b2f-e47d26e83d0b req-28a313f2-ee41-4fa3-b29e-ce7e4257d5d0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Received event network-vif-plugged-a6fcb4cd-f25f-467f-926b-423518d175bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:24:53 np0005531887 nova_compute[186849]: 2025-11-22 08:24:53.461 186853 DEBUG oslo_concurrency.lockutils [req-b342c9cc-8ef1-4a35-8b2f-e47d26e83d0b req-28a313f2-ee41-4fa3-b29e-ce7e4257d5d0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "0f4580b9-9b95-420b-a41e-971fafe8dab0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:24:53 np0005531887 nova_compute[186849]: 2025-11-22 08:24:53.461 186853 DEBUG oslo_concurrency.lockutils [req-b342c9cc-8ef1-4a35-8b2f-e47d26e83d0b req-28a313f2-ee41-4fa3-b29e-ce7e4257d5d0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "0f4580b9-9b95-420b-a41e-971fafe8dab0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:24:53 np0005531887 nova_compute[186849]: 2025-11-22 08:24:53.461 186853 DEBUG oslo_concurrency.lockutils [req-b342c9cc-8ef1-4a35-8b2f-e47d26e83d0b req-28a313f2-ee41-4fa3-b29e-ce7e4257d5d0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "0f4580b9-9b95-420b-a41e-971fafe8dab0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:24:53 np0005531887 nova_compute[186849]: 2025-11-22 08:24:53.461 186853 DEBUG nova.compute.manager [req-b342c9cc-8ef1-4a35-8b2f-e47d26e83d0b req-28a313f2-ee41-4fa3-b29e-ce7e4257d5d0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] No waiting events found dispatching network-vif-plugged-a6fcb4cd-f25f-467f-926b-423518d175bb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:24:53 np0005531887 nova_compute[186849]: 2025-11-22 08:24:53.461 186853 WARNING nova.compute.manager [req-b342c9cc-8ef1-4a35-8b2f-e47d26e83d0b req-28a313f2-ee41-4fa3-b29e-ce7e4257d5d0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Received unexpected event network-vif-plugged-a6fcb4cd-f25f-467f-926b-423518d175bb for instance with vm_state active and task_state None.#033[00m
Nov 22 03:24:54 np0005531887 nova_compute[186849]: 2025-11-22 08:24:54.405 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:54 np0005531887 podman[239879]: 2025-11-22 08:24:54.856554079 +0000 UTC m=+0.069040999 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=ubi9-minimal, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vendor=Red Hat, Inc., config_id=edpm, container_name=openstack_network_exporter, distribution-scope=public, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 22 03:24:56 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:24:56.856 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=51, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=50) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:24:56 np0005531887 nova_compute[186849]: 2025-11-22 08:24:56.862 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:56 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:24:56.863 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:24:57 np0005531887 nova_compute[186849]: 2025-11-22 08:24:57.955 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:58 np0005531887 nova_compute[186849]: 2025-11-22 08:24:58.663 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:58 np0005531887 NetworkManager[55210]: <info>  [1763799898.6826] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/262)
Nov 22 03:24:58 np0005531887 NetworkManager[55210]: <info>  [1763799898.6842] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/263)
Nov 22 03:24:58 np0005531887 nova_compute[186849]: 2025-11-22 08:24:58.738 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:58 np0005531887 ovn_controller[95130]: 2025-11-22T08:24:58Z|00545|binding|INFO|Releasing lport c3dedc73-8573-44a6-afa7-82c65dae3823 from this chassis (sb_readonly=0)
Nov 22 03:24:58 np0005531887 nova_compute[186849]: 2025-11-22 08:24:58.749 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:59 np0005531887 nova_compute[186849]: 2025-11-22 08:24:59.408 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:24:59 np0005531887 podman[239901]: 2025-11-22 08:24:59.852288648 +0000 UTC m=+0.071050788 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:24:59 np0005531887 podman[239902]: 2025-11-22 08:24:59.879417969 +0000 UTC m=+0.093992955 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:24:59 np0005531887 nova_compute[186849]: 2025-11-22 08:24:59.973 186853 DEBUG nova.compute.manager [req-ac9e648b-100a-481e-baeb-7f0d687bc31c req-24b8ea9a-74ba-4f0f-a58f-a547101f7049 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Received event network-changed-a6fcb4cd-f25f-467f-926b-423518d175bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:24:59 np0005531887 nova_compute[186849]: 2025-11-22 08:24:59.974 186853 DEBUG nova.compute.manager [req-ac9e648b-100a-481e-baeb-7f0d687bc31c req-24b8ea9a-74ba-4f0f-a58f-a547101f7049 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Refreshing instance network info cache due to event network-changed-a6fcb4cd-f25f-467f-926b-423518d175bb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:24:59 np0005531887 nova_compute[186849]: 2025-11-22 08:24:59.974 186853 DEBUG oslo_concurrency.lockutils [req-ac9e648b-100a-481e-baeb-7f0d687bc31c req-24b8ea9a-74ba-4f0f-a58f-a547101f7049 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-0f4580b9-9b95-420b-a41e-971fafe8dab0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:24:59 np0005531887 nova_compute[186849]: 2025-11-22 08:24:59.974 186853 DEBUG oslo_concurrency.lockutils [req-ac9e648b-100a-481e-baeb-7f0d687bc31c req-24b8ea9a-74ba-4f0f-a58f-a547101f7049 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-0f4580b9-9b95-420b-a41e-971fafe8dab0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:24:59 np0005531887 nova_compute[186849]: 2025-11-22 08:24:59.975 186853 DEBUG nova.network.neutron [req-ac9e648b-100a-481e-baeb-7f0d687bc31c req-24b8ea9a-74ba-4f0f-a58f-a547101f7049 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Refreshing network info cache for port a6fcb4cd-f25f-467f-926b-423518d175bb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:25:02 np0005531887 nova_compute[186849]: 2025-11-22 08:25:02.652 186853 DEBUG nova.network.neutron [req-ac9e648b-100a-481e-baeb-7f0d687bc31c req-24b8ea9a-74ba-4f0f-a58f-a547101f7049 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Updated VIF entry in instance network info cache for port a6fcb4cd-f25f-467f-926b-423518d175bb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:25:02 np0005531887 nova_compute[186849]: 2025-11-22 08:25:02.653 186853 DEBUG nova.network.neutron [req-ac9e648b-100a-481e-baeb-7f0d687bc31c req-24b8ea9a-74ba-4f0f-a58f-a547101f7049 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Updating instance_info_cache with network_info: [{"id": "a6fcb4cd-f25f-467f-926b-423518d175bb", "address": "fa:16:3e:04:50:b7", "network": {"id": "87c3d412-41cf-4963-ada8-de4b3881e6fd", "bridge": "br-int", "label": "tempest-network-smoke--716674626", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6fcb4cd-f2", "ovs_interfaceid": "a6fcb4cd-f25f-467f-926b-423518d175bb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:25:02 np0005531887 nova_compute[186849]: 2025-11-22 08:25:02.732 186853 DEBUG oslo_concurrency.lockutils [req-ac9e648b-100a-481e-baeb-7f0d687bc31c req-24b8ea9a-74ba-4f0f-a58f-a547101f7049 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-0f4580b9-9b95-420b-a41e-971fafe8dab0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:25:02 np0005531887 nova_compute[186849]: 2025-11-22 08:25:02.958 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:03 np0005531887 nova_compute[186849]: 2025-11-22 08:25:03.673 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:25:03 np0005531887 nova_compute[186849]: 2025-11-22 08:25:03.696 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Triggering sync for uuid 0f4580b9-9b95-420b-a41e-971fafe8dab0 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Nov 22 03:25:03 np0005531887 nova_compute[186849]: 2025-11-22 08:25:03.697 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "0f4580b9-9b95-420b-a41e-971fafe8dab0" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:25:03 np0005531887 nova_compute[186849]: 2025-11-22 08:25:03.697 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "0f4580b9-9b95-420b-a41e-971fafe8dab0" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:25:03 np0005531887 nova_compute[186849]: 2025-11-22 08:25:03.720 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "0f4580b9-9b95-420b-a41e-971fafe8dab0" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.023s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:25:03 np0005531887 podman[239949]: 2025-11-22 08:25:03.850336946 +0000 UTC m=+0.067058939 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:25:04 np0005531887 nova_compute[186849]: 2025-11-22 08:25:04.411 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:05 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:05.865 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '51'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:25:07 np0005531887 nova_compute[186849]: 2025-11-22 08:25:07.962 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:09 np0005531887 nova_compute[186849]: 2025-11-22 08:25:09.413 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:09 np0005531887 ovn_controller[95130]: 2025-11-22T08:25:09Z|00070|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:04:50:b7 10.100.0.8
Nov 22 03:25:09 np0005531887 ovn_controller[95130]: 2025-11-22T08:25:09Z|00071|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:04:50:b7 10.100.0.8
Nov 22 03:25:09 np0005531887 nova_compute[186849]: 2025-11-22 08:25:09.793 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:25:10 np0005531887 ovn_controller[95130]: 2025-11-22T08:25:10Z|00546|binding|INFO|Releasing lport c3dedc73-8573-44a6-afa7-82c65dae3823 from this chassis (sb_readonly=0)
Nov 22 03:25:10 np0005531887 nova_compute[186849]: 2025-11-22 08:25:10.506 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:10 np0005531887 nova_compute[186849]: 2025-11-22 08:25:10.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:25:10 np0005531887 nova_compute[186849]: 2025-11-22 08:25:10.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:25:10 np0005531887 nova_compute[186849]: 2025-11-22 08:25:10.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:25:10 np0005531887 podman[239995]: 2025-11-22 08:25:10.854308657 +0000 UTC m=+0.068409002 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:25:11 np0005531887 nova_compute[186849]: 2025-11-22 08:25:11.260 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "refresh_cache-0f4580b9-9b95-420b-a41e-971fafe8dab0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:25:11 np0005531887 nova_compute[186849]: 2025-11-22 08:25:11.260 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquired lock "refresh_cache-0f4580b9-9b95-420b-a41e-971fafe8dab0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:25:11 np0005531887 nova_compute[186849]: 2025-11-22 08:25:11.260 186853 DEBUG nova.network.neutron [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 03:25:11 np0005531887 nova_compute[186849]: 2025-11-22 08:25:11.261 186853 DEBUG nova.objects.instance [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 0f4580b9-9b95-420b-a41e-971fafe8dab0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:25:12 np0005531887 nova_compute[186849]: 2025-11-22 08:25:12.964 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:14 np0005531887 nova_compute[186849]: 2025-11-22 08:25:14.417 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:15 np0005531887 nova_compute[186849]: 2025-11-22 08:25:15.295 186853 DEBUG nova.network.neutron [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Updating instance_info_cache with network_info: [{"id": "a6fcb4cd-f25f-467f-926b-423518d175bb", "address": "fa:16:3e:04:50:b7", "network": {"id": "87c3d412-41cf-4963-ada8-de4b3881e6fd", "bridge": "br-int", "label": "tempest-network-smoke--716674626", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6fcb4cd-f2", "ovs_interfaceid": "a6fcb4cd-f25f-467f-926b-423518d175bb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:25:15 np0005531887 nova_compute[186849]: 2025-11-22 08:25:15.307 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Releasing lock "refresh_cache-0f4580b9-9b95-420b-a41e-971fafe8dab0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:25:15 np0005531887 nova_compute[186849]: 2025-11-22 08:25:15.308 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 03:25:15 np0005531887 nova_compute[186849]: 2025-11-22 08:25:15.308 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:25:15 np0005531887 nova_compute[186849]: 2025-11-22 08:25:15.527 186853 INFO nova.compute.manager [None req-e95d03e9-746d-436e-9faf-ad8a26665527 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Get console output#033[00m
Nov 22 03:25:15 np0005531887 nova_compute[186849]: 2025-11-22 08:25:15.534 213402 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 22 03:25:15 np0005531887 nova_compute[186849]: 2025-11-22 08:25:15.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:25:15 np0005531887 nova_compute[186849]: 2025-11-22 08:25:15.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:25:15 np0005531887 nova_compute[186849]: 2025-11-22 08:25:15.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:25:15 np0005531887 nova_compute[186849]: 2025-11-22 08:25:15.770 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:25:15 np0005531887 nova_compute[186849]: 2025-11-22 08:25:15.794 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:25:15 np0005531887 nova_compute[186849]: 2025-11-22 08:25:15.795 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:25:15 np0005531887 nova_compute[186849]: 2025-11-22 08:25:15.795 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:25:15 np0005531887 nova_compute[186849]: 2025-11-22 08:25:15.795 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:25:15 np0005531887 podman[240016]: 2025-11-22 08:25:15.871981388 +0000 UTC m=+0.081522597 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 22 03:25:15 np0005531887 nova_compute[186849]: 2025-11-22 08:25:15.890 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0f4580b9-9b95-420b-a41e-971fafe8dab0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:25:16 np0005531887 nova_compute[186849]: 2025-11-22 08:25:16.121 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0f4580b9-9b95-420b-a41e-971fafe8dab0/disk --force-share --output=json" returned: 0 in 0.231s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:25:16 np0005531887 nova_compute[186849]: 2025-11-22 08:25:16.122 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0f4580b9-9b95-420b-a41e-971fafe8dab0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:25:16 np0005531887 nova_compute[186849]: 2025-11-22 08:25:16.179 186853 DEBUG oslo_concurrency.lockutils [None req-871aadb5-1da4-4af3-8207-6b4096220c0e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "0f4580b9-9b95-420b-a41e-971fafe8dab0" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:25:16 np0005531887 nova_compute[186849]: 2025-11-22 08:25:16.180 186853 DEBUG oslo_concurrency.lockutils [None req-871aadb5-1da4-4af3-8207-6b4096220c0e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "0f4580b9-9b95-420b-a41e-971fafe8dab0" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:25:16 np0005531887 nova_compute[186849]: 2025-11-22 08:25:16.180 186853 INFO nova.compute.manager [None req-871aadb5-1da4-4af3-8207-6b4096220c0e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Rebooting instance#033[00m
Nov 22 03:25:16 np0005531887 nova_compute[186849]: 2025-11-22 08:25:16.183 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0f4580b9-9b95-420b-a41e-971fafe8dab0/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:25:16 np0005531887 nova_compute[186849]: 2025-11-22 08:25:16.194 186853 DEBUG oslo_concurrency.lockutils [None req-871aadb5-1da4-4af3-8207-6b4096220c0e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "refresh_cache-0f4580b9-9b95-420b-a41e-971fafe8dab0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:25:16 np0005531887 nova_compute[186849]: 2025-11-22 08:25:16.195 186853 DEBUG oslo_concurrency.lockutils [None req-871aadb5-1da4-4af3-8207-6b4096220c0e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquired lock "refresh_cache-0f4580b9-9b95-420b-a41e-971fafe8dab0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:25:16 np0005531887 nova_compute[186849]: 2025-11-22 08:25:16.195 186853 DEBUG nova.network.neutron [None req-871aadb5-1da4-4af3-8207-6b4096220c0e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:25:16 np0005531887 nova_compute[186849]: 2025-11-22 08:25:16.375 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:25:16 np0005531887 nova_compute[186849]: 2025-11-22 08:25:16.376 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5542MB free_disk=73.24549102783203GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:25:16 np0005531887 nova_compute[186849]: 2025-11-22 08:25:16.376 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:25:16 np0005531887 nova_compute[186849]: 2025-11-22 08:25:16.376 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:25:16 np0005531887 nova_compute[186849]: 2025-11-22 08:25:16.479 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Instance 0f4580b9-9b95-420b-a41e-971fafe8dab0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 03:25:16 np0005531887 nova_compute[186849]: 2025-11-22 08:25:16.480 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:25:16 np0005531887 nova_compute[186849]: 2025-11-22 08:25:16.480 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:25:16 np0005531887 nova_compute[186849]: 2025-11-22 08:25:16.722 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:25:16 np0005531887 nova_compute[186849]: 2025-11-22 08:25:16.734 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:25:16 np0005531887 nova_compute[186849]: 2025-11-22 08:25:16.872 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:25:16 np0005531887 nova_compute[186849]: 2025-11-22 08:25:16.872 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.496s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:25:17 np0005531887 nova_compute[186849]: 2025-11-22 08:25:17.668 186853 DEBUG nova.network.neutron [None req-871aadb5-1da4-4af3-8207-6b4096220c0e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Updating instance_info_cache with network_info: [{"id": "a6fcb4cd-f25f-467f-926b-423518d175bb", "address": "fa:16:3e:04:50:b7", "network": {"id": "87c3d412-41cf-4963-ada8-de4b3881e6fd", "bridge": "br-int", "label": "tempest-network-smoke--716674626", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6fcb4cd-f2", "ovs_interfaceid": "a6fcb4cd-f25f-467f-926b-423518d175bb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:25:17 np0005531887 nova_compute[186849]: 2025-11-22 08:25:17.691 186853 DEBUG oslo_concurrency.lockutils [None req-871aadb5-1da4-4af3-8207-6b4096220c0e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Releasing lock "refresh_cache-0f4580b9-9b95-420b-a41e-971fafe8dab0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:25:17 np0005531887 nova_compute[186849]: 2025-11-22 08:25:17.698 186853 DEBUG nova.compute.manager [None req-871aadb5-1da4-4af3-8207-6b4096220c0e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:25:17 np0005531887 nova_compute[186849]: 2025-11-22 08:25:17.965 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:19 np0005531887 nova_compute[186849]: 2025-11-22 08:25:19.421 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:19 np0005531887 podman[240042]: 2025-11-22 08:25:19.828332255 +0000 UTC m=+0.052365876 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 03:25:20 np0005531887 kernel: tapa6fcb4cd-f2 (unregistering): left promiscuous mode
Nov 22 03:25:20 np0005531887 NetworkManager[55210]: <info>  [1763799920.8513] device (tapa6fcb4cd-f2): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:25:20 np0005531887 ovn_controller[95130]: 2025-11-22T08:25:20Z|00547|binding|INFO|Releasing lport a6fcb4cd-f25f-467f-926b-423518d175bb from this chassis (sb_readonly=0)
Nov 22 03:25:20 np0005531887 nova_compute[186849]: 2025-11-22 08:25:20.863 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:20 np0005531887 ovn_controller[95130]: 2025-11-22T08:25:20Z|00548|binding|INFO|Setting lport a6fcb4cd-f25f-467f-926b-423518d175bb down in Southbound
Nov 22 03:25:20 np0005531887 ovn_controller[95130]: 2025-11-22T08:25:20Z|00549|binding|INFO|Removing iface tapa6fcb4cd-f2 ovn-installed in OVS
Nov 22 03:25:20 np0005531887 nova_compute[186849]: 2025-11-22 08:25:20.868 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:20 np0005531887 nova_compute[186849]: 2025-11-22 08:25:20.871 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:25:20 np0005531887 nova_compute[186849]: 2025-11-22 08:25:20.886 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:20 np0005531887 systemd[1]: machine-qemu\x2d58\x2dinstance\x2d000000a0.scope: Deactivated successfully.
Nov 22 03:25:20 np0005531887 systemd[1]: machine-qemu\x2d58\x2dinstance\x2d000000a0.scope: Consumed 16.930s CPU time.
Nov 22 03:25:20 np0005531887 systemd-machined[153180]: Machine qemu-58-instance-000000a0 terminated.
Nov 22 03:25:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:20.994 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:50:b7 10.100.0.8'], port_security=['fa:16:3e:04:50:b7 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '0f4580b9-9b95-420b-a41e-971fafe8dab0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-87c3d412-41cf-4963-ada8-de4b3881e6fd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '042f6d127720471aaedb8a1fb7535416', 'neutron:revision_number': '4', 'neutron:security_group_ids': '919e7659-019c-4818-b503-4d36b04f6078', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.172'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2ed71e8b-3bd3-4be1-8730-8e7ffc983db4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=a6fcb4cd-f25f-467f-926b-423518d175bb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:25:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:20.995 104084 INFO neutron.agent.ovn.metadata.agent [-] Port a6fcb4cd-f25f-467f-926b-423518d175bb in datapath 87c3d412-41cf-4963-ada8-de4b3881e6fd unbound from our chassis#033[00m
Nov 22 03:25:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:20.996 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 87c3d412-41cf-4963-ada8-de4b3881e6fd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:25:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:20.997 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[3f641116-43cb-465b-aae8-bae29e310f08]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:25:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:20.998 104084 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-87c3d412-41cf-4963-ada8-de4b3881e6fd namespace which is not needed anymore#033[00m
Nov 22 03:25:21 np0005531887 neutron-haproxy-ovnmeta-87c3d412-41cf-4963-ada8-de4b3881e6fd[239864]: [NOTICE]   (239868) : haproxy version is 2.8.14-c23fe91
Nov 22 03:25:21 np0005531887 neutron-haproxy-ovnmeta-87c3d412-41cf-4963-ada8-de4b3881e6fd[239864]: [NOTICE]   (239868) : path to executable is /usr/sbin/haproxy
Nov 22 03:25:21 np0005531887 neutron-haproxy-ovnmeta-87c3d412-41cf-4963-ada8-de4b3881e6fd[239864]: [WARNING]  (239868) : Exiting Master process...
Nov 22 03:25:21 np0005531887 neutron-haproxy-ovnmeta-87c3d412-41cf-4963-ada8-de4b3881e6fd[239864]: [ALERT]    (239868) : Current worker (239870) exited with code 143 (Terminated)
Nov 22 03:25:21 np0005531887 neutron-haproxy-ovnmeta-87c3d412-41cf-4963-ada8-de4b3881e6fd[239864]: [WARNING]  (239868) : All workers exited. Exiting... (0)
Nov 22 03:25:21 np0005531887 systemd[1]: libpod-634ed7a34482971aaa50adad302c26317458faafbc500bced2cbea6554f56036.scope: Deactivated successfully.
Nov 22 03:25:21 np0005531887 podman[240094]: 2025-11-22 08:25:21.491799922 +0000 UTC m=+0.392906858 container died 634ed7a34482971aaa50adad302c26317458faafbc500bced2cbea6554f56036 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-87c3d412-41cf-4963-ada8-de4b3881e6fd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 03:25:21 np0005531887 nova_compute[186849]: 2025-11-22 08:25:21.813 186853 DEBUG nova.compute.manager [req-778ee07e-330c-4dc8-823f-589b9b7170d5 req-1707843d-cbd9-43b7-aa06-84064d5913d2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Received event network-vif-unplugged-a6fcb4cd-f25f-467f-926b-423518d175bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:25:21 np0005531887 nova_compute[186849]: 2025-11-22 08:25:21.813 186853 DEBUG oslo_concurrency.lockutils [req-778ee07e-330c-4dc8-823f-589b9b7170d5 req-1707843d-cbd9-43b7-aa06-84064d5913d2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "0f4580b9-9b95-420b-a41e-971fafe8dab0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:25:21 np0005531887 nova_compute[186849]: 2025-11-22 08:25:21.814 186853 DEBUG oslo_concurrency.lockutils [req-778ee07e-330c-4dc8-823f-589b9b7170d5 req-1707843d-cbd9-43b7-aa06-84064d5913d2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "0f4580b9-9b95-420b-a41e-971fafe8dab0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:25:21 np0005531887 nova_compute[186849]: 2025-11-22 08:25:21.814 186853 DEBUG oslo_concurrency.lockutils [req-778ee07e-330c-4dc8-823f-589b9b7170d5 req-1707843d-cbd9-43b7-aa06-84064d5913d2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "0f4580b9-9b95-420b-a41e-971fafe8dab0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:25:21 np0005531887 nova_compute[186849]: 2025-11-22 08:25:21.814 186853 DEBUG nova.compute.manager [req-778ee07e-330c-4dc8-823f-589b9b7170d5 req-1707843d-cbd9-43b7-aa06-84064d5913d2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] No waiting events found dispatching network-vif-unplugged-a6fcb4cd-f25f-467f-926b-423518d175bb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:25:21 np0005531887 nova_compute[186849]: 2025-11-22 08:25:21.815 186853 WARNING nova.compute.manager [req-778ee07e-330c-4dc8-823f-589b9b7170d5 req-1707843d-cbd9-43b7-aa06-84064d5913d2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Received unexpected event network-vif-unplugged-a6fcb4cd-f25f-467f-926b-423518d175bb for instance with vm_state active and task_state reboot_started.#033[00m
Nov 22 03:25:22 np0005531887 nova_compute[186849]: 2025-11-22 08:25:22.134 186853 INFO nova.virt.libvirt.driver [None req-871aadb5-1da4-4af3-8207-6b4096220c0e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Instance shutdown successfully.#033[00m
Nov 22 03:25:22 np0005531887 kernel: tapa6fcb4cd-f2: entered promiscuous mode
Nov 22 03:25:22 np0005531887 nova_compute[186849]: 2025-11-22 08:25:22.209 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:22 np0005531887 NetworkManager[55210]: <info>  [1763799922.2107] manager: (tapa6fcb4cd-f2): new Tun device (/org/freedesktop/NetworkManager/Devices/264)
Nov 22 03:25:22 np0005531887 ovn_controller[95130]: 2025-11-22T08:25:22Z|00550|binding|INFO|Claiming lport a6fcb4cd-f25f-467f-926b-423518d175bb for this chassis.
Nov 22 03:25:22 np0005531887 ovn_controller[95130]: 2025-11-22T08:25:22Z|00551|binding|INFO|a6fcb4cd-f25f-467f-926b-423518d175bb: Claiming fa:16:3e:04:50:b7 10.100.0.8
Nov 22 03:25:22 np0005531887 systemd-udevd[240069]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:25:22 np0005531887 ovn_controller[95130]: 2025-11-22T08:25:22Z|00552|binding|INFO|Setting lport a6fcb4cd-f25f-467f-926b-423518d175bb ovn-installed in OVS
Nov 22 03:25:22 np0005531887 nova_compute[186849]: 2025-11-22 08:25:22.225 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:22 np0005531887 NetworkManager[55210]: <info>  [1763799922.2304] device (tapa6fcb4cd-f2): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:25:22 np0005531887 nova_compute[186849]: 2025-11-22 08:25:22.230 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:22 np0005531887 NetworkManager[55210]: <info>  [1763799922.2315] device (tapa6fcb4cd-f2): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:25:22 np0005531887 nova_compute[186849]: 2025-11-22 08:25:22.234 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:22 np0005531887 systemd-machined[153180]: New machine qemu-59-instance-000000a0.
Nov 22 03:25:22 np0005531887 systemd[1]: Started Virtual Machine qemu-59-instance-000000a0.
Nov 22 03:25:22 np0005531887 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-634ed7a34482971aaa50adad302c26317458faafbc500bced2cbea6554f56036-userdata-shm.mount: Deactivated successfully.
Nov 22 03:25:22 np0005531887 systemd[1]: var-lib-containers-storage-overlay-16fa7737556e2ecdb49351547e64a742bcebfd2d0c79971211bb5253e6ecac80-merged.mount: Deactivated successfully.
Nov 22 03:25:22 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:22.407 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:50:b7 10.100.0.8'], port_security=['fa:16:3e:04:50:b7 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '0f4580b9-9b95-420b-a41e-971fafe8dab0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-87c3d412-41cf-4963-ada8-de4b3881e6fd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '042f6d127720471aaedb8a1fb7535416', 'neutron:revision_number': '5', 'neutron:security_group_ids': '919e7659-019c-4818-b503-4d36b04f6078', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.172'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2ed71e8b-3bd3-4be1-8730-8e7ffc983db4, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=a6fcb4cd-f25f-467f-926b-423518d175bb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:25:22 np0005531887 ovn_controller[95130]: 2025-11-22T08:25:22Z|00553|binding|INFO|Setting lport a6fcb4cd-f25f-467f-926b-423518d175bb up in Southbound
Nov 22 03:25:22 np0005531887 nova_compute[186849]: 2025-11-22 08:25:22.968 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:23 np0005531887 nova_compute[186849]: 2025-11-22 08:25:23.202 186853 DEBUG nova.virt.libvirt.host [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Removed pending event for 0f4580b9-9b95-420b-a41e-971fafe8dab0 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Nov 22 03:25:23 np0005531887 nova_compute[186849]: 2025-11-22 08:25:23.203 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763799923.20241, 0f4580b9-9b95-420b-a41e-971fafe8dab0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:25:23 np0005531887 nova_compute[186849]: 2025-11-22 08:25:23.204 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:25:23 np0005531887 nova_compute[186849]: 2025-11-22 08:25:23.209 186853 INFO nova.virt.libvirt.driver [-] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Instance running successfully.#033[00m
Nov 22 03:25:23 np0005531887 nova_compute[186849]: 2025-11-22 08:25:23.209 186853 INFO nova.virt.libvirt.driver [None req-871aadb5-1da4-4af3-8207-6b4096220c0e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Instance soft rebooted successfully.#033[00m
Nov 22 03:25:23 np0005531887 nova_compute[186849]: 2025-11-22 08:25:23.210 186853 DEBUG nova.compute.manager [None req-871aadb5-1da4-4af3-8207-6b4096220c0e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:25:23 np0005531887 nova_compute[186849]: 2025-11-22 08:25:23.234 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:25:23 np0005531887 nova_compute[186849]: 2025-11-22 08:25:23.238 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:25:23 np0005531887 podman[240094]: 2025-11-22 08:25:23.263358331 +0000 UTC m=+2.164465267 container cleanup 634ed7a34482971aaa50adad302c26317458faafbc500bced2cbea6554f56036 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-87c3d412-41cf-4963-ada8-de4b3881e6fd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:25:23 np0005531887 nova_compute[186849]: 2025-11-22 08:25:23.264 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] During sync_power_state the instance has a pending task (reboot_started). Skip.#033[00m
Nov 22 03:25:23 np0005531887 nova_compute[186849]: 2025-11-22 08:25:23.265 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763799923.20344, 0f4580b9-9b95-420b-a41e-971fafe8dab0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:25:23 np0005531887 nova_compute[186849]: 2025-11-22 08:25:23.265 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] VM Started (Lifecycle Event)#033[00m
Nov 22 03:25:23 np0005531887 systemd[1]: libpod-conmon-634ed7a34482971aaa50adad302c26317458faafbc500bced2cbea6554f56036.scope: Deactivated successfully.
Nov 22 03:25:23 np0005531887 nova_compute[186849]: 2025-11-22 08:25:23.280 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:25:23 np0005531887 nova_compute[186849]: 2025-11-22 08:25:23.284 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: reboot_started, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:25:23 np0005531887 nova_compute[186849]: 2025-11-22 08:25:23.305 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] During sync_power_state the instance has a pending task (reboot_started). Skip.#033[00m
Nov 22 03:25:23 np0005531887 nova_compute[186849]: 2025-11-22 08:25:23.707 186853 DEBUG oslo_concurrency.lockutils [None req-871aadb5-1da4-4af3-8207-6b4096220c0e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "0f4580b9-9b95-420b-a41e-971fafe8dab0" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 7.527s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:25:23 np0005531887 nova_compute[186849]: 2025-11-22 08:25:23.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:25:23 np0005531887 nova_compute[186849]: 2025-11-22 08:25:23.890 186853 DEBUG nova.compute.manager [req-2a3faf59-a3d5-4185-9076-619137348fd6 req-473d392c-752a-4c79-929a-9dd7435ba8db 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Received event network-vif-plugged-a6fcb4cd-f25f-467f-926b-423518d175bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:25:23 np0005531887 nova_compute[186849]: 2025-11-22 08:25:23.891 186853 DEBUG oslo_concurrency.lockutils [req-2a3faf59-a3d5-4185-9076-619137348fd6 req-473d392c-752a-4c79-929a-9dd7435ba8db 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "0f4580b9-9b95-420b-a41e-971fafe8dab0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:25:23 np0005531887 nova_compute[186849]: 2025-11-22 08:25:23.891 186853 DEBUG oslo_concurrency.lockutils [req-2a3faf59-a3d5-4185-9076-619137348fd6 req-473d392c-752a-4c79-929a-9dd7435ba8db 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "0f4580b9-9b95-420b-a41e-971fafe8dab0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:25:23 np0005531887 nova_compute[186849]: 2025-11-22 08:25:23.891 186853 DEBUG oslo_concurrency.lockutils [req-2a3faf59-a3d5-4185-9076-619137348fd6 req-473d392c-752a-4c79-929a-9dd7435ba8db 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "0f4580b9-9b95-420b-a41e-971fafe8dab0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:25:23 np0005531887 nova_compute[186849]: 2025-11-22 08:25:23.892 186853 DEBUG nova.compute.manager [req-2a3faf59-a3d5-4185-9076-619137348fd6 req-473d392c-752a-4c79-929a-9dd7435ba8db 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] No waiting events found dispatching network-vif-plugged-a6fcb4cd-f25f-467f-926b-423518d175bb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:25:23 np0005531887 nova_compute[186849]: 2025-11-22 08:25:23.892 186853 WARNING nova.compute.manager [req-2a3faf59-a3d5-4185-9076-619137348fd6 req-473d392c-752a-4c79-929a-9dd7435ba8db 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Received unexpected event network-vif-plugged-a6fcb4cd-f25f-467f-926b-423518d175bb for instance with vm_state active and task_state None.#033[00m
Nov 22 03:25:23 np0005531887 nova_compute[186849]: 2025-11-22 08:25:23.892 186853 DEBUG nova.compute.manager [req-2a3faf59-a3d5-4185-9076-619137348fd6 req-473d392c-752a-4c79-929a-9dd7435ba8db 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Received event network-vif-plugged-a6fcb4cd-f25f-467f-926b-423518d175bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:25:23 np0005531887 nova_compute[186849]: 2025-11-22 08:25:23.893 186853 DEBUG oslo_concurrency.lockutils [req-2a3faf59-a3d5-4185-9076-619137348fd6 req-473d392c-752a-4c79-929a-9dd7435ba8db 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "0f4580b9-9b95-420b-a41e-971fafe8dab0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:25:23 np0005531887 nova_compute[186849]: 2025-11-22 08:25:23.893 186853 DEBUG oslo_concurrency.lockutils [req-2a3faf59-a3d5-4185-9076-619137348fd6 req-473d392c-752a-4c79-929a-9dd7435ba8db 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "0f4580b9-9b95-420b-a41e-971fafe8dab0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:25:23 np0005531887 nova_compute[186849]: 2025-11-22 08:25:23.893 186853 DEBUG oslo_concurrency.lockutils [req-2a3faf59-a3d5-4185-9076-619137348fd6 req-473d392c-752a-4c79-929a-9dd7435ba8db 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "0f4580b9-9b95-420b-a41e-971fafe8dab0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:25:23 np0005531887 nova_compute[186849]: 2025-11-22 08:25:23.894 186853 DEBUG nova.compute.manager [req-2a3faf59-a3d5-4185-9076-619137348fd6 req-473d392c-752a-4c79-929a-9dd7435ba8db 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] No waiting events found dispatching network-vif-plugged-a6fcb4cd-f25f-467f-926b-423518d175bb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:25:23 np0005531887 nova_compute[186849]: 2025-11-22 08:25:23.894 186853 WARNING nova.compute.manager [req-2a3faf59-a3d5-4185-9076-619137348fd6 req-473d392c-752a-4c79-929a-9dd7435ba8db 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Received unexpected event network-vif-plugged-a6fcb4cd-f25f-467f-926b-423518d175bb for instance with vm_state active and task_state None.#033[00m
Nov 22 03:25:23 np0005531887 podman[240169]: 2025-11-22 08:25:23.900327833 +0000 UTC m=+0.610838197 container remove 634ed7a34482971aaa50adad302c26317458faafbc500bced2cbea6554f56036 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-87c3d412-41cf-4963-ada8-de4b3881e6fd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:25:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:23.906 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[6ff8f068-98c8-4a61-a189-8a656f555f5c]: (4, ('Sat Nov 22 08:25:21 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-87c3d412-41cf-4963-ada8-de4b3881e6fd (634ed7a34482971aaa50adad302c26317458faafbc500bced2cbea6554f56036)\n634ed7a34482971aaa50adad302c26317458faafbc500bced2cbea6554f56036\nSat Nov 22 08:25:23 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-87c3d412-41cf-4963-ada8-de4b3881e6fd (634ed7a34482971aaa50adad302c26317458faafbc500bced2cbea6554f56036)\n634ed7a34482971aaa50adad302c26317458faafbc500bced2cbea6554f56036\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:25:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:23.908 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[36ca9b78-df00-4377-901e-91ec940cd304]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:25:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:23.909 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap87c3d412-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:25:23 np0005531887 nova_compute[186849]: 2025-11-22 08:25:23.911 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:23 np0005531887 kernel: tap87c3d412-40: left promiscuous mode
Nov 22 03:25:23 np0005531887 nova_compute[186849]: 2025-11-22 08:25:23.915 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:23.917 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[942f5695-a959-489f-8e0c-5b4727a250ef]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:25:23 np0005531887 nova_compute[186849]: 2025-11-22 08:25:23.929 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:23.936 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[becbc748-af12-47e4-8dbf-e5d216d097d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:25:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:23.938 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[ca0f1a2a-d683-43e5-a5b2-e131adf6ff50]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:25:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:23.957 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[a0413db6-2e68-432d-b73f-e6c030c42470]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 663319, 'reachable_time': 28360, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240184, 'error': None, 'target': 'ovnmeta-87c3d412-41cf-4963-ada8-de4b3881e6fd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:25:23 np0005531887 systemd[1]: run-netns-ovnmeta\x2d87c3d412\x2d41cf\x2d4963\x2dada8\x2dde4b3881e6fd.mount: Deactivated successfully.
Nov 22 03:25:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:23.961 104200 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-87c3d412-41cf-4963-ada8-de4b3881e6fd deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:25:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:23.961 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[14d36659-f6a8-4cc1-ae63-80d29ccd2fff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:25:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:23.962 104084 INFO neutron.agent.ovn.metadata.agent [-] Port a6fcb4cd-f25f-467f-926b-423518d175bb in datapath 87c3d412-41cf-4963-ada8-de4b3881e6fd unbound from our chassis#033[00m
Nov 22 03:25:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:23.964 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 87c3d412-41cf-4963-ada8-de4b3881e6fd#033[00m
Nov 22 03:25:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:23.977 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[bbce6ce3-4260-4908-b601-6e139a3b6a68]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:25:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:23.978 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap87c3d412-41 in ovnmeta-87c3d412-41cf-4963-ada8-de4b3881e6fd namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:25:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:23.979 213790 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap87c3d412-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:25:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:23.979 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[a5d68ad1-eeb7-452c-bfec-6ca9de50e091]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:25:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:23.980 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[73d8261f-ea20-4cec-ac2a-b3718926d862]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:25:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:23.992 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[2614dde2-8a19-4642-9cbc-ac4cb3d64918]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:25:24 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:24.016 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[f81a8c02-00fd-4991-9dfd-72d673a17c1e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:25:24 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:24.045 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[741e2de7-6312-47fb-a31f-8aefc5943cb4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:25:24 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:24.053 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[88527383-ef30-4633-ba43-dbee4ceecba5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:25:24 np0005531887 NetworkManager[55210]: <info>  [1763799924.0551] manager: (tap87c3d412-40): new Veth device (/org/freedesktop/NetworkManager/Devices/265)
Nov 22 03:25:24 np0005531887 systemd-udevd[240186]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:25:24 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:24.095 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[fb3d314c-b72f-4bb8-80fa-254a9c018bc0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:25:24 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:24.099 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[8e55b21b-c329-49c3-a8f5-95935b1a954b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:25:24 np0005531887 NetworkManager[55210]: <info>  [1763799924.1279] device (tap87c3d412-40): carrier: link connected
Nov 22 03:25:24 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:24.134 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[77d207db-4d58-43a6-b195-9493b719a743]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:25:24 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:24.148 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[7e057bf8-c199-49ff-95bd-4af595b73846]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap87c3d412-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5c:05:37'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 168], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 666673, 'reachable_time': 42422, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240215, 'error': None, 'target': 'ovnmeta-87c3d412-41cf-4963-ada8-de4b3881e6fd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:25:24 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:24.161 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[22adff29-9eef-495b-af87-4888611fef2a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5c:537'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 666673, 'tstamp': 666673}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240216, 'error': None, 'target': 'ovnmeta-87c3d412-41cf-4963-ada8-de4b3881e6fd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:25:24 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:24.174 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[4b8609f7-a7d9-4b85-813d-5d44b6335b79]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap87c3d412-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5c:05:37'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 168], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 666673, 'reachable_time': 42422, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 240217, 'error': None, 'target': 'ovnmeta-87c3d412-41cf-4963-ada8-de4b3881e6fd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:25:24 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:24.201 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[dbb4baa1-3791-4668-a743-a7d5315d2669]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:25:24 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:24.254 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[20795caf-b89f-48c5-a4ee-d3b03c845d9c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:25:24 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:24.256 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap87c3d412-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:25:24 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:24.256 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:25:24 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:24.256 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap87c3d412-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:25:24 np0005531887 nova_compute[186849]: 2025-11-22 08:25:24.258 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:24 np0005531887 kernel: tap87c3d412-40: entered promiscuous mode
Nov 22 03:25:24 np0005531887 NetworkManager[55210]: <info>  [1763799924.2599] manager: (tap87c3d412-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/266)
Nov 22 03:25:24 np0005531887 nova_compute[186849]: 2025-11-22 08:25:24.261 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:24 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:24.264 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap87c3d412-40, col_values=(('external_ids', {'iface-id': 'c3dedc73-8573-44a6-afa7-82c65dae3823'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:25:24 np0005531887 nova_compute[186849]: 2025-11-22 08:25:24.265 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:24 np0005531887 ovn_controller[95130]: 2025-11-22T08:25:24Z|00554|binding|INFO|Releasing lport c3dedc73-8573-44a6-afa7-82c65dae3823 from this chassis (sb_readonly=0)
Nov 22 03:25:24 np0005531887 nova_compute[186849]: 2025-11-22 08:25:24.280 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:24 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:24.283 104084 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/87c3d412-41cf-4963-ada8-de4b3881e6fd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/87c3d412-41cf-4963-ada8-de4b3881e6fd.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:25:24 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:24.284 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[d199a3f6-310b-44dd-ac43-79f0242b6c2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:25:24 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:24.285 104084 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:25:24 np0005531887 ovn_metadata_agent[104079]: global
Nov 22 03:25:24 np0005531887 ovn_metadata_agent[104079]:    log         /dev/log local0 debug
Nov 22 03:25:24 np0005531887 ovn_metadata_agent[104079]:    log-tag     haproxy-metadata-proxy-87c3d412-41cf-4963-ada8-de4b3881e6fd
Nov 22 03:25:24 np0005531887 ovn_metadata_agent[104079]:    user        root
Nov 22 03:25:24 np0005531887 ovn_metadata_agent[104079]:    group       root
Nov 22 03:25:24 np0005531887 ovn_metadata_agent[104079]:    maxconn     1024
Nov 22 03:25:24 np0005531887 ovn_metadata_agent[104079]:    pidfile     /var/lib/neutron/external/pids/87c3d412-41cf-4963-ada8-de4b3881e6fd.pid.haproxy
Nov 22 03:25:24 np0005531887 ovn_metadata_agent[104079]:    daemon
Nov 22 03:25:24 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:25:24 np0005531887 ovn_metadata_agent[104079]: defaults
Nov 22 03:25:24 np0005531887 ovn_metadata_agent[104079]:    log global
Nov 22 03:25:24 np0005531887 ovn_metadata_agent[104079]:    mode http
Nov 22 03:25:24 np0005531887 ovn_metadata_agent[104079]:    option httplog
Nov 22 03:25:24 np0005531887 ovn_metadata_agent[104079]:    option dontlognull
Nov 22 03:25:24 np0005531887 ovn_metadata_agent[104079]:    option http-server-close
Nov 22 03:25:24 np0005531887 ovn_metadata_agent[104079]:    option forwardfor
Nov 22 03:25:24 np0005531887 ovn_metadata_agent[104079]:    retries                 3
Nov 22 03:25:24 np0005531887 ovn_metadata_agent[104079]:    timeout http-request    30s
Nov 22 03:25:24 np0005531887 ovn_metadata_agent[104079]:    timeout connect         30s
Nov 22 03:25:24 np0005531887 ovn_metadata_agent[104079]:    timeout client          32s
Nov 22 03:25:24 np0005531887 ovn_metadata_agent[104079]:    timeout server          32s
Nov 22 03:25:24 np0005531887 ovn_metadata_agent[104079]:    timeout http-keep-alive 30s
Nov 22 03:25:24 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:25:24 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:25:24 np0005531887 ovn_metadata_agent[104079]: listen listener
Nov 22 03:25:24 np0005531887 ovn_metadata_agent[104079]:    bind 169.254.169.254:80
Nov 22 03:25:24 np0005531887 ovn_metadata_agent[104079]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:25:24 np0005531887 ovn_metadata_agent[104079]:    http-request add-header X-OVN-Network-ID 87c3d412-41cf-4963-ada8-de4b3881e6fd
Nov 22 03:25:24 np0005531887 ovn_metadata_agent[104079]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:25:24 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:24.286 104084 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-87c3d412-41cf-4963-ada8-de4b3881e6fd', 'env', 'PROCESS_TAG=haproxy-87c3d412-41cf-4963-ada8-de4b3881e6fd', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/87c3d412-41cf-4963-ada8-de4b3881e6fd.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:25:24 np0005531887 nova_compute[186849]: 2025-11-22 08:25:24.423 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:24 np0005531887 podman[240249]: 2025-11-22 08:25:24.625351332 +0000 UTC m=+0.022179790 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:25:25 np0005531887 podman[240249]: 2025-11-22 08:25:25.080098897 +0000 UTC m=+0.476927305 container create 90a3ce87360e88e1c491a5116ee49dea24c233ebe3f8f69cb21d2d2589dfed26 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-87c3d412-41cf-4963-ada8-de4b3881e6fd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 22 03:25:25 np0005531887 systemd[1]: Started libpod-conmon-90a3ce87360e88e1c491a5116ee49dea24c233ebe3f8f69cb21d2d2589dfed26.scope.
Nov 22 03:25:25 np0005531887 podman[240262]: 2025-11-22 08:25:25.193248956 +0000 UTC m=+0.071243993 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, version=9.6, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, config_id=edpm, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, vendor=Red Hat, Inc.)
Nov 22 03:25:25 np0005531887 systemd[1]: Started libcrun container.
Nov 22 03:25:25 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e7f28e2be460a5719f6418139fa83f48328f7445bef6789ae86cedd7289812c5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:25:25 np0005531887 podman[240249]: 2025-11-22 08:25:25.395819184 +0000 UTC m=+0.792647622 container init 90a3ce87360e88e1c491a5116ee49dea24c233ebe3f8f69cb21d2d2589dfed26 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-87c3d412-41cf-4963-ada8-de4b3881e6fd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 22 03:25:25 np0005531887 podman[240249]: 2025-11-22 08:25:25.403035453 +0000 UTC m=+0.799863861 container start 90a3ce87360e88e1c491a5116ee49dea24c233ebe3f8f69cb21d2d2589dfed26 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-87c3d412-41cf-4963-ada8-de4b3881e6fd, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 03:25:25 np0005531887 neutron-haproxy-ovnmeta-87c3d412-41cf-4963-ada8-de4b3881e6fd[240282]: [NOTICE]   (240289) : New worker (240291) forked
Nov 22 03:25:25 np0005531887 neutron-haproxy-ovnmeta-87c3d412-41cf-4963-ada8-de4b3881e6fd[240282]: [NOTICE]   (240289) : Loading success.
Nov 22 03:25:26 np0005531887 nova_compute[186849]: 2025-11-22 08:25:26.397 186853 DEBUG nova.compute.manager [req-c06134b1-3b9a-4f76-9d28-790d3b127de2 req-d79724c4-981f-4c4e-94f5-27316110c6ef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Received event network-vif-plugged-a6fcb4cd-f25f-467f-926b-423518d175bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:25:26 np0005531887 nova_compute[186849]: 2025-11-22 08:25:26.398 186853 DEBUG oslo_concurrency.lockutils [req-c06134b1-3b9a-4f76-9d28-790d3b127de2 req-d79724c4-981f-4c4e-94f5-27316110c6ef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "0f4580b9-9b95-420b-a41e-971fafe8dab0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:25:26 np0005531887 nova_compute[186849]: 2025-11-22 08:25:26.398 186853 DEBUG oslo_concurrency.lockutils [req-c06134b1-3b9a-4f76-9d28-790d3b127de2 req-d79724c4-981f-4c4e-94f5-27316110c6ef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "0f4580b9-9b95-420b-a41e-971fafe8dab0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:25:26 np0005531887 nova_compute[186849]: 2025-11-22 08:25:26.398 186853 DEBUG oslo_concurrency.lockutils [req-c06134b1-3b9a-4f76-9d28-790d3b127de2 req-d79724c4-981f-4c4e-94f5-27316110c6ef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "0f4580b9-9b95-420b-a41e-971fafe8dab0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:25:26 np0005531887 nova_compute[186849]: 2025-11-22 08:25:26.399 186853 DEBUG nova.compute.manager [req-c06134b1-3b9a-4f76-9d28-790d3b127de2 req-d79724c4-981f-4c4e-94f5-27316110c6ef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] No waiting events found dispatching network-vif-plugged-a6fcb4cd-f25f-467f-926b-423518d175bb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:25:26 np0005531887 nova_compute[186849]: 2025-11-22 08:25:26.399 186853 WARNING nova.compute.manager [req-c06134b1-3b9a-4f76-9d28-790d3b127de2 req-d79724c4-981f-4c4e-94f5-27316110c6ef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Received unexpected event network-vif-plugged-a6fcb4cd-f25f-467f-926b-423518d175bb for instance with vm_state active and task_state None.#033[00m
Nov 22 03:25:27 np0005531887 nova_compute[186849]: 2025-11-22 08:25:27.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:25:27 np0005531887 nova_compute[186849]: 2025-11-22 08:25:27.971 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:29 np0005531887 nova_compute[186849]: 2025-11-22 08:25:29.425 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:30 np0005531887 podman[240300]: 2025-11-22 08:25:30.868286804 +0000 UTC m=+0.086222684 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:25:30 np0005531887 podman[240301]: 2025-11-22 08:25:30.943053592 +0000 UTC m=+0.158680104 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251118)
Nov 22 03:25:31 np0005531887 nova_compute[186849]: 2025-11-22 08:25:31.820 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:32 np0005531887 nova_compute[186849]: 2025-11-22 08:25:32.972 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:34 np0005531887 nova_compute[186849]: 2025-11-22 08:25:34.427 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:34 np0005531887 podman[240348]: 2025-11-22 08:25:34.866232739 +0000 UTC m=+0.083864155 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 03:25:36 np0005531887 nova_compute[186849]: 2025-11-22 08:25:36.791 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:37.358 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:37.359 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:25:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:37.359 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:25:37 np0005531887 nova_compute[186849]: 2025-11-22 08:25:37.974 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:39 np0005531887 nova_compute[186849]: 2025-11-22 08:25:39.428 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:39 np0005531887 ovn_controller[95130]: 2025-11-22T08:25:39Z|00072|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:04:50:b7 10.100.0.8
Nov 22 03:25:41 np0005531887 podman[240384]: 2025-11-22 08:25:41.865092145 +0000 UTC m=+0.077435467 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:25:42 np0005531887 nova_compute[186849]: 2025-11-22 08:25:42.976 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:44 np0005531887 nova_compute[186849]: 2025-11-22 08:25:44.430 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:44 np0005531887 nova_compute[186849]: 2025-11-22 08:25:44.814 186853 INFO nova.compute.manager [None req-c7ebdd2f-c216-4e9b-b549-20877409de1b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Get console output#033[00m
Nov 22 03:25:44 np0005531887 nova_compute[186849]: 2025-11-22 08:25:44.821 213402 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 22 03:25:45 np0005531887 ovn_controller[95130]: 2025-11-22T08:25:45Z|00555|binding|INFO|Releasing lport c3dedc73-8573-44a6-afa7-82c65dae3823 from this chassis (sb_readonly=0)
Nov 22 03:25:45 np0005531887 nova_compute[186849]: 2025-11-22 08:25:45.945 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:46 np0005531887 nova_compute[186849]: 2025-11-22 08:25:46.148 186853 DEBUG nova.compute.manager [req-ca57ce0e-55ce-4645-8b74-1589d166b455 req-fb409ad5-d0d8-449b-8e2a-8d1348379364 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Received event network-changed-a6fcb4cd-f25f-467f-926b-423518d175bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:25:46 np0005531887 nova_compute[186849]: 2025-11-22 08:25:46.148 186853 DEBUG nova.compute.manager [req-ca57ce0e-55ce-4645-8b74-1589d166b455 req-fb409ad5-d0d8-449b-8e2a-8d1348379364 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Refreshing instance network info cache due to event network-changed-a6fcb4cd-f25f-467f-926b-423518d175bb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:25:46 np0005531887 nova_compute[186849]: 2025-11-22 08:25:46.149 186853 DEBUG oslo_concurrency.lockutils [req-ca57ce0e-55ce-4645-8b74-1589d166b455 req-fb409ad5-d0d8-449b-8e2a-8d1348379364 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-0f4580b9-9b95-420b-a41e-971fafe8dab0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:25:46 np0005531887 nova_compute[186849]: 2025-11-22 08:25:46.149 186853 DEBUG oslo_concurrency.lockutils [req-ca57ce0e-55ce-4645-8b74-1589d166b455 req-fb409ad5-d0d8-449b-8e2a-8d1348379364 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-0f4580b9-9b95-420b-a41e-971fafe8dab0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:25:46 np0005531887 nova_compute[186849]: 2025-11-22 08:25:46.149 186853 DEBUG nova.network.neutron [req-ca57ce0e-55ce-4645-8b74-1589d166b455 req-fb409ad5-d0d8-449b-8e2a-8d1348379364 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Refreshing network info cache for port a6fcb4cd-f25f-467f-926b-423518d175bb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:25:46 np0005531887 nova_compute[186849]: 2025-11-22 08:25:46.408 186853 DEBUG oslo_concurrency.lockutils [None req-c84bc0dd-eb3d-4859-9849-6f41bcc0a18e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "0f4580b9-9b95-420b-a41e-971fafe8dab0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:25:46 np0005531887 nova_compute[186849]: 2025-11-22 08:25:46.408 186853 DEBUG oslo_concurrency.lockutils [None req-c84bc0dd-eb3d-4859-9849-6f41bcc0a18e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "0f4580b9-9b95-420b-a41e-971fafe8dab0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:25:46 np0005531887 nova_compute[186849]: 2025-11-22 08:25:46.409 186853 DEBUG oslo_concurrency.lockutils [None req-c84bc0dd-eb3d-4859-9849-6f41bcc0a18e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "0f4580b9-9b95-420b-a41e-971fafe8dab0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:25:46 np0005531887 nova_compute[186849]: 2025-11-22 08:25:46.409 186853 DEBUG oslo_concurrency.lockutils [None req-c84bc0dd-eb3d-4859-9849-6f41bcc0a18e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "0f4580b9-9b95-420b-a41e-971fafe8dab0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:25:46 np0005531887 nova_compute[186849]: 2025-11-22 08:25:46.409 186853 DEBUG oslo_concurrency.lockutils [None req-c84bc0dd-eb3d-4859-9849-6f41bcc0a18e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "0f4580b9-9b95-420b-a41e-971fafe8dab0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:25:46 np0005531887 nova_compute[186849]: 2025-11-22 08:25:46.418 186853 INFO nova.compute.manager [None req-c84bc0dd-eb3d-4859-9849-6f41bcc0a18e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Terminating instance#033[00m
Nov 22 03:25:46 np0005531887 nova_compute[186849]: 2025-11-22 08:25:46.423 186853 DEBUG nova.compute.manager [None req-c84bc0dd-eb3d-4859-9849-6f41bcc0a18e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 03:25:46 np0005531887 kernel: tapa6fcb4cd-f2 (unregistering): left promiscuous mode
Nov 22 03:25:46 np0005531887 NetworkManager[55210]: <info>  [1763799946.9925] device (tapa6fcb4cd-f2): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:25:47 np0005531887 nova_compute[186849]: 2025-11-22 08:25:47.004 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:47 np0005531887 ovn_controller[95130]: 2025-11-22T08:25:47Z|00556|binding|INFO|Releasing lport a6fcb4cd-f25f-467f-926b-423518d175bb from this chassis (sb_readonly=0)
Nov 22 03:25:47 np0005531887 ovn_controller[95130]: 2025-11-22T08:25:47Z|00557|binding|INFO|Setting lport a6fcb4cd-f25f-467f-926b-423518d175bb down in Southbound
Nov 22 03:25:47 np0005531887 ovn_controller[95130]: 2025-11-22T08:25:47Z|00558|binding|INFO|Removing iface tapa6fcb4cd-f2 ovn-installed in OVS
Nov 22 03:25:47 np0005531887 nova_compute[186849]: 2025-11-22 08:25:47.008 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:47 np0005531887 nova_compute[186849]: 2025-11-22 08:25:47.027 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:47 np0005531887 systemd[1]: machine-qemu\x2d59\x2dinstance\x2d000000a0.scope: Deactivated successfully.
Nov 22 03:25:47 np0005531887 systemd[1]: machine-qemu\x2d59\x2dinstance\x2d000000a0.scope: Consumed 15.659s CPU time.
Nov 22 03:25:47 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:47.058 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:50:b7 10.100.0.8'], port_security=['fa:16:3e:04:50:b7 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '0f4580b9-9b95-420b-a41e-971fafe8dab0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-87c3d412-41cf-4963-ada8-de4b3881e6fd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '042f6d127720471aaedb8a1fb7535416', 'neutron:revision_number': '6', 'neutron:security_group_ids': '919e7659-019c-4818-b503-4d36b04f6078', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2ed71e8b-3bd3-4be1-8730-8e7ffc983db4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=a6fcb4cd-f25f-467f-926b-423518d175bb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:25:47 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:47.060 104084 INFO neutron.agent.ovn.metadata.agent [-] Port a6fcb4cd-f25f-467f-926b-423518d175bb in datapath 87c3d412-41cf-4963-ada8-de4b3881e6fd unbound from our chassis#033[00m
Nov 22 03:25:47 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:47.061 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 87c3d412-41cf-4963-ada8-de4b3881e6fd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:25:47 np0005531887 systemd-machined[153180]: Machine qemu-59-instance-000000a0 terminated.
Nov 22 03:25:47 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:47.062 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[81db5f5d-aa5b-4c38-86b8-2ea02c888925]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:25:47 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:47.062 104084 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-87c3d412-41cf-4963-ada8-de4b3881e6fd namespace which is not needed anymore#033[00m
Nov 22 03:25:47 np0005531887 podman[240404]: 2025-11-22 08:25:47.074278103 +0000 UTC m=+0.070518625 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251118)
Nov 22 03:25:47 np0005531887 kernel: tapa6fcb4cd-f2: entered promiscuous mode
Nov 22 03:25:47 np0005531887 kernel: tapa6fcb4cd-f2 (unregistering): left promiscuous mode
Nov 22 03:25:47 np0005531887 ovn_controller[95130]: 2025-11-22T08:25:47Z|00559|binding|INFO|Claiming lport a6fcb4cd-f25f-467f-926b-423518d175bb for this chassis.
Nov 22 03:25:47 np0005531887 ovn_controller[95130]: 2025-11-22T08:25:47Z|00560|binding|INFO|a6fcb4cd-f25f-467f-926b-423518d175bb: Claiming fa:16:3e:04:50:b7 10.100.0.8
Nov 22 03:25:47 np0005531887 nova_compute[186849]: 2025-11-22 08:25:47.256 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:47 np0005531887 ovn_controller[95130]: 2025-11-22T08:25:47Z|00561|binding|INFO|Setting lport a6fcb4cd-f25f-467f-926b-423518d175bb ovn-installed in OVS
Nov 22 03:25:47 np0005531887 ovn_controller[95130]: 2025-11-22T08:25:47Z|00562|if_status|INFO|Dropped 2 log messages in last 1192 seconds (most recently, 1192 seconds ago) due to excessive rate
Nov 22 03:25:47 np0005531887 ovn_controller[95130]: 2025-11-22T08:25:47Z|00563|if_status|INFO|Not setting lport a6fcb4cd-f25f-467f-926b-423518d175bb down as sb is readonly
Nov 22 03:25:47 np0005531887 nova_compute[186849]: 2025-11-22 08:25:47.278 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:47 np0005531887 ovn_controller[95130]: 2025-11-22T08:25:47Z|00564|binding|INFO|Releasing lport a6fcb4cd-f25f-467f-926b-423518d175bb from this chassis (sb_readonly=0)
Nov 22 03:25:47 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:47.295 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:50:b7 10.100.0.8'], port_security=['fa:16:3e:04:50:b7 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '0f4580b9-9b95-420b-a41e-971fafe8dab0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-87c3d412-41cf-4963-ada8-de4b3881e6fd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '042f6d127720471aaedb8a1fb7535416', 'neutron:revision_number': '6', 'neutron:security_group_ids': '919e7659-019c-4818-b503-4d36b04f6078', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2ed71e8b-3bd3-4be1-8730-8e7ffc983db4, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=a6fcb4cd-f25f-467f-926b-423518d175bb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:25:47 np0005531887 nova_compute[186849]: 2025-11-22 08:25:47.298 186853 INFO nova.virt.libvirt.driver [-] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Instance destroyed successfully.#033[00m
Nov 22 03:25:47 np0005531887 nova_compute[186849]: 2025-11-22 08:25:47.298 186853 DEBUG nova.objects.instance [None req-c84bc0dd-eb3d-4859-9849-6f41bcc0a18e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'resources' on Instance uuid 0f4580b9-9b95-420b-a41e-971fafe8dab0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:25:47 np0005531887 nova_compute[186849]: 2025-11-22 08:25:47.306 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:47 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:47.310 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:50:b7 10.100.0.8'], port_security=['fa:16:3e:04:50:b7 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '0f4580b9-9b95-420b-a41e-971fafe8dab0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-87c3d412-41cf-4963-ada8-de4b3881e6fd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '042f6d127720471aaedb8a1fb7535416', 'neutron:revision_number': '6', 'neutron:security_group_ids': '919e7659-019c-4818-b503-4d36b04f6078', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2ed71e8b-3bd3-4be1-8730-8e7ffc983db4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=a6fcb4cd-f25f-467f-926b-423518d175bb) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:25:47 np0005531887 nova_compute[186849]: 2025-11-22 08:25:47.318 186853 DEBUG nova.virt.libvirt.vif [None req-c84bc0dd-eb3d-4859-9849-6f41bcc0a18e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:24:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1702931055',display_name='tempest-TestNetworkAdvancedServerOps-server-1702931055',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1702931055',id=160,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPxaQpYKOKe4L0KR7bY79WQEoXUDaFhQINyKgLQxxD+DY1SMW41QKPoSnfT27Llv7MI1/G06FoayeK4tBR1oc1AWt511XOIyHrR4CcBO56ZLEeXBHthfNrR7xWlOpV113A==',key_name='tempest-TestNetworkAdvancedServerOps-1146531194',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:24:51Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='042f6d127720471aaedb8a1fb7535416',ramdisk_id='',reservation_id='r-h8w9hc3n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1221065053',owner_user_name='tempest-TestNetworkAdvancedServerOps-1221065053-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:25:23Z,user_data=None,user_id='d8853d84c1e84f6baaf01635ef1d0f7c',uuid=0f4580b9-9b95-420b-a41e-971fafe8dab0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a6fcb4cd-f25f-467f-926b-423518d175bb", "address": "fa:16:3e:04:50:b7", "network": {"id": "87c3d412-41cf-4963-ada8-de4b3881e6fd", "bridge": "br-int", "label": "tempest-network-smoke--716674626", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6fcb4cd-f2", "ovs_interfaceid": "a6fcb4cd-f25f-467f-926b-423518d175bb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:25:47 np0005531887 nova_compute[186849]: 2025-11-22 08:25:47.318 186853 DEBUG nova.network.os_vif_util [None req-c84bc0dd-eb3d-4859-9849-6f41bcc0a18e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converting VIF {"id": "a6fcb4cd-f25f-467f-926b-423518d175bb", "address": "fa:16:3e:04:50:b7", "network": {"id": "87c3d412-41cf-4963-ada8-de4b3881e6fd", "bridge": "br-int", "label": "tempest-network-smoke--716674626", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6fcb4cd-f2", "ovs_interfaceid": "a6fcb4cd-f25f-467f-926b-423518d175bb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:25:47 np0005531887 neutron-haproxy-ovnmeta-87c3d412-41cf-4963-ada8-de4b3881e6fd[240282]: [NOTICE]   (240289) : haproxy version is 2.8.14-c23fe91
Nov 22 03:25:47 np0005531887 neutron-haproxy-ovnmeta-87c3d412-41cf-4963-ada8-de4b3881e6fd[240282]: [NOTICE]   (240289) : path to executable is /usr/sbin/haproxy
Nov 22 03:25:47 np0005531887 neutron-haproxy-ovnmeta-87c3d412-41cf-4963-ada8-de4b3881e6fd[240282]: [WARNING]  (240289) : Exiting Master process...
Nov 22 03:25:47 np0005531887 neutron-haproxy-ovnmeta-87c3d412-41cf-4963-ada8-de4b3881e6fd[240282]: [WARNING]  (240289) : Exiting Master process...
Nov 22 03:25:47 np0005531887 nova_compute[186849]: 2025-11-22 08:25:47.320 186853 DEBUG nova.network.os_vif_util [None req-c84bc0dd-eb3d-4859-9849-6f41bcc0a18e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:04:50:b7,bridge_name='br-int',has_traffic_filtering=True,id=a6fcb4cd-f25f-467f-926b-423518d175bb,network=Network(87c3d412-41cf-4963-ada8-de4b3881e6fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa6fcb4cd-f2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:25:47 np0005531887 nova_compute[186849]: 2025-11-22 08:25:47.320 186853 DEBUG os_vif [None req-c84bc0dd-eb3d-4859-9849-6f41bcc0a18e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:04:50:b7,bridge_name='br-int',has_traffic_filtering=True,id=a6fcb4cd-f25f-467f-926b-423518d175bb,network=Network(87c3d412-41cf-4963-ada8-de4b3881e6fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa6fcb4cd-f2') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:25:47 np0005531887 neutron-haproxy-ovnmeta-87c3d412-41cf-4963-ada8-de4b3881e6fd[240282]: [ALERT]    (240289) : Current worker (240291) exited with code 143 (Terminated)
Nov 22 03:25:47 np0005531887 neutron-haproxy-ovnmeta-87c3d412-41cf-4963-ada8-de4b3881e6fd[240282]: [WARNING]  (240289) : All workers exited. Exiting... (0)
Nov 22 03:25:47 np0005531887 nova_compute[186849]: 2025-11-22 08:25:47.322 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:47 np0005531887 nova_compute[186849]: 2025-11-22 08:25:47.323 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa6fcb4cd-f2, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:25:47 np0005531887 systemd[1]: libpod-90a3ce87360e88e1c491a5116ee49dea24c233ebe3f8f69cb21d2d2589dfed26.scope: Deactivated successfully.
Nov 22 03:25:47 np0005531887 nova_compute[186849]: 2025-11-22 08:25:47.324 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:47 np0005531887 nova_compute[186849]: 2025-11-22 08:25:47.326 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:47 np0005531887 nova_compute[186849]: 2025-11-22 08:25:47.329 186853 INFO os_vif [None req-c84bc0dd-eb3d-4859-9849-6f41bcc0a18e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:04:50:b7,bridge_name='br-int',has_traffic_filtering=True,id=a6fcb4cd-f25f-467f-926b-423518d175bb,network=Network(87c3d412-41cf-4963-ada8-de4b3881e6fd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa6fcb4cd-f2')#033[00m
Nov 22 03:25:47 np0005531887 nova_compute[186849]: 2025-11-22 08:25:47.329 186853 INFO nova.virt.libvirt.driver [None req-c84bc0dd-eb3d-4859-9849-6f41bcc0a18e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Deleting instance files /var/lib/nova/instances/0f4580b9-9b95-420b-a41e-971fafe8dab0_del#033[00m
Nov 22 03:25:47 np0005531887 nova_compute[186849]: 2025-11-22 08:25:47.330 186853 INFO nova.virt.libvirt.driver [None req-c84bc0dd-eb3d-4859-9849-6f41bcc0a18e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Deletion of /var/lib/nova/instances/0f4580b9-9b95-420b-a41e-971fafe8dab0_del complete#033[00m
Nov 22 03:25:47 np0005531887 podman[240448]: 2025-11-22 08:25:47.330707752 +0000 UTC m=+0.174751280 container died 90a3ce87360e88e1c491a5116ee49dea24c233ebe3f8f69cb21d2d2589dfed26 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-87c3d412-41cf-4963-ada8-de4b3881e6fd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:25:47 np0005531887 nova_compute[186849]: 2025-11-22 08:25:47.469 186853 DEBUG nova.compute.manager [req-44a7ead7-e547-493e-8ac8-310001dec977 req-4ee08281-df84-454d-9cba-2eebfd6cdd59 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Received event network-vif-unplugged-a6fcb4cd-f25f-467f-926b-423518d175bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:25:47 np0005531887 nova_compute[186849]: 2025-11-22 08:25:47.469 186853 DEBUG oslo_concurrency.lockutils [req-44a7ead7-e547-493e-8ac8-310001dec977 req-4ee08281-df84-454d-9cba-2eebfd6cdd59 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "0f4580b9-9b95-420b-a41e-971fafe8dab0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:25:47 np0005531887 nova_compute[186849]: 2025-11-22 08:25:47.469 186853 DEBUG oslo_concurrency.lockutils [req-44a7ead7-e547-493e-8ac8-310001dec977 req-4ee08281-df84-454d-9cba-2eebfd6cdd59 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "0f4580b9-9b95-420b-a41e-971fafe8dab0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:25:47 np0005531887 nova_compute[186849]: 2025-11-22 08:25:47.470 186853 DEBUG oslo_concurrency.lockutils [req-44a7ead7-e547-493e-8ac8-310001dec977 req-4ee08281-df84-454d-9cba-2eebfd6cdd59 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "0f4580b9-9b95-420b-a41e-971fafe8dab0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:25:47 np0005531887 nova_compute[186849]: 2025-11-22 08:25:47.470 186853 DEBUG nova.compute.manager [req-44a7ead7-e547-493e-8ac8-310001dec977 req-4ee08281-df84-454d-9cba-2eebfd6cdd59 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] No waiting events found dispatching network-vif-unplugged-a6fcb4cd-f25f-467f-926b-423518d175bb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:25:47 np0005531887 nova_compute[186849]: 2025-11-22 08:25:47.470 186853 DEBUG nova.compute.manager [req-44a7ead7-e547-493e-8ac8-310001dec977 req-4ee08281-df84-454d-9cba-2eebfd6cdd59 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Received event network-vif-unplugged-a6fcb4cd-f25f-467f-926b-423518d175bb for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 03:25:47 np0005531887 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-90a3ce87360e88e1c491a5116ee49dea24c233ebe3f8f69cb21d2d2589dfed26-userdata-shm.mount: Deactivated successfully.
Nov 22 03:25:47 np0005531887 systemd[1]: var-lib-containers-storage-overlay-e7f28e2be460a5719f6418139fa83f48328f7445bef6789ae86cedd7289812c5-merged.mount: Deactivated successfully.
Nov 22 03:25:47 np0005531887 nova_compute[186849]: 2025-11-22 08:25:47.636 186853 INFO nova.compute.manager [None req-c84bc0dd-eb3d-4859-9849-6f41bcc0a18e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Took 1.21 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 03:25:47 np0005531887 nova_compute[186849]: 2025-11-22 08:25:47.637 186853 DEBUG oslo.service.loopingcall [None req-c84bc0dd-eb3d-4859-9849-6f41bcc0a18e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 03:25:47 np0005531887 nova_compute[186849]: 2025-11-22 08:25:47.637 186853 DEBUG nova.compute.manager [-] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 03:25:47 np0005531887 nova_compute[186849]: 2025-11-22 08:25:47.637 186853 DEBUG nova.network.neutron [-] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 03:25:47 np0005531887 nova_compute[186849]: 2025-11-22 08:25:47.895 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:47 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:47.897 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=52, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=51) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:25:47 np0005531887 nova_compute[186849]: 2025-11-22 08:25:47.977 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:48 np0005531887 podman[240448]: 2025-11-22 08:25:48.066599267 +0000 UTC m=+0.910642785 container cleanup 90a3ce87360e88e1c491a5116ee49dea24c233ebe3f8f69cb21d2d2589dfed26 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-87c3d412-41cf-4963-ada8-de4b3881e6fd, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Nov 22 03:25:48 np0005531887 systemd[1]: libpod-conmon-90a3ce87360e88e1c491a5116ee49dea24c233ebe3f8f69cb21d2d2589dfed26.scope: Deactivated successfully.
Nov 22 03:25:48 np0005531887 nova_compute[186849]: 2025-11-22 08:25:48.665 186853 DEBUG nova.network.neutron [req-ca57ce0e-55ce-4645-8b74-1589d166b455 req-fb409ad5-d0d8-449b-8e2a-8d1348379364 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Updated VIF entry in instance network info cache for port a6fcb4cd-f25f-467f-926b-423518d175bb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:25:48 np0005531887 nova_compute[186849]: 2025-11-22 08:25:48.666 186853 DEBUG nova.network.neutron [req-ca57ce0e-55ce-4645-8b74-1589d166b455 req-fb409ad5-d0d8-449b-8e2a-8d1348379364 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Updating instance_info_cache with network_info: [{"id": "a6fcb4cd-f25f-467f-926b-423518d175bb", "address": "fa:16:3e:04:50:b7", "network": {"id": "87c3d412-41cf-4963-ada8-de4b3881e6fd", "bridge": "br-int", "label": "tempest-network-smoke--716674626", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6fcb4cd-f2", "ovs_interfaceid": "a6fcb4cd-f25f-467f-926b-423518d175bb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:25:48 np0005531887 podman[240495]: 2025-11-22 08:25:48.677074456 +0000 UTC m=+0.586858119 container remove 90a3ce87360e88e1c491a5116ee49dea24c233ebe3f8f69cb21d2d2589dfed26 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-87c3d412-41cf-4963-ada8-de4b3881e6fd, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 03:25:48 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:48.682 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[87e7cc4a-3587-4296-824b-6f2afc6096d7]: (4, ('Sat Nov 22 08:25:47 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-87c3d412-41cf-4963-ada8-de4b3881e6fd (90a3ce87360e88e1c491a5116ee49dea24c233ebe3f8f69cb21d2d2589dfed26)\n90a3ce87360e88e1c491a5116ee49dea24c233ebe3f8f69cb21d2d2589dfed26\nSat Nov 22 08:25:48 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-87c3d412-41cf-4963-ada8-de4b3881e6fd (90a3ce87360e88e1c491a5116ee49dea24c233ebe3f8f69cb21d2d2589dfed26)\n90a3ce87360e88e1c491a5116ee49dea24c233ebe3f8f69cb21d2d2589dfed26\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:25:48 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:48.684 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[911d20e9-7af1-4f63-a913-3f2f9f4c5c57]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:25:48 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:48.685 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap87c3d412-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:25:48 np0005531887 nova_compute[186849]: 2025-11-22 08:25:48.688 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:48 np0005531887 kernel: tap87c3d412-40: left promiscuous mode
Nov 22 03:25:48 np0005531887 nova_compute[186849]: 2025-11-22 08:25:48.699 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:48 np0005531887 nova_compute[186849]: 2025-11-22 08:25:48.701 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:48 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:48.703 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[68281a6b-852d-42d7-896d-fa084ffc473c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:25:48 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:48.727 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[31836593-8a85-46cc-86eb-4ad812ea80b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:25:48 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:48.730 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[aaa0ce27-200f-4a42-817d-f2dc95010b90]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:25:48 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:48.749 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[a78b750b-cb0a-4f27-a11c-1080764afa99]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 666664, 'reachable_time': 15068, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240510, 'error': None, 'target': 'ovnmeta-87c3d412-41cf-4963-ada8-de4b3881e6fd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:25:48 np0005531887 systemd[1]: run-netns-ovnmeta\x2d87c3d412\x2d41cf\x2d4963\x2dada8\x2dde4b3881e6fd.mount: Deactivated successfully.
Nov 22 03:25:48 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:48.755 104200 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-87c3d412-41cf-4963-ada8-de4b3881e6fd deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:25:48 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:48.755 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[7e44a534-5b10-4475-a2cd-e83e46c613dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:25:48 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:48.757 104084 INFO neutron.agent.ovn.metadata.agent [-] Port a6fcb4cd-f25f-467f-926b-423518d175bb in datapath 87c3d412-41cf-4963-ada8-de4b3881e6fd unbound from our chassis#033[00m
Nov 22 03:25:48 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:48.759 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 87c3d412-41cf-4963-ada8-de4b3881e6fd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:25:48 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:48.760 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[482d4062-9fee-4378-9f26-d45cb7431c52]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:25:48 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:48.761 104084 INFO neutron.agent.ovn.metadata.agent [-] Port a6fcb4cd-f25f-467f-926b-423518d175bb in datapath 87c3d412-41cf-4963-ada8-de4b3881e6fd unbound from our chassis#033[00m
Nov 22 03:25:48 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:48.763 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 87c3d412-41cf-4963-ada8-de4b3881e6fd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:25:48 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:48.764 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[b0ae92ec-ad28-4d29-a85b-f7d0e476f8f0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:25:48 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:48.766 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:25:48 np0005531887 nova_compute[186849]: 2025-11-22 08:25:48.828 186853 DEBUG oslo_concurrency.lockutils [req-ca57ce0e-55ce-4645-8b74-1589d166b455 req-fb409ad5-d0d8-449b-8e2a-8d1348379364 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-0f4580b9-9b95-420b-a41e-971fafe8dab0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:25:48 np0005531887 nova_compute[186849]: 2025-11-22 08:25:48.843 186853 DEBUG nova.network.neutron [-] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:25:48 np0005531887 nova_compute[186849]: 2025-11-22 08:25:48.938 186853 DEBUG nova.compute.manager [req-bed5e406-7bd1-4a6b-b9e6-509a2a12884c req-8ee8d0b4-e301-446a-844d-bc12683823ed 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Received event network-vif-deleted-a6fcb4cd-f25f-467f-926b-423518d175bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:25:48 np0005531887 nova_compute[186849]: 2025-11-22 08:25:48.938 186853 INFO nova.compute.manager [req-bed5e406-7bd1-4a6b-b9e6-509a2a12884c req-8ee8d0b4-e301-446a-844d-bc12683823ed 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Neutron deleted interface a6fcb4cd-f25f-467f-926b-423518d175bb; detaching it from the instance and deleting it from the info cache#033[00m
Nov 22 03:25:48 np0005531887 nova_compute[186849]: 2025-11-22 08:25:48.939 186853 DEBUG nova.network.neutron [req-bed5e406-7bd1-4a6b-b9e6-509a2a12884c req-8ee8d0b4-e301-446a-844d-bc12683823ed 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:25:49 np0005531887 nova_compute[186849]: 2025-11-22 08:25:49.009 186853 INFO nova.compute.manager [-] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Took 1.37 seconds to deallocate network for instance.#033[00m
Nov 22 03:25:49 np0005531887 nova_compute[186849]: 2025-11-22 08:25:49.021 186853 DEBUG nova.compute.manager [req-bed5e406-7bd1-4a6b-b9e6-509a2a12884c req-8ee8d0b4-e301-446a-844d-bc12683823ed 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Detach interface failed, port_id=a6fcb4cd-f25f-467f-926b-423518d175bb, reason: Instance 0f4580b9-9b95-420b-a41e-971fafe8dab0 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Nov 22 03:25:49 np0005531887 nova_compute[186849]: 2025-11-22 08:25:49.130 186853 DEBUG oslo_concurrency.lockutils [None req-c84bc0dd-eb3d-4859-9849-6f41bcc0a18e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:25:49 np0005531887 nova_compute[186849]: 2025-11-22 08:25:49.130 186853 DEBUG oslo_concurrency.lockutils [None req-c84bc0dd-eb3d-4859-9849-6f41bcc0a18e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:25:49 np0005531887 nova_compute[186849]: 2025-11-22 08:25:49.216 186853 DEBUG nova.compute.provider_tree [None req-c84bc0dd-eb3d-4859-9849-6f41bcc0a18e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:25:49 np0005531887 nova_compute[186849]: 2025-11-22 08:25:49.228 186853 DEBUG nova.scheduler.client.report [None req-c84bc0dd-eb3d-4859-9849-6f41bcc0a18e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:25:49 np0005531887 nova_compute[186849]: 2025-11-22 08:25:49.281 186853 DEBUG oslo_concurrency.lockutils [None req-c84bc0dd-eb3d-4859-9849-6f41bcc0a18e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.150s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:25:49 np0005531887 nova_compute[186849]: 2025-11-22 08:25:49.328 186853 INFO nova.scheduler.client.report [None req-c84bc0dd-eb3d-4859-9849-6f41bcc0a18e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Deleted allocations for instance 0f4580b9-9b95-420b-a41e-971fafe8dab0#033[00m
Nov 22 03:25:49 np0005531887 nova_compute[186849]: 2025-11-22 08:25:49.442 186853 DEBUG oslo_concurrency.lockutils [None req-c84bc0dd-eb3d-4859-9849-6f41bcc0a18e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "0f4580b9-9b95-420b-a41e-971fafe8dab0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.033s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:25:49 np0005531887 nova_compute[186849]: 2025-11-22 08:25:49.608 186853 DEBUG nova.compute.manager [req-a87636d7-6eea-4a4c-af00-eb0dbc0ca015 req-3a0d540e-0b57-4d32-b4de-f426d38defc2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Received event network-vif-plugged-a6fcb4cd-f25f-467f-926b-423518d175bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:25:49 np0005531887 nova_compute[186849]: 2025-11-22 08:25:49.609 186853 DEBUG oslo_concurrency.lockutils [req-a87636d7-6eea-4a4c-af00-eb0dbc0ca015 req-3a0d540e-0b57-4d32-b4de-f426d38defc2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "0f4580b9-9b95-420b-a41e-971fafe8dab0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:25:49 np0005531887 nova_compute[186849]: 2025-11-22 08:25:49.609 186853 DEBUG oslo_concurrency.lockutils [req-a87636d7-6eea-4a4c-af00-eb0dbc0ca015 req-3a0d540e-0b57-4d32-b4de-f426d38defc2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "0f4580b9-9b95-420b-a41e-971fafe8dab0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:25:49 np0005531887 nova_compute[186849]: 2025-11-22 08:25:49.609 186853 DEBUG oslo_concurrency.lockutils [req-a87636d7-6eea-4a4c-af00-eb0dbc0ca015 req-3a0d540e-0b57-4d32-b4de-f426d38defc2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "0f4580b9-9b95-420b-a41e-971fafe8dab0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:25:49 np0005531887 nova_compute[186849]: 2025-11-22 08:25:49.610 186853 DEBUG nova.compute.manager [req-a87636d7-6eea-4a4c-af00-eb0dbc0ca015 req-3a0d540e-0b57-4d32-b4de-f426d38defc2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] No waiting events found dispatching network-vif-plugged-a6fcb4cd-f25f-467f-926b-423518d175bb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:25:49 np0005531887 nova_compute[186849]: 2025-11-22 08:25:49.610 186853 WARNING nova.compute.manager [req-a87636d7-6eea-4a4c-af00-eb0dbc0ca015 req-3a0d540e-0b57-4d32-b4de-f426d38defc2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Received unexpected event network-vif-plugged-a6fcb4cd-f25f-467f-926b-423518d175bb for instance with vm_state deleted and task_state None.#033[00m
Nov 22 03:25:50 np0005531887 podman[240511]: 2025-11-22 08:25:50.84284415 +0000 UTC m=+0.057490226 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 03:25:51 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:25:51.768 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '52'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:25:52 np0005531887 nova_compute[186849]: 2025-11-22 08:25:52.326 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:52 np0005531887 nova_compute[186849]: 2025-11-22 08:25:52.979 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:55 np0005531887 podman[240535]: 2025-11-22 08:25:55.845893426 +0000 UTC m=+0.064194421 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, architecture=x86_64, config_id=edpm, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public)
Nov 22 03:25:57 np0005531887 nova_compute[186849]: 2025-11-22 08:25:57.329 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:25:57 np0005531887 nova_compute[186849]: 2025-11-22 08:25:57.981 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:26:01 np0005531887 podman[240558]: 2025-11-22 08:26:01.889557628 +0000 UTC m=+0.097365317 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:26:01 np0005531887 podman[240559]: 2025-11-22 08:26:01.889568408 +0000 UTC m=+0.093006480 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Nov 22 03:26:02 np0005531887 nova_compute[186849]: 2025-11-22 08:26:02.296 186853 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763799947.2951484, 0f4580b9-9b95-420b-a41e-971fafe8dab0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:26:02 np0005531887 nova_compute[186849]: 2025-11-22 08:26:02.297 186853 INFO nova.compute.manager [-] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:26:02 np0005531887 nova_compute[186849]: 2025-11-22 08:26:02.311 186853 DEBUG nova.compute.manager [None req-df836a15-05e7-4b20-986e-6efcd16d0934 - - - - - -] [instance: 0f4580b9-9b95-420b-a41e-971fafe8dab0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:26:02 np0005531887 nova_compute[186849]: 2025-11-22 08:26:02.330 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:26:02 np0005531887 nova_compute[186849]: 2025-11-22 08:26:02.984 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:26:03 np0005531887 nova_compute[186849]: 2025-11-22 08:26:03.594 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:26:03 np0005531887 nova_compute[186849]: 2025-11-22 08:26:03.671 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:26:05 np0005531887 podman[240602]: 2025-11-22 08:26:05.832141315 +0000 UTC m=+0.049265153 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:26:07 np0005531887 nova_compute[186849]: 2025-11-22 08:26:07.334 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:26:07 np0005531887 nova_compute[186849]: 2025-11-22 08:26:07.986 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:26:09 np0005531887 nova_compute[186849]: 2025-11-22 08:26:09.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:26:11 np0005531887 nova_compute[186849]: 2025-11-22 08:26:11.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:26:11 np0005531887 nova_compute[186849]: 2025-11-22 08:26:11.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:26:11 np0005531887 nova_compute[186849]: 2025-11-22 08:26:11.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:26:11 np0005531887 nova_compute[186849]: 2025-11-22 08:26:11.784 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:26:12 np0005531887 nova_compute[186849]: 2025-11-22 08:26:12.336 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:26:12 np0005531887 podman[240626]: 2025-11-22 08:26:12.8685701 +0000 UTC m=+0.085952466 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:26:12 np0005531887 nova_compute[186849]: 2025-11-22 08:26:12.988 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:26:14 np0005531887 nova_compute[186849]: 2025-11-22 08:26:14.779 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:26:15 np0005531887 nova_compute[186849]: 2025-11-22 08:26:15.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:26:15 np0005531887 nova_compute[186849]: 2025-11-22 08:26:15.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:26:15 np0005531887 nova_compute[186849]: 2025-11-22 08:26:15.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:26:16 np0005531887 nova_compute[186849]: 2025-11-22 08:26:16.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:26:16 np0005531887 nova_compute[186849]: 2025-11-22 08:26:16.803 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:26:16 np0005531887 nova_compute[186849]: 2025-11-22 08:26:16.803 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:26:16 np0005531887 nova_compute[186849]: 2025-11-22 08:26:16.803 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:26:16 np0005531887 nova_compute[186849]: 2025-11-22 08:26:16.803 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:26:16 np0005531887 nova_compute[186849]: 2025-11-22 08:26:16.995 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:26:16 np0005531887 nova_compute[186849]: 2025-11-22 08:26:16.996 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5731MB free_disk=73.27409744262695GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:26:16 np0005531887 nova_compute[186849]: 2025-11-22 08:26:16.997 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:26:16 np0005531887 nova_compute[186849]: 2025-11-22 08:26:16.997 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:26:17 np0005531887 nova_compute[186849]: 2025-11-22 08:26:17.051 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:26:17 np0005531887 nova_compute[186849]: 2025-11-22 08:26:17.052 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:26:17 np0005531887 nova_compute[186849]: 2025-11-22 08:26:17.070 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Refreshing inventories for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 22 03:26:17 np0005531887 nova_compute[186849]: 2025-11-22 08:26:17.087 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Updating ProviderTree inventory for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 22 03:26:17 np0005531887 nova_compute[186849]: 2025-11-22 08:26:17.087 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Updating inventory in ProviderTree for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 22 03:26:17 np0005531887 nova_compute[186849]: 2025-11-22 08:26:17.120 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Refreshing aggregate associations for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 22 03:26:17 np0005531887 nova_compute[186849]: 2025-11-22 08:26:17.147 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Refreshing trait associations for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78, traits: COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_MMX,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_PCNET _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 22 03:26:17 np0005531887 nova_compute[186849]: 2025-11-22 08:26:17.173 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:26:17 np0005531887 nova_compute[186849]: 2025-11-22 08:26:17.189 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:26:17 np0005531887 nova_compute[186849]: 2025-11-22 08:26:17.228 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:26:17 np0005531887 nova_compute[186849]: 2025-11-22 08:26:17.229 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.232s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:26:17 np0005531887 nova_compute[186849]: 2025-11-22 08:26:17.340 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:26:17 np0005531887 podman[240646]: 2025-11-22 08:26:17.849245778 +0000 UTC m=+0.067935692 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.license=GPLv2)
Nov 22 03:26:17 np0005531887 nova_compute[186849]: 2025-11-22 08:26:17.991 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:26:21 np0005531887 podman[240666]: 2025-11-22 08:26:21.825644498 +0000 UTC m=+0.047794957 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:26:22 np0005531887 nova_compute[186849]: 2025-11-22 08:26:22.229 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:26:22 np0005531887 nova_compute[186849]: 2025-11-22 08:26:22.343 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:26:22 np0005531887 nova_compute[186849]: 2025-11-22 08:26:22.992 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:26:23 np0005531887 nova_compute[186849]: 2025-11-22 08:26:23.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:26:26 np0005531887 podman[240691]: 2025-11-22 08:26:26.85242637 +0000 UTC m=+0.068972258 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, vendor=Red Hat, Inc., version=9.6, release=1755695350, io.openshift.tags=minimal rhel9, vcs-type=git, distribution-scope=public, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, architecture=x86_64, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm)
Nov 22 03:26:27 np0005531887 nova_compute[186849]: 2025-11-22 08:26:27.346 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:26:27 np0005531887 nova_compute[186849]: 2025-11-22 08:26:27.993 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:26:29 np0005531887 nova_compute[186849]: 2025-11-22 08:26:29.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:26:32 np0005531887 nova_compute[186849]: 2025-11-22 08:26:32.350 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:26:32 np0005531887 podman[240713]: 2025-11-22 08:26:32.856525717 +0000 UTC m=+0.074567726 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:26:32 np0005531887 podman[240714]: 2025-11-22 08:26:32.880929117 +0000 UTC m=+0.093744777 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 22 03:26:32 np0005531887 nova_compute[186849]: 2025-11-22 08:26:32.996 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:26:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:26:36.671 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:26:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:26:36.673 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:26:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:26:36.674 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:26:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:26:36.674 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:26:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:26:36.674 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:26:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:26:36.674 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:26:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:26:36.674 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:26:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:26:36.674 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:26:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:26:36.674 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:26:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:26:36.674 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:26:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:26:36.674 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:26:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:26:36.674 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:26:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:26:36.674 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:26:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:26:36.674 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:26:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:26:36.675 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:26:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:26:36.675 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:26:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:26:36.675 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:26:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:26:36.675 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:26:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:26:36.675 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:26:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:26:36.675 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:26:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:26:36.675 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:26:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:26:36.675 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:26:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:26:36.675 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:26:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:26:36.675 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:26:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:26:36.675 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:26:36 np0005531887 podman[240760]: 2025-11-22 08:26:36.847017034 +0000 UTC m=+0.063648088 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:26:37 np0005531887 nova_compute[186849]: 2025-11-22 08:26:37.353 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:26:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:26:37.359 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:26:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:26:37.360 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:26:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:26:37.360 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:26:37 np0005531887 nova_compute[186849]: 2025-11-22 08:26:37.998 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:26:40 np0005531887 nova_compute[186849]: 2025-11-22 08:26:40.762 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:26:42 np0005531887 nova_compute[186849]: 2025-11-22 08:26:42.357 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:26:43 np0005531887 nova_compute[186849]: 2025-11-22 08:26:43.001 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:26:43 np0005531887 podman[240784]: 2025-11-22 08:26:43.843003033 +0000 UTC m=+0.059562176 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible)
Nov 22 03:26:47 np0005531887 nova_compute[186849]: 2025-11-22 08:26:47.360 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:26:48 np0005531887 nova_compute[186849]: 2025-11-22 08:26:48.002 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:26:48 np0005531887 podman[240803]: 2025-11-22 08:26:48.838033494 +0000 UTC m=+0.055065846 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3)
Nov 22 03:26:50 np0005531887 nova_compute[186849]: 2025-11-22 08:26:50.717 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:26:50 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:26:50.718 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=53, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=52) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:26:50 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:26:50.719 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:26:52 np0005531887 nova_compute[186849]: 2025-11-22 08:26:52.363 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:26:52 np0005531887 podman[240824]: 2025-11-22 08:26:52.839900731 +0000 UTC m=+0.059568737 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:26:53 np0005531887 nova_compute[186849]: 2025-11-22 08:26:53.005 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:26:54 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:26:54.735 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '53'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:26:57 np0005531887 nova_compute[186849]: 2025-11-22 08:26:57.366 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:26:57 np0005531887 ovn_controller[95130]: 2025-11-22T08:26:57Z|00565|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Nov 22 03:26:57 np0005531887 podman[240848]: 2025-11-22 08:26:57.843584234 +0000 UTC m=+0.064111858 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, io.openshift.expose-services=, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, io.buildah.version=1.33.7, managed_by=edpm_ansible, architecture=x86_64, io.openshift.tags=minimal rhel9, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 03:26:58 np0005531887 nova_compute[186849]: 2025-11-22 08:26:58.010 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:27:02 np0005531887 nova_compute[186849]: 2025-11-22 08:27:02.370 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:27:03 np0005531887 nova_compute[186849]: 2025-11-22 08:27:03.012 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:27:03 np0005531887 podman[240870]: 2025-11-22 08:27:03.837143772 +0000 UTC m=+0.059935804 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 03:27:03 np0005531887 podman[240871]: 2025-11-22 08:27:03.890466413 +0000 UTC m=+0.109728009 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 03:27:07 np0005531887 nova_compute[186849]: 2025-11-22 08:27:07.374 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:27:07 np0005531887 podman[240911]: 2025-11-22 08:27:07.835457821 +0000 UTC m=+0.053695122 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 03:27:08 np0005531887 nova_compute[186849]: 2025-11-22 08:27:08.014 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:27:11 np0005531887 nova_compute[186849]: 2025-11-22 08:27:11.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:27:12 np0005531887 nova_compute[186849]: 2025-11-22 08:27:12.379 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:27:12 np0005531887 nova_compute[186849]: 2025-11-22 08:27:12.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:27:12 np0005531887 nova_compute[186849]: 2025-11-22 08:27:12.768 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:27:12 np0005531887 nova_compute[186849]: 2025-11-22 08:27:12.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:27:12 np0005531887 nova_compute[186849]: 2025-11-22 08:27:12.780 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:27:13 np0005531887 nova_compute[186849]: 2025-11-22 08:27:13.017 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:27:14 np0005531887 podman[240935]: 2025-11-22 08:27:14.82701509 +0000 UTC m=+0.050504353 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 03:27:15 np0005531887 nova_compute[186849]: 2025-11-22 08:27:15.774 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:27:16 np0005531887 nova_compute[186849]: 2025-11-22 08:27:16.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:27:16 np0005531887 nova_compute[186849]: 2025-11-22 08:27:16.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:27:16 np0005531887 nova_compute[186849]: 2025-11-22 08:27:16.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:27:16 np0005531887 nova_compute[186849]: 2025-11-22 08:27:16.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:27:16 np0005531887 nova_compute[186849]: 2025-11-22 08:27:16.791 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:27:16 np0005531887 nova_compute[186849]: 2025-11-22 08:27:16.792 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:27:16 np0005531887 nova_compute[186849]: 2025-11-22 08:27:16.792 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:27:16 np0005531887 nova_compute[186849]: 2025-11-22 08:27:16.792 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:27:16 np0005531887 nova_compute[186849]: 2025-11-22 08:27:16.946 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:27:16 np0005531887 nova_compute[186849]: 2025-11-22 08:27:16.946 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5737MB free_disk=73.27411651611328GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:27:16 np0005531887 nova_compute[186849]: 2025-11-22 08:27:16.947 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:27:16 np0005531887 nova_compute[186849]: 2025-11-22 08:27:16.947 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:27:16 np0005531887 nova_compute[186849]: 2025-11-22 08:27:16.996 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:27:16 np0005531887 nova_compute[186849]: 2025-11-22 08:27:16.996 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:27:17 np0005531887 nova_compute[186849]: 2025-11-22 08:27:17.021 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:27:17 np0005531887 nova_compute[186849]: 2025-11-22 08:27:17.032 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:27:17 np0005531887 nova_compute[186849]: 2025-11-22 08:27:17.033 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:27:17 np0005531887 nova_compute[186849]: 2025-11-22 08:27:17.034 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.087s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:27:17 np0005531887 nova_compute[186849]: 2025-11-22 08:27:17.382 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:27:18 np0005531887 nova_compute[186849]: 2025-11-22 08:27:18.019 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:27:19 np0005531887 podman[240956]: 2025-11-22 08:27:19.845601262 +0000 UTC m=+0.062508109 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd)
Nov 22 03:27:22 np0005531887 nova_compute[186849]: 2025-11-22 08:27:22.385 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:27:23 np0005531887 nova_compute[186849]: 2025-11-22 08:27:23.021 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:27:23 np0005531887 podman[240976]: 2025-11-22 08:27:23.856339835 +0000 UTC m=+0.073650452 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 03:27:24 np0005531887 nova_compute[186849]: 2025-11-22 08:27:24.034 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:27:24 np0005531887 nova_compute[186849]: 2025-11-22 08:27:24.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:27:27 np0005531887 nova_compute[186849]: 2025-11-22 08:27:27.387 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:27:28 np0005531887 nova_compute[186849]: 2025-11-22 08:27:28.026 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:27:28 np0005531887 podman[241000]: 2025-11-22 08:27:28.835692972 +0000 UTC m=+0.054301528 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, version=9.6, distribution-scope=public, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, io.openshift.expose-services=, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9)
Nov 22 03:27:30 np0005531887 nova_compute[186849]: 2025-11-22 08:27:30.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:27:32 np0005531887 nova_compute[186849]: 2025-11-22 08:27:32.391 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:27:33 np0005531887 nova_compute[186849]: 2025-11-22 08:27:33.028 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:27:34 np0005531887 podman[241021]: 2025-11-22 08:27:34.837982464 +0000 UTC m=+0.058542282 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:27:34 np0005531887 podman[241022]: 2025-11-22 08:27:34.872494782 +0000 UTC m=+0.089278907 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 22 03:27:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:27:37.361 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:27:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:27:37.362 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:27:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:27:37.362 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:27:37 np0005531887 nova_compute[186849]: 2025-11-22 08:27:37.395 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:27:38 np0005531887 nova_compute[186849]: 2025-11-22 08:27:38.030 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:27:38 np0005531887 podman[241066]: 2025-11-22 08:27:38.83947428 +0000 UTC m=+0.061244867 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:27:42 np0005531887 nova_compute[186849]: 2025-11-22 08:27:42.399 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:27:43 np0005531887 nova_compute[186849]: 2025-11-22 08:27:43.032 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:27:45 np0005531887 podman[241090]: 2025-11-22 08:27:45.831447821 +0000 UTC m=+0.047812388 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:27:47 np0005531887 nova_compute[186849]: 2025-11-22 08:27:47.402 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:27:48 np0005531887 nova_compute[186849]: 2025-11-22 08:27:48.254 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:27:49 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:27:49.620 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=54, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=53) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:27:49 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:27:49.621 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:27:49 np0005531887 nova_compute[186849]: 2025-11-22 08:27:49.622 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:27:50 np0005531887 podman[241111]: 2025-11-22 08:27:50.879395935 +0000 UTC m=+0.092371294 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:27:52 np0005531887 nova_compute[186849]: 2025-11-22 08:27:52.405 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:27:53 np0005531887 nova_compute[186849]: 2025-11-22 08:27:53.258 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:27:53 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:27:53.623 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '54'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:27:54 np0005531887 podman[241131]: 2025-11-22 08:27:54.828604014 +0000 UTC m=+0.049152370 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 03:27:57 np0005531887 nova_compute[186849]: 2025-11-22 08:27:57.409 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:27:58 np0005531887 nova_compute[186849]: 2025-11-22 08:27:58.260 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:27:59 np0005531887 podman[241155]: 2025-11-22 08:27:59.864854229 +0000 UTC m=+0.084331005 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, distribution-scope=public, container_name=openstack_network_exporter, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, io.openshift.expose-services=, maintainer=Red Hat, Inc., config_id=edpm, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41)
Nov 22 03:28:02 np0005531887 nova_compute[186849]: 2025-11-22 08:28:02.412 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:03 np0005531887 nova_compute[186849]: 2025-11-22 08:28:03.261 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:05 np0005531887 podman[241176]: 2025-11-22 08:28:05.833717061 +0000 UTC m=+0.057393983 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:28:05 np0005531887 podman[241177]: 2025-11-22 08:28:05.906970343 +0000 UTC m=+0.123374716 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3)
Nov 22 03:28:07 np0005531887 nova_compute[186849]: 2025-11-22 08:28:07.415 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:08 np0005531887 nova_compute[186849]: 2025-11-22 08:28:08.264 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:09 np0005531887 podman[241223]: 2025-11-22 08:28:09.839874653 +0000 UTC m=+0.058031688 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:28:11 np0005531887 nova_compute[186849]: 2025-11-22 08:28:11.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:28:12 np0005531887 nova_compute[186849]: 2025-11-22 08:28:12.419 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:13 np0005531887 nova_compute[186849]: 2025-11-22 08:28:13.265 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:13 np0005531887 nova_compute[186849]: 2025-11-22 08:28:13.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:28:13 np0005531887 nova_compute[186849]: 2025-11-22 08:28:13.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:28:13 np0005531887 nova_compute[186849]: 2025-11-22 08:28:13.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:28:13 np0005531887 nova_compute[186849]: 2025-11-22 08:28:13.786 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:28:15 np0005531887 nova_compute[186849]: 2025-11-22 08:28:15.780 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:28:16 np0005531887 nova_compute[186849]: 2025-11-22 08:28:16.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:28:16 np0005531887 nova_compute[186849]: 2025-11-22 08:28:16.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:28:16 np0005531887 podman[241249]: 2025-11-22 08:28:16.843667517 +0000 UTC m=+0.060045659 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 22 03:28:17 np0005531887 nova_compute[186849]: 2025-11-22 08:28:17.422 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:17 np0005531887 nova_compute[186849]: 2025-11-22 08:28:17.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:28:17 np0005531887 nova_compute[186849]: 2025-11-22 08:28:17.791 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:28:17 np0005531887 nova_compute[186849]: 2025-11-22 08:28:17.791 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:28:17 np0005531887 nova_compute[186849]: 2025-11-22 08:28:17.791 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:28:17 np0005531887 nova_compute[186849]: 2025-11-22 08:28:17.792 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:28:17 np0005531887 nova_compute[186849]: 2025-11-22 08:28:17.984 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:28:17 np0005531887 nova_compute[186849]: 2025-11-22 08:28:17.985 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5738MB free_disk=73.27409744262695GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:28:17 np0005531887 nova_compute[186849]: 2025-11-22 08:28:17.985 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:28:17 np0005531887 nova_compute[186849]: 2025-11-22 08:28:17.986 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:28:18 np0005531887 nova_compute[186849]: 2025-11-22 08:28:18.190 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:28:18 np0005531887 nova_compute[186849]: 2025-11-22 08:28:18.190 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:28:18 np0005531887 nova_compute[186849]: 2025-11-22 08:28:18.216 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:28:18 np0005531887 nova_compute[186849]: 2025-11-22 08:28:18.231 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:28:18 np0005531887 nova_compute[186849]: 2025-11-22 08:28:18.233 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:28:18 np0005531887 nova_compute[186849]: 2025-11-22 08:28:18.233 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.247s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:28:18 np0005531887 nova_compute[186849]: 2025-11-22 08:28:18.268 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:19 np0005531887 nova_compute[186849]: 2025-11-22 08:28:19.234 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:28:21 np0005531887 podman[241268]: 2025-11-22 08:28:21.841492246 +0000 UTC m=+0.060886179 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:28:22 np0005531887 nova_compute[186849]: 2025-11-22 08:28:22.426 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:23 np0005531887 nova_compute[186849]: 2025-11-22 08:28:23.269 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:23 np0005531887 nova_compute[186849]: 2025-11-22 08:28:23.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:28:24 np0005531887 nova_compute[186849]: 2025-11-22 08:28:24.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:28:25 np0005531887 podman[241288]: 2025-11-22 08:28:25.848919539 +0000 UTC m=+0.073588691 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 03:28:27 np0005531887 nova_compute[186849]: 2025-11-22 08:28:27.429 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:28 np0005531887 nova_compute[186849]: 2025-11-22 08:28:28.272 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:30 np0005531887 podman[241313]: 2025-11-22 08:28:30.843536939 +0000 UTC m=+0.062068278 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., name=ubi9-minimal, release=1755695350, vcs-type=git, managed_by=edpm_ansible, distribution-scope=public, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6)
Nov 22 03:28:31 np0005531887 nova_compute[186849]: 2025-11-22 08:28:31.770 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:28:32 np0005531887 nova_compute[186849]: 2025-11-22 08:28:32.431 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:33 np0005531887 nova_compute[186849]: 2025-11-22 08:28:33.275 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:28:36.670 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:28:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:28:36.671 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:28:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:28:36.671 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:28:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:28:36.671 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:28:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:28:36.671 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:28:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:28:36.671 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:28:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:28:36.672 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:28:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:28:36.672 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:28:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:28:36.672 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:28:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:28:36.672 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:28:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:28:36.672 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:28:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:28:36.672 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:28:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:28:36.672 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:28:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:28:36.672 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:28:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:28:36.672 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:28:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:28:36.672 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:28:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:28:36.672 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:28:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:28:36.672 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:28:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:28:36.673 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:28:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:28:36.673 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:28:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:28:36.673 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:28:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:28:36.673 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:28:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:28:36.673 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:28:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:28:36.673 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:28:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:28:36.673 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:28:36 np0005531887 podman[241332]: 2025-11-22 08:28:36.835156879 +0000 UTC m=+0.053865917 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2)
Nov 22 03:28:36 np0005531887 podman[241333]: 2025-11-22 08:28:36.867104715 +0000 UTC m=+0.082882711 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:28:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:28:37.362 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:28:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:28:37.363 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:28:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:28:37.363 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:28:37 np0005531887 nova_compute[186849]: 2025-11-22 08:28:37.433 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:38 np0005531887 nova_compute[186849]: 2025-11-22 08:28:38.278 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:40 np0005531887 podman[241377]: 2025-11-22 08:28:40.831301103 +0000 UTC m=+0.051607521 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 03:28:41 np0005531887 nova_compute[186849]: 2025-11-22 08:28:41.763 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:28:42 np0005531887 nova_compute[186849]: 2025-11-22 08:28:42.436 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:43 np0005531887 nova_compute[186849]: 2025-11-22 08:28:43.279 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:47 np0005531887 nova_compute[186849]: 2025-11-22 08:28:47.438 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:47 np0005531887 podman[241402]: 2025-11-22 08:28:47.829353286 +0000 UTC m=+0.053719413 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:28:48 np0005531887 nova_compute[186849]: 2025-11-22 08:28:48.280 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:52 np0005531887 nova_compute[186849]: 2025-11-22 08:28:52.442 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:52 np0005531887 podman[241421]: 2025-11-22 08:28:52.838537336 +0000 UTC m=+0.053833846 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:28:53 np0005531887 nova_compute[186849]: 2025-11-22 08:28:53.283 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:56 np0005531887 podman[241441]: 2025-11-22 08:28:56.847199509 +0000 UTC m=+0.070352362 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 03:28:57 np0005531887 nova_compute[186849]: 2025-11-22 08:28:57.446 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:28:58 np0005531887 nova_compute[186849]: 2025-11-22 08:28:58.285 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:01 np0005531887 podman[241467]: 2025-11-22 08:29:01.863529594 +0000 UTC m=+0.077812865 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, distribution-scope=public, release=1755695350, architecture=x86_64, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 22 03:29:02 np0005531887 nova_compute[186849]: 2025-11-22 08:29:02.449 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:03 np0005531887 nova_compute[186849]: 2025-11-22 08:29:03.289 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:07 np0005531887 nova_compute[186849]: 2025-11-22 08:29:07.453 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:07 np0005531887 podman[241488]: 2025-11-22 08:29:07.878811637 +0000 UTC m=+0.098185947 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=edpm, managed_by=edpm_ansible, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Nov 22 03:29:07 np0005531887 podman[241489]: 2025-11-22 08:29:07.879088154 +0000 UTC m=+0.094673591 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 22 03:29:08 np0005531887 nova_compute[186849]: 2025-11-22 08:29:08.290 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:11 np0005531887 podman[241533]: 2025-11-22 08:29:11.875955278 +0000 UTC m=+0.094552217 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 03:29:12 np0005531887 nova_compute[186849]: 2025-11-22 08:29:12.455 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:12 np0005531887 nova_compute[186849]: 2025-11-22 08:29:12.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:29:13 np0005531887 nova_compute[186849]: 2025-11-22 08:29:13.292 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:15 np0005531887 nova_compute[186849]: 2025-11-22 08:29:15.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:29:15 np0005531887 nova_compute[186849]: 2025-11-22 08:29:15.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:29:15 np0005531887 nova_compute[186849]: 2025-11-22 08:29:15.771 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:29:15 np0005531887 nova_compute[186849]: 2025-11-22 08:29:15.788 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:29:16 np0005531887 nova_compute[186849]: 2025-11-22 08:29:16.782 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:29:17 np0005531887 nova_compute[186849]: 2025-11-22 08:29:17.457 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:29:18.136 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=55, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=54) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:29:18 np0005531887 nova_compute[186849]: 2025-11-22 08:29:18.136 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:18 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:29:18.137 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:29:18 np0005531887 nova_compute[186849]: 2025-11-22 08:29:18.293 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:18 np0005531887 nova_compute[186849]: 2025-11-22 08:29:18.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:29:18 np0005531887 nova_compute[186849]: 2025-11-22 08:29:18.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:29:18 np0005531887 podman[241557]: 2025-11-22 08:29:18.838384951 +0000 UTC m=+0.055257470 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:29:19 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:29:19.139 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '55'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:29:19 np0005531887 nova_compute[186849]: 2025-11-22 08:29:19.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:29:19 np0005531887 nova_compute[186849]: 2025-11-22 08:29:19.805 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:29:19 np0005531887 nova_compute[186849]: 2025-11-22 08:29:19.806 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:29:19 np0005531887 nova_compute[186849]: 2025-11-22 08:29:19.806 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:29:19 np0005531887 nova_compute[186849]: 2025-11-22 08:29:19.806 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:29:19 np0005531887 nova_compute[186849]: 2025-11-22 08:29:19.984 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:29:19 np0005531887 nova_compute[186849]: 2025-11-22 08:29:19.986 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5727MB free_disk=73.27411651611328GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:29:19 np0005531887 nova_compute[186849]: 2025-11-22 08:29:19.986 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:29:19 np0005531887 nova_compute[186849]: 2025-11-22 08:29:19.986 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:29:20 np0005531887 nova_compute[186849]: 2025-11-22 08:29:20.071 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:29:20 np0005531887 nova_compute[186849]: 2025-11-22 08:29:20.072 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:29:20 np0005531887 nova_compute[186849]: 2025-11-22 08:29:20.093 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:29:20 np0005531887 nova_compute[186849]: 2025-11-22 08:29:20.105 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:29:20 np0005531887 nova_compute[186849]: 2025-11-22 08:29:20.106 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:29:20 np0005531887 nova_compute[186849]: 2025-11-22 08:29:20.106 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:29:21 np0005531887 nova_compute[186849]: 2025-11-22 08:29:21.107 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:29:22 np0005531887 nova_compute[186849]: 2025-11-22 08:29:22.460 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:23 np0005531887 nova_compute[186849]: 2025-11-22 08:29:23.296 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:23 np0005531887 podman[241577]: 2025-11-22 08:29:23.840974108 +0000 UTC m=+0.061155136 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Nov 22 03:29:24 np0005531887 nova_compute[186849]: 2025-11-22 08:29:24.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:29:25 np0005531887 nova_compute[186849]: 2025-11-22 08:29:25.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:29:27 np0005531887 nova_compute[186849]: 2025-11-22 08:29:27.462 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:27 np0005531887 nova_compute[186849]: 2025-11-22 08:29:27.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:29:27 np0005531887 nova_compute[186849]: 2025-11-22 08:29:27.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 22 03:29:27 np0005531887 nova_compute[186849]: 2025-11-22 08:29:27.806 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 22 03:29:27 np0005531887 podman[241597]: 2025-11-22 08:29:27.861812312 +0000 UTC m=+0.079851896 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:29:28 np0005531887 nova_compute[186849]: 2025-11-22 08:29:28.299 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:32 np0005531887 nova_compute[186849]: 2025-11-22 08:29:32.465 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:32 np0005531887 nova_compute[186849]: 2025-11-22 08:29:32.806 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:29:32 np0005531887 podman[241623]: 2025-11-22 08:29:32.850986429 +0000 UTC m=+0.065148533 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, container_name=openstack_network_exporter, release=1755695350, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, version=9.6, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=)
Nov 22 03:29:33 np0005531887 nova_compute[186849]: 2025-11-22 08:29:33.301 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:29:37.363 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:29:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:29:37.364 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:29:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:29:37.364 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:29:37 np0005531887 nova_compute[186849]: 2025-11-22 08:29:37.468 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:38 np0005531887 nova_compute[186849]: 2025-11-22 08:29:38.303 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:38 np0005531887 podman[241644]: 2025-11-22 08:29:38.846394053 +0000 UTC m=+0.063878203 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 22 03:29:38 np0005531887 podman[241645]: 2025-11-22 08:29:38.879283942 +0000 UTC m=+0.088531199 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 22 03:29:42 np0005531887 nova_compute[186849]: 2025-11-22 08:29:42.471 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:42 np0005531887 nova_compute[186849]: 2025-11-22 08:29:42.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:29:42 np0005531887 nova_compute[186849]: 2025-11-22 08:29:42.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 22 03:29:42 np0005531887 podman[241688]: 2025-11-22 08:29:42.845393389 +0000 UTC m=+0.056162243 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 03:29:43 np0005531887 nova_compute[186849]: 2025-11-22 08:29:43.307 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:47 np0005531887 nova_compute[186849]: 2025-11-22 08:29:47.476 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:48 np0005531887 nova_compute[186849]: 2025-11-22 08:29:48.309 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:49 np0005531887 podman[241713]: 2025-11-22 08:29:49.837013052 +0000 UTC m=+0.052966625 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 03:29:52 np0005531887 nova_compute[186849]: 2025-11-22 08:29:52.480 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:52 np0005531887 nova_compute[186849]: 2025-11-22 08:29:52.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:29:53 np0005531887 nova_compute[186849]: 2025-11-22 08:29:53.312 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:54 np0005531887 podman[241734]: 2025-11-22 08:29:54.602490184 +0000 UTC m=+0.065766189 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 22 03:29:57 np0005531887 nova_compute[186849]: 2025-11-22 08:29:57.484 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:58 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:29:58.153 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=56, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=55) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:29:58 np0005531887 nova_compute[186849]: 2025-11-22 08:29:58.153 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:58 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:29:58.154 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:29:58 np0005531887 nova_compute[186849]: 2025-11-22 08:29:58.313 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:29:58 np0005531887 podman[241754]: 2025-11-22 08:29:58.834416751 +0000 UTC m=+0.056012079 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:30:02 np0005531887 nova_compute[186849]: 2025-11-22 08:30:02.487 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:30:03 np0005531887 nova_compute[186849]: 2025-11-22 08:30:03.316 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:30:03 np0005531887 podman[241778]: 2025-11-22 08:30:03.838401132 +0000 UTC m=+0.054735098 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.expose-services=, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, managed_by=edpm_ansible, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm)
Nov 22 03:30:07 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:30:07.157 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '56'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:30:07 np0005531887 nova_compute[186849]: 2025-11-22 08:30:07.496 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:30:08 np0005531887 nova_compute[186849]: 2025-11-22 08:30:08.319 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:30:09 np0005531887 podman[241799]: 2025-11-22 08:30:09.842141939 +0000 UTC m=+0.061795132 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:30:09 np0005531887 podman[241800]: 2025-11-22 08:30:09.874846193 +0000 UTC m=+0.088300994 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller)
Nov 22 03:30:12 np0005531887 nova_compute[186849]: 2025-11-22 08:30:12.504 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:30:12 np0005531887 nova_compute[186849]: 2025-11-22 08:30:12.783 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:30:13 np0005531887 nova_compute[186849]: 2025-11-22 08:30:13.321 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:30:13 np0005531887 podman[241846]: 2025-11-22 08:30:13.838803098 +0000 UTC m=+0.051641972 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 03:30:16 np0005531887 nova_compute[186849]: 2025-11-22 08:30:16.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:30:16 np0005531887 nova_compute[186849]: 2025-11-22 08:30:16.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:30:16 np0005531887 nova_compute[186849]: 2025-11-22 08:30:16.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:30:16 np0005531887 nova_compute[186849]: 2025-11-22 08:30:16.785 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:30:17 np0005531887 nova_compute[186849]: 2025-11-22 08:30:17.506 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:30:18 np0005531887 nova_compute[186849]: 2025-11-22 08:30:18.324 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:30:18 np0005531887 nova_compute[186849]: 2025-11-22 08:30:18.779 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:30:19 np0005531887 nova_compute[186849]: 2025-11-22 08:30:19.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:30:19 np0005531887 nova_compute[186849]: 2025-11-22 08:30:19.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:30:20 np0005531887 podman[241870]: 2025-11-22 08:30:20.849454098 +0000 UTC m=+0.065138143 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:30:21 np0005531887 nova_compute[186849]: 2025-11-22 08:30:21.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:30:21 np0005531887 nova_compute[186849]: 2025-11-22 08:30:21.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:30:21 np0005531887 nova_compute[186849]: 2025-11-22 08:30:21.799 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:30:21 np0005531887 nova_compute[186849]: 2025-11-22 08:30:21.800 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:30:21 np0005531887 nova_compute[186849]: 2025-11-22 08:30:21.800 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:30:21 np0005531887 nova_compute[186849]: 2025-11-22 08:30:21.800 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:30:21 np0005531887 nova_compute[186849]: 2025-11-22 08:30:21.985 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:30:21 np0005531887 nova_compute[186849]: 2025-11-22 08:30:21.987 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5736MB free_disk=73.2741470336914GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:30:21 np0005531887 nova_compute[186849]: 2025-11-22 08:30:21.987 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:30:21 np0005531887 nova_compute[186849]: 2025-11-22 08:30:21.987 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:30:22 np0005531887 nova_compute[186849]: 2025-11-22 08:30:22.173 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:30:22 np0005531887 nova_compute[186849]: 2025-11-22 08:30:22.173 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:30:22 np0005531887 nova_compute[186849]: 2025-11-22 08:30:22.283 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:30:22 np0005531887 nova_compute[186849]: 2025-11-22 08:30:22.297 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:30:22 np0005531887 nova_compute[186849]: 2025-11-22 08:30:22.299 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:30:22 np0005531887 nova_compute[186849]: 2025-11-22 08:30:22.299 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.312s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:30:22 np0005531887 nova_compute[186849]: 2025-11-22 08:30:22.508 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:30:23 np0005531887 nova_compute[186849]: 2025-11-22 08:30:23.326 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:30:24 np0005531887 podman[241889]: 2025-11-22 08:30:24.839187506 +0000 UTC m=+0.061245848 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 03:30:27 np0005531887 nova_compute[186849]: 2025-11-22 08:30:27.299 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:30:27 np0005531887 nova_compute[186849]: 2025-11-22 08:30:27.512 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:30:27 np0005531887 nova_compute[186849]: 2025-11-22 08:30:27.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:30:28 np0005531887 nova_compute[186849]: 2025-11-22 08:30:28.329 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:30:29 np0005531887 podman[241908]: 2025-11-22 08:30:29.850350183 +0000 UTC m=+0.068123157 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 03:30:32 np0005531887 nova_compute[186849]: 2025-11-22 08:30:32.514 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:30:33 np0005531887 nova_compute[186849]: 2025-11-22 08:30:33.331 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:30:34 np0005531887 nova_compute[186849]: 2025-11-22 08:30:34.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:30:34 np0005531887 podman[241933]: 2025-11-22 08:30:34.837845239 +0000 UTC m=+0.060703804 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, release=1755695350, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 03:30:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:30:36.671 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:30:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:30:36.671 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:30:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:30:36.672 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:30:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:30:36.672 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:30:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:30:36.672 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:30:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:30:36.672 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:30:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:30:36.672 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:30:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:30:36.672 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:30:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:30:36.672 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:30:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:30:36.672 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:30:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:30:36.672 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:30:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:30:36.672 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:30:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:30:36.673 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:30:36 np0005531887 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 03:30:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:30:36.673 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:30:36 np0005531887 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 03:30:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:30:36.673 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:30:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:30:36.673 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:30:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:30:36.673 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:30:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:30:36.673 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:30:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:30:36.673 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:30:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:30:36.673 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:30:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:30:36.673 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:30:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:30:36.673 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:30:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:30:36.673 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:30:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:30:36.673 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:30:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:30:36.674 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:30:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:30:37.365 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:30:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:30:37.366 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:30:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:30:37.366 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:30:37 np0005531887 nova_compute[186849]: 2025-11-22 08:30:37.517 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:30:38 np0005531887 nova_compute[186849]: 2025-11-22 08:30:38.335 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:30:40 np0005531887 podman[241955]: 2025-11-22 08:30:40.82911562 +0000 UTC m=+0.047472069 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=edpm)
Nov 22 03:30:40 np0005531887 podman[241956]: 2025-11-22 08:30:40.890775217 +0000 UTC m=+0.099506869 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Nov 22 03:30:42 np0005531887 nova_compute[186849]: 2025-11-22 08:30:42.521 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:30:42 np0005531887 nova_compute[186849]: 2025-11-22 08:30:42.763 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:30:43 np0005531887 nova_compute[186849]: 2025-11-22 08:30:43.337 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:30:44 np0005531887 podman[242003]: 2025-11-22 08:30:44.835519467 +0000 UTC m=+0.056397938 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 03:30:47 np0005531887 nova_compute[186849]: 2025-11-22 08:30:47.523 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:30:48 np0005531887 nova_compute[186849]: 2025-11-22 08:30:48.340 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:30:51 np0005531887 podman[242030]: 2025-11-22 08:30:51.841438481 +0000 UTC m=+0.060470148 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 22 03:30:52 np0005531887 nova_compute[186849]: 2025-11-22 08:30:52.527 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:30:53 np0005531887 nova_compute[186849]: 2025-11-22 08:30:53.342 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:30:55 np0005531887 podman[242049]: 2025-11-22 08:30:55.83967641 +0000 UTC m=+0.060167622 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 22 03:30:57 np0005531887 nova_compute[186849]: 2025-11-22 08:30:57.530 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:30:58 np0005531887 nova_compute[186849]: 2025-11-22 08:30:58.343 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:00 np0005531887 podman[242069]: 2025-11-22 08:31:00.835228214 +0000 UTC m=+0.056265746 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:31:02 np0005531887 nova_compute[186849]: 2025-11-22 08:31:02.535 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:03 np0005531887 nova_compute[186849]: 2025-11-22 08:31:03.344 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:05 np0005531887 podman[242093]: 2025-11-22 08:31:05.828579653 +0000 UTC m=+0.053022776 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vcs-type=git, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, architecture=x86_64, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, distribution-scope=public, maintainer=Red Hat, Inc., release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm)
Nov 22 03:31:07 np0005531887 nova_compute[186849]: 2025-11-22 08:31:07.539 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:08 np0005531887 nova_compute[186849]: 2025-11-22 08:31:08.347 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:11 np0005531887 podman[242112]: 2025-11-22 08:31:11.859498408 +0000 UTC m=+0.079162138 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, org.label-schema.vendor=CentOS)
Nov 22 03:31:11 np0005531887 podman[242113]: 2025-11-22 08:31:11.876200059 +0000 UTC m=+0.092269931 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 22 03:31:12 np0005531887 nova_compute[186849]: 2025-11-22 08:31:12.541 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:13 np0005531887 nova_compute[186849]: 2025-11-22 08:31:13.347 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:14 np0005531887 nova_compute[186849]: 2025-11-22 08:31:14.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:31:15 np0005531887 podman[242155]: 2025-11-22 08:31:15.822762935 +0000 UTC m=+0.046814333 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 03:31:16 np0005531887 nova_compute[186849]: 2025-11-22 08:31:16.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:31:16 np0005531887 nova_compute[186849]: 2025-11-22 08:31:16.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:31:16 np0005531887 nova_compute[186849]: 2025-11-22 08:31:16.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:31:16 np0005531887 nova_compute[186849]: 2025-11-22 08:31:16.781 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:31:17 np0005531887 nova_compute[186849]: 2025-11-22 08:31:17.544 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:18 np0005531887 nova_compute[186849]: 2025-11-22 08:31:18.348 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:18 np0005531887 nova_compute[186849]: 2025-11-22 08:31:18.774 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:31:20 np0005531887 nova_compute[186849]: 2025-11-22 08:31:20.770 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:31:20 np0005531887 nova_compute[186849]: 2025-11-22 08:31:20.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:31:21 np0005531887 nova_compute[186849]: 2025-11-22 08:31:21.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:31:21 np0005531887 nova_compute[186849]: 2025-11-22 08:31:21.797 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:31:21 np0005531887 nova_compute[186849]: 2025-11-22 08:31:21.798 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:31:21 np0005531887 nova_compute[186849]: 2025-11-22 08:31:21.798 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:31:21 np0005531887 nova_compute[186849]: 2025-11-22 08:31:21.798 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:31:21 np0005531887 nova_compute[186849]: 2025-11-22 08:31:21.989 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:31:21 np0005531887 nova_compute[186849]: 2025-11-22 08:31:21.990 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5741MB free_disk=73.27418518066406GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:31:21 np0005531887 nova_compute[186849]: 2025-11-22 08:31:21.990 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:31:21 np0005531887 nova_compute[186849]: 2025-11-22 08:31:21.990 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:31:22 np0005531887 nova_compute[186849]: 2025-11-22 08:31:22.085 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:31:22 np0005531887 nova_compute[186849]: 2025-11-22 08:31:22.086 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:31:22 np0005531887 nova_compute[186849]: 2025-11-22 08:31:22.101 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Refreshing inventories for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 22 03:31:22 np0005531887 nova_compute[186849]: 2025-11-22 08:31:22.118 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Updating ProviderTree inventory for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 22 03:31:22 np0005531887 nova_compute[186849]: 2025-11-22 08:31:22.119 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Updating inventory in ProviderTree for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 22 03:31:22 np0005531887 nova_compute[186849]: 2025-11-22 08:31:22.133 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Refreshing aggregate associations for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 22 03:31:22 np0005531887 nova_compute[186849]: 2025-11-22 08:31:22.152 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Refreshing trait associations for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78, traits: COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_MMX,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_PCNET _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 22 03:31:22 np0005531887 nova_compute[186849]: 2025-11-22 08:31:22.172 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:31:22 np0005531887 nova_compute[186849]: 2025-11-22 08:31:22.185 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:31:22 np0005531887 nova_compute[186849]: 2025-11-22 08:31:22.187 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:31:22 np0005531887 nova_compute[186849]: 2025-11-22 08:31:22.187 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.197s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:31:22 np0005531887 nova_compute[186849]: 2025-11-22 08:31:22.546 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:22 np0005531887 podman[242181]: 2025-11-22 08:31:22.829394076 +0000 UTC m=+0.054017189 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 22 03:31:23 np0005531887 nova_compute[186849]: 2025-11-22 08:31:23.187 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:31:23 np0005531887 nova_compute[186849]: 2025-11-22 08:31:23.351 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:26 np0005531887 nova_compute[186849]: 2025-11-22 08:31:26.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:31:26 np0005531887 podman[242200]: 2025-11-22 08:31:26.846788913 +0000 UTC m=+0.058502489 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 03:31:27 np0005531887 nova_compute[186849]: 2025-11-22 08:31:27.549 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:27 np0005531887 nova_compute[186849]: 2025-11-22 08:31:27.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:31:28 np0005531887 nova_compute[186849]: 2025-11-22 08:31:28.352 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:31 np0005531887 podman[242219]: 2025-11-22 08:31:31.832481714 +0000 UTC m=+0.053607169 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 03:31:32 np0005531887 nova_compute[186849]: 2025-11-22 08:31:32.552 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:33 np0005531887 nova_compute[186849]: 2025-11-22 08:31:33.354 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:35 np0005531887 nova_compute[186849]: 2025-11-22 08:31:35.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:31:36 np0005531887 podman[242243]: 2025-11-22 08:31:36.834895258 +0000 UTC m=+0.053861616 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, version=9.6, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 22 03:31:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:31:37.367 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:31:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:31:37.367 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:31:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:31:37.367 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:31:37 np0005531887 nova_compute[186849]: 2025-11-22 08:31:37.554 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:38 np0005531887 nova_compute[186849]: 2025-11-22 08:31:38.356 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:42 np0005531887 nova_compute[186849]: 2025-11-22 08:31:42.558 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:42 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:31:42.652 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=57, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=56) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:31:42 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:31:42.653 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:31:42 np0005531887 nova_compute[186849]: 2025-11-22 08:31:42.653 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:42 np0005531887 podman[242268]: 2025-11-22 08:31:42.862823331 +0000 UTC m=+0.079997659 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 22 03:31:42 np0005531887 podman[242267]: 2025-11-22 08:31:42.863095037 +0000 UTC m=+0.084678894 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 03:31:43 np0005531887 nova_compute[186849]: 2025-11-22 08:31:43.356 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:46 np0005531887 podman[242309]: 2025-11-22 08:31:46.853324098 +0000 UTC m=+0.054397829 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:31:47 np0005531887 nova_compute[186849]: 2025-11-22 08:31:47.561 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:48 np0005531887 nova_compute[186849]: 2025-11-22 08:31:48.359 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:51 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:31:51.654 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '57'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:31:52 np0005531887 nova_compute[186849]: 2025-11-22 08:31:52.564 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:53 np0005531887 nova_compute[186849]: 2025-11-22 08:31:53.360 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:53 np0005531887 podman[242334]: 2025-11-22 08:31:53.832498934 +0000 UTC m=+0.054301047 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:31:57 np0005531887 nova_compute[186849]: 2025-11-22 08:31:57.566 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:31:57 np0005531887 podman[242353]: 2025-11-22 08:31:57.830486615 +0000 UTC m=+0.055383023 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, container_name=multipathd)
Nov 22 03:31:58 np0005531887 nova_compute[186849]: 2025-11-22 08:31:58.362 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:32:02 np0005531887 nova_compute[186849]: 2025-11-22 08:32:02.570 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:32:02 np0005531887 podman[242373]: 2025-11-22 08:32:02.859564874 +0000 UTC m=+0.069506781 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:32:03 np0005531887 nova_compute[186849]: 2025-11-22 08:32:03.364 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:32:07 np0005531887 nova_compute[186849]: 2025-11-22 08:32:07.573 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:32:07 np0005531887 podman[242397]: 2025-11-22 08:32:07.84217049 +0000 UTC m=+0.059462915 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., config_id=edpm, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, architecture=x86_64, name=ubi9-minimal, vcs-type=git, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public)
Nov 22 03:32:08 np0005531887 nova_compute[186849]: 2025-11-22 08:32:08.365 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:32:12 np0005531887 nova_compute[186849]: 2025-11-22 08:32:12.577 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:32:13 np0005531887 nova_compute[186849]: 2025-11-22 08:32:13.367 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:32:13 np0005531887 podman[242418]: 2025-11-22 08:32:13.838302309 +0000 UTC m=+0.059299089 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Nov 22 03:32:13 np0005531887 podman[242419]: 2025-11-22 08:32:13.867703193 +0000 UTC m=+0.084288815 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:32:14 np0005531887 nova_compute[186849]: 2025-11-22 08:32:14.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:32:16 np0005531887 nova_compute[186849]: 2025-11-22 08:32:16.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:32:16 np0005531887 nova_compute[186849]: 2025-11-22 08:32:16.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:32:16 np0005531887 nova_compute[186849]: 2025-11-22 08:32:16.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:32:16 np0005531887 nova_compute[186849]: 2025-11-22 08:32:16.781 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:32:17 np0005531887 nova_compute[186849]: 2025-11-22 08:32:17.578 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:32:17 np0005531887 podman[242460]: 2025-11-22 08:32:17.82166788 +0000 UTC m=+0.043919291 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:32:18 np0005531887 nova_compute[186849]: 2025-11-22 08:32:18.368 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:32:20 np0005531887 nova_compute[186849]: 2025-11-22 08:32:20.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:32:20 np0005531887 nova_compute[186849]: 2025-11-22 08:32:20.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:32:20 np0005531887 nova_compute[186849]: 2025-11-22 08:32:20.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:32:22 np0005531887 nova_compute[186849]: 2025-11-22 08:32:22.581 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:32:22 np0005531887 nova_compute[186849]: 2025-11-22 08:32:22.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:32:22 np0005531887 nova_compute[186849]: 2025-11-22 08:32:22.793 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:32:22 np0005531887 nova_compute[186849]: 2025-11-22 08:32:22.794 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:32:22 np0005531887 nova_compute[186849]: 2025-11-22 08:32:22.794 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:32:22 np0005531887 nova_compute[186849]: 2025-11-22 08:32:22.794 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:32:22 np0005531887 nova_compute[186849]: 2025-11-22 08:32:22.954 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:32:22 np0005531887 nova_compute[186849]: 2025-11-22 08:32:22.956 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5759MB free_disk=73.27420425415039GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:32:22 np0005531887 nova_compute[186849]: 2025-11-22 08:32:22.956 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:32:22 np0005531887 nova_compute[186849]: 2025-11-22 08:32:22.956 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:32:23 np0005531887 nova_compute[186849]: 2025-11-22 08:32:23.370 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:32:23 np0005531887 nova_compute[186849]: 2025-11-22 08:32:23.585 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:32:23 np0005531887 nova_compute[186849]: 2025-11-22 08:32:23.586 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:32:23 np0005531887 nova_compute[186849]: 2025-11-22 08:32:23.646 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:32:23 np0005531887 nova_compute[186849]: 2025-11-22 08:32:23.661 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:32:23 np0005531887 nova_compute[186849]: 2025-11-22 08:32:23.663 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:32:23 np0005531887 nova_compute[186849]: 2025-11-22 08:32:23.663 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.707s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:32:24 np0005531887 podman[242484]: 2025-11-22 08:32:24.837234893 +0000 UTC m=+0.058343627 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:32:25 np0005531887 nova_compute[186849]: 2025-11-22 08:32:25.664 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:32:26 np0005531887 nova_compute[186849]: 2025-11-22 08:32:26.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:32:27 np0005531887 nova_compute[186849]: 2025-11-22 08:32:27.584 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:32:28 np0005531887 nova_compute[186849]: 2025-11-22 08:32:28.372 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:32:28 np0005531887 nova_compute[186849]: 2025-11-22 08:32:28.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:32:28 np0005531887 podman[242505]: 2025-11-22 08:32:28.842503604 +0000 UTC m=+0.062889298 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 22 03:32:32 np0005531887 nova_compute[186849]: 2025-11-22 08:32:32.588 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:32:33 np0005531887 nova_compute[186849]: 2025-11-22 08:32:33.374 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:32:33 np0005531887 podman[242523]: 2025-11-22 08:32:33.862898519 +0000 UTC m=+0.078954513 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 03:32:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:32:36.672 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:32:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:32:36.672 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:32:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:32:36.672 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:32:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:32:36.672 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:32:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:32:36.672 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:32:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:32:36.672 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:32:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:32:36.672 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:32:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:32:36.672 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:32:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:32:36.672 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:32:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:32:36.673 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:32:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:32:36.673 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:32:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:32:36.673 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:32:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:32:36.673 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:32:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:32:36.673 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:32:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:32:36.673 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:32:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:32:36.673 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:32:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:32:36.673 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:32:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:32:36.673 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:32:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:32:36.673 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:32:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:32:36.673 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:32:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:32:36.673 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:32:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:32:36.673 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:32:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:32:36.673 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:32:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:32:36.673 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:32:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:32:36.674 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:32:36 np0005531887 nova_compute[186849]: 2025-11-22 08:32:36.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:32:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:32:37.368 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:32:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:32:37.369 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:32:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:32:37.369 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:32:37 np0005531887 nova_compute[186849]: 2025-11-22 08:32:37.591 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:32:38 np0005531887 nova_compute[186849]: 2025-11-22 08:32:38.376 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:32:38 np0005531887 podman[242547]: 2025-11-22 08:32:38.844335126 +0000 UTC m=+0.064322794 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., config_id=edpm, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, version=9.6, com.redhat.component=ubi9-minimal-container, release=1755695350)
Nov 22 03:32:42 np0005531887 nova_compute[186849]: 2025-11-22 08:32:42.593 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:32:43 np0005531887 nova_compute[186849]: 2025-11-22 08:32:43.377 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:32:44 np0005531887 podman[242568]: 2025-11-22 08:32:44.853338325 +0000 UTC m=+0.064468808 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:32:44 np0005531887 podman[242569]: 2025-11-22 08:32:44.886538631 +0000 UTC m=+0.092464676 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Nov 22 03:32:44 np0005531887 nova_compute[186849]: 2025-11-22 08:32:44.910 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:32:44 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:32:44.910 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=58, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=57) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:32:44 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:32:44.911 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:32:45 np0005531887 nova_compute[186849]: 2025-11-22 08:32:45.762 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:32:47 np0005531887 nova_compute[186849]: 2025-11-22 08:32:47.597 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:32:48 np0005531887 nova_compute[186849]: 2025-11-22 08:32:48.379 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:32:48 np0005531887 podman[242613]: 2025-11-22 08:32:48.823294336 +0000 UTC m=+0.045874680 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 03:32:52 np0005531887 nova_compute[186849]: 2025-11-22 08:32:52.603 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:32:52 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:32:52.913 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '58'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:32:53 np0005531887 nova_compute[186849]: 2025-11-22 08:32:53.381 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:32:55 np0005531887 podman[242638]: 2025-11-22 08:32:55.863472063 +0000 UTC m=+0.081675300 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:32:57 np0005531887 nova_compute[186849]: 2025-11-22 08:32:57.606 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:32:58 np0005531887 nova_compute[186849]: 2025-11-22 08:32:58.382 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:32:59 np0005531887 podman[242658]: 2025-11-22 08:32:59.843110972 +0000 UTC m=+0.057442714 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, org.label-schema.schema-version=1.0, container_name=multipathd)
Nov 22 03:33:02 np0005531887 nova_compute[186849]: 2025-11-22 08:33:02.610 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:03 np0005531887 nova_compute[186849]: 2025-11-22 08:33:03.383 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:04 np0005531887 podman[242678]: 2025-11-22 08:33:04.831473988 +0000 UTC m=+0.052458301 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:33:07 np0005531887 nova_compute[186849]: 2025-11-22 08:33:07.613 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:08 np0005531887 nova_compute[186849]: 2025-11-22 08:33:08.386 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:09 np0005531887 podman[242702]: 2025-11-22 08:33:09.862457045 +0000 UTC m=+0.086079399 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, name=ubi9-minimal, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, distribution-scope=public, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.openshift.expose-services=)
Nov 22 03:33:12 np0005531887 nova_compute[186849]: 2025-11-22 08:33:12.616 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:13 np0005531887 nova_compute[186849]: 2025-11-22 08:33:13.388 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:15 np0005531887 podman[242723]: 2025-11-22 08:33:15.835310603 +0000 UTC m=+0.055783004 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:33:15 np0005531887 podman[242724]: 2025-11-22 08:33:15.86286566 +0000 UTC m=+0.078448191 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, io.buildah.version=1.41.3)
Nov 22 03:33:16 np0005531887 nova_compute[186849]: 2025-11-22 08:33:16.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:33:17 np0005531887 nova_compute[186849]: 2025-11-22 08:33:17.620 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:17 np0005531887 nova_compute[186849]: 2025-11-22 08:33:17.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:33:17 np0005531887 nova_compute[186849]: 2025-11-22 08:33:17.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:33:17 np0005531887 nova_compute[186849]: 2025-11-22 08:33:17.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:33:17 np0005531887 nova_compute[186849]: 2025-11-22 08:33:17.781 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:33:18 np0005531887 nova_compute[186849]: 2025-11-22 08:33:18.390 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:19 np0005531887 podman[242767]: 2025-11-22 08:33:19.831131789 +0000 UTC m=+0.055773672 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:33:20 np0005531887 nova_compute[186849]: 2025-11-22 08:33:20.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:33:20 np0005531887 nova_compute[186849]: 2025-11-22 08:33:20.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:33:22 np0005531887 nova_compute[186849]: 2025-11-22 08:33:22.622 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:22 np0005531887 nova_compute[186849]: 2025-11-22 08:33:22.763 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:33:23 np0005531887 nova_compute[186849]: 2025-11-22 08:33:23.392 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:23 np0005531887 nova_compute[186849]: 2025-11-22 08:33:23.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:33:23 np0005531887 nova_compute[186849]: 2025-11-22 08:33:23.794 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:33:23 np0005531887 nova_compute[186849]: 2025-11-22 08:33:23.795 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:33:23 np0005531887 nova_compute[186849]: 2025-11-22 08:33:23.795 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:33:23 np0005531887 nova_compute[186849]: 2025-11-22 08:33:23.795 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:33:23 np0005531887 nova_compute[186849]: 2025-11-22 08:33:23.966 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:33:23 np0005531887 nova_compute[186849]: 2025-11-22 08:33:23.967 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5756MB free_disk=73.27415466308594GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:33:23 np0005531887 nova_compute[186849]: 2025-11-22 08:33:23.967 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:33:23 np0005531887 nova_compute[186849]: 2025-11-22 08:33:23.968 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:33:24 np0005531887 nova_compute[186849]: 2025-11-22 08:33:24.191 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:33:24 np0005531887 nova_compute[186849]: 2025-11-22 08:33:24.191 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:33:24 np0005531887 nova_compute[186849]: 2025-11-22 08:33:24.227 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:33:24 np0005531887 nova_compute[186849]: 2025-11-22 08:33:24.245 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:33:24 np0005531887 nova_compute[186849]: 2025-11-22 08:33:24.247 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:33:24 np0005531887 nova_compute[186849]: 2025-11-22 08:33:24.247 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.280s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:33:26 np0005531887 nova_compute[186849]: 2025-11-22 08:33:26.248 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:33:26 np0005531887 nova_compute[186849]: 2025-11-22 08:33:26.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_shelved_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:33:26 np0005531887 podman[242793]: 2025-11-22 08:33:26.891541014 +0000 UTC m=+0.111812962 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 22 03:33:27 np0005531887 nova_compute[186849]: 2025-11-22 08:33:27.625 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:28 np0005531887 nova_compute[186849]: 2025-11-22 08:33:28.395 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:28 np0005531887 nova_compute[186849]: 2025-11-22 08:33:28.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:33:28 np0005531887 nova_compute[186849]: 2025-11-22 08:33:28.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:33:30 np0005531887 podman[242810]: 2025-11-22 08:33:30.860588373 +0000 UTC m=+0.080595325 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 22 03:33:32 np0005531887 nova_compute[186849]: 2025-11-22 08:33:32.628 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:33 np0005531887 nova_compute[186849]: 2025-11-22 08:33:33.397 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:35 np0005531887 podman[242831]: 2025-11-22 08:33:35.83312308 +0000 UTC m=+0.052267287 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 03:33:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:33:37.369 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:33:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:33:37.370 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:33:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:33:37.370 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:33:37 np0005531887 nova_compute[186849]: 2025-11-22 08:33:37.631 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:38 np0005531887 nova_compute[186849]: 2025-11-22 08:33:38.398 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:38 np0005531887 nova_compute[186849]: 2025-11-22 08:33:38.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:33:40 np0005531887 podman[242856]: 2025-11-22 08:33:40.863691347 +0000 UTC m=+0.070793842 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, architecture=x86_64, build-date=2025-08-20T13:12:41, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, distribution-scope=public, managed_by=edpm_ansible, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc.)
Nov 22 03:33:42 np0005531887 nova_compute[186849]: 2025-11-22 08:33:42.633 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:43 np0005531887 nova_compute[186849]: 2025-11-22 08:33:43.399 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:46 np0005531887 podman[242877]: 2025-11-22 08:33:46.847165997 +0000 UTC m=+0.062799926 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 03:33:46 np0005531887 podman[242878]: 2025-11-22 08:33:46.875853093 +0000 UTC m=+0.091568484 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:33:47 np0005531887 nova_compute[186849]: 2025-11-22 08:33:47.635 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:48 np0005531887 nova_compute[186849]: 2025-11-22 08:33:48.401 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:50 np0005531887 podman[242920]: 2025-11-22 08:33:50.83306505 +0000 UTC m=+0.057313831 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 03:33:52 np0005531887 nova_compute[186849]: 2025-11-22 08:33:52.638 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:53 np0005531887 nova_compute[186849]: 2025-11-22 08:33:53.402 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:57 np0005531887 nova_compute[186849]: 2025-11-22 08:33:57.641 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:57 np0005531887 podman[242945]: 2025-11-22 08:33:57.846790946 +0000 UTC m=+0.071850549 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 22 03:33:58 np0005531887 nova_compute[186849]: 2025-11-22 08:33:58.403 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:58 np0005531887 nova_compute[186849]: 2025-11-22 08:33:58.673 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:33:59 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:33:59.988 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=59, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=58) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:33:59 np0005531887 nova_compute[186849]: 2025-11-22 08:33:59.988 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:33:59 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:33:59.989 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:34:01 np0005531887 podman[242964]: 2025-11-22 08:34:01.851640206 +0000 UTC m=+0.073317445 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd)
Nov 22 03:34:02 np0005531887 nova_compute[186849]: 2025-11-22 08:34:02.644 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:34:03 np0005531887 nova_compute[186849]: 2025-11-22 08:34:03.406 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:34:06 np0005531887 podman[242984]: 2025-11-22 08:34:06.855138445 +0000 UTC m=+0.060095070 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:34:06 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:34:06.991 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '59'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:34:07 np0005531887 nova_compute[186849]: 2025-11-22 08:34:07.647 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:34:08 np0005531887 nova_compute[186849]: 2025-11-22 08:34:08.406 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:34:11 np0005531887 podman[243009]: 2025-11-22 08:34:11.835221508 +0000 UTC m=+0.054898211 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, release=1755695350, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, maintainer=Red Hat, Inc., managed_by=edpm_ansible, version=9.6, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc.)
Nov 22 03:34:12 np0005531887 nova_compute[186849]: 2025-11-22 08:34:12.650 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:34:13 np0005531887 nova_compute[186849]: 2025-11-22 08:34:13.408 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:34:17 np0005531887 nova_compute[186849]: 2025-11-22 08:34:17.653 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:34:17 np0005531887 nova_compute[186849]: 2025-11-22 08:34:17.770 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:34:17 np0005531887 podman[243032]: 2025-11-22 08:34:17.850931931 +0000 UTC m=+0.071755043 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm)
Nov 22 03:34:17 np0005531887 podman[243033]: 2025-11-22 08:34:17.900566799 +0000 UTC m=+0.115646420 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 03:34:18 np0005531887 nova_compute[186849]: 2025-11-22 08:34:18.410 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:34:18 np0005531887 nova_compute[186849]: 2025-11-22 08:34:18.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:34:18 np0005531887 nova_compute[186849]: 2025-11-22 08:34:18.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:34:18 np0005531887 nova_compute[186849]: 2025-11-22 08:34:18.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:34:18 np0005531887 nova_compute[186849]: 2025-11-22 08:34:18.780 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:34:21 np0005531887 nova_compute[186849]: 2025-11-22 08:34:21.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:34:21 np0005531887 nova_compute[186849]: 2025-11-22 08:34:21.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:34:21 np0005531887 podman[243078]: 2025-11-22 08:34:21.831231104 +0000 UTC m=+0.051600608 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:34:22 np0005531887 nova_compute[186849]: 2025-11-22 08:34:22.656 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:34:23 np0005531887 nova_compute[186849]: 2025-11-22 08:34:23.412 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:34:23 np0005531887 nova_compute[186849]: 2025-11-22 08:34:23.762 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:34:25 np0005531887 nova_compute[186849]: 2025-11-22 08:34:25.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:34:25 np0005531887 nova_compute[186849]: 2025-11-22 08:34:25.788 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:34:25 np0005531887 nova_compute[186849]: 2025-11-22 08:34:25.789 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:34:25 np0005531887 nova_compute[186849]: 2025-11-22 08:34:25.789 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:34:25 np0005531887 nova_compute[186849]: 2025-11-22 08:34:25.789 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:34:25 np0005531887 nova_compute[186849]: 2025-11-22 08:34:25.949 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:34:25 np0005531887 nova_compute[186849]: 2025-11-22 08:34:25.950 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5755MB free_disk=73.2741928100586GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:34:25 np0005531887 nova_compute[186849]: 2025-11-22 08:34:25.951 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:34:25 np0005531887 nova_compute[186849]: 2025-11-22 08:34:25.951 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:34:26 np0005531887 nova_compute[186849]: 2025-11-22 08:34:26.008 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:34:26 np0005531887 nova_compute[186849]: 2025-11-22 08:34:26.008 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:34:26 np0005531887 nova_compute[186849]: 2025-11-22 08:34:26.030 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:34:26 np0005531887 nova_compute[186849]: 2025-11-22 08:34:26.048 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:34:26 np0005531887 nova_compute[186849]: 2025-11-22 08:34:26.049 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:34:26 np0005531887 nova_compute[186849]: 2025-11-22 08:34:26.050 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.099s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:34:27 np0005531887 nova_compute[186849]: 2025-11-22 08:34:27.659 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:34:28 np0005531887 nova_compute[186849]: 2025-11-22 08:34:28.050 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:34:28 np0005531887 nova_compute[186849]: 2025-11-22 08:34:28.414 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:34:28 np0005531887 podman[243102]: 2025-11-22 08:34:28.839191835 +0000 UTC m=+0.060367033 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 22 03:34:29 np0005531887 nova_compute[186849]: 2025-11-22 08:34:29.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:34:30 np0005531887 nova_compute[186849]: 2025-11-22 08:34:30.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:34:32 np0005531887 nova_compute[186849]: 2025-11-22 08:34:32.662 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:34:32 np0005531887 podman[243121]: 2025-11-22 08:34:32.85412743 +0000 UTC m=+0.063209623 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118)
Nov 22 03:34:33 np0005531887 nova_compute[186849]: 2025-11-22 08:34:33.417 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:34:34 np0005531887 nova_compute[186849]: 2025-11-22 08:34:34.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:34:34 np0005531887 nova_compute[186849]: 2025-11-22 08:34:34.768 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 22 03:34:34 np0005531887 nova_compute[186849]: 2025-11-22 08:34:34.817 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 22 03:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:34:36.672 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:34:36.673 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:34:36.673 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:34:36.673 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:34:36.673 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:34:36.674 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:34:36.674 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:34:36.674 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:34:36.674 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:34:36.674 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:34:36.674 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:34:36.675 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:34:36.675 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:34:36.675 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:34:36.675 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:34:36.675 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:34:36.675 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:34:36.676 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:34:36.676 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:34:36.676 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:34:36.676 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:34:36.677 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:34:36.677 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:34:36.677 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:34:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:34:36.677 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:34:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:34:37.370 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:34:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:34:37.370 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:34:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:34:37.371 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:34:37 np0005531887 nova_compute[186849]: 2025-11-22 08:34:37.664 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:34:37 np0005531887 podman[243140]: 2025-11-22 08:34:37.848282123 +0000 UTC m=+0.053051804 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:34:38 np0005531887 nova_compute[186849]: 2025-11-22 08:34:38.419 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:34:40 np0005531887 nova_compute[186849]: 2025-11-22 08:34:40.817 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:34:42 np0005531887 nova_compute[186849]: 2025-11-22 08:34:42.666 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:34:42 np0005531887 podman[243165]: 2025-11-22 08:34:42.840344587 +0000 UTC m=+0.060886747 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, name=ubi9-minimal, version=9.6, config_id=edpm, release=1755695350, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 22 03:34:43 np0005531887 nova_compute[186849]: 2025-11-22 08:34:43.423 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:34:47 np0005531887 nova_compute[186849]: 2025-11-22 08:34:47.288 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:34:47 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:34:47.291 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=60, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=59) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:34:47 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:34:47.292 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:34:47 np0005531887 nova_compute[186849]: 2025-11-22 08:34:47.669 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:34:47 np0005531887 nova_compute[186849]: 2025-11-22 08:34:47.762 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:34:48 np0005531887 nova_compute[186849]: 2025-11-22 08:34:48.424 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:34:48 np0005531887 nova_compute[186849]: 2025-11-22 08:34:48.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:34:48 np0005531887 nova_compute[186849]: 2025-11-22 08:34:48.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 22 03:34:48 np0005531887 podman[243187]: 2025-11-22 08:34:48.86262296 +0000 UTC m=+0.071285521 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:34:48 np0005531887 podman[243188]: 2025-11-22 08:34:48.924060329 +0000 UTC m=+0.127112882 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 22 03:34:52 np0005531887 nova_compute[186849]: 2025-11-22 08:34:52.673 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:34:52 np0005531887 podman[243233]: 2025-11-22 08:34:52.872179833 +0000 UTC m=+0.086621048 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 03:34:53 np0005531887 nova_compute[186849]: 2025-11-22 08:34:53.426 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:34:55 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:34:55.295 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '60'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:34:55 np0005531887 nova_compute[186849]: 2025-11-22 08:34:55.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:34:57 np0005531887 nova_compute[186849]: 2025-11-22 08:34:57.675 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:34:58 np0005531887 nova_compute[186849]: 2025-11-22 08:34:58.427 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:34:59 np0005531887 podman[243259]: 2025-11-22 08:34:59.835015927 +0000 UTC m=+0.050160303 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:35:02 np0005531887 nova_compute[186849]: 2025-11-22 08:35:02.678 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:35:03 np0005531887 nova_compute[186849]: 2025-11-22 08:35:03.428 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:35:03 np0005531887 podman[243278]: 2025-11-22 08:35:03.840053337 +0000 UTC m=+0.057102833 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 22 03:35:07 np0005531887 nova_compute[186849]: 2025-11-22 08:35:07.681 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:35:08 np0005531887 nova_compute[186849]: 2025-11-22 08:35:08.429 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:35:08 np0005531887 podman[243301]: 2025-11-22 08:35:08.614825655 +0000 UTC m=+0.060820884 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:35:10 np0005531887 podman[201064]: time="2025-11-22T08:35:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 22 03:35:10 np0005531887 podman[201064]: @ - - [22/Nov/2025:08:35:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 22840 "" "Go-http-client/1.1"
Nov 22 03:35:12 np0005531887 nova_compute[186849]: 2025-11-22 08:35:12.684 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:35:13 np0005531887 nova_compute[186849]: 2025-11-22 08:35:13.430 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:35:13 np0005531887 nova_compute[186849]: 2025-11-22 08:35:13.670 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:35:13 np0005531887 podman[243325]: 2025-11-22 08:35:13.844571652 +0000 UTC m=+0.059055871 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, config_id=edpm)
Nov 22 03:35:17 np0005531887 nova_compute[186849]: 2025-11-22 08:35:17.687 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:35:17 np0005531887 nova_compute[186849]: 2025-11-22 08:35:17.791 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:35:18 np0005531887 nova_compute[186849]: 2025-11-22 08:35:18.432 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:35:18 np0005531887 nova_compute[186849]: 2025-11-22 08:35:18.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:35:18 np0005531887 nova_compute[186849]: 2025-11-22 08:35:18.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:35:18 np0005531887 nova_compute[186849]: 2025-11-22 08:35:18.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:35:18 np0005531887 nova_compute[186849]: 2025-11-22 08:35:18.785 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:35:19 np0005531887 podman[243347]: 2025-11-22 08:35:19.833849875 +0000 UTC m=+0.058456446 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 03:35:19 np0005531887 podman[243348]: 2025-11-22 08:35:19.852957275 +0000 UTC m=+0.075475064 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Nov 22 03:35:21 np0005531887 nova_compute[186849]: 2025-11-22 08:35:21.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:35:21 np0005531887 nova_compute[186849]: 2025-11-22 08:35:21.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:35:22 np0005531887 nova_compute[186849]: 2025-11-22 08:35:22.690 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:35:23 np0005531887 nova_compute[186849]: 2025-11-22 08:35:23.434 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:35:23 np0005531887 podman[243395]: 2025-11-22 08:35:23.832322255 +0000 UTC m=+0.055353030 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 03:35:25 np0005531887 nova_compute[186849]: 2025-11-22 08:35:25.763 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:35:26 np0005531887 nova_compute[186849]: 2025-11-22 08:35:26.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:35:26 np0005531887 nova_compute[186849]: 2025-11-22 08:35:26.792 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:35:26 np0005531887 nova_compute[186849]: 2025-11-22 08:35:26.792 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:35:26 np0005531887 nova_compute[186849]: 2025-11-22 08:35:26.792 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:35:26 np0005531887 nova_compute[186849]: 2025-11-22 08:35:26.793 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:35:26 np0005531887 nova_compute[186849]: 2025-11-22 08:35:26.972 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:35:26 np0005531887 nova_compute[186849]: 2025-11-22 08:35:26.974 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5740MB free_disk=73.27418899536133GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:35:26 np0005531887 nova_compute[186849]: 2025-11-22 08:35:26.974 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:35:26 np0005531887 nova_compute[186849]: 2025-11-22 08:35:26.974 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:35:27 np0005531887 nova_compute[186849]: 2025-11-22 08:35:27.027 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:35:27 np0005531887 nova_compute[186849]: 2025-11-22 08:35:27.027 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:35:27 np0005531887 nova_compute[186849]: 2025-11-22 08:35:27.052 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:35:27 np0005531887 nova_compute[186849]: 2025-11-22 08:35:27.069 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:35:27 np0005531887 nova_compute[186849]: 2025-11-22 08:35:27.071 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:35:27 np0005531887 nova_compute[186849]: 2025-11-22 08:35:27.071 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.097s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:35:27 np0005531887 nova_compute[186849]: 2025-11-22 08:35:27.693 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:35:28 np0005531887 nova_compute[186849]: 2025-11-22 08:35:28.435 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:35:30 np0005531887 nova_compute[186849]: 2025-11-22 08:35:30.071 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:35:30 np0005531887 nova_compute[186849]: 2025-11-22 08:35:30.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:35:30 np0005531887 podman[243419]: 2025-11-22 08:35:30.852395934 +0000 UTC m=+0.074874930 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 22 03:35:32 np0005531887 nova_compute[186849]: 2025-11-22 08:35:32.695 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:35:32 np0005531887 nova_compute[186849]: 2025-11-22 08:35:32.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:35:33 np0005531887 nova_compute[186849]: 2025-11-22 08:35:33.437 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:35:34 np0005531887 podman[243438]: 2025-11-22 08:35:34.837425764 +0000 UTC m=+0.057658576 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:35:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:35:37.371 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:35:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:35:37.372 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:35:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:35:37.372 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:35:37 np0005531887 nova_compute[186849]: 2025-11-22 08:35:37.699 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:35:38 np0005531887 nova_compute[186849]: 2025-11-22 08:35:38.440 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:35:38 np0005531887 podman[243458]: 2025-11-22 08:35:38.855400773 +0000 UTC m=+0.066084273 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:35:41 np0005531887 nova_compute[186849]: 2025-11-22 08:35:41.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:35:42 np0005531887 nova_compute[186849]: 2025-11-22 08:35:42.701 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:35:43 np0005531887 nova_compute[186849]: 2025-11-22 08:35:43.441 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:35:44 np0005531887 podman[243483]: 2025-11-22 08:35:44.844157845 +0000 UTC m=+0.062180428 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.openshift.expose-services=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, build-date=2025-08-20T13:12:41, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.buildah.version=1.33.7)
Nov 22 03:35:47 np0005531887 nova_compute[186849]: 2025-11-22 08:35:47.703 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:35:48 np0005531887 nova_compute[186849]: 2025-11-22 08:35:48.443 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:35:50 np0005531887 podman[243504]: 2025-11-22 08:35:50.836509154 +0000 UTC m=+0.057903943 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:35:51 np0005531887 podman[243505]: 2025-11-22 08:35:51.039160927 +0000 UTC m=+0.252928519 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 22 03:35:52 np0005531887 nova_compute[186849]: 2025-11-22 08:35:52.706 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:35:53 np0005531887 nova_compute[186849]: 2025-11-22 08:35:53.444 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:35:54 np0005531887 podman[243553]: 2025-11-22 08:35:54.833759533 +0000 UTC m=+0.053434373 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 03:35:57 np0005531887 nova_compute[186849]: 2025-11-22 08:35:57.708 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:35:58 np0005531887 nova_compute[186849]: 2025-11-22 08:35:58.448 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:36:01 np0005531887 podman[243579]: 2025-11-22 08:36:01.82808708 +0000 UTC m=+0.049512888 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 22 03:36:02 np0005531887 nova_compute[186849]: 2025-11-22 08:36:02.710 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:36:03 np0005531887 nova_compute[186849]: 2025-11-22 08:36:03.451 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:36:05 np0005531887 podman[243598]: 2025-11-22 08:36:05.827850562 +0000 UTC m=+0.052898040 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 22 03:36:07 np0005531887 nova_compute[186849]: 2025-11-22 08:36:07.713 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:36:08 np0005531887 nova_compute[186849]: 2025-11-22 08:36:08.453 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:36:09 np0005531887 podman[243620]: 2025-11-22 08:36:09.854130404 +0000 UTC m=+0.076407946 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:36:12 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:36:12.395 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=61, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=60) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:36:12 np0005531887 nova_compute[186849]: 2025-11-22 08:36:12.396 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:36:12 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:36:12.397 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:36:12 np0005531887 nova_compute[186849]: 2025-11-22 08:36:12.715 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:36:13 np0005531887 nova_compute[186849]: 2025-11-22 08:36:13.454 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:36:15 np0005531887 podman[243645]: 2025-11-22 08:36:15.82887435 +0000 UTC m=+0.049753932 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, version=9.6, config_id=edpm, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, name=ubi9-minimal, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc.)
Nov 22 03:36:16 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:36:16.399 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '61'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:36:17 np0005531887 nova_compute[186849]: 2025-11-22 08:36:17.719 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:36:17 np0005531887 nova_compute[186849]: 2025-11-22 08:36:17.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:36:18 np0005531887 nova_compute[186849]: 2025-11-22 08:36:18.455 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:36:20 np0005531887 nova_compute[186849]: 2025-11-22 08:36:20.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:36:20 np0005531887 nova_compute[186849]: 2025-11-22 08:36:20.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:36:20 np0005531887 nova_compute[186849]: 2025-11-22 08:36:20.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:36:20 np0005531887 nova_compute[186849]: 2025-11-22 08:36:20.782 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:36:21 np0005531887 podman[243667]: 2025-11-22 08:36:21.839341575 +0000 UTC m=+0.059825201 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 22 03:36:21 np0005531887 podman[243668]: 2025-11-22 08:36:21.858229499 +0000 UTC m=+0.077883974 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible)
Nov 22 03:36:22 np0005531887 nova_compute[186849]: 2025-11-22 08:36:22.721 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:36:23 np0005531887 nova_compute[186849]: 2025-11-22 08:36:23.457 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:36:23 np0005531887 nova_compute[186849]: 2025-11-22 08:36:23.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:36:23 np0005531887 nova_compute[186849]: 2025-11-22 08:36:23.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:36:25 np0005531887 podman[243712]: 2025-11-22 08:36:25.852100656 +0000 UTC m=+0.073006473 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 03:36:27 np0005531887 nova_compute[186849]: 2025-11-22 08:36:27.723 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:36:27 np0005531887 nova_compute[186849]: 2025-11-22 08:36:27.762 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:36:28 np0005531887 nova_compute[186849]: 2025-11-22 08:36:28.459 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:36:28 np0005531887 nova_compute[186849]: 2025-11-22 08:36:28.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:36:28 np0005531887 nova_compute[186849]: 2025-11-22 08:36:28.796 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:36:28 np0005531887 nova_compute[186849]: 2025-11-22 08:36:28.797 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:36:28 np0005531887 nova_compute[186849]: 2025-11-22 08:36:28.797 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:36:28 np0005531887 nova_compute[186849]: 2025-11-22 08:36:28.797 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:36:28 np0005531887 nova_compute[186849]: 2025-11-22 08:36:28.964 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:36:28 np0005531887 nova_compute[186849]: 2025-11-22 08:36:28.966 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5751MB free_disk=73.27418899536133GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:36:28 np0005531887 nova_compute[186849]: 2025-11-22 08:36:28.966 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:36:28 np0005531887 nova_compute[186849]: 2025-11-22 08:36:28.967 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:36:29 np0005531887 nova_compute[186849]: 2025-11-22 08:36:29.014 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:36:29 np0005531887 nova_compute[186849]: 2025-11-22 08:36:29.014 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:36:29 np0005531887 nova_compute[186849]: 2025-11-22 08:36:29.033 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Refreshing inventories for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 22 03:36:29 np0005531887 nova_compute[186849]: 2025-11-22 08:36:29.049 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Updating ProviderTree inventory for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 22 03:36:29 np0005531887 nova_compute[186849]: 2025-11-22 08:36:29.049 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Updating inventory in ProviderTree for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 22 03:36:29 np0005531887 nova_compute[186849]: 2025-11-22 08:36:29.061 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Refreshing aggregate associations for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 22 03:36:29 np0005531887 nova_compute[186849]: 2025-11-22 08:36:29.089 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Refreshing trait associations for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78, traits: COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_MMX,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_PCNET _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 22 03:36:29 np0005531887 nova_compute[186849]: 2025-11-22 08:36:29.108 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:36:29 np0005531887 nova_compute[186849]: 2025-11-22 08:36:29.124 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:36:29 np0005531887 nova_compute[186849]: 2025-11-22 08:36:29.125 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:36:29 np0005531887 nova_compute[186849]: 2025-11-22 08:36:29.126 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:36:31 np0005531887 nova_compute[186849]: 2025-11-22 08:36:31.126 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:36:31 np0005531887 nova_compute[186849]: 2025-11-22 08:36:31.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:36:32 np0005531887 nova_compute[186849]: 2025-11-22 08:36:32.726 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:36:32 np0005531887 nova_compute[186849]: 2025-11-22 08:36:32.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:36:32 np0005531887 podman[243736]: 2025-11-22 08:36:32.831109467 +0000 UTC m=+0.052176613 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 03:36:33 np0005531887 nova_compute[186849]: 2025-11-22 08:36:33.462 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:36:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:36:36.672 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:36:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:36:36.673 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:36:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:36:36.673 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:36:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:36:36.673 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:36:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:36:36.673 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:36:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:36:36.673 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:36:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:36:36.673 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:36:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:36:36.673 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:36:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:36:36.673 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:36:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:36:36.673 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:36:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:36:36.673 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:36:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:36:36.673 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:36:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:36:36.674 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:36:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:36:36.674 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:36:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:36:36.674 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:36:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:36:36.674 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:36:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:36:36.674 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:36:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:36:36.674 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:36:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:36:36.674 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:36:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:36:36.674 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:36:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:36:36.674 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:36:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:36:36.674 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:36:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:36:36.674 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:36:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:36:36.674 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:36:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:36:36.674 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:36:36 np0005531887 podman[243756]: 2025-11-22 08:36:36.86599306 +0000 UTC m=+0.088792890 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 03:36:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:36:37.372 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:36:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:36:37.373 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:36:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:36:37.373 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:36:37 np0005531887 nova_compute[186849]: 2025-11-22 08:36:37.728 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:36:38 np0005531887 nova_compute[186849]: 2025-11-22 08:36:38.463 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:36:40 np0005531887 podman[243776]: 2025-11-22 08:36:40.836113535 +0000 UTC m=+0.051247470 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:36:41 np0005531887 nova_compute[186849]: 2025-11-22 08:36:41.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:36:42 np0005531887 nova_compute[186849]: 2025-11-22 08:36:42.731 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:36:43 np0005531887 nova_compute[186849]: 2025-11-22 08:36:43.464 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:36:46 np0005531887 podman[243800]: 2025-11-22 08:36:46.853889517 +0000 UTC m=+0.066524314 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, release=1755695350, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, distribution-scope=public, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 22 03:36:47 np0005531887 nova_compute[186849]: 2025-11-22 08:36:47.733 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:36:48 np0005531887 nova_compute[186849]: 2025-11-22 08:36:48.467 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:36:49 np0005531887 nova_compute[186849]: 2025-11-22 08:36:49.763 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:36:52 np0005531887 nova_compute[186849]: 2025-11-22 08:36:52.736 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:36:52 np0005531887 podman[243823]: 2025-11-22 08:36:52.843320375 +0000 UTC m=+0.062017805 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 03:36:52 np0005531887 podman[243824]: 2025-11-22 08:36:52.866123284 +0000 UTC m=+0.081526312 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 03:36:53 np0005531887 nova_compute[186849]: 2025-11-22 08:36:53.468 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:36:56 np0005531887 podman[243870]: 2025-11-22 08:36:56.833414828 +0000 UTC m=+0.055590296 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:36:57 np0005531887 nova_compute[186849]: 2025-11-22 08:36:57.739 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:36:58 np0005531887 nova_compute[186849]: 2025-11-22 08:36:58.470 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:37:02 np0005531887 nova_compute[186849]: 2025-11-22 08:37:02.742 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:37:03 np0005531887 nova_compute[186849]: 2025-11-22 08:37:03.472 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:37:03 np0005531887 podman[243895]: 2025-11-22 08:37:03.836372867 +0000 UTC m=+0.055472074 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 22 03:37:07 np0005531887 nova_compute[186849]: 2025-11-22 08:37:07.745 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:37:07 np0005531887 podman[243914]: 2025-11-22 08:37:07.841531731 +0000 UTC m=+0.065072309 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd)
Nov 22 03:37:08 np0005531887 nova_compute[186849]: 2025-11-22 08:37:08.475 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:37:11 np0005531887 podman[243934]: 2025-11-22 08:37:11.853593284 +0000 UTC m=+0.073199877 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 03:37:12 np0005531887 nova_compute[186849]: 2025-11-22 08:37:12.748 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:37:13 np0005531887 nova_compute[186849]: 2025-11-22 08:37:13.477 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:37:17 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:37:17.078 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=62, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=61) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:37:17 np0005531887 nova_compute[186849]: 2025-11-22 08:37:17.078 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:37:17 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:37:17.079 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:37:17 np0005531887 nova_compute[186849]: 2025-11-22 08:37:17.751 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:37:17 np0005531887 nova_compute[186849]: 2025-11-22 08:37:17.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:37:17 np0005531887 podman[243958]: 2025-11-22 08:37:17.869844689 +0000 UTC m=+0.089039748 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, container_name=openstack_network_exporter, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_id=edpm)
Nov 22 03:37:18 np0005531887 nova_compute[186849]: 2025-11-22 08:37:18.479 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:37:21 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:37:21.081 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '62'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:37:21 np0005531887 nova_compute[186849]: 2025-11-22 08:37:21.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:37:21 np0005531887 nova_compute[186849]: 2025-11-22 08:37:21.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:37:21 np0005531887 nova_compute[186849]: 2025-11-22 08:37:21.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:37:21 np0005531887 nova_compute[186849]: 2025-11-22 08:37:21.784 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:37:22 np0005531887 nova_compute[186849]: 2025-11-22 08:37:22.754 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:37:23 np0005531887 nova_compute[186849]: 2025-11-22 08:37:23.480 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:37:23 np0005531887 podman[243979]: 2025-11-22 08:37:23.85252292 +0000 UTC m=+0.068221556 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm)
Nov 22 03:37:23 np0005531887 podman[243980]: 2025-11-22 08:37:23.895980167 +0000 UTC m=+0.106807703 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 03:37:25 np0005531887 nova_compute[186849]: 2025-11-22 08:37:25.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:37:25 np0005531887 nova_compute[186849]: 2025-11-22 08:37:25.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:37:27 np0005531887 nova_compute[186849]: 2025-11-22 08:37:27.757 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:37:27 np0005531887 podman[244022]: 2025-11-22 08:37:27.876510207 +0000 UTC m=+0.081417900 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:37:28 np0005531887 nova_compute[186849]: 2025-11-22 08:37:28.485 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:37:28 np0005531887 nova_compute[186849]: 2025-11-22 08:37:28.764 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:37:29 np0005531887 nova_compute[186849]: 2025-11-22 08:37:29.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:37:29 np0005531887 nova_compute[186849]: 2025-11-22 08:37:29.811 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:37:29 np0005531887 nova_compute[186849]: 2025-11-22 08:37:29.811 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:37:29 np0005531887 nova_compute[186849]: 2025-11-22 08:37:29.811 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:37:29 np0005531887 nova_compute[186849]: 2025-11-22 08:37:29.812 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:37:29 np0005531887 nova_compute[186849]: 2025-11-22 08:37:29.967 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:37:29 np0005531887 nova_compute[186849]: 2025-11-22 08:37:29.968 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5760MB free_disk=73.27419662475586GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:37:29 np0005531887 nova_compute[186849]: 2025-11-22 08:37:29.968 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:37:29 np0005531887 nova_compute[186849]: 2025-11-22 08:37:29.969 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:37:30 np0005531887 nova_compute[186849]: 2025-11-22 08:37:30.119 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:37:30 np0005531887 nova_compute[186849]: 2025-11-22 08:37:30.119 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:37:30 np0005531887 nova_compute[186849]: 2025-11-22 08:37:30.195 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:37:30 np0005531887 nova_compute[186849]: 2025-11-22 08:37:30.220 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:37:30 np0005531887 nova_compute[186849]: 2025-11-22 08:37:30.222 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:37:30 np0005531887 nova_compute[186849]: 2025-11-22 08:37:30.222 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.253s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:37:31 np0005531887 nova_compute[186849]: 2025-11-22 08:37:31.222 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:37:32 np0005531887 nova_compute[186849]: 2025-11-22 08:37:32.759 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:37:33 np0005531887 nova_compute[186849]: 2025-11-22 08:37:33.485 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:37:33 np0005531887 nova_compute[186849]: 2025-11-22 08:37:33.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:37:33 np0005531887 nova_compute[186849]: 2025-11-22 08:37:33.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:37:34 np0005531887 podman[244046]: 2025-11-22 08:37:34.847607143 +0000 UTC m=+0.063204493 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 22 03:37:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:37:37.374 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:37:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:37:37.375 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:37:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:37:37.375 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:37:37 np0005531887 nova_compute[186849]: 2025-11-22 08:37:37.762 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:37:38 np0005531887 nova_compute[186849]: 2025-11-22 08:37:38.486 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:37:38 np0005531887 podman[244065]: 2025-11-22 08:37:38.860245301 +0000 UTC m=+0.076100561 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3)
Nov 22 03:37:42 np0005531887 nova_compute[186849]: 2025-11-22 08:37:42.766 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:37:42 np0005531887 nova_compute[186849]: 2025-11-22 08:37:42.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:37:42 np0005531887 podman[244087]: 2025-11-22 08:37:42.861878199 +0000 UTC m=+0.063364658 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 03:37:43 np0005531887 nova_compute[186849]: 2025-11-22 08:37:43.488 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:37:47 np0005531887 nova_compute[186849]: 2025-11-22 08:37:47.771 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:37:48 np0005531887 nova_compute[186849]: 2025-11-22 08:37:48.491 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:37:48 np0005531887 podman[244112]: 2025-11-22 08:37:48.855216052 +0000 UTC m=+0.066415482 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, version=9.6, name=ubi9-minimal, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, release=1755695350, config_id=edpm)
Nov 22 03:37:52 np0005531887 nova_compute[186849]: 2025-11-22 08:37:52.774 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:37:53 np0005531887 nova_compute[186849]: 2025-11-22 08:37:53.493 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:37:54 np0005531887 podman[244135]: 2025-11-22 08:37:54.880690905 +0000 UTC m=+0.093111918 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 22 03:37:54 np0005531887 podman[244134]: 2025-11-22 08:37:54.902243273 +0000 UTC m=+0.110076564 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute)
Nov 22 03:37:57 np0005531887 nova_compute[186849]: 2025-11-22 08:37:57.778 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:37:58 np0005531887 nova_compute[186849]: 2025-11-22 08:37:58.494 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:37:58 np0005531887 podman[244179]: 2025-11-22 08:37:58.842648067 +0000 UTC m=+0.063158261 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:38:02 np0005531887 nova_compute[186849]: 2025-11-22 08:38:02.781 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:03 np0005531887 nova_compute[186849]: 2025-11-22 08:38:03.496 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:05 np0005531887 podman[244203]: 2025-11-22 08:38:05.853363566 +0000 UTC m=+0.072916391 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_metadata_agent)
Nov 22 03:38:07 np0005531887 nova_compute[186849]: 2025-11-22 08:38:07.785 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:08 np0005531887 nova_compute[186849]: 2025-11-22 08:38:08.497 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:09 np0005531887 podman[244222]: 2025-11-22 08:38:09.846434745 +0000 UTC m=+0.066328949 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:38:12 np0005531887 nova_compute[186849]: 2025-11-22 08:38:12.789 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:13 np0005531887 nova_compute[186849]: 2025-11-22 08:38:13.499 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:13 np0005531887 podman[244242]: 2025-11-22 08:38:13.842040025 +0000 UTC m=+0.063871139 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 03:38:17 np0005531887 nova_compute[186849]: 2025-11-22 08:38:17.791 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:18 np0005531887 nova_compute[186849]: 2025-11-22 08:38:18.502 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:19 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:38:19.333 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=63, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=62) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:38:19 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:38:19.335 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:38:19 np0005531887 nova_compute[186849]: 2025-11-22 08:38:19.335 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:19 np0005531887 nova_compute[186849]: 2025-11-22 08:38:19.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:38:19 np0005531887 podman[244266]: 2025-11-22 08:38:19.85538369 +0000 UTC m=+0.065721315 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, name=ubi9-minimal, release=1755695350, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, vendor=Red Hat, Inc.)
Nov 22 03:38:21 np0005531887 nova_compute[186849]: 2025-11-22 08:38:21.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:38:21 np0005531887 nova_compute[186849]: 2025-11-22 08:38:21.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:38:21 np0005531887 nova_compute[186849]: 2025-11-22 08:38:21.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:38:21 np0005531887 nova_compute[186849]: 2025-11-22 08:38:21.783 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:38:22 np0005531887 nova_compute[186849]: 2025-11-22 08:38:22.795 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:23 np0005531887 nova_compute[186849]: 2025-11-22 08:38:23.504 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:25 np0005531887 podman[244287]: 2025-11-22 08:38:25.839617809 +0000 UTC m=+0.062985288 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Nov 22 03:38:25 np0005531887 podman[244288]: 2025-11-22 08:38:25.872211049 +0000 UTC m=+0.090054042 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 22 03:38:27 np0005531887 nova_compute[186849]: 2025-11-22 08:38:27.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:38:27 np0005531887 nova_compute[186849]: 2025-11-22 08:38:27.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:38:27 np0005531887 nova_compute[186849]: 2025-11-22 08:38:27.798 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:28 np0005531887 nova_compute[186849]: 2025-11-22 08:38:28.506 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:29 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:38:29.337 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '63'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:38:29 np0005531887 nova_compute[186849]: 2025-11-22 08:38:29.762 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:38:29 np0005531887 podman[244330]: 2025-11-22 08:38:29.845445097 +0000 UTC m=+0.062745442 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 03:38:30 np0005531887 nova_compute[186849]: 2025-11-22 08:38:30.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:38:30 np0005531887 nova_compute[186849]: 2025-11-22 08:38:30.792 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:38:30 np0005531887 nova_compute[186849]: 2025-11-22 08:38:30.792 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:38:30 np0005531887 nova_compute[186849]: 2025-11-22 08:38:30.792 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:38:30 np0005531887 nova_compute[186849]: 2025-11-22 08:38:30.792 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:38:30 np0005531887 nova_compute[186849]: 2025-11-22 08:38:30.947 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:38:30 np0005531887 nova_compute[186849]: 2025-11-22 08:38:30.948 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5737MB free_disk=73.27417755126953GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:38:30 np0005531887 nova_compute[186849]: 2025-11-22 08:38:30.948 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:38:30 np0005531887 nova_compute[186849]: 2025-11-22 08:38:30.948 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:38:30 np0005531887 nova_compute[186849]: 2025-11-22 08:38:30.993 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:38:30 np0005531887 nova_compute[186849]: 2025-11-22 08:38:30.993 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:38:31 np0005531887 nova_compute[186849]: 2025-11-22 08:38:31.012 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:38:31 np0005531887 nova_compute[186849]: 2025-11-22 08:38:31.026 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:38:31 np0005531887 nova_compute[186849]: 2025-11-22 08:38:31.027 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:38:31 np0005531887 nova_compute[186849]: 2025-11-22 08:38:31.028 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.080s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:38:32 np0005531887 nova_compute[186849]: 2025-11-22 08:38:32.028 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:38:32 np0005531887 nova_compute[186849]: 2025-11-22 08:38:32.801 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:33 np0005531887 nova_compute[186849]: 2025-11-22 08:38:33.507 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:34 np0005531887 nova_compute[186849]: 2025-11-22 08:38:34.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:38:35 np0005531887 nova_compute[186849]: 2025-11-22 08:38:35.770 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:38:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:38:36.676 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:38:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:38:36.677 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:38:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:38:36.677 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:38:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:38:36.677 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:38:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:38:36.678 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:38:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:38:36.678 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:38:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:38:36.678 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:38:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:38:36.678 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:38:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:38:36.678 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:38:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:38:36.679 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:38:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:38:36.679 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:38:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:38:36.679 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:38:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:38:36.679 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:38:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:38:36.679 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:38:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:38:36.679 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:38:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:38:36.679 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:38:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:38:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:38:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:38:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:38:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:38:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:38:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:38:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:38:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:38:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:38:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:38:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:38:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:38:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:38:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:38:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:38:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:38:36.681 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:38:36 np0005531887 podman[244356]: 2025-11-22 08:38:36.832246261 +0000 UTC m=+0.055268028 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 03:38:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:38:37.375 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:38:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:38:37.376 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:38:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:38:37.377 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:38:37 np0005531887 nova_compute[186849]: 2025-11-22 08:38:37.805 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:38 np0005531887 nova_compute[186849]: 2025-11-22 08:38:38.508 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:40 np0005531887 podman[244377]: 2025-11-22 08:38:40.871749869 +0000 UTC m=+0.079550434 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 22 03:38:42 np0005531887 nova_compute[186849]: 2025-11-22 08:38:42.809 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:43 np0005531887 nova_compute[186849]: 2025-11-22 08:38:43.511 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:44 np0005531887 nova_compute[186849]: 2025-11-22 08:38:44.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:38:44 np0005531887 podman[244397]: 2025-11-22 08:38:44.873042538 +0000 UTC m=+0.088286539 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 03:38:48 np0005531887 nova_compute[186849]: 2025-11-22 08:38:47.818 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:48 np0005531887 nova_compute[186849]: 2025-11-22 08:38:48.514 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:50 np0005531887 podman[244421]: 2025-11-22 08:38:50.896570812 +0000 UTC m=+0.107948201 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, version=9.6, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, config_id=edpm, io.buildah.version=1.33.7, managed_by=edpm_ansible, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 03:38:52 np0005531887 nova_compute[186849]: 2025-11-22 08:38:52.821 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:53 np0005531887 nova_compute[186849]: 2025-11-22 08:38:53.516 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:53 np0005531887 nova_compute[186849]: 2025-11-22 08:38:53.763 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:38:56 np0005531887 podman[244442]: 2025-11-22 08:38:56.846318335 +0000 UTC m=+0.064347361 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:38:56 np0005531887 podman[244443]: 2025-11-22 08:38:56.876450366 +0000 UTC m=+0.090672258 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 03:38:57 np0005531887 nova_compute[186849]: 2025-11-22 08:38:57.824 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:38:58 np0005531887 nova_compute[186849]: 2025-11-22 08:38:58.518 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:39:00 np0005531887 podman[244486]: 2025-11-22 08:39:00.86043594 +0000 UTC m=+0.082915327 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 03:39:02 np0005531887 nova_compute[186849]: 2025-11-22 08:39:02.827 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:39:03 np0005531887 nova_compute[186849]: 2025-11-22 08:39:03.521 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:39:07 np0005531887 nova_compute[186849]: 2025-11-22 08:39:07.830 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:39:07 np0005531887 podman[244511]: 2025-11-22 08:39:07.838474026 +0000 UTC m=+0.055628927 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 22 03:39:08 np0005531887 nova_compute[186849]: 2025-11-22 08:39:08.521 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:39:11 np0005531887 podman[244530]: 2025-11-22 08:39:11.84614347 +0000 UTC m=+0.067612751 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 22 03:39:12 np0005531887 nova_compute[186849]: 2025-11-22 08:39:12.833 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:39:13 np0005531887 nova_compute[186849]: 2025-11-22 08:39:13.523 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:39:14 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:39:14.272 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=64, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=63) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:39:14 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:39:14.273 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:39:14 np0005531887 nova_compute[186849]: 2025-11-22 08:39:14.274 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:39:15 np0005531887 podman[244551]: 2025-11-22 08:39:15.835329594 +0000 UTC m=+0.051261849 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 03:39:17 np0005531887 nova_compute[186849]: 2025-11-22 08:39:17.837 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:39:18 np0005531887 nova_compute[186849]: 2025-11-22 08:39:18.525 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:39:19 np0005531887 nova_compute[186849]: 2025-11-22 08:39:19.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:39:21 np0005531887 podman[244576]: 2025-11-22 08:39:21.835384842 +0000 UTC m=+0.053472613 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, vcs-type=git, container_name=openstack_network_exporter, release=1755695350, version=9.6, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm)
Nov 22 03:39:22 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:39:22.276 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '64'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:39:22 np0005531887 nova_compute[186849]: 2025-11-22 08:39:22.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:39:22 np0005531887 nova_compute[186849]: 2025-11-22 08:39:22.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:39:22 np0005531887 nova_compute[186849]: 2025-11-22 08:39:22.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:39:22 np0005531887 nova_compute[186849]: 2025-11-22 08:39:22.783 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:39:22 np0005531887 nova_compute[186849]: 2025-11-22 08:39:22.839 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:39:23 np0005531887 nova_compute[186849]: 2025-11-22 08:39:23.527 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:39:27 np0005531887 nova_compute[186849]: 2025-11-22 08:39:27.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:39:27 np0005531887 nova_compute[186849]: 2025-11-22 08:39:27.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:39:27 np0005531887 nova_compute[186849]: 2025-11-22 08:39:27.844 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:39:27 np0005531887 podman[244597]: 2025-11-22 08:39:27.855358098 +0000 UTC m=+0.074044778 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute)
Nov 22 03:39:27 np0005531887 podman[244598]: 2025-11-22 08:39:27.925603504 +0000 UTC m=+0.140424359 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0)
Nov 22 03:39:28 np0005531887 nova_compute[186849]: 2025-11-22 08:39:28.528 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:39:29 np0005531887 nova_compute[186849]: 2025-11-22 08:39:29.762 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:39:30 np0005531887 nova_compute[186849]: 2025-11-22 08:39:30.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:39:30 np0005531887 nova_compute[186849]: 2025-11-22 08:39:30.789 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:39:30 np0005531887 nova_compute[186849]: 2025-11-22 08:39:30.789 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:39:30 np0005531887 nova_compute[186849]: 2025-11-22 08:39:30.789 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:39:30 np0005531887 nova_compute[186849]: 2025-11-22 08:39:30.789 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:39:30 np0005531887 nova_compute[186849]: 2025-11-22 08:39:30.976 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:39:30 np0005531887 nova_compute[186849]: 2025-11-22 08:39:30.977 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5753MB free_disk=73.27408599853516GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:39:30 np0005531887 nova_compute[186849]: 2025-11-22 08:39:30.977 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:39:30 np0005531887 nova_compute[186849]: 2025-11-22 08:39:30.977 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:39:31 np0005531887 nova_compute[186849]: 2025-11-22 08:39:31.149 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:39:31 np0005531887 nova_compute[186849]: 2025-11-22 08:39:31.150 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:39:31 np0005531887 nova_compute[186849]: 2025-11-22 08:39:31.174 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:39:31 np0005531887 nova_compute[186849]: 2025-11-22 08:39:31.207 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:39:31 np0005531887 nova_compute[186849]: 2025-11-22 08:39:31.208 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:39:31 np0005531887 nova_compute[186849]: 2025-11-22 08:39:31.208 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.231s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:39:31 np0005531887 podman[244643]: 2025-11-22 08:39:31.826096537 +0000 UTC m=+0.049513566 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:39:32 np0005531887 nova_compute[186849]: 2025-11-22 08:39:32.208 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:39:32 np0005531887 nova_compute[186849]: 2025-11-22 08:39:32.847 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:39:33 np0005531887 nova_compute[186849]: 2025-11-22 08:39:33.529 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:39:35 np0005531887 nova_compute[186849]: 2025-11-22 08:39:35.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:39:36 np0005531887 nova_compute[186849]: 2025-11-22 08:39:36.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:39:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:39:37.376 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:39:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:39:37.377 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:39:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:39:37.377 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:39:37 np0005531887 nova_compute[186849]: 2025-11-22 08:39:37.850 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:39:38 np0005531887 nova_compute[186849]: 2025-11-22 08:39:38.530 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:39:38 np0005531887 podman[244667]: 2025-11-22 08:39:38.826012721 +0000 UTC m=+0.048611254 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Nov 22 03:39:42 np0005531887 nova_compute[186849]: 2025-11-22 08:39:42.853 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:39:42 np0005531887 podman[244686]: 2025-11-22 08:39:42.854207022 +0000 UTC m=+0.074145242 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:39:43 np0005531887 nova_compute[186849]: 2025-11-22 08:39:43.532 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:39:45 np0005531887 nova_compute[186849]: 2025-11-22 08:39:45.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:39:46 np0005531887 podman[244706]: 2025-11-22 08:39:46.999289711 +0000 UTC m=+0.054716575 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:39:47 np0005531887 nova_compute[186849]: 2025-11-22 08:39:47.857 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:39:48 np0005531887 nova_compute[186849]: 2025-11-22 08:39:48.534 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:39:48 np0005531887 nova_compute[186849]: 2025-11-22 08:39:48.770 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:39:48 np0005531887 nova_compute[186849]: 2025-11-22 08:39:48.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 22 03:39:48 np0005531887 nova_compute[186849]: 2025-11-22 08:39:48.787 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 22 03:39:49 np0005531887 nova_compute[186849]: 2025-11-22 08:39:49.770 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:39:49 np0005531887 nova_compute[186849]: 2025-11-22 08:39:49.771 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 22 03:39:52 np0005531887 nova_compute[186849]: 2025-11-22 08:39:52.860 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:39:52 np0005531887 podman[244732]: 2025-11-22 08:39:52.868844056 +0000 UTC m=+0.086929036 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, release=1755695350, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 22 03:39:53 np0005531887 nova_compute[186849]: 2025-11-22 08:39:53.168 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:39:53 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:39:53.169 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=65, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=64) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:39:53 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:39:53.170 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:39:53 np0005531887 nova_compute[186849]: 2025-11-22 08:39:53.535 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:39:57 np0005531887 nova_compute[186849]: 2025-11-22 08:39:57.862 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:39:58 np0005531887 nova_compute[186849]: 2025-11-22 08:39:58.536 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:39:58 np0005531887 podman[244753]: 2025-11-22 08:39:58.845298475 +0000 UTC m=+0.060482166 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_id=edpm, managed_by=edpm_ansible)
Nov 22 03:39:58 np0005531887 podman[244754]: 2025-11-22 08:39:58.904804445 +0000 UTC m=+0.113104437 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 22 03:39:59 np0005531887 nova_compute[186849]: 2025-11-22 08:39:59.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:40:02 np0005531887 podman[244797]: 2025-11-22 08:40:02.854381545 +0000 UTC m=+0.059733038 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:40:02 np0005531887 nova_compute[186849]: 2025-11-22 08:40:02.866 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:03 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:40:03.172 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '65'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:40:03 np0005531887 nova_compute[186849]: 2025-11-22 08:40:03.539 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:07 np0005531887 nova_compute[186849]: 2025-11-22 08:40:07.869 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:08 np0005531887 nova_compute[186849]: 2025-11-22 08:40:08.542 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:09 np0005531887 podman[244821]: 2025-11-22 08:40:09.850176238 +0000 UTC m=+0.063495560 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 22 03:40:12 np0005531887 nova_compute[186849]: 2025-11-22 08:40:12.871 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:13 np0005531887 nova_compute[186849]: 2025-11-22 08:40:13.544 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:13 np0005531887 podman[244840]: 2025-11-22 08:40:13.838281063 +0000 UTC m=+0.061522172 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd)
Nov 22 03:40:17 np0005531887 podman[244860]: 2025-11-22 08:40:17.83824174 +0000 UTC m=+0.054198292 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:40:17 np0005531887 nova_compute[186849]: 2025-11-22 08:40:17.874 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:18 np0005531887 nova_compute[186849]: 2025-11-22 08:40:18.545 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:21 np0005531887 nova_compute[186849]: 2025-11-22 08:40:21.796 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:40:22 np0005531887 nova_compute[186849]: 2025-11-22 08:40:22.877 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:23 np0005531887 nova_compute[186849]: 2025-11-22 08:40:23.546 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:23 np0005531887 nova_compute[186849]: 2025-11-22 08:40:23.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:40:23 np0005531887 nova_compute[186849]: 2025-11-22 08:40:23.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:40:23 np0005531887 nova_compute[186849]: 2025-11-22 08:40:23.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:40:23 np0005531887 nova_compute[186849]: 2025-11-22 08:40:23.781 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:40:23 np0005531887 podman[244888]: 2025-11-22 08:40:23.836280438 +0000 UTC m=+0.058944309 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, managed_by=edpm_ansible, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_id=edpm, release=1755695350)
Nov 22 03:40:27 np0005531887 nova_compute[186849]: 2025-11-22 08:40:27.881 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:28 np0005531887 nova_compute[186849]: 2025-11-22 08:40:28.550 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:29 np0005531887 nova_compute[186849]: 2025-11-22 08:40:29.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:40:29 np0005531887 nova_compute[186849]: 2025-11-22 08:40:29.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:40:29 np0005531887 podman[244911]: 2025-11-22 08:40:29.838899668 +0000 UTC m=+0.058184180 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute)
Nov 22 03:40:29 np0005531887 podman[244912]: 2025-11-22 08:40:29.876593134 +0000 UTC m=+0.090233608 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:40:31 np0005531887 nova_compute[186849]: 2025-11-22 08:40:31.762 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:40:32 np0005531887 nova_compute[186849]: 2025-11-22 08:40:32.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:40:32 np0005531887 nova_compute[186849]: 2025-11-22 08:40:32.802 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:40:32 np0005531887 nova_compute[186849]: 2025-11-22 08:40:32.803 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:40:32 np0005531887 nova_compute[186849]: 2025-11-22 08:40:32.803 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:40:32 np0005531887 nova_compute[186849]: 2025-11-22 08:40:32.804 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:40:32 np0005531887 nova_compute[186849]: 2025-11-22 08:40:32.884 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:33 np0005531887 nova_compute[186849]: 2025-11-22 08:40:33.004 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:40:33 np0005531887 nova_compute[186849]: 2025-11-22 08:40:33.005 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5755MB free_disk=73.27408599853516GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:40:33 np0005531887 nova_compute[186849]: 2025-11-22 08:40:33.005 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:40:33 np0005531887 nova_compute[186849]: 2025-11-22 08:40:33.005 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:40:33 np0005531887 nova_compute[186849]: 2025-11-22 08:40:33.096 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:40:33 np0005531887 nova_compute[186849]: 2025-11-22 08:40:33.096 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:40:33 np0005531887 nova_compute[186849]: 2025-11-22 08:40:33.124 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:40:33 np0005531887 nova_compute[186849]: 2025-11-22 08:40:33.144 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:40:33 np0005531887 nova_compute[186849]: 2025-11-22 08:40:33.146 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:40:33 np0005531887 nova_compute[186849]: 2025-11-22 08:40:33.146 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.141s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:40:33 np0005531887 nova_compute[186849]: 2025-11-22 08:40:33.551 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:33 np0005531887 podman[244957]: 2025-11-22 08:40:33.841454301 +0000 UTC m=+0.058210410 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 03:40:34 np0005531887 nova_compute[186849]: 2025-11-22 08:40:34.147 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:40:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:40:36.674 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:40:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:40:36.674 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:40:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:40:36.674 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:40:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:40:36.674 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:40:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:40:36.674 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:40:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:40:36.674 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:40:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:40:36.675 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:40:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:40:36.675 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:40:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:40:36.675 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:40:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:40:36.675 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:40:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:40:36.675 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:40:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:40:36.675 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:40:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:40:36.675 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:40:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:40:36.675 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:40:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:40:36.675 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:40:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:40:36.675 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:40:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:40:36.675 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:40:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:40:36.676 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:40:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:40:36.676 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:40:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:40:36.676 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:40:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:40:36.676 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:40:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:40:36.676 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:40:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:40:36.676 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:40:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:40:36.676 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:40:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:40:36.676 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:40:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:40:37.377 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:40:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:40:37.378 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:40:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:40:37.378 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:40:37 np0005531887 nova_compute[186849]: 2025-11-22 08:40:37.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:40:37 np0005531887 nova_compute[186849]: 2025-11-22 08:40:37.888 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:38 np0005531887 nova_compute[186849]: 2025-11-22 08:40:38.552 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:38 np0005531887 nova_compute[186849]: 2025-11-22 08:40:38.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:40:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:40:40.246 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=66, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=65) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:40:40 np0005531887 nova_compute[186849]: 2025-11-22 08:40:40.247 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:40:40.247 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:40:40 np0005531887 podman[244982]: 2025-11-22 08:40:40.831215885 +0000 UTC m=+0.054923611 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:40:42 np0005531887 nova_compute[186849]: 2025-11-22 08:40:42.891 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:43 np0005531887 nova_compute[186849]: 2025-11-22 08:40:43.554 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:44 np0005531887 podman[245000]: 2025-11-22 08:40:44.839201097 +0000 UTC m=+0.058675912 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:40:45 np0005531887 nova_compute[186849]: 2025-11-22 08:40:45.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:40:48 np0005531887 nova_compute[186849]: 2025-11-22 08:40:48.002 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:48 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:40:48.249 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '66'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:40:48 np0005531887 nova_compute[186849]: 2025-11-22 08:40:48.555 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:48 np0005531887 podman[245020]: 2025-11-22 08:40:48.835495084 +0000 UTC m=+0.054417896 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 03:40:53 np0005531887 nova_compute[186849]: 2025-11-22 08:40:53.005 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:53 np0005531887 nova_compute[186849]: 2025-11-22 08:40:53.557 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:54 np0005531887 podman[245044]: 2025-11-22 08:40:54.834147949 +0000 UTC m=+0.056765005 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, distribution-scope=public, io.openshift.tags=minimal rhel9, architecture=x86_64, io.buildah.version=1.33.7)
Nov 22 03:40:55 np0005531887 nova_compute[186849]: 2025-11-22 08:40:55.762 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:40:58 np0005531887 nova_compute[186849]: 2025-11-22 08:40:58.009 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:40:58 np0005531887 nova_compute[186849]: 2025-11-22 08:40:58.559 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:41:00 np0005531887 podman[245066]: 2025-11-22 08:41:00.860974364 +0000 UTC m=+0.075196636 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 03:41:00 np0005531887 podman[245067]: 2025-11-22 08:41:00.867162446 +0000 UTC m=+0.078051237 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:41:03 np0005531887 nova_compute[186849]: 2025-11-22 08:41:03.012 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:41:03 np0005531887 nova_compute[186849]: 2025-11-22 08:41:03.561 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:41:04 np0005531887 podman[245111]: 2025-11-22 08:41:04.853381155 +0000 UTC m=+0.063029978 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:41:08 np0005531887 nova_compute[186849]: 2025-11-22 08:41:08.015 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:41:08 np0005531887 nova_compute[186849]: 2025-11-22 08:41:08.563 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:41:11 np0005531887 podman[245135]: 2025-11-22 08:41:11.862534555 +0000 UTC m=+0.081493073 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true)
Nov 22 03:41:13 np0005531887 nova_compute[186849]: 2025-11-22 08:41:13.019 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:41:13 np0005531887 nova_compute[186849]: 2025-11-22 08:41:13.566 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:41:15 np0005531887 podman[245156]: 2025-11-22 08:41:15.830160538 +0000 UTC m=+0.054935270 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:41:18 np0005531887 nova_compute[186849]: 2025-11-22 08:41:18.021 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:41:18 np0005531887 nova_compute[186849]: 2025-11-22 08:41:18.567 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:41:19 np0005531887 podman[245176]: 2025-11-22 08:41:19.841145876 +0000 UTC m=+0.053656878 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:41:23 np0005531887 nova_compute[186849]: 2025-11-22 08:41:23.027 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:41:23 np0005531887 nova_compute[186849]: 2025-11-22 08:41:23.573 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:41:23 np0005531887 nova_compute[186849]: 2025-11-22 08:41:23.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:41:24 np0005531887 nova_compute[186849]: 2025-11-22 08:41:24.782 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:41:24 np0005531887 nova_compute[186849]: 2025-11-22 08:41:24.782 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:41:24 np0005531887 nova_compute[186849]: 2025-11-22 08:41:24.782 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:41:24 np0005531887 nova_compute[186849]: 2025-11-22 08:41:24.797 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:41:25 np0005531887 podman[245200]: 2025-11-22 08:41:25.845618943 +0000 UTC m=+0.065537500 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, release=1755695350, distribution-scope=public, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 22 03:41:28 np0005531887 nova_compute[186849]: 2025-11-22 08:41:28.031 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:41:28 np0005531887 nova_compute[186849]: 2025-11-22 08:41:28.576 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:41:31 np0005531887 nova_compute[186849]: 2025-11-22 08:41:31.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:41:31 np0005531887 nova_compute[186849]: 2025-11-22 08:41:31.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:41:31 np0005531887 podman[245221]: 2025-11-22 08:41:31.843894117 +0000 UTC m=+0.056930589 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 22 03:41:31 np0005531887 podman[245222]: 2025-11-22 08:41:31.867589738 +0000 UTC m=+0.076653963 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:41:32 np0005531887 nova_compute[186849]: 2025-11-22 08:41:32.767 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:41:33 np0005531887 nova_compute[186849]: 2025-11-22 08:41:33.035 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:41:33 np0005531887 nova_compute[186849]: 2025-11-22 08:41:33.578 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:41:33 np0005531887 nova_compute[186849]: 2025-11-22 08:41:33.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:41:34 np0005531887 nova_compute[186849]: 2025-11-22 08:41:34.771 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:41:34 np0005531887 nova_compute[186849]: 2025-11-22 08:41:34.824 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:41:34 np0005531887 nova_compute[186849]: 2025-11-22 08:41:34.824 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:41:34 np0005531887 nova_compute[186849]: 2025-11-22 08:41:34.825 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:41:34 np0005531887 nova_compute[186849]: 2025-11-22 08:41:34.825 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:41:35 np0005531887 nova_compute[186849]: 2025-11-22 08:41:35.089 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:41:35 np0005531887 nova_compute[186849]: 2025-11-22 08:41:35.090 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5750MB free_disk=73.27406692504883GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:41:35 np0005531887 nova_compute[186849]: 2025-11-22 08:41:35.090 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:41:35 np0005531887 nova_compute[186849]: 2025-11-22 08:41:35.090 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:41:35 np0005531887 nova_compute[186849]: 2025-11-22 08:41:35.194 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:41:35 np0005531887 nova_compute[186849]: 2025-11-22 08:41:35.194 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:41:35 np0005531887 nova_compute[186849]: 2025-11-22 08:41:35.317 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Refreshing inventories for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 22 03:41:35 np0005531887 nova_compute[186849]: 2025-11-22 08:41:35.473 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Updating ProviderTree inventory for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 22 03:41:35 np0005531887 nova_compute[186849]: 2025-11-22 08:41:35.474 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Updating inventory in ProviderTree for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 22 03:41:35 np0005531887 nova_compute[186849]: 2025-11-22 08:41:35.492 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Refreshing aggregate associations for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 22 03:41:35 np0005531887 nova_compute[186849]: 2025-11-22 08:41:35.537 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Refreshing trait associations for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78, traits: COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_MMX,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_PCNET _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 22 03:41:35 np0005531887 nova_compute[186849]: 2025-11-22 08:41:35.566 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:41:35 np0005531887 nova_compute[186849]: 2025-11-22 08:41:35.589 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:41:35 np0005531887 nova_compute[186849]: 2025-11-22 08:41:35.591 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:41:35 np0005531887 nova_compute[186849]: 2025-11-22 08:41:35.591 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.501s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:41:35 np0005531887 podman[245267]: 2025-11-22 08:41:35.825132653 +0000 UTC m=+0.044800781 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:41:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:41:37.379 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:41:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:41:37.379 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:41:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:41:37.379 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:41:38 np0005531887 nova_compute[186849]: 2025-11-22 08:41:38.038 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:41:38 np0005531887 nova_compute[186849]: 2025-11-22 08:41:38.581 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:41:40 np0005531887 nova_compute[186849]: 2025-11-22 08:41:40.590 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:41:40 np0005531887 nova_compute[186849]: 2025-11-22 08:41:40.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:41:42 np0005531887 podman[245292]: 2025-11-22 08:41:42.836639112 +0000 UTC m=+0.057083563 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 22 03:41:43 np0005531887 nova_compute[186849]: 2025-11-22 08:41:43.040 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:41:43 np0005531887 nova_compute[186849]: 2025-11-22 08:41:43.583 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:41:46 np0005531887 podman[245312]: 2025-11-22 08:41:46.869369042 +0000 UTC m=+0.086926815 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 03:41:47 np0005531887 nova_compute[186849]: 2025-11-22 08:41:47.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:41:48 np0005531887 nova_compute[186849]: 2025-11-22 08:41:48.043 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:41:48 np0005531887 nova_compute[186849]: 2025-11-22 08:41:48.583 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:41:50 np0005531887 podman[245332]: 2025-11-22 08:41:50.828095822 +0000 UTC m=+0.047755723 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:41:53 np0005531887 nova_compute[186849]: 2025-11-22 08:41:53.047 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:41:53 np0005531887 nova_compute[186849]: 2025-11-22 08:41:53.586 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:41:56 np0005531887 podman[245357]: 2025-11-22 08:41:56.870355251 +0000 UTC m=+0.087549871 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vcs-type=git, config_id=edpm, release=1755695350, io.openshift.tags=minimal rhel9, architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 03:41:58 np0005531887 nova_compute[186849]: 2025-11-22 08:41:58.050 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:41:58 np0005531887 nova_compute[186849]: 2025-11-22 08:41:58.587 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:02 np0005531887 podman[245378]: 2025-11-22 08:42:02.847290222 +0000 UTC m=+0.066381682 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 03:42:02 np0005531887 podman[245379]: 2025-11-22 08:42:02.883495321 +0000 UTC m=+0.093732183 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 22 03:42:03 np0005531887 nova_compute[186849]: 2025-11-22 08:42:03.051 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:03 np0005531887 nova_compute[186849]: 2025-11-22 08:42:03.589 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:06 np0005531887 podman[245423]: 2025-11-22 08:42:06.824890529 +0000 UTC m=+0.047525848 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 03:42:08 np0005531887 nova_compute[186849]: 2025-11-22 08:42:08.055 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:08 np0005531887 nova_compute[186849]: 2025-11-22 08:42:08.591 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:13 np0005531887 nova_compute[186849]: 2025-11-22 08:42:13.057 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:13 np0005531887 nova_compute[186849]: 2025-11-22 08:42:13.593 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:13 np0005531887 podman[245449]: 2025-11-22 08:42:13.85444303 +0000 UTC m=+0.067068207 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Nov 22 03:42:17 np0005531887 podman[245467]: 2025-11-22 08:42:17.855166816 +0000 UTC m=+0.073780542 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 03:42:18 np0005531887 nova_compute[186849]: 2025-11-22 08:42:18.061 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:18 np0005531887 nova_compute[186849]: 2025-11-22 08:42:18.595 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:21 np0005531887 podman[245489]: 2025-11-22 08:42:21.831914463 +0000 UTC m=+0.053119545 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:42:23 np0005531887 nova_compute[186849]: 2025-11-22 08:42:23.063 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:23 np0005531887 nova_compute[186849]: 2025-11-22 08:42:23.596 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:23 np0005531887 nova_compute[186849]: 2025-11-22 08:42:23.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:42:24 np0005531887 nova_compute[186849]: 2025-11-22 08:42:24.770 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:42:24 np0005531887 nova_compute[186849]: 2025-11-22 08:42:24.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:42:24 np0005531887 nova_compute[186849]: 2025-11-22 08:42:24.771 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:42:24 np0005531887 nova_compute[186849]: 2025-11-22 08:42:24.783 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:42:27 np0005531887 podman[245513]: 2025-11-22 08:42:27.919406698 +0000 UTC m=+0.083715737 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, name=ubi9-minimal, vcs-type=git, version=9.6, managed_by=edpm_ansible, distribution-scope=public, com.redhat.component=ubi9-minimal-container, release=1755695350, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., config_id=edpm, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=)
Nov 22 03:42:28 np0005531887 nova_compute[186849]: 2025-11-22 08:42:28.066 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:28 np0005531887 nova_compute[186849]: 2025-11-22 08:42:28.598 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:31 np0005531887 nova_compute[186849]: 2025-11-22 08:42:31.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:42:31 np0005531887 nova_compute[186849]: 2025-11-22 08:42:31.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:42:33 np0005531887 nova_compute[186849]: 2025-11-22 08:42:33.072 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:33 np0005531887 nova_compute[186849]: 2025-11-22 08:42:33.600 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:33 np0005531887 podman[245535]: 2025-11-22 08:42:33.846403852 +0000 UTC m=+0.063777347 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:42:33 np0005531887 podman[245536]: 2025-11-22 08:42:33.909398249 +0000 UTC m=+0.124074828 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118)
Nov 22 03:42:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:42:34.715 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=67, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=66) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:42:34 np0005531887 nova_compute[186849]: 2025-11-22 08:42:34.715 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:42:34.716 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:42:34 np0005531887 nova_compute[186849]: 2025-11-22 08:42:34.763 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:42:34 np0005531887 nova_compute[186849]: 2025-11-22 08:42:34.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:42:35 np0005531887 nova_compute[186849]: 2025-11-22 08:42:35.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:42:35 np0005531887 nova_compute[186849]: 2025-11-22 08:42:35.798 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:42:35 np0005531887 nova_compute[186849]: 2025-11-22 08:42:35.799 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:42:35 np0005531887 nova_compute[186849]: 2025-11-22 08:42:35.799 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:42:35 np0005531887 nova_compute[186849]: 2025-11-22 08:42:35.799 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:42:35 np0005531887 nova_compute[186849]: 2025-11-22 08:42:35.995 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:42:35 np0005531887 nova_compute[186849]: 2025-11-22 08:42:35.997 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5748MB free_disk=73.27408599853516GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:42:35 np0005531887 nova_compute[186849]: 2025-11-22 08:42:35.997 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:42:35 np0005531887 nova_compute[186849]: 2025-11-22 08:42:35.997 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:42:36 np0005531887 nova_compute[186849]: 2025-11-22 08:42:36.112 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:42:36 np0005531887 nova_compute[186849]: 2025-11-22 08:42:36.112 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:42:36 np0005531887 nova_compute[186849]: 2025-11-22 08:42:36.232 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:42:36 np0005531887 nova_compute[186849]: 2025-11-22 08:42:36.260 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:42:36 np0005531887 nova_compute[186849]: 2025-11-22 08:42:36.262 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:42:36 np0005531887 nova_compute[186849]: 2025-11-22 08:42:36.262 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.265s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:42:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:42:36.674 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:42:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:42:36.674 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:42:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:42:36.674 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:42:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:42:36.675 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:42:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:42:36.675 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:42:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:42:36.675 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:42:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:42:36.675 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:42:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:42:36.675 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:42:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:42:36.675 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:42:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:42:36.675 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:42:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:42:36.675 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:42:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:42:36.675 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:42:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:42:36.675 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:42:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:42:36.675 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:42:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:42:36.675 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:42:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:42:36.675 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:42:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:42:36.676 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:42:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:42:36.676 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:42:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:42:36.676 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:42:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:42:36.676 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:42:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:42:36.676 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:42:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:42:36.676 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:42:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:42:36.676 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:42:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:42:36.676 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:42:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:42:36.676 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:42:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:42:37.379 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:42:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:42:37.379 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:42:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:42:37.380 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:42:37 np0005531887 podman[245581]: 2025-11-22 08:42:37.843021668 +0000 UTC m=+0.060807544 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:42:38 np0005531887 nova_compute[186849]: 2025-11-22 08:42:38.074 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:38 np0005531887 nova_compute[186849]: 2025-11-22 08:42:38.603 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:42:38.719 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '67'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:42:41 np0005531887 nova_compute[186849]: 2025-11-22 08:42:41.262 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:42:41 np0005531887 nova_compute[186849]: 2025-11-22 08:42:41.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:42:43 np0005531887 nova_compute[186849]: 2025-11-22 08:42:43.078 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:43 np0005531887 nova_compute[186849]: 2025-11-22 08:42:43.605 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:44 np0005531887 podman[245605]: 2025-11-22 08:42:44.852409144 +0000 UTC m=+0.063417828 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 22 03:42:47 np0005531887 nova_compute[186849]: 2025-11-22 08:42:47.770 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:42:48 np0005531887 nova_compute[186849]: 2025-11-22 08:42:48.861 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:48 np0005531887 nova_compute[186849]: 2025-11-22 08:42:48.862 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:42:48 np0005531887 podman[245624]: 2025-11-22 08:42:48.951022403 +0000 UTC m=+0.059087262 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:42:52 np0005531887 podman[245645]: 2025-11-22 08:42:52.837198565 +0000 UTC m=+0.052224802 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 03:42:53 np0005531887 nova_compute[186849]: 2025-11-22 08:42:53.608 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:53 np0005531887 nova_compute[186849]: 2025-11-22 08:42:53.864 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:58 np0005531887 nova_compute[186849]: 2025-11-22 08:42:58.611 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:42:58 np0005531887 nova_compute[186849]: 2025-11-22 08:42:58.762 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:42:58 np0005531887 podman[245669]: 2025-11-22 08:42:58.853660356 +0000 UTC m=+0.067682952 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.6, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, container_name=openstack_network_exporter, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible)
Nov 22 03:42:58 np0005531887 nova_compute[186849]: 2025-11-22 08:42:58.865 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:03 np0005531887 nova_compute[186849]: 2025-11-22 08:43:03.614 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:03 np0005531887 nova_compute[186849]: 2025-11-22 08:43:03.866 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:04 np0005531887 podman[245692]: 2025-11-22 08:43:04.84583491 +0000 UTC m=+0.059873831 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:43:04 np0005531887 podman[245693]: 2025-11-22 08:43:04.90854984 +0000 UTC m=+0.119379952 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2)
Nov 22 03:43:08 np0005531887 nova_compute[186849]: 2025-11-22 08:43:08.618 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:08 np0005531887 podman[245738]: 2025-11-22 08:43:08.861133314 +0000 UTC m=+0.083031271 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:43:08 np0005531887 nova_compute[186849]: 2025-11-22 08:43:08.868 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:13 np0005531887 nova_compute[186849]: 2025-11-22 08:43:13.619 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:13 np0005531887 nova_compute[186849]: 2025-11-22 08:43:13.870 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:15 np0005531887 podman[245763]: 2025-11-22 08:43:15.836315009 +0000 UTC m=+0.054174471 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible)
Nov 22 03:43:16 np0005531887 nova_compute[186849]: 2025-11-22 08:43:16.663 186853 DEBUG oslo_concurrency.lockutils [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "dc53f437-5252-470a-b342-2c885312c906" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:43:16 np0005531887 nova_compute[186849]: 2025-11-22 08:43:16.663 186853 DEBUG oslo_concurrency.lockutils [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "dc53f437-5252-470a-b342-2c885312c906" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:43:16 np0005531887 nova_compute[186849]: 2025-11-22 08:43:16.686 186853 DEBUG nova.compute.manager [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: dc53f437-5252-470a-b342-2c885312c906] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 03:43:16 np0005531887 nova_compute[186849]: 2025-11-22 08:43:16.772 186853 DEBUG oslo_concurrency.lockutils [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:43:16 np0005531887 nova_compute[186849]: 2025-11-22 08:43:16.773 186853 DEBUG oslo_concurrency.lockutils [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:43:16 np0005531887 nova_compute[186849]: 2025-11-22 08:43:16.783 186853 DEBUG nova.virt.hardware [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:43:16 np0005531887 nova_compute[186849]: 2025-11-22 08:43:16.784 186853 INFO nova.compute.claims [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: dc53f437-5252-470a-b342-2c885312c906] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 22 03:43:16 np0005531887 nova_compute[186849]: 2025-11-22 08:43:16.960 186853 DEBUG nova.compute.provider_tree [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:43:16 np0005531887 nova_compute[186849]: 2025-11-22 08:43:16.974 186853 DEBUG nova.scheduler.client.report [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:43:17 np0005531887 nova_compute[186849]: 2025-11-22 08:43:17.029 186853 DEBUG oslo_concurrency.lockutils [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.257s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:43:17 np0005531887 nova_compute[186849]: 2025-11-22 08:43:17.030 186853 DEBUG nova.compute.manager [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: dc53f437-5252-470a-b342-2c885312c906] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 03:43:17 np0005531887 nova_compute[186849]: 2025-11-22 08:43:17.084 186853 DEBUG nova.compute.manager [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: dc53f437-5252-470a-b342-2c885312c906] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 03:43:17 np0005531887 nova_compute[186849]: 2025-11-22 08:43:17.084 186853 DEBUG nova.network.neutron [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: dc53f437-5252-470a-b342-2c885312c906] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:43:17 np0005531887 nova_compute[186849]: 2025-11-22 08:43:17.100 186853 INFO nova.virt.libvirt.driver [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: dc53f437-5252-470a-b342-2c885312c906] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 03:43:17 np0005531887 nova_compute[186849]: 2025-11-22 08:43:17.120 186853 DEBUG nova.compute.manager [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: dc53f437-5252-470a-b342-2c885312c906] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 03:43:17 np0005531887 nova_compute[186849]: 2025-11-22 08:43:17.211 186853 DEBUG nova.compute.manager [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: dc53f437-5252-470a-b342-2c885312c906] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 03:43:17 np0005531887 nova_compute[186849]: 2025-11-22 08:43:17.212 186853 DEBUG nova.virt.libvirt.driver [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: dc53f437-5252-470a-b342-2c885312c906] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:43:17 np0005531887 nova_compute[186849]: 2025-11-22 08:43:17.213 186853 INFO nova.virt.libvirt.driver [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: dc53f437-5252-470a-b342-2c885312c906] Creating image(s)#033[00m
Nov 22 03:43:17 np0005531887 nova_compute[186849]: 2025-11-22 08:43:17.214 186853 DEBUG oslo_concurrency.lockutils [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "/var/lib/nova/instances/dc53f437-5252-470a-b342-2c885312c906/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:43:17 np0005531887 nova_compute[186849]: 2025-11-22 08:43:17.214 186853 DEBUG oslo_concurrency.lockutils [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "/var/lib/nova/instances/dc53f437-5252-470a-b342-2c885312c906/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:43:17 np0005531887 nova_compute[186849]: 2025-11-22 08:43:17.215 186853 DEBUG oslo_concurrency.lockutils [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "/var/lib/nova/instances/dc53f437-5252-470a-b342-2c885312c906/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:43:17 np0005531887 nova_compute[186849]: 2025-11-22 08:43:17.233 186853 DEBUG oslo_concurrency.processutils [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:43:17 np0005531887 nova_compute[186849]: 2025-11-22 08:43:17.300 186853 DEBUG oslo_concurrency.processutils [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:43:17 np0005531887 nova_compute[186849]: 2025-11-22 08:43:17.301 186853 DEBUG oslo_concurrency.lockutils [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:43:17 np0005531887 nova_compute[186849]: 2025-11-22 08:43:17.302 186853 DEBUG oslo_concurrency.lockutils [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:43:17 np0005531887 nova_compute[186849]: 2025-11-22 08:43:17.314 186853 DEBUG oslo_concurrency.processutils [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:43:17 np0005531887 nova_compute[186849]: 2025-11-22 08:43:17.377 186853 DEBUG oslo_concurrency.processutils [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:43:17 np0005531887 nova_compute[186849]: 2025-11-22 08:43:17.378 186853 DEBUG oslo_concurrency.processutils [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/dc53f437-5252-470a-b342-2c885312c906/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:43:17 np0005531887 nova_compute[186849]: 2025-11-22 08:43:17.415 186853 DEBUG oslo_concurrency.processutils [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/dc53f437-5252-470a-b342-2c885312c906/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:43:17 np0005531887 nova_compute[186849]: 2025-11-22 08:43:17.416 186853 DEBUG oslo_concurrency.lockutils [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:43:17 np0005531887 nova_compute[186849]: 2025-11-22 08:43:17.416 186853 DEBUG oslo_concurrency.processutils [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:43:17 np0005531887 nova_compute[186849]: 2025-11-22 08:43:17.482 186853 DEBUG oslo_concurrency.processutils [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:43:17 np0005531887 nova_compute[186849]: 2025-11-22 08:43:17.482 186853 DEBUG nova.virt.disk.api [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Checking if we can resize image /var/lib/nova/instances/dc53f437-5252-470a-b342-2c885312c906/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:43:17 np0005531887 nova_compute[186849]: 2025-11-22 08:43:17.483 186853 DEBUG oslo_concurrency.processutils [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dc53f437-5252-470a-b342-2c885312c906/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:43:17 np0005531887 nova_compute[186849]: 2025-11-22 08:43:17.548 186853 DEBUG oslo_concurrency.processutils [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dc53f437-5252-470a-b342-2c885312c906/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:43:17 np0005531887 nova_compute[186849]: 2025-11-22 08:43:17.549 186853 DEBUG nova.virt.disk.api [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Cannot resize image /var/lib/nova/instances/dc53f437-5252-470a-b342-2c885312c906/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:43:17 np0005531887 nova_compute[186849]: 2025-11-22 08:43:17.550 186853 DEBUG nova.objects.instance [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lazy-loading 'migration_context' on Instance uuid dc53f437-5252-470a-b342-2c885312c906 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:43:17 np0005531887 nova_compute[186849]: 2025-11-22 08:43:17.564 186853 DEBUG nova.virt.libvirt.driver [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: dc53f437-5252-470a-b342-2c885312c906] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:43:17 np0005531887 nova_compute[186849]: 2025-11-22 08:43:17.565 186853 DEBUG nova.virt.libvirt.driver [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: dc53f437-5252-470a-b342-2c885312c906] Ensure instance console log exists: /var/lib/nova/instances/dc53f437-5252-470a-b342-2c885312c906/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:43:17 np0005531887 nova_compute[186849]: 2025-11-22 08:43:17.565 186853 DEBUG oslo_concurrency.lockutils [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:43:17 np0005531887 nova_compute[186849]: 2025-11-22 08:43:17.566 186853 DEBUG oslo_concurrency.lockutils [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:43:17 np0005531887 nova_compute[186849]: 2025-11-22 08:43:17.566 186853 DEBUG oslo_concurrency.lockutils [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:43:18 np0005531887 nova_compute[186849]: 2025-11-22 08:43:18.620 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:18 np0005531887 nova_compute[186849]: 2025-11-22 08:43:18.629 186853 DEBUG nova.policy [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:43:18 np0005531887 nova_compute[186849]: 2025-11-22 08:43:18.873 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:19 np0005531887 podman[245797]: 2025-11-22 08:43:19.846033245 +0000 UTC m=+0.063830797 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd)
Nov 22 03:43:20 np0005531887 nova_compute[186849]: 2025-11-22 08:43:20.750 186853 DEBUG nova.network.neutron [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: dc53f437-5252-470a-b342-2c885312c906] Successfully updated port: 15aaa9ce-5a60-4a63-a8ba-48052e19c726 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:43:20 np0005531887 nova_compute[186849]: 2025-11-22 08:43:20.769 186853 DEBUG oslo_concurrency.lockutils [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "refresh_cache-dc53f437-5252-470a-b342-2c885312c906" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:43:20 np0005531887 nova_compute[186849]: 2025-11-22 08:43:20.769 186853 DEBUG oslo_concurrency.lockutils [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquired lock "refresh_cache-dc53f437-5252-470a-b342-2c885312c906" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:43:20 np0005531887 nova_compute[186849]: 2025-11-22 08:43:20.770 186853 DEBUG nova.network.neutron [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: dc53f437-5252-470a-b342-2c885312c906] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:43:20 np0005531887 nova_compute[186849]: 2025-11-22 08:43:20.889 186853 DEBUG nova.compute.manager [req-d7eb68ff-1d5e-457b-933d-b9cd1b65f2b2 req-91fb7946-e161-4d6e-ba41-6d6d983df2c8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dc53f437-5252-470a-b342-2c885312c906] Received event network-changed-15aaa9ce-5a60-4a63-a8ba-48052e19c726 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:43:20 np0005531887 nova_compute[186849]: 2025-11-22 08:43:20.889 186853 DEBUG nova.compute.manager [req-d7eb68ff-1d5e-457b-933d-b9cd1b65f2b2 req-91fb7946-e161-4d6e-ba41-6d6d983df2c8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dc53f437-5252-470a-b342-2c885312c906] Refreshing instance network info cache due to event network-changed-15aaa9ce-5a60-4a63-a8ba-48052e19c726. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:43:20 np0005531887 nova_compute[186849]: 2025-11-22 08:43:20.890 186853 DEBUG oslo_concurrency.lockutils [req-d7eb68ff-1d5e-457b-933d-b9cd1b65f2b2 req-91fb7946-e161-4d6e-ba41-6d6d983df2c8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-dc53f437-5252-470a-b342-2c885312c906" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:43:20 np0005531887 nova_compute[186849]: 2025-11-22 08:43:20.974 186853 DEBUG nova.network.neutron [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: dc53f437-5252-470a-b342-2c885312c906] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:43:22 np0005531887 nova_compute[186849]: 2025-11-22 08:43:22.527 186853 DEBUG nova.network.neutron [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: dc53f437-5252-470a-b342-2c885312c906] Updating instance_info_cache with network_info: [{"id": "15aaa9ce-5a60-4a63-a8ba-48052e19c726", "address": "fa:16:3e:26:d0:6c", "network": {"id": "6b97ad36-fe6a-4ecc-ae0a-fc772d456632", "bridge": "br-int", "label": "tempest-network-smoke--56310956", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15aaa9ce-5a", "ovs_interfaceid": "15aaa9ce-5a60-4a63-a8ba-48052e19c726", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:43:22 np0005531887 nova_compute[186849]: 2025-11-22 08:43:22.549 186853 DEBUG oslo_concurrency.lockutils [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Releasing lock "refresh_cache-dc53f437-5252-470a-b342-2c885312c906" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:43:22 np0005531887 nova_compute[186849]: 2025-11-22 08:43:22.550 186853 DEBUG nova.compute.manager [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: dc53f437-5252-470a-b342-2c885312c906] Instance network_info: |[{"id": "15aaa9ce-5a60-4a63-a8ba-48052e19c726", "address": "fa:16:3e:26:d0:6c", "network": {"id": "6b97ad36-fe6a-4ecc-ae0a-fc772d456632", "bridge": "br-int", "label": "tempest-network-smoke--56310956", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15aaa9ce-5a", "ovs_interfaceid": "15aaa9ce-5a60-4a63-a8ba-48052e19c726", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 03:43:22 np0005531887 nova_compute[186849]: 2025-11-22 08:43:22.551 186853 DEBUG oslo_concurrency.lockutils [req-d7eb68ff-1d5e-457b-933d-b9cd1b65f2b2 req-91fb7946-e161-4d6e-ba41-6d6d983df2c8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-dc53f437-5252-470a-b342-2c885312c906" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:43:22 np0005531887 nova_compute[186849]: 2025-11-22 08:43:22.552 186853 DEBUG nova.network.neutron [req-d7eb68ff-1d5e-457b-933d-b9cd1b65f2b2 req-91fb7946-e161-4d6e-ba41-6d6d983df2c8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dc53f437-5252-470a-b342-2c885312c906] Refreshing network info cache for port 15aaa9ce-5a60-4a63-a8ba-48052e19c726 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:43:22 np0005531887 nova_compute[186849]: 2025-11-22 08:43:22.556 186853 DEBUG nova.virt.libvirt.driver [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: dc53f437-5252-470a-b342-2c885312c906] Start _get_guest_xml network_info=[{"id": "15aaa9ce-5a60-4a63-a8ba-48052e19c726", "address": "fa:16:3e:26:d0:6c", "network": {"id": "6b97ad36-fe6a-4ecc-ae0a-fc772d456632", "bridge": "br-int", "label": "tempest-network-smoke--56310956", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15aaa9ce-5a", "ovs_interfaceid": "15aaa9ce-5a60-4a63-a8ba-48052e19c726", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'size': 0, 'boot_index': 0, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:43:22 np0005531887 nova_compute[186849]: 2025-11-22 08:43:22.563 186853 WARNING nova.virt.libvirt.driver [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:43:22 np0005531887 nova_compute[186849]: 2025-11-22 08:43:22.573 186853 DEBUG nova.virt.libvirt.host [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:43:22 np0005531887 nova_compute[186849]: 2025-11-22 08:43:22.574 186853 DEBUG nova.virt.libvirt.host [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:43:22 np0005531887 nova_compute[186849]: 2025-11-22 08:43:22.579 186853 DEBUG nova.virt.libvirt.host [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:43:22 np0005531887 nova_compute[186849]: 2025-11-22 08:43:22.580 186853 DEBUG nova.virt.libvirt.host [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:43:22 np0005531887 nova_compute[186849]: 2025-11-22 08:43:22.582 186853 DEBUG nova.virt.libvirt.driver [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:43:22 np0005531887 nova_compute[186849]: 2025-11-22 08:43:22.582 186853 DEBUG nova.virt.hardware [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:43:22 np0005531887 nova_compute[186849]: 2025-11-22 08:43:22.582 186853 DEBUG nova.virt.hardware [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:43:22 np0005531887 nova_compute[186849]: 2025-11-22 08:43:22.583 186853 DEBUG nova.virt.hardware [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:43:22 np0005531887 nova_compute[186849]: 2025-11-22 08:43:22.583 186853 DEBUG nova.virt.hardware [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:43:22 np0005531887 nova_compute[186849]: 2025-11-22 08:43:22.583 186853 DEBUG nova.virt.hardware [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:43:22 np0005531887 nova_compute[186849]: 2025-11-22 08:43:22.583 186853 DEBUG nova.virt.hardware [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:43:22 np0005531887 nova_compute[186849]: 2025-11-22 08:43:22.584 186853 DEBUG nova.virt.hardware [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:43:22 np0005531887 nova_compute[186849]: 2025-11-22 08:43:22.584 186853 DEBUG nova.virt.hardware [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:43:22 np0005531887 nova_compute[186849]: 2025-11-22 08:43:22.584 186853 DEBUG nova.virt.hardware [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:43:22 np0005531887 nova_compute[186849]: 2025-11-22 08:43:22.584 186853 DEBUG nova.virt.hardware [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:43:22 np0005531887 nova_compute[186849]: 2025-11-22 08:43:22.584 186853 DEBUG nova.virt.hardware [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:43:22 np0005531887 nova_compute[186849]: 2025-11-22 08:43:22.590 186853 DEBUG nova.virt.libvirt.vif [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:43:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-269207377',display_name='tempest-TestNetworkBasicOps-server-269207377',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-269207377',id=173,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCVFhbK0zEFuzzS9mOMhvzYt1U5yhMs7nmXJvrYO4DXSkDJzAK+gNzkPgnZlZd9LfZ+ik1SUgGypo2/GDMTGfa7lOzAZUoSfQvBpUgCfwN0WUnxXCSjzKwwbmMJsiPDuZQ==',key_name='tempest-TestNetworkBasicOps-1867718137',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='12f63a6d87a947758ab928c0d625ff06',ramdisk_id='',reservation_id='r-9qve4ll4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1998778518',owner_user_name='tempest-TestNetworkBasicOps-1998778518-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:43:17Z,user_data=None,user_id='033a5e424a0a42afa21b67c28d79d1f4',uuid=dc53f437-5252-470a-b342-2c885312c906,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "15aaa9ce-5a60-4a63-a8ba-48052e19c726", "address": "fa:16:3e:26:d0:6c", "network": {"id": "6b97ad36-fe6a-4ecc-ae0a-fc772d456632", "bridge": "br-int", "label": "tempest-network-smoke--56310956", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15aaa9ce-5a", "ovs_interfaceid": "15aaa9ce-5a60-4a63-a8ba-48052e19c726", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:43:22 np0005531887 nova_compute[186849]: 2025-11-22 08:43:22.590 186853 DEBUG nova.network.os_vif_util [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converting VIF {"id": "15aaa9ce-5a60-4a63-a8ba-48052e19c726", "address": "fa:16:3e:26:d0:6c", "network": {"id": "6b97ad36-fe6a-4ecc-ae0a-fc772d456632", "bridge": "br-int", "label": "tempest-network-smoke--56310956", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15aaa9ce-5a", "ovs_interfaceid": "15aaa9ce-5a60-4a63-a8ba-48052e19c726", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:43:22 np0005531887 nova_compute[186849]: 2025-11-22 08:43:22.592 186853 DEBUG nova.network.os_vif_util [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:26:d0:6c,bridge_name='br-int',has_traffic_filtering=True,id=15aaa9ce-5a60-4a63-a8ba-48052e19c726,network=Network(6b97ad36-fe6a-4ecc-ae0a-fc772d456632),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap15aaa9ce-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:43:22 np0005531887 nova_compute[186849]: 2025-11-22 08:43:22.592 186853 DEBUG nova.objects.instance [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lazy-loading 'pci_devices' on Instance uuid dc53f437-5252-470a-b342-2c885312c906 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:43:22 np0005531887 nova_compute[186849]: 2025-11-22 08:43:22.607 186853 DEBUG nova.virt.libvirt.driver [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: dc53f437-5252-470a-b342-2c885312c906] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:43:22 np0005531887 nova_compute[186849]:  <uuid>dc53f437-5252-470a-b342-2c885312c906</uuid>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:  <name>instance-000000ad</name>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:  <memory>131072</memory>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:  <vcpu>1</vcpu>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:43:22 np0005531887 nova_compute[186849]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:      <nova:name>tempest-TestNetworkBasicOps-server-269207377</nova:name>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:      <nova:creationTime>2025-11-22 08:43:22</nova:creationTime>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:      <nova:flavor name="m1.nano">
Nov 22 03:43:22 np0005531887 nova_compute[186849]:        <nova:memory>128</nova:memory>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:        <nova:disk>1</nova:disk>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:        <nova:swap>0</nova:swap>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:      </nova:flavor>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:      <nova:owner>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:        <nova:user uuid="033a5e424a0a42afa21b67c28d79d1f4">tempest-TestNetworkBasicOps-1998778518-project-member</nova:user>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:        <nova:project uuid="12f63a6d87a947758ab928c0d625ff06">tempest-TestNetworkBasicOps-1998778518</nova:project>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:      </nova:owner>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:      <nova:ports>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:        <nova:port uuid="15aaa9ce-5a60-4a63-a8ba-48052e19c726">
Nov 22 03:43:22 np0005531887 nova_compute[186849]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:        </nova:port>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:      </nova:ports>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:    </nova:instance>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:  <sysinfo type="smbios">
Nov 22 03:43:22 np0005531887 nova_compute[186849]:    <system>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:      <entry name="serial">dc53f437-5252-470a-b342-2c885312c906</entry>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:      <entry name="uuid">dc53f437-5252-470a-b342-2c885312c906</entry>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:    </system>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:  <os>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:    <boot dev="hd"/>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:    <smbios mode="sysinfo"/>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:  </os>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:  <features>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:    <vmcoreinfo/>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:  </features>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:  <clock offset="utc">
Nov 22 03:43:22 np0005531887 nova_compute[186849]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:    <timer name="hpet" present="no"/>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:  </clock>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:  <cpu mode="custom" match="exact">
Nov 22 03:43:22 np0005531887 nova_compute[186849]:    <model>Nehalem</model>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:  <devices>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:    <disk type="file" device="disk">
Nov 22 03:43:22 np0005531887 nova_compute[186849]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/dc53f437-5252-470a-b342-2c885312c906/disk"/>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:      <target dev="vda" bus="virtio"/>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:    <disk type="file" device="cdrom">
Nov 22 03:43:22 np0005531887 nova_compute[186849]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/dc53f437-5252-470a-b342-2c885312c906/disk.config"/>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:      <target dev="sda" bus="sata"/>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:    <interface type="ethernet">
Nov 22 03:43:22 np0005531887 nova_compute[186849]:      <mac address="fa:16:3e:26:d0:6c"/>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:      <mtu size="1442"/>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:      <target dev="tap15aaa9ce-5a"/>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:    </interface>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:    <serial type="pty">
Nov 22 03:43:22 np0005531887 nova_compute[186849]:      <log file="/var/lib/nova/instances/dc53f437-5252-470a-b342-2c885312c906/console.log" append="off"/>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:    </serial>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:    <video>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:    </video>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:    <input type="tablet" bus="usb"/>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:    <rng model="virtio">
Nov 22 03:43:22 np0005531887 nova_compute[186849]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:    </rng>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:    <controller type="usb" index="0"/>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:    <memballoon model="virtio">
Nov 22 03:43:22 np0005531887 nova_compute[186849]:      <stats period="10"/>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 03:43:22 np0005531887 nova_compute[186849]:  </devices>
Nov 22 03:43:22 np0005531887 nova_compute[186849]: </domain>
Nov 22 03:43:22 np0005531887 nova_compute[186849]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:43:22 np0005531887 nova_compute[186849]: 2025-11-22 08:43:22.609 186853 DEBUG nova.compute.manager [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: dc53f437-5252-470a-b342-2c885312c906] Preparing to wait for external event network-vif-plugged-15aaa9ce-5a60-4a63-a8ba-48052e19c726 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:43:22 np0005531887 nova_compute[186849]: 2025-11-22 08:43:22.609 186853 DEBUG oslo_concurrency.lockutils [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "dc53f437-5252-470a-b342-2c885312c906-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:43:22 np0005531887 nova_compute[186849]: 2025-11-22 08:43:22.609 186853 DEBUG oslo_concurrency.lockutils [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "dc53f437-5252-470a-b342-2c885312c906-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:43:22 np0005531887 nova_compute[186849]: 2025-11-22 08:43:22.610 186853 DEBUG oslo_concurrency.lockutils [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "dc53f437-5252-470a-b342-2c885312c906-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:43:22 np0005531887 nova_compute[186849]: 2025-11-22 08:43:22.610 186853 DEBUG nova.virt.libvirt.vif [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:43:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-269207377',display_name='tempest-TestNetworkBasicOps-server-269207377',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-269207377',id=173,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCVFhbK0zEFuzzS9mOMhvzYt1U5yhMs7nmXJvrYO4DXSkDJzAK+gNzkPgnZlZd9LfZ+ik1SUgGypo2/GDMTGfa7lOzAZUoSfQvBpUgCfwN0WUnxXCSjzKwwbmMJsiPDuZQ==',key_name='tempest-TestNetworkBasicOps-1867718137',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='12f63a6d87a947758ab928c0d625ff06',ramdisk_id='',reservation_id='r-9qve4ll4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1998778518',owner_user_name='tempest-TestNetworkBasicOps-1998778518-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:43:17Z,user_data=None,user_id='033a5e424a0a42afa21b67c28d79d1f4',uuid=dc53f437-5252-470a-b342-2c885312c906,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "15aaa9ce-5a60-4a63-a8ba-48052e19c726", "address": "fa:16:3e:26:d0:6c", "network": {"id": "6b97ad36-fe6a-4ecc-ae0a-fc772d456632", "bridge": "br-int", "label": "tempest-network-smoke--56310956", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15aaa9ce-5a", "ovs_interfaceid": "15aaa9ce-5a60-4a63-a8ba-48052e19c726", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:43:22 np0005531887 nova_compute[186849]: 2025-11-22 08:43:22.611 186853 DEBUG nova.network.os_vif_util [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converting VIF {"id": "15aaa9ce-5a60-4a63-a8ba-48052e19c726", "address": "fa:16:3e:26:d0:6c", "network": {"id": "6b97ad36-fe6a-4ecc-ae0a-fc772d456632", "bridge": "br-int", "label": "tempest-network-smoke--56310956", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15aaa9ce-5a", "ovs_interfaceid": "15aaa9ce-5a60-4a63-a8ba-48052e19c726", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:43:22 np0005531887 nova_compute[186849]: 2025-11-22 08:43:22.611 186853 DEBUG nova.network.os_vif_util [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:26:d0:6c,bridge_name='br-int',has_traffic_filtering=True,id=15aaa9ce-5a60-4a63-a8ba-48052e19c726,network=Network(6b97ad36-fe6a-4ecc-ae0a-fc772d456632),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap15aaa9ce-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:43:22 np0005531887 nova_compute[186849]: 2025-11-22 08:43:22.612 186853 DEBUG os_vif [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:26:d0:6c,bridge_name='br-int',has_traffic_filtering=True,id=15aaa9ce-5a60-4a63-a8ba-48052e19c726,network=Network(6b97ad36-fe6a-4ecc-ae0a-fc772d456632),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap15aaa9ce-5a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:43:22 np0005531887 nova_compute[186849]: 2025-11-22 08:43:22.613 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:22 np0005531887 nova_compute[186849]: 2025-11-22 08:43:22.613 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:43:22 np0005531887 nova_compute[186849]: 2025-11-22 08:43:22.613 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:43:22 np0005531887 nova_compute[186849]: 2025-11-22 08:43:22.616 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:22 np0005531887 nova_compute[186849]: 2025-11-22 08:43:22.617 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap15aaa9ce-5a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:43:22 np0005531887 nova_compute[186849]: 2025-11-22 08:43:22.617 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap15aaa9ce-5a, col_values=(('external_ids', {'iface-id': '15aaa9ce-5a60-4a63-a8ba-48052e19c726', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:26:d0:6c', 'vm-uuid': 'dc53f437-5252-470a-b342-2c885312c906'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:43:22 np0005531887 nova_compute[186849]: 2025-11-22 08:43:22.619 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:22 np0005531887 NetworkManager[55210]: <info>  [1763801002.6207] manager: (tap15aaa9ce-5a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/267)
Nov 22 03:43:22 np0005531887 nova_compute[186849]: 2025-11-22 08:43:22.622 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:43:22 np0005531887 nova_compute[186849]: 2025-11-22 08:43:22.628 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:22 np0005531887 nova_compute[186849]: 2025-11-22 08:43:22.630 186853 INFO os_vif [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:26:d0:6c,bridge_name='br-int',has_traffic_filtering=True,id=15aaa9ce-5a60-4a63-a8ba-48052e19c726,network=Network(6b97ad36-fe6a-4ecc-ae0a-fc772d456632),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap15aaa9ce-5a')#033[00m
Nov 22 03:43:22 np0005531887 nova_compute[186849]: 2025-11-22 08:43:22.689 186853 DEBUG nova.virt.libvirt.driver [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:43:22 np0005531887 nova_compute[186849]: 2025-11-22 08:43:22.690 186853 DEBUG nova.virt.libvirt.driver [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:43:22 np0005531887 nova_compute[186849]: 2025-11-22 08:43:22.690 186853 DEBUG nova.virt.libvirt.driver [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] No VIF found with MAC fa:16:3e:26:d0:6c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:43:22 np0005531887 nova_compute[186849]: 2025-11-22 08:43:22.691 186853 INFO nova.virt.libvirt.driver [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: dc53f437-5252-470a-b342-2c885312c906] Using config drive#033[00m
Nov 22 03:43:22 np0005531887 nova_compute[186849]: 2025-11-22 08:43:22.967 186853 INFO nova.virt.libvirt.driver [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: dc53f437-5252-470a-b342-2c885312c906] Creating config drive at /var/lib/nova/instances/dc53f437-5252-470a-b342-2c885312c906/disk.config#033[00m
Nov 22 03:43:22 np0005531887 nova_compute[186849]: 2025-11-22 08:43:22.973 186853 DEBUG oslo_concurrency.processutils [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/dc53f437-5252-470a-b342-2c885312c906/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8qg5z3y7 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:43:23 np0005531887 nova_compute[186849]: 2025-11-22 08:43:23.110 186853 DEBUG oslo_concurrency.processutils [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/dc53f437-5252-470a-b342-2c885312c906/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8qg5z3y7" returned: 0 in 0.137s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:43:23 np0005531887 kernel: tap15aaa9ce-5a: entered promiscuous mode
Nov 22 03:43:23 np0005531887 NetworkManager[55210]: <info>  [1763801003.2041] manager: (tap15aaa9ce-5a): new Tun device (/org/freedesktop/NetworkManager/Devices/268)
Nov 22 03:43:23 np0005531887 ovn_controller[95130]: 2025-11-22T08:43:23Z|00566|binding|INFO|Claiming lport 15aaa9ce-5a60-4a63-a8ba-48052e19c726 for this chassis.
Nov 22 03:43:23 np0005531887 ovn_controller[95130]: 2025-11-22T08:43:23Z|00567|binding|INFO|15aaa9ce-5a60-4a63-a8ba-48052e19c726: Claiming fa:16:3e:26:d0:6c 10.100.0.5
Nov 22 03:43:23 np0005531887 nova_compute[186849]: 2025-11-22 08:43:23.205 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:23 np0005531887 nova_compute[186849]: 2025-11-22 08:43:23.211 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:43:23.219 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:26:d0:6c 10.100.0.5'], port_security=['fa:16:3e:26:d0:6c 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1012956209', 'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'dc53f437-5252-470a-b342-2c885312c906', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6b97ad36-fe6a-4ecc-ae0a-fc772d456632', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1012956209', 'neutron:project_id': '12f63a6d87a947758ab928c0d625ff06', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1130d42c-f40b-4a39-88f2-637246715885', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bf52ec35-2c17-43e3-9550-8e20cdf2c2b7, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=15aaa9ce-5a60-4a63-a8ba-48052e19c726) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:43:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:43:23.220 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 15aaa9ce-5a60-4a63-a8ba-48052e19c726 in datapath 6b97ad36-fe6a-4ecc-ae0a-fc772d456632 bound to our chassis#033[00m
Nov 22 03:43:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:43:23.221 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6b97ad36-fe6a-4ecc-ae0a-fc772d456632#033[00m
Nov 22 03:43:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:43:23.237 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[38d3be58-44e6-4a2e-8983-1793f92313d0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:43:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:43:23.238 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6b97ad36-f1 in ovnmeta-6b97ad36-fe6a-4ecc-ae0a-fc772d456632 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:43:23 np0005531887 systemd-udevd[245848]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:43:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:43:23.245 213790 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6b97ad36-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:43:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:43:23.246 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[45930680-ee53-4636-97e8-f9c28dfdf1b8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:43:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:43:23.247 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[115f4f07-ee8a-40d8-bd6a-43b06dbdab87]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:43:23 np0005531887 NetworkManager[55210]: <info>  [1763801003.2581] device (tap15aaa9ce-5a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:43:23 np0005531887 NetworkManager[55210]: <info>  [1763801003.2590] device (tap15aaa9ce-5a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:43:23 np0005531887 nova_compute[186849]: 2025-11-22 08:43:23.264 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:23 np0005531887 systemd-machined[153180]: New machine qemu-60-instance-000000ad.
Nov 22 03:43:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:43:23.266 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[18203131-a9d7-4cdc-b196-915e21b0d84f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:43:23 np0005531887 ovn_controller[95130]: 2025-11-22T08:43:23Z|00568|binding|INFO|Setting lport 15aaa9ce-5a60-4a63-a8ba-48052e19c726 ovn-installed in OVS
Nov 22 03:43:23 np0005531887 ovn_controller[95130]: 2025-11-22T08:43:23Z|00569|binding|INFO|Setting lport 15aaa9ce-5a60-4a63-a8ba-48052e19c726 up in Southbound
Nov 22 03:43:23 np0005531887 nova_compute[186849]: 2025-11-22 08:43:23.273 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:23 np0005531887 systemd[1]: Started Virtual Machine qemu-60-instance-000000ad.
Nov 22 03:43:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:43:23.290 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[edec93cb-55bf-420c-ad9a-e05adba5cdb5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:43:23 np0005531887 podman[245830]: 2025-11-22 08:43:23.299401315 +0000 UTC m=+0.100457887 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:43:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:43:23.327 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[ed14d57c-cbe3-4cc7-93b6-9e7b691a8a2d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:43:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:43:23.335 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[b0533702-9481-4261-a503-a7fd438bb75c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:43:23 np0005531887 systemd-udevd[245852]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:43:23 np0005531887 NetworkManager[55210]: <info>  [1763801003.3369] manager: (tap6b97ad36-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/269)
Nov 22 03:43:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:43:23.371 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[ff16f914-1b59-4576-a0c2-1f5e4e5d3dc4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:43:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:43:23.376 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[8b18bf34-667c-4720-8a85-b9480cb0704d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:43:23 np0005531887 NetworkManager[55210]: <info>  [1763801003.4032] device (tap6b97ad36-f0): carrier: link connected
Nov 22 03:43:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:43:23.410 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[1827e3ac-5506-44c9-b15b-984639eb77f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:43:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:43:23.431 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[f5359a12-f2b0-453b-ac6b-a0ea0e6f9e56]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6b97ad36-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:93:84:a5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 171], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 774601, 'reachable_time': 39171, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245893, 'error': None, 'target': 'ovnmeta-6b97ad36-fe6a-4ecc-ae0a-fc772d456632', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:43:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:43:23.452 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[6e2f52a5-a010-4156-b875-fd2e81d4a820]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe93:84a5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 774601, 'tstamp': 774601}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 245894, 'error': None, 'target': 'ovnmeta-6b97ad36-fe6a-4ecc-ae0a-fc772d456632', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:43:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:43:23.477 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[ce53ac14-9649-4d02-a8d2-5103789623d6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6b97ad36-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:93:84:a5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 171], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 774601, 'reachable_time': 39171, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 245895, 'error': None, 'target': 'ovnmeta-6b97ad36-fe6a-4ecc-ae0a-fc772d456632', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:43:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:43:23.515 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[8611ce42-d9eb-47f8-9455-6f56af3bed15]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:43:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:43:23.593 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[db6b5e1f-a028-4fed-8b5c-aca424b6d0b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:43:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:43:23.594 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6b97ad36-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:43:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:43:23.595 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:43:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:43:23.595 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6b97ad36-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:43:23 np0005531887 nova_compute[186849]: 2025-11-22 08:43:23.598 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:23 np0005531887 NetworkManager[55210]: <info>  [1763801003.5990] manager: (tap6b97ad36-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/270)
Nov 22 03:43:23 np0005531887 kernel: tap6b97ad36-f0: entered promiscuous mode
Nov 22 03:43:23 np0005531887 nova_compute[186849]: 2025-11-22 08:43:23.600 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:43:23.603 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6b97ad36-f0, col_values=(('external_ids', {'iface-id': '04e092e8-b0e3-44aa-842f-7ec0ec9be431'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:43:23 np0005531887 nova_compute[186849]: 2025-11-22 08:43:23.605 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:23 np0005531887 ovn_controller[95130]: 2025-11-22T08:43:23Z|00570|binding|INFO|Releasing lport 04e092e8-b0e3-44aa-842f-7ec0ec9be431 from this chassis (sb_readonly=0)
Nov 22 03:43:23 np0005531887 nova_compute[186849]: 2025-11-22 08:43:23.619 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:23 np0005531887 nova_compute[186849]: 2025-11-22 08:43:23.623 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:43:23.624 104084 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6b97ad36-fe6a-4ecc-ae0a-fc772d456632.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6b97ad36-fe6a-4ecc-ae0a-fc772d456632.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:43:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:43:23.626 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[c951fe35-0d23-420b-8449-45a88c4ec0ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:43:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:43:23.626 104084 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:43:23 np0005531887 ovn_metadata_agent[104079]: global
Nov 22 03:43:23 np0005531887 ovn_metadata_agent[104079]:    log         /dev/log local0 debug
Nov 22 03:43:23 np0005531887 ovn_metadata_agent[104079]:    log-tag     haproxy-metadata-proxy-6b97ad36-fe6a-4ecc-ae0a-fc772d456632
Nov 22 03:43:23 np0005531887 ovn_metadata_agent[104079]:    user        root
Nov 22 03:43:23 np0005531887 ovn_metadata_agent[104079]:    group       root
Nov 22 03:43:23 np0005531887 ovn_metadata_agent[104079]:    maxconn     1024
Nov 22 03:43:23 np0005531887 ovn_metadata_agent[104079]:    pidfile     /var/lib/neutron/external/pids/6b97ad36-fe6a-4ecc-ae0a-fc772d456632.pid.haproxy
Nov 22 03:43:23 np0005531887 ovn_metadata_agent[104079]:    daemon
Nov 22 03:43:23 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:43:23 np0005531887 ovn_metadata_agent[104079]: defaults
Nov 22 03:43:23 np0005531887 ovn_metadata_agent[104079]:    log global
Nov 22 03:43:23 np0005531887 ovn_metadata_agent[104079]:    mode http
Nov 22 03:43:23 np0005531887 ovn_metadata_agent[104079]:    option httplog
Nov 22 03:43:23 np0005531887 ovn_metadata_agent[104079]:    option dontlognull
Nov 22 03:43:23 np0005531887 ovn_metadata_agent[104079]:    option http-server-close
Nov 22 03:43:23 np0005531887 ovn_metadata_agent[104079]:    option forwardfor
Nov 22 03:43:23 np0005531887 ovn_metadata_agent[104079]:    retries                 3
Nov 22 03:43:23 np0005531887 ovn_metadata_agent[104079]:    timeout http-request    30s
Nov 22 03:43:23 np0005531887 ovn_metadata_agent[104079]:    timeout connect         30s
Nov 22 03:43:23 np0005531887 ovn_metadata_agent[104079]:    timeout client          32s
Nov 22 03:43:23 np0005531887 ovn_metadata_agent[104079]:    timeout server          32s
Nov 22 03:43:23 np0005531887 ovn_metadata_agent[104079]:    timeout http-keep-alive 30s
Nov 22 03:43:23 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:43:23 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:43:23 np0005531887 ovn_metadata_agent[104079]: listen listener
Nov 22 03:43:23 np0005531887 ovn_metadata_agent[104079]:    bind 169.254.169.254:80
Nov 22 03:43:23 np0005531887 ovn_metadata_agent[104079]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:43:23 np0005531887 ovn_metadata_agent[104079]:    http-request add-header X-OVN-Network-ID 6b97ad36-fe6a-4ecc-ae0a-fc772d456632
Nov 22 03:43:23 np0005531887 ovn_metadata_agent[104079]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:43:23 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:43:23.627 104084 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6b97ad36-fe6a-4ecc-ae0a-fc772d456632', 'env', 'PROCESS_TAG=haproxy-6b97ad36-fe6a-4ecc-ae0a-fc772d456632', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6b97ad36-fe6a-4ecc-ae0a-fc772d456632.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:43:23 np0005531887 nova_compute[186849]: 2025-11-22 08:43:23.694 186853 DEBUG nova.compute.manager [req-d953486d-d207-4d4b-b45a-d6d0ec2bb9f0 req-e338e6cd-d53c-44b9-a736-7ed73871d900 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dc53f437-5252-470a-b342-2c885312c906] Received event network-vif-plugged-15aaa9ce-5a60-4a63-a8ba-48052e19c726 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:43:23 np0005531887 nova_compute[186849]: 2025-11-22 08:43:23.695 186853 DEBUG oslo_concurrency.lockutils [req-d953486d-d207-4d4b-b45a-d6d0ec2bb9f0 req-e338e6cd-d53c-44b9-a736-7ed73871d900 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "dc53f437-5252-470a-b342-2c885312c906-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:43:23 np0005531887 nova_compute[186849]: 2025-11-22 08:43:23.696 186853 DEBUG oslo_concurrency.lockutils [req-d953486d-d207-4d4b-b45a-d6d0ec2bb9f0 req-e338e6cd-d53c-44b9-a736-7ed73871d900 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "dc53f437-5252-470a-b342-2c885312c906-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:43:23 np0005531887 nova_compute[186849]: 2025-11-22 08:43:23.696 186853 DEBUG oslo_concurrency.lockutils [req-d953486d-d207-4d4b-b45a-d6d0ec2bb9f0 req-e338e6cd-d53c-44b9-a736-7ed73871d900 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "dc53f437-5252-470a-b342-2c885312c906-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:43:23 np0005531887 nova_compute[186849]: 2025-11-22 08:43:23.696 186853 DEBUG nova.compute.manager [req-d953486d-d207-4d4b-b45a-d6d0ec2bb9f0 req-e338e6cd-d53c-44b9-a736-7ed73871d900 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dc53f437-5252-470a-b342-2c885312c906] Processing event network-vif-plugged-15aaa9ce-5a60-4a63-a8ba-48052e19c726 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:43:23 np0005531887 nova_compute[186849]: 2025-11-22 08:43:23.752 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763801003.7517707, dc53f437-5252-470a-b342-2c885312c906 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:43:23 np0005531887 nova_compute[186849]: 2025-11-22 08:43:23.753 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: dc53f437-5252-470a-b342-2c885312c906] VM Started (Lifecycle Event)#033[00m
Nov 22 03:43:23 np0005531887 nova_compute[186849]: 2025-11-22 08:43:23.755 186853 DEBUG nova.compute.manager [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: dc53f437-5252-470a-b342-2c885312c906] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:43:23 np0005531887 nova_compute[186849]: 2025-11-22 08:43:23.764 186853 DEBUG nova.virt.libvirt.driver [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: dc53f437-5252-470a-b342-2c885312c906] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:43:23 np0005531887 nova_compute[186849]: 2025-11-22 08:43:23.769 186853 INFO nova.virt.libvirt.driver [-] [instance: dc53f437-5252-470a-b342-2c885312c906] Instance spawned successfully.#033[00m
Nov 22 03:43:23 np0005531887 nova_compute[186849]: 2025-11-22 08:43:23.770 186853 DEBUG nova.virt.libvirt.driver [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: dc53f437-5252-470a-b342-2c885312c906] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 03:43:23 np0005531887 nova_compute[186849]: 2025-11-22 08:43:23.773 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: dc53f437-5252-470a-b342-2c885312c906] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:43:23 np0005531887 nova_compute[186849]: 2025-11-22 08:43:23.777 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: dc53f437-5252-470a-b342-2c885312c906] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:43:23 np0005531887 nova_compute[186849]: 2025-11-22 08:43:23.801 186853 DEBUG nova.virt.libvirt.driver [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: dc53f437-5252-470a-b342-2c885312c906] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:43:23 np0005531887 nova_compute[186849]: 2025-11-22 08:43:23.801 186853 DEBUG nova.virt.libvirt.driver [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: dc53f437-5252-470a-b342-2c885312c906] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:43:23 np0005531887 nova_compute[186849]: 2025-11-22 08:43:23.802 186853 DEBUG nova.virt.libvirt.driver [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: dc53f437-5252-470a-b342-2c885312c906] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:43:23 np0005531887 nova_compute[186849]: 2025-11-22 08:43:23.802 186853 DEBUG nova.virt.libvirt.driver [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: dc53f437-5252-470a-b342-2c885312c906] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:43:23 np0005531887 nova_compute[186849]: 2025-11-22 08:43:23.803 186853 DEBUG nova.virt.libvirt.driver [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: dc53f437-5252-470a-b342-2c885312c906] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:43:23 np0005531887 nova_compute[186849]: 2025-11-22 08:43:23.803 186853 DEBUG nova.virt.libvirt.driver [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: dc53f437-5252-470a-b342-2c885312c906] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:43:23 np0005531887 nova_compute[186849]: 2025-11-22 08:43:23.859 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: dc53f437-5252-470a-b342-2c885312c906] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:43:23 np0005531887 nova_compute[186849]: 2025-11-22 08:43:23.859 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763801003.7529805, dc53f437-5252-470a-b342-2c885312c906 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:43:23 np0005531887 nova_compute[186849]: 2025-11-22 08:43:23.860 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: dc53f437-5252-470a-b342-2c885312c906] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:43:23 np0005531887 nova_compute[186849]: 2025-11-22 08:43:23.885 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: dc53f437-5252-470a-b342-2c885312c906] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:43:23 np0005531887 nova_compute[186849]: 2025-11-22 08:43:23.889 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763801003.7647223, dc53f437-5252-470a-b342-2c885312c906 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:43:23 np0005531887 nova_compute[186849]: 2025-11-22 08:43:23.890 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: dc53f437-5252-470a-b342-2c885312c906] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:43:23 np0005531887 nova_compute[186849]: 2025-11-22 08:43:23.923 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: dc53f437-5252-470a-b342-2c885312c906] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:43:23 np0005531887 nova_compute[186849]: 2025-11-22 08:43:23.931 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: dc53f437-5252-470a-b342-2c885312c906] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:43:24 np0005531887 nova_compute[186849]: 2025-11-22 08:43:24.001 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: dc53f437-5252-470a-b342-2c885312c906] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:43:24 np0005531887 nova_compute[186849]: 2025-11-22 08:43:24.013 186853 DEBUG nova.network.neutron [req-d7eb68ff-1d5e-457b-933d-b9cd1b65f2b2 req-91fb7946-e161-4d6e-ba41-6d6d983df2c8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dc53f437-5252-470a-b342-2c885312c906] Updated VIF entry in instance network info cache for port 15aaa9ce-5a60-4a63-a8ba-48052e19c726. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:43:24 np0005531887 nova_compute[186849]: 2025-11-22 08:43:24.014 186853 DEBUG nova.network.neutron [req-d7eb68ff-1d5e-457b-933d-b9cd1b65f2b2 req-91fb7946-e161-4d6e-ba41-6d6d983df2c8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dc53f437-5252-470a-b342-2c885312c906] Updating instance_info_cache with network_info: [{"id": "15aaa9ce-5a60-4a63-a8ba-48052e19c726", "address": "fa:16:3e:26:d0:6c", "network": {"id": "6b97ad36-fe6a-4ecc-ae0a-fc772d456632", "bridge": "br-int", "label": "tempest-network-smoke--56310956", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15aaa9ce-5a", "ovs_interfaceid": "15aaa9ce-5a60-4a63-a8ba-48052e19c726", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:43:24 np0005531887 nova_compute[186849]: 2025-11-22 08:43:24.017 186853 INFO nova.compute.manager [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: dc53f437-5252-470a-b342-2c885312c906] Took 6.81 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 03:43:24 np0005531887 nova_compute[186849]: 2025-11-22 08:43:24.018 186853 DEBUG nova.compute.manager [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: dc53f437-5252-470a-b342-2c885312c906] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:43:24 np0005531887 nova_compute[186849]: 2025-11-22 08:43:24.027 186853 DEBUG oslo_concurrency.lockutils [req-d7eb68ff-1d5e-457b-933d-b9cd1b65f2b2 req-91fb7946-e161-4d6e-ba41-6d6d983df2c8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-dc53f437-5252-470a-b342-2c885312c906" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:43:24 np0005531887 podman[245934]: 2025-11-22 08:43:24.064118869 +0000 UTC m=+0.068431351 container create 48b4544fc7eeff34546810ed31a7e12db3315156b0405f2d7988b6fae6fe10f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6b97ad36-fe6a-4ecc-ae0a-fc772d456632, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 22 03:43:24 np0005531887 systemd[1]: Started libpod-conmon-48b4544fc7eeff34546810ed31a7e12db3315156b0405f2d7988b6fae6fe10f0.scope.
Nov 22 03:43:24 np0005531887 podman[245934]: 2025-11-22 08:43:24.031850486 +0000 UTC m=+0.036162998 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:43:24 np0005531887 systemd[1]: Started libcrun container.
Nov 22 03:43:24 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/831ec5805b6c684b7bb2953e1a5c80f72e866746977f9514ea9dc90a639a91bf/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:43:24 np0005531887 podman[245934]: 2025-11-22 08:43:24.173656518 +0000 UTC m=+0.177969010 container init 48b4544fc7eeff34546810ed31a7e12db3315156b0405f2d7988b6fae6fe10f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6b97ad36-fe6a-4ecc-ae0a-fc772d456632, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118)
Nov 22 03:43:24 np0005531887 podman[245934]: 2025-11-22 08:43:24.182439603 +0000 UTC m=+0.186752095 container start 48b4544fc7eeff34546810ed31a7e12db3315156b0405f2d7988b6fae6fe10f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6b97ad36-fe6a-4ecc-ae0a-fc772d456632, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:43:24 np0005531887 nova_compute[186849]: 2025-11-22 08:43:24.198 186853 INFO nova.compute.manager [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: dc53f437-5252-470a-b342-2c885312c906] Took 7.47 seconds to build instance.#033[00m
Nov 22 03:43:24 np0005531887 neutron-haproxy-ovnmeta-6b97ad36-fe6a-4ecc-ae0a-fc772d456632[245949]: [NOTICE]   (245953) : New worker (245955) forked
Nov 22 03:43:24 np0005531887 neutron-haproxy-ovnmeta-6b97ad36-fe6a-4ecc-ae0a-fc772d456632[245949]: [NOTICE]   (245953) : Loading success.
Nov 22 03:43:24 np0005531887 nova_compute[186849]: 2025-11-22 08:43:24.215 186853 DEBUG oslo_concurrency.lockutils [None req-625359c6-06ac-4824-820b-64a6302fb5de 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "dc53f437-5252-470a-b342-2c885312c906" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.552s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:43:24 np0005531887 nova_compute[186849]: 2025-11-22 08:43:24.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:43:25 np0005531887 nova_compute[186849]: 2025-11-22 08:43:25.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:43:25 np0005531887 nova_compute[186849]: 2025-11-22 08:43:25.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:43:25 np0005531887 nova_compute[186849]: 2025-11-22 08:43:25.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:43:25 np0005531887 nova_compute[186849]: 2025-11-22 08:43:25.918 186853 DEBUG nova.compute.manager [req-babd415d-4d3e-4d0a-b68c-dac19de5c442 req-dfbed793-3829-4e50-b0c2-1388d7f3227f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dc53f437-5252-470a-b342-2c885312c906] Received event network-vif-plugged-15aaa9ce-5a60-4a63-a8ba-48052e19c726 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:43:25 np0005531887 nova_compute[186849]: 2025-11-22 08:43:25.918 186853 DEBUG oslo_concurrency.lockutils [req-babd415d-4d3e-4d0a-b68c-dac19de5c442 req-dfbed793-3829-4e50-b0c2-1388d7f3227f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "dc53f437-5252-470a-b342-2c885312c906-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:43:25 np0005531887 nova_compute[186849]: 2025-11-22 08:43:25.919 186853 DEBUG oslo_concurrency.lockutils [req-babd415d-4d3e-4d0a-b68c-dac19de5c442 req-dfbed793-3829-4e50-b0c2-1388d7f3227f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "dc53f437-5252-470a-b342-2c885312c906-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:43:25 np0005531887 nova_compute[186849]: 2025-11-22 08:43:25.919 186853 DEBUG oslo_concurrency.lockutils [req-babd415d-4d3e-4d0a-b68c-dac19de5c442 req-dfbed793-3829-4e50-b0c2-1388d7f3227f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "dc53f437-5252-470a-b342-2c885312c906-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:43:25 np0005531887 nova_compute[186849]: 2025-11-22 08:43:25.919 186853 DEBUG nova.compute.manager [req-babd415d-4d3e-4d0a-b68c-dac19de5c442 req-dfbed793-3829-4e50-b0c2-1388d7f3227f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dc53f437-5252-470a-b342-2c885312c906] No waiting events found dispatching network-vif-plugged-15aaa9ce-5a60-4a63-a8ba-48052e19c726 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:43:25 np0005531887 nova_compute[186849]: 2025-11-22 08:43:25.919 186853 WARNING nova.compute.manager [req-babd415d-4d3e-4d0a-b68c-dac19de5c442 req-dfbed793-3829-4e50-b0c2-1388d7f3227f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dc53f437-5252-470a-b342-2c885312c906] Received unexpected event network-vif-plugged-15aaa9ce-5a60-4a63-a8ba-48052e19c726 for instance with vm_state active and task_state None.#033[00m
Nov 22 03:43:26 np0005531887 nova_compute[186849]: 2025-11-22 08:43:26.506 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "refresh_cache-dc53f437-5252-470a-b342-2c885312c906" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:43:26 np0005531887 nova_compute[186849]: 2025-11-22 08:43:26.507 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquired lock "refresh_cache-dc53f437-5252-470a-b342-2c885312c906" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:43:26 np0005531887 nova_compute[186849]: 2025-11-22 08:43:26.507 186853 DEBUG nova.network.neutron [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: dc53f437-5252-470a-b342-2c885312c906] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 22 03:43:26 np0005531887 nova_compute[186849]: 2025-11-22 08:43:26.507 186853 DEBUG nova.objects.instance [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid dc53f437-5252-470a-b342-2c885312c906 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:43:27 np0005531887 nova_compute[186849]: 2025-11-22 08:43:27.621 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:27 np0005531887 nova_compute[186849]: 2025-11-22 08:43:27.990 186853 DEBUG nova.network.neutron [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: dc53f437-5252-470a-b342-2c885312c906] Updating instance_info_cache with network_info: [{"id": "15aaa9ce-5a60-4a63-a8ba-48052e19c726", "address": "fa:16:3e:26:d0:6c", "network": {"id": "6b97ad36-fe6a-4ecc-ae0a-fc772d456632", "bridge": "br-int", "label": "tempest-network-smoke--56310956", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15aaa9ce-5a", "ovs_interfaceid": "15aaa9ce-5a60-4a63-a8ba-48052e19c726", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:43:28 np0005531887 nova_compute[186849]: 2025-11-22 08:43:28.019 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Releasing lock "refresh_cache-dc53f437-5252-470a-b342-2c885312c906" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:43:28 np0005531887 nova_compute[186849]: 2025-11-22 08:43:28.019 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] [instance: dc53f437-5252-470a-b342-2c885312c906] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 22 03:43:28 np0005531887 NetworkManager[55210]: <info>  [1763801008.1170] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/271)
Nov 22 03:43:28 np0005531887 nova_compute[186849]: 2025-11-22 08:43:28.116 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:28 np0005531887 NetworkManager[55210]: <info>  [1763801008.1180] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/272)
Nov 22 03:43:28 np0005531887 ovn_controller[95130]: 2025-11-22T08:43:28Z|00571|binding|INFO|Releasing lport 04e092e8-b0e3-44aa-842f-7ec0ec9be431 from this chassis (sb_readonly=0)
Nov 22 03:43:28 np0005531887 ovn_controller[95130]: 2025-11-22T08:43:28Z|00572|binding|INFO|Releasing lport 04e092e8-b0e3-44aa-842f-7ec0ec9be431 from this chassis (sb_readonly=0)
Nov 22 03:43:28 np0005531887 nova_compute[186849]: 2025-11-22 08:43:28.146 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:28 np0005531887 nova_compute[186849]: 2025-11-22 08:43:28.152 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:28 np0005531887 nova_compute[186849]: 2025-11-22 08:43:28.628 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:29 np0005531887 nova_compute[186849]: 2025-11-22 08:43:29.305 186853 DEBUG nova.compute.manager [req-b3136705-c1f1-4fe3-9dc4-23dd6f078946 req-b8930398-448a-41d6-9d13-8275428b543f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dc53f437-5252-470a-b342-2c885312c906] Received event network-changed-15aaa9ce-5a60-4a63-a8ba-48052e19c726 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:43:29 np0005531887 nova_compute[186849]: 2025-11-22 08:43:29.306 186853 DEBUG nova.compute.manager [req-b3136705-c1f1-4fe3-9dc4-23dd6f078946 req-b8930398-448a-41d6-9d13-8275428b543f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dc53f437-5252-470a-b342-2c885312c906] Refreshing instance network info cache due to event network-changed-15aaa9ce-5a60-4a63-a8ba-48052e19c726. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:43:29 np0005531887 nova_compute[186849]: 2025-11-22 08:43:29.307 186853 DEBUG oslo_concurrency.lockutils [req-b3136705-c1f1-4fe3-9dc4-23dd6f078946 req-b8930398-448a-41d6-9d13-8275428b543f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-dc53f437-5252-470a-b342-2c885312c906" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:43:29 np0005531887 nova_compute[186849]: 2025-11-22 08:43:29.307 186853 DEBUG oslo_concurrency.lockutils [req-b3136705-c1f1-4fe3-9dc4-23dd6f078946 req-b8930398-448a-41d6-9d13-8275428b543f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-dc53f437-5252-470a-b342-2c885312c906" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:43:29 np0005531887 nova_compute[186849]: 2025-11-22 08:43:29.307 186853 DEBUG nova.network.neutron [req-b3136705-c1f1-4fe3-9dc4-23dd6f078946 req-b8930398-448a-41d6-9d13-8275428b543f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dc53f437-5252-470a-b342-2c885312c906] Refreshing network info cache for port 15aaa9ce-5a60-4a63-a8ba-48052e19c726 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:43:29 np0005531887 nova_compute[186849]: 2025-11-22 08:43:29.752 186853 DEBUG oslo_concurrency.lockutils [None req-2f96915e-0dab-404e-a00d-a0425e125c3e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "dc53f437-5252-470a-b342-2c885312c906" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:43:29 np0005531887 nova_compute[186849]: 2025-11-22 08:43:29.754 186853 DEBUG oslo_concurrency.lockutils [None req-2f96915e-0dab-404e-a00d-a0425e125c3e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "dc53f437-5252-470a-b342-2c885312c906" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:43:29 np0005531887 nova_compute[186849]: 2025-11-22 08:43:29.754 186853 DEBUG oslo_concurrency.lockutils [None req-2f96915e-0dab-404e-a00d-a0425e125c3e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "dc53f437-5252-470a-b342-2c885312c906-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:43:29 np0005531887 nova_compute[186849]: 2025-11-22 08:43:29.755 186853 DEBUG oslo_concurrency.lockutils [None req-2f96915e-0dab-404e-a00d-a0425e125c3e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "dc53f437-5252-470a-b342-2c885312c906-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:43:29 np0005531887 nova_compute[186849]: 2025-11-22 08:43:29.755 186853 DEBUG oslo_concurrency.lockutils [None req-2f96915e-0dab-404e-a00d-a0425e125c3e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "dc53f437-5252-470a-b342-2c885312c906-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:43:29 np0005531887 nova_compute[186849]: 2025-11-22 08:43:29.768 186853 INFO nova.compute.manager [None req-2f96915e-0dab-404e-a00d-a0425e125c3e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: dc53f437-5252-470a-b342-2c885312c906] Terminating instance#033[00m
Nov 22 03:43:29 np0005531887 nova_compute[186849]: 2025-11-22 08:43:29.780 186853 DEBUG nova.compute.manager [None req-2f96915e-0dab-404e-a00d-a0425e125c3e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: dc53f437-5252-470a-b342-2c885312c906] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 03:43:29 np0005531887 kernel: tap15aaa9ce-5a (unregistering): left promiscuous mode
Nov 22 03:43:29 np0005531887 NetworkManager[55210]: <info>  [1763801009.8089] device (tap15aaa9ce-5a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:43:29 np0005531887 ovn_controller[95130]: 2025-11-22T08:43:29Z|00573|binding|INFO|Releasing lport 15aaa9ce-5a60-4a63-a8ba-48052e19c726 from this chassis (sb_readonly=0)
Nov 22 03:43:29 np0005531887 ovn_controller[95130]: 2025-11-22T08:43:29Z|00574|binding|INFO|Setting lport 15aaa9ce-5a60-4a63-a8ba-48052e19c726 down in Southbound
Nov 22 03:43:29 np0005531887 ovn_controller[95130]: 2025-11-22T08:43:29Z|00575|binding|INFO|Removing iface tap15aaa9ce-5a ovn-installed in OVS
Nov 22 03:43:29 np0005531887 nova_compute[186849]: 2025-11-22 08:43:29.836 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:29 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:43:29.846 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:26:d0:6c 10.100.0.5'], port_security=['fa:16:3e:26:d0:6c 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1012956209', 'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'dc53f437-5252-470a-b342-2c885312c906', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6b97ad36-fe6a-4ecc-ae0a-fc772d456632', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1012956209', 'neutron:project_id': '12f63a6d87a947758ab928c0d625ff06', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1130d42c-f40b-4a39-88f2-637246715885', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.189'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bf52ec35-2c17-43e3-9550-8e20cdf2c2b7, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=15aaa9ce-5a60-4a63-a8ba-48052e19c726) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:43:29 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:43:29.849 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 15aaa9ce-5a60-4a63-a8ba-48052e19c726 in datapath 6b97ad36-fe6a-4ecc-ae0a-fc772d456632 unbound from our chassis#033[00m
Nov 22 03:43:29 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:43:29.850 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6b97ad36-fe6a-4ecc-ae0a-fc772d456632, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:43:29 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:43:29.851 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[ddbe8778-b546-42a2-825b-56235f370d2c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:43:29 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:43:29.852 104084 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6b97ad36-fe6a-4ecc-ae0a-fc772d456632 namespace which is not needed anymore#033[00m
Nov 22 03:43:29 np0005531887 nova_compute[186849]: 2025-11-22 08:43:29.853 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:29 np0005531887 systemd[1]: machine-qemu\x2d60\x2dinstance\x2d000000ad.scope: Deactivated successfully.
Nov 22 03:43:29 np0005531887 systemd[1]: machine-qemu\x2d60\x2dinstance\x2d000000ad.scope: Consumed 6.562s CPU time.
Nov 22 03:43:29 np0005531887 systemd-machined[153180]: Machine qemu-60-instance-000000ad terminated.
Nov 22 03:43:29 np0005531887 podman[245965]: 2025-11-22 08:43:29.874237684 +0000 UTC m=+0.090334269 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.buildah.version=1.33.7, name=ubi9-minimal, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., version=9.6, io.openshift.expose-services=, managed_by=edpm_ansible, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, vcs-type=git, vendor=Red Hat, Inc., container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 22 03:43:30 np0005531887 neutron-haproxy-ovnmeta-6b97ad36-fe6a-4ecc-ae0a-fc772d456632[245949]: [NOTICE]   (245953) : haproxy version is 2.8.14-c23fe91
Nov 22 03:43:30 np0005531887 neutron-haproxy-ovnmeta-6b97ad36-fe6a-4ecc-ae0a-fc772d456632[245949]: [NOTICE]   (245953) : path to executable is /usr/sbin/haproxy
Nov 22 03:43:30 np0005531887 neutron-haproxy-ovnmeta-6b97ad36-fe6a-4ecc-ae0a-fc772d456632[245949]: [WARNING]  (245953) : Exiting Master process...
Nov 22 03:43:30 np0005531887 neutron-haproxy-ovnmeta-6b97ad36-fe6a-4ecc-ae0a-fc772d456632[245949]: [ALERT]    (245953) : Current worker (245955) exited with code 143 (Terminated)
Nov 22 03:43:30 np0005531887 neutron-haproxy-ovnmeta-6b97ad36-fe6a-4ecc-ae0a-fc772d456632[245949]: [WARNING]  (245953) : All workers exited. Exiting... (0)
Nov 22 03:43:30 np0005531887 systemd[1]: libpod-48b4544fc7eeff34546810ed31a7e12db3315156b0405f2d7988b6fae6fe10f0.scope: Deactivated successfully.
Nov 22 03:43:30 np0005531887 podman[246011]: 2025-11-22 08:43:30.024725568 +0000 UTC m=+0.066625597 container died 48b4544fc7eeff34546810ed31a7e12db3315156b0405f2d7988b6fae6fe10f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6b97ad36-fe6a-4ecc-ae0a-fc772d456632, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:43:30 np0005531887 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-48b4544fc7eeff34546810ed31a7e12db3315156b0405f2d7988b6fae6fe10f0-userdata-shm.mount: Deactivated successfully.
Nov 22 03:43:30 np0005531887 systemd[1]: var-lib-containers-storage-overlay-831ec5805b6c684b7bb2953e1a5c80f72e866746977f9514ea9dc90a639a91bf-merged.mount: Deactivated successfully.
Nov 22 03:43:30 np0005531887 nova_compute[186849]: 2025-11-22 08:43:30.063 186853 INFO nova.virt.libvirt.driver [-] [instance: dc53f437-5252-470a-b342-2c885312c906] Instance destroyed successfully.#033[00m
Nov 22 03:43:30 np0005531887 nova_compute[186849]: 2025-11-22 08:43:30.064 186853 DEBUG nova.objects.instance [None req-2f96915e-0dab-404e-a00d-a0425e125c3e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lazy-loading 'resources' on Instance uuid dc53f437-5252-470a-b342-2c885312c906 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:43:30 np0005531887 podman[246011]: 2025-11-22 08:43:30.072306446 +0000 UTC m=+0.114206475 container cleanup 48b4544fc7eeff34546810ed31a7e12db3315156b0405f2d7988b6fae6fe10f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6b97ad36-fe6a-4ecc-ae0a-fc772d456632, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 22 03:43:30 np0005531887 systemd[1]: libpod-conmon-48b4544fc7eeff34546810ed31a7e12db3315156b0405f2d7988b6fae6fe10f0.scope: Deactivated successfully.
Nov 22 03:43:30 np0005531887 nova_compute[186849]: 2025-11-22 08:43:30.084 186853 DEBUG nova.virt.libvirt.vif [None req-2f96915e-0dab-404e-a00d-a0425e125c3e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:43:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-269207377',display_name='tempest-TestNetworkBasicOps-server-269207377',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-269207377',id=173,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCVFhbK0zEFuzzS9mOMhvzYt1U5yhMs7nmXJvrYO4DXSkDJzAK+gNzkPgnZlZd9LfZ+ik1SUgGypo2/GDMTGfa7lOzAZUoSfQvBpUgCfwN0WUnxXCSjzKwwbmMJsiPDuZQ==',key_name='tempest-TestNetworkBasicOps-1867718137',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:43:24Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='12f63a6d87a947758ab928c0d625ff06',ramdisk_id='',reservation_id='r-9qve4ll4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1998778518',owner_user_name='tempest-TestNetworkBasicOps-1998778518-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:43:24Z,user_data=None,user_id='033a5e424a0a42afa21b67c28d79d1f4',uuid=dc53f437-5252-470a-b342-2c885312c906,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "15aaa9ce-5a60-4a63-a8ba-48052e19c726", "address": "fa:16:3e:26:d0:6c", "network": {"id": "6b97ad36-fe6a-4ecc-ae0a-fc772d456632", "bridge": "br-int", "label": "tempest-network-smoke--56310956", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15aaa9ce-5a", "ovs_interfaceid": "15aaa9ce-5a60-4a63-a8ba-48052e19c726", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:43:30 np0005531887 nova_compute[186849]: 2025-11-22 08:43:30.084 186853 DEBUG nova.network.os_vif_util [None req-2f96915e-0dab-404e-a00d-a0425e125c3e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converting VIF {"id": "15aaa9ce-5a60-4a63-a8ba-48052e19c726", "address": "fa:16:3e:26:d0:6c", "network": {"id": "6b97ad36-fe6a-4ecc-ae0a-fc772d456632", "bridge": "br-int", "label": "tempest-network-smoke--56310956", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15aaa9ce-5a", "ovs_interfaceid": "15aaa9ce-5a60-4a63-a8ba-48052e19c726", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:43:30 np0005531887 nova_compute[186849]: 2025-11-22 08:43:30.086 186853 DEBUG nova.network.os_vif_util [None req-2f96915e-0dab-404e-a00d-a0425e125c3e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:26:d0:6c,bridge_name='br-int',has_traffic_filtering=True,id=15aaa9ce-5a60-4a63-a8ba-48052e19c726,network=Network(6b97ad36-fe6a-4ecc-ae0a-fc772d456632),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap15aaa9ce-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:43:30 np0005531887 nova_compute[186849]: 2025-11-22 08:43:30.086 186853 DEBUG os_vif [None req-2f96915e-0dab-404e-a00d-a0425e125c3e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:26:d0:6c,bridge_name='br-int',has_traffic_filtering=True,id=15aaa9ce-5a60-4a63-a8ba-48052e19c726,network=Network(6b97ad36-fe6a-4ecc-ae0a-fc772d456632),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap15aaa9ce-5a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:43:30 np0005531887 nova_compute[186849]: 2025-11-22 08:43:30.088 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:30 np0005531887 nova_compute[186849]: 2025-11-22 08:43:30.088 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap15aaa9ce-5a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:43:30 np0005531887 nova_compute[186849]: 2025-11-22 08:43:30.092 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:30 np0005531887 nova_compute[186849]: 2025-11-22 08:43:30.095 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:43:30 np0005531887 nova_compute[186849]: 2025-11-22 08:43:30.097 186853 INFO os_vif [None req-2f96915e-0dab-404e-a00d-a0425e125c3e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:26:d0:6c,bridge_name='br-int',has_traffic_filtering=True,id=15aaa9ce-5a60-4a63-a8ba-48052e19c726,network=Network(6b97ad36-fe6a-4ecc-ae0a-fc772d456632),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap15aaa9ce-5a')#033[00m
Nov 22 03:43:30 np0005531887 nova_compute[186849]: 2025-11-22 08:43:30.098 186853 INFO nova.virt.libvirt.driver [None req-2f96915e-0dab-404e-a00d-a0425e125c3e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: dc53f437-5252-470a-b342-2c885312c906] Deleting instance files /var/lib/nova/instances/dc53f437-5252-470a-b342-2c885312c906_del#033[00m
Nov 22 03:43:30 np0005531887 nova_compute[186849]: 2025-11-22 08:43:30.099 186853 INFO nova.virt.libvirt.driver [None req-2f96915e-0dab-404e-a00d-a0425e125c3e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: dc53f437-5252-470a-b342-2c885312c906] Deletion of /var/lib/nova/instances/dc53f437-5252-470a-b342-2c885312c906_del complete#033[00m
Nov 22 03:43:30 np0005531887 podman[246057]: 2025-11-22 08:43:30.139419404 +0000 UTC m=+0.041128991 container remove 48b4544fc7eeff34546810ed31a7e12db3315156b0405f2d7988b6fae6fe10f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6b97ad36-fe6a-4ecc-ae0a-fc772d456632, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 03:43:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:43:30.146 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[82ddc1e9-a217-420e-b1e3-d0d231f9c008]: (4, ('Sat Nov 22 08:43:29 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-6b97ad36-fe6a-4ecc-ae0a-fc772d456632 (48b4544fc7eeff34546810ed31a7e12db3315156b0405f2d7988b6fae6fe10f0)\n48b4544fc7eeff34546810ed31a7e12db3315156b0405f2d7988b6fae6fe10f0\nSat Nov 22 08:43:30 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-6b97ad36-fe6a-4ecc-ae0a-fc772d456632 (48b4544fc7eeff34546810ed31a7e12db3315156b0405f2d7988b6fae6fe10f0)\n48b4544fc7eeff34546810ed31a7e12db3315156b0405f2d7988b6fae6fe10f0\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:43:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:43:30.148 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[e0afedd0-1656-43a4-8c94-e6163e95d4fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:43:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:43:30.149 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6b97ad36-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:43:30 np0005531887 kernel: tap6b97ad36-f0: left promiscuous mode
Nov 22 03:43:30 np0005531887 nova_compute[186849]: 2025-11-22 08:43:30.152 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:30 np0005531887 nova_compute[186849]: 2025-11-22 08:43:30.164 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:43:30.167 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[66c76d5b-0302-4547-b047-e1cdfb36eab5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:43:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:43:30.184 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[4016456d-fc2f-41f7-b674-403463ece006]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:43:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:43:30.186 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[1855f708-888f-445a-9dfb-6a12f24972e8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:43:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:43:30.205 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[d37c3dd9-03db-44b6-9fec-6ad23c3d64ad]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 774592, 'reachable_time': 42115, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 246072, 'error': None, 'target': 'ovnmeta-6b97ad36-fe6a-4ecc-ae0a-fc772d456632', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:43:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:43:30.208 104200 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6b97ad36-fe6a-4ecc-ae0a-fc772d456632 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:43:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:43:30.209 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[f1b6f595-3eca-43de-8ee7-cf8e51c9acb5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:43:30 np0005531887 systemd[1]: run-netns-ovnmeta\x2d6b97ad36\x2dfe6a\x2d4ecc\x2dae0a\x2dfc772d456632.mount: Deactivated successfully.
Nov 22 03:43:30 np0005531887 nova_compute[186849]: 2025-11-22 08:43:30.240 186853 INFO nova.compute.manager [None req-2f96915e-0dab-404e-a00d-a0425e125c3e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: dc53f437-5252-470a-b342-2c885312c906] Took 0.46 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 03:43:30 np0005531887 nova_compute[186849]: 2025-11-22 08:43:30.241 186853 DEBUG oslo.service.loopingcall [None req-2f96915e-0dab-404e-a00d-a0425e125c3e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 03:43:30 np0005531887 nova_compute[186849]: 2025-11-22 08:43:30.241 186853 DEBUG nova.compute.manager [-] [instance: dc53f437-5252-470a-b342-2c885312c906] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 03:43:30 np0005531887 nova_compute[186849]: 2025-11-22 08:43:30.242 186853 DEBUG nova.network.neutron [-] [instance: dc53f437-5252-470a-b342-2c885312c906] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 03:43:30 np0005531887 nova_compute[186849]: 2025-11-22 08:43:30.778 186853 DEBUG nova.network.neutron [req-b3136705-c1f1-4fe3-9dc4-23dd6f078946 req-b8930398-448a-41d6-9d13-8275428b543f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dc53f437-5252-470a-b342-2c885312c906] Updated VIF entry in instance network info cache for port 15aaa9ce-5a60-4a63-a8ba-48052e19c726. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:43:30 np0005531887 nova_compute[186849]: 2025-11-22 08:43:30.778 186853 DEBUG nova.network.neutron [req-b3136705-c1f1-4fe3-9dc4-23dd6f078946 req-b8930398-448a-41d6-9d13-8275428b543f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dc53f437-5252-470a-b342-2c885312c906] Updating instance_info_cache with network_info: [{"id": "15aaa9ce-5a60-4a63-a8ba-48052e19c726", "address": "fa:16:3e:26:d0:6c", "network": {"id": "6b97ad36-fe6a-4ecc-ae0a-fc772d456632", "bridge": "br-int", "label": "tempest-network-smoke--56310956", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15aaa9ce-5a", "ovs_interfaceid": "15aaa9ce-5a60-4a63-a8ba-48052e19c726", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:43:30 np0005531887 nova_compute[186849]: 2025-11-22 08:43:30.801 186853 DEBUG oslo_concurrency.lockutils [req-b3136705-c1f1-4fe3-9dc4-23dd6f078946 req-b8930398-448a-41d6-9d13-8275428b543f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-dc53f437-5252-470a-b342-2c885312c906" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:43:31 np0005531887 nova_compute[186849]: 2025-11-22 08:43:31.749 186853 DEBUG nova.compute.manager [req-1b137609-8ea9-4eb7-ae87-8ba5a111e6e3 req-7fc82aa3-036d-413a-b061-991ba0cdeb62 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dc53f437-5252-470a-b342-2c885312c906] Received event network-vif-unplugged-15aaa9ce-5a60-4a63-a8ba-48052e19c726 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:43:31 np0005531887 nova_compute[186849]: 2025-11-22 08:43:31.750 186853 DEBUG oslo_concurrency.lockutils [req-1b137609-8ea9-4eb7-ae87-8ba5a111e6e3 req-7fc82aa3-036d-413a-b061-991ba0cdeb62 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "dc53f437-5252-470a-b342-2c885312c906-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:43:31 np0005531887 nova_compute[186849]: 2025-11-22 08:43:31.750 186853 DEBUG oslo_concurrency.lockutils [req-1b137609-8ea9-4eb7-ae87-8ba5a111e6e3 req-7fc82aa3-036d-413a-b061-991ba0cdeb62 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "dc53f437-5252-470a-b342-2c885312c906-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:43:31 np0005531887 nova_compute[186849]: 2025-11-22 08:43:31.751 186853 DEBUG oslo_concurrency.lockutils [req-1b137609-8ea9-4eb7-ae87-8ba5a111e6e3 req-7fc82aa3-036d-413a-b061-991ba0cdeb62 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "dc53f437-5252-470a-b342-2c885312c906-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:43:31 np0005531887 nova_compute[186849]: 2025-11-22 08:43:31.751 186853 DEBUG nova.compute.manager [req-1b137609-8ea9-4eb7-ae87-8ba5a111e6e3 req-7fc82aa3-036d-413a-b061-991ba0cdeb62 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dc53f437-5252-470a-b342-2c885312c906] No waiting events found dispatching network-vif-unplugged-15aaa9ce-5a60-4a63-a8ba-48052e19c726 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:43:31 np0005531887 nova_compute[186849]: 2025-11-22 08:43:31.751 186853 DEBUG nova.compute.manager [req-1b137609-8ea9-4eb7-ae87-8ba5a111e6e3 req-7fc82aa3-036d-413a-b061-991ba0cdeb62 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dc53f437-5252-470a-b342-2c885312c906] Received event network-vif-unplugged-15aaa9ce-5a60-4a63-a8ba-48052e19c726 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 03:43:31 np0005531887 nova_compute[186849]: 2025-11-22 08:43:31.752 186853 DEBUG nova.compute.manager [req-1b137609-8ea9-4eb7-ae87-8ba5a111e6e3 req-7fc82aa3-036d-413a-b061-991ba0cdeb62 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dc53f437-5252-470a-b342-2c885312c906] Received event network-vif-plugged-15aaa9ce-5a60-4a63-a8ba-48052e19c726 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:43:31 np0005531887 nova_compute[186849]: 2025-11-22 08:43:31.752 186853 DEBUG oslo_concurrency.lockutils [req-1b137609-8ea9-4eb7-ae87-8ba5a111e6e3 req-7fc82aa3-036d-413a-b061-991ba0cdeb62 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "dc53f437-5252-470a-b342-2c885312c906-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:43:31 np0005531887 nova_compute[186849]: 2025-11-22 08:43:31.752 186853 DEBUG oslo_concurrency.lockutils [req-1b137609-8ea9-4eb7-ae87-8ba5a111e6e3 req-7fc82aa3-036d-413a-b061-991ba0cdeb62 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "dc53f437-5252-470a-b342-2c885312c906-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:43:31 np0005531887 nova_compute[186849]: 2025-11-22 08:43:31.752 186853 DEBUG oslo_concurrency.lockutils [req-1b137609-8ea9-4eb7-ae87-8ba5a111e6e3 req-7fc82aa3-036d-413a-b061-991ba0cdeb62 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "dc53f437-5252-470a-b342-2c885312c906-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:43:31 np0005531887 nova_compute[186849]: 2025-11-22 08:43:31.753 186853 DEBUG nova.compute.manager [req-1b137609-8ea9-4eb7-ae87-8ba5a111e6e3 req-7fc82aa3-036d-413a-b061-991ba0cdeb62 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dc53f437-5252-470a-b342-2c885312c906] No waiting events found dispatching network-vif-plugged-15aaa9ce-5a60-4a63-a8ba-48052e19c726 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:43:31 np0005531887 nova_compute[186849]: 2025-11-22 08:43:31.753 186853 WARNING nova.compute.manager [req-1b137609-8ea9-4eb7-ae87-8ba5a111e6e3 req-7fc82aa3-036d-413a-b061-991ba0cdeb62 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: dc53f437-5252-470a-b342-2c885312c906] Received unexpected event network-vif-plugged-15aaa9ce-5a60-4a63-a8ba-48052e19c726 for instance with vm_state active and task_state deleting.#033[00m
Nov 22 03:43:32 np0005531887 nova_compute[186849]: 2025-11-22 08:43:32.683 186853 DEBUG nova.network.neutron [-] [instance: dc53f437-5252-470a-b342-2c885312c906] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:43:32 np0005531887 nova_compute[186849]: 2025-11-22 08:43:32.701 186853 INFO nova.compute.manager [-] [instance: dc53f437-5252-470a-b342-2c885312c906] Took 2.46 seconds to deallocate network for instance.#033[00m
Nov 22 03:43:32 np0005531887 nova_compute[186849]: 2025-11-22 08:43:32.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:43:32 np0005531887 nova_compute[186849]: 2025-11-22 08:43:32.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:43:32 np0005531887 nova_compute[186849]: 2025-11-22 08:43:32.771 186853 DEBUG oslo_concurrency.lockutils [None req-2f96915e-0dab-404e-a00d-a0425e125c3e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:43:32 np0005531887 nova_compute[186849]: 2025-11-22 08:43:32.772 186853 DEBUG oslo_concurrency.lockutils [None req-2f96915e-0dab-404e-a00d-a0425e125c3e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:43:32 np0005531887 nova_compute[186849]: 2025-11-22 08:43:32.828 186853 DEBUG nova.compute.provider_tree [None req-2f96915e-0dab-404e-a00d-a0425e125c3e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:43:32 np0005531887 nova_compute[186849]: 2025-11-22 08:43:32.843 186853 DEBUG nova.scheduler.client.report [None req-2f96915e-0dab-404e-a00d-a0425e125c3e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:43:32 np0005531887 nova_compute[186849]: 2025-11-22 08:43:32.867 186853 DEBUG oslo_concurrency.lockutils [None req-2f96915e-0dab-404e-a00d-a0425e125c3e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.095s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:43:32 np0005531887 nova_compute[186849]: 2025-11-22 08:43:32.893 186853 INFO nova.scheduler.client.report [None req-2f96915e-0dab-404e-a00d-a0425e125c3e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Deleted allocations for instance dc53f437-5252-470a-b342-2c885312c906#033[00m
Nov 22 03:43:32 np0005531887 nova_compute[186849]: 2025-11-22 08:43:32.993 186853 DEBUG oslo_concurrency.lockutils [None req-2f96915e-0dab-404e-a00d-a0425e125c3e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "dc53f437-5252-470a-b342-2c885312c906" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.239s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:43:33 np0005531887 nova_compute[186849]: 2025-11-22 08:43:33.629 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:35 np0005531887 nova_compute[186849]: 2025-11-22 08:43:35.006 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:43:35.006 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=68, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=67) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:43:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:43:35.007 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:43:35 np0005531887 nova_compute[186849]: 2025-11-22 08:43:35.090 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:35 np0005531887 podman[246075]: 2025-11-22 08:43:35.84938772 +0000 UTC m=+0.069889777 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.schema-version=1.0)
Nov 22 03:43:35 np0005531887 podman[246076]: 2025-11-22 08:43:35.881499898 +0000 UTC m=+0.096180282 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 22 03:43:36 np0005531887 nova_compute[186849]: 2025-11-22 08:43:36.764 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:43:36 np0005531887 nova_compute[186849]: 2025-11-22 08:43:36.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:43:36 np0005531887 nova_compute[186849]: 2025-11-22 08:43:36.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:43:36 np0005531887 nova_compute[186849]: 2025-11-22 08:43:36.796 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:43:36 np0005531887 nova_compute[186849]: 2025-11-22 08:43:36.797 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:43:36 np0005531887 nova_compute[186849]: 2025-11-22 08:43:36.797 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:43:36 np0005531887 nova_compute[186849]: 2025-11-22 08:43:36.797 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:43:36 np0005531887 nova_compute[186849]: 2025-11-22 08:43:36.978 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:43:36 np0005531887 nova_compute[186849]: 2025-11-22 08:43:36.979 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5714MB free_disk=73.2740478515625GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:43:36 np0005531887 nova_compute[186849]: 2025-11-22 08:43:36.980 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:43:36 np0005531887 nova_compute[186849]: 2025-11-22 08:43:36.980 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:43:37 np0005531887 nova_compute[186849]: 2025-11-22 08:43:37.190 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:43:37 np0005531887 nova_compute[186849]: 2025-11-22 08:43:37.190 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:43:37 np0005531887 nova_compute[186849]: 2025-11-22 08:43:37.214 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:43:37 np0005531887 nova_compute[186849]: 2025-11-22 08:43:37.230 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:43:37 np0005531887 nova_compute[186849]: 2025-11-22 08:43:37.264 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:43:37 np0005531887 nova_compute[186849]: 2025-11-22 08:43:37.265 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.285s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:43:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:43:37.380 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:43:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:43:37.380 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:43:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:43:37.381 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:43:38 np0005531887 nova_compute[186849]: 2025-11-22 08:43:38.631 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:39 np0005531887 podman[246123]: 2025-11-22 08:43:39.840607182 +0000 UTC m=+0.062426334 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:43:40 np0005531887 nova_compute[186849]: 2025-11-22 08:43:40.093 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:43:41.010 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '68'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:43:41 np0005531887 nova_compute[186849]: 2025-11-22 08:43:41.266 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:43:42 np0005531887 nova_compute[186849]: 2025-11-22 08:43:42.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:43:43 np0005531887 nova_compute[186849]: 2025-11-22 08:43:43.634 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:45 np0005531887 nova_compute[186849]: 2025-11-22 08:43:45.055 186853 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763801010.0545568, dc53f437-5252-470a-b342-2c885312c906 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:43:45 np0005531887 nova_compute[186849]: 2025-11-22 08:43:45.056 186853 INFO nova.compute.manager [-] [instance: dc53f437-5252-470a-b342-2c885312c906] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:43:45 np0005531887 nova_compute[186849]: 2025-11-22 08:43:45.085 186853 DEBUG nova.compute.manager [None req-30ae8fee-68e5-41d8-8671-cedcb6c9d5ba - - - - - -] [instance: dc53f437-5252-470a-b342-2c885312c906] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:43:45 np0005531887 nova_compute[186849]: 2025-11-22 08:43:45.094 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:46 np0005531887 podman[246148]: 2025-11-22 08:43:46.835738529 +0000 UTC m=+0.058617560 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible)
Nov 22 03:43:48 np0005531887 nova_compute[186849]: 2025-11-22 08:43:48.637 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:48 np0005531887 nova_compute[186849]: 2025-11-22 08:43:48.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:43:50 np0005531887 nova_compute[186849]: 2025-11-22 08:43:50.097 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:50 np0005531887 podman[246167]: 2025-11-22 08:43:50.8350877 +0000 UTC m=+0.054843268 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 22 03:43:53 np0005531887 nova_compute[186849]: 2025-11-22 08:43:53.638 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:53 np0005531887 podman[246187]: 2025-11-22 08:43:53.827445221 +0000 UTC m=+0.051449744 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 03:43:55 np0005531887 nova_compute[186849]: 2025-11-22 08:43:55.101 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:43:58 np0005531887 nova_compute[186849]: 2025-11-22 08:43:58.640 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:00 np0005531887 nova_compute[186849]: 2025-11-22 08:44:00.084 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:00 np0005531887 nova_compute[186849]: 2025-11-22 08:44:00.102 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:00 np0005531887 nova_compute[186849]: 2025-11-22 08:44:00.153 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:00 np0005531887 podman[246214]: 2025-11-22 08:44:00.835515824 +0000 UTC m=+0.054874330 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, vcs-type=git, io.buildah.version=1.33.7, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, name=ubi9-minimal, release=1755695350)
Nov 22 03:44:03 np0005531887 nova_compute[186849]: 2025-11-22 08:44:03.641 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:05 np0005531887 nova_compute[186849]: 2025-11-22 08:44:05.105 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:06 np0005531887 podman[246236]: 2025-11-22 08:44:06.8550424 +0000 UTC m=+0.068584264 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute)
Nov 22 03:44:06 np0005531887 podman[246237]: 2025-11-22 08:44:06.876517747 +0000 UTC m=+0.085519920 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 22 03:44:08 np0005531887 nova_compute[186849]: 2025-11-22 08:44:08.644 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:10 np0005531887 nova_compute[186849]: 2025-11-22 08:44:10.107 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:10 np0005531887 podman[246283]: 2025-11-22 08:44:10.838474321 +0000 UTC m=+0.057640036 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 03:44:13 np0005531887 nova_compute[186849]: 2025-11-22 08:44:13.645 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:15 np0005531887 nova_compute[186849]: 2025-11-22 08:44:15.110 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:17 np0005531887 podman[246309]: 2025-11-22 08:44:17.859487332 +0000 UTC m=+0.070931522 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:44:18 np0005531887 nova_compute[186849]: 2025-11-22 08:44:18.646 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:20 np0005531887 nova_compute[186849]: 2025-11-22 08:44:20.112 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:21 np0005531887 podman[246328]: 2025-11-22 08:44:21.847464584 +0000 UTC m=+0.066921704 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 22 03:44:23 np0005531887 nova_compute[186849]: 2025-11-22 08:44:23.648 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:24 np0005531887 podman[246350]: 2025-11-22 08:44:24.837074788 +0000 UTC m=+0.055672918 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 03:44:25 np0005531887 nova_compute[186849]: 2025-11-22 08:44:25.114 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:25 np0005531887 nova_compute[186849]: 2025-11-22 08:44:25.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:44:27 np0005531887 nova_compute[186849]: 2025-11-22 08:44:27.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:44:27 np0005531887 nova_compute[186849]: 2025-11-22 08:44:27.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:44:27 np0005531887 nova_compute[186849]: 2025-11-22 08:44:27.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:44:27 np0005531887 nova_compute[186849]: 2025-11-22 08:44:27.780 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:44:28 np0005531887 nova_compute[186849]: 2025-11-22 08:44:28.651 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:30 np0005531887 nova_compute[186849]: 2025-11-22 08:44:30.116 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:31 np0005531887 podman[246374]: 2025-11-22 08:44:31.863379959 +0000 UTC m=+0.081911792 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, managed_by=edpm_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, distribution-scope=public, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, io.openshift.expose-services=, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 22 03:44:32 np0005531887 nova_compute[186849]: 2025-11-22 08:44:32.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:44:32 np0005531887 nova_compute[186849]: 2025-11-22 08:44:32.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:44:33 np0005531887 nova_compute[186849]: 2025-11-22 08:44:33.654 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:35 np0005531887 nova_compute[186849]: 2025-11-22 08:44:35.118 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:35 np0005531887 ovn_controller[95130]: 2025-11-22T08:44:35Z|00576|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Nov 22 03:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:44:36.676 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:44:36.676 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:44:36.676 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:44:36.676 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:44:36.676 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:44:36.677 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:44:36.677 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:44:36.677 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:44:36.677 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:44:36.677 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:44:36.677 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:44:36.677 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:44:36.677 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:44:36.677 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:44:36.677 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:44:36.678 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:44:36.678 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:44:36.678 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:44:36.678 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:44:36.678 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:44:36.678 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:44:36.678 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:44:36.678 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:44:36.678 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:44:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:44:36.678 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:44:36 np0005531887 nova_compute[186849]: 2025-11-22 08:44:36.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:44:36 np0005531887 nova_compute[186849]: 2025-11-22 08:44:36.795 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:44:36 np0005531887 nova_compute[186849]: 2025-11-22 08:44:36.795 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:44:36 np0005531887 nova_compute[186849]: 2025-11-22 08:44:36.795 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:44:36 np0005531887 nova_compute[186849]: 2025-11-22 08:44:36.795 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:44:36 np0005531887 nova_compute[186849]: 2025-11-22 08:44:36.984 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:44:36 np0005531887 nova_compute[186849]: 2025-11-22 08:44:36.986 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5744MB free_disk=73.27412796020508GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:44:36 np0005531887 nova_compute[186849]: 2025-11-22 08:44:36.986 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:44:36 np0005531887 nova_compute[186849]: 2025-11-22 08:44:36.986 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:44:37 np0005531887 nova_compute[186849]: 2025-11-22 08:44:37.101 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:44:37 np0005531887 nova_compute[186849]: 2025-11-22 08:44:37.102 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:44:37 np0005531887 nova_compute[186849]: 2025-11-22 08:44:37.126 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:44:37 np0005531887 nova_compute[186849]: 2025-11-22 08:44:37.160 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:44:37 np0005531887 nova_compute[186849]: 2025-11-22 08:44:37.162 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:44:37 np0005531887 nova_compute[186849]: 2025-11-22 08:44:37.163 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.176s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:44:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:44:37.381 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:44:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:44:37.381 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:44:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:44:37.381 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:44:37 np0005531887 podman[246396]: 2025-11-22 08:44:37.834171129 +0000 UTC m=+0.054173460 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:44:37 np0005531887 podman[246397]: 2025-11-22 08:44:37.88268145 +0000 UTC m=+0.094969462 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 22 03:44:38 np0005531887 nova_compute[186849]: 2025-11-22 08:44:38.655 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:39 np0005531887 nova_compute[186849]: 2025-11-22 08:44:39.155 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:44:39 np0005531887 nova_compute[186849]: 2025-11-22 08:44:39.156 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:44:40 np0005531887 nova_compute[186849]: 2025-11-22 08:44:40.121 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:41 np0005531887 nova_compute[186849]: 2025-11-22 08:44:41.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:44:41 np0005531887 podman[246442]: 2025-11-22 08:44:41.833363267 +0000 UTC m=+0.057347069 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 03:44:43 np0005531887 nova_compute[186849]: 2025-11-22 08:44:43.658 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:43 np0005531887 nova_compute[186849]: 2025-11-22 08:44:43.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:44:45 np0005531887 nova_compute[186849]: 2025-11-22 08:44:45.123 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:48 np0005531887 nova_compute[186849]: 2025-11-22 08:44:48.659 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:48 np0005531887 podman[246466]: 2025-11-22 08:44:48.83237777 +0000 UTC m=+0.053396063 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 22 03:44:50 np0005531887 nova_compute[186849]: 2025-11-22 08:44:50.125 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:50 np0005531887 nova_compute[186849]: 2025-11-22 08:44:50.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:44:52 np0005531887 podman[246487]: 2025-11-22 08:44:52.841106091 +0000 UTC m=+0.056512999 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 22 03:44:53 np0005531887 nova_compute[186849]: 2025-11-22 08:44:53.663 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:55 np0005531887 nova_compute[186849]: 2025-11-22 08:44:55.128 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:55 np0005531887 podman[246510]: 2025-11-22 08:44:55.238503936 +0000 UTC m=+0.078864797 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 03:44:55 np0005531887 nova_compute[186849]: 2025-11-22 08:44:55.819 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:44:55 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:44:55.820 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=69, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=68) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:44:55 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:44:55.821 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:44:58 np0005531887 nova_compute[186849]: 2025-11-22 08:44:58.664 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:00 np0005531887 nova_compute[186849]: 2025-11-22 08:45:00.131 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:00 np0005531887 nova_compute[186849]: 2025-11-22 08:45:00.763 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:45:01 np0005531887 nova_compute[186849]: 2025-11-22 08:45:01.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:45:01 np0005531887 nova_compute[186849]: 2025-11-22 08:45:01.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 22 03:45:01 np0005531887 nova_compute[186849]: 2025-11-22 08:45:01.782 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 22 03:45:01 np0005531887 nova_compute[186849]: 2025-11-22 08:45:01.783 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:45:01 np0005531887 nova_compute[186849]: 2025-11-22 08:45:01.783 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 22 03:45:02 np0005531887 podman[246532]: 2025-11-22 08:45:02.853165871 +0000 UTC m=+0.066299658 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.openshift.expose-services=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, architecture=x86_64, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, distribution-scope=public, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.6, release=1755695350, vcs-type=git, build-date=2025-08-20T13:12:41)
Nov 22 03:45:03 np0005531887 nova_compute[186849]: 2025-11-22 08:45:03.666 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:03 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:45:03.823 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '69'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:45:05 np0005531887 nova_compute[186849]: 2025-11-22 08:45:05.133 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:08 np0005531887 nova_compute[186849]: 2025-11-22 08:45:08.670 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:08 np0005531887 podman[246554]: 2025-11-22 08:45:08.844690301 +0000 UTC m=+0.066096615 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 22 03:45:08 np0005531887 podman[246555]: 2025-11-22 08:45:08.879042923 +0000 UTC m=+0.095029024 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:45:10 np0005531887 nova_compute[186849]: 2025-11-22 08:45:10.135 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:12 np0005531887 nova_compute[186849]: 2025-11-22 08:45:12.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:45:12 np0005531887 podman[246598]: 2025-11-22 08:45:12.832130979 +0000 UTC m=+0.053204297 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:45:13 np0005531887 nova_compute[186849]: 2025-11-22 08:45:13.672 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:15 np0005531887 nova_compute[186849]: 2025-11-22 08:45:15.139 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:18 np0005531887 nova_compute[186849]: 2025-11-22 08:45:18.673 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:19 np0005531887 podman[246622]: 2025-11-22 08:45:19.84497376 +0000 UTC m=+0.062166388 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:45:20 np0005531887 nova_compute[186849]: 2025-11-22 08:45:20.141 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:23 np0005531887 nova_compute[186849]: 2025-11-22 08:45:23.675 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:23 np0005531887 podman[246641]: 2025-11-22 08:45:23.868446366 +0000 UTC m=+0.087267925 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 22 03:45:25 np0005531887 nova_compute[186849]: 2025-11-22 08:45:25.143 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:25 np0005531887 podman[246661]: 2025-11-22 08:45:25.868274871 +0000 UTC m=+0.087853059 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 03:45:26 np0005531887 nova_compute[186849]: 2025-11-22 08:45:26.777 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:45:28 np0005531887 nova_compute[186849]: 2025-11-22 08:45:28.679 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:29 np0005531887 nova_compute[186849]: 2025-11-22 08:45:29.770 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:45:29 np0005531887 nova_compute[186849]: 2025-11-22 08:45:29.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:45:29 np0005531887 nova_compute[186849]: 2025-11-22 08:45:29.771 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:45:29 np0005531887 nova_compute[186849]: 2025-11-22 08:45:29.788 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:45:30 np0005531887 nova_compute[186849]: 2025-11-22 08:45:30.146 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:32 np0005531887 nova_compute[186849]: 2025-11-22 08:45:32.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:45:32 np0005531887 nova_compute[186849]: 2025-11-22 08:45:32.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:45:33 np0005531887 nova_compute[186849]: 2025-11-22 08:45:33.680 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:33 np0005531887 podman[246685]: 2025-11-22 08:45:33.841947791 +0000 UTC m=+0.064609267 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.openshift.tags=minimal rhel9, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc.)
Nov 22 03:45:33 np0005531887 nova_compute[186849]: 2025-11-22 08:45:33.948 186853 DEBUG oslo_concurrency.lockutils [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "7f178c92-047c-4473-8ac2-6fc099de6eac" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:45:33 np0005531887 nova_compute[186849]: 2025-11-22 08:45:33.948 186853 DEBUG oslo_concurrency.lockutils [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "7f178c92-047c-4473-8ac2-6fc099de6eac" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:45:33 np0005531887 nova_compute[186849]: 2025-11-22 08:45:33.963 186853 DEBUG nova.compute.manager [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 22 03:45:34 np0005531887 nova_compute[186849]: 2025-11-22 08:45:34.042 186853 DEBUG oslo_concurrency.lockutils [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:45:34 np0005531887 nova_compute[186849]: 2025-11-22 08:45:34.042 186853 DEBUG oslo_concurrency.lockutils [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:45:34 np0005531887 nova_compute[186849]: 2025-11-22 08:45:34.050 186853 DEBUG nova.virt.hardware [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 22 03:45:34 np0005531887 nova_compute[186849]: 2025-11-22 08:45:34.050 186853 INFO nova.compute.claims [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 22 03:45:34 np0005531887 nova_compute[186849]: 2025-11-22 08:45:34.148 186853 DEBUG nova.compute.provider_tree [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:45:34 np0005531887 nova_compute[186849]: 2025-11-22 08:45:34.159 186853 DEBUG nova.scheduler.client.report [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:45:34 np0005531887 nova_compute[186849]: 2025-11-22 08:45:34.177 186853 DEBUG oslo_concurrency.lockutils [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:45:34 np0005531887 nova_compute[186849]: 2025-11-22 08:45:34.178 186853 DEBUG nova.compute.manager [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 22 03:45:34 np0005531887 nova_compute[186849]: 2025-11-22 08:45:34.226 186853 DEBUG nova.compute.manager [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 22 03:45:34 np0005531887 nova_compute[186849]: 2025-11-22 08:45:34.226 186853 DEBUG nova.network.neutron [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 22 03:45:34 np0005531887 nova_compute[186849]: 2025-11-22 08:45:34.241 186853 INFO nova.virt.libvirt.driver [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 22 03:45:34 np0005531887 nova_compute[186849]: 2025-11-22 08:45:34.261 186853 DEBUG nova.compute.manager [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 22 03:45:34 np0005531887 nova_compute[186849]: 2025-11-22 08:45:34.361 186853 DEBUG nova.compute.manager [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 22 03:45:34 np0005531887 nova_compute[186849]: 2025-11-22 08:45:34.364 186853 DEBUG nova.virt.libvirt.driver [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 22 03:45:34 np0005531887 nova_compute[186849]: 2025-11-22 08:45:34.365 186853 INFO nova.virt.libvirt.driver [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Creating image(s)#033[00m
Nov 22 03:45:34 np0005531887 nova_compute[186849]: 2025-11-22 08:45:34.366 186853 DEBUG oslo_concurrency.lockutils [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "/var/lib/nova/instances/7f178c92-047c-4473-8ac2-6fc099de6eac/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:45:34 np0005531887 nova_compute[186849]: 2025-11-22 08:45:34.366 186853 DEBUG oslo_concurrency.lockutils [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "/var/lib/nova/instances/7f178c92-047c-4473-8ac2-6fc099de6eac/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:45:34 np0005531887 nova_compute[186849]: 2025-11-22 08:45:34.367 186853 DEBUG oslo_concurrency.lockutils [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "/var/lib/nova/instances/7f178c92-047c-4473-8ac2-6fc099de6eac/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:45:34 np0005531887 nova_compute[186849]: 2025-11-22 08:45:34.382 186853 DEBUG oslo_concurrency.processutils [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:45:34 np0005531887 nova_compute[186849]: 2025-11-22 08:45:34.452 186853 DEBUG oslo_concurrency.processutils [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:45:34 np0005531887 nova_compute[186849]: 2025-11-22 08:45:34.453 186853 DEBUG oslo_concurrency.lockutils [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:45:34 np0005531887 nova_compute[186849]: 2025-11-22 08:45:34.454 186853 DEBUG oslo_concurrency.lockutils [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:45:34 np0005531887 nova_compute[186849]: 2025-11-22 08:45:34.470 186853 DEBUG oslo_concurrency.processutils [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:45:34 np0005531887 nova_compute[186849]: 2025-11-22 08:45:34.538 186853 DEBUG oslo_concurrency.processutils [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:45:34 np0005531887 nova_compute[186849]: 2025-11-22 08:45:34.540 186853 DEBUG oslo_concurrency.processutils [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/7f178c92-047c-4473-8ac2-6fc099de6eac/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:45:34 np0005531887 nova_compute[186849]: 2025-11-22 08:45:34.610 186853 DEBUG oslo_concurrency.processutils [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/7f178c92-047c-4473-8ac2-6fc099de6eac/disk 1073741824" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:45:34 np0005531887 nova_compute[186849]: 2025-11-22 08:45:34.611 186853 DEBUG oslo_concurrency.lockutils [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.157s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:45:34 np0005531887 nova_compute[186849]: 2025-11-22 08:45:34.612 186853 DEBUG oslo_concurrency.processutils [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:45:34 np0005531887 nova_compute[186849]: 2025-11-22 08:45:34.637 186853 DEBUG nova.policy [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 22 03:45:34 np0005531887 nova_compute[186849]: 2025-11-22 08:45:34.682 186853 DEBUG oslo_concurrency.processutils [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:45:34 np0005531887 nova_compute[186849]: 2025-11-22 08:45:34.683 186853 DEBUG nova.virt.disk.api [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Checking if we can resize image /var/lib/nova/instances/7f178c92-047c-4473-8ac2-6fc099de6eac/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 22 03:45:34 np0005531887 nova_compute[186849]: 2025-11-22 08:45:34.684 186853 DEBUG oslo_concurrency.processutils [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f178c92-047c-4473-8ac2-6fc099de6eac/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:45:34 np0005531887 nova_compute[186849]: 2025-11-22 08:45:34.749 186853 DEBUG oslo_concurrency.processutils [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f178c92-047c-4473-8ac2-6fc099de6eac/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:45:34 np0005531887 nova_compute[186849]: 2025-11-22 08:45:34.751 186853 DEBUG nova.virt.disk.api [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Cannot resize image /var/lib/nova/instances/7f178c92-047c-4473-8ac2-6fc099de6eac/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 22 03:45:34 np0005531887 nova_compute[186849]: 2025-11-22 08:45:34.751 186853 DEBUG nova.objects.instance [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lazy-loading 'migration_context' on Instance uuid 7f178c92-047c-4473-8ac2-6fc099de6eac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:45:34 np0005531887 nova_compute[186849]: 2025-11-22 08:45:34.771 186853 DEBUG nova.virt.libvirt.driver [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 22 03:45:34 np0005531887 nova_compute[186849]: 2025-11-22 08:45:34.771 186853 DEBUG nova.virt.libvirt.driver [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Ensure instance console log exists: /var/lib/nova/instances/7f178c92-047c-4473-8ac2-6fc099de6eac/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 22 03:45:34 np0005531887 nova_compute[186849]: 2025-11-22 08:45:34.772 186853 DEBUG oslo_concurrency.lockutils [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:45:34 np0005531887 nova_compute[186849]: 2025-11-22 08:45:34.772 186853 DEBUG oslo_concurrency.lockutils [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:45:34 np0005531887 nova_compute[186849]: 2025-11-22 08:45:34.773 186853 DEBUG oslo_concurrency.lockutils [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:45:35 np0005531887 nova_compute[186849]: 2025-11-22 08:45:35.149 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:35 np0005531887 nova_compute[186849]: 2025-11-22 08:45:35.322 186853 DEBUG nova.network.neutron [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Successfully created port: 88d05ad1-3553-48ae-a1b7-c602b2689e38 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 22 03:45:36 np0005531887 nova_compute[186849]: 2025-11-22 08:45:36.205 186853 DEBUG nova.network.neutron [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Successfully updated port: 88d05ad1-3553-48ae-a1b7-c602b2689e38 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 22 03:45:36 np0005531887 nova_compute[186849]: 2025-11-22 08:45:36.226 186853 DEBUG oslo_concurrency.lockutils [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "refresh_cache-7f178c92-047c-4473-8ac2-6fc099de6eac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:45:36 np0005531887 nova_compute[186849]: 2025-11-22 08:45:36.226 186853 DEBUG oslo_concurrency.lockutils [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquired lock "refresh_cache-7f178c92-047c-4473-8ac2-6fc099de6eac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:45:36 np0005531887 nova_compute[186849]: 2025-11-22 08:45:36.226 186853 DEBUG nova.network.neutron [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 22 03:45:36 np0005531887 nova_compute[186849]: 2025-11-22 08:45:36.301 186853 DEBUG nova.compute.manager [req-5c3a7311-c465-4f65-8ae7-653f36c42d13 req-ad1d1109-8556-4533-8e17-daa18308feda 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Received event network-changed-88d05ad1-3553-48ae-a1b7-c602b2689e38 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:45:36 np0005531887 nova_compute[186849]: 2025-11-22 08:45:36.302 186853 DEBUG nova.compute.manager [req-5c3a7311-c465-4f65-8ae7-653f36c42d13 req-ad1d1109-8556-4533-8e17-daa18308feda 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Refreshing instance network info cache due to event network-changed-88d05ad1-3553-48ae-a1b7-c602b2689e38. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:45:36 np0005531887 nova_compute[186849]: 2025-11-22 08:45:36.302 186853 DEBUG oslo_concurrency.lockutils [req-5c3a7311-c465-4f65-8ae7-653f36c42d13 req-ad1d1109-8556-4533-8e17-daa18308feda 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-7f178c92-047c-4473-8ac2-6fc099de6eac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:45:36 np0005531887 nova_compute[186849]: 2025-11-22 08:45:36.382 186853 DEBUG nova.network.neutron [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.138 186853 DEBUG nova.network.neutron [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Updating instance_info_cache with network_info: [{"id": "88d05ad1-3553-48ae-a1b7-c602b2689e38", "address": "fa:16:3e:e8:7b:46", "network": {"id": "75a459da-4098-4237-9a69-6ce91c909b9c", "bridge": "br-int", "label": "tempest-network-smoke--86102444", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88d05ad1-35", "ovs_interfaceid": "88d05ad1-3553-48ae-a1b7-c602b2689e38", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.156 186853 DEBUG oslo_concurrency.lockutils [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Releasing lock "refresh_cache-7f178c92-047c-4473-8ac2-6fc099de6eac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.156 186853 DEBUG nova.compute.manager [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Instance network_info: |[{"id": "88d05ad1-3553-48ae-a1b7-c602b2689e38", "address": "fa:16:3e:e8:7b:46", "network": {"id": "75a459da-4098-4237-9a69-6ce91c909b9c", "bridge": "br-int", "label": "tempest-network-smoke--86102444", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88d05ad1-35", "ovs_interfaceid": "88d05ad1-3553-48ae-a1b7-c602b2689e38", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.157 186853 DEBUG oslo_concurrency.lockutils [req-5c3a7311-c465-4f65-8ae7-653f36c42d13 req-ad1d1109-8556-4533-8e17-daa18308feda 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-7f178c92-047c-4473-8ac2-6fc099de6eac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.157 186853 DEBUG nova.network.neutron [req-5c3a7311-c465-4f65-8ae7-653f36c42d13 req-ad1d1109-8556-4533-8e17-daa18308feda 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Refreshing network info cache for port 88d05ad1-3553-48ae-a1b7-c602b2689e38 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.160 186853 DEBUG nova.virt.libvirt.driver [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Start _get_guest_xml network_info=[{"id": "88d05ad1-3553-48ae-a1b7-c602b2689e38", "address": "fa:16:3e:e8:7b:46", "network": {"id": "75a459da-4098-4237-9a69-6ce91c909b9c", "bridge": "br-int", "label": "tempest-network-smoke--86102444", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88d05ad1-35", "ovs_interfaceid": "88d05ad1-3553-48ae-a1b7-c602b2689e38", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'size': 0, 'boot_index': 0, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.168 186853 WARNING nova.virt.libvirt.driver [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.176 186853 DEBUG nova.virt.libvirt.host [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.176 186853 DEBUG nova.virt.libvirt.host [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.181 186853 DEBUG nova.virt.libvirt.host [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.182 186853 DEBUG nova.virt.libvirt.host [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.183 186853 DEBUG nova.virt.libvirt.driver [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.184 186853 DEBUG nova.virt.hardware [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.184 186853 DEBUG nova.virt.hardware [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.184 186853 DEBUG nova.virt.hardware [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.184 186853 DEBUG nova.virt.hardware [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.185 186853 DEBUG nova.virt.hardware [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.185 186853 DEBUG nova.virt.hardware [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.185 186853 DEBUG nova.virt.hardware [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.185 186853 DEBUG nova.virt.hardware [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.185 186853 DEBUG nova.virt.hardware [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.186 186853 DEBUG nova.virt.hardware [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.186 186853 DEBUG nova.virt.hardware [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.189 186853 DEBUG nova.virt.libvirt.vif [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:45:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2099507978',display_name='tempest-TestNetworkBasicOps-server-2099507978',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2099507978',id=177,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJxBF+uV7kLGfNLSXaArojldE//69MaOCr1c5h9e6Oog3H1LUjI4I5mHFbCXPKNHbwYRpo/jUhVybrlSvevbkWLY/VbM4wAXCfm1OZDPNMTrj2iVHp/CG10iy05ELyqlVQ==',key_name='tempest-TestNetworkBasicOps-250415143',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='12f63a6d87a947758ab928c0d625ff06',ramdisk_id='',reservation_id='r-ca2gj73f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1998778518',owner_user_name='tempest-TestNetworkBasicOps-1998778518-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:45:34Z,user_data=None,user_id='033a5e424a0a42afa21b67c28d79d1f4',uuid=7f178c92-047c-4473-8ac2-6fc099de6eac,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "88d05ad1-3553-48ae-a1b7-c602b2689e38", "address": "fa:16:3e:e8:7b:46", "network": {"id": "75a459da-4098-4237-9a69-6ce91c909b9c", "bridge": "br-int", "label": "tempest-network-smoke--86102444", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88d05ad1-35", "ovs_interfaceid": "88d05ad1-3553-48ae-a1b7-c602b2689e38", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.190 186853 DEBUG nova.network.os_vif_util [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converting VIF {"id": "88d05ad1-3553-48ae-a1b7-c602b2689e38", "address": "fa:16:3e:e8:7b:46", "network": {"id": "75a459da-4098-4237-9a69-6ce91c909b9c", "bridge": "br-int", "label": "tempest-network-smoke--86102444", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88d05ad1-35", "ovs_interfaceid": "88d05ad1-3553-48ae-a1b7-c602b2689e38", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.190 186853 DEBUG nova.network.os_vif_util [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e8:7b:46,bridge_name='br-int',has_traffic_filtering=True,id=88d05ad1-3553-48ae-a1b7-c602b2689e38,network=Network(75a459da-4098-4237-9a69-6ce91c909b9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap88d05ad1-35') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.191 186853 DEBUG nova.objects.instance [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7f178c92-047c-4473-8ac2-6fc099de6eac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.205 186853 DEBUG nova.virt.libvirt.driver [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] End _get_guest_xml xml=<domain type="kvm">
Nov 22 03:45:37 np0005531887 nova_compute[186849]:  <uuid>7f178c92-047c-4473-8ac2-6fc099de6eac</uuid>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:  <name>instance-000000b1</name>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:  <memory>131072</memory>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:  <vcpu>1</vcpu>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:  <metadata>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 03:45:37 np0005531887 nova_compute[186849]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:      <nova:name>tempest-TestNetworkBasicOps-server-2099507978</nova:name>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:      <nova:creationTime>2025-11-22 08:45:37</nova:creationTime>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:      <nova:flavor name="m1.nano">
Nov 22 03:45:37 np0005531887 nova_compute[186849]:        <nova:memory>128</nova:memory>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:        <nova:disk>1</nova:disk>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:        <nova:swap>0</nova:swap>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:        <nova:ephemeral>0</nova:ephemeral>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:        <nova:vcpus>1</nova:vcpus>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:      </nova:flavor>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:      <nova:owner>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:        <nova:user uuid="033a5e424a0a42afa21b67c28d79d1f4">tempest-TestNetworkBasicOps-1998778518-project-member</nova:user>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:        <nova:project uuid="12f63a6d87a947758ab928c0d625ff06">tempest-TestNetworkBasicOps-1998778518</nova:project>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:      </nova:owner>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:      <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:      <nova:ports>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:        <nova:port uuid="88d05ad1-3553-48ae-a1b7-c602b2689e38">
Nov 22 03:45:37 np0005531887 nova_compute[186849]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:        </nova:port>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:      </nova:ports>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:    </nova:instance>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:  </metadata>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:  <sysinfo type="smbios">
Nov 22 03:45:37 np0005531887 nova_compute[186849]:    <system>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:      <entry name="manufacturer">RDO</entry>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:      <entry name="product">OpenStack Compute</entry>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:      <entry name="serial">7f178c92-047c-4473-8ac2-6fc099de6eac</entry>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:      <entry name="uuid">7f178c92-047c-4473-8ac2-6fc099de6eac</entry>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:      <entry name="family">Virtual Machine</entry>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:    </system>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:  </sysinfo>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:  <os>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:    <boot dev="hd"/>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:    <smbios mode="sysinfo"/>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:  </os>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:  <features>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:    <acpi/>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:    <apic/>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:    <vmcoreinfo/>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:  </features>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:  <clock offset="utc">
Nov 22 03:45:37 np0005531887 nova_compute[186849]:    <timer name="pit" tickpolicy="delay"/>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:    <timer name="hpet" present="no"/>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:  </clock>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:  <cpu mode="custom" match="exact">
Nov 22 03:45:37 np0005531887 nova_compute[186849]:    <model>Nehalem</model>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:    <topology sockets="1" cores="1" threads="1"/>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:  </cpu>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:  <devices>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:    <disk type="file" device="disk">
Nov 22 03:45:37 np0005531887 nova_compute[186849]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/7f178c92-047c-4473-8ac2-6fc099de6eac/disk"/>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:      <target dev="vda" bus="virtio"/>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:    <disk type="file" device="cdrom">
Nov 22 03:45:37 np0005531887 nova_compute[186849]:      <driver name="qemu" type="raw" cache="none"/>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:      <source file="/var/lib/nova/instances/7f178c92-047c-4473-8ac2-6fc099de6eac/disk.config"/>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:      <target dev="sda" bus="sata"/>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:    </disk>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:    <interface type="ethernet">
Nov 22 03:45:37 np0005531887 nova_compute[186849]:      <mac address="fa:16:3e:e8:7b:46"/>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:      <driver name="vhost" rx_queue_size="512"/>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:      <mtu size="1442"/>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:      <target dev="tap88d05ad1-35"/>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:    </interface>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:    <serial type="pty">
Nov 22 03:45:37 np0005531887 nova_compute[186849]:      <log file="/var/lib/nova/instances/7f178c92-047c-4473-8ac2-6fc099de6eac/console.log" append="off"/>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:    </serial>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:    <video>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:      <model type="virtio"/>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:    </video>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:    <input type="tablet" bus="usb"/>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:    <rng model="virtio">
Nov 22 03:45:37 np0005531887 nova_compute[186849]:      <backend model="random">/dev/urandom</backend>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:    </rng>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root"/>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:    <controller type="pci" model="pcie-root-port"/>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:    <controller type="usb" index="0"/>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:    <memballoon model="virtio">
Nov 22 03:45:37 np0005531887 nova_compute[186849]:      <stats period="10"/>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:    </memballoon>
Nov 22 03:45:37 np0005531887 nova_compute[186849]:  </devices>
Nov 22 03:45:37 np0005531887 nova_compute[186849]: </domain>
Nov 22 03:45:37 np0005531887 nova_compute[186849]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.207 186853 DEBUG nova.compute.manager [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Preparing to wait for external event network-vif-plugged-88d05ad1-3553-48ae-a1b7-c602b2689e38 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.208 186853 DEBUG oslo_concurrency.lockutils [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "7f178c92-047c-4473-8ac2-6fc099de6eac-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.208 186853 DEBUG oslo_concurrency.lockutils [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "7f178c92-047c-4473-8ac2-6fc099de6eac-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.208 186853 DEBUG oslo_concurrency.lockutils [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "7f178c92-047c-4473-8ac2-6fc099de6eac-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.209 186853 DEBUG nova.virt.libvirt.vif [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:45:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2099507978',display_name='tempest-TestNetworkBasicOps-server-2099507978',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2099507978',id=177,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJxBF+uV7kLGfNLSXaArojldE//69MaOCr1c5h9e6Oog3H1LUjI4I5mHFbCXPKNHbwYRpo/jUhVybrlSvevbkWLY/VbM4wAXCfm1OZDPNMTrj2iVHp/CG10iy05ELyqlVQ==',key_name='tempest-TestNetworkBasicOps-250415143',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='12f63a6d87a947758ab928c0d625ff06',ramdisk_id='',reservation_id='r-ca2gj73f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1998778518',owner_user_name='tempest-TestNetworkBasicOps-1998778518-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:45:34Z,user_data=None,user_id='033a5e424a0a42afa21b67c28d79d1f4',uuid=7f178c92-047c-4473-8ac2-6fc099de6eac,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "88d05ad1-3553-48ae-a1b7-c602b2689e38", "address": "fa:16:3e:e8:7b:46", "network": {"id": "75a459da-4098-4237-9a69-6ce91c909b9c", "bridge": "br-int", "label": "tempest-network-smoke--86102444", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88d05ad1-35", "ovs_interfaceid": "88d05ad1-3553-48ae-a1b7-c602b2689e38", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.209 186853 DEBUG nova.network.os_vif_util [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converting VIF {"id": "88d05ad1-3553-48ae-a1b7-c602b2689e38", "address": "fa:16:3e:e8:7b:46", "network": {"id": "75a459da-4098-4237-9a69-6ce91c909b9c", "bridge": "br-int", "label": "tempest-network-smoke--86102444", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88d05ad1-35", "ovs_interfaceid": "88d05ad1-3553-48ae-a1b7-c602b2689e38", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.210 186853 DEBUG nova.network.os_vif_util [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e8:7b:46,bridge_name='br-int',has_traffic_filtering=True,id=88d05ad1-3553-48ae-a1b7-c602b2689e38,network=Network(75a459da-4098-4237-9a69-6ce91c909b9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap88d05ad1-35') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.210 186853 DEBUG os_vif [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e8:7b:46,bridge_name='br-int',has_traffic_filtering=True,id=88d05ad1-3553-48ae-a1b7-c602b2689e38,network=Network(75a459da-4098-4237-9a69-6ce91c909b9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap88d05ad1-35') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.211 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.211 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.212 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.215 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.215 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap88d05ad1-35, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.216 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap88d05ad1-35, col_values=(('external_ids', {'iface-id': '88d05ad1-3553-48ae-a1b7-c602b2689e38', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e8:7b:46', 'vm-uuid': '7f178c92-047c-4473-8ac2-6fc099de6eac'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.217 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:37 np0005531887 NetworkManager[55210]: <info>  [1763801137.2200] manager: (tap88d05ad1-35): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/273)
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.220 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.227 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.228 186853 INFO os_vif [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e8:7b:46,bridge_name='br-int',has_traffic_filtering=True,id=88d05ad1-3553-48ae-a1b7-c602b2689e38,network=Network(75a459da-4098-4237-9a69-6ce91c909b9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap88d05ad1-35')#033[00m
Nov 22 03:45:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:45:37.382 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:45:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:45:37.382 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:45:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:45:37.383 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.505 186853 DEBUG nova.virt.libvirt.driver [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.506 186853 DEBUG nova.virt.libvirt.driver [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.506 186853 DEBUG nova.virt.libvirt.driver [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] No VIF found with MAC fa:16:3e:e8:7b:46, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.506 186853 INFO nova.virt.libvirt.driver [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Using config drive#033[00m
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.786 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.787 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.787 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.787 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.811 186853 INFO nova.virt.libvirt.driver [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Creating config drive at /var/lib/nova/instances/7f178c92-047c-4473-8ac2-6fc099de6eac/disk.config#033[00m
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.816 186853 DEBUG oslo_concurrency.processutils [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7f178c92-047c-4473-8ac2-6fc099de6eac/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpw25l1fj5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.857 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f178c92-047c-4473-8ac2-6fc099de6eac/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.922 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f178c92-047c-4473-8ac2-6fc099de6eac/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.923 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f178c92-047c-4473-8ac2-6fc099de6eac/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.945 186853 DEBUG oslo_concurrency.processutils [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7f178c92-047c-4473-8ac2-6fc099de6eac/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpw25l1fj5" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:45:37 np0005531887 nova_compute[186849]: 2025-11-22 08:45:37.987 186853 DEBUG oslo_concurrency.processutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f178c92-047c-4473-8ac2-6fc099de6eac/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 03:45:38 np0005531887 kernel: tap88d05ad1-35: entered promiscuous mode
Nov 22 03:45:38 np0005531887 ovn_controller[95130]: 2025-11-22T08:45:38Z|00577|binding|INFO|Claiming lport 88d05ad1-3553-48ae-a1b7-c602b2689e38 for this chassis.
Nov 22 03:45:38 np0005531887 ovn_controller[95130]: 2025-11-22T08:45:38Z|00578|binding|INFO|88d05ad1-3553-48ae-a1b7-c602b2689e38: Claiming fa:16:3e:e8:7b:46 10.100.0.11
Nov 22 03:45:38 np0005531887 NetworkManager[55210]: <info>  [1763801138.0176] manager: (tap88d05ad1-35): new Tun device (/org/freedesktop/NetworkManager/Devices/274)
Nov 22 03:45:38 np0005531887 nova_compute[186849]: 2025-11-22 08:45:38.015 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:38 np0005531887 nova_compute[186849]: 2025-11-22 08:45:38.021 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:38 np0005531887 NetworkManager[55210]: <info>  [1763801138.0290] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/275)
Nov 22 03:45:38 np0005531887 nova_compute[186849]: 2025-11-22 08:45:38.028 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:38 np0005531887 NetworkManager[55210]: <info>  [1763801138.0299] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/276)
Nov 22 03:45:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:45:38.037 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e8:7b:46 10.100.0.11'], port_security=['fa:16:3e:e8:7b:46 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '7f178c92-047c-4473-8ac2-6fc099de6eac', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-75a459da-4098-4237-9a69-6ce91c909b9c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '12f63a6d87a947758ab928c0d625ff06', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b34698d5-a6b0-4396-ae15-77dfaa064741', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=233785d2-7366-479b-b956-6331b9bfdb2d, chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=88d05ad1-3553-48ae-a1b7-c602b2689e38) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:45:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:45:38.038 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 88d05ad1-3553-48ae-a1b7-c602b2689e38 in datapath 75a459da-4098-4237-9a69-6ce91c909b9c bound to our chassis#033[00m
Nov 22 03:45:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:45:38.039 104084 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 75a459da-4098-4237-9a69-6ce91c909b9c#033[00m
Nov 22 03:45:38 np0005531887 systemd-udevd[246746]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 03:45:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:45:38.052 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[aefa081f-7119-41a7-aa69-3faefee54a19]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:45:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:45:38.053 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap75a459da-41 in ovnmeta-75a459da-4098-4237-9a69-6ce91c909b9c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 22 03:45:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:45:38.055 213790 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap75a459da-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 22 03:45:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:45:38.055 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[8c7ac9f9-4090-4400-ae7b-c660aa7039b6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:45:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:45:38.057 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[ce737d39-8c98-4cb1-a638-ead75d513ead]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:45:38 np0005531887 systemd-machined[153180]: New machine qemu-61-instance-000000b1.
Nov 22 03:45:38 np0005531887 NetworkManager[55210]: <info>  [1763801138.0690] device (tap88d05ad1-35): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 03:45:38 np0005531887 NetworkManager[55210]: <info>  [1763801138.0723] device (tap88d05ad1-35): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 03:45:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:45:38.070 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[ed5dc500-84d0-4051-931f-0a727774a574]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:45:38 np0005531887 nova_compute[186849]: 2025-11-22 08:45:38.097 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:38 np0005531887 nova_compute[186849]: 2025-11-22 08:45:38.098 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:38 np0005531887 systemd[1]: Started Virtual Machine qemu-61-instance-000000b1.
Nov 22 03:45:38 np0005531887 nova_compute[186849]: 2025-11-22 08:45:38.100 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:45:38.105 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[1d5e33bc-2e44-4482-8408-9bfea888464b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:45:38 np0005531887 nova_compute[186849]: 2025-11-22 08:45:38.108 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:38 np0005531887 ovn_controller[95130]: 2025-11-22T08:45:38Z|00579|binding|INFO|Setting lport 88d05ad1-3553-48ae-a1b7-c602b2689e38 ovn-installed in OVS
Nov 22 03:45:38 np0005531887 ovn_controller[95130]: 2025-11-22T08:45:38Z|00580|binding|INFO|Setting lport 88d05ad1-3553-48ae-a1b7-c602b2689e38 up in Southbound
Nov 22 03:45:38 np0005531887 nova_compute[186849]: 2025-11-22 08:45:38.118 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:45:38.143 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[296ab48b-0204-437f-8538-c3837e5fe0fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:45:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:45:38.150 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[dafdfcfa-9384-48aa-a13d-5d0ae1622cb3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:45:38 np0005531887 NetworkManager[55210]: <info>  [1763801138.1524] manager: (tap75a459da-40): new Veth device (/org/freedesktop/NetworkManager/Devices/277)
Nov 22 03:45:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:45:38.200 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[7ee7e87d-1806-42f2-87d0-676f770b1782]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:45:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:45:38.204 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[938d2737-49b4-49c2-9d94-f9a0879d0e4a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:45:38 np0005531887 NetworkManager[55210]: <info>  [1763801138.2375] device (tap75a459da-40): carrier: link connected
Nov 22 03:45:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:45:38.245 213804 DEBUG oslo.privsep.daemon [-] privsep: reply[536a1092-48f7-485f-b098-0df3c25261f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:45:38 np0005531887 nova_compute[186849]: 2025-11-22 08:45:38.264 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:45:38 np0005531887 nova_compute[186849]: 2025-11-22 08:45:38.265 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5725MB free_disk=73.27390670776367GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:45:38 np0005531887 nova_compute[186849]: 2025-11-22 08:45:38.266 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:45:38 np0005531887 nova_compute[186849]: 2025-11-22 08:45:38.266 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:45:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:45:38.265 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[25032972-6df8-4354-bb76-4b34b6533696]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap75a459da-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d3:7c:24'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 174], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 788084, 'reachable_time': 16149, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 246779, 'error': None, 'target': 'ovnmeta-75a459da-4098-4237-9a69-6ce91c909b9c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:45:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:45:38.282 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[4487604b-4eab-4c90-ad6f-907fef4a9db2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed3:7c24'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 788084, 'tstamp': 788084}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 246780, 'error': None, 'target': 'ovnmeta-75a459da-4098-4237-9a69-6ce91c909b9c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:45:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:45:38.304 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[eefe986f-c096-40e3-93d6-ca4d9477e28d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap75a459da-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d3:7c:24'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 174], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 788084, 'reachable_time': 16149, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 246781, 'error': None, 'target': 'ovnmeta-75a459da-4098-4237-9a69-6ce91c909b9c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:45:38 np0005531887 nova_compute[186849]: 2025-11-22 08:45:38.343 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Instance 7f178c92-047c-4473-8ac2-6fc099de6eac actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 22 03:45:38 np0005531887 nova_compute[186849]: 2025-11-22 08:45:38.344 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:45:38 np0005531887 nova_compute[186849]: 2025-11-22 08:45:38.344 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:45:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:45:38.349 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[3b88890a-35cc-4a4f-9ae5-c280ca922ff7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:45:38 np0005531887 nova_compute[186849]: 2025-11-22 08:45:38.412 186853 DEBUG nova.compute.manager [req-d746a0ae-f9ae-45c7-a5e1-642df4eeb7ee req-5f4b068f-0a36-47a9-9c09-8a8c9780ee48 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Received event network-vif-plugged-88d05ad1-3553-48ae-a1b7-c602b2689e38 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:45:38 np0005531887 nova_compute[186849]: 2025-11-22 08:45:38.414 186853 DEBUG oslo_concurrency.lockutils [req-d746a0ae-f9ae-45c7-a5e1-642df4eeb7ee req-5f4b068f-0a36-47a9-9c09-8a8c9780ee48 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "7f178c92-047c-4473-8ac2-6fc099de6eac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:45:38 np0005531887 nova_compute[186849]: 2025-11-22 08:45:38.414 186853 DEBUG oslo_concurrency.lockutils [req-d746a0ae-f9ae-45c7-a5e1-642df4eeb7ee req-5f4b068f-0a36-47a9-9c09-8a8c9780ee48 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "7f178c92-047c-4473-8ac2-6fc099de6eac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:45:38 np0005531887 nova_compute[186849]: 2025-11-22 08:45:38.415 186853 DEBUG oslo_concurrency.lockutils [req-d746a0ae-f9ae-45c7-a5e1-642df4eeb7ee req-5f4b068f-0a36-47a9-9c09-8a8c9780ee48 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "7f178c92-047c-4473-8ac2-6fc099de6eac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:45:38 np0005531887 nova_compute[186849]: 2025-11-22 08:45:38.415 186853 DEBUG nova.compute.manager [req-d746a0ae-f9ae-45c7-a5e1-642df4eeb7ee req-5f4b068f-0a36-47a9-9c09-8a8c9780ee48 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Processing event network-vif-plugged-88d05ad1-3553-48ae-a1b7-c602b2689e38 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 22 03:45:38 np0005531887 nova_compute[186849]: 2025-11-22 08:45:38.419 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:45:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:45:38.420 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[9d92b1b9-ae25-4c35-ac7c-b24dbc851f48]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:45:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:45:38.422 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap75a459da-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:45:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:45:38.422 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 22 03:45:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:45:38.422 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap75a459da-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:45:38 np0005531887 NetworkManager[55210]: <info>  [1763801138.4254] manager: (tap75a459da-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/278)
Nov 22 03:45:38 np0005531887 nova_compute[186849]: 2025-11-22 08:45:38.424 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:38 np0005531887 kernel: tap75a459da-40: entered promiscuous mode
Nov 22 03:45:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:45:38.429 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap75a459da-40, col_values=(('external_ids', {'iface-id': '39800cc0-ce78-44f5-a846-1b0efde3902d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:45:38 np0005531887 ovn_controller[95130]: 2025-11-22T08:45:38Z|00581|binding|INFO|Releasing lport 39800cc0-ce78-44f5-a846-1b0efde3902d from this chassis (sb_readonly=0)
Nov 22 03:45:38 np0005531887 nova_compute[186849]: 2025-11-22 08:45:38.430 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:45:38.432 104084 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/75a459da-4098-4237-9a69-6ce91c909b9c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/75a459da-4098-4237-9a69-6ce91c909b9c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 22 03:45:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:45:38.433 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[1a6ca99e-f2cd-4a2a-b0fc-3f0480b4c319]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:45:38 np0005531887 nova_compute[186849]: 2025-11-22 08:45:38.433 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:45:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:45:38.434 104084 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 03:45:38 np0005531887 ovn_metadata_agent[104079]: global
Nov 22 03:45:38 np0005531887 ovn_metadata_agent[104079]:    log         /dev/log local0 debug
Nov 22 03:45:38 np0005531887 ovn_metadata_agent[104079]:    log-tag     haproxy-metadata-proxy-75a459da-4098-4237-9a69-6ce91c909b9c
Nov 22 03:45:38 np0005531887 ovn_metadata_agent[104079]:    user        root
Nov 22 03:45:38 np0005531887 ovn_metadata_agent[104079]:    group       root
Nov 22 03:45:38 np0005531887 ovn_metadata_agent[104079]:    maxconn     1024
Nov 22 03:45:38 np0005531887 ovn_metadata_agent[104079]:    pidfile     /var/lib/neutron/external/pids/75a459da-4098-4237-9a69-6ce91c909b9c.pid.haproxy
Nov 22 03:45:38 np0005531887 ovn_metadata_agent[104079]:    daemon
Nov 22 03:45:38 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:45:38 np0005531887 ovn_metadata_agent[104079]: defaults
Nov 22 03:45:38 np0005531887 ovn_metadata_agent[104079]:    log global
Nov 22 03:45:38 np0005531887 ovn_metadata_agent[104079]:    mode http
Nov 22 03:45:38 np0005531887 ovn_metadata_agent[104079]:    option httplog
Nov 22 03:45:38 np0005531887 ovn_metadata_agent[104079]:    option dontlognull
Nov 22 03:45:38 np0005531887 ovn_metadata_agent[104079]:    option http-server-close
Nov 22 03:45:38 np0005531887 ovn_metadata_agent[104079]:    option forwardfor
Nov 22 03:45:38 np0005531887 ovn_metadata_agent[104079]:    retries                 3
Nov 22 03:45:38 np0005531887 ovn_metadata_agent[104079]:    timeout http-request    30s
Nov 22 03:45:38 np0005531887 ovn_metadata_agent[104079]:    timeout connect         30s
Nov 22 03:45:38 np0005531887 ovn_metadata_agent[104079]:    timeout client          32s
Nov 22 03:45:38 np0005531887 ovn_metadata_agent[104079]:    timeout server          32s
Nov 22 03:45:38 np0005531887 ovn_metadata_agent[104079]:    timeout http-keep-alive 30s
Nov 22 03:45:38 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:45:38 np0005531887 ovn_metadata_agent[104079]: 
Nov 22 03:45:38 np0005531887 ovn_metadata_agent[104079]: listen listener
Nov 22 03:45:38 np0005531887 ovn_metadata_agent[104079]:    bind 169.254.169.254:80
Nov 22 03:45:38 np0005531887 ovn_metadata_agent[104079]:    server metadata /var/lib/neutron/metadata_proxy
Nov 22 03:45:38 np0005531887 ovn_metadata_agent[104079]:    http-request add-header X-OVN-Network-ID 75a459da-4098-4237-9a69-6ce91c909b9c
Nov 22 03:45:38 np0005531887 ovn_metadata_agent[104079]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 22 03:45:38 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:45:38.436 104084 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-75a459da-4098-4237-9a69-6ce91c909b9c', 'env', 'PROCESS_TAG=haproxy-75a459da-4098-4237-9a69-6ce91c909b9c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/75a459da-4098-4237-9a69-6ce91c909b9c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 22 03:45:38 np0005531887 nova_compute[186849]: 2025-11-22 08:45:38.437 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:38 np0005531887 nova_compute[186849]: 2025-11-22 08:45:38.442 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763801138.4424794, 7f178c92-047c-4473-8ac2-6fc099de6eac => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:45:38 np0005531887 nova_compute[186849]: 2025-11-22 08:45:38.443 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] VM Started (Lifecycle Event)#033[00m
Nov 22 03:45:38 np0005531887 nova_compute[186849]: 2025-11-22 08:45:38.445 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:38 np0005531887 nova_compute[186849]: 2025-11-22 08:45:38.446 186853 DEBUG nova.compute.manager [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 22 03:45:38 np0005531887 nova_compute[186849]: 2025-11-22 08:45:38.449 186853 DEBUG nova.virt.libvirt.driver [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 22 03:45:38 np0005531887 nova_compute[186849]: 2025-11-22 08:45:38.453 186853 INFO nova.virt.libvirt.driver [-] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Instance spawned successfully.#033[00m
Nov 22 03:45:38 np0005531887 nova_compute[186849]: 2025-11-22 08:45:38.454 186853 DEBUG nova.virt.libvirt.driver [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 22 03:45:38 np0005531887 nova_compute[186849]: 2025-11-22 08:45:38.458 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:45:38 np0005531887 nova_compute[186849]: 2025-11-22 08:45:38.459 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.193s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:45:38 np0005531887 nova_compute[186849]: 2025-11-22 08:45:38.461 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:45:38 np0005531887 nova_compute[186849]: 2025-11-22 08:45:38.464 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:45:38 np0005531887 nova_compute[186849]: 2025-11-22 08:45:38.491 186853 DEBUG nova.virt.libvirt.driver [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:45:38 np0005531887 nova_compute[186849]: 2025-11-22 08:45:38.492 186853 DEBUG nova.virt.libvirt.driver [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:45:38 np0005531887 nova_compute[186849]: 2025-11-22 08:45:38.492 186853 DEBUG nova.virt.libvirt.driver [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:45:38 np0005531887 nova_compute[186849]: 2025-11-22 08:45:38.493 186853 DEBUG nova.virt.libvirt.driver [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:45:38 np0005531887 nova_compute[186849]: 2025-11-22 08:45:38.493 186853 DEBUG nova.virt.libvirt.driver [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:45:38 np0005531887 nova_compute[186849]: 2025-11-22 08:45:38.493 186853 DEBUG nova.virt.libvirt.driver [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 22 03:45:38 np0005531887 nova_compute[186849]: 2025-11-22 08:45:38.500 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:45:38 np0005531887 nova_compute[186849]: 2025-11-22 08:45:38.500 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763801138.4433703, 7f178c92-047c-4473-8ac2-6fc099de6eac => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:45:38 np0005531887 nova_compute[186849]: 2025-11-22 08:45:38.500 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] VM Paused (Lifecycle Event)#033[00m
Nov 22 03:45:38 np0005531887 nova_compute[186849]: 2025-11-22 08:45:38.538 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:45:38 np0005531887 nova_compute[186849]: 2025-11-22 08:45:38.541 186853 DEBUG nova.virt.driver [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] Emitting event <LifecycleEvent: 1763801138.4484272, 7f178c92-047c-4473-8ac2-6fc099de6eac => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:45:38 np0005531887 nova_compute[186849]: 2025-11-22 08:45:38.542 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] VM Resumed (Lifecycle Event)#033[00m
Nov 22 03:45:38 np0005531887 nova_compute[186849]: 2025-11-22 08:45:38.545 186853 DEBUG nova.network.neutron [req-5c3a7311-c465-4f65-8ae7-653f36c42d13 req-ad1d1109-8556-4533-8e17-daa18308feda 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Updated VIF entry in instance network info cache for port 88d05ad1-3553-48ae-a1b7-c602b2689e38. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:45:38 np0005531887 nova_compute[186849]: 2025-11-22 08:45:38.546 186853 DEBUG nova.network.neutron [req-5c3a7311-c465-4f65-8ae7-653f36c42d13 req-ad1d1109-8556-4533-8e17-daa18308feda 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Updating instance_info_cache with network_info: [{"id": "88d05ad1-3553-48ae-a1b7-c602b2689e38", "address": "fa:16:3e:e8:7b:46", "network": {"id": "75a459da-4098-4237-9a69-6ce91c909b9c", "bridge": "br-int", "label": "tempest-network-smoke--86102444", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88d05ad1-35", "ovs_interfaceid": "88d05ad1-3553-48ae-a1b7-c602b2689e38", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:45:38 np0005531887 nova_compute[186849]: 2025-11-22 08:45:38.575 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:45:38 np0005531887 nova_compute[186849]: 2025-11-22 08:45:38.577 186853 DEBUG oslo_concurrency.lockutils [req-5c3a7311-c465-4f65-8ae7-653f36c42d13 req-ad1d1109-8556-4533-8e17-daa18308feda 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-7f178c92-047c-4473-8ac2-6fc099de6eac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:45:38 np0005531887 nova_compute[186849]: 2025-11-22 08:45:38.579 186853 DEBUG nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 22 03:45:38 np0005531887 nova_compute[186849]: 2025-11-22 08:45:38.603 186853 INFO nova.compute.manager [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Took 4.24 seconds to spawn the instance on the hypervisor.#033[00m
Nov 22 03:45:38 np0005531887 nova_compute[186849]: 2025-11-22 08:45:38.603 186853 DEBUG nova.compute.manager [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:45:38 np0005531887 nova_compute[186849]: 2025-11-22 08:45:38.605 186853 INFO nova.compute.manager [None req-7cf85c03-ab87-42bd-a6f9-9eb8e6fdff76 - - - - - -] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 22 03:45:38 np0005531887 nova_compute[186849]: 2025-11-22 08:45:38.666 186853 INFO nova.compute.manager [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Took 4.66 seconds to build instance.#033[00m
Nov 22 03:45:38 np0005531887 nova_compute[186849]: 2025-11-22 08:45:38.681 186853 DEBUG oslo_concurrency.lockutils [None req-113c5e0d-54d4-4d2e-8331-bd9336586b1f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "7f178c92-047c-4473-8ac2-6fc099de6eac" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.732s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:45:38 np0005531887 nova_compute[186849]: 2025-11-22 08:45:38.682 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:38 np0005531887 podman[246821]: 2025-11-22 08:45:38.88728874 +0000 UTC m=+0.065786426 container create 1fd0ad4c0ce8f54546decbafa5727c38fa45c1fae434413f221c7e99b5402113 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-75a459da-4098-4237-9a69-6ce91c909b9c, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 22 03:45:38 np0005531887 systemd[1]: Started libpod-conmon-1fd0ad4c0ce8f54546decbafa5727c38fa45c1fae434413f221c7e99b5402113.scope.
Nov 22 03:45:38 np0005531887 podman[246821]: 2025-11-22 08:45:38.847086814 +0000 UTC m=+0.025584520 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 03:45:38 np0005531887 systemd[1]: Started libcrun container.
Nov 22 03:45:38 np0005531887 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/55f913ed148ef46bdd067f1549cb7bce58ffd5fce5a68ea61232bdbec0e0df3a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 03:45:38 np0005531887 podman[246821]: 2025-11-22 08:45:38.984828065 +0000 UTC m=+0.163325781 container init 1fd0ad4c0ce8f54546decbafa5727c38fa45c1fae434413f221c7e99b5402113 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-75a459da-4098-4237-9a69-6ce91c909b9c, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:45:38 np0005531887 podman[246821]: 2025-11-22 08:45:38.992389861 +0000 UTC m=+0.170887557 container start 1fd0ad4c0ce8f54546decbafa5727c38fa45c1fae434413f221c7e99b5402113 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-75a459da-4098-4237-9a69-6ce91c909b9c, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 03:45:39 np0005531887 podman[246834]: 2025-11-22 08:45:39.004389955 +0000 UTC m=+0.075505015 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:45:39 np0005531887 neutron-haproxy-ovnmeta-75a459da-4098-4237-9a69-6ce91c909b9c[246838]: [NOTICE]   (246875) : New worker (246882) forked
Nov 22 03:45:39 np0005531887 neutron-haproxy-ovnmeta-75a459da-4098-4237-9a69-6ce91c909b9c[246838]: [NOTICE]   (246875) : Loading success.
Nov 22 03:45:39 np0005531887 podman[246837]: 2025-11-22 08:45:39.028528697 +0000 UTC m=+0.094135901 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0)
Nov 22 03:45:40 np0005531887 nova_compute[186849]: 2025-11-22 08:45:40.454 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:45:40 np0005531887 nova_compute[186849]: 2025-11-22 08:45:40.455 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:45:40 np0005531887 nova_compute[186849]: 2025-11-22 08:45:40.551 186853 DEBUG nova.compute.manager [req-5203893e-7388-4c71-b311-853f012ddf9c req-8d27ad42-e2ad-44dd-a04c-8e08f53a74ce 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Received event network-vif-plugged-88d05ad1-3553-48ae-a1b7-c602b2689e38 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:45:40 np0005531887 nova_compute[186849]: 2025-11-22 08:45:40.551 186853 DEBUG oslo_concurrency.lockutils [req-5203893e-7388-4c71-b311-853f012ddf9c req-8d27ad42-e2ad-44dd-a04c-8e08f53a74ce 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "7f178c92-047c-4473-8ac2-6fc099de6eac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:45:40 np0005531887 nova_compute[186849]: 2025-11-22 08:45:40.552 186853 DEBUG oslo_concurrency.lockutils [req-5203893e-7388-4c71-b311-853f012ddf9c req-8d27ad42-e2ad-44dd-a04c-8e08f53a74ce 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "7f178c92-047c-4473-8ac2-6fc099de6eac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:45:40 np0005531887 nova_compute[186849]: 2025-11-22 08:45:40.552 186853 DEBUG oslo_concurrency.lockutils [req-5203893e-7388-4c71-b311-853f012ddf9c req-8d27ad42-e2ad-44dd-a04c-8e08f53a74ce 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "7f178c92-047c-4473-8ac2-6fc099de6eac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:45:40 np0005531887 nova_compute[186849]: 2025-11-22 08:45:40.552 186853 DEBUG nova.compute.manager [req-5203893e-7388-4c71-b311-853f012ddf9c req-8d27ad42-e2ad-44dd-a04c-8e08f53a74ce 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] No waiting events found dispatching network-vif-plugged-88d05ad1-3553-48ae-a1b7-c602b2689e38 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:45:40 np0005531887 nova_compute[186849]: 2025-11-22 08:45:40.552 186853 WARNING nova.compute.manager [req-5203893e-7388-4c71-b311-853f012ddf9c req-8d27ad42-e2ad-44dd-a04c-8e08f53a74ce 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Received unexpected event network-vif-plugged-88d05ad1-3553-48ae-a1b7-c602b2689e38 for instance with vm_state active and task_state None.#033[00m
Nov 22 03:45:42 np0005531887 nova_compute[186849]: 2025-11-22 08:45:42.118 186853 DEBUG nova.compute.manager [req-31bc6f3c-5d3c-4063-9bad-9c6d831ba194 req-1b9b7caa-a544-49ee-8680-acfe0fc66cec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Received event network-changed-88d05ad1-3553-48ae-a1b7-c602b2689e38 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:45:42 np0005531887 nova_compute[186849]: 2025-11-22 08:45:42.118 186853 DEBUG nova.compute.manager [req-31bc6f3c-5d3c-4063-9bad-9c6d831ba194 req-1b9b7caa-a544-49ee-8680-acfe0fc66cec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Refreshing instance network info cache due to event network-changed-88d05ad1-3553-48ae-a1b7-c602b2689e38. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:45:42 np0005531887 nova_compute[186849]: 2025-11-22 08:45:42.119 186853 DEBUG oslo_concurrency.lockutils [req-31bc6f3c-5d3c-4063-9bad-9c6d831ba194 req-1b9b7caa-a544-49ee-8680-acfe0fc66cec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-7f178c92-047c-4473-8ac2-6fc099de6eac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:45:42 np0005531887 nova_compute[186849]: 2025-11-22 08:45:42.119 186853 DEBUG oslo_concurrency.lockutils [req-31bc6f3c-5d3c-4063-9bad-9c6d831ba194 req-1b9b7caa-a544-49ee-8680-acfe0fc66cec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-7f178c92-047c-4473-8ac2-6fc099de6eac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:45:42 np0005531887 nova_compute[186849]: 2025-11-22 08:45:42.119 186853 DEBUG nova.network.neutron [req-31bc6f3c-5d3c-4063-9bad-9c6d831ba194 req-1b9b7caa-a544-49ee-8680-acfe0fc66cec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Refreshing network info cache for port 88d05ad1-3553-48ae-a1b7-c602b2689e38 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:45:42 np0005531887 nova_compute[186849]: 2025-11-22 08:45:42.218 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:42 np0005531887 nova_compute[186849]: 2025-11-22 08:45:42.655 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:45:42 np0005531887 nova_compute[186849]: 2025-11-22 08:45:42.678 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Triggering sync for uuid 7f178c92-047c-4473-8ac2-6fc099de6eac _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Nov 22 03:45:42 np0005531887 nova_compute[186849]: 2025-11-22 08:45:42.678 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "7f178c92-047c-4473-8ac2-6fc099de6eac" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:45:42 np0005531887 nova_compute[186849]: 2025-11-22 08:45:42.679 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "7f178c92-047c-4473-8ac2-6fc099de6eac" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:45:42 np0005531887 nova_compute[186849]: 2025-11-22 08:45:42.704 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "7f178c92-047c-4473-8ac2-6fc099de6eac" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.025s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:45:42 np0005531887 nova_compute[186849]: 2025-11-22 08:45:42.791 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:45:43 np0005531887 nova_compute[186849]: 2025-11-22 08:45:43.615 186853 DEBUG nova.network.neutron [req-31bc6f3c-5d3c-4063-9bad-9c6d831ba194 req-1b9b7caa-a544-49ee-8680-acfe0fc66cec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Updated VIF entry in instance network info cache for port 88d05ad1-3553-48ae-a1b7-c602b2689e38. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:45:43 np0005531887 nova_compute[186849]: 2025-11-22 08:45:43.615 186853 DEBUG nova.network.neutron [req-31bc6f3c-5d3c-4063-9bad-9c6d831ba194 req-1b9b7caa-a544-49ee-8680-acfe0fc66cec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Updating instance_info_cache with network_info: [{"id": "88d05ad1-3553-48ae-a1b7-c602b2689e38", "address": "fa:16:3e:e8:7b:46", "network": {"id": "75a459da-4098-4237-9a69-6ce91c909b9c", "bridge": "br-int", "label": "tempest-network-smoke--86102444", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.226", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88d05ad1-35", "ovs_interfaceid": "88d05ad1-3553-48ae-a1b7-c602b2689e38", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:45:43 np0005531887 nova_compute[186849]: 2025-11-22 08:45:43.636 186853 DEBUG oslo_concurrency.lockutils [req-31bc6f3c-5d3c-4063-9bad-9c6d831ba194 req-1b9b7caa-a544-49ee-8680-acfe0fc66cec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-7f178c92-047c-4473-8ac2-6fc099de6eac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:45:43 np0005531887 nova_compute[186849]: 2025-11-22 08:45:43.687 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:43 np0005531887 nova_compute[186849]: 2025-11-22 08:45:43.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:45:43 np0005531887 podman[246893]: 2025-11-22 08:45:43.837647109 +0000 UTC m=+0.053615537 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:45:47 np0005531887 nova_compute[186849]: 2025-11-22 08:45:47.220 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:48 np0005531887 nova_compute[186849]: 2025-11-22 08:45:48.687 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:50 np0005531887 podman[246932]: 2025-11-22 08:45:50.842283199 +0000 UTC m=+0.060497286 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 22 03:45:52 np0005531887 nova_compute[186849]: 2025-11-22 08:45:52.223 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:52 np0005531887 nova_compute[186849]: 2025-11-22 08:45:52.770 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:45:52 np0005531887 ovn_controller[95130]: 2025-11-22T08:45:52Z|00073|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e8:7b:46 10.100.0.11
Nov 22 03:45:52 np0005531887 ovn_controller[95130]: 2025-11-22T08:45:52Z|00074|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e8:7b:46 10.100.0.11
Nov 22 03:45:53 np0005531887 nova_compute[186849]: 2025-11-22 08:45:53.689 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:54 np0005531887 podman[246963]: 2025-11-22 08:45:54.868875329 +0000 UTC m=+0.089260832 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:45:56 np0005531887 podman[246983]: 2025-11-22 08:45:56.841494236 +0000 UTC m=+0.057292428 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 03:45:57 np0005531887 nova_compute[186849]: 2025-11-22 08:45:57.231 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:45:58 np0005531887 nova_compute[186849]: 2025-11-22 08:45:58.692 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:46:02 np0005531887 nova_compute[186849]: 2025-11-22 08:46:02.239 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:46:03 np0005531887 nova_compute[186849]: 2025-11-22 08:46:03.694 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:46:04 np0005531887 podman[247005]: 2025-11-22 08:46:04.838372533 +0000 UTC m=+0.061162052 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., release=1755695350, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, architecture=x86_64, container_name=openstack_network_exporter, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, vcs-type=git, version=9.6)
Nov 22 03:46:07 np0005531887 nova_compute[186849]: 2025-11-22 08:46:07.242 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:46:08 np0005531887 nova_compute[186849]: 2025-11-22 08:46:08.696 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:46:09 np0005531887 podman[247034]: 2025-11-22 08:46:09.524069205 +0000 UTC m=+0.062167378 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:46:09 np0005531887 podman[247035]: 2025-11-22 08:46:09.55975827 +0000 UTC m=+0.093385153 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 03:46:11 np0005531887 ovn_controller[95130]: 2025-11-22T08:46:11Z|00582|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Nov 22 03:46:12 np0005531887 nova_compute[186849]: 2025-11-22 08:46:12.244 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:46:13 np0005531887 nova_compute[186849]: 2025-11-22 08:46:13.699 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:46:14 np0005531887 podman[247094]: 2025-11-22 08:46:14.147457856 +0000 UTC m=+0.070993704 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:46:16 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:46:16.210 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=70, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=69) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:46:16 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:46:16.211 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:46:16 np0005531887 nova_compute[186849]: 2025-11-22 08:46:16.212 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:46:17 np0005531887 nova_compute[186849]: 2025-11-22 08:46:17.247 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:46:18 np0005531887 nova_compute[186849]: 2025-11-22 08:46:18.701 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:46:21 np0005531887 podman[247142]: 2025-11-22 08:46:21.145454433 +0000 UTC m=+0.079260217 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 22 03:46:21 np0005531887 nova_compute[186849]: 2025-11-22 08:46:21.774 186853 DEBUG nova.compute.manager [req-b5d34fe4-c2a0-46c5-a111-bc80d4346b24 req-026db9d8-92e9-44f0-80d6-249d1fcec8df 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Received event network-changed-88d05ad1-3553-48ae-a1b7-c602b2689e38 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:46:21 np0005531887 nova_compute[186849]: 2025-11-22 08:46:21.775 186853 DEBUG nova.compute.manager [req-b5d34fe4-c2a0-46c5-a111-bc80d4346b24 req-026db9d8-92e9-44f0-80d6-249d1fcec8df 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Refreshing instance network info cache due to event network-changed-88d05ad1-3553-48ae-a1b7-c602b2689e38. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 22 03:46:21 np0005531887 nova_compute[186849]: 2025-11-22 08:46:21.775 186853 DEBUG oslo_concurrency.lockutils [req-b5d34fe4-c2a0-46c5-a111-bc80d4346b24 req-026db9d8-92e9-44f0-80d6-249d1fcec8df 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-7f178c92-047c-4473-8ac2-6fc099de6eac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 22 03:46:21 np0005531887 nova_compute[186849]: 2025-11-22 08:46:21.775 186853 DEBUG oslo_concurrency.lockutils [req-b5d34fe4-c2a0-46c5-a111-bc80d4346b24 req-026db9d8-92e9-44f0-80d6-249d1fcec8df 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-7f178c92-047c-4473-8ac2-6fc099de6eac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 22 03:46:21 np0005531887 nova_compute[186849]: 2025-11-22 08:46:21.775 186853 DEBUG nova.network.neutron [req-b5d34fe4-c2a0-46c5-a111-bc80d4346b24 req-026db9d8-92e9-44f0-80d6-249d1fcec8df 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Refreshing network info cache for port 88d05ad1-3553-48ae-a1b7-c602b2689e38 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 22 03:46:21 np0005531887 nova_compute[186849]: 2025-11-22 08:46:21.855 186853 DEBUG oslo_concurrency.lockutils [None req-80343bf6-3b36-42eb-921d-43a553eb0b9e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "7f178c92-047c-4473-8ac2-6fc099de6eac" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:46:21 np0005531887 nova_compute[186849]: 2025-11-22 08:46:21.856 186853 DEBUG oslo_concurrency.lockutils [None req-80343bf6-3b36-42eb-921d-43a553eb0b9e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "7f178c92-047c-4473-8ac2-6fc099de6eac" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:46:21 np0005531887 nova_compute[186849]: 2025-11-22 08:46:21.857 186853 DEBUG oslo_concurrency.lockutils [None req-80343bf6-3b36-42eb-921d-43a553eb0b9e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "7f178c92-047c-4473-8ac2-6fc099de6eac-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:46:21 np0005531887 nova_compute[186849]: 2025-11-22 08:46:21.857 186853 DEBUG oslo_concurrency.lockutils [None req-80343bf6-3b36-42eb-921d-43a553eb0b9e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "7f178c92-047c-4473-8ac2-6fc099de6eac-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:46:21 np0005531887 nova_compute[186849]: 2025-11-22 08:46:21.857 186853 DEBUG oslo_concurrency.lockutils [None req-80343bf6-3b36-42eb-921d-43a553eb0b9e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "7f178c92-047c-4473-8ac2-6fc099de6eac-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:46:21 np0005531887 nova_compute[186849]: 2025-11-22 08:46:21.865 186853 INFO nova.compute.manager [None req-80343bf6-3b36-42eb-921d-43a553eb0b9e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Terminating instance#033[00m
Nov 22 03:46:21 np0005531887 nova_compute[186849]: 2025-11-22 08:46:21.871 186853 DEBUG nova.compute.manager [None req-80343bf6-3b36-42eb-921d-43a553eb0b9e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 22 03:46:21 np0005531887 kernel: tap88d05ad1-35 (unregistering): left promiscuous mode
Nov 22 03:46:21 np0005531887 NetworkManager[55210]: <info>  [1763801181.8926] device (tap88d05ad1-35): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 03:46:21 np0005531887 nova_compute[186849]: 2025-11-22 08:46:21.897 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:46:21 np0005531887 ovn_controller[95130]: 2025-11-22T08:46:21Z|00583|binding|INFO|Releasing lport 88d05ad1-3553-48ae-a1b7-c602b2689e38 from this chassis (sb_readonly=0)
Nov 22 03:46:21 np0005531887 ovn_controller[95130]: 2025-11-22T08:46:21Z|00584|binding|INFO|Setting lport 88d05ad1-3553-48ae-a1b7-c602b2689e38 down in Southbound
Nov 22 03:46:21 np0005531887 ovn_controller[95130]: 2025-11-22T08:46:21Z|00585|binding|INFO|Removing iface tap88d05ad1-35 ovn-installed in OVS
Nov 22 03:46:21 np0005531887 nova_compute[186849]: 2025-11-22 08:46:21.900 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:46:21 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:46:21.913 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e8:7b:46 10.100.0.11'], port_security=['fa:16:3e:e8:7b:46 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '7f178c92-047c-4473-8ac2-6fc099de6eac', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-75a459da-4098-4237-9a69-6ce91c909b9c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '12f63a6d87a947758ab928c0d625ff06', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b34698d5-a6b0-4396-ae15-77dfaa064741', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=233785d2-7366-479b-b956-6331b9bfdb2d, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>], logical_port=88d05ad1-3553-48ae-a1b7-c602b2689e38) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84c9b146a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:46:21 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:46:21.914 104084 INFO neutron.agent.ovn.metadata.agent [-] Port 88d05ad1-3553-48ae-a1b7-c602b2689e38 in datapath 75a459da-4098-4237-9a69-6ce91c909b9c unbound from our chassis#033[00m
Nov 22 03:46:21 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:46:21.915 104084 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 75a459da-4098-4237-9a69-6ce91c909b9c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 22 03:46:21 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:46:21.916 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[39175f46-bee1-4dc6-b128-1785d1206411]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:46:21 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:46:21.917 104084 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-75a459da-4098-4237-9a69-6ce91c909b9c namespace which is not needed anymore#033[00m
Nov 22 03:46:21 np0005531887 nova_compute[186849]: 2025-11-22 08:46:21.918 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:46:21 np0005531887 systemd[1]: machine-qemu\x2d61\x2dinstance\x2d000000b1.scope: Deactivated successfully.
Nov 22 03:46:21 np0005531887 systemd[1]: machine-qemu\x2d61\x2dinstance\x2d000000b1.scope: Consumed 15.074s CPU time.
Nov 22 03:46:21 np0005531887 systemd-machined[153180]: Machine qemu-61-instance-000000b1 terminated.
Nov 22 03:46:22 np0005531887 neutron-haproxy-ovnmeta-75a459da-4098-4237-9a69-6ce91c909b9c[246838]: [NOTICE]   (246875) : haproxy version is 2.8.14-c23fe91
Nov 22 03:46:22 np0005531887 neutron-haproxy-ovnmeta-75a459da-4098-4237-9a69-6ce91c909b9c[246838]: [NOTICE]   (246875) : path to executable is /usr/sbin/haproxy
Nov 22 03:46:22 np0005531887 neutron-haproxy-ovnmeta-75a459da-4098-4237-9a69-6ce91c909b9c[246838]: [WARNING]  (246875) : Exiting Master process...
Nov 22 03:46:22 np0005531887 neutron-haproxy-ovnmeta-75a459da-4098-4237-9a69-6ce91c909b9c[246838]: [ALERT]    (246875) : Current worker (246882) exited with code 143 (Terminated)
Nov 22 03:46:22 np0005531887 neutron-haproxy-ovnmeta-75a459da-4098-4237-9a69-6ce91c909b9c[246838]: [WARNING]  (246875) : All workers exited. Exiting... (0)
Nov 22 03:46:22 np0005531887 systemd[1]: libpod-1fd0ad4c0ce8f54546decbafa5727c38fa45c1fae434413f221c7e99b5402113.scope: Deactivated successfully.
Nov 22 03:46:22 np0005531887 podman[247190]: 2025-11-22 08:46:22.068840131 +0000 UTC m=+0.052183012 container died 1fd0ad4c0ce8f54546decbafa5727c38fa45c1fae434413f221c7e99b5402113 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-75a459da-4098-4237-9a69-6ce91c909b9c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118)
Nov 22 03:46:22 np0005531887 nova_compute[186849]: 2025-11-22 08:46:22.095 186853 DEBUG nova.compute.manager [req-d47ee2cd-ba00-4a7c-8e50-dd621fce481a req-1ff5b060-01f9-460f-9333-9191eec752a4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Received event network-vif-unplugged-88d05ad1-3553-48ae-a1b7-c602b2689e38 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:46:22 np0005531887 nova_compute[186849]: 2025-11-22 08:46:22.097 186853 DEBUG oslo_concurrency.lockutils [req-d47ee2cd-ba00-4a7c-8e50-dd621fce481a req-1ff5b060-01f9-460f-9333-9191eec752a4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "7f178c92-047c-4473-8ac2-6fc099de6eac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:46:22 np0005531887 nova_compute[186849]: 2025-11-22 08:46:22.097 186853 DEBUG oslo_concurrency.lockutils [req-d47ee2cd-ba00-4a7c-8e50-dd621fce481a req-1ff5b060-01f9-460f-9333-9191eec752a4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "7f178c92-047c-4473-8ac2-6fc099de6eac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:46:22 np0005531887 nova_compute[186849]: 2025-11-22 08:46:22.097 186853 DEBUG oslo_concurrency.lockutils [req-d47ee2cd-ba00-4a7c-8e50-dd621fce481a req-1ff5b060-01f9-460f-9333-9191eec752a4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "7f178c92-047c-4473-8ac2-6fc099de6eac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:46:22 np0005531887 nova_compute[186849]: 2025-11-22 08:46:22.097 186853 DEBUG nova.compute.manager [req-d47ee2cd-ba00-4a7c-8e50-dd621fce481a req-1ff5b060-01f9-460f-9333-9191eec752a4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] No waiting events found dispatching network-vif-unplugged-88d05ad1-3553-48ae-a1b7-c602b2689e38 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:46:22 np0005531887 nova_compute[186849]: 2025-11-22 08:46:22.098 186853 DEBUG nova.compute.manager [req-d47ee2cd-ba00-4a7c-8e50-dd621fce481a req-1ff5b060-01f9-460f-9333-9191eec752a4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Received event network-vif-unplugged-88d05ad1-3553-48ae-a1b7-c602b2689e38 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 22 03:46:22 np0005531887 nova_compute[186849]: 2025-11-22 08:46:22.098 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:46:22 np0005531887 nova_compute[186849]: 2025-11-22 08:46:22.101 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:46:22 np0005531887 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1fd0ad4c0ce8f54546decbafa5727c38fa45c1fae434413f221c7e99b5402113-userdata-shm.mount: Deactivated successfully.
Nov 22 03:46:22 np0005531887 systemd[1]: var-lib-containers-storage-overlay-55f913ed148ef46bdd067f1549cb7bce58ffd5fce5a68ea61232bdbec0e0df3a-merged.mount: Deactivated successfully.
Nov 22 03:46:22 np0005531887 podman[247190]: 2025-11-22 08:46:22.143341031 +0000 UTC m=+0.126683922 container cleanup 1fd0ad4c0ce8f54546decbafa5727c38fa45c1fae434413f221c7e99b5402113 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-75a459da-4098-4237-9a69-6ce91c909b9c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:46:22 np0005531887 nova_compute[186849]: 2025-11-22 08:46:22.145 186853 INFO nova.virt.libvirt.driver [-] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Instance destroyed successfully.#033[00m
Nov 22 03:46:22 np0005531887 nova_compute[186849]: 2025-11-22 08:46:22.147 186853 DEBUG nova.objects.instance [None req-80343bf6-3b36-42eb-921d-43a553eb0b9e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lazy-loading 'resources' on Instance uuid 7f178c92-047c-4473-8ac2-6fc099de6eac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 22 03:46:22 np0005531887 systemd[1]: libpod-conmon-1fd0ad4c0ce8f54546decbafa5727c38fa45c1fae434413f221c7e99b5402113.scope: Deactivated successfully.
Nov 22 03:46:22 np0005531887 nova_compute[186849]: 2025-11-22 08:46:22.162 186853 DEBUG nova.virt.libvirt.vif [None req-80343bf6-3b36-42eb-921d-43a553eb0b9e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:45:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2099507978',display_name='tempest-TestNetworkBasicOps-server-2099507978',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2099507978',id=177,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJxBF+uV7kLGfNLSXaArojldE//69MaOCr1c5h9e6Oog3H1LUjI4I5mHFbCXPKNHbwYRpo/jUhVybrlSvevbkWLY/VbM4wAXCfm1OZDPNMTrj2iVHp/CG10iy05ELyqlVQ==',key_name='tempest-TestNetworkBasicOps-250415143',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:45:38Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='12f63a6d87a947758ab928c0d625ff06',ramdisk_id='',reservation_id='r-ca2gj73f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1998778518',owner_user_name='tempest-TestNetworkBasicOps-1998778518-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:45:38Z,user_data=None,user_id='033a5e424a0a42afa21b67c28d79d1f4',uuid=7f178c92-047c-4473-8ac2-6fc099de6eac,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "88d05ad1-3553-48ae-a1b7-c602b2689e38", "address": "fa:16:3e:e8:7b:46", "network": {"id": "75a459da-4098-4237-9a69-6ce91c909b9c", "bridge": "br-int", "label": "tempest-network-smoke--86102444", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.226", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88d05ad1-35", "ovs_interfaceid": "88d05ad1-3553-48ae-a1b7-c602b2689e38", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 22 03:46:22 np0005531887 nova_compute[186849]: 2025-11-22 08:46:22.163 186853 DEBUG nova.network.os_vif_util [None req-80343bf6-3b36-42eb-921d-43a553eb0b9e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converting VIF {"id": "88d05ad1-3553-48ae-a1b7-c602b2689e38", "address": "fa:16:3e:e8:7b:46", "network": {"id": "75a459da-4098-4237-9a69-6ce91c909b9c", "bridge": "br-int", "label": "tempest-network-smoke--86102444", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.226", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88d05ad1-35", "ovs_interfaceid": "88d05ad1-3553-48ae-a1b7-c602b2689e38", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 22 03:46:22 np0005531887 nova_compute[186849]: 2025-11-22 08:46:22.164 186853 DEBUG nova.network.os_vif_util [None req-80343bf6-3b36-42eb-921d-43a553eb0b9e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e8:7b:46,bridge_name='br-int',has_traffic_filtering=True,id=88d05ad1-3553-48ae-a1b7-c602b2689e38,network=Network(75a459da-4098-4237-9a69-6ce91c909b9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap88d05ad1-35') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 22 03:46:22 np0005531887 nova_compute[186849]: 2025-11-22 08:46:22.164 186853 DEBUG os_vif [None req-80343bf6-3b36-42eb-921d-43a553eb0b9e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e8:7b:46,bridge_name='br-int',has_traffic_filtering=True,id=88d05ad1-3553-48ae-a1b7-c602b2689e38,network=Network(75a459da-4098-4237-9a69-6ce91c909b9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap88d05ad1-35') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 22 03:46:22 np0005531887 nova_compute[186849]: 2025-11-22 08:46:22.167 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:46:22 np0005531887 nova_compute[186849]: 2025-11-22 08:46:22.167 186853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88d05ad1-35, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:46:22 np0005531887 nova_compute[186849]: 2025-11-22 08:46:22.170 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:46:22 np0005531887 nova_compute[186849]: 2025-11-22 08:46:22.171 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:46:22 np0005531887 nova_compute[186849]: 2025-11-22 08:46:22.175 186853 INFO os_vif [None req-80343bf6-3b36-42eb-921d-43a553eb0b9e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e8:7b:46,bridge_name='br-int',has_traffic_filtering=True,id=88d05ad1-3553-48ae-a1b7-c602b2689e38,network=Network(75a459da-4098-4237-9a69-6ce91c909b9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap88d05ad1-35')#033[00m
Nov 22 03:46:22 np0005531887 nova_compute[186849]: 2025-11-22 08:46:22.176 186853 INFO nova.virt.libvirt.driver [None req-80343bf6-3b36-42eb-921d-43a553eb0b9e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Deleting instance files /var/lib/nova/instances/7f178c92-047c-4473-8ac2-6fc099de6eac_del#033[00m
Nov 22 03:46:22 np0005531887 nova_compute[186849]: 2025-11-22 08:46:22.177 186853 INFO nova.virt.libvirt.driver [None req-80343bf6-3b36-42eb-921d-43a553eb0b9e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Deletion of /var/lib/nova/instances/7f178c92-047c-4473-8ac2-6fc099de6eac_del complete#033[00m
Nov 22 03:46:22 np0005531887 nova_compute[186849]: 2025-11-22 08:46:22.242 186853 INFO nova.compute.manager [None req-80343bf6-3b36-42eb-921d-43a553eb0b9e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Took 0.37 seconds to destroy the instance on the hypervisor.#033[00m
Nov 22 03:46:22 np0005531887 nova_compute[186849]: 2025-11-22 08:46:22.243 186853 DEBUG oslo.service.loopingcall [None req-80343bf6-3b36-42eb-921d-43a553eb0b9e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 22 03:46:22 np0005531887 nova_compute[186849]: 2025-11-22 08:46:22.243 186853 DEBUG nova.compute.manager [-] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 22 03:46:22 np0005531887 nova_compute[186849]: 2025-11-22 08:46:22.244 186853 DEBUG nova.network.neutron [-] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 22 03:46:22 np0005531887 podman[247234]: 2025-11-22 08:46:22.244740799 +0000 UTC m=+0.074297525 container remove 1fd0ad4c0ce8f54546decbafa5727c38fa45c1fae434413f221c7e99b5402113 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-75a459da-4098-4237-9a69-6ce91c909b9c, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 03:46:22 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:46:22.251 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[3675a6cc-8dfa-46bf-a446-6a98b1a89161]: (4, ('Sat Nov 22 08:46:21 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-75a459da-4098-4237-9a69-6ce91c909b9c (1fd0ad4c0ce8f54546decbafa5727c38fa45c1fae434413f221c7e99b5402113)\n1fd0ad4c0ce8f54546decbafa5727c38fa45c1fae434413f221c7e99b5402113\nSat Nov 22 08:46:22 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-75a459da-4098-4237-9a69-6ce91c909b9c (1fd0ad4c0ce8f54546decbafa5727c38fa45c1fae434413f221c7e99b5402113)\n1fd0ad4c0ce8f54546decbafa5727c38fa45c1fae434413f221c7e99b5402113\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:46:22 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:46:22.253 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[65b58995-add0-45b8-bd7d-f4a51d130d37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:46:22 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:46:22.255 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap75a459da-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:46:22 np0005531887 kernel: tap75a459da-40: left promiscuous mode
Nov 22 03:46:22 np0005531887 nova_compute[186849]: 2025-11-22 08:46:22.258 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:46:22 np0005531887 nova_compute[186849]: 2025-11-22 08:46:22.269 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:46:22 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:46:22.275 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[801a9778-d974-4598-ba48-9b44a4ee539d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:46:22 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:46:22.294 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[c6c74d80-2242-4c9d-9497-cacc516bb4c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:46:22 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:46:22.296 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[50c49ca2-648b-43d6-b61e-6f0e46968c2e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:46:22 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:46:22.314 213790 DEBUG oslo.privsep.daemon [-] privsep: reply[13649116-91e5-46c1-89c3-d15169b66e51]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 788074, 'reachable_time': 27632, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 247249, 'error': None, 'target': 'ovnmeta-75a459da-4098-4237-9a69-6ce91c909b9c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:46:22 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:46:22.317 104200 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-75a459da-4098-4237-9a69-6ce91c909b9c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 22 03:46:22 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:46:22.317 104200 DEBUG oslo.privsep.daemon [-] privsep: reply[12aa96b3-d56b-4934-a5f3-0fbbbdc41614]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 22 03:46:22 np0005531887 systemd[1]: run-netns-ovnmeta\x2d75a459da\x2d4098\x2d4237\x2d9a69\x2d6ce91c909b9c.mount: Deactivated successfully.
Nov 22 03:46:22 np0005531887 nova_compute[186849]: 2025-11-22 08:46:22.725 186853 DEBUG nova.network.neutron [req-b5d34fe4-c2a0-46c5-a111-bc80d4346b24 req-026db9d8-92e9-44f0-80d6-249d1fcec8df 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Updated VIF entry in instance network info cache for port 88d05ad1-3553-48ae-a1b7-c602b2689e38. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 22 03:46:22 np0005531887 nova_compute[186849]: 2025-11-22 08:46:22.725 186853 DEBUG nova.network.neutron [req-b5d34fe4-c2a0-46c5-a111-bc80d4346b24 req-026db9d8-92e9-44f0-80d6-249d1fcec8df 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Updating instance_info_cache with network_info: [{"id": "88d05ad1-3553-48ae-a1b7-c602b2689e38", "address": "fa:16:3e:e8:7b:46", "network": {"id": "75a459da-4098-4237-9a69-6ce91c909b9c", "bridge": "br-int", "label": "tempest-network-smoke--86102444", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88d05ad1-35", "ovs_interfaceid": "88d05ad1-3553-48ae-a1b7-c602b2689e38", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:46:22 np0005531887 nova_compute[186849]: 2025-11-22 08:46:22.743 186853 DEBUG oslo_concurrency.lockutils [req-b5d34fe4-c2a0-46c5-a111-bc80d4346b24 req-026db9d8-92e9-44f0-80d6-249d1fcec8df 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-7f178c92-047c-4473-8ac2-6fc099de6eac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 22 03:46:22 np0005531887 nova_compute[186849]: 2025-11-22 08:46:22.863 186853 DEBUG nova.network.neutron [-] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 22 03:46:22 np0005531887 nova_compute[186849]: 2025-11-22 08:46:22.876 186853 INFO nova.compute.manager [-] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Took 0.63 seconds to deallocate network for instance.#033[00m
Nov 22 03:46:22 np0005531887 nova_compute[186849]: 2025-11-22 08:46:22.941 186853 DEBUG oslo_concurrency.lockutils [None req-80343bf6-3b36-42eb-921d-43a553eb0b9e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:46:22 np0005531887 nova_compute[186849]: 2025-11-22 08:46:22.942 186853 DEBUG oslo_concurrency.lockutils [None req-80343bf6-3b36-42eb-921d-43a553eb0b9e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:46:23 np0005531887 nova_compute[186849]: 2025-11-22 08:46:23.004 186853 DEBUG nova.compute.provider_tree [None req-80343bf6-3b36-42eb-921d-43a553eb0b9e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:46:23 np0005531887 nova_compute[186849]: 2025-11-22 08:46:23.025 186853 DEBUG nova.scheduler.client.report [None req-80343bf6-3b36-42eb-921d-43a553eb0b9e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:46:23 np0005531887 nova_compute[186849]: 2025-11-22 08:46:23.049 186853 DEBUG oslo_concurrency.lockutils [None req-80343bf6-3b36-42eb-921d-43a553eb0b9e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:46:23 np0005531887 nova_compute[186849]: 2025-11-22 08:46:23.076 186853 INFO nova.scheduler.client.report [None req-80343bf6-3b36-42eb-921d-43a553eb0b9e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Deleted allocations for instance 7f178c92-047c-4473-8ac2-6fc099de6eac#033[00m
Nov 22 03:46:23 np0005531887 nova_compute[186849]: 2025-11-22 08:46:23.146 186853 DEBUG oslo_concurrency.lockutils [None req-80343bf6-3b36-42eb-921d-43a553eb0b9e 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "7f178c92-047c-4473-8ac2-6fc099de6eac" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.290s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:46:23 np0005531887 nova_compute[186849]: 2025-11-22 08:46:23.704 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:46:24 np0005531887 nova_compute[186849]: 2025-11-22 08:46:24.163 186853 DEBUG nova.compute.manager [req-b2056773-3bd5-47a4-b7d0-268defeed7aa req-30478bbe-f903-40b8-a1eb-877ab4d2eae1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Received event network-vif-plugged-88d05ad1-3553-48ae-a1b7-c602b2689e38 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:46:24 np0005531887 nova_compute[186849]: 2025-11-22 08:46:24.163 186853 DEBUG oslo_concurrency.lockutils [req-b2056773-3bd5-47a4-b7d0-268defeed7aa req-30478bbe-f903-40b8-a1eb-877ab4d2eae1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "7f178c92-047c-4473-8ac2-6fc099de6eac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:46:24 np0005531887 nova_compute[186849]: 2025-11-22 08:46:24.163 186853 DEBUG oslo_concurrency.lockutils [req-b2056773-3bd5-47a4-b7d0-268defeed7aa req-30478bbe-f903-40b8-a1eb-877ab4d2eae1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "7f178c92-047c-4473-8ac2-6fc099de6eac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:46:24 np0005531887 nova_compute[186849]: 2025-11-22 08:46:24.163 186853 DEBUG oslo_concurrency.lockutils [req-b2056773-3bd5-47a4-b7d0-268defeed7aa req-30478bbe-f903-40b8-a1eb-877ab4d2eae1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "7f178c92-047c-4473-8ac2-6fc099de6eac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:46:24 np0005531887 nova_compute[186849]: 2025-11-22 08:46:24.164 186853 DEBUG nova.compute.manager [req-b2056773-3bd5-47a4-b7d0-268defeed7aa req-30478bbe-f903-40b8-a1eb-877ab4d2eae1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] No waiting events found dispatching network-vif-plugged-88d05ad1-3553-48ae-a1b7-c602b2689e38 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 22 03:46:24 np0005531887 nova_compute[186849]: 2025-11-22 08:46:24.164 186853 WARNING nova.compute.manager [req-b2056773-3bd5-47a4-b7d0-268defeed7aa req-30478bbe-f903-40b8-a1eb-877ab4d2eae1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Received unexpected event network-vif-plugged-88d05ad1-3553-48ae-a1b7-c602b2689e38 for instance with vm_state deleted and task_state None.#033[00m
Nov 22 03:46:24 np0005531887 nova_compute[186849]: 2025-11-22 08:46:24.165 186853 DEBUG nova.compute.manager [req-b2056773-3bd5-47a4-b7d0-268defeed7aa req-30478bbe-f903-40b8-a1eb-877ab4d2eae1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Received event network-vif-deleted-88d05ad1-3553-48ae-a1b7-c602b2689e38 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 22 03:46:25 np0005531887 podman[247260]: 2025-11-22 08:46:25.199389014 +0000 UTC m=+0.074509730 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 22 03:46:26 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:46:26.215 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '70'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:46:27 np0005531887 nova_compute[186849]: 2025-11-22 08:46:27.169 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:46:27 np0005531887 podman[247291]: 2025-11-22 08:46:27.836562306 +0000 UTC m=+0.056185951 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 03:46:28 np0005531887 nova_compute[186849]: 2025-11-22 08:46:28.706 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:46:28 np0005531887 nova_compute[186849]: 2025-11-22 08:46:28.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:46:30 np0005531887 nova_compute[186849]: 2025-11-22 08:46:30.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:46:30 np0005531887 nova_compute[186849]: 2025-11-22 08:46:30.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:46:30 np0005531887 nova_compute[186849]: 2025-11-22 08:46:30.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:46:30 np0005531887 nova_compute[186849]: 2025-11-22 08:46:30.786 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:46:32 np0005531887 nova_compute[186849]: 2025-11-22 08:46:32.170 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:46:33 np0005531887 nova_compute[186849]: 2025-11-22 08:46:33.707 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:46:33 np0005531887 nova_compute[186849]: 2025-11-22 08:46:33.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:46:33 np0005531887 nova_compute[186849]: 2025-11-22 08:46:33.768 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:46:35 np0005531887 podman[247315]: 2025-11-22 08:46:35.878157862 +0000 UTC m=+0.089312184 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, version=9.6, com.redhat.component=ubi9-minimal-container, config_id=edpm, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 03:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:46:36.676 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:46:36.677 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:46:36.678 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:46:36.678 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:46:36.678 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:46:36.678 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:46:36.678 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:46:36.678 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:46:36.678 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:46:36.678 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:46:36.678 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:46:36.679 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:46:36.679 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:46:36.679 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:46:36.679 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:46:36.679 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:46:36.679 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:46:36.679 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:46:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:46:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:46:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:46:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:46:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:46:36.681 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:46:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:46:36.681 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:46:37 np0005531887 nova_compute[186849]: 2025-11-22 08:46:37.144 186853 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763801182.142598, 7f178c92-047c-4473-8ac2-6fc099de6eac => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 22 03:46:37 np0005531887 nova_compute[186849]: 2025-11-22 08:46:37.144 186853 INFO nova.compute.manager [-] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] VM Stopped (Lifecycle Event)#033[00m
Nov 22 03:46:37 np0005531887 nova_compute[186849]: 2025-11-22 08:46:37.159 186853 DEBUG nova.compute.manager [None req-e5f86c2e-4ae2-47b9-b2a9-886d2578edc5 - - - - - -] [instance: 7f178c92-047c-4473-8ac2-6fc099de6eac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 22 03:46:37 np0005531887 nova_compute[186849]: 2025-11-22 08:46:37.173 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:46:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:46:37.383 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:46:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:46:37.383 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:46:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:46:37.383 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:46:38 np0005531887 nova_compute[186849]: 2025-11-22 08:46:38.431 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:46:38 np0005531887 nova_compute[186849]: 2025-11-22 08:46:38.503 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:46:38 np0005531887 nova_compute[186849]: 2025-11-22 08:46:38.708 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:46:38 np0005531887 nova_compute[186849]: 2025-11-22 08:46:38.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:46:38 np0005531887 nova_compute[186849]: 2025-11-22 08:46:38.800 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:46:38 np0005531887 nova_compute[186849]: 2025-11-22 08:46:38.801 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:46:38 np0005531887 nova_compute[186849]: 2025-11-22 08:46:38.801 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:46:38 np0005531887 nova_compute[186849]: 2025-11-22 08:46:38.801 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:46:39 np0005531887 nova_compute[186849]: 2025-11-22 08:46:39.005 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:46:39 np0005531887 nova_compute[186849]: 2025-11-22 08:46:39.006 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5742MB free_disk=73.27388381958008GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:46:39 np0005531887 nova_compute[186849]: 2025-11-22 08:46:39.006 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:46:39 np0005531887 nova_compute[186849]: 2025-11-22 08:46:39.007 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:46:39 np0005531887 nova_compute[186849]: 2025-11-22 08:46:39.065 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:46:39 np0005531887 nova_compute[186849]: 2025-11-22 08:46:39.066 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:46:39 np0005531887 nova_compute[186849]: 2025-11-22 08:46:39.079 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Refreshing inventories for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 22 03:46:39 np0005531887 nova_compute[186849]: 2025-11-22 08:46:39.096 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Updating ProviderTree inventory for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 22 03:46:39 np0005531887 nova_compute[186849]: 2025-11-22 08:46:39.096 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Updating inventory in ProviderTree for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 22 03:46:39 np0005531887 nova_compute[186849]: 2025-11-22 08:46:39.119 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Refreshing aggregate associations for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 22 03:46:39 np0005531887 nova_compute[186849]: 2025-11-22 08:46:39.140 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Refreshing trait associations for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78, traits: COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_MMX,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_PCNET _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 22 03:46:39 np0005531887 nova_compute[186849]: 2025-11-22 08:46:39.163 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:46:39 np0005531887 nova_compute[186849]: 2025-11-22 08:46:39.183 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:46:39 np0005531887 nova_compute[186849]: 2025-11-22 08:46:39.254 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:46:39 np0005531887 nova_compute[186849]: 2025-11-22 08:46:39.254 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.248s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:46:39 np0005531887 podman[247340]: 2025-11-22 08:46:39.850570412 +0000 UTC m=+0.068244065 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 03:46:39 np0005531887 podman[247341]: 2025-11-22 08:46:39.912825381 +0000 UTC m=+0.124780374 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:46:40 np0005531887 nova_compute[186849]: 2025-11-22 08:46:40.249 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:46:41 np0005531887 nova_compute[186849]: 2025-11-22 08:46:41.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:46:42 np0005531887 nova_compute[186849]: 2025-11-22 08:46:42.175 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:46:43 np0005531887 nova_compute[186849]: 2025-11-22 08:46:43.710 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:46:43 np0005531887 nova_compute[186849]: 2025-11-22 08:46:43.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:46:44 np0005531887 nova_compute[186849]: 2025-11-22 08:46:44.770 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:46:44 np0005531887 podman[247387]: 2025-11-22 08:46:44.834459754 +0000 UTC m=+0.055018472 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:46:47 np0005531887 nova_compute[186849]: 2025-11-22 08:46:47.177 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:46:48 np0005531887 nova_compute[186849]: 2025-11-22 08:46:48.712 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:46:51 np0005531887 podman[247411]: 2025-11-22 08:46:51.827222032 +0000 UTC m=+0.051000333 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 22 03:46:52 np0005531887 nova_compute[186849]: 2025-11-22 08:46:52.179 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:46:53 np0005531887 nova_compute[186849]: 2025-11-22 08:46:53.714 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:46:54 np0005531887 nova_compute[186849]: 2025-11-22 08:46:54.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:46:55 np0005531887 podman[247429]: 2025-11-22 08:46:55.843234902 +0000 UTC m=+0.062304250 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:46:57 np0005531887 nova_compute[186849]: 2025-11-22 08:46:57.182 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:46:58 np0005531887 nova_compute[186849]: 2025-11-22 08:46:58.716 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:46:58 np0005531887 podman[247449]: 2025-11-22 08:46:58.837182992 +0000 UTC m=+0.056427837 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 03:47:02 np0005531887 nova_compute[186849]: 2025-11-22 08:47:02.184 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:47:03 np0005531887 nova_compute[186849]: 2025-11-22 08:47:03.718 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:47:04 np0005531887 nova_compute[186849]: 2025-11-22 08:47:04.763 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:47:06 np0005531887 podman[247473]: 2025-11-22 08:47:06.842225221 +0000 UTC m=+0.060645959 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41)
Nov 22 03:47:07 np0005531887 nova_compute[186849]: 2025-11-22 08:47:07.185 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:47:08 np0005531887 nova_compute[186849]: 2025-11-22 08:47:08.719 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:47:10 np0005531887 podman[247495]: 2025-11-22 08:47:10.846225417 +0000 UTC m=+0.060694562 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 22 03:47:10 np0005531887 podman[247496]: 2025-11-22 08:47:10.864046545 +0000 UTC m=+0.075882075 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 03:47:12 np0005531887 nova_compute[186849]: 2025-11-22 08:47:12.187 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:47:13 np0005531887 nova_compute[186849]: 2025-11-22 08:47:13.721 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:47:15 np0005531887 podman[247538]: 2025-11-22 08:47:15.854535348 +0000 UTC m=+0.076081639 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 03:47:17 np0005531887 nova_compute[186849]: 2025-11-22 08:47:17.190 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:47:18 np0005531887 nova_compute[186849]: 2025-11-22 08:47:18.724 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:47:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:47:20.371 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=71, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=70) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:47:20 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:47:20.372 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:47:20 np0005531887 nova_compute[186849]: 2025-11-22 08:47:20.372 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:47:20 np0005531887 ovn_controller[95130]: 2025-11-22T08:47:20Z|00586|memory_trim|INFO|Detected inactivity (last active 30000 ms ago): trimming memory
Nov 22 03:47:22 np0005531887 nova_compute[186849]: 2025-11-22 08:47:22.192 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:47:22 np0005531887 podman[247562]: 2025-11-22 08:47:22.829506828 +0000 UTC m=+0.050060730 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 22 03:47:23 np0005531887 nova_compute[186849]: 2025-11-22 08:47:23.727 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:47:26 np0005531887 podman[247581]: 2025-11-22 08:47:26.83946456 +0000 UTC m=+0.058413444 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 03:47:27 np0005531887 nova_compute[186849]: 2025-11-22 08:47:27.193 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:47:27 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:47:27.374 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '71'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:47:28 np0005531887 nova_compute[186849]: 2025-11-22 08:47:28.730 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:47:29 np0005531887 nova_compute[186849]: 2025-11-22 08:47:29.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:47:29 np0005531887 podman[247601]: 2025-11-22 08:47:29.824690886 +0000 UTC m=+0.043549270 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 03:47:32 np0005531887 nova_compute[186849]: 2025-11-22 08:47:32.195 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:47:32 np0005531887 nova_compute[186849]: 2025-11-22 08:47:32.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:47:32 np0005531887 nova_compute[186849]: 2025-11-22 08:47:32.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:47:32 np0005531887 nova_compute[186849]: 2025-11-22 08:47:32.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:47:32 np0005531887 nova_compute[186849]: 2025-11-22 08:47:32.784 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:47:33 np0005531887 nova_compute[186849]: 2025-11-22 08:47:33.732 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:47:33 np0005531887 nova_compute[186849]: 2025-11-22 08:47:33.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:47:33 np0005531887 nova_compute[186849]: 2025-11-22 08:47:33.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:47:37 np0005531887 nova_compute[186849]: 2025-11-22 08:47:37.197 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:47:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:47:37.383 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:47:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:47:37.384 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:47:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:47:37.384 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:47:37 np0005531887 podman[247629]: 2025-11-22 08:47:37.822930668 +0000 UTC m=+0.046967885 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vendor=Red Hat, Inc., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, container_name=openstack_network_exporter, vcs-type=git, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public)
Nov 22 03:47:38 np0005531887 nova_compute[186849]: 2025-11-22 08:47:38.733 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:47:40 np0005531887 nova_compute[186849]: 2025-11-22 08:47:40.764 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:47:40 np0005531887 nova_compute[186849]: 2025-11-22 08:47:40.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:47:40 np0005531887 nova_compute[186849]: 2025-11-22 08:47:40.807 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:47:40 np0005531887 nova_compute[186849]: 2025-11-22 08:47:40.807 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:47:40 np0005531887 nova_compute[186849]: 2025-11-22 08:47:40.808 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:47:40 np0005531887 nova_compute[186849]: 2025-11-22 08:47:40.808 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:47:40 np0005531887 nova_compute[186849]: 2025-11-22 08:47:40.979 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:47:40 np0005531887 nova_compute[186849]: 2025-11-22 08:47:40.982 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5749MB free_disk=73.2660903930664GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:47:40 np0005531887 nova_compute[186849]: 2025-11-22 08:47:40.982 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:47:40 np0005531887 nova_compute[186849]: 2025-11-22 08:47:40.983 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:47:41 np0005531887 nova_compute[186849]: 2025-11-22 08:47:41.320 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:47:41 np0005531887 nova_compute[186849]: 2025-11-22 08:47:41.320 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:47:41 np0005531887 nova_compute[186849]: 2025-11-22 08:47:41.374 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:47:41 np0005531887 nova_compute[186849]: 2025-11-22 08:47:41.388 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:47:41 np0005531887 nova_compute[186849]: 2025-11-22 08:47:41.390 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:47:41 np0005531887 nova_compute[186849]: 2025-11-22 08:47:41.390 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.408s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:47:41 np0005531887 podman[247650]: 2025-11-22 08:47:41.861234955 +0000 UTC m=+0.081611765 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:47:41 np0005531887 podman[247651]: 2025-11-22 08:47:41.874612064 +0000 UTC m=+0.088911205 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Nov 22 03:47:42 np0005531887 nova_compute[186849]: 2025-11-22 08:47:42.199 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:47:42 np0005531887 nova_compute[186849]: 2025-11-22 08:47:42.390 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:47:43 np0005531887 nova_compute[186849]: 2025-11-22 08:47:43.735 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:47:43 np0005531887 nova_compute[186849]: 2025-11-22 08:47:43.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:47:44 np0005531887 nova_compute[186849]: 2025-11-22 08:47:44.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:47:46 np0005531887 podman[247696]: 2025-11-22 08:47:46.870071991 +0000 UTC m=+0.071963699 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:47:47 np0005531887 nova_compute[186849]: 2025-11-22 08:47:47.201 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:47:48 np0005531887 nova_compute[186849]: 2025-11-22 08:47:48.736 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:47:52 np0005531887 nova_compute[186849]: 2025-11-22 08:47:52.204 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:47:53 np0005531887 nova_compute[186849]: 2025-11-22 08:47:53.739 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:47:53 np0005531887 podman[247721]: 2025-11-22 08:47:53.829760997 +0000 UTC m=+0.052630733 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:47:56 np0005531887 nova_compute[186849]: 2025-11-22 08:47:56.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:47:57 np0005531887 nova_compute[186849]: 2025-11-22 08:47:57.206 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:47:57 np0005531887 podman[247742]: 2025-11-22 08:47:57.872701159 +0000 UTC m=+0.092255396 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 03:47:58 np0005531887 nova_compute[186849]: 2025-11-22 08:47:58.740 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:47:59 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:47:59.469 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=72, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=71) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:47:59 np0005531887 nova_compute[186849]: 2025-11-22 08:47:59.470 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:47:59 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:47:59.470 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:48:00 np0005531887 podman[247763]: 2025-11-22 08:48:00.842196349 +0000 UTC m=+0.058470057 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:48:01 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:48:01.473 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '72'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:48:02 np0005531887 nova_compute[186849]: 2025-11-22 08:48:02.210 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:03 np0005531887 nova_compute[186849]: 2025-11-22 08:48:03.742 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:07 np0005531887 nova_compute[186849]: 2025-11-22 08:48:07.212 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:08 np0005531887 nova_compute[186849]: 2025-11-22 08:48:08.745 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:08 np0005531887 podman[247785]: 2025-11-22 08:48:08.846562 +0000 UTC m=+0.058457625 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, io.buildah.version=1.33.7, vcs-type=git, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, release=1755695350, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 22 03:48:12 np0005531887 nova_compute[186849]: 2025-11-22 08:48:12.215 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:12 np0005531887 podman[247806]: 2025-11-22 08:48:12.856365139 +0000 UTC m=+0.065905429 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm)
Nov 22 03:48:12 np0005531887 podman[247807]: 2025-11-22 08:48:12.889066022 +0000 UTC m=+0.096119921 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 03:48:13 np0005531887 nova_compute[186849]: 2025-11-22 08:48:13.746 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:17 np0005531887 nova_compute[186849]: 2025-11-22 08:48:17.217 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:17 np0005531887 podman[247853]: 2025-11-22 08:48:17.867926461 +0000 UTC m=+0.083818959 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:48:18 np0005531887 nova_compute[186849]: 2025-11-22 08:48:18.750 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:22 np0005531887 nova_compute[186849]: 2025-11-22 08:48:22.219 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:23 np0005531887 nova_compute[186849]: 2025-11-22 08:48:23.752 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:24 np0005531887 podman[247877]: 2025-11-22 08:48:24.834836241 +0000 UTC m=+0.050030519 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Nov 22 03:48:27 np0005531887 nova_compute[186849]: 2025-11-22 08:48:27.222 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:28 np0005531887 nova_compute[186849]: 2025-11-22 08:48:28.756 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:28 np0005531887 podman[247897]: 2025-11-22 08:48:28.867164834 +0000 UTC m=+0.082511396 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 22 03:48:29 np0005531887 nova_compute[186849]: 2025-11-22 08:48:29.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:48:31 np0005531887 podman[247918]: 2025-11-22 08:48:31.828485113 +0000 UTC m=+0.048305357 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:48:32 np0005531887 nova_compute[186849]: 2025-11-22 08:48:32.224 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:33 np0005531887 nova_compute[186849]: 2025-11-22 08:48:33.758 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:34 np0005531887 nova_compute[186849]: 2025-11-22 08:48:34.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:48:34 np0005531887 nova_compute[186849]: 2025-11-22 08:48:34.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:48:34 np0005531887 nova_compute[186849]: 2025-11-22 08:48:34.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:48:34 np0005531887 nova_compute[186849]: 2025-11-22 08:48:34.783 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:48:34 np0005531887 nova_compute[186849]: 2025-11-22 08:48:34.784 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:48:34 np0005531887 nova_compute[186849]: 2025-11-22 08:48:34.784 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:48:36.677 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:48:36.678 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:48:36.678 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:48:36.678 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:48:36.678 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:48:36.678 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:48:36.678 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:48:36.678 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:48:36.678 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:48:36.678 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:48:36.678 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:48:36.678 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:48:36.678 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:48:36.679 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:48:36.679 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:48:36.679 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:48:36.679 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:48:36.679 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:48:36.679 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:48:36.679 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:48:36.679 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:48:36.679 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:48:36.679 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:48:36.679 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:48:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:48:36.679 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:48:37 np0005531887 nova_compute[186849]: 2025-11-22 08:48:37.226 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:48:37.385 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:48:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:48:37.385 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:48:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:48:37.385 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:48:38 np0005531887 nova_compute[186849]: 2025-11-22 08:48:38.759 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:39 np0005531887 podman[247944]: 2025-11-22 08:48:39.842401129 +0000 UTC m=+0.062519576 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-type=git, architecture=x86_64, config_id=edpm, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc.)
Nov 22 03:48:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:48:40.455 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=73, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=72) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:48:40 np0005531887 nova_compute[186849]: 2025-11-22 08:48:40.455 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:40 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:48:40.456 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:48:40 np0005531887 nova_compute[186849]: 2025-11-22 08:48:40.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:48:40 np0005531887 nova_compute[186849]: 2025-11-22 08:48:40.789 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:48:40 np0005531887 nova_compute[186849]: 2025-11-22 08:48:40.790 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:48:40 np0005531887 nova_compute[186849]: 2025-11-22 08:48:40.790 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:48:40 np0005531887 nova_compute[186849]: 2025-11-22 08:48:40.790 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:48:40 np0005531887 nova_compute[186849]: 2025-11-22 08:48:40.954 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:48:40 np0005531887 nova_compute[186849]: 2025-11-22 08:48:40.955 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5749MB free_disk=73.2660903930664GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:48:40 np0005531887 nova_compute[186849]: 2025-11-22 08:48:40.955 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:48:40 np0005531887 nova_compute[186849]: 2025-11-22 08:48:40.956 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:48:41 np0005531887 nova_compute[186849]: 2025-11-22 08:48:41.147 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:48:41 np0005531887 nova_compute[186849]: 2025-11-22 08:48:41.147 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:48:41 np0005531887 nova_compute[186849]: 2025-11-22 08:48:41.172 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:48:41 np0005531887 nova_compute[186849]: 2025-11-22 08:48:41.184 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:48:41 np0005531887 nova_compute[186849]: 2025-11-22 08:48:41.185 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:48:41 np0005531887 nova_compute[186849]: 2025-11-22 08:48:41.185 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.230s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:48:41 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:48:41.458 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '73'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:48:42 np0005531887 nova_compute[186849]: 2025-11-22 08:48:42.186 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:48:42 np0005531887 nova_compute[186849]: 2025-11-22 08:48:42.229 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:42 np0005531887 nova_compute[186849]: 2025-11-22 08:48:42.762 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:48:43 np0005531887 nova_compute[186849]: 2025-11-22 08:48:43.761 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:43 np0005531887 nova_compute[186849]: 2025-11-22 08:48:43.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:48:43 np0005531887 podman[247966]: 2025-11-22 08:48:43.857026016 +0000 UTC m=+0.063143981 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 22 03:48:43 np0005531887 podman[247967]: 2025-11-22 08:48:43.897025979 +0000 UTC m=+0.097711010 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 03:48:45 np0005531887 nova_compute[186849]: 2025-11-22 08:48:45.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:48:47 np0005531887 nova_compute[186849]: 2025-11-22 08:48:47.231 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:48 np0005531887 nova_compute[186849]: 2025-11-22 08:48:48.764 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:48 np0005531887 podman[248010]: 2025-11-22 08:48:48.83284113 +0000 UTC m=+0.052866060 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:48:52 np0005531887 nova_compute[186849]: 2025-11-22 08:48:52.232 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:53 np0005531887 nova_compute[186849]: 2025-11-22 08:48:53.767 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:55 np0005531887 podman[248037]: 2025-11-22 08:48:55.830066548 +0000 UTC m=+0.052256635 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 22 03:48:57 np0005531887 nova_compute[186849]: 2025-11-22 08:48:57.236 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:57 np0005531887 nova_compute[186849]: 2025-11-22 08:48:57.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:48:58 np0005531887 nova_compute[186849]: 2025-11-22 08:48:58.776 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:48:59 np0005531887 podman[248058]: 2025-11-22 08:48:59.860087533 +0000 UTC m=+0.077752611 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Nov 22 03:49:02 np0005531887 nova_compute[186849]: 2025-11-22 08:49:02.239 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:49:02 np0005531887 podman[248078]: 2025-11-22 08:49:02.842033509 +0000 UTC m=+0.056928399 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:49:03 np0005531887 nova_compute[186849]: 2025-11-22 08:49:03.782 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:49:07 np0005531887 nova_compute[186849]: 2025-11-22 08:49:07.241 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:49:08 np0005531887 nova_compute[186849]: 2025-11-22 08:49:08.785 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:49:09 np0005531887 nova_compute[186849]: 2025-11-22 08:49:09.763 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:49:10 np0005531887 podman[248102]: 2025-11-22 08:49:10.883136972 +0000 UTC m=+0.081775630 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, release=1755695350, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, io.openshift.expose-services=, version=9.6, architecture=x86_64, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 22 03:49:12 np0005531887 nova_compute[186849]: 2025-11-22 08:49:12.242 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:49:13 np0005531887 nova_compute[186849]: 2025-11-22 08:49:13.788 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:49:14 np0005531887 podman[248123]: 2025-11-22 08:49:14.863516648 +0000 UTC m=+0.084845624 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:49:14 np0005531887 podman[248124]: 2025-11-22 08:49:14.89822276 +0000 UTC m=+0.114731768 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:49:17 np0005531887 nova_compute[186849]: 2025-11-22 08:49:17.244 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:49:18 np0005531887 nova_compute[186849]: 2025-11-22 08:49:18.790 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:49:19 np0005531887 podman[248167]: 2025-11-22 08:49:19.832143274 +0000 UTC m=+0.050205964 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 03:49:22 np0005531887 nova_compute[186849]: 2025-11-22 08:49:22.247 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:49:23 np0005531887 nova_compute[186849]: 2025-11-22 08:49:23.792 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:49:26 np0005531887 podman[248190]: 2025-11-22 08:49:26.847234251 +0000 UTC m=+0.065540960 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 03:49:27 np0005531887 nova_compute[186849]: 2025-11-22 08:49:27.249 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:49:28 np0005531887 nova_compute[186849]: 2025-11-22 08:49:28.794 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:49:29 np0005531887 nova_compute[186849]: 2025-11-22 08:49:29.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:49:30 np0005531887 podman[248209]: 2025-11-22 08:49:30.832032825 +0000 UTC m=+0.050619665 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, io.buildah.version=1.41.3)
Nov 22 03:49:32 np0005531887 nova_compute[186849]: 2025-11-22 08:49:32.251 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:49:33 np0005531887 nova_compute[186849]: 2025-11-22 08:49:33.796 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:49:33 np0005531887 podman[248229]: 2025-11-22 08:49:33.82907289 +0000 UTC m=+0.052079110 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 03:49:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:49:34.410 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=74, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=73) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:49:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:49:34.411 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:49:34 np0005531887 nova_compute[186849]: 2025-11-22 08:49:34.411 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:49:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:49:35.413 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '74'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:49:36 np0005531887 nova_compute[186849]: 2025-11-22 08:49:36.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:49:36 np0005531887 nova_compute[186849]: 2025-11-22 08:49:36.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:49:36 np0005531887 nova_compute[186849]: 2025-11-22 08:49:36.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:49:36 np0005531887 nova_compute[186849]: 2025-11-22 08:49:36.784 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:49:36 np0005531887 nova_compute[186849]: 2025-11-22 08:49:36.784 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:49:36 np0005531887 nova_compute[186849]: 2025-11-22 08:49:36.784 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:49:37 np0005531887 nova_compute[186849]: 2025-11-22 08:49:37.252 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:49:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:49:37.385 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:49:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:49:37.386 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:49:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:49:37.386 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:49:38 np0005531887 nova_compute[186849]: 2025-11-22 08:49:38.797 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:49:40 np0005531887 nova_compute[186849]: 2025-11-22 08:49:40.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:49:40 np0005531887 nova_compute[186849]: 2025-11-22 08:49:40.791 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:49:40 np0005531887 nova_compute[186849]: 2025-11-22 08:49:40.792 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:49:40 np0005531887 nova_compute[186849]: 2025-11-22 08:49:40.792 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:49:40 np0005531887 nova_compute[186849]: 2025-11-22 08:49:40.792 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:49:40 np0005531887 nova_compute[186849]: 2025-11-22 08:49:40.941 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:49:40 np0005531887 nova_compute[186849]: 2025-11-22 08:49:40.942 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5756MB free_disk=73.26610946655273GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:49:40 np0005531887 nova_compute[186849]: 2025-11-22 08:49:40.942 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:49:40 np0005531887 nova_compute[186849]: 2025-11-22 08:49:40.942 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:49:40 np0005531887 nova_compute[186849]: 2025-11-22 08:49:40.996 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:49:40 np0005531887 nova_compute[186849]: 2025-11-22 08:49:40.996 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:49:41 np0005531887 nova_compute[186849]: 2025-11-22 08:49:41.017 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:49:41 np0005531887 nova_compute[186849]: 2025-11-22 08:49:41.029 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:49:41 np0005531887 nova_compute[186849]: 2025-11-22 08:49:41.031 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:49:41 np0005531887 nova_compute[186849]: 2025-11-22 08:49:41.031 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.089s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:49:41 np0005531887 podman[248255]: 2025-11-22 08:49:41.84096031 +0000 UTC m=+0.059613434 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_id=edpm, version=9.6, vcs-type=git, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., release=1755695350, io.openshift.tags=minimal rhel9, distribution-scope=public, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.buildah.version=1.33.7, name=ubi9-minimal, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 03:49:42 np0005531887 nova_compute[186849]: 2025-11-22 08:49:42.031 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:49:42 np0005531887 nova_compute[186849]: 2025-11-22 08:49:42.254 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:49:42 np0005531887 nova_compute[186849]: 2025-11-22 08:49:42.762 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:49:43 np0005531887 nova_compute[186849]: 2025-11-22 08:49:43.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:49:43 np0005531887 nova_compute[186849]: 2025-11-22 08:49:43.798 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:49:45 np0005531887 nova_compute[186849]: 2025-11-22 08:49:45.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:49:45 np0005531887 podman[248276]: 2025-11-22 08:49:45.829042664 +0000 UTC m=+0.053375202 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 22 03:49:45 np0005531887 podman[248277]: 2025-11-22 08:49:45.868290957 +0000 UTC m=+0.088359660 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, container_name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 22 03:49:47 np0005531887 nova_compute[186849]: 2025-11-22 08:49:47.256 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:49:48 np0005531887 nova_compute[186849]: 2025-11-22 08:49:48.801 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:49:50 np0005531887 podman[248322]: 2025-11-22 08:49:50.832525636 +0000 UTC m=+0.052082400 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 03:49:52 np0005531887 nova_compute[186849]: 2025-11-22 08:49:52.259 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:49:53 np0005531887 nova_compute[186849]: 2025-11-22 08:49:53.804 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:49:57 np0005531887 nova_compute[186849]: 2025-11-22 08:49:57.267 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:49:57 np0005531887 podman[248346]: 2025-11-22 08:49:57.833107107 +0000 UTC m=+0.050166793 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent)
Nov 22 03:49:58 np0005531887 nova_compute[186849]: 2025-11-22 08:49:58.810 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:49:59 np0005531887 nova_compute[186849]: 2025-11-22 08:49:59.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:50:01 np0005531887 podman[248365]: 2025-11-22 08:50:01.831023384 +0000 UTC m=+0.055224526 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 03:50:02 np0005531887 nova_compute[186849]: 2025-11-22 08:50:02.269 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:50:03 np0005531887 nova_compute[186849]: 2025-11-22 08:50:03.815 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:50:04 np0005531887 podman[248385]: 2025-11-22 08:50:04.834372315 +0000 UTC m=+0.048441750 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 03:50:06 np0005531887 nova_compute[186849]: 2025-11-22 08:50:06.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:50:06 np0005531887 nova_compute[186849]: 2025-11-22 08:50:06.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 22 03:50:06 np0005531887 nova_compute[186849]: 2025-11-22 08:50:06.788 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 22 03:50:07 np0005531887 nova_compute[186849]: 2025-11-22 08:50:07.270 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:50:07 np0005531887 nova_compute[186849]: 2025-11-22 08:50:07.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:50:07 np0005531887 nova_compute[186849]: 2025-11-22 08:50:07.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 22 03:50:08 np0005531887 nova_compute[186849]: 2025-11-22 08:50:08.817 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:50:12 np0005531887 nova_compute[186849]: 2025-11-22 08:50:12.274 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:50:12 np0005531887 podman[248408]: 2025-11-22 08:50:12.885348222 +0000 UTC m=+0.090637697 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, config_id=edpm, io.buildah.version=1.33.7, managed_by=edpm_ansible, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 03:50:13 np0005531887 nova_compute[186849]: 2025-11-22 08:50:13.820 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:50:16 np0005531887 podman[248430]: 2025-11-22 08:50:16.846104566 +0000 UTC m=+0.063303805 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 22 03:50:16 np0005531887 podman[248431]: 2025-11-22 08:50:16.88329018 +0000 UTC m=+0.096183813 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 22 03:50:17 np0005531887 nova_compute[186849]: 2025-11-22 08:50:17.275 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:50:18 np0005531887 nova_compute[186849]: 2025-11-22 08:50:18.822 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:50:21 np0005531887 podman[248477]: 2025-11-22 08:50:21.834846908 +0000 UTC m=+0.058619270 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 03:50:22 np0005531887 nova_compute[186849]: 2025-11-22 08:50:22.277 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:50:22 np0005531887 nova_compute[186849]: 2025-11-22 08:50:22.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:50:23 np0005531887 nova_compute[186849]: 2025-11-22 08:50:23.824 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:50:27 np0005531887 nova_compute[186849]: 2025-11-22 08:50:27.279 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:50:28 np0005531887 nova_compute[186849]: 2025-11-22 08:50:28.826 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:50:28 np0005531887 podman[248503]: 2025-11-22 08:50:28.852953729 +0000 UTC m=+0.077529215 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 03:50:29 np0005531887 nova_compute[186849]: 2025-11-22 08:50:29.777 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:50:32 np0005531887 nova_compute[186849]: 2025-11-22 08:50:32.281 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:50:32 np0005531887 podman[248524]: 2025-11-22 08:50:32.838202984 +0000 UTC m=+0.062390643 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Nov 22 03:50:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:50:33.500 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=75, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=74) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:50:33 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:50:33.501 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:50:33 np0005531887 nova_compute[186849]: 2025-11-22 08:50:33.501 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:50:33 np0005531887 nova_compute[186849]: 2025-11-22 08:50:33.829 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:50:35 np0005531887 podman[248544]: 2025-11-22 08:50:35.830895624 +0000 UTC m=+0.050703237 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:50:36.678 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:50:36.679 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:50:36.679 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:50:36.679 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:50:36.679 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:50:36.679 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:50:36.679 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:50:36.679 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:50:36.679 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:50:36.679 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:50:36.679 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:50:36.679 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:50:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:50:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:50:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:50:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:50:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:50:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:50:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:50:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:50:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:50:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:50:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:50:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:50:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:50:36.681 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:50:37 np0005531887 nova_compute[186849]: 2025-11-22 08:50:37.283 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:50:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:50:37.386 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:50:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:50:37.387 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:50:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:50:37.387 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:50:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:50:37.503 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '75'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:50:37 np0005531887 nova_compute[186849]: 2025-11-22 08:50:37.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:50:37 np0005531887 nova_compute[186849]: 2025-11-22 08:50:37.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:50:38 np0005531887 nova_compute[186849]: 2025-11-22 08:50:38.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:50:38 np0005531887 nova_compute[186849]: 2025-11-22 08:50:38.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:50:38 np0005531887 nova_compute[186849]: 2025-11-22 08:50:38.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:50:38 np0005531887 nova_compute[186849]: 2025-11-22 08:50:38.784 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:50:38 np0005531887 nova_compute[186849]: 2025-11-22 08:50:38.830 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:50:42 np0005531887 nova_compute[186849]: 2025-11-22 08:50:42.286 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:50:42 np0005531887 nova_compute[186849]: 2025-11-22 08:50:42.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:50:42 np0005531887 nova_compute[186849]: 2025-11-22 08:50:42.791 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:50:42 np0005531887 nova_compute[186849]: 2025-11-22 08:50:42.791 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:50:42 np0005531887 nova_compute[186849]: 2025-11-22 08:50:42.792 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:50:42 np0005531887 nova_compute[186849]: 2025-11-22 08:50:42.792 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:50:42 np0005531887 nova_compute[186849]: 2025-11-22 08:50:42.981 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:50:42 np0005531887 nova_compute[186849]: 2025-11-22 08:50:42.983 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5750MB free_disk=73.26610946655273GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:50:42 np0005531887 nova_compute[186849]: 2025-11-22 08:50:42.983 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:50:42 np0005531887 nova_compute[186849]: 2025-11-22 08:50:42.983 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:50:43 np0005531887 nova_compute[186849]: 2025-11-22 08:50:43.039 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:50:43 np0005531887 nova_compute[186849]: 2025-11-22 08:50:43.040 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:50:43 np0005531887 nova_compute[186849]: 2025-11-22 08:50:43.064 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:50:43 np0005531887 nova_compute[186849]: 2025-11-22 08:50:43.075 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:50:43 np0005531887 nova_compute[186849]: 2025-11-22 08:50:43.078 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:50:43 np0005531887 nova_compute[186849]: 2025-11-22 08:50:43.079 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.096s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:50:43 np0005531887 nova_compute[186849]: 2025-11-22 08:50:43.833 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:50:43 np0005531887 podman[248568]: 2025-11-22 08:50:43.841670163 +0000 UTC m=+0.063997672 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., io.openshift.expose-services=, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, name=ubi9-minimal)
Nov 22 03:50:44 np0005531887 nova_compute[186849]: 2025-11-22 08:50:44.080 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:50:44 np0005531887 nova_compute[186849]: 2025-11-22 08:50:44.762 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:50:45 np0005531887 nova_compute[186849]: 2025-11-22 08:50:45.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:50:46 np0005531887 nova_compute[186849]: 2025-11-22 08:50:46.770 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:50:47 np0005531887 nova_compute[186849]: 2025-11-22 08:50:47.288 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:50:47 np0005531887 podman[248589]: 2025-11-22 08:50:47.852125687 +0000 UTC m=+0.067167530 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:50:47 np0005531887 podman[248590]: 2025-11-22 08:50:47.903642092 +0000 UTC m=+0.114565044 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 22 03:50:48 np0005531887 nova_compute[186849]: 2025-11-22 08:50:48.835 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:50:52 np0005531887 nova_compute[186849]: 2025-11-22 08:50:52.292 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:50:52 np0005531887 podman[248635]: 2025-11-22 08:50:52.832929563 +0000 UTC m=+0.051038885 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 03:50:53 np0005531887 nova_compute[186849]: 2025-11-22 08:50:53.838 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:50:57 np0005531887 nova_compute[186849]: 2025-11-22 08:50:57.298 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:50:58 np0005531887 nova_compute[186849]: 2025-11-22 08:50:58.841 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:50:59 np0005531887 podman[248659]: 2025-11-22 08:50:59.866070373 +0000 UTC m=+0.083869260 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118)
Nov 22 03:51:01 np0005531887 nova_compute[186849]: 2025-11-22 08:51:01.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:51:02 np0005531887 nova_compute[186849]: 2025-11-22 08:51:02.301 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:51:03 np0005531887 nova_compute[186849]: 2025-11-22 08:51:03.843 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:51:03 np0005531887 podman[248678]: 2025-11-22 08:51:03.864707458 +0000 UTC m=+0.082216820 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3)
Nov 22 03:51:06 np0005531887 podman[248699]: 2025-11-22 08:51:06.837728193 +0000 UTC m=+0.059559553 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:51:07 np0005531887 nova_compute[186849]: 2025-11-22 08:51:07.303 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:51:08 np0005531887 nova_compute[186849]: 2025-11-22 08:51:08.846 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:51:12 np0005531887 nova_compute[186849]: 2025-11-22 08:51:12.306 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:51:13 np0005531887 nova_compute[186849]: 2025-11-22 08:51:13.848 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:51:14 np0005531887 nova_compute[186849]: 2025-11-22 08:51:14.762 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:51:14 np0005531887 podman[248724]: 2025-11-22 08:51:14.840948408 +0000 UTC m=+0.057701798 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_id=edpm, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, io.openshift.expose-services=, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.6, distribution-scope=public)
Nov 22 03:51:17 np0005531887 nova_compute[186849]: 2025-11-22 08:51:17.308 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:51:18 np0005531887 nova_compute[186849]: 2025-11-22 08:51:18.851 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:51:18 np0005531887 podman[248745]: 2025-11-22 08:51:18.891033406 +0000 UTC m=+0.096987542 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118)
Nov 22 03:51:18 np0005531887 podman[248746]: 2025-11-22 08:51:18.906753051 +0000 UTC m=+0.106183498 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 22 03:51:22 np0005531887 nova_compute[186849]: 2025-11-22 08:51:22.310 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:51:23 np0005531887 podman[248791]: 2025-11-22 08:51:23.861305633 +0000 UTC m=+0.068857512 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:51:27 np0005531887 nova_compute[186849]: 2025-11-22 08:51:27.313 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 22 03:51:28 np0005531887 nova_compute[186849]: 2025-11-22 08:51:28.854 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:51:30 np0005531887 podman[248818]: 2025-11-22 08:51:30.862038335 +0000 UTC m=+0.080009106 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:51:31 np0005531887 nova_compute[186849]: 2025-11-22 08:51:31.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:51:32 np0005531887 nova_compute[186849]: 2025-11-22 08:51:32.316 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:51:33 np0005531887 nova_compute[186849]: 2025-11-22 08:51:33.857 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:51:34 np0005531887 podman[248838]: 2025-11-22 08:51:34.84646162 +0000 UTC m=+0.063854639 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:51:37 np0005531887 nova_compute[186849]: 2025-11-22 08:51:37.318 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:51:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:51:37.387 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:51:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:51:37.388 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:51:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:51:37.388 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:51:37 np0005531887 podman[248858]: 2025-11-22 08:51:37.849686723 +0000 UTC m=+0.058031696 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:51:38 np0005531887 nova_compute[186849]: 2025-11-22 08:51:38.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:51:38 np0005531887 nova_compute[186849]: 2025-11-22 08:51:38.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:51:38 np0005531887 nova_compute[186849]: 2025-11-22 08:51:38.860 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:51:39 np0005531887 nova_compute[186849]: 2025-11-22 08:51:39.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:51:39 np0005531887 nova_compute[186849]: 2025-11-22 08:51:39.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:51:39 np0005531887 nova_compute[186849]: 2025-11-22 08:51:39.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:51:39 np0005531887 nova_compute[186849]: 2025-11-22 08:51:39.798 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:51:42 np0005531887 nova_compute[186849]: 2025-11-22 08:51:42.321 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:51:43 np0005531887 nova_compute[186849]: 2025-11-22 08:51:43.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:51:43 np0005531887 nova_compute[186849]: 2025-11-22 08:51:43.862 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:51:44 np0005531887 nova_compute[186849]: 2025-11-22 08:51:44.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:51:44 np0005531887 nova_compute[186849]: 2025-11-22 08:51:44.791 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:51:44 np0005531887 nova_compute[186849]: 2025-11-22 08:51:44.791 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:51:44 np0005531887 nova_compute[186849]: 2025-11-22 08:51:44.791 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:51:44 np0005531887 nova_compute[186849]: 2025-11-22 08:51:44.792 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:51:44 np0005531887 nova_compute[186849]: 2025-11-22 08:51:44.957 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:51:44 np0005531887 nova_compute[186849]: 2025-11-22 08:51:44.958 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5749MB free_disk=73.26619720458984GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:51:44 np0005531887 nova_compute[186849]: 2025-11-22 08:51:44.958 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:51:44 np0005531887 nova_compute[186849]: 2025-11-22 08:51:44.958 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:51:45 np0005531887 nova_compute[186849]: 2025-11-22 08:51:45.018 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:51:45 np0005531887 nova_compute[186849]: 2025-11-22 08:51:45.019 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:51:45 np0005531887 nova_compute[186849]: 2025-11-22 08:51:45.104 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Refreshing inventories for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 22 03:51:45 np0005531887 nova_compute[186849]: 2025-11-22 08:51:45.152 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Updating ProviderTree inventory for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 22 03:51:45 np0005531887 nova_compute[186849]: 2025-11-22 08:51:45.152 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Updating inventory in ProviderTree for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 22 03:51:45 np0005531887 nova_compute[186849]: 2025-11-22 08:51:45.169 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Refreshing aggregate associations for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 22 03:51:45 np0005531887 nova_compute[186849]: 2025-11-22 08:51:45.191 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Refreshing trait associations for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78, traits: COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_MMX,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_PCNET _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 22 03:51:45 np0005531887 nova_compute[186849]: 2025-11-22 08:51:45.220 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:51:45 np0005531887 nova_compute[186849]: 2025-11-22 08:51:45.232 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:51:45 np0005531887 nova_compute[186849]: 2025-11-22 08:51:45.234 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:51:45 np0005531887 nova_compute[186849]: 2025-11-22 08:51:45.234 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.276s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:51:45 np0005531887 podman[248884]: 2025-11-22 08:51:45.848546451 +0000 UTC m=+0.069545698 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., container_name=openstack_network_exporter, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, name=ubi9-minimal, vendor=Red Hat, Inc., io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, io.openshift.tags=minimal rhel9, distribution-scope=public, vcs-type=git, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 03:51:46 np0005531887 nova_compute[186849]: 2025-11-22 08:51:46.235 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:51:46 np0005531887 nova_compute[186849]: 2025-11-22 08:51:46.762 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:51:47 np0005531887 nova_compute[186849]: 2025-11-22 08:51:47.323 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:51:47 np0005531887 nova_compute[186849]: 2025-11-22 08:51:47.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:51:48 np0005531887 nova_compute[186849]: 2025-11-22 08:51:48.864 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:51:49 np0005531887 podman[248905]: 2025-11-22 08:51:49.878792102 +0000 UTC m=+0.093950848 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:51:49 np0005531887 podman[248906]: 2025-11-22 08:51:49.887093436 +0000 UTC m=+0.100398127 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, container_name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:51:52 np0005531887 nova_compute[186849]: 2025-11-22 08:51:52.326 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:51:53 np0005531887 nova_compute[186849]: 2025-11-22 08:51:53.867 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:51:54 np0005531887 podman[248948]: 2025-11-22 08:51:54.844109382 +0000 UTC m=+0.056133669 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:51:57 np0005531887 nova_compute[186849]: 2025-11-22 08:51:57.327 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:51:58 np0005531887 nova_compute[186849]: 2025-11-22 08:51:58.869 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:52:00 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:52:00.811 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=76, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=75) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:52:00 np0005531887 nova_compute[186849]: 2025-11-22 08:52:00.812 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:52:00 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:52:00.813 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:52:01 np0005531887 nova_compute[186849]: 2025-11-22 08:52:01.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:52:01 np0005531887 podman[248972]: 2025-11-22 08:52:01.85252144 +0000 UTC m=+0.058068516 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 03:52:02 np0005531887 nova_compute[186849]: 2025-11-22 08:52:02.330 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:52:02 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:52:02.815 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '76'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:52:03 np0005531887 nova_compute[186849]: 2025-11-22 08:52:03.871 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:52:05 np0005531887 podman[248991]: 2025-11-22 08:52:05.861378618 +0000 UTC m=+0.074429229 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 22 03:52:07 np0005531887 nova_compute[186849]: 2025-11-22 08:52:07.333 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:52:08 np0005531887 nova_compute[186849]: 2025-11-22 08:52:08.873 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:52:08 np0005531887 podman[249012]: 2025-11-22 08:52:08.88254609 +0000 UTC m=+0.086147745 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:52:12 np0005531887 nova_compute[186849]: 2025-11-22 08:52:12.335 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:52:13 np0005531887 nova_compute[186849]: 2025-11-22 08:52:13.874 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:52:16 np0005531887 podman[249036]: 2025-11-22 08:52:16.847003005 +0000 UTC m=+0.062778963 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, distribution-scope=public, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, config_id=edpm, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.buildah.version=1.33.7, maintainer=Red Hat, Inc.)
Nov 22 03:52:17 np0005531887 nova_compute[186849]: 2025-11-22 08:52:17.337 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:52:18 np0005531887 nova_compute[186849]: 2025-11-22 08:52:18.877 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:52:20 np0005531887 podman[249059]: 2025-11-22 08:52:20.838633757 +0000 UTC m=+0.056885168 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 22 03:52:20 np0005531887 podman[249060]: 2025-11-22 08:52:20.909589579 +0000 UTC m=+0.122822087 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:52:22 np0005531887 nova_compute[186849]: 2025-11-22 08:52:22.339 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:52:23 np0005531887 nova_compute[186849]: 2025-11-22 08:52:23.880 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:52:25 np0005531887 podman[249108]: 2025-11-22 08:52:25.42279902 +0000 UTC m=+0.064030084 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:52:27 np0005531887 nova_compute[186849]: 2025-11-22 08:52:27.341 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:52:28 np0005531887 nova_compute[186849]: 2025-11-22 08:52:28.882 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:52:32 np0005531887 nova_compute[186849]: 2025-11-22 08:52:32.343 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:52:32 np0005531887 podman[249133]: 2025-11-22 08:52:32.845400738 +0000 UTC m=+0.055557115 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 22 03:52:33 np0005531887 nova_compute[186849]: 2025-11-22 08:52:33.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:52:33 np0005531887 nova_compute[186849]: 2025-11-22 08:52:33.885 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:52:36.678 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:52:36.679 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:52:36.679 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:52:36.679 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:52:36.679 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:52:36.679 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:52:36.679 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:52:36.679 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:52:36.679 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:52:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:52:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:52:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:52:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:52:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:52:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:52:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:52:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:52:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:52:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:52:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:52:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:52:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:52:36.681 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:52:36.681 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:52:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:52:36.681 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:52:36 np0005531887 podman[249152]: 2025-11-22 08:52:36.843494619 +0000 UTC m=+0.062887195 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 22 03:52:37 np0005531887 nova_compute[186849]: 2025-11-22 08:52:37.346 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:52:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:52:37.388 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:52:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:52:37.389 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:52:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:52:37.389 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:52:38 np0005531887 nova_compute[186849]: 2025-11-22 08:52:38.886 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:52:39 np0005531887 nova_compute[186849]: 2025-11-22 08:52:39.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:52:39 np0005531887 nova_compute[186849]: 2025-11-22 08:52:39.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:52:39 np0005531887 podman[249172]: 2025-11-22 08:52:39.840810805 +0000 UTC m=+0.059300677 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:52:41 np0005531887 nova_compute[186849]: 2025-11-22 08:52:41.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:52:41 np0005531887 nova_compute[186849]: 2025-11-22 08:52:41.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:52:41 np0005531887 nova_compute[186849]: 2025-11-22 08:52:41.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:52:41 np0005531887 nova_compute[186849]: 2025-11-22 08:52:41.964 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:52:42 np0005531887 nova_compute[186849]: 2025-11-22 08:52:42.347 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:52:43 np0005531887 nova_compute[186849]: 2025-11-22 08:52:43.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:52:43 np0005531887 nova_compute[186849]: 2025-11-22 08:52:43.888 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:52:44 np0005531887 nova_compute[186849]: 2025-11-22 08:52:44.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:52:44 np0005531887 nova_compute[186849]: 2025-11-22 08:52:44.792 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:52:44 np0005531887 nova_compute[186849]: 2025-11-22 08:52:44.793 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:52:44 np0005531887 nova_compute[186849]: 2025-11-22 08:52:44.793 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:52:44 np0005531887 nova_compute[186849]: 2025-11-22 08:52:44.793 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:52:44 np0005531887 nova_compute[186849]: 2025-11-22 08:52:44.948 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:52:44 np0005531887 nova_compute[186849]: 2025-11-22 08:52:44.949 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5760MB free_disk=73.26623153686523GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:52:44 np0005531887 nova_compute[186849]: 2025-11-22 08:52:44.950 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:52:44 np0005531887 nova_compute[186849]: 2025-11-22 08:52:44.950 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:52:45 np0005531887 nova_compute[186849]: 2025-11-22 08:52:45.097 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:52:45 np0005531887 nova_compute[186849]: 2025-11-22 08:52:45.098 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:52:45 np0005531887 nova_compute[186849]: 2025-11-22 08:52:45.130 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:52:45 np0005531887 nova_compute[186849]: 2025-11-22 08:52:45.144 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:52:45 np0005531887 nova_compute[186849]: 2025-11-22 08:52:45.146 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:52:45 np0005531887 nova_compute[186849]: 2025-11-22 08:52:45.147 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.197s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:52:47 np0005531887 nova_compute[186849]: 2025-11-22 08:52:47.140 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:52:47 np0005531887 nova_compute[186849]: 2025-11-22 08:52:47.141 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:52:47 np0005531887 nova_compute[186849]: 2025-11-22 08:52:47.349 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:52:47 np0005531887 nova_compute[186849]: 2025-11-22 08:52:47.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:52:47 np0005531887 podman[249196]: 2025-11-22 08:52:47.839628744 +0000 UTC m=+0.058717563 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, architecture=x86_64, name=ubi9-minimal, config_id=edpm, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.6, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 22 03:52:48 np0005531887 nova_compute[186849]: 2025-11-22 08:52:48.890 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:52:51 np0005531887 podman[249216]: 2025-11-22 08:52:51.852410176 +0000 UTC m=+0.069542539 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 22 03:52:51 np0005531887 podman[249217]: 2025-11-22 08:52:51.875671078 +0000 UTC m=+0.093687332 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 03:52:52 np0005531887 nova_compute[186849]: 2025-11-22 08:52:52.351 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:52:53 np0005531887 nova_compute[186849]: 2025-11-22 08:52:53.893 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:52:55 np0005531887 podman[249262]: 2025-11-22 08:52:55.837662842 +0000 UTC m=+0.058009215 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:52:57 np0005531887 nova_compute[186849]: 2025-11-22 08:52:57.353 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:52:58 np0005531887 nova_compute[186849]: 2025-11-22 08:52:58.894 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:53:02 np0005531887 nova_compute[186849]: 2025-11-22 08:53:02.355 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:53:03 np0005531887 nova_compute[186849]: 2025-11-22 08:53:03.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:53:03 np0005531887 podman[249286]: 2025-11-22 08:53:03.835808183 +0000 UTC m=+0.057473762 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:53:03 np0005531887 nova_compute[186849]: 2025-11-22 08:53:03.895 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:53:07 np0005531887 nova_compute[186849]: 2025-11-22 08:53:07.356 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:53:07 np0005531887 podman[249305]: 2025-11-22 08:53:07.840324331 +0000 UTC m=+0.055471003 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 22 03:53:08 np0005531887 nova_compute[186849]: 2025-11-22 08:53:08.896 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:53:10 np0005531887 podman[249326]: 2025-11-22 08:53:10.83766376 +0000 UTC m=+0.061989074 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:53:12 np0005531887 nova_compute[186849]: 2025-11-22 08:53:12.359 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:53:12 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:53:12.867 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=77, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=76) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:53:12 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:53:12.867 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:53:12 np0005531887 nova_compute[186849]: 2025-11-22 08:53:12.868 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:53:13 np0005531887 nova_compute[186849]: 2025-11-22 08:53:13.898 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:53:14 np0005531887 nova_compute[186849]: 2025-11-22 08:53:14.762 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:53:17 np0005531887 nova_compute[186849]: 2025-11-22 08:53:17.362 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:53:17 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:53:17.869 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '77'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:53:18 np0005531887 podman[249352]: 2025-11-22 08:53:18.849008465 +0000 UTC m=+0.063850898 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, release=1755695350, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, name=ubi9-minimal, config_id=edpm, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 22 03:53:18 np0005531887 nova_compute[186849]: 2025-11-22 08:53:18.900 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:53:22 np0005531887 nova_compute[186849]: 2025-11-22 08:53:22.364 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:53:22 np0005531887 podman[249373]: 2025-11-22 08:53:22.834062046 +0000 UTC m=+0.052779987 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 03:53:22 np0005531887 podman[249374]: 2025-11-22 08:53:22.901011119 +0000 UTC m=+0.115013025 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251118)
Nov 22 03:53:23 np0005531887 nova_compute[186849]: 2025-11-22 08:53:23.901 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:53:26 np0005531887 podman[249420]: 2025-11-22 08:53:26.828686823 +0000 UTC m=+0.052815558 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 03:53:27 np0005531887 nova_compute[186849]: 2025-11-22 08:53:27.365 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:53:28 np0005531887 nova_compute[186849]: 2025-11-22 08:53:28.904 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:53:32 np0005531887 nova_compute[186849]: 2025-11-22 08:53:32.368 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:53:33 np0005531887 nova_compute[186849]: 2025-11-22 08:53:33.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:53:33 np0005531887 nova_compute[186849]: 2025-11-22 08:53:33.914 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:53:34 np0005531887 podman[249444]: 2025-11-22 08:53:34.834201622 +0000 UTC m=+0.057135004 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:53:37 np0005531887 nova_compute[186849]: 2025-11-22 08:53:37.370 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:53:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:53:37.389 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:53:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:53:37.390 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:53:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:53:37.391 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:53:38 np0005531887 podman[249464]: 2025-11-22 08:53:38.865655745 +0000 UTC m=+0.086740491 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:53:38 np0005531887 nova_compute[186849]: 2025-11-22 08:53:38.915 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:53:40 np0005531887 nova_compute[186849]: 2025-11-22 08:53:40.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:53:40 np0005531887 nova_compute[186849]: 2025-11-22 08:53:40.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:53:41 np0005531887 podman[249486]: 2025-11-22 08:53:41.831028679 +0000 UTC m=+0.052481520 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:53:42 np0005531887 nova_compute[186849]: 2025-11-22 08:53:42.372 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:53:42 np0005531887 nova_compute[186849]: 2025-11-22 08:53:42.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:53:42 np0005531887 nova_compute[186849]: 2025-11-22 08:53:42.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:53:42 np0005531887 nova_compute[186849]: 2025-11-22 08:53:42.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:53:42 np0005531887 nova_compute[186849]: 2025-11-22 08:53:42.795 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:53:43 np0005531887 nova_compute[186849]: 2025-11-22 08:53:43.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:53:43 np0005531887 nova_compute[186849]: 2025-11-22 08:53:43.918 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:53:45 np0005531887 nova_compute[186849]: 2025-11-22 08:53:45.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:53:45 np0005531887 nova_compute[186849]: 2025-11-22 08:53:45.816 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:53:45 np0005531887 nova_compute[186849]: 2025-11-22 08:53:45.817 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:53:45 np0005531887 nova_compute[186849]: 2025-11-22 08:53:45.817 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:53:45 np0005531887 nova_compute[186849]: 2025-11-22 08:53:45.817 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:53:46 np0005531887 nova_compute[186849]: 2025-11-22 08:53:46.030 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:53:46 np0005531887 nova_compute[186849]: 2025-11-22 08:53:46.031 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5756MB free_disk=73.26626205444336GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:53:46 np0005531887 nova_compute[186849]: 2025-11-22 08:53:46.032 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:53:46 np0005531887 nova_compute[186849]: 2025-11-22 08:53:46.033 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:53:46 np0005531887 nova_compute[186849]: 2025-11-22 08:53:46.233 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:53:46 np0005531887 nova_compute[186849]: 2025-11-22 08:53:46.233 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:53:46 np0005531887 nova_compute[186849]: 2025-11-22 08:53:46.279 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:53:46 np0005531887 nova_compute[186849]: 2025-11-22 08:53:46.296 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:53:46 np0005531887 nova_compute[186849]: 2025-11-22 08:53:46.299 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:53:46 np0005531887 nova_compute[186849]: 2025-11-22 08:53:46.300 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.268s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:53:47 np0005531887 nova_compute[186849]: 2025-11-22 08:53:47.295 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:53:47 np0005531887 nova_compute[186849]: 2025-11-22 08:53:47.375 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:53:47 np0005531887 nova_compute[186849]: 2025-11-22 08:53:47.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:53:48 np0005531887 nova_compute[186849]: 2025-11-22 08:53:48.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:53:48 np0005531887 nova_compute[186849]: 2025-11-22 08:53:48.921 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:53:49 np0005531887 podman[249510]: 2025-11-22 08:53:49.840573359 +0000 UTC m=+0.061045070 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, version=9.6, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.buildah.version=1.33.7, managed_by=edpm_ansible, config_id=edpm, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, name=ubi9-minimal)
Nov 22 03:53:52 np0005531887 nova_compute[186849]: 2025-11-22 08:53:52.377 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:53:53 np0005531887 podman[249532]: 2025-11-22 08:53:53.845199011 +0000 UTC m=+0.060732082 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 22 03:53:53 np0005531887 podman[249533]: 2025-11-22 08:53:53.878632022 +0000 UTC m=+0.095149157 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 22 03:53:53 np0005531887 nova_compute[186849]: 2025-11-22 08:53:53.922 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:53:57 np0005531887 nova_compute[186849]: 2025-11-22 08:53:57.379 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:53:57 np0005531887 podman[249576]: 2025-11-22 08:53:57.4649527 +0000 UTC m=+0.054619022 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:53:58 np0005531887 nova_compute[186849]: 2025-11-22 08:53:58.924 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:02 np0005531887 nova_compute[186849]: 2025-11-22 08:54:02.381 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:03 np0005531887 nova_compute[186849]: 2025-11-22 08:54:03.926 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:04 np0005531887 nova_compute[186849]: 2025-11-22 08:54:04.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:54:05 np0005531887 podman[249600]: 2025-11-22 08:54:05.831143141 +0000 UTC m=+0.054566281 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent)
Nov 22 03:54:07 np0005531887 nova_compute[186849]: 2025-11-22 08:54:07.383 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:08 np0005531887 nova_compute[186849]: 2025-11-22 08:54:08.928 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:09 np0005531887 podman[249619]: 2025-11-22 08:54:09.849420648 +0000 UTC m=+0.067690464 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251118)
Nov 22 03:54:12 np0005531887 nova_compute[186849]: 2025-11-22 08:54:12.386 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:12 np0005531887 podman[249639]: 2025-11-22 08:54:12.849604305 +0000 UTC m=+0.059479701 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:54:13 np0005531887 nova_compute[186849]: 2025-11-22 08:54:13.930 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:17 np0005531887 nova_compute[186849]: 2025-11-22 08:54:17.389 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:18 np0005531887 nova_compute[186849]: 2025-11-22 08:54:18.932 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:20 np0005531887 podman[249667]: 2025-11-22 08:54:20.842675282 +0000 UTC m=+0.061900591 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, architecture=x86_64, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, version=9.6, managed_by=edpm_ansible, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., container_name=openstack_network_exporter)
Nov 22 03:54:22 np0005531887 nova_compute[186849]: 2025-11-22 08:54:22.390 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:23 np0005531887 nova_compute[186849]: 2025-11-22 08:54:23.934 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:24 np0005531887 podman[249688]: 2025-11-22 08:54:24.853471165 +0000 UTC m=+0.065248063 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm)
Nov 22 03:54:24 np0005531887 podman[249689]: 2025-11-22 08:54:24.903700039 +0000 UTC m=+0.108961787 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:54:25 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:54:25.425 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=78, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=77) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:54:25 np0005531887 nova_compute[186849]: 2025-11-22 08:54:25.427 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:25 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:54:25.428 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:54:27 np0005531887 nova_compute[186849]: 2025-11-22 08:54:27.392 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:27 np0005531887 podman[249734]: 2025-11-22 08:54:27.840080267 +0000 UTC m=+0.055675738 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 03:54:28 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:54:28.431 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '78'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:54:28 np0005531887 nova_compute[186849]: 2025-11-22 08:54:28.937 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:32 np0005531887 nova_compute[186849]: 2025-11-22 08:54:32.394 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:33 np0005531887 nova_compute[186849]: 2025-11-22 08:54:33.938 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:35 np0005531887 nova_compute[186849]: 2025-11-22 08:54:35.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:54:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:54:36.679 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:54:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:54:36.679 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:54:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:54:36.679 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:54:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:54:36.679 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:54:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:54:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:54:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:54:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:54:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:54:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:54:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:54:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:54:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:54:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:54:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:54:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:54:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:54:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:54:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:54:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:54:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:54:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:54:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:54:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:54:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:54:36.681 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:54:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:54:36.681 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:54:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:54:36.681 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:54:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:54:36.681 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:54:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:54:36.681 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:54:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:54:36.681 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:54:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:54:36.681 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:54:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:54:36.681 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:54:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:54:36.681 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:54:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:54:36.681 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:54:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:54:36.681 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:54:36 np0005531887 podman[249758]: 2025-11-22 08:54:36.866168241 +0000 UTC m=+0.083122642 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118)
Nov 22 03:54:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:54:37.390 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:54:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:54:37.391 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:54:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:54:37.391 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:54:37 np0005531887 nova_compute[186849]: 2025-11-22 08:54:37.397 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:38 np0005531887 nova_compute[186849]: 2025-11-22 08:54:38.941 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:40 np0005531887 nova_compute[186849]: 2025-11-22 08:54:40.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:54:40 np0005531887 nova_compute[186849]: 2025-11-22 08:54:40.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:54:40 np0005531887 podman[249778]: 2025-11-22 08:54:40.841612517 +0000 UTC m=+0.065061309 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:54:42 np0005531887 nova_compute[186849]: 2025-11-22 08:54:42.400 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:43 np0005531887 nova_compute[186849]: 2025-11-22 08:54:43.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:54:43 np0005531887 nova_compute[186849]: 2025-11-22 08:54:43.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:54:43 np0005531887 nova_compute[186849]: 2025-11-22 08:54:43.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:54:43 np0005531887 nova_compute[186849]: 2025-11-22 08:54:43.781 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:54:43 np0005531887 podman[249798]: 2025-11-22 08:54:43.836441813 +0000 UTC m=+0.055522494 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 03:54:43 np0005531887 nova_compute[186849]: 2025-11-22 08:54:43.943 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:44 np0005531887 nova_compute[186849]: 2025-11-22 08:54:44.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:54:45 np0005531887 nova_compute[186849]: 2025-11-22 08:54:45.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:54:45 np0005531887 nova_compute[186849]: 2025-11-22 08:54:45.792 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:54:45 np0005531887 nova_compute[186849]: 2025-11-22 08:54:45.792 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:54:45 np0005531887 nova_compute[186849]: 2025-11-22 08:54:45.792 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:54:45 np0005531887 nova_compute[186849]: 2025-11-22 08:54:45.792 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:54:45 np0005531887 nova_compute[186849]: 2025-11-22 08:54:45.955 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:54:45 np0005531887 nova_compute[186849]: 2025-11-22 08:54:45.957 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5751MB free_disk=73.26628112792969GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:54:45 np0005531887 nova_compute[186849]: 2025-11-22 08:54:45.957 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:54:45 np0005531887 nova_compute[186849]: 2025-11-22 08:54:45.957 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:54:46 np0005531887 nova_compute[186849]: 2025-11-22 08:54:46.027 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:54:46 np0005531887 nova_compute[186849]: 2025-11-22 08:54:46.028 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:54:46 np0005531887 nova_compute[186849]: 2025-11-22 08:54:46.051 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:54:46 np0005531887 nova_compute[186849]: 2025-11-22 08:54:46.071 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:54:46 np0005531887 nova_compute[186849]: 2025-11-22 08:54:46.073 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:54:46 np0005531887 nova_compute[186849]: 2025-11-22 08:54:46.073 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:54:47 np0005531887 nova_compute[186849]: 2025-11-22 08:54:47.067 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:54:47 np0005531887 nova_compute[186849]: 2025-11-22 08:54:47.401 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:48 np0005531887 nova_compute[186849]: 2025-11-22 08:54:48.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:54:48 np0005531887 nova_compute[186849]: 2025-11-22 08:54:48.946 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:49 np0005531887 nova_compute[186849]: 2025-11-22 08:54:49.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:54:51 np0005531887 podman[249824]: 2025-11-22 08:54:51.842548529 +0000 UTC m=+0.058815366 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., architecture=x86_64, config_id=edpm, distribution-scope=public, managed_by=edpm_ansible, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 03:54:52 np0005531887 nova_compute[186849]: 2025-11-22 08:54:52.402 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:53 np0005531887 nova_compute[186849]: 2025-11-22 08:54:53.949 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:54 np0005531887 nova_compute[186849]: 2025-11-22 08:54:54.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:54:54 np0005531887 nova_compute[186849]: 2025-11-22 08:54:54.769 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:54:54 np0005531887 nova_compute[186849]: 2025-11-22 08:54:54.770 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:54:54 np0005531887 nova_compute[186849]: 2025-11-22 08:54:54.770 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:54:54 np0005531887 nova_compute[186849]: 2025-11-22 08:54:54.771 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:54:54 np0005531887 nova_compute[186849]: 2025-11-22 08:54:54.771 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:54:54 np0005531887 nova_compute[186849]: 2025-11-22 08:54:54.771 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:54:54 np0005531887 nova_compute[186849]: 2025-11-22 08:54:54.791 186853 DEBUG nova.virt.libvirt.imagecache [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314#033[00m
Nov 22 03:54:54 np0005531887 nova_compute[186849]: 2025-11-22 08:54:54.791 186853 WARNING nova.virt.libvirt.imagecache [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726#033[00m
Nov 22 03:54:54 np0005531887 nova_compute[186849]: 2025-11-22 08:54:54.791 186853 WARNING nova.virt.libvirt.imagecache [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42#033[00m
Nov 22 03:54:54 np0005531887 nova_compute[186849]: 2025-11-22 08:54:54.792 186853 WARNING nova.virt.libvirt.imagecache [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/5e1b055cd2dda7073fea6bdd458a9a8fcf51be29#033[00m
Nov 22 03:54:54 np0005531887 nova_compute[186849]: 2025-11-22 08:54:54.792 186853 WARNING nova.virt.libvirt.imagecache [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/253915408629bc954cd98505927b671e71b5d9d2#033[00m
Nov 22 03:54:54 np0005531887 nova_compute[186849]: 2025-11-22 08:54:54.792 186853 WARNING nova.virt.libvirt.imagecache [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/dd5495c5d01c62e3f2430ded9b741807f5260e73#033[00m
Nov 22 03:54:54 np0005531887 nova_compute[186849]: 2025-11-22 08:54:54.792 186853 INFO nova.virt.libvirt.imagecache [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Removable base files: /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 /var/lib/nova/instances/_base/5e1b055cd2dda7073fea6bdd458a9a8fcf51be29 /var/lib/nova/instances/_base/253915408629bc954cd98505927b671e71b5d9d2 /var/lib/nova/instances/_base/dd5495c5d01c62e3f2430ded9b741807f5260e73#033[00m
Nov 22 03:54:54 np0005531887 nova_compute[186849]: 2025-11-22 08:54:54.793 186853 INFO nova.virt.libvirt.imagecache [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726#033[00m
Nov 22 03:54:54 np0005531887 nova_compute[186849]: 2025-11-22 08:54:54.793 186853 INFO nova.virt.libvirt.imagecache [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42#033[00m
Nov 22 03:54:54 np0005531887 nova_compute[186849]: 2025-11-22 08:54:54.793 186853 INFO nova.virt.libvirt.imagecache [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/5e1b055cd2dda7073fea6bdd458a9a8fcf51be29#033[00m
Nov 22 03:54:54 np0005531887 nova_compute[186849]: 2025-11-22 08:54:54.793 186853 INFO nova.virt.libvirt.imagecache [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/253915408629bc954cd98505927b671e71b5d9d2#033[00m
Nov 22 03:54:54 np0005531887 nova_compute[186849]: 2025-11-22 08:54:54.794 186853 INFO nova.virt.libvirt.imagecache [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/dd5495c5d01c62e3f2430ded9b741807f5260e73#033[00m
Nov 22 03:54:54 np0005531887 nova_compute[186849]: 2025-11-22 08:54:54.794 186853 DEBUG nova.virt.libvirt.imagecache [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350#033[00m
Nov 22 03:54:54 np0005531887 nova_compute[186849]: 2025-11-22 08:54:54.794 186853 DEBUG nova.virt.libvirt.imagecache [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299#033[00m
Nov 22 03:54:54 np0005531887 nova_compute[186849]: 2025-11-22 08:54:54.794 186853 DEBUG nova.virt.libvirt.imagecache [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284#033[00m
Nov 22 03:54:55 np0005531887 podman[249848]: 2025-11-22 08:54:55.85609127 +0000 UTC m=+0.069712113 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:54:55 np0005531887 podman[249849]: 2025-11-22 08:54:55.89110434 +0000 UTC m=+0.098174422 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:54:57 np0005531887 nova_compute[186849]: 2025-11-22 08:54:57.403 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:54:58 np0005531887 podman[249890]: 2025-11-22 08:54:58.856617796 +0000 UTC m=+0.080636600 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 03:54:58 np0005531887 nova_compute[186849]: 2025-11-22 08:54:58.949 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:55:02 np0005531887 nova_compute[186849]: 2025-11-22 08:55:02.404 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:55:03 np0005531887 nova_compute[186849]: 2025-11-22 08:55:03.950 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:55:06 np0005531887 nova_compute[186849]: 2025-11-22 08:55:06.794 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:55:07 np0005531887 nova_compute[186849]: 2025-11-22 08:55:07.406 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:55:07 np0005531887 podman[249914]: 2025-11-22 08:55:07.850429185 +0000 UTC m=+0.061384639 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 03:55:08 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:55:08.803 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=79, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=78) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:55:08 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:55:08.804 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:55:08 np0005531887 nova_compute[186849]: 2025-11-22 08:55:08.803 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:55:08 np0005531887 nova_compute[186849]: 2025-11-22 08:55:08.952 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:55:11 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:55:11.806 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '79'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:55:11 np0005531887 podman[249933]: 2025-11-22 08:55:11.850501065 +0000 UTC m=+0.071948398 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, config_id=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:55:12 np0005531887 nova_compute[186849]: 2025-11-22 08:55:12.408 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:55:13 np0005531887 nova_compute[186849]: 2025-11-22 08:55:13.953 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:55:14 np0005531887 nova_compute[186849]: 2025-11-22 08:55:14.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:55:14 np0005531887 nova_compute[186849]: 2025-11-22 08:55:14.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 22 03:55:14 np0005531887 nova_compute[186849]: 2025-11-22 08:55:14.789 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 22 03:55:14 np0005531887 podman[249955]: 2025-11-22 08:55:14.869461074 +0000 UTC m=+0.085972643 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:55:17 np0005531887 nova_compute[186849]: 2025-11-22 08:55:17.410 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:55:17 np0005531887 nova_compute[186849]: 2025-11-22 08:55:17.782 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:55:18 np0005531887 nova_compute[186849]: 2025-11-22 08:55:18.955 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:55:21 np0005531887 nova_compute[186849]: 2025-11-22 08:55:21.771 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:55:21 np0005531887 nova_compute[186849]: 2025-11-22 08:55:21.772 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 22 03:55:22 np0005531887 nova_compute[186849]: 2025-11-22 08:55:22.417 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:55:22 np0005531887 podman[249979]: 2025-11-22 08:55:22.83444773 +0000 UTC m=+0.056294284 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, version=9.6, container_name=openstack_network_exporter, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9-minimal, architecture=x86_64)
Nov 22 03:55:23 np0005531887 nova_compute[186849]: 2025-11-22 08:55:23.960 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:55:26 np0005531887 podman[250000]: 2025-11-22 08:55:26.863970874 +0000 UTC m=+0.065608301 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 22 03:55:26 np0005531887 podman[250001]: 2025-11-22 08:55:26.89598917 +0000 UTC m=+0.091521388 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:55:27 np0005531887 nova_compute[186849]: 2025-11-22 08:55:27.420 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:55:28 np0005531887 nova_compute[186849]: 2025-11-22 08:55:28.962 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:55:29 np0005531887 podman[250047]: 2025-11-22 08:55:29.839145027 +0000 UTC m=+0.053114755 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 03:55:32 np0005531887 nova_compute[186849]: 2025-11-22 08:55:32.423 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:55:33 np0005531887 nova_compute[186849]: 2025-11-22 08:55:33.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:55:33 np0005531887 nova_compute[186849]: 2025-11-22 08:55:33.966 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:55:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:55:37.391 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:55:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:55:37.392 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:55:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:55:37.392 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:55:37 np0005531887 nova_compute[186849]: 2025-11-22 08:55:37.425 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:55:37 np0005531887 nova_compute[186849]: 2025-11-22 08:55:37.784 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:55:38 np0005531887 podman[250071]: 2025-11-22 08:55:38.841067556 +0000 UTC m=+0.059544834 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Nov 22 03:55:38 np0005531887 nova_compute[186849]: 2025-11-22 08:55:38.968 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:55:42 np0005531887 nova_compute[186849]: 2025-11-22 08:55:42.429 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:55:42 np0005531887 nova_compute[186849]: 2025-11-22 08:55:42.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:55:42 np0005531887 nova_compute[186849]: 2025-11-22 08:55:42.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:55:42 np0005531887 podman[250091]: 2025-11-22 08:55:42.843182367 +0000 UTC m=+0.061427830 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:55:43 np0005531887 nova_compute[186849]: 2025-11-22 08:55:43.969 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:55:44 np0005531887 nova_compute[186849]: 2025-11-22 08:55:44.770 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:55:44 np0005531887 nova_compute[186849]: 2025-11-22 08:55:44.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:55:44 np0005531887 nova_compute[186849]: 2025-11-22 08:55:44.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:55:44 np0005531887 nova_compute[186849]: 2025-11-22 08:55:44.782 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:55:45 np0005531887 podman[250111]: 2025-11-22 08:55:45.830177561 +0000 UTC m=+0.051509825 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:55:46 np0005531887 nova_compute[186849]: 2025-11-22 08:55:46.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:55:47 np0005531887 nova_compute[186849]: 2025-11-22 08:55:47.430 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:55:47 np0005531887 nova_compute[186849]: 2025-11-22 08:55:47.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:55:47 np0005531887 nova_compute[186849]: 2025-11-22 08:55:47.792 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:55:47 np0005531887 nova_compute[186849]: 2025-11-22 08:55:47.793 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:55:47 np0005531887 nova_compute[186849]: 2025-11-22 08:55:47.793 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:55:47 np0005531887 nova_compute[186849]: 2025-11-22 08:55:47.793 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:55:47 np0005531887 nova_compute[186849]: 2025-11-22 08:55:47.995 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:55:47 np0005531887 nova_compute[186849]: 2025-11-22 08:55:47.997 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5747MB free_disk=73.26626205444336GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:55:47 np0005531887 nova_compute[186849]: 2025-11-22 08:55:47.997 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:55:47 np0005531887 nova_compute[186849]: 2025-11-22 08:55:47.997 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:55:49 np0005531887 nova_compute[186849]: 2025-11-22 08:55:49.152 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:55:49 np0005531887 nova_compute[186849]: 2025-11-22 08:55:49.152 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:55:49 np0005531887 nova_compute[186849]: 2025-11-22 08:55:49.155 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:55:49 np0005531887 nova_compute[186849]: 2025-11-22 08:55:49.604 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:55:49 np0005531887 nova_compute[186849]: 2025-11-22 08:55:49.627 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:55:49 np0005531887 nova_compute[186849]: 2025-11-22 08:55:49.629 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:55:49 np0005531887 nova_compute[186849]: 2025-11-22 08:55:49.629 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.632s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:55:50 np0005531887 nova_compute[186849]: 2025-11-22 08:55:50.624 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:55:50 np0005531887 nova_compute[186849]: 2025-11-22 08:55:50.624 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:55:51 np0005531887 nova_compute[186849]: 2025-11-22 08:55:51.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:55:52 np0005531887 nova_compute[186849]: 2025-11-22 08:55:52.433 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:55:53 np0005531887 podman[250135]: 2025-11-22 08:55:53.838199734 +0000 UTC m=+0.060925328 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, container_name=openstack_network_exporter, name=ubi9-minimal, io.buildah.version=1.33.7, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-type=git, release=1755695350, build-date=2025-08-20T13:12:41, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 22 03:55:53 np0005531887 nova_compute[186849]: 2025-11-22 08:55:53.973 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:55:57 np0005531887 nova_compute[186849]: 2025-11-22 08:55:57.434 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:55:57 np0005531887 podman[250156]: 2025-11-22 08:55:57.837670139 +0000 UTC m=+0.063036188 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 22 03:55:57 np0005531887 podman[250157]: 2025-11-22 08:55:57.902082421 +0000 UTC m=+0.119607997 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:55:58 np0005531887 nova_compute[186849]: 2025-11-22 08:55:58.974 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:56:00 np0005531887 podman[250203]: 2025-11-22 08:56:00.835480789 +0000 UTC m=+0.053360961 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:56:02 np0005531887 nova_compute[186849]: 2025-11-22 08:56:02.435 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:56:03 np0005531887 nova_compute[186849]: 2025-11-22 08:56:03.977 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:56:06 np0005531887 nova_compute[186849]: 2025-11-22 08:56:06.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:56:07 np0005531887 nova_compute[186849]: 2025-11-22 08:56:07.436 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:56:08 np0005531887 nova_compute[186849]: 2025-11-22 08:56:08.979 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:56:09 np0005531887 podman[250228]: 2025-11-22 08:56:09.859444508 +0000 UTC m=+0.081840410 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:56:11 np0005531887 nova_compute[186849]: 2025-11-22 08:56:11.655 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:56:12 np0005531887 nova_compute[186849]: 2025-11-22 08:56:12.438 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:56:13 np0005531887 podman[250249]: 2025-11-22 08:56:13.854402294 +0000 UTC m=+0.066340531 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 03:56:13 np0005531887 nova_compute[186849]: 2025-11-22 08:56:13.980 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:56:15 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:56:15.075 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=80, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=79) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:56:15 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:56:15.076 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:56:15 np0005531887 nova_compute[186849]: 2025-11-22 08:56:15.077 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:56:16 np0005531887 podman[250269]: 2025-11-22 08:56:16.83540558 +0000 UTC m=+0.055111415 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 03:56:17 np0005531887 nova_compute[186849]: 2025-11-22 08:56:17.439 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:56:18 np0005531887 nova_compute[186849]: 2025-11-22 08:56:18.982 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:56:19 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:56:19.079 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '80'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:56:22 np0005531887 nova_compute[186849]: 2025-11-22 08:56:22.441 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:56:23 np0005531887 nova_compute[186849]: 2025-11-22 08:56:23.983 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:56:24 np0005531887 podman[250294]: 2025-11-22 08:56:24.838371589 +0000 UTC m=+0.057927574 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-type=git, build-date=2025-08-20T13:12:41, distribution-scope=public, name=ubi9-minimal, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=)
Nov 22 03:56:27 np0005531887 nova_compute[186849]: 2025-11-22 08:56:27.442 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:56:28 np0005531887 podman[250315]: 2025-11-22 08:56:28.860162111 +0000 UTC m=+0.072681766 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:56:28 np0005531887 podman[250316]: 2025-11-22 08:56:28.877414394 +0000 UTC m=+0.087644432 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 22 03:56:28 np0005531887 nova_compute[186849]: 2025-11-22 08:56:28.984 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:56:31 np0005531887 podman[250362]: 2025-11-22 08:56:31.842445461 +0000 UTC m=+0.053264689 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 03:56:32 np0005531887 nova_compute[186849]: 2025-11-22 08:56:32.443 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:56:33 np0005531887 nova_compute[186849]: 2025-11-22 08:56:33.986 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:56:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:56:36.678 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:56:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:56:36.679 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:56:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:56:36.679 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:56:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:56:36.679 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:56:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:56:36.679 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:56:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:56:36.679 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:56:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:56:36.679 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:56:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:56:36.679 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:56:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:56:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:56:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:56:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:56:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:56:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:56:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:56:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:56:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:56:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:56:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:56:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:56:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:56:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:56:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:56:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:56:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:56:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:56:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:56:36.681 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:56:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:56:36.681 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:56:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:56:36.681 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:56:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:56:36.681 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:56:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:56:36.681 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:56:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:56:36.681 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:56:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:56:36.681 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:56:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:56:36.681 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:56:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:56:37.392 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:56:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:56:37.392 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:56:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:56:37.392 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:56:37 np0005531887 nova_compute[186849]: 2025-11-22 08:56:37.445 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:56:38 np0005531887 nova_compute[186849]: 2025-11-22 08:56:38.797 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:56:38 np0005531887 nova_compute[186849]: 2025-11-22 08:56:38.988 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:56:40 np0005531887 podman[250386]: 2025-11-22 08:56:40.842157146 +0000 UTC m=+0.058150230 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Nov 22 03:56:42 np0005531887 nova_compute[186849]: 2025-11-22 08:56:42.446 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:56:42 np0005531887 nova_compute[186849]: 2025-11-22 08:56:42.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:56:42 np0005531887 nova_compute[186849]: 2025-11-22 08:56:42.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:56:43 np0005531887 nova_compute[186849]: 2025-11-22 08:56:43.990 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:56:44 np0005531887 nova_compute[186849]: 2025-11-22 08:56:44.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:56:44 np0005531887 nova_compute[186849]: 2025-11-22 08:56:44.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:56:44 np0005531887 nova_compute[186849]: 2025-11-22 08:56:44.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:56:44 np0005531887 nova_compute[186849]: 2025-11-22 08:56:44.785 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:56:44 np0005531887 podman[250403]: 2025-11-22 08:56:44.846539211 +0000 UTC m=+0.062051665 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 03:56:47 np0005531887 nova_compute[186849]: 2025-11-22 08:56:47.447 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:56:47 np0005531887 podman[250423]: 2025-11-22 08:56:47.855587417 +0000 UTC m=+0.075059235 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 03:56:48 np0005531887 nova_compute[186849]: 2025-11-22 08:56:48.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:56:48 np0005531887 nova_compute[186849]: 2025-11-22 08:56:48.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:56:48 np0005531887 nova_compute[186849]: 2025-11-22 08:56:48.807 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:56:48 np0005531887 nova_compute[186849]: 2025-11-22 08:56:48.807 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:56:48 np0005531887 nova_compute[186849]: 2025-11-22 08:56:48.807 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:56:48 np0005531887 nova_compute[186849]: 2025-11-22 08:56:48.808 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:56:48 np0005531887 nova_compute[186849]: 2025-11-22 08:56:48.973 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:56:48 np0005531887 nova_compute[186849]: 2025-11-22 08:56:48.974 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5753MB free_disk=73.26619338989258GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:56:48 np0005531887 nova_compute[186849]: 2025-11-22 08:56:48.974 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:56:48 np0005531887 nova_compute[186849]: 2025-11-22 08:56:48.975 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:56:48 np0005531887 nova_compute[186849]: 2025-11-22 08:56:48.993 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:56:49 np0005531887 nova_compute[186849]: 2025-11-22 08:56:49.070 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:56:49 np0005531887 nova_compute[186849]: 2025-11-22 08:56:49.070 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:56:49 np0005531887 nova_compute[186849]: 2025-11-22 08:56:49.091 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Refreshing inventories for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 22 03:56:49 np0005531887 nova_compute[186849]: 2025-11-22 08:56:49.192 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Updating ProviderTree inventory for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 22 03:56:49 np0005531887 nova_compute[186849]: 2025-11-22 08:56:49.193 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Updating inventory in ProviderTree for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 22 03:56:49 np0005531887 nova_compute[186849]: 2025-11-22 08:56:49.210 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Refreshing aggregate associations for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 22 03:56:49 np0005531887 nova_compute[186849]: 2025-11-22 08:56:49.249 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Refreshing trait associations for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78, traits: COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_MMX,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_PCNET _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 22 03:56:49 np0005531887 nova_compute[186849]: 2025-11-22 08:56:49.278 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:56:49 np0005531887 nova_compute[186849]: 2025-11-22 08:56:49.302 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:56:49 np0005531887 nova_compute[186849]: 2025-11-22 08:56:49.305 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:56:49 np0005531887 nova_compute[186849]: 2025-11-22 08:56:49.305 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.331s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:56:51 np0005531887 nova_compute[186849]: 2025-11-22 08:56:51.300 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:56:51 np0005531887 nova_compute[186849]: 2025-11-22 08:56:51.301 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:56:51 np0005531887 nova_compute[186849]: 2025-11-22 08:56:51.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:56:52 np0005531887 nova_compute[186849]: 2025-11-22 08:56:52.449 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:56:53 np0005531887 nova_compute[186849]: 2025-11-22 08:56:53.994 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:56:55 np0005531887 podman[250447]: 2025-11-22 08:56:55.860755929 +0000 UTC m=+0.068442192 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, managed_by=edpm_ansible, config_id=edpm, distribution-scope=public, release=1755695350, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 03:56:57 np0005531887 nova_compute[186849]: 2025-11-22 08:56:57.450 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:56:58 np0005531887 nova_compute[186849]: 2025-11-22 08:56:58.996 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:56:59 np0005531887 podman[250469]: 2025-11-22 08:56:59.869945143 +0000 UTC m=+0.087386677 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118)
Nov 22 03:56:59 np0005531887 podman[250470]: 2025-11-22 08:56:59.887382601 +0000 UTC m=+0.099138366 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, container_name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 22 03:57:02 np0005531887 nova_compute[186849]: 2025-11-22 08:57:02.452 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:02 np0005531887 podman[250517]: 2025-11-22 08:57:02.84153442 +0000 UTC m=+0.056704974 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 03:57:03 np0005531887 nova_compute[186849]: 2025-11-22 08:57:03.997 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:06 np0005531887 nova_compute[186849]: 2025-11-22 08:57:06.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:57:07 np0005531887 nova_compute[186849]: 2025-11-22 08:57:07.452 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:09 np0005531887 nova_compute[186849]: 2025-11-22 08:57:09.000 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:11 np0005531887 podman[250542]: 2025-11-22 08:57:11.839330706 +0000 UTC m=+0.058521038 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 22 03:57:12 np0005531887 nova_compute[186849]: 2025-11-22 08:57:12.454 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:14 np0005531887 nova_compute[186849]: 2025-11-22 08:57:14.002 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:15 np0005531887 podman[250563]: 2025-11-22 08:57:15.832789284 +0000 UTC m=+0.054896010 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:57:17 np0005531887 nova_compute[186849]: 2025-11-22 08:57:17.455 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:18 np0005531887 podman[250584]: 2025-11-22 08:57:18.853618529 +0000 UTC m=+0.065419738 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:57:19 np0005531887 nova_compute[186849]: 2025-11-22 08:57:19.004 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:19 np0005531887 nova_compute[186849]: 2025-11-22 08:57:19.762 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:57:22 np0005531887 nova_compute[186849]: 2025-11-22 08:57:22.456 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:24 np0005531887 nova_compute[186849]: 2025-11-22 08:57:24.006 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:26 np0005531887 podman[250609]: 2025-11-22 08:57:26.854896264 +0000 UTC m=+0.069010846 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, architecture=x86_64, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, name=ubi9-minimal, version=9.6, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.buildah.version=1.33.7, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 22 03:57:27 np0005531887 nova_compute[186849]: 2025-11-22 08:57:27.458 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:29 np0005531887 nova_compute[186849]: 2025-11-22 08:57:29.007 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:57:30.218 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=81, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=80) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:57:30 np0005531887 nova_compute[186849]: 2025-11-22 08:57:30.218 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:30 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:57:30.220 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:57:30 np0005531887 podman[250633]: 2025-11-22 08:57:30.853376887 +0000 UTC m=+0.067621281 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 03:57:30 np0005531887 podman[250634]: 2025-11-22 08:57:30.872654041 +0000 UTC m=+0.090176425 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:57:32 np0005531887 nova_compute[186849]: 2025-11-22 08:57:32.460 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:33 np0005531887 podman[250680]: 2025-11-22 08:57:33.857449011 +0000 UTC m=+0.081046761 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 03:57:34 np0005531887 nova_compute[186849]: 2025-11-22 08:57:34.008 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:35 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:57:35.223 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '81'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:57:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:57:37.393 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:57:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:57:37.394 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:57:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:57:37.394 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:57:37 np0005531887 nova_compute[186849]: 2025-11-22 08:57:37.462 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:39 np0005531887 nova_compute[186849]: 2025-11-22 08:57:39.010 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:40 np0005531887 nova_compute[186849]: 2025-11-22 08:57:40.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:57:42 np0005531887 nova_compute[186849]: 2025-11-22 08:57:42.464 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:42 np0005531887 podman[250704]: 2025-11-22 08:57:42.839719607 +0000 UTC m=+0.046900682 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:57:43 np0005531887 nova_compute[186849]: 2025-11-22 08:57:43.774 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:57:43 np0005531887 nova_compute[186849]: 2025-11-22 08:57:43.776 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:57:44 np0005531887 nova_compute[186849]: 2025-11-22 08:57:44.012 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:45 np0005531887 nova_compute[186849]: 2025-11-22 08:57:45.771 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:57:45 np0005531887 nova_compute[186849]: 2025-11-22 08:57:45.772 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:57:45 np0005531887 nova_compute[186849]: 2025-11-22 08:57:45.772 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:57:45 np0005531887 nova_compute[186849]: 2025-11-22 08:57:45.790 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:57:46 np0005531887 podman[250722]: 2025-11-22 08:57:46.840554926 +0000 UTC m=+0.062240360 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 03:57:47 np0005531887 nova_compute[186849]: 2025-11-22 08:57:47.466 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:49 np0005531887 nova_compute[186849]: 2025-11-22 08:57:48.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:57:49 np0005531887 nova_compute[186849]: 2025-11-22 08:57:48.798 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:57:49 np0005531887 nova_compute[186849]: 2025-11-22 08:57:48.798 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:57:49 np0005531887 nova_compute[186849]: 2025-11-22 08:57:48.799 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:57:49 np0005531887 nova_compute[186849]: 2025-11-22 08:57:48.799 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:57:49 np0005531887 nova_compute[186849]: 2025-11-22 08:57:49.005 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:57:49 np0005531887 nova_compute[186849]: 2025-11-22 08:57:49.006 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5753MB free_disk=73.26617431640625GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:57:49 np0005531887 nova_compute[186849]: 2025-11-22 08:57:49.007 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:57:49 np0005531887 nova_compute[186849]: 2025-11-22 08:57:49.007 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:57:49 np0005531887 nova_compute[186849]: 2025-11-22 08:57:49.014 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:49 np0005531887 nova_compute[186849]: 2025-11-22 08:57:49.295 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:57:49 np0005531887 nova_compute[186849]: 2025-11-22 08:57:49.296 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:57:49 np0005531887 nova_compute[186849]: 2025-11-22 08:57:49.332 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:57:49 np0005531887 nova_compute[186849]: 2025-11-22 08:57:49.360 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:57:49 np0005531887 nova_compute[186849]: 2025-11-22 08:57:49.362 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:57:49 np0005531887 nova_compute[186849]: 2025-11-22 08:57:49.362 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.355s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:57:49 np0005531887 podman[250744]: 2025-11-22 08:57:49.847153211 +0000 UTC m=+0.060941157 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:57:51 np0005531887 nova_compute[186849]: 2025-11-22 08:57:51.356 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:57:51 np0005531887 nova_compute[186849]: 2025-11-22 08:57:51.356 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:57:51 np0005531887 nova_compute[186849]: 2025-11-22 08:57:51.356 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:57:51 np0005531887 nova_compute[186849]: 2025-11-22 08:57:51.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:57:52 np0005531887 nova_compute[186849]: 2025-11-22 08:57:52.468 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:54 np0005531887 nova_compute[186849]: 2025-11-22 08:57:54.016 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:57 np0005531887 nova_compute[186849]: 2025-11-22 08:57:57.469 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:57:57 np0005531887 podman[250768]: 2025-11-22 08:57:57.866842871 +0000 UTC m=+0.088993996 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=ubi9-minimal, vcs-type=git, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., managed_by=edpm_ansible, distribution-scope=public, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.buildah.version=1.33.7)
Nov 22 03:57:59 np0005531887 nova_compute[186849]: 2025-11-22 08:57:59.017 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:58:01 np0005531887 podman[250789]: 2025-11-22 08:58:01.839032506 +0000 UTC m=+0.061562642 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 22 03:58:01 np0005531887 podman[250790]: 2025-11-22 08:58:01.883703153 +0000 UTC m=+0.101783760 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 03:58:02 np0005531887 nova_compute[186849]: 2025-11-22 08:58:02.471 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:58:04 np0005531887 nova_compute[186849]: 2025-11-22 08:58:04.018 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:58:04 np0005531887 podman[250834]: 2025-11-22 08:58:04.866398882 +0000 UTC m=+0.078545020 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 03:58:07 np0005531887 nova_compute[186849]: 2025-11-22 08:58:07.473 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:58:07 np0005531887 nova_compute[186849]: 2025-11-22 08:58:07.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:58:09 np0005531887 nova_compute[186849]: 2025-11-22 08:58:09.020 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:58:12 np0005531887 nova_compute[186849]: 2025-11-22 08:58:12.475 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:58:13 np0005531887 podman[250860]: 2025-11-22 08:58:13.872669715 +0000 UTC m=+0.089590361 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent)
Nov 22 03:58:14 np0005531887 nova_compute[186849]: 2025-11-22 08:58:14.027 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:58:17 np0005531887 nova_compute[186849]: 2025-11-22 08:58:17.477 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:58:17 np0005531887 podman[250880]: 2025-11-22 08:58:17.841313974 +0000 UTC m=+0.061269176 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Nov 22 03:58:19 np0005531887 nova_compute[186849]: 2025-11-22 08:58:19.030 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:58:20 np0005531887 podman[250900]: 2025-11-22 08:58:20.85850945 +0000 UTC m=+0.069834326 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:58:22 np0005531887 nova_compute[186849]: 2025-11-22 08:58:22.479 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:58:24 np0005531887 nova_compute[186849]: 2025-11-22 08:58:24.033 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:58:27 np0005531887 nova_compute[186849]: 2025-11-22 08:58:27.481 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:58:28 np0005531887 podman[250924]: 2025-11-22 08:58:28.852081229 +0000 UTC m=+0.069129499 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, architecture=x86_64, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.expose-services=, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, name=ubi9-minimal, build-date=2025-08-20T13:12:41, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 22 03:58:29 np0005531887 nova_compute[186849]: 2025-11-22 08:58:29.034 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:58:32 np0005531887 nova_compute[186849]: 2025-11-22 08:58:32.483 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:58:32 np0005531887 podman[250945]: 2025-11-22 08:58:32.85383582 +0000 UTC m=+0.069599220 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.schema-version=1.0)
Nov 22 03:58:32 np0005531887 podman[250946]: 2025-11-22 08:58:32.919696797 +0000 UTC m=+0.120219593 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller)
Nov 22 03:58:32 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:58:32.925 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=82, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=81) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 03:58:32 np0005531887 nova_compute[186849]: 2025-11-22 08:58:32.925 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:58:32 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:58:32.927 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 03:58:34 np0005531887 nova_compute[186849]: 2025-11-22 08:58:34.037 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:58:34 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:58:34.929 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '82'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 03:58:35 np0005531887 podman[250990]: 2025-11-22 08:58:35.886911713 +0000 UTC m=+0.101261187 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 03:58:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:58:36.679 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:58:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:58:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:58:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:58:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:58:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:58:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:58:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:58:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:58:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:58:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:58:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:58:36.681 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:58:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:58:36.681 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:58:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:58:36.685 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:58:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:58:36.685 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:58:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:58:36.685 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:58:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:58:36.685 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:58:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:58:36.686 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:58:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:58:36.686 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:58:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:58:36.686 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:58:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:58:36.686 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:58:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:58:36.686 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:58:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:58:36.686 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:58:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:58:36.686 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:58:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:58:36.686 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:58:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:58:36.687 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:58:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:58:36.687 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:58:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:58:36.687 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:58:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:58:36.687 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:58:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 08:58:36.687 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 03:58:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:58:37.394 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:58:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:58:37.395 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:58:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:58:37.395 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:58:37 np0005531887 nova_compute[186849]: 2025-11-22 08:58:37.485 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:58:39 np0005531887 nova_compute[186849]: 2025-11-22 08:58:39.039 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:58:42 np0005531887 nova_compute[186849]: 2025-11-22 08:58:42.488 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:58:42 np0005531887 nova_compute[186849]: 2025-11-22 08:58:42.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:58:44 np0005531887 nova_compute[186849]: 2025-11-22 08:58:44.041 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:58:44 np0005531887 nova_compute[186849]: 2025-11-22 08:58:44.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:58:44 np0005531887 nova_compute[186849]: 2025-11-22 08:58:44.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:58:44 np0005531887 podman[251014]: 2025-11-22 08:58:44.868418231 +0000 UTC m=+0.077696869 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 22 03:58:46 np0005531887 nova_compute[186849]: 2025-11-22 08:58:46.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:58:46 np0005531887 nova_compute[186849]: 2025-11-22 08:58:46.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:58:46 np0005531887 nova_compute[186849]: 2025-11-22 08:58:46.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:58:46 np0005531887 nova_compute[186849]: 2025-11-22 08:58:46.789 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:58:47 np0005531887 nova_compute[186849]: 2025-11-22 08:58:47.490 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:58:48 np0005531887 nova_compute[186849]: 2025-11-22 08:58:48.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:58:48 np0005531887 nova_compute[186849]: 2025-11-22 08:58:48.797 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:58:48 np0005531887 nova_compute[186849]: 2025-11-22 08:58:48.797 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:58:48 np0005531887 nova_compute[186849]: 2025-11-22 08:58:48.797 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:58:48 np0005531887 nova_compute[186849]: 2025-11-22 08:58:48.798 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:58:48 np0005531887 podman[251033]: 2025-11-22 08:58:48.901597096 +0000 UTC m=+0.113102059 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, org.label-schema.build-date=20251118, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:58:48 np0005531887 nova_compute[186849]: 2025-11-22 08:58:48.995 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:58:48 np0005531887 nova_compute[186849]: 2025-11-22 08:58:48.996 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5748MB free_disk=73.26619338989258GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:58:48 np0005531887 nova_compute[186849]: 2025-11-22 08:58:48.997 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:58:48 np0005531887 nova_compute[186849]: 2025-11-22 08:58:48.997 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:58:49 np0005531887 nova_compute[186849]: 2025-11-22 08:58:49.044 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:58:49 np0005531887 nova_compute[186849]: 2025-11-22 08:58:49.068 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:58:49 np0005531887 nova_compute[186849]: 2025-11-22 08:58:49.069 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:58:49 np0005531887 nova_compute[186849]: 2025-11-22 08:58:49.105 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:58:49 np0005531887 nova_compute[186849]: 2025-11-22 08:58:49.121 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:58:49 np0005531887 nova_compute[186849]: 2025-11-22 08:58:49.122 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:58:49 np0005531887 nova_compute[186849]: 2025-11-22 08:58:49.123 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.126s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:58:51 np0005531887 nova_compute[186849]: 2025-11-22 08:58:51.123 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:58:51 np0005531887 nova_compute[186849]: 2025-11-22 08:58:51.123 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:58:51 np0005531887 nova_compute[186849]: 2025-11-22 08:58:51.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:58:51 np0005531887 podman[251053]: 2025-11-22 08:58:51.83610327 +0000 UTC m=+0.055112354 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:58:52 np0005531887 nova_compute[186849]: 2025-11-22 08:58:52.492 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:58:52 np0005531887 nova_compute[186849]: 2025-11-22 08:58:52.762 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:58:54 np0005531887 nova_compute[186849]: 2025-11-22 08:58:54.047 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:58:57 np0005531887 nova_compute[186849]: 2025-11-22 08:58:57.493 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:58:59 np0005531887 nova_compute[186849]: 2025-11-22 08:58:59.052 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:58:59 np0005531887 podman[251077]: 2025-11-22 08:58:59.901065773 +0000 UTC m=+0.108824104 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.expose-services=, maintainer=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc.)
Nov 22 03:59:02 np0005531887 nova_compute[186849]: 2025-11-22 08:59:02.495 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:59:03 np0005531887 podman[251098]: 2025-11-22 08:59:03.884460082 +0000 UTC m=+0.101600566 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute)
Nov 22 03:59:03 np0005531887 podman[251099]: 2025-11-22 08:59:03.888363758 +0000 UTC m=+0.102624551 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 03:59:04 np0005531887 nova_compute[186849]: 2025-11-22 08:59:04.053 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:59:06 np0005531887 podman[251144]: 2025-11-22 08:59:06.83700664 +0000 UTC m=+0.058045196 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 03:59:07 np0005531887 nova_compute[186849]: 2025-11-22 08:59:07.497 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:59:08 np0005531887 nova_compute[186849]: 2025-11-22 08:59:08.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:59:09 np0005531887 nova_compute[186849]: 2025-11-22 08:59:09.056 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:59:12 np0005531887 nova_compute[186849]: 2025-11-22 08:59:12.499 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:59:14 np0005531887 nova_compute[186849]: 2025-11-22 08:59:14.058 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:59:15 np0005531887 podman[251168]: 2025-11-22 08:59:15.845299304 +0000 UTC m=+0.062262029 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 03:59:17 np0005531887 nova_compute[186849]: 2025-11-22 08:59:17.500 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:59:19 np0005531887 nova_compute[186849]: 2025-11-22 08:59:19.060 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:59:19 np0005531887 podman[251188]: 2025-11-22 08:59:19.849589349 +0000 UTC m=+0.062499716 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 03:59:21 np0005531887 nova_compute[186849]: 2025-11-22 08:59:21.762 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:59:22 np0005531887 nova_compute[186849]: 2025-11-22 08:59:22.502 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:59:22 np0005531887 podman[251209]: 2025-11-22 08:59:22.852362591 +0000 UTC m=+0.077407632 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 03:59:24 np0005531887 nova_compute[186849]: 2025-11-22 08:59:24.065 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:59:27 np0005531887 nova_compute[186849]: 2025-11-22 08:59:27.503 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:59:29 np0005531887 nova_compute[186849]: 2025-11-22 08:59:29.065 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:59:30 np0005531887 systemd[1]: Starting dnf makecache...
Nov 22 03:59:30 np0005531887 podman[251233]: 2025-11-22 08:59:30.849562778 +0000 UTC m=+0.069441575 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, version=9.6, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_id=edpm, io.openshift.expose-services=, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., architecture=x86_64)
Nov 22 03:59:31 np0005531887 dnf[251234]: Metadata cache refreshed recently.
Nov 22 03:59:31 np0005531887 systemd[1]: dnf-makecache.service: Deactivated successfully.
Nov 22 03:59:31 np0005531887 systemd[1]: Finished dnf makecache.
Nov 22 03:59:32 np0005531887 nova_compute[186849]: 2025-11-22 08:59:32.504 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:59:34 np0005531887 nova_compute[186849]: 2025-11-22 08:59:34.066 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:59:34 np0005531887 podman[251256]: 2025-11-22 08:59:34.89527171 +0000 UTC m=+0.108413914 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 03:59:34 np0005531887 podman[251257]: 2025-11-22 08:59:34.912494942 +0000 UTC m=+0.118485760 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 22 03:59:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:59:37.396 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:59:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:59:37.397 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:59:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 08:59:37.397 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:59:37 np0005531887 nova_compute[186849]: 2025-11-22 08:59:37.505 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:59:37 np0005531887 podman[251305]: 2025-11-22 08:59:37.864109608 +0000 UTC m=+0.082894627 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 03:59:39 np0005531887 nova_compute[186849]: 2025-11-22 08:59:39.069 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:59:42 np0005531887 nova_compute[186849]: 2025-11-22 08:59:42.507 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:59:42 np0005531887 nova_compute[186849]: 2025-11-22 08:59:42.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:59:44 np0005531887 nova_compute[186849]: 2025-11-22 08:59:44.070 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:59:44 np0005531887 nova_compute[186849]: 2025-11-22 08:59:44.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:59:44 np0005531887 nova_compute[186849]: 2025-11-22 08:59:44.768 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 03:59:46 np0005531887 podman[251329]: 2025-11-22 08:59:46.846786692 +0000 UTC m=+0.058169249 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 03:59:47 np0005531887 nova_compute[186849]: 2025-11-22 08:59:47.509 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:59:47 np0005531887 nova_compute[186849]: 2025-11-22 08:59:47.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:59:47 np0005531887 nova_compute[186849]: 2025-11-22 08:59:47.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 03:59:47 np0005531887 nova_compute[186849]: 2025-11-22 08:59:47.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 03:59:47 np0005531887 nova_compute[186849]: 2025-11-22 08:59:47.839 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 03:59:48 np0005531887 nova_compute[186849]: 2025-11-22 08:59:48.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:59:48 np0005531887 nova_compute[186849]: 2025-11-22 08:59:48.797 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:59:48 np0005531887 nova_compute[186849]: 2025-11-22 08:59:48.798 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:59:48 np0005531887 nova_compute[186849]: 2025-11-22 08:59:48.798 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:59:48 np0005531887 nova_compute[186849]: 2025-11-22 08:59:48.799 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 03:59:49 np0005531887 nova_compute[186849]: 2025-11-22 08:59:49.000 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 03:59:49 np0005531887 nova_compute[186849]: 2025-11-22 08:59:49.001 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5743MB free_disk=73.26619338989258GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 03:59:49 np0005531887 nova_compute[186849]: 2025-11-22 08:59:49.002 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 03:59:49 np0005531887 nova_compute[186849]: 2025-11-22 08:59:49.002 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 03:59:49 np0005531887 nova_compute[186849]: 2025-11-22 08:59:49.054 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 03:59:49 np0005531887 nova_compute[186849]: 2025-11-22 08:59:49.055 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 03:59:49 np0005531887 nova_compute[186849]: 2025-11-22 08:59:49.073 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:59:49 np0005531887 nova_compute[186849]: 2025-11-22 08:59:49.080 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 03:59:49 np0005531887 nova_compute[186849]: 2025-11-22 08:59:49.100 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 03:59:49 np0005531887 nova_compute[186849]: 2025-11-22 08:59:49.102 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 03:59:49 np0005531887 nova_compute[186849]: 2025-11-22 08:59:49.103 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 03:59:50 np0005531887 podman[251350]: 2025-11-22 08:59:50.885043551 +0000 UTC m=+0.088841752 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 22 03:59:51 np0005531887 nova_compute[186849]: 2025-11-22 08:59:51.103 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:59:51 np0005531887 nova_compute[186849]: 2025-11-22 08:59:51.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:59:51 np0005531887 nova_compute[186849]: 2025-11-22 08:59:51.770 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:59:52 np0005531887 nova_compute[186849]: 2025-11-22 08:59:52.511 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:59:53 np0005531887 nova_compute[186849]: 2025-11-22 08:59:53.763 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 03:59:53 np0005531887 podman[251370]: 2025-11-22 08:59:53.877692693 +0000 UTC m=+0.081649825 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 03:59:54 np0005531887 nova_compute[186849]: 2025-11-22 08:59:54.075 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:59:57 np0005531887 nova_compute[186849]: 2025-11-22 08:59:57.514 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 03:59:59 np0005531887 nova_compute[186849]: 2025-11-22 08:59:59.077 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:00:01 np0005531887 podman[251394]: 2025-11-22 09:00:01.900752697 +0000 UTC m=+0.109256233 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.openshift.expose-services=, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 22 04:00:02 np0005531887 nova_compute[186849]: 2025-11-22 09:00:02.516 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:00:04 np0005531887 nova_compute[186849]: 2025-11-22 09:00:04.082 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:00:05 np0005531887 podman[251414]: 2025-11-22 09:00:05.840531996 +0000 UTC m=+0.063536921 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, tcib_managed=true)
Nov 22 04:00:05 np0005531887 podman[251415]: 2025-11-22 09:00:05.881541984 +0000 UTC m=+0.099822203 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 22 04:00:07 np0005531887 nova_compute[186849]: 2025-11-22 09:00:07.517 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:00:08 np0005531887 podman[251462]: 2025-11-22 09:00:08.869758967 +0000 UTC m=+0.088495114 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 04:00:09 np0005531887 nova_compute[186849]: 2025-11-22 09:00:09.084 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:00:10 np0005531887 nova_compute[186849]: 2025-11-22 09:00:10.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:00:12 np0005531887 nova_compute[186849]: 2025-11-22 09:00:12.519 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:00:14 np0005531887 nova_compute[186849]: 2025-11-22 09:00:14.087 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:00:17 np0005531887 nova_compute[186849]: 2025-11-22 09:00:17.520 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:00:17 np0005531887 podman[251486]: 2025-11-22 09:00:17.879308504 +0000 UTC m=+0.093237591 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 22 04:00:19 np0005531887 nova_compute[186849]: 2025-11-22 09:00:19.091 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:00:21 np0005531887 podman[251507]: 2025-11-22 09:00:21.837451234 +0000 UTC m=+0.058545279 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 22 04:00:22 np0005531887 systemd-logind[821]: New session 59 of user zuul.
Nov 22 04:00:22 np0005531887 nova_compute[186849]: 2025-11-22 09:00:22.522 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:00:22 np0005531887 systemd[1]: Started Session 59 of User zuul.
Nov 22 04:00:23 np0005531887 nova_compute[186849]: 2025-11-22 09:00:23.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:00:23 np0005531887 nova_compute[186849]: 2025-11-22 09:00:23.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 22 04:00:23 np0005531887 nova_compute[186849]: 2025-11-22 09:00:23.786 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 22 04:00:24 np0005531887 nova_compute[186849]: 2025-11-22 09:00:24.091 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:00:24 np0005531887 podman[251645]: 2025-11-22 09:00:24.856385823 +0000 UTC m=+0.070285447 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 04:00:27 np0005531887 nova_compute[186849]: 2025-11-22 09:00:27.523 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:00:27 np0005531887 ovs-vsctl[251730]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Nov 22 04:00:28 np0005531887 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 251555 (sos)
Nov 22 04:00:28 np0005531887 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Nov 22 04:00:28 np0005531887 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Nov 22 04:00:28 np0005531887 virtqemud[186424]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Nov 22 04:00:28 np0005531887 virtqemud[186424]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Nov 22 04:00:28 np0005531887 virtqemud[186424]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Nov 22 04:00:29 np0005531887 nova_compute[186849]: 2025-11-22 09:00:29.092 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:00:32 np0005531887 systemd[1]: Starting Hostname Service...
Nov 22 04:00:32 np0005531887 nova_compute[186849]: 2025-11-22 09:00:32.525 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:00:32 np0005531887 podman[252269]: 2025-11-22 09:00:32.593974305 +0000 UTC m=+0.072086630 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.33.7, architecture=x86_64, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., maintainer=Red Hat, Inc., release=1755695350, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible)
Nov 22 04:00:32 np0005531887 systemd[1]: Started Hostname Service.
Nov 22 04:00:34 np0005531887 nova_compute[186849]: 2025-11-22 09:00:34.094 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:00:35 np0005531887 nova_compute[186849]: 2025-11-22 09:00:35.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:00:35 np0005531887 nova_compute[186849]: 2025-11-22 09:00:35.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 22 04:00:35 np0005531887 podman[252576]: 2025-11-22 09:00:35.966753502 +0000 UTC m=+0.069053056 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 04:00:36 np0005531887 podman[252597]: 2025-11-22 09:00:36.100135107 +0000 UTC m=+0.112489002 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 04:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:00:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:00:36.681 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:00:36.681 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:00:36.681 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:00:36.681 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:00:36.681 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:00:36.681 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:00:36.681 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:00:36.682 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:00:36.682 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:00:36.682 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:00:36.682 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:00:36.682 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:00:36.682 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:00:36.682 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:00:36.682 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:00:36.682 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:00:36.683 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:00:36.683 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:00:36.683 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:00:36.683 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:00:36.683 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:00:36.683 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:00:36.683 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:00:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:00:36.683 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:00:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 09:00:37.398 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:00:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 09:00:37.399 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:00:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 09:00:37.399 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:00:37 np0005531887 nova_compute[186849]: 2025-11-22 09:00:37.528 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:00:39 np0005531887 nova_compute[186849]: 2025-11-22 09:00:39.097 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:00:39 np0005531887 podman[253127]: 2025-11-22 09:00:39.853679194 +0000 UTC m=+0.059832540 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 04:00:41 np0005531887 ovs-appctl[253524]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Nov 22 04:00:41 np0005531887 ovs-appctl[253528]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Nov 22 04:00:41 np0005531887 ovs-appctl[253531]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Nov 22 04:00:42 np0005531887 nova_compute[186849]: 2025-11-22 09:00:42.533 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:00:42 np0005531887 nova_compute[186849]: 2025-11-22 09:00:42.785 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:00:44 np0005531887 nova_compute[186849]: 2025-11-22 09:00:44.099 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:00:44 np0005531887 nova_compute[186849]: 2025-11-22 09:00:44.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:00:46 np0005531887 nova_compute[186849]: 2025-11-22 09:00:46.786 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:00:46 np0005531887 nova_compute[186849]: 2025-11-22 09:00:46.788 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 04:00:47 np0005531887 nova_compute[186849]: 2025-11-22 09:00:47.536 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:00:48 np0005531887 podman[254649]: 2025-11-22 09:00:48.00516192 +0000 UTC m=+0.054421787 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 22 04:00:48 np0005531887 nova_compute[186849]: 2025-11-22 09:00:48.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:00:48 np0005531887 virtqemud[186424]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Nov 22 04:00:48 np0005531887 nova_compute[186849]: 2025-11-22 09:00:48.799 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:00:48 np0005531887 nova_compute[186849]: 2025-11-22 09:00:48.799 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:00:48 np0005531887 nova_compute[186849]: 2025-11-22 09:00:48.800 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:00:48 np0005531887 nova_compute[186849]: 2025-11-22 09:00:48.800 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 04:00:49 np0005531887 nova_compute[186849]: 2025-11-22 09:00:49.000 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 04:00:49 np0005531887 nova_compute[186849]: 2025-11-22 09:00:49.002 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5526MB free_disk=72.6812858581543GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 04:00:49 np0005531887 nova_compute[186849]: 2025-11-22 09:00:49.002 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:00:49 np0005531887 nova_compute[186849]: 2025-11-22 09:00:49.002 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:00:49 np0005531887 nova_compute[186849]: 2025-11-22 09:00:49.102 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:00:49 np0005531887 nova_compute[186849]: 2025-11-22 09:00:49.165 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 04:00:49 np0005531887 nova_compute[186849]: 2025-11-22 09:00:49.165 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 04:00:49 np0005531887 nova_compute[186849]: 2025-11-22 09:00:49.222 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 04:00:49 np0005531887 nova_compute[186849]: 2025-11-22 09:00:49.241 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 04:00:49 np0005531887 nova_compute[186849]: 2025-11-22 09:00:49.269 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 04:00:49 np0005531887 nova_compute[186849]: 2025-11-22 09:00:49.270 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.268s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:00:50 np0005531887 nova_compute[186849]: 2025-11-22 09:00:50.270 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:00:50 np0005531887 nova_compute[186849]: 2025-11-22 09:00:50.270 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 04:00:50 np0005531887 nova_compute[186849]: 2025-11-22 09:00:50.271 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 04:00:50 np0005531887 nova_compute[186849]: 2025-11-22 09:00:50.300 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 04:00:51 np0005531887 systemd[1]: Starting Time & Date Service...
Nov 22 04:00:51 np0005531887 systemd[1]: Started Time & Date Service.
Nov 22 04:00:51 np0005531887 nova_compute[186849]: 2025-11-22 09:00:51.770 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:00:51 np0005531887 nova_compute[186849]: 2025-11-22 09:00:51.771 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:00:51 np0005531887 nova_compute[186849]: 2025-11-22 09:00:51.771 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:00:52 np0005531887 nova_compute[186849]: 2025-11-22 09:00:52.536 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:00:52 np0005531887 podman[255090]: 2025-11-22 09:00:52.860225743 +0000 UTC m=+0.078889778 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 22 04:00:54 np0005531887 nova_compute[186849]: 2025-11-22 09:00:54.110 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:00:55 np0005531887 podman[255112]: 2025-11-22 09:00:55.655350737 +0000 UTC m=+0.068201726 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 04:00:55 np0005531887 nova_compute[186849]: 2025-11-22 09:00:55.764 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:00:57 np0005531887 nova_compute[186849]: 2025-11-22 09:00:57.538 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:00:59 np0005531887 nova_compute[186849]: 2025-11-22 09:00:59.114 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:01:02 np0005531887 nova_compute[186849]: 2025-11-22 09:01:02.540 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:01:02 np0005531887 podman[255146]: 2025-11-22 09:01:02.846230376 +0000 UTC m=+0.065079930 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, vendor=Red Hat, Inc., architecture=x86_64, version=9.6, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_id=edpm, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, distribution-scope=public, maintainer=Red Hat, Inc., release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 22 04:01:04 np0005531887 nova_compute[186849]: 2025-11-22 09:01:04.116 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:01:06 np0005531887 podman[255169]: 2025-11-22 09:01:06.880644129 +0000 UTC m=+0.078523949 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Nov 22 04:01:06 np0005531887 podman[255170]: 2025-11-22 09:01:06.918428516 +0000 UTC m=+0.116197164 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 22 04:01:07 np0005531887 nova_compute[186849]: 2025-11-22 09:01:07.543 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:01:09 np0005531887 nova_compute[186849]: 2025-11-22 09:01:09.119 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:01:10 np0005531887 podman[255219]: 2025-11-22 09:01:10.811694774 +0000 UTC m=+0.065010038 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 04:01:12 np0005531887 nova_compute[186849]: 2025-11-22 09:01:12.544 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:01:12 np0005531887 nova_compute[186849]: 2025-11-22 09:01:12.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:01:14 np0005531887 nova_compute[186849]: 2025-11-22 09:01:14.123 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:01:17 np0005531887 nova_compute[186849]: 2025-11-22 09:01:17.545 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:01:19 np0005531887 nova_compute[186849]: 2025-11-22 09:01:19.125 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:01:19 np0005531887 podman[255243]: 2025-11-22 09:01:19.941574494 +0000 UTC m=+1.145369636 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 22 04:01:21 np0005531887 systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 22 04:01:21 np0005531887 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 22 04:01:21 np0005531887 systemd[1]: session-59.scope: Deactivated successfully.
Nov 22 04:01:21 np0005531887 systemd[1]: session-59.scope: Consumed 1min 27.546s CPU time, 581.5M memory peak, read 100.9M from disk, written 17.4M to disk.
Nov 22 04:01:21 np0005531887 systemd-logind[821]: Session 59 logged out. Waiting for processes to exit.
Nov 22 04:01:21 np0005531887 systemd-logind[821]: Removed session 59.
Nov 22 04:01:21 np0005531887 systemd-logind[821]: New session 60 of user zuul.
Nov 22 04:01:21 np0005531887 systemd[1]: Started Session 60 of User zuul.
Nov 22 04:01:22 np0005531887 systemd[1]: session-60.scope: Deactivated successfully.
Nov 22 04:01:22 np0005531887 systemd-logind[821]: Session 60 logged out. Waiting for processes to exit.
Nov 22 04:01:22 np0005531887 systemd-logind[821]: Removed session 60.
Nov 22 04:01:22 np0005531887 systemd-logind[821]: New session 61 of user zuul.
Nov 22 04:01:22 np0005531887 systemd[1]: Started Session 61 of User zuul.
Nov 22 04:01:22 np0005531887 systemd[1]: session-61.scope: Deactivated successfully.
Nov 22 04:01:22 np0005531887 systemd-logind[821]: Session 61 logged out. Waiting for processes to exit.
Nov 22 04:01:22 np0005531887 systemd-logind[821]: Removed session 61.
Nov 22 04:01:22 np0005531887 nova_compute[186849]: 2025-11-22 09:01:22.547 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:01:23 np0005531887 podman[255326]: 2025-11-22 09:01:23.871877319 +0000 UTC m=+0.089295273 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 04:01:24 np0005531887 nova_compute[186849]: 2025-11-22 09:01:24.128 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:01:24 np0005531887 nova_compute[186849]: 2025-11-22 09:01:24.763 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:01:25 np0005531887 podman[255346]: 2025-11-22 09:01:25.835202249 +0000 UTC m=+0.055540065 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 04:01:27 np0005531887 nova_compute[186849]: 2025-11-22 09:01:27.549 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:01:29 np0005531887 nova_compute[186849]: 2025-11-22 09:01:29.131 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:01:32 np0005531887 nova_compute[186849]: 2025-11-22 09:01:32.551 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:01:33 np0005531887 podman[255370]: 2025-11-22 09:01:33.850317516 +0000 UTC m=+0.061599083 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, vcs-type=git, config_id=edpm, io.openshift.tags=minimal rhel9, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, architecture=x86_64, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 22 04:01:34 np0005531887 nova_compute[186849]: 2025-11-22 09:01:34.131 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:01:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 09:01:37.401 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:01:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 09:01:37.401 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:01:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 09:01:37.401 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:01:37 np0005531887 nova_compute[186849]: 2025-11-22 09:01:37.551 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:01:37 np0005531887 podman[255391]: 2025-11-22 09:01:37.860862104 +0000 UTC m=+0.074541472 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 04:01:37 np0005531887 podman[255392]: 2025-11-22 09:01:37.894641933 +0000 UTC m=+0.104510077 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 22 04:01:39 np0005531887 nova_compute[186849]: 2025-11-22 09:01:39.133 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:01:41 np0005531887 podman[255434]: 2025-11-22 09:01:41.879033206 +0000 UTC m=+0.098443298 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 04:01:42 np0005531887 nova_compute[186849]: 2025-11-22 09:01:42.553 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:01:42 np0005531887 nova_compute[186849]: 2025-11-22 09:01:42.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:01:44 np0005531887 nova_compute[186849]: 2025-11-22 09:01:44.134 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:01:47 np0005531887 nova_compute[186849]: 2025-11-22 09:01:47.555 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:01:48 np0005531887 nova_compute[186849]: 2025-11-22 09:01:48.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:01:48 np0005531887 nova_compute[186849]: 2025-11-22 09:01:48.768 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 04:01:49 np0005531887 nova_compute[186849]: 2025-11-22 09:01:49.135 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:01:49 np0005531887 nova_compute[186849]: 2025-11-22 09:01:49.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:01:49 np0005531887 nova_compute[186849]: 2025-11-22 09:01:49.827 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:01:49 np0005531887 nova_compute[186849]: 2025-11-22 09:01:49.827 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:01:49 np0005531887 nova_compute[186849]: 2025-11-22 09:01:49.828 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:01:49 np0005531887 nova_compute[186849]: 2025-11-22 09:01:49.828 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 04:01:49 np0005531887 nova_compute[186849]: 2025-11-22 09:01:49.977 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 04:01:49 np0005531887 nova_compute[186849]: 2025-11-22 09:01:49.978 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5728MB free_disk=73.2640609741211GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 04:01:49 np0005531887 nova_compute[186849]: 2025-11-22 09:01:49.978 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:01:49 np0005531887 nova_compute[186849]: 2025-11-22 09:01:49.978 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:01:50 np0005531887 nova_compute[186849]: 2025-11-22 09:01:50.083 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 04:01:50 np0005531887 nova_compute[186849]: 2025-11-22 09:01:50.084 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 04:01:50 np0005531887 nova_compute[186849]: 2025-11-22 09:01:50.392 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Refreshing inventories for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 22 04:01:50 np0005531887 nova_compute[186849]: 2025-11-22 09:01:50.660 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Updating ProviderTree inventory for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 22 04:01:50 np0005531887 nova_compute[186849]: 2025-11-22 09:01:50.660 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Updating inventory in ProviderTree for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 22 04:01:50 np0005531887 nova_compute[186849]: 2025-11-22 09:01:50.677 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Refreshing aggregate associations for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 22 04:01:50 np0005531887 nova_compute[186849]: 2025-11-22 09:01:50.720 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Refreshing trait associations for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78, traits: COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_MMX,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_PCNET _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 22 04:01:50 np0005531887 nova_compute[186849]: 2025-11-22 09:01:50.759 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 04:01:50 np0005531887 nova_compute[186849]: 2025-11-22 09:01:50.779 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 04:01:50 np0005531887 nova_compute[186849]: 2025-11-22 09:01:50.839 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 04:01:50 np0005531887 nova_compute[186849]: 2025-11-22 09:01:50.840 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.861s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:01:50 np0005531887 podman[255459]: 2025-11-22 09:01:50.861473827 +0000 UTC m=+0.067537599 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 04:01:51 np0005531887 nova_compute[186849]: 2025-11-22 09:01:51.840 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:01:51 np0005531887 nova_compute[186849]: 2025-11-22 09:01:51.840 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 04:01:51 np0005531887 nova_compute[186849]: 2025-11-22 09:01:51.840 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 04:01:51 np0005531887 nova_compute[186849]: 2025-11-22 09:01:51.857 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 04:01:51 np0005531887 nova_compute[186849]: 2025-11-22 09:01:51.857 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:01:51 np0005531887 nova_compute[186849]: 2025-11-22 09:01:51.858 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:01:52 np0005531887 nova_compute[186849]: 2025-11-22 09:01:52.559 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:01:52 np0005531887 nova_compute[186849]: 2025-11-22 09:01:52.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:01:54 np0005531887 nova_compute[186849]: 2025-11-22 09:01:54.137 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:01:54 np0005531887 podman[255478]: 2025-11-22 09:01:54.883435145 +0000 UTC m=+0.090656046 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd)
Nov 22 04:01:55 np0005531887 nova_compute[186849]: 2025-11-22 09:01:55.762 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:01:56 np0005531887 podman[255499]: 2025-11-22 09:01:56.842618072 +0000 UTC m=+0.068638046 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 04:01:57 np0005531887 nova_compute[186849]: 2025-11-22 09:01:57.562 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:01:59 np0005531887 nova_compute[186849]: 2025-11-22 09:01:59.139 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:02:02 np0005531887 nova_compute[186849]: 2025-11-22 09:02:02.566 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:02:04 np0005531887 nova_compute[186849]: 2025-11-22 09:02:04.141 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:02:04 np0005531887 podman[255526]: 2025-11-22 09:02:04.855306038 +0000 UTC m=+0.075336400 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.buildah.version=1.33.7, config_id=edpm, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 22 04:02:07 np0005531887 nova_compute[186849]: 2025-11-22 09:02:07.570 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:02:08 np0005531887 podman[255547]: 2025-11-22 09:02:08.854229111 +0000 UTC m=+0.072629973 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 04:02:08 np0005531887 podman[255548]: 2025-11-22 09:02:08.891418875 +0000 UTC m=+0.106245820 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 04:02:09 np0005531887 nova_compute[186849]: 2025-11-22 09:02:09.143 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:02:12 np0005531887 nova_compute[186849]: 2025-11-22 09:02:12.574 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:02:12 np0005531887 nova_compute[186849]: 2025-11-22 09:02:12.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:02:12 np0005531887 podman[255594]: 2025-11-22 09:02:12.839573689 +0000 UTC m=+0.050877080 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 04:02:14 np0005531887 nova_compute[186849]: 2025-11-22 09:02:14.145 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:02:17 np0005531887 nova_compute[186849]: 2025-11-22 09:02:17.578 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:02:19 np0005531887 nova_compute[186849]: 2025-11-22 09:02:19.147 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:02:21 np0005531887 podman[255618]: 2025-11-22 09:02:21.853096252 +0000 UTC m=+0.078129489 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Nov 22 04:02:22 np0005531887 nova_compute[186849]: 2025-11-22 09:02:22.581 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:02:24 np0005531887 nova_compute[186849]: 2025-11-22 09:02:24.148 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:02:25 np0005531887 podman[255637]: 2025-11-22 09:02:25.8461498 +0000 UTC m=+0.064513545 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=multipathd)
Nov 22 04:02:27 np0005531887 nova_compute[186849]: 2025-11-22 09:02:27.584 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:02:27 np0005531887 podman[255658]: 2025-11-22 09:02:27.825556723 +0000 UTC m=+0.046875791 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 04:02:29 np0005531887 nova_compute[186849]: 2025-11-22 09:02:29.150 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:02:32 np0005531887 nova_compute[186849]: 2025-11-22 09:02:32.589 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:02:34 np0005531887 nova_compute[186849]: 2025-11-22 09:02:34.152 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:02:35 np0005531887 podman[255683]: 2025-11-22 09:02:35.835930695 +0000 UTC m=+0.059111622 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, name=ubi9-minimal, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, config_id=edpm, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vendor=Red Hat, Inc.)
Nov 22 04:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:02:36.685 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:02:36.686 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:02:36.686 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:02:36.686 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:02:36.687 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:02:36.687 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:02:36.687 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:02:36.687 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:02:36.687 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:02:36.687 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:02:36.687 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:02:36.687 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:02:36.687 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:02:36.687 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:02:36.687 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:02:36.687 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:02:36.687 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:02:36.687 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:02:36.687 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:02:36.688 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:02:36.688 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:02:36.688 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:02:36.688 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:02:36.688 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:02:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:02:36.688 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:02:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 09:02:37.402 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:02:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 09:02:37.402 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:02:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 09:02:37.402 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:02:37 np0005531887 nova_compute[186849]: 2025-11-22 09:02:37.593 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:02:39 np0005531887 nova_compute[186849]: 2025-11-22 09:02:39.154 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:02:39 np0005531887 podman[255704]: 2025-11-22 09:02:39.846571774 +0000 UTC m=+0.064472264 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 22 04:02:39 np0005531887 podman[255705]: 2025-11-22 09:02:39.884514705 +0000 UTC m=+0.095783393 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller)
Nov 22 04:02:42 np0005531887 nova_compute[186849]: 2025-11-22 09:02:42.596 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:02:42 np0005531887 nova_compute[186849]: 2025-11-22 09:02:42.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:02:43 np0005531887 podman[255747]: 2025-11-22 09:02:43.85748962 +0000 UTC m=+0.077035922 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 04:02:44 np0005531887 nova_compute[186849]: 2025-11-22 09:02:44.156 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:02:47 np0005531887 nova_compute[186849]: 2025-11-22 09:02:47.599 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:02:49 np0005531887 nova_compute[186849]: 2025-11-22 09:02:49.158 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:02:50 np0005531887 nova_compute[186849]: 2025-11-22 09:02:50.767 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:02:50 np0005531887 nova_compute[186849]: 2025-11-22 09:02:50.768 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 04:02:50 np0005531887 nova_compute[186849]: 2025-11-22 09:02:50.768 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 04:02:50 np0005531887 nova_compute[186849]: 2025-11-22 09:02:50.801 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 04:02:50 np0005531887 nova_compute[186849]: 2025-11-22 09:02:50.802 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:02:50 np0005531887 nova_compute[186849]: 2025-11-22 09:02:50.802 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 04:02:50 np0005531887 nova_compute[186849]: 2025-11-22 09:02:50.802 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:02:50 np0005531887 nova_compute[186849]: 2025-11-22 09:02:50.832 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:02:50 np0005531887 nova_compute[186849]: 2025-11-22 09:02:50.832 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:02:50 np0005531887 nova_compute[186849]: 2025-11-22 09:02:50.833 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:02:50 np0005531887 nova_compute[186849]: 2025-11-22 09:02:50.833 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 04:02:50 np0005531887 nova_compute[186849]: 2025-11-22 09:02:50.985 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 04:02:50 np0005531887 nova_compute[186849]: 2025-11-22 09:02:50.986 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5725MB free_disk=73.26408004760742GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 04:02:50 np0005531887 nova_compute[186849]: 2025-11-22 09:02:50.986 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:02:50 np0005531887 nova_compute[186849]: 2025-11-22 09:02:50.987 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:02:51 np0005531887 nova_compute[186849]: 2025-11-22 09:02:51.125 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 04:02:51 np0005531887 nova_compute[186849]: 2025-11-22 09:02:51.125 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 04:02:51 np0005531887 nova_compute[186849]: 2025-11-22 09:02:51.544 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 04:02:51 np0005531887 nova_compute[186849]: 2025-11-22 09:02:51.621 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 04:02:51 np0005531887 nova_compute[186849]: 2025-11-22 09:02:51.622 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 04:02:51 np0005531887 nova_compute[186849]: 2025-11-22 09:02:51.623 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.636s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:02:52 np0005531887 nova_compute[186849]: 2025-11-22 09:02:52.590 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:02:52 np0005531887 nova_compute[186849]: 2025-11-22 09:02:52.602 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:02:52 np0005531887 nova_compute[186849]: 2025-11-22 09:02:52.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:02:52 np0005531887 podman[255771]: 2025-11-22 09:02:52.829383741 +0000 UTC m=+0.052484809 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent)
Nov 22 04:02:53 np0005531887 nova_compute[186849]: 2025-11-22 09:02:53.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:02:54 np0005531887 nova_compute[186849]: 2025-11-22 09:02:54.160 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:02:56 np0005531887 nova_compute[186849]: 2025-11-22 09:02:56.762 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:02:56 np0005531887 podman[255790]: 2025-11-22 09:02:56.855560293 +0000 UTC m=+0.072884371 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 22 04:02:57 np0005531887 nova_compute[186849]: 2025-11-22 09:02:57.606 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:02:58 np0005531887 podman[255811]: 2025-11-22 09:02:58.863113126 +0000 UTC m=+0.082406123 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 04:02:59 np0005531887 nova_compute[186849]: 2025-11-22 09:02:59.163 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:03:02 np0005531887 nova_compute[186849]: 2025-11-22 09:03:02.609 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:03:04 np0005531887 nova_compute[186849]: 2025-11-22 09:03:04.165 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:03:06 np0005531887 podman[255837]: 2025-11-22 09:03:06.890598257 +0000 UTC m=+0.097107996 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, architecture=x86_64, release=1755695350, version=9.6, distribution-scope=public, io.buildah.version=1.33.7, config_id=edpm, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 22 04:03:07 np0005531887 nova_compute[186849]: 2025-11-22 09:03:07.614 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:03:09 np0005531887 nova_compute[186849]: 2025-11-22 09:03:09.168 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:03:10 np0005531887 podman[255859]: 2025-11-22 09:03:10.864449904 +0000 UTC m=+0.073952187 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute)
Nov 22 04:03:10 np0005531887 podman[255860]: 2025-11-22 09:03:10.90544357 +0000 UTC m=+0.106021933 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 04:03:12 np0005531887 nova_compute[186849]: 2025-11-22 09:03:12.616 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:03:13 np0005531887 nova_compute[186849]: 2025-11-22 09:03:13.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:03:14 np0005531887 nova_compute[186849]: 2025-11-22 09:03:14.170 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:03:14 np0005531887 podman[255907]: 2025-11-22 09:03:14.823753312 +0000 UTC m=+0.046676198 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 04:03:17 np0005531887 nova_compute[186849]: 2025-11-22 09:03:17.617 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:03:19 np0005531887 nova_compute[186849]: 2025-11-22 09:03:19.175 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:03:22 np0005531887 nova_compute[186849]: 2025-11-22 09:03:22.623 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:03:23 np0005531887 podman[255932]: 2025-11-22 09:03:23.840665838 +0000 UTC m=+0.059555854 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 22 04:03:24 np0005531887 nova_compute[186849]: 2025-11-22 09:03:24.176 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:03:27 np0005531887 nova_compute[186849]: 2025-11-22 09:03:27.625 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:03:27 np0005531887 podman[255951]: 2025-11-22 09:03:27.862251817 +0000 UTC m=+0.078496649 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 04:03:28 np0005531887 nova_compute[186849]: 2025-11-22 09:03:28.762 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:03:29 np0005531887 nova_compute[186849]: 2025-11-22 09:03:29.179 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:03:29 np0005531887 podman[255972]: 2025-11-22 09:03:29.835230603 +0000 UTC m=+0.058286012 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 04:03:32 np0005531887 nova_compute[186849]: 2025-11-22 09:03:32.627 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:03:34 np0005531887 nova_compute[186849]: 2025-11-22 09:03:34.182 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:03:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 09:03:37.403 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:03:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 09:03:37.404 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:03:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 09:03:37.404 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:03:37 np0005531887 nova_compute[186849]: 2025-11-22 09:03:37.631 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:03:37 np0005531887 podman[255997]: 2025-11-22 09:03:37.840018587 +0000 UTC m=+0.055143155 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, architecture=x86_64, distribution-scope=public, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vendor=Red Hat, Inc., name=ubi9-minimal, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350)
Nov 22 04:03:39 np0005531887 nova_compute[186849]: 2025-11-22 09:03:39.184 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:03:41 np0005531887 podman[256020]: 2025-11-22 09:03:41.861721878 +0000 UTC m=+0.076257984 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=edpm, org.label-schema.vendor=CentOS)
Nov 22 04:03:41 np0005531887 podman[256021]: 2025-11-22 09:03:41.906159299 +0000 UTC m=+0.113483818 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 04:03:42 np0005531887 nova_compute[186849]: 2025-11-22 09:03:42.632 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:03:42 np0005531887 nova_compute[186849]: 2025-11-22 09:03:42.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:03:44 np0005531887 nova_compute[186849]: 2025-11-22 09:03:44.186 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:03:45 np0005531887 podman[256064]: 2025-11-22 09:03:45.835123793 +0000 UTC m=+0.059187785 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 04:03:47 np0005531887 nova_compute[186849]: 2025-11-22 09:03:47.636 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:03:49 np0005531887 nova_compute[186849]: 2025-11-22 09:03:49.188 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:03:50 np0005531887 nova_compute[186849]: 2025-11-22 09:03:50.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:03:50 np0005531887 nova_compute[186849]: 2025-11-22 09:03:50.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 04:03:50 np0005531887 nova_compute[186849]: 2025-11-22 09:03:50.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 04:03:50 np0005531887 nova_compute[186849]: 2025-11-22 09:03:50.788 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 04:03:52 np0005531887 nova_compute[186849]: 2025-11-22 09:03:52.639 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:03:52 np0005531887 nova_compute[186849]: 2025-11-22 09:03:52.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:03:52 np0005531887 nova_compute[186849]: 2025-11-22 09:03:52.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:03:52 np0005531887 nova_compute[186849]: 2025-11-22 09:03:52.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 04:03:52 np0005531887 nova_compute[186849]: 2025-11-22 09:03:52.770 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:03:52 np0005531887 nova_compute[186849]: 2025-11-22 09:03:52.814 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:03:52 np0005531887 nova_compute[186849]: 2025-11-22 09:03:52.815 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:03:52 np0005531887 nova_compute[186849]: 2025-11-22 09:03:52.815 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:03:52 np0005531887 nova_compute[186849]: 2025-11-22 09:03:52.815 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 04:03:53 np0005531887 nova_compute[186849]: 2025-11-22 09:03:53.093 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 04:03:53 np0005531887 nova_compute[186849]: 2025-11-22 09:03:53.095 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5719MB free_disk=73.2640609741211GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 04:03:53 np0005531887 nova_compute[186849]: 2025-11-22 09:03:53.095 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:03:53 np0005531887 nova_compute[186849]: 2025-11-22 09:03:53.096 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:03:53 np0005531887 nova_compute[186849]: 2025-11-22 09:03:53.207 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 04:03:53 np0005531887 nova_compute[186849]: 2025-11-22 09:03:53.208 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 04:03:53 np0005531887 nova_compute[186849]: 2025-11-22 09:03:53.246 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 04:03:53 np0005531887 nova_compute[186849]: 2025-11-22 09:03:53.265 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 04:03:53 np0005531887 nova_compute[186849]: 2025-11-22 09:03:53.268 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 04:03:53 np0005531887 nova_compute[186849]: 2025-11-22 09:03:53.268 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.172s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:03:54 np0005531887 nova_compute[186849]: 2025-11-22 09:03:54.190 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:03:54 np0005531887 podman[256088]: 2025-11-22 09:03:54.860960028 +0000 UTC m=+0.069959839 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Nov 22 04:03:55 np0005531887 nova_compute[186849]: 2025-11-22 09:03:55.269 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:03:55 np0005531887 nova_compute[186849]: 2025-11-22 09:03:55.269 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:03:57 np0005531887 nova_compute[186849]: 2025-11-22 09:03:57.643 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:03:58 np0005531887 nova_compute[186849]: 2025-11-22 09:03:58.762 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:03:58 np0005531887 podman[256109]: 2025-11-22 09:03:58.841287073 +0000 UTC m=+0.060332782 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3)
Nov 22 04:03:59 np0005531887 nova_compute[186849]: 2025-11-22 09:03:59.193 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:04:00 np0005531887 podman[256128]: 2025-11-22 09:04:00.818643086 +0000 UTC m=+0.042732910 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 04:04:02 np0005531887 nova_compute[186849]: 2025-11-22 09:04:02.647 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:04:04 np0005531887 nova_compute[186849]: 2025-11-22 09:04:04.193 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:04:07 np0005531887 nova_compute[186849]: 2025-11-22 09:04:07.652 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:04:08 np0005531887 podman[256152]: 2025-11-22 09:04:08.8348535 +0000 UTC m=+0.059723627 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, release=1755695350, architecture=x86_64, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.buildah.version=1.33.7, config_id=edpm, vcs-type=git, distribution-scope=public, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 22 04:04:09 np0005531887 nova_compute[186849]: 2025-11-22 09:04:09.195 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:04:12 np0005531887 nova_compute[186849]: 2025-11-22 09:04:12.656 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:04:12 np0005531887 podman[256175]: 2025-11-22 09:04:12.833909025 +0000 UTC m=+0.057049972 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 04:04:12 np0005531887 podman[256176]: 2025-11-22 09:04:12.861057962 +0000 UTC m=+0.080855117 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 22 04:04:14 np0005531887 nova_compute[186849]: 2025-11-22 09:04:14.198 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:04:14 np0005531887 nova_compute[186849]: 2025-11-22 09:04:14.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:04:16 np0005531887 podman[256222]: 2025-11-22 09:04:16.854212282 +0000 UTC m=+0.068988465 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 04:04:17 np0005531887 nova_compute[186849]: 2025-11-22 09:04:17.660 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:04:18 np0005531887 nova_compute[186849]: 2025-11-22 09:04:18.673 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:04:19 np0005531887 nova_compute[186849]: 2025-11-22 09:04:19.200 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:04:22 np0005531887 nova_compute[186849]: 2025-11-22 09:04:22.664 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:04:24 np0005531887 nova_compute[186849]: 2025-11-22 09:04:24.202 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:04:25 np0005531887 podman[256243]: 2025-11-22 09:04:25.831302681 +0000 UTC m=+0.050714767 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 22 04:04:27 np0005531887 nova_compute[186849]: 2025-11-22 09:04:27.668 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:04:29 np0005531887 nova_compute[186849]: 2025-11-22 09:04:29.202 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:04:29 np0005531887 podman[256262]: 2025-11-22 09:04:29.859660325 +0000 UTC m=+0.072677685 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 22 04:04:31 np0005531887 podman[256282]: 2025-11-22 09:04:31.825012183 +0000 UTC m=+0.048917172 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 04:04:32 np0005531887 nova_compute[186849]: 2025-11-22 09:04:32.673 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:04:34 np0005531887 nova_compute[186849]: 2025-11-22 09:04:34.203 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:04:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:04:36.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:04:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:04:36.681 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:04:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:04:36.681 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:04:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:04:36.681 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:04:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:04:36.681 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:04:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:04:36.682 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:04:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:04:36.682 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:04:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:04:36.682 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:04:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:04:36.682 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:04:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:04:36.682 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:04:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:04:36.682 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:04:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:04:36.682 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:04:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:04:36.682 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:04:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:04:36.682 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:04:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:04:36.682 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:04:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:04:36.682 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:04:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:04:36.682 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:04:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:04:36.682 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:04:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:04:36.682 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:04:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:04:36.683 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:04:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:04:36.683 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:04:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:04:36.683 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:04:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:04:36.683 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:04:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:04:36.683 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:04:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:04:36.683 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:04:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 09:04:37.405 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:04:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 09:04:37.406 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:04:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 09:04:37.406 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:04:37 np0005531887 nova_compute[186849]: 2025-11-22 09:04:37.676 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:04:39 np0005531887 nova_compute[186849]: 2025-11-22 09:04:39.205 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:04:39 np0005531887 podman[256306]: 2025-11-22 09:04:39.864886328 +0000 UTC m=+0.084346032 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, architecture=x86_64, io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., container_name=openstack_network_exporter, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41)
Nov 22 04:04:42 np0005531887 nova_compute[186849]: 2025-11-22 09:04:42.679 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:04:42 np0005531887 nova_compute[186849]: 2025-11-22 09:04:42.771 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:04:43 np0005531887 podman[256326]: 2025-11-22 09:04:43.847374277 +0000 UTC m=+0.066899524 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 04:04:43 np0005531887 podman[256327]: 2025-11-22 09:04:43.883969216 +0000 UTC m=+0.091066768 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 04:04:44 np0005531887 nova_compute[186849]: 2025-11-22 09:04:44.207 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:04:47 np0005531887 nova_compute[186849]: 2025-11-22 09:04:47.683 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:04:47 np0005531887 podman[256370]: 2025-11-22 09:04:47.820120825 +0000 UTC m=+0.045108628 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 04:04:49 np0005531887 nova_compute[186849]: 2025-11-22 09:04:49.209 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:04:50 np0005531887 nova_compute[186849]: 2025-11-22 09:04:50.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:04:50 np0005531887 nova_compute[186849]: 2025-11-22 09:04:50.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 04:04:50 np0005531887 nova_compute[186849]: 2025-11-22 09:04:50.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 04:04:50 np0005531887 nova_compute[186849]: 2025-11-22 09:04:50.796 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 04:04:52 np0005531887 nova_compute[186849]: 2025-11-22 09:04:52.686 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:04:52 np0005531887 nova_compute[186849]: 2025-11-22 09:04:52.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:04:52 np0005531887 nova_compute[186849]: 2025-11-22 09:04:52.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 04:04:52 np0005531887 nova_compute[186849]: 2025-11-22 09:04:52.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:04:52 np0005531887 nova_compute[186849]: 2025-11-22 09:04:52.807 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:04:52 np0005531887 nova_compute[186849]: 2025-11-22 09:04:52.808 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:04:52 np0005531887 nova_compute[186849]: 2025-11-22 09:04:52.808 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:04:52 np0005531887 nova_compute[186849]: 2025-11-22 09:04:52.808 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 04:04:52 np0005531887 nova_compute[186849]: 2025-11-22 09:04:52.957 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 04:04:52 np0005531887 nova_compute[186849]: 2025-11-22 09:04:52.958 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5715MB free_disk=73.26410675048828GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 04:04:52 np0005531887 nova_compute[186849]: 2025-11-22 09:04:52.958 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:04:52 np0005531887 nova_compute[186849]: 2025-11-22 09:04:52.959 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:04:53 np0005531887 nova_compute[186849]: 2025-11-22 09:04:53.027 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 04:04:53 np0005531887 nova_compute[186849]: 2025-11-22 09:04:53.028 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 04:04:53 np0005531887 nova_compute[186849]: 2025-11-22 09:04:53.059 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 04:04:53 np0005531887 nova_compute[186849]: 2025-11-22 09:04:53.079 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 04:04:53 np0005531887 nova_compute[186849]: 2025-11-22 09:04:53.080 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 04:04:53 np0005531887 nova_compute[186849]: 2025-11-22 09:04:53.080 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:04:54 np0005531887 nova_compute[186849]: 2025-11-22 09:04:54.211 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:04:55 np0005531887 nova_compute[186849]: 2025-11-22 09:04:55.080 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:04:55 np0005531887 nova_compute[186849]: 2025-11-22 09:04:55.081 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:04:55 np0005531887 nova_compute[186849]: 2025-11-22 09:04:55.081 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:04:56 np0005531887 podman[256396]: 2025-11-22 09:04:56.866849683 +0000 UTC m=+0.076631992 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 22 04:04:57 np0005531887 nova_compute[186849]: 2025-11-22 09:04:57.692 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:04:58 np0005531887 nova_compute[186849]: 2025-11-22 09:04:58.763 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:04:59 np0005531887 nova_compute[186849]: 2025-11-22 09:04:59.212 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:05:00 np0005531887 podman[256415]: 2025-11-22 09:05:00.832624182 +0000 UTC m=+0.054673523 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 04:05:02 np0005531887 nova_compute[186849]: 2025-11-22 09:05:02.696 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:05:02 np0005531887 podman[256437]: 2025-11-22 09:05:02.829062004 +0000 UTC m=+0.048465481 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 04:05:04 np0005531887 nova_compute[186849]: 2025-11-22 09:05:04.215 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:05:07 np0005531887 nova_compute[186849]: 2025-11-22 09:05:07.700 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:05:09 np0005531887 nova_compute[186849]: 2025-11-22 09:05:09.216 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:05:10 np0005531887 podman[256463]: 2025-11-22 09:05:10.83963735 +0000 UTC m=+0.058065517 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, build-date=2025-08-20T13:12:41, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=, name=ubi9-minimal, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_id=edpm, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 22 04:05:12 np0005531887 nova_compute[186849]: 2025-11-22 09:05:12.704 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:05:14 np0005531887 nova_compute[186849]: 2025-11-22 09:05:14.216 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:05:14 np0005531887 podman[256486]: 2025-11-22 09:05:14.8593199 +0000 UTC m=+0.073655610 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Nov 22 04:05:14 np0005531887 podman[256487]: 2025-11-22 09:05:14.911596444 +0000 UTC m=+0.110419053 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 04:05:16 np0005531887 nova_compute[186849]: 2025-11-22 09:05:16.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:05:17 np0005531887 nova_compute[186849]: 2025-11-22 09:05:17.705 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:05:18 np0005531887 podman[256532]: 2025-11-22 09:05:18.849035567 +0000 UTC m=+0.060910396 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 04:05:19 np0005531887 nova_compute[186849]: 2025-11-22 09:05:19.219 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:05:22 np0005531887 nova_compute[186849]: 2025-11-22 09:05:22.709 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:05:24 np0005531887 nova_compute[186849]: 2025-11-22 09:05:24.220 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:05:27 np0005531887 nova_compute[186849]: 2025-11-22 09:05:27.713 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:05:27 np0005531887 podman[256556]: 2025-11-22 09:05:27.836522 +0000 UTC m=+0.055641306 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 04:05:29 np0005531887 nova_compute[186849]: 2025-11-22 09:05:29.222 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:05:31 np0005531887 nova_compute[186849]: 2025-11-22 09:05:31.763 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:05:31 np0005531887 podman[256575]: 2025-11-22 09:05:31.841959803 +0000 UTC m=+0.061971023 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 22 04:05:32 np0005531887 nova_compute[186849]: 2025-11-22 09:05:32.716 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:05:33 np0005531887 podman[256595]: 2025-11-22 09:05:33.832368616 +0000 UTC m=+0.054156201 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 04:05:34 np0005531887 nova_compute[186849]: 2025-11-22 09:05:34.223 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:05:35 np0005531887 nova_compute[186849]: 2025-11-22 09:05:35.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:05:35 np0005531887 nova_compute[186849]: 2025-11-22 09:05:35.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 22 04:05:35 np0005531887 nova_compute[186849]: 2025-11-22 09:05:35.793 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 22 04:05:36 np0005531887 nova_compute[186849]: 2025-11-22 09:05:36.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:05:36 np0005531887 nova_compute[186849]: 2025-11-22 09:05:36.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 22 04:05:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 09:05:37.406 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:05:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 09:05:37.407 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:05:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 09:05:37.407 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:05:37 np0005531887 nova_compute[186849]: 2025-11-22 09:05:37.720 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:05:39 np0005531887 nova_compute[186849]: 2025-11-22 09:05:39.225 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:05:41 np0005531887 podman[256621]: 2025-11-22 09:05:41.872867707 +0000 UTC m=+0.098090240 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, build-date=2025-08-20T13:12:41, config_id=edpm, vcs-type=git, io.buildah.version=1.33.7)
Nov 22 04:05:42 np0005531887 nova_compute[186849]: 2025-11-22 09:05:42.724 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:05:42 np0005531887 nova_compute[186849]: 2025-11-22 09:05:42.798 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:05:44 np0005531887 nova_compute[186849]: 2025-11-22 09:05:44.228 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:05:45 np0005531887 podman[256642]: 2025-11-22 09:05:45.837621629 +0000 UTC m=+0.059403489 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 04:05:45 np0005531887 podman[256643]: 2025-11-22 09:05:45.889340039 +0000 UTC m=+0.105763118 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3)
Nov 22 04:05:47 np0005531887 nova_compute[186849]: 2025-11-22 09:05:47.728 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:05:47 np0005531887 nova_compute[186849]: 2025-11-22 09:05:47.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:05:49 np0005531887 nova_compute[186849]: 2025-11-22 09:05:49.229 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:05:49 np0005531887 podman[256691]: 2025-11-22 09:05:49.833183468 +0000 UTC m=+0.058258452 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 04:05:50 np0005531887 nova_compute[186849]: 2025-11-22 09:05:50.805 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:05:50 np0005531887 nova_compute[186849]: 2025-11-22 09:05:50.806 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 04:05:50 np0005531887 nova_compute[186849]: 2025-11-22 09:05:50.806 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 04:05:50 np0005531887 nova_compute[186849]: 2025-11-22 09:05:50.836 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 04:05:52 np0005531887 nova_compute[186849]: 2025-11-22 09:05:52.732 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:05:53 np0005531887 nova_compute[186849]: 2025-11-22 09:05:53.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:05:53 np0005531887 nova_compute[186849]: 2025-11-22 09:05:53.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 04:05:53 np0005531887 nova_compute[186849]: 2025-11-22 09:05:53.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:05:53 np0005531887 nova_compute[186849]: 2025-11-22 09:05:53.818 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:05:53 np0005531887 nova_compute[186849]: 2025-11-22 09:05:53.818 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:05:53 np0005531887 nova_compute[186849]: 2025-11-22 09:05:53.818 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:05:53 np0005531887 nova_compute[186849]: 2025-11-22 09:05:53.819 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 04:05:54 np0005531887 nova_compute[186849]: 2025-11-22 09:05:54.016 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 04:05:54 np0005531887 nova_compute[186849]: 2025-11-22 09:05:54.017 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5724MB free_disk=73.26412582397461GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 04:05:54 np0005531887 nova_compute[186849]: 2025-11-22 09:05:54.018 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:05:54 np0005531887 nova_compute[186849]: 2025-11-22 09:05:54.018 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:05:54 np0005531887 nova_compute[186849]: 2025-11-22 09:05:54.120 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 04:05:54 np0005531887 nova_compute[186849]: 2025-11-22 09:05:54.121 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 04:05:54 np0005531887 nova_compute[186849]: 2025-11-22 09:05:54.167 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 04:05:54 np0005531887 nova_compute[186849]: 2025-11-22 09:05:54.188 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 04:05:54 np0005531887 nova_compute[186849]: 2025-11-22 09:05:54.190 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 04:05:54 np0005531887 nova_compute[186849]: 2025-11-22 09:05:54.190 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.172s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:05:54 np0005531887 nova_compute[186849]: 2025-11-22 09:05:54.232 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:05:55 np0005531887 nova_compute[186849]: 2025-11-22 09:05:55.190 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:05:56 np0005531887 nova_compute[186849]: 2025-11-22 09:05:56.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:05:56 np0005531887 nova_compute[186849]: 2025-11-22 09:05:56.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:05:57 np0005531887 nova_compute[186849]: 2025-11-22 09:05:57.735 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:05:58 np0005531887 nova_compute[186849]: 2025-11-22 09:05:58.763 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:05:58 np0005531887 podman[256717]: 2025-11-22 09:05:58.851725914 +0000 UTC m=+0.067234012 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible)
Nov 22 04:05:59 np0005531887 nova_compute[186849]: 2025-11-22 09:05:59.234 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:06:02 np0005531887 nova_compute[186849]: 2025-11-22 09:06:02.738 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:06:02 np0005531887 podman[256736]: 2025-11-22 09:06:02.847498199 +0000 UTC m=+0.058329083 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible)
Nov 22 04:06:04 np0005531887 nova_compute[186849]: 2025-11-22 09:06:04.235 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:06:04 np0005531887 podman[256756]: 2025-11-22 09:06:04.823839657 +0000 UTC m=+0.045862077 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 04:06:07 np0005531887 nova_compute[186849]: 2025-11-22 09:06:07.742 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:06:09 np0005531887 nova_compute[186849]: 2025-11-22 09:06:09.238 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:06:12 np0005531887 nova_compute[186849]: 2025-11-22 09:06:12.745 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:06:12 np0005531887 podman[256781]: 2025-11-22 09:06:12.829764719 +0000 UTC m=+0.056266373 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.buildah.version=1.33.7, release=1755695350, managed_by=edpm_ansible, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, vcs-type=git, version=9.6, distribution-scope=public, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter)
Nov 22 04:06:14 np0005531887 nova_compute[186849]: 2025-11-22 09:06:14.240 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:06:16 np0005531887 nova_compute[186849]: 2025-11-22 09:06:16.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:06:16 np0005531887 podman[256803]: 2025-11-22 09:06:16.826290913 +0000 UTC m=+0.051772823 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=edpm, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 04:06:16 np0005531887 podman[256804]: 2025-11-22 09:06:16.844738775 +0000 UTC m=+0.065047598 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 04:06:17 np0005531887 nova_compute[186849]: 2025-11-22 09:06:17.746 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:06:19 np0005531887 nova_compute[186849]: 2025-11-22 09:06:19.241 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:06:20 np0005531887 podman[256845]: 2025-11-22 09:06:20.835209189 +0000 UTC m=+0.050648364 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 04:06:22 np0005531887 nova_compute[186849]: 2025-11-22 09:06:22.751 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:06:24 np0005531887 nova_compute[186849]: 2025-11-22 09:06:24.243 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:06:27 np0005531887 nova_compute[186849]: 2025-11-22 09:06:27.754 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:06:29 np0005531887 nova_compute[186849]: 2025-11-22 09:06:29.246 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:06:29 np0005531887 podman[256869]: 2025-11-22 09:06:29.825159842 +0000 UTC m=+0.047686792 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 22 04:06:32 np0005531887 nova_compute[186849]: 2025-11-22 09:06:32.758 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:06:33 np0005531887 podman[256888]: 2025-11-22 09:06:33.851698972 +0000 UTC m=+0.074885649 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 04:06:34 np0005531887 nova_compute[186849]: 2025-11-22 09:06:34.247 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:06:35 np0005531887 podman[256908]: 2025-11-22 09:06:35.826967724 +0000 UTC m=+0.047594579 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 04:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:06:36.681 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:06:36.682 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:06:36.682 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:06:36.682 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:06:36.682 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:06:36.682 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:06:36.683 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:06:36.683 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:06:36.683 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:06:36.683 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:06:36.683 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:06:36.683 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:06:36.684 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:06:36.684 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:06:36.684 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:06:36.684 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:06:36.684 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:06:36.684 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:06:36.684 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:06:36.685 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:06:36.685 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:06:36.685 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:06:36.685 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:06:36.685 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:06:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:06:36.685 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:06:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 09:06:37.408 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:06:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 09:06:37.409 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:06:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 09:06:37.409 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:06:37 np0005531887 nova_compute[186849]: 2025-11-22 09:06:37.655 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:06:37 np0005531887 nova_compute[186849]: 2025-11-22 09:06:37.762 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:06:39 np0005531887 nova_compute[186849]: 2025-11-22 09:06:39.250 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:06:42 np0005531887 nova_compute[186849]: 2025-11-22 09:06:42.766 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:06:43 np0005531887 nova_compute[186849]: 2025-11-22 09:06:43.808 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:06:43 np0005531887 podman[256933]: 2025-11-22 09:06:43.874804262 +0000 UTC m=+0.087633733 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_id=edpm, name=ubi9-minimal, release=1755695350, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., managed_by=edpm_ansible, vcs-type=git, io.buildah.version=1.33.7, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6)
Nov 22 04:06:44 np0005531887 nova_compute[186849]: 2025-11-22 09:06:44.253 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:06:47 np0005531887 nova_compute[186849]: 2025-11-22 09:06:47.770 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:06:47 np0005531887 podman[256956]: 2025-11-22 09:06:47.834070314 +0000 UTC m=+0.054535181 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251118)
Nov 22 04:06:47 np0005531887 podman[256957]: 2025-11-22 09:06:47.861827875 +0000 UTC m=+0.075993247 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller)
Nov 22 04:06:49 np0005531887 nova_compute[186849]: 2025-11-22 09:06:49.257 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:06:50 np0005531887 nova_compute[186849]: 2025-11-22 09:06:50.255 186853 DEBUG oslo_concurrency.processutils [None req-7f8d9728-2d8b-420d-a211-d58aa5b5dfd5 74ad5d4ed255439cafdb153ee87124a2 cb198b45e9034b108a19399d19c6cf14 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 22 04:06:50 np0005531887 nova_compute[186849]: 2025-11-22 09:06:50.275 186853 DEBUG oslo_concurrency.processutils [None req-7f8d9728-2d8b-420d-a211-d58aa5b5dfd5 74ad5d4ed255439cafdb153ee87124a2 cb198b45e9034b108a19399d19c6cf14 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.020s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 22 04:06:50 np0005531887 nova_compute[186849]: 2025-11-22 09:06:50.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:06:50 np0005531887 nova_compute[186849]: 2025-11-22 09:06:50.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 04:06:50 np0005531887 nova_compute[186849]: 2025-11-22 09:06:50.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 04:06:50 np0005531887 nova_compute[186849]: 2025-11-22 09:06:50.794 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 04:06:51 np0005531887 podman[257001]: 2025-11-22 09:06:51.829226452 +0000 UTC m=+0.054116710 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 04:06:52 np0005531887 nova_compute[186849]: 2025-11-22 09:06:52.775 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:06:54 np0005531887 nova_compute[186849]: 2025-11-22 09:06:54.259 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:06:54 np0005531887 nova_compute[186849]: 2025-11-22 09:06:54.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:06:54 np0005531887 nova_compute[186849]: 2025-11-22 09:06:54.770 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:06:54 np0005531887 nova_compute[186849]: 2025-11-22 09:06:54.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 04:06:54 np0005531887 nova_compute[186849]: 2025-11-22 09:06:54.770 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:06:54 np0005531887 nova_compute[186849]: 2025-11-22 09:06:54.798 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:06:54 np0005531887 nova_compute[186849]: 2025-11-22 09:06:54.798 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:06:54 np0005531887 nova_compute[186849]: 2025-11-22 09:06:54.798 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:06:54 np0005531887 nova_compute[186849]: 2025-11-22 09:06:54.799 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 04:06:54 np0005531887 nova_compute[186849]: 2025-11-22 09:06:54.967 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 04:06:54 np0005531887 nova_compute[186849]: 2025-11-22 09:06:54.969 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5732MB free_disk=73.26410675048828GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 04:06:54 np0005531887 nova_compute[186849]: 2025-11-22 09:06:54.969 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:06:54 np0005531887 nova_compute[186849]: 2025-11-22 09:06:54.970 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:06:55 np0005531887 nova_compute[186849]: 2025-11-22 09:06:55.191 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 04:06:55 np0005531887 nova_compute[186849]: 2025-11-22 09:06:55.192 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 04:06:55 np0005531887 nova_compute[186849]: 2025-11-22 09:06:55.214 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Refreshing inventories for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 22 04:06:55 np0005531887 nova_compute[186849]: 2025-11-22 09:06:55.288 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Updating ProviderTree inventory for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 22 04:06:55 np0005531887 nova_compute[186849]: 2025-11-22 09:06:55.288 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Updating inventory in ProviderTree for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 22 04:06:55 np0005531887 nova_compute[186849]: 2025-11-22 09:06:55.312 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Refreshing aggregate associations for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 22 04:06:55 np0005531887 nova_compute[186849]: 2025-11-22 09:06:55.360 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Refreshing trait associations for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78, traits: COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_MMX,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_PCNET _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 22 04:06:55 np0005531887 nova_compute[186849]: 2025-11-22 09:06:55.410 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 04:06:55 np0005531887 nova_compute[186849]: 2025-11-22 09:06:55.464 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 04:06:55 np0005531887 nova_compute[186849]: 2025-11-22 09:06:55.466 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 04:06:55 np0005531887 nova_compute[186849]: 2025-11-22 09:06:55.467 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.497s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:06:55 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 09:06:55.943 104084 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=83, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=82) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 22 04:06:55 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 09:06:55.944 104084 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 22 04:06:55 np0005531887 nova_compute[186849]: 2025-11-22 09:06:55.944 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:06:57 np0005531887 nova_compute[186849]: 2025-11-22 09:06:57.466 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:06:57 np0005531887 nova_compute[186849]: 2025-11-22 09:06:57.467 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:06:57 np0005531887 nova_compute[186849]: 2025-11-22 09:06:57.777 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:06:59 np0005531887 nova_compute[186849]: 2025-11-22 09:06:59.260 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:06:59 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 09:06:59.946 104084 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=73ab1342-b2af-4236-8199-7d435ebce194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '83'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 22 04:07:00 np0005531887 nova_compute[186849]: 2025-11-22 09:07:00.763 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:07:00 np0005531887 podman[257025]: 2025-11-22 09:07:00.850149267 +0000 UTC m=+0.075768612 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 22 04:07:02 np0005531887 nova_compute[186849]: 2025-11-22 09:07:02.781 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:07:04 np0005531887 nova_compute[186849]: 2025-11-22 09:07:04.262 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:07:04 np0005531887 podman[257044]: 2025-11-22 09:07:04.829897457 +0000 UTC m=+0.056362945 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 04:07:06 np0005531887 podman[257065]: 2025-11-22 09:07:06.82798723 +0000 UTC m=+0.044769481 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 04:07:07 np0005531887 nova_compute[186849]: 2025-11-22 09:07:07.784 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:07:09 np0005531887 nova_compute[186849]: 2025-11-22 09:07:09.265 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:07:12 np0005531887 nova_compute[186849]: 2025-11-22 09:07:12.788 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:07:14 np0005531887 nova_compute[186849]: 2025-11-22 09:07:14.268 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:07:14 np0005531887 podman[257089]: 2025-11-22 09:07:14.829007841 +0000 UTC m=+0.051408674 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, container_name=openstack_network_exporter, io.buildah.version=1.33.7, distribution-scope=public, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vcs-type=git, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, version=9.6, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 04:07:17 np0005531887 nova_compute[186849]: 2025-11-22 09:07:17.793 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:07:18 np0005531887 nova_compute[186849]: 2025-11-22 09:07:18.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:07:18 np0005531887 podman[257110]: 2025-11-22 09:07:18.863646687 +0000 UTC m=+0.070496591 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118)
Nov 22 04:07:18 np0005531887 podman[257111]: 2025-11-22 09:07:18.907197878 +0000 UTC m=+0.110696121 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 22 04:07:19 np0005531887 nova_compute[186849]: 2025-11-22 09:07:19.269 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:07:22 np0005531887 nova_compute[186849]: 2025-11-22 09:07:22.796 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:07:22 np0005531887 podman[257152]: 2025-11-22 09:07:22.83167349 +0000 UTC m=+0.057047641 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 04:07:24 np0005531887 nova_compute[186849]: 2025-11-22 09:07:24.272 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:07:27 np0005531887 nova_compute[186849]: 2025-11-22 09:07:27.801 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:07:29 np0005531887 nova_compute[186849]: 2025-11-22 09:07:29.274 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:07:31 np0005531887 nova_compute[186849]: 2025-11-22 09:07:31.764 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:07:31 np0005531887 podman[257176]: 2025-11-22 09:07:31.834301708 +0000 UTC m=+0.055905234 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 04:07:32 np0005531887 nova_compute[186849]: 2025-11-22 09:07:32.805 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:07:34 np0005531887 nova_compute[186849]: 2025-11-22 09:07:34.276 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:07:35 np0005531887 podman[257197]: 2025-11-22 09:07:35.838892939 +0000 UTC m=+0.060455045 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 22 04:07:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 09:07:37.409 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:07:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 09:07:37.410 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:07:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 09:07:37.410 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:07:37 np0005531887 nova_compute[186849]: 2025-11-22 09:07:37.808 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:07:37 np0005531887 podman[257217]: 2025-11-22 09:07:37.81899093 +0000 UTC m=+0.044467713 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 04:07:39 np0005531887 nova_compute[186849]: 2025-11-22 09:07:39.277 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:07:42 np0005531887 nova_compute[186849]: 2025-11-22 09:07:42.812 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:07:44 np0005531887 nova_compute[186849]: 2025-11-22 09:07:44.278 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:07:44 np0005531887 nova_compute[186849]: 2025-11-22 09:07:44.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:07:45 np0005531887 podman[257241]: 2025-11-22 09:07:45.858148358 +0000 UTC m=+0.082420405 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.openshift.expose-services=, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, distribution-scope=public, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6)
Nov 22 04:07:47 np0005531887 nova_compute[186849]: 2025-11-22 09:07:47.815 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:07:49 np0005531887 nova_compute[186849]: 2025-11-22 09:07:49.280 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:07:49 np0005531887 podman[257264]: 2025-11-22 09:07:49.835562062 +0000 UTC m=+0.052029559 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 22 04:07:49 np0005531887 podman[257265]: 2025-11-22 09:07:49.883022597 +0000 UTC m=+0.083003178 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Nov 22 04:07:51 np0005531887 nova_compute[186849]: 2025-11-22 09:07:51.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:07:51 np0005531887 nova_compute[186849]: 2025-11-22 09:07:51.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 04:07:51 np0005531887 nova_compute[186849]: 2025-11-22 09:07:51.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 04:07:51 np0005531887 nova_compute[186849]: 2025-11-22 09:07:51.784 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 04:07:52 np0005531887 nova_compute[186849]: 2025-11-22 09:07:52.817 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:07:53 np0005531887 podman[257307]: 2025-11-22 09:07:53.832275939 +0000 UTC m=+0.055034783 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 04:07:54 np0005531887 nova_compute[186849]: 2025-11-22 09:07:54.282 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:07:54 np0005531887 nova_compute[186849]: 2025-11-22 09:07:54.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:07:54 np0005531887 nova_compute[186849]: 2025-11-22 09:07:54.798 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:07:54 np0005531887 nova_compute[186849]: 2025-11-22 09:07:54.799 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:07:54 np0005531887 nova_compute[186849]: 2025-11-22 09:07:54.800 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:07:54 np0005531887 nova_compute[186849]: 2025-11-22 09:07:54.800 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 04:07:54 np0005531887 nova_compute[186849]: 2025-11-22 09:07:54.960 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 04:07:54 np0005531887 nova_compute[186849]: 2025-11-22 09:07:54.961 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5712MB free_disk=73.26410675048828GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 04:07:54 np0005531887 nova_compute[186849]: 2025-11-22 09:07:54.961 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:07:54 np0005531887 nova_compute[186849]: 2025-11-22 09:07:54.962 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:07:55 np0005531887 nova_compute[186849]: 2025-11-22 09:07:55.033 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 04:07:55 np0005531887 nova_compute[186849]: 2025-11-22 09:07:55.034 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 04:07:55 np0005531887 nova_compute[186849]: 2025-11-22 09:07:55.170 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 04:07:55 np0005531887 nova_compute[186849]: 2025-11-22 09:07:55.186 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 04:07:55 np0005531887 nova_compute[186849]: 2025-11-22 09:07:55.188 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 04:07:55 np0005531887 nova_compute[186849]: 2025-11-22 09:07:55.189 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.227s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:07:56 np0005531887 nova_compute[186849]: 2025-11-22 09:07:56.189 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:07:56 np0005531887 nova_compute[186849]: 2025-11-22 09:07:56.190 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:07:56 np0005531887 nova_compute[186849]: 2025-11-22 09:07:56.190 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 04:07:56 np0005531887 nova_compute[186849]: 2025-11-22 09:07:56.770 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:07:57 np0005531887 nova_compute[186849]: 2025-11-22 09:07:57.818 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:07:58 np0005531887 nova_compute[186849]: 2025-11-22 09:07:58.770 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:07:59 np0005531887 nova_compute[186849]: 2025-11-22 09:07:59.284 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:08:01 np0005531887 nova_compute[186849]: 2025-11-22 09:08:01.763 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:08:02 np0005531887 podman[257331]: 2025-11-22 09:08:02.818944213 +0000 UTC m=+0.043450608 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 22 04:08:02 np0005531887 nova_compute[186849]: 2025-11-22 09:08:02.822 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:08:04 np0005531887 nova_compute[186849]: 2025-11-22 09:08:04.287 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:08:06 np0005531887 podman[257350]: 2025-11-22 09:08:06.869188365 +0000 UTC m=+0.086261939 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 04:08:07 np0005531887 nova_compute[186849]: 2025-11-22 09:08:07.827 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:08:08 np0005531887 podman[257370]: 2025-11-22 09:08:08.821966603 +0000 UTC m=+0.043945360 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 04:08:09 np0005531887 nova_compute[186849]: 2025-11-22 09:08:09.289 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:08:12 np0005531887 nova_compute[186849]: 2025-11-22 09:08:12.831 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:08:14 np0005531887 nova_compute[186849]: 2025-11-22 09:08:14.292 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:08:16 np0005531887 podman[257394]: 2025-11-22 09:08:16.875402553 +0000 UTC m=+0.093496888 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41)
Nov 22 04:08:17 np0005531887 nova_compute[186849]: 2025-11-22 09:08:17.835 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:08:19 np0005531887 nova_compute[186849]: 2025-11-22 09:08:19.292 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:08:20 np0005531887 nova_compute[186849]: 2025-11-22 09:08:20.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:08:20 np0005531887 podman[257415]: 2025-11-22 09:08:20.828051157 +0000 UTC m=+0.051078225 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 04:08:20 np0005531887 podman[257416]: 2025-11-22 09:08:20.870707275 +0000 UTC m=+0.088528554 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3)
Nov 22 04:08:22 np0005531887 nova_compute[186849]: 2025-11-22 09:08:22.840 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:08:24 np0005531887 nova_compute[186849]: 2025-11-22 09:08:24.295 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:08:24 np0005531887 podman[257457]: 2025-11-22 09:08:24.828157009 +0000 UTC m=+0.049747232 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 04:08:27 np0005531887 nova_compute[186849]: 2025-11-22 09:08:27.844 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:08:29 np0005531887 nova_compute[186849]: 2025-11-22 09:08:29.296 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:08:32 np0005531887 nova_compute[186849]: 2025-11-22 09:08:32.849 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:08:33 np0005531887 podman[257481]: 2025-11-22 09:08:33.827362451 +0000 UTC m=+0.048335598 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 04:08:34 np0005531887 nova_compute[186849]: 2025-11-22 09:08:34.298 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:08:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:08:36.682 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:08:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:08:36.682 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:08:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:08:36.682 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:08:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:08:36.682 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:08:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:08:36.682 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:08:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:08:36.683 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:08:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:08:36.683 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:08:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:08:36.683 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:08:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:08:36.683 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:08:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:08:36.683 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:08:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:08:36.683 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:08:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:08:36.683 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:08:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:08:36.683 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:08:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:08:36.683 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:08:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:08:36.683 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:08:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:08:36.683 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:08:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:08:36.683 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:08:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:08:36.683 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:08:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:08:36.683 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:08:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:08:36.684 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:08:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:08:36.684 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:08:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:08:36.684 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:08:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:08:36.684 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:08:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:08:36.684 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:08:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:08:36.684 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:08:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 09:08:37.410 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:08:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 09:08:37.411 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:08:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 09:08:37.411 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:08:37 np0005531887 podman[257500]: 2025-11-22 09:08:37.847505559 +0000 UTC m=+0.069132506 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 04:08:37 np0005531887 nova_compute[186849]: 2025-11-22 09:08:37.851 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:08:39 np0005531887 nova_compute[186849]: 2025-11-22 09:08:39.299 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:08:39 np0005531887 podman[257520]: 2025-11-22 09:08:39.818912313 +0000 UTC m=+0.044817829 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 04:08:42 np0005531887 nova_compute[186849]: 2025-11-22 09:08:42.855 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:08:44 np0005531887 nova_compute[186849]: 2025-11-22 09:08:44.301 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:08:44 np0005531887 nova_compute[186849]: 2025-11-22 09:08:44.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:08:47 np0005531887 podman[257546]: 2025-11-22 09:08:47.838558215 +0000 UTC m=+0.059191782 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.openshift.expose-services=, config_id=edpm, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, version=9.6, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 22 04:08:47 np0005531887 nova_compute[186849]: 2025-11-22 09:08:47.858 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:08:49 np0005531887 nova_compute[186849]: 2025-11-22 09:08:49.302 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:08:51 np0005531887 podman[257567]: 2025-11-22 09:08:51.827450631 +0000 UTC m=+0.053464091 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 04:08:51 np0005531887 podman[257568]: 2025-11-22 09:08:51.854350211 +0000 UTC m=+0.076988227 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 04:08:52 np0005531887 nova_compute[186849]: 2025-11-22 09:08:52.860 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:08:53 np0005531887 nova_compute[186849]: 2025-11-22 09:08:53.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:08:53 np0005531887 nova_compute[186849]: 2025-11-22 09:08:53.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 04:08:53 np0005531887 nova_compute[186849]: 2025-11-22 09:08:53.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 04:08:53 np0005531887 nova_compute[186849]: 2025-11-22 09:08:53.802 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 04:08:54 np0005531887 nova_compute[186849]: 2025-11-22 09:08:54.304 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:08:55 np0005531887 podman[257611]: 2025-11-22 09:08:55.824623861 +0000 UTC m=+0.044889390 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 04:08:56 np0005531887 nova_compute[186849]: 2025-11-22 09:08:56.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:08:56 np0005531887 nova_compute[186849]: 2025-11-22 09:08:56.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 04:08:56 np0005531887 nova_compute[186849]: 2025-11-22 09:08:56.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:08:56 np0005531887 nova_compute[186849]: 2025-11-22 09:08:56.796 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:08:56 np0005531887 nova_compute[186849]: 2025-11-22 09:08:56.796 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:08:56 np0005531887 nova_compute[186849]: 2025-11-22 09:08:56.797 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:08:56 np0005531887 nova_compute[186849]: 2025-11-22 09:08:56.797 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 04:08:56 np0005531887 nova_compute[186849]: 2025-11-22 09:08:56.982 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 04:08:56 np0005531887 nova_compute[186849]: 2025-11-22 09:08:56.983 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5714MB free_disk=73.26412582397461GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 04:08:56 np0005531887 nova_compute[186849]: 2025-11-22 09:08:56.984 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:08:56 np0005531887 nova_compute[186849]: 2025-11-22 09:08:56.984 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:08:57 np0005531887 nova_compute[186849]: 2025-11-22 09:08:57.054 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 04:08:57 np0005531887 nova_compute[186849]: 2025-11-22 09:08:57.055 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 04:08:57 np0005531887 nova_compute[186849]: 2025-11-22 09:08:57.073 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 04:08:57 np0005531887 nova_compute[186849]: 2025-11-22 09:08:57.086 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 04:08:57 np0005531887 nova_compute[186849]: 2025-11-22 09:08:57.088 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 04:08:57 np0005531887 nova_compute[186849]: 2025-11-22 09:08:57.088 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:08:57 np0005531887 nova_compute[186849]: 2025-11-22 09:08:57.865 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:08:58 np0005531887 nova_compute[186849]: 2025-11-22 09:08:58.088 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:08:58 np0005531887 nova_compute[186849]: 2025-11-22 09:08:58.089 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:08:59 np0005531887 nova_compute[186849]: 2025-11-22 09:08:59.306 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:09:00 np0005531887 nova_compute[186849]: 2025-11-22 09:09:00.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:09:02 np0005531887 nova_compute[186849]: 2025-11-22 09:09:02.763 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:09:02 np0005531887 nova_compute[186849]: 2025-11-22 09:09:02.869 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:09:04 np0005531887 nova_compute[186849]: 2025-11-22 09:09:04.307 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:09:04 np0005531887 podman[257635]: 2025-11-22 09:09:04.906544259 +0000 UTC m=+0.117777697 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Nov 22 04:09:07 np0005531887 nova_compute[186849]: 2025-11-22 09:09:07.873 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:09:08 np0005531887 podman[257653]: 2025-11-22 09:09:08.950024404 +0000 UTC m=+0.166412379 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 22 04:09:09 np0005531887 nova_compute[186849]: 2025-11-22 09:09:09.309 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:09:10 np0005531887 podman[257675]: 2025-11-22 09:09:10.83212596 +0000 UTC m=+0.051393460 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 04:09:12 np0005531887 nova_compute[186849]: 2025-11-22 09:09:12.878 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:09:14 np0005531887 nova_compute[186849]: 2025-11-22 09:09:14.313 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:09:17 np0005531887 nova_compute[186849]: 2025-11-22 09:09:17.882 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:09:18 np0005531887 podman[257699]: 2025-11-22 09:09:18.837670925 +0000 UTC m=+0.057321965 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, vendor=Red Hat, Inc., config_id=edpm, io.buildah.version=1.33.7, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, version=9.6)
Nov 22 04:09:19 np0005531887 nova_compute[186849]: 2025-11-22 09:09:19.317 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:09:21 np0005531887 nova_compute[186849]: 2025-11-22 09:09:21.770 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:09:22 np0005531887 podman[257720]: 2025-11-22 09:09:22.832452366 +0000 UTC m=+0.056668800 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Nov 22 04:09:22 np0005531887 podman[257721]: 2025-11-22 09:09:22.879635812 +0000 UTC m=+0.099655123 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 22 04:09:22 np0005531887 nova_compute[186849]: 2025-11-22 09:09:22.884 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:09:24 np0005531887 nova_compute[186849]: 2025-11-22 09:09:24.320 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:09:26 np0005531887 podman[257763]: 2025-11-22 09:09:26.824723377 +0000 UTC m=+0.047611798 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 04:09:27 np0005531887 nova_compute[186849]: 2025-11-22 09:09:27.886 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:09:29 np0005531887 nova_compute[186849]: 2025-11-22 09:09:29.320 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:09:32 np0005531887 nova_compute[186849]: 2025-11-22 09:09:32.891 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:09:34 np0005531887 nova_compute[186849]: 2025-11-22 09:09:34.324 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:09:34 np0005531887 nova_compute[186849]: 2025-11-22 09:09:34.762 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:09:35 np0005531887 podman[257789]: 2025-11-22 09:09:35.836451655 +0000 UTC m=+0.057611212 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent)
Nov 22 04:09:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 09:09:37.412 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:09:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 09:09:37.412 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:09:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 09:09:37.413 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:09:37 np0005531887 nova_compute[186849]: 2025-11-22 09:09:37.894 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:09:39 np0005531887 nova_compute[186849]: 2025-11-22 09:09:39.324 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:09:39 np0005531887 podman[257809]: 2025-11-22 09:09:39.84524981 +0000 UTC m=+0.065589697 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 04:09:41 np0005531887 podman[257829]: 2025-11-22 09:09:41.847495449 +0000 UTC m=+0.058965836 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 04:09:42 np0005531887 nova_compute[186849]: 2025-11-22 09:09:42.898 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:09:44 np0005531887 nova_compute[186849]: 2025-11-22 09:09:44.325 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:09:44 np0005531887 nova_compute[186849]: 2025-11-22 09:09:44.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:09:47 np0005531887 nova_compute[186849]: 2025-11-22 09:09:47.902 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:09:49 np0005531887 nova_compute[186849]: 2025-11-22 09:09:49.336 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:09:49 np0005531887 podman[257856]: 2025-11-22 09:09:49.848435681 +0000 UTC m=+0.066497251 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.buildah.version=1.33.7, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm, name=ubi9-minimal, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public)
Nov 22 04:09:52 np0005531887 nova_compute[186849]: 2025-11-22 09:09:52.905 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:09:53 np0005531887 podman[257877]: 2025-11-22 09:09:53.843083909 +0000 UTC m=+0.055121422 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 04:09:53 np0005531887 podman[257878]: 2025-11-22 09:09:53.870340746 +0000 UTC m=+0.084937322 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 04:09:54 np0005531887 nova_compute[186849]: 2025-11-22 09:09:54.337 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:09:55 np0005531887 nova_compute[186849]: 2025-11-22 09:09:55.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:09:55 np0005531887 nova_compute[186849]: 2025-11-22 09:09:55.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 04:09:55 np0005531887 nova_compute[186849]: 2025-11-22 09:09:55.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 04:09:55 np0005531887 nova_compute[186849]: 2025-11-22 09:09:55.785 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 04:09:56 np0005531887 nova_compute[186849]: 2025-11-22 09:09:56.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:09:56 np0005531887 nova_compute[186849]: 2025-11-22 09:09:56.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 04:09:56 np0005531887 nova_compute[186849]: 2025-11-22 09:09:56.770 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:09:56 np0005531887 nova_compute[186849]: 2025-11-22 09:09:56.798 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:09:56 np0005531887 nova_compute[186849]: 2025-11-22 09:09:56.799 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:09:56 np0005531887 nova_compute[186849]: 2025-11-22 09:09:56.799 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:09:56 np0005531887 nova_compute[186849]: 2025-11-22 09:09:56.799 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 04:09:56 np0005531887 nova_compute[186849]: 2025-11-22 09:09:56.962 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 04:09:56 np0005531887 nova_compute[186849]: 2025-11-22 09:09:56.963 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5718MB free_disk=73.26412582397461GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 04:09:56 np0005531887 nova_compute[186849]: 2025-11-22 09:09:56.963 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:09:56 np0005531887 nova_compute[186849]: 2025-11-22 09:09:56.963 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:09:57 np0005531887 nova_compute[186849]: 2025-11-22 09:09:57.030 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 04:09:57 np0005531887 nova_compute[186849]: 2025-11-22 09:09:57.031 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 04:09:57 np0005531887 nova_compute[186849]: 2025-11-22 09:09:57.060 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 04:09:57 np0005531887 nova_compute[186849]: 2025-11-22 09:09:57.081 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 04:09:57 np0005531887 nova_compute[186849]: 2025-11-22 09:09:57.083 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 04:09:57 np0005531887 nova_compute[186849]: 2025-11-22 09:09:57.083 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:09:57 np0005531887 podman[257923]: 2025-11-22 09:09:57.866942354 +0000 UTC m=+0.088251304 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 04:09:57 np0005531887 nova_compute[186849]: 2025-11-22 09:09:57.910 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:09:59 np0005531887 nova_compute[186849]: 2025-11-22 09:09:59.084 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:09:59 np0005531887 nova_compute[186849]: 2025-11-22 09:09:59.084 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:09:59 np0005531887 nova_compute[186849]: 2025-11-22 09:09:59.339 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:10:00 np0005531887 nova_compute[186849]: 2025-11-22 09:10:00.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:10:02 np0005531887 nova_compute[186849]: 2025-11-22 09:10:02.914 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:10:04 np0005531887 nova_compute[186849]: 2025-11-22 09:10:04.340 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:10:04 np0005531887 nova_compute[186849]: 2025-11-22 09:10:04.762 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:10:06 np0005531887 podman[257948]: 2025-11-22 09:10:06.878976229 +0000 UTC m=+0.090450227 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 22 04:10:07 np0005531887 nova_compute[186849]: 2025-11-22 09:10:07.917 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:10:09 np0005531887 nova_compute[186849]: 2025-11-22 09:10:09.342 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:10:10 np0005531887 podman[257970]: 2025-11-22 09:10:10.853625898 +0000 UTC m=+0.075021800 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Nov 22 04:10:12 np0005531887 podman[257991]: 2025-11-22 09:10:12.858039521 +0000 UTC m=+0.066451499 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 04:10:12 np0005531887 nova_compute[186849]: 2025-11-22 09:10:12.944 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:10:14 np0005531887 nova_compute[186849]: 2025-11-22 09:10:14.344 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:10:17 np0005531887 nova_compute[186849]: 2025-11-22 09:10:17.950 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:10:19 np0005531887 nova_compute[186849]: 2025-11-22 09:10:19.345 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:10:20 np0005531887 podman[258015]: 2025-11-22 09:10:20.852107585 +0000 UTC m=+0.064638955 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-type=git, vendor=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 22 04:10:22 np0005531887 nova_compute[186849]: 2025-11-22 09:10:22.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:10:22 np0005531887 nova_compute[186849]: 2025-11-22 09:10:22.990 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:10:24 np0005531887 nova_compute[186849]: 2025-11-22 09:10:24.346 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:10:24 np0005531887 podman[258036]: 2025-11-22 09:10:24.841939846 +0000 UTC m=+0.064125573 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm)
Nov 22 04:10:24 np0005531887 podman[258037]: 2025-11-22 09:10:24.934420822 +0000 UTC m=+0.141397876 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible)
Nov 22 04:10:27 np0005531887 nova_compute[186849]: 2025-11-22 09:10:27.994 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:10:28 np0005531887 podman[258081]: 2025-11-22 09:10:28.885203645 +0000 UTC m=+0.096650149 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 04:10:29 np0005531887 nova_compute[186849]: 2025-11-22 09:10:29.348 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:10:32 np0005531887 nova_compute[186849]: 2025-11-22 09:10:32.998 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:10:34 np0005531887 nova_compute[186849]: 2025-11-22 09:10:34.350 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:10:36.683 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:10:36.683 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:10:36.683 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:10:36.683 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:10:36.684 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:10:36.684 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:10:36.684 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:10:36.684 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:10:36.684 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:10:36.684 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:10:36.685 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:10:36.685 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:10:36.685 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:10:36.685 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:10:36.685 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:10:36.686 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:10:36.686 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:10:36.686 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:10:36.686 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:10:36.686 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:10:36.686 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:10:36.686 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:10:36.686 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:10:36.686 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:10:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:10:36.686 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:10:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 09:10:37.413 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:10:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 09:10:37.413 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:10:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 09:10:37.413 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:10:37 np0005531887 podman[258106]: 2025-11-22 09:10:37.82939769 +0000 UTC m=+0.051987356 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent)
Nov 22 04:10:38 np0005531887 nova_compute[186849]: 2025-11-22 09:10:38.002 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:10:39 np0005531887 nova_compute[186849]: 2025-11-22 09:10:39.351 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:10:41 np0005531887 podman[258127]: 2025-11-22 09:10:41.856196647 +0000 UTC m=+0.080285359 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 22 04:10:43 np0005531887 nova_compute[186849]: 2025-11-22 09:10:43.036 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:10:43 np0005531887 podman[258148]: 2025-11-22 09:10:43.818977398 +0000 UTC m=+0.044933212 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 04:10:44 np0005531887 nova_compute[186849]: 2025-11-22 09:10:44.353 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:10:44 np0005531887 nova_compute[186849]: 2025-11-22 09:10:44.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:10:48 np0005531887 nova_compute[186849]: 2025-11-22 09:10:48.039 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:10:48 np0005531887 nova_compute[186849]: 2025-11-22 09:10:48.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:10:48 np0005531887 nova_compute[186849]: 2025-11-22 09:10:48.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 22 04:10:48 np0005531887 nova_compute[186849]: 2025-11-22 09:10:48.794 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 22 04:10:49 np0005531887 nova_compute[186849]: 2025-11-22 09:10:49.354 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:10:49 np0005531887 nova_compute[186849]: 2025-11-22 09:10:49.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:10:49 np0005531887 nova_compute[186849]: 2025-11-22 09:10:49.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 22 04:10:51 np0005531887 podman[258172]: 2025-11-22 09:10:51.835157724 +0000 UTC m=+0.057066240 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, release=1755695350, com.redhat.component=ubi9-minimal-container, distribution-scope=public)
Nov 22 04:10:53 np0005531887 nova_compute[186849]: 2025-11-22 09:10:53.043 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:10:54 np0005531887 nova_compute[186849]: 2025-11-22 09:10:54.356 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:10:54 np0005531887 nova_compute[186849]: 2025-11-22 09:10:54.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:10:55 np0005531887 podman[258193]: 2025-11-22 09:10:55.834889106 +0000 UTC m=+0.058672448 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118)
Nov 22 04:10:55 np0005531887 podman[258194]: 2025-11-22 09:10:55.865958818 +0000 UTC m=+0.087572997 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 22 04:10:56 np0005531887 nova_compute[186849]: 2025-11-22 09:10:56.783 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:10:56 np0005531887 nova_compute[186849]: 2025-11-22 09:10:56.811 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:10:56 np0005531887 nova_compute[186849]: 2025-11-22 09:10:56.812 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:10:56 np0005531887 nova_compute[186849]: 2025-11-22 09:10:56.812 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:10:56 np0005531887 nova_compute[186849]: 2025-11-22 09:10:56.812 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 04:10:56 np0005531887 nova_compute[186849]: 2025-11-22 09:10:56.989 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 04:10:56 np0005531887 nova_compute[186849]: 2025-11-22 09:10:56.990 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5720MB free_disk=73.26410675048828GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 04:10:56 np0005531887 nova_compute[186849]: 2025-11-22 09:10:56.991 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:10:56 np0005531887 nova_compute[186849]: 2025-11-22 09:10:56.991 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:10:57 np0005531887 nova_compute[186849]: 2025-11-22 09:10:57.060 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 04:10:57 np0005531887 nova_compute[186849]: 2025-11-22 09:10:57.061 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 04:10:57 np0005531887 nova_compute[186849]: 2025-11-22 09:10:57.086 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 04:10:57 np0005531887 nova_compute[186849]: 2025-11-22 09:10:57.101 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 04:10:57 np0005531887 nova_compute[186849]: 2025-11-22 09:10:57.103 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 04:10:57 np0005531887 nova_compute[186849]: 2025-11-22 09:10:57.103 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:10:58 np0005531887 nova_compute[186849]: 2025-11-22 09:10:58.048 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:10:58 np0005531887 nova_compute[186849]: 2025-11-22 09:10:58.089 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:10:58 np0005531887 nova_compute[186849]: 2025-11-22 09:10:58.090 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 04:10:58 np0005531887 nova_compute[186849]: 2025-11-22 09:10:58.090 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 04:10:58 np0005531887 nova_compute[186849]: 2025-11-22 09:10:58.110 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 04:10:58 np0005531887 nova_compute[186849]: 2025-11-22 09:10:58.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:10:58 np0005531887 nova_compute[186849]: 2025-11-22 09:10:58.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 04:10:59 np0005531887 nova_compute[186849]: 2025-11-22 09:10:59.360 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:10:59 np0005531887 nova_compute[186849]: 2025-11-22 09:10:59.771 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:10:59 np0005531887 podman[258234]: 2025-11-22 09:10:59.831051591 +0000 UTC m=+0.051947203 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 04:11:00 np0005531887 nova_compute[186849]: 2025-11-22 09:11:00.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:11:01 np0005531887 nova_compute[186849]: 2025-11-22 09:11:01.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:11:03 np0005531887 nova_compute[186849]: 2025-11-22 09:11:03.052 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:11:04 np0005531887 nova_compute[186849]: 2025-11-22 09:11:04.364 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:11:06 np0005531887 nova_compute[186849]: 2025-11-22 09:11:06.762 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:11:08 np0005531887 nova_compute[186849]: 2025-11-22 09:11:08.081 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:11:08 np0005531887 podman[258258]: 2025-11-22 09:11:08.827062576 +0000 UTC m=+0.047188967 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent)
Nov 22 04:11:09 np0005531887 nova_compute[186849]: 2025-11-22 09:11:09.367 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:11:12 np0005531887 podman[258278]: 2025-11-22 09:11:12.909654858 +0000 UTC m=+0.124877100 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Nov 22 04:11:13 np0005531887 nova_compute[186849]: 2025-11-22 09:11:13.085 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:11:14 np0005531887 nova_compute[186849]: 2025-11-22 09:11:14.368 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:11:14 np0005531887 podman[258298]: 2025-11-22 09:11:14.835748422 +0000 UTC m=+0.058481754 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 04:11:18 np0005531887 nova_compute[186849]: 2025-11-22 09:11:18.132 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:11:19 np0005531887 nova_compute[186849]: 2025-11-22 09:11:19.370 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:11:22 np0005531887 nova_compute[186849]: 2025-11-22 09:11:22.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:11:22 np0005531887 podman[258322]: 2025-11-22 09:11:22.838316986 +0000 UTC m=+0.062739318 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, vendor=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=edpm, distribution-scope=public, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 22 04:11:23 np0005531887 nova_compute[186849]: 2025-11-22 09:11:23.134 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:11:24 np0005531887 nova_compute[186849]: 2025-11-22 09:11:24.372 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:11:26 np0005531887 podman[258343]: 2025-11-22 09:11:26.858574661 +0000 UTC m=+0.073820290 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 04:11:26 np0005531887 podman[258344]: 2025-11-22 09:11:26.926602217 +0000 UTC m=+0.132055966 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 04:11:28 np0005531887 nova_compute[186849]: 2025-11-22 09:11:28.138 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:11:29 np0005531887 nova_compute[186849]: 2025-11-22 09:11:29.374 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:11:30 np0005531887 podman[258390]: 2025-11-22 09:11:30.843417158 +0000 UTC m=+0.067184308 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 04:11:33 np0005531887 nova_compute[186849]: 2025-11-22 09:11:33.144 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:11:34 np0005531887 nova_compute[186849]: 2025-11-22 09:11:34.377 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:11:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 09:11:37.414 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:11:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 09:11:37.414 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:11:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 09:11:37.414 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:11:37 np0005531887 nova_compute[186849]: 2025-11-22 09:11:37.763 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:11:38 np0005531887 nova_compute[186849]: 2025-11-22 09:11:38.189 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:11:39 np0005531887 nova_compute[186849]: 2025-11-22 09:11:39.379 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:11:39 np0005531887 podman[258416]: 2025-11-22 09:11:39.839721248 +0000 UTC m=+0.059858978 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 04:11:43 np0005531887 nova_compute[186849]: 2025-11-22 09:11:43.192 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:11:43 np0005531887 podman[258436]: 2025-11-22 09:11:43.844151525 +0000 UTC m=+0.056152057 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 22 04:11:44 np0005531887 nova_compute[186849]: 2025-11-22 09:11:44.381 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:11:44 np0005531887 nova_compute[186849]: 2025-11-22 09:11:44.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:11:45 np0005531887 podman[258456]: 2025-11-22 09:11:45.855032446 +0000 UTC m=+0.080176115 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 04:11:48 np0005531887 nova_compute[186849]: 2025-11-22 09:11:48.201 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:11:49 np0005531887 nova_compute[186849]: 2025-11-22 09:11:49.384 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:11:53 np0005531887 nova_compute[186849]: 2025-11-22 09:11:53.205 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:11:53 np0005531887 podman[258481]: 2025-11-22 09:11:53.845432469 +0000 UTC m=+0.065431974 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, architecture=x86_64, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.component=ubi9-minimal-container, release=1755695350)
Nov 22 04:11:54 np0005531887 nova_compute[186849]: 2025-11-22 09:11:54.386 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:11:57 np0005531887 podman[258503]: 2025-11-22 09:11:57.851783676 +0000 UTC m=+0.067709551 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 04:11:57 np0005531887 podman[258504]: 2025-11-22 09:11:57.89520577 +0000 UTC m=+0.100671158 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Nov 22 04:11:58 np0005531887 nova_compute[186849]: 2025-11-22 09:11:58.207 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:11:58 np0005531887 nova_compute[186849]: 2025-11-22 09:11:58.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:11:58 np0005531887 nova_compute[186849]: 2025-11-22 09:11:58.799 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:11:58 np0005531887 nova_compute[186849]: 2025-11-22 09:11:58.799 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:11:58 np0005531887 nova_compute[186849]: 2025-11-22 09:11:58.799 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:11:58 np0005531887 nova_compute[186849]: 2025-11-22 09:11:58.800 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 04:11:58 np0005531887 nova_compute[186849]: 2025-11-22 09:11:58.965 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 04:11:58 np0005531887 nova_compute[186849]: 2025-11-22 09:11:58.967 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5729MB free_disk=73.26412582397461GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 04:11:58 np0005531887 nova_compute[186849]: 2025-11-22 09:11:58.967 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:11:58 np0005531887 nova_compute[186849]: 2025-11-22 09:11:58.968 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:11:59 np0005531887 nova_compute[186849]: 2025-11-22 09:11:59.290 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 04:11:59 np0005531887 nova_compute[186849]: 2025-11-22 09:11:59.291 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 04:11:59 np0005531887 nova_compute[186849]: 2025-11-22 09:11:59.307 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Refreshing inventories for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 22 04:11:59 np0005531887 nova_compute[186849]: 2025-11-22 09:11:59.389 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:11:59 np0005531887 nova_compute[186849]: 2025-11-22 09:11:59.464 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Updating ProviderTree inventory for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 22 04:11:59 np0005531887 nova_compute[186849]: 2025-11-22 09:11:59.465 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Updating inventory in ProviderTree for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 22 04:11:59 np0005531887 nova_compute[186849]: 2025-11-22 09:11:59.495 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Refreshing aggregate associations for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 22 04:11:59 np0005531887 nova_compute[186849]: 2025-11-22 09:11:59.527 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Refreshing trait associations for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78, traits: COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_MMX,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_PCNET _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 22 04:11:59 np0005531887 nova_compute[186849]: 2025-11-22 09:11:59.558 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 04:11:59 np0005531887 nova_compute[186849]: 2025-11-22 09:11:59.581 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 04:11:59 np0005531887 nova_compute[186849]: 2025-11-22 09:11:59.584 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 04:11:59 np0005531887 nova_compute[186849]: 2025-11-22 09:11:59.584 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.617s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:12:00 np0005531887 nova_compute[186849]: 2025-11-22 09:12:00.585 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:12:00 np0005531887 nova_compute[186849]: 2025-11-22 09:12:00.585 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 04:12:00 np0005531887 nova_compute[186849]: 2025-11-22 09:12:00.585 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 04:12:00 np0005531887 nova_compute[186849]: 2025-11-22 09:12:00.606 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 04:12:00 np0005531887 nova_compute[186849]: 2025-11-22 09:12:00.607 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:12:00 np0005531887 nova_compute[186849]: 2025-11-22 09:12:00.607 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 04:12:01 np0005531887 nova_compute[186849]: 2025-11-22 09:12:01.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:12:01 np0005531887 podman[258549]: 2025-11-22 09:12:01.902804434 +0000 UTC m=+0.114801223 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 04:12:02 np0005531887 nova_compute[186849]: 2025-11-22 09:12:02.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:12:02 np0005531887 nova_compute[186849]: 2025-11-22 09:12:02.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:12:03 np0005531887 nova_compute[186849]: 2025-11-22 09:12:03.211 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:12:04 np0005531887 nova_compute[186849]: 2025-11-22 09:12:04.391 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:12:06 np0005531887 nova_compute[186849]: 2025-11-22 09:12:06.762 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:12:08 np0005531887 nova_compute[186849]: 2025-11-22 09:12:08.215 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:12:09 np0005531887 nova_compute[186849]: 2025-11-22 09:12:09.393 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:12:10 np0005531887 podman[258575]: 2025-11-22 09:12:10.824791703 +0000 UTC m=+0.050105429 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 22 04:12:13 np0005531887 nova_compute[186849]: 2025-11-22 09:12:13.252 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:12:14 np0005531887 nova_compute[186849]: 2025-11-22 09:12:14.394 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:12:14 np0005531887 podman[258595]: 2025-11-22 09:12:14.829342604 +0000 UTC m=+0.055640345 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd)
Nov 22 04:12:16 np0005531887 podman[258615]: 2025-11-22 09:12:16.824872309 +0000 UTC m=+0.045972127 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 04:12:18 np0005531887 nova_compute[186849]: 2025-11-22 09:12:18.255 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:12:19 np0005531887 nova_compute[186849]: 2025-11-22 09:12:19.397 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:12:23 np0005531887 nova_compute[186849]: 2025-11-22 09:12:23.259 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:12:23 np0005531887 nova_compute[186849]: 2025-11-22 09:12:23.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:12:24 np0005531887 nova_compute[186849]: 2025-11-22 09:12:24.398 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:12:24 np0005531887 podman[258639]: 2025-11-22 09:12:24.833710165 +0000 UTC m=+0.053971723 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1755695350, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, maintainer=Red Hat, Inc., version=9.6, io.openshift.expose-services=)
Nov 22 04:12:28 np0005531887 nova_compute[186849]: 2025-11-22 09:12:28.264 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:12:28 np0005531887 podman[258661]: 2025-11-22 09:12:28.845233667 +0000 UTC m=+0.060659108 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 22 04:12:28 np0005531887 podman[258662]: 2025-11-22 09:12:28.867233726 +0000 UTC m=+0.079376076 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, container_name=ovn_controller)
Nov 22 04:12:29 np0005531887 nova_compute[186849]: 2025-11-22 09:12:29.401 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:12:32 np0005531887 podman[258705]: 2025-11-22 09:12:32.84455785 +0000 UTC m=+0.056461695 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 04:12:33 np0005531887 nova_compute[186849]: 2025-11-22 09:12:33.269 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:12:34 np0005531887 nova_compute[186849]: 2025-11-22 09:12:34.403 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:12:36.683 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:12:36.684 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:12:36.684 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:12:36.684 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:12:36.684 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:12:36.684 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:12:36.685 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:12:36.685 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:12:36.685 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:12:36.685 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:12:36.685 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:12:36.685 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:12:36.686 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:12:36.686 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:12:36.686 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:12:36.686 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:12:36.686 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:12:36.687 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:12:36.687 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:12:36.687 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:12:36.687 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:12:36.687 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:12:36.687 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:12:36.688 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:12:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:12:36.688 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:12:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 09:12:37.415 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:12:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 09:12:37.415 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:12:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 09:12:37.415 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:12:38 np0005531887 nova_compute[186849]: 2025-11-22 09:12:38.277 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:12:39 np0005531887 nova_compute[186849]: 2025-11-22 09:12:39.405 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:12:41 np0005531887 podman[258731]: 2025-11-22 09:12:41.83107694 +0000 UTC m=+0.056401613 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 22 04:12:43 np0005531887 nova_compute[186849]: 2025-11-22 09:12:43.280 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:12:44 np0005531887 nova_compute[186849]: 2025-11-22 09:12:44.407 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:12:44 np0005531887 nova_compute[186849]: 2025-11-22 09:12:44.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:12:45 np0005531887 podman[258750]: 2025-11-22 09:12:45.891783188 +0000 UTC m=+0.105700740 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Nov 22 04:12:47 np0005531887 podman[258770]: 2025-11-22 09:12:47.8439216 +0000 UTC m=+0.065057965 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 04:12:48 np0005531887 nova_compute[186849]: 2025-11-22 09:12:48.283 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:12:49 np0005531887 nova_compute[186849]: 2025-11-22 09:12:49.409 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:12:53 np0005531887 nova_compute[186849]: 2025-11-22 09:12:53.288 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:12:54 np0005531887 nova_compute[186849]: 2025-11-22 09:12:54.437 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:12:55 np0005531887 podman[258795]: 2025-11-22 09:12:55.845701973 +0000 UTC m=+0.062836221 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, managed_by=edpm_ansible, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_id=edpm, architecture=x86_64)
Nov 22 04:12:58 np0005531887 nova_compute[186849]: 2025-11-22 09:12:58.292 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:12:59 np0005531887 nova_compute[186849]: 2025-11-22 09:12:59.439 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:12:59 np0005531887 nova_compute[186849]: 2025-11-22 09:12:59.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:12:59 np0005531887 nova_compute[186849]: 2025-11-22 09:12:59.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 04:12:59 np0005531887 podman[258816]: 2025-11-22 09:12:59.848454769 +0000 UTC m=+0.066545361 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=edpm)
Nov 22 04:12:59 np0005531887 podman[258817]: 2025-11-22 09:12:59.886062461 +0000 UTC m=+0.102897812 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 04:13:00 np0005531887 nova_compute[186849]: 2025-11-22 09:13:00.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:13:00 np0005531887 nova_compute[186849]: 2025-11-22 09:13:00.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 04:13:00 np0005531887 nova_compute[186849]: 2025-11-22 09:13:00.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 04:13:00 np0005531887 nova_compute[186849]: 2025-11-22 09:13:00.782 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 04:13:00 np0005531887 nova_compute[186849]: 2025-11-22 09:13:00.782 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:13:00 np0005531887 nova_compute[186849]: 2025-11-22 09:13:00.802 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:13:00 np0005531887 nova_compute[186849]: 2025-11-22 09:13:00.803 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:13:00 np0005531887 nova_compute[186849]: 2025-11-22 09:13:00.803 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:13:00 np0005531887 nova_compute[186849]: 2025-11-22 09:13:00.803 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 04:13:00 np0005531887 nova_compute[186849]: 2025-11-22 09:13:00.938 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 04:13:00 np0005531887 nova_compute[186849]: 2025-11-22 09:13:00.939 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5720MB free_disk=73.26412582397461GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 04:13:00 np0005531887 nova_compute[186849]: 2025-11-22 09:13:00.939 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:13:00 np0005531887 nova_compute[186849]: 2025-11-22 09:13:00.939 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:13:00 np0005531887 nova_compute[186849]: 2025-11-22 09:13:00.999 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 04:13:00 np0005531887 nova_compute[186849]: 2025-11-22 09:13:00.999 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 04:13:01 np0005531887 nova_compute[186849]: 2025-11-22 09:13:01.024 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 04:13:01 np0005531887 nova_compute[186849]: 2025-11-22 09:13:01.036 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 04:13:01 np0005531887 nova_compute[186849]: 2025-11-22 09:13:01.037 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 04:13:01 np0005531887 nova_compute[186849]: 2025-11-22 09:13:01.038 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.098s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:13:03 np0005531887 nova_compute[186849]: 2025-11-22 09:13:03.024 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:13:03 np0005531887 nova_compute[186849]: 2025-11-22 09:13:03.024 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:13:03 np0005531887 nova_compute[186849]: 2025-11-22 09:13:03.025 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:13:03 np0005531887 nova_compute[186849]: 2025-11-22 09:13:03.296 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:13:03 np0005531887 podman[258864]: 2025-11-22 09:13:03.838342541 +0000 UTC m=+0.057340076 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 04:13:04 np0005531887 nova_compute[186849]: 2025-11-22 09:13:04.442 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:13:06 np0005531887 nova_compute[186849]: 2025-11-22 09:13:06.763 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:13:08 np0005531887 nova_compute[186849]: 2025-11-22 09:13:08.301 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:13:09 np0005531887 nova_compute[186849]: 2025-11-22 09:13:09.444 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:13:12 np0005531887 podman[258891]: 2025-11-22 09:13:12.855481161 +0000 UTC m=+0.080760169 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 22 04:13:13 np0005531887 nova_compute[186849]: 2025-11-22 09:13:13.306 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:13:14 np0005531887 nova_compute[186849]: 2025-11-22 09:13:14.449 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:13:16 np0005531887 podman[258910]: 2025-11-22 09:13:16.828918221 +0000 UTC m=+0.051141784 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 04:13:18 np0005531887 nova_compute[186849]: 2025-11-22 09:13:18.338 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:13:18 np0005531887 podman[258929]: 2025-11-22 09:13:18.823710518 +0000 UTC m=+0.050113509 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 04:13:19 np0005531887 nova_compute[186849]: 2025-11-22 09:13:19.450 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:13:23 np0005531887 nova_compute[186849]: 2025-11-22 09:13:23.343 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:13:24 np0005531887 nova_compute[186849]: 2025-11-22 09:13:24.451 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:13:25 np0005531887 nova_compute[186849]: 2025-11-22 09:13:25.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:13:26 np0005531887 podman[258953]: 2025-11-22 09:13:26.84713912 +0000 UTC m=+0.062200134 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.buildah.version=1.33.7, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, name=ubi9-minimal, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, distribution-scope=public, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 22 04:13:28 np0005531887 nova_compute[186849]: 2025-11-22 09:13:28.346 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:13:29 np0005531887 nova_compute[186849]: 2025-11-22 09:13:29.460 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:13:30 np0005531887 podman[258974]: 2025-11-22 09:13:30.840023664 +0000 UTC m=+0.057040178 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 04:13:30 np0005531887 podman[258975]: 2025-11-22 09:13:30.860016934 +0000 UTC m=+0.074859315 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 22 04:13:33 np0005531887 nova_compute[186849]: 2025-11-22 09:13:33.349 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:13:34 np0005531887 nova_compute[186849]: 2025-11-22 09:13:34.461 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:13:34 np0005531887 podman[259020]: 2025-11-22 09:13:34.850533071 +0000 UTC m=+0.066942341 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 04:13:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 09:13:37.416 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:13:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 09:13:37.417 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:13:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 09:13:37.417 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:13:38 np0005531887 nova_compute[186849]: 2025-11-22 09:13:38.352 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:13:39 np0005531887 nova_compute[186849]: 2025-11-22 09:13:39.464 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:13:40 np0005531887 nova_compute[186849]: 2025-11-22 09:13:40.764 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:13:43 np0005531887 nova_compute[186849]: 2025-11-22 09:13:43.356 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:13:43 np0005531887 podman[259045]: 2025-11-22 09:13:43.856662262 +0000 UTC m=+0.080920444 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent)
Nov 22 04:13:44 np0005531887 nova_compute[186849]: 2025-11-22 09:13:44.467 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:13:46 np0005531887 nova_compute[186849]: 2025-11-22 09:13:46.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:13:47 np0005531887 podman[259064]: 2025-11-22 09:13:47.842684729 +0000 UTC m=+0.060806962 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 04:13:48 np0005531887 nova_compute[186849]: 2025-11-22 09:13:48.361 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:13:49 np0005531887 nova_compute[186849]: 2025-11-22 09:13:49.471 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:13:49 np0005531887 podman[259086]: 2025-11-22 09:13:49.834861731 +0000 UTC m=+0.055943511 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 04:13:53 np0005531887 nova_compute[186849]: 2025-11-22 09:13:53.364 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:13:54 np0005531887 nova_compute[186849]: 2025-11-22 09:13:54.473 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:13:57 np0005531887 podman[259109]: 2025-11-22 09:13:57.840088449 +0000 UTC m=+0.061781925 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, version=9.6, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vendor=Red Hat, Inc., name=ubi9-minimal, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=)
Nov 22 04:13:58 np0005531887 nova_compute[186849]: 2025-11-22 09:13:58.369 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:13:59 np0005531887 nova_compute[186849]: 2025-11-22 09:13:59.475 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:14:00 np0005531887 nova_compute[186849]: 2025-11-22 09:14:00.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:14:00 np0005531887 nova_compute[186849]: 2025-11-22 09:14:00.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 04:14:01 np0005531887 nova_compute[186849]: 2025-11-22 09:14:01.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:14:01 np0005531887 nova_compute[186849]: 2025-11-22 09:14:01.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 04:14:01 np0005531887 nova_compute[186849]: 2025-11-22 09:14:01.770 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 04:14:01 np0005531887 nova_compute[186849]: 2025-11-22 09:14:01.790 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 04:14:01 np0005531887 podman[259128]: 2025-11-22 09:14:01.847374677 +0000 UTC m=+0.066136652 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 22 04:14:01 np0005531887 podman[259129]: 2025-11-22 09:14:01.875570197 +0000 UTC m=+0.089085084 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true)
Nov 22 04:14:02 np0005531887 nova_compute[186849]: 2025-11-22 09:14:02.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:14:02 np0005531887 nova_compute[186849]: 2025-11-22 09:14:02.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:14:02 np0005531887 nova_compute[186849]: 2025-11-22 09:14:02.797 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:14:02 np0005531887 nova_compute[186849]: 2025-11-22 09:14:02.797 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:14:02 np0005531887 nova_compute[186849]: 2025-11-22 09:14:02.798 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:14:02 np0005531887 nova_compute[186849]: 2025-11-22 09:14:02.798 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 04:14:02 np0005531887 nova_compute[186849]: 2025-11-22 09:14:02.956 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 04:14:02 np0005531887 nova_compute[186849]: 2025-11-22 09:14:02.957 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5726MB free_disk=73.26410675048828GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 04:14:02 np0005531887 nova_compute[186849]: 2025-11-22 09:14:02.958 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:14:02 np0005531887 nova_compute[186849]: 2025-11-22 09:14:02.958 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:14:03 np0005531887 nova_compute[186849]: 2025-11-22 09:14:03.029 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 04:14:03 np0005531887 nova_compute[186849]: 2025-11-22 09:14:03.030 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 04:14:03 np0005531887 nova_compute[186849]: 2025-11-22 09:14:03.063 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 04:14:03 np0005531887 nova_compute[186849]: 2025-11-22 09:14:03.083 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 04:14:03 np0005531887 nova_compute[186849]: 2025-11-22 09:14:03.084 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 04:14:03 np0005531887 nova_compute[186849]: 2025-11-22 09:14:03.085 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:14:03 np0005531887 nova_compute[186849]: 2025-11-22 09:14:03.372 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:14:04 np0005531887 nova_compute[186849]: 2025-11-22 09:14:04.479 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:14:05 np0005531887 nova_compute[186849]: 2025-11-22 09:14:05.085 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:14:05 np0005531887 nova_compute[186849]: 2025-11-22 09:14:05.086 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:14:05 np0005531887 podman[259170]: 2025-11-22 09:14:05.861464162 +0000 UTC m=+0.068759986 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 04:14:07 np0005531887 nova_compute[186849]: 2025-11-22 09:14:07.763 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:14:08 np0005531887 nova_compute[186849]: 2025-11-22 09:14:08.378 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:14:09 np0005531887 nova_compute[186849]: 2025-11-22 09:14:09.481 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:14:13 np0005531887 nova_compute[186849]: 2025-11-22 09:14:13.382 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:14:14 np0005531887 nova_compute[186849]: 2025-11-22 09:14:14.482 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:14:14 np0005531887 podman[259194]: 2025-11-22 09:14:14.864411755 +0000 UTC m=+0.080326899 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent)
Nov 22 04:14:18 np0005531887 nova_compute[186849]: 2025-11-22 09:14:18.386 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:14:18 np0005531887 podman[259213]: 2025-11-22 09:14:18.851329073 +0000 UTC m=+0.064906071 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 22 04:14:19 np0005531887 nova_compute[186849]: 2025-11-22 09:14:19.484 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:14:20 np0005531887 podman[259233]: 2025-11-22 09:14:20.83752323 +0000 UTC m=+0.053045101 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 04:14:23 np0005531887 nova_compute[186849]: 2025-11-22 09:14:23.390 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:14:24 np0005531887 nova_compute[186849]: 2025-11-22 09:14:24.487 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:14:27 np0005531887 nova_compute[186849]: 2025-11-22 09:14:27.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:14:28 np0005531887 nova_compute[186849]: 2025-11-22 09:14:28.393 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:14:28 np0005531887 podman[259257]: 2025-11-22 09:14:28.845174606 +0000 UTC m=+0.067453844 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, build-date=2025-08-20T13:12:41, vcs-type=git, io.openshift.tags=minimal rhel9, release=1755695350, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6)
Nov 22 04:14:29 np0005531887 nova_compute[186849]: 2025-11-22 09:14:29.488 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:14:32 np0005531887 podman[259281]: 2025-11-22 09:14:32.879450876 +0000 UTC m=+0.089636999 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team)
Nov 22 04:14:32 np0005531887 podman[259282]: 2025-11-22 09:14:32.91106643 +0000 UTC m=+0.110847787 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, io.buildah.version=1.41.3)
Nov 22 04:14:33 np0005531887 nova_compute[186849]: 2025-11-22 09:14:33.396 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:14:34 np0005531887 nova_compute[186849]: 2025-11-22 09:14:34.490 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:14:36.683 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:14:36.683 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:14:36.683 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:14:36.684 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:14:36.684 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:14:36.684 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:14:36.684 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:14:36.684 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:14:36.684 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:14:36.684 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:14:36.684 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:14:36.684 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:14:36.684 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:14:36.684 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:14:36.684 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:14:36.684 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:14:36.685 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:14:36.685 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:14:36.685 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:14:36.685 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:14:36.685 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:14:36.685 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:14:36.685 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:14:36.685 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:14:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:14:36.685 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:14:36 np0005531887 podman[259327]: 2025-11-22 09:14:36.839053055 +0000 UTC m=+0.060205136 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 04:14:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 09:14:37.418 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:14:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 09:14:37.418 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:14:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 09:14:37.418 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:14:38 np0005531887 nova_compute[186849]: 2025-11-22 09:14:38.400 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:14:39 np0005531887 nova_compute[186849]: 2025-11-22 09:14:39.492 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:14:43 np0005531887 nova_compute[186849]: 2025-11-22 09:14:43.404 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:14:44 np0005531887 nova_compute[186849]: 2025-11-22 09:14:44.493 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:14:45 np0005531887 podman[259352]: 2025-11-22 09:14:45.855502939 +0000 UTC m=+0.064399249 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 04:14:46 np0005531887 nova_compute[186849]: 2025-11-22 09:14:46.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:14:48 np0005531887 nova_compute[186849]: 2025-11-22 09:14:48.407 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:14:49 np0005531887 nova_compute[186849]: 2025-11-22 09:14:49.495 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:14:49 np0005531887 podman[259371]: 2025-11-22 09:14:49.873763426 +0000 UTC m=+0.080799941 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd)
Nov 22 04:14:51 np0005531887 podman[259391]: 2025-11-22 09:14:51.826129403 +0000 UTC m=+0.048084419 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 04:14:53 np0005531887 nova_compute[186849]: 2025-11-22 09:14:53.410 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:14:54 np0005531887 nova_compute[186849]: 2025-11-22 09:14:54.497 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:14:58 np0005531887 nova_compute[186849]: 2025-11-22 09:14:58.413 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:14:59 np0005531887 nova_compute[186849]: 2025-11-22 09:14:59.500 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:14:59 np0005531887 podman[259415]: 2025-11-22 09:14:59.836501667 +0000 UTC m=+0.056428414 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, version=9.6, com.redhat.component=ubi9-minimal-container)
Nov 22 04:15:02 np0005531887 nova_compute[186849]: 2025-11-22 09:15:02.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:15:02 np0005531887 nova_compute[186849]: 2025-11-22 09:15:02.768 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 04:15:02 np0005531887 nova_compute[186849]: 2025-11-22 09:15:02.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:15:02 np0005531887 nova_compute[186849]: 2025-11-22 09:15:02.788 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:15:02 np0005531887 nova_compute[186849]: 2025-11-22 09:15:02.788 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:15:02 np0005531887 nova_compute[186849]: 2025-11-22 09:15:02.788 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:15:02 np0005531887 nova_compute[186849]: 2025-11-22 09:15:02.789 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 04:15:02 np0005531887 nova_compute[186849]: 2025-11-22 09:15:02.935 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 04:15:02 np0005531887 nova_compute[186849]: 2025-11-22 09:15:02.937 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5735MB free_disk=73.27063369750977GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 04:15:02 np0005531887 nova_compute[186849]: 2025-11-22 09:15:02.937 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:15:02 np0005531887 nova_compute[186849]: 2025-11-22 09:15:02.937 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:15:02 np0005531887 nova_compute[186849]: 2025-11-22 09:15:02.996 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 04:15:02 np0005531887 nova_compute[186849]: 2025-11-22 09:15:02.997 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 04:15:03 np0005531887 nova_compute[186849]: 2025-11-22 09:15:03.114 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 04:15:03 np0005531887 nova_compute[186849]: 2025-11-22 09:15:03.147 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 04:15:03 np0005531887 nova_compute[186849]: 2025-11-22 09:15:03.149 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 04:15:03 np0005531887 nova_compute[186849]: 2025-11-22 09:15:03.149 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.212s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:15:03 np0005531887 nova_compute[186849]: 2025-11-22 09:15:03.418 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:15:03 np0005531887 podman[259437]: 2025-11-22 09:15:03.83336557 +0000 UTC m=+0.056249059 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Nov 22 04:15:03 np0005531887 podman[259438]: 2025-11-22 09:15:03.88066774 +0000 UTC m=+0.100753540 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 04:15:04 np0005531887 nova_compute[186849]: 2025-11-22 09:15:04.150 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:15:04 np0005531887 nova_compute[186849]: 2025-11-22 09:15:04.151 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 04:15:04 np0005531887 nova_compute[186849]: 2025-11-22 09:15:04.151 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 04:15:04 np0005531887 nova_compute[186849]: 2025-11-22 09:15:04.168 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 04:15:04 np0005531887 nova_compute[186849]: 2025-11-22 09:15:04.502 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:15:04 np0005531887 nova_compute[186849]: 2025-11-22 09:15:04.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:15:04 np0005531887 nova_compute[186849]: 2025-11-22 09:15:04.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:15:06 np0005531887 nova_compute[186849]: 2025-11-22 09:15:06.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:15:07 np0005531887 podman[259482]: 2025-11-22 09:15:07.83743487 +0000 UTC m=+0.059500730 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 04:15:08 np0005531887 nova_compute[186849]: 2025-11-22 09:15:08.420 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:15:08 np0005531887 nova_compute[186849]: 2025-11-22 09:15:08.761 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:15:09 np0005531887 nova_compute[186849]: 2025-11-22 09:15:09.504 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:15:13 np0005531887 nova_compute[186849]: 2025-11-22 09:15:13.425 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:15:14 np0005531887 nova_compute[186849]: 2025-11-22 09:15:14.509 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:15:16 np0005531887 podman[259507]: 2025-11-22 09:15:16.855089892 +0000 UTC m=+0.056018213 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 04:15:18 np0005531887 nova_compute[186849]: 2025-11-22 09:15:18.429 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:15:19 np0005531887 nova_compute[186849]: 2025-11-22 09:15:19.510 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:15:20 np0005531887 podman[259526]: 2025-11-22 09:15:20.83298269 +0000 UTC m=+0.059151872 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 04:15:22 np0005531887 podman[259546]: 2025-11-22 09:15:22.821958024 +0000 UTC m=+0.044472451 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 04:15:23 np0005531887 nova_compute[186849]: 2025-11-22 09:15:23.434 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:15:24 np0005531887 nova_compute[186849]: 2025-11-22 09:15:24.514 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:15:28 np0005531887 nova_compute[186849]: 2025-11-22 09:15:28.438 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:15:28 np0005531887 nova_compute[186849]: 2025-11-22 09:15:28.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:15:29 np0005531887 nova_compute[186849]: 2025-11-22 09:15:29.516 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:15:30 np0005531887 podman[259570]: 2025-11-22 09:15:30.87861992 +0000 UTC m=+0.086541141 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., release=1755695350, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 22 04:15:33 np0005531887 nova_compute[186849]: 2025-11-22 09:15:33.442 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:15:34 np0005531887 nova_compute[186849]: 2025-11-22 09:15:34.517 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:15:34 np0005531887 podman[259593]: 2025-11-22 09:15:34.8313071 +0000 UTC m=+0.055798587 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=edpm, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Nov 22 04:15:34 np0005531887 podman[259594]: 2025-11-22 09:15:34.865398736 +0000 UTC m=+0.086485390 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 04:15:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 09:15:37.419 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:15:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 09:15:37.420 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:15:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 09:15:37.420 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:15:38 np0005531887 nova_compute[186849]: 2025-11-22 09:15:38.445 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:15:38 np0005531887 podman[259638]: 2025-11-22 09:15:38.855316999 +0000 UTC m=+0.072208740 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 04:15:39 np0005531887 nova_compute[186849]: 2025-11-22 09:15:39.519 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:15:41 np0005531887 nova_compute[186849]: 2025-11-22 09:15:41.763 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:15:43 np0005531887 nova_compute[186849]: 2025-11-22 09:15:43.448 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:15:44 np0005531887 nova_compute[186849]: 2025-11-22 09:15:44.521 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:15:46 np0005531887 nova_compute[186849]: 2025-11-22 09:15:46.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:15:47 np0005531887 podman[259662]: 2025-11-22 09:15:47.849341614 +0000 UTC m=+0.073868171 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 22 04:15:48 np0005531887 nova_compute[186849]: 2025-11-22 09:15:48.451 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:15:49 np0005531887 nova_compute[186849]: 2025-11-22 09:15:49.523 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:15:51 np0005531887 podman[259681]: 2025-11-22 09:15:51.832359578 +0000 UTC m=+0.058550026 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 22 04:15:53 np0005531887 nova_compute[186849]: 2025-11-22 09:15:53.455 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:15:53 np0005531887 podman[259701]: 2025-11-22 09:15:53.859861716 +0000 UTC m=+0.072317473 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 04:15:54 np0005531887 nova_compute[186849]: 2025-11-22 09:15:54.524 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:15:54 np0005531887 nova_compute[186849]: 2025-11-22 09:15:54.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:15:54 np0005531887 nova_compute[186849]: 2025-11-22 09:15:54.769 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 22 04:15:56 np0005531887 nova_compute[186849]: 2025-11-22 09:15:56.789 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:15:56 np0005531887 nova_compute[186849]: 2025-11-22 09:15:56.790 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 22 04:15:56 np0005531887 nova_compute[186849]: 2025-11-22 09:15:56.811 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 22 04:15:58 np0005531887 nova_compute[186849]: 2025-11-22 09:15:58.510 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:15:59 np0005531887 nova_compute[186849]: 2025-11-22 09:15:59.526 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:16:00 np0005531887 nova_compute[186849]: 2025-11-22 09:16:00.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:16:01 np0005531887 podman[259726]: 2025-11-22 09:16:01.827041611 +0000 UTC m=+0.052738343 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, name=ubi9-minimal, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., vcs-type=git, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 04:16:02 np0005531887 nova_compute[186849]: 2025-11-22 09:16:02.782 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:16:02 np0005531887 nova_compute[186849]: 2025-11-22 09:16:02.782 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 04:16:02 np0005531887 nova_compute[186849]: 2025-11-22 09:16:02.783 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:16:02 np0005531887 nova_compute[186849]: 2025-11-22 09:16:02.839 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:16:02 np0005531887 nova_compute[186849]: 2025-11-22 09:16:02.839 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:16:02 np0005531887 nova_compute[186849]: 2025-11-22 09:16:02.840 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:16:02 np0005531887 nova_compute[186849]: 2025-11-22 09:16:02.840 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 04:16:02 np0005531887 nova_compute[186849]: 2025-11-22 09:16:02.984 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 04:16:02 np0005531887 nova_compute[186849]: 2025-11-22 09:16:02.985 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5734MB free_disk=73.2628402709961GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 04:16:02 np0005531887 nova_compute[186849]: 2025-11-22 09:16:02.986 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:16:02 np0005531887 nova_compute[186849]: 2025-11-22 09:16:02.986 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:16:03 np0005531887 nova_compute[186849]: 2025-11-22 09:16:03.052 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 04:16:03 np0005531887 nova_compute[186849]: 2025-11-22 09:16:03.053 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 04:16:03 np0005531887 nova_compute[186849]: 2025-11-22 09:16:03.084 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 04:16:03 np0005531887 nova_compute[186849]: 2025-11-22 09:16:03.099 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 04:16:03 np0005531887 nova_compute[186849]: 2025-11-22 09:16:03.100 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 04:16:03 np0005531887 nova_compute[186849]: 2025-11-22 09:16:03.100 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:16:03 np0005531887 nova_compute[186849]: 2025-11-22 09:16:03.514 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:16:04 np0005531887 nova_compute[186849]: 2025-11-22 09:16:04.086 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:16:04 np0005531887 nova_compute[186849]: 2025-11-22 09:16:04.087 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 04:16:04 np0005531887 nova_compute[186849]: 2025-11-22 09:16:04.087 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 04:16:04 np0005531887 nova_compute[186849]: 2025-11-22 09:16:04.106 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 04:16:04 np0005531887 nova_compute[186849]: 2025-11-22 09:16:04.528 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:16:05 np0005531887 podman[259746]: 2025-11-22 09:16:05.841508464 +0000 UTC m=+0.065616169 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm)
Nov 22 04:16:05 np0005531887 podman[259747]: 2025-11-22 09:16:05.863988205 +0000 UTC m=+0.078676399 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 22 04:16:06 np0005531887 nova_compute[186849]: 2025-11-22 09:16:06.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:16:06 np0005531887 nova_compute[186849]: 2025-11-22 09:16:06.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:16:07 np0005531887 nova_compute[186849]: 2025-11-22 09:16:07.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:16:08 np0005531887 nova_compute[186849]: 2025-11-22 09:16:08.518 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:16:09 np0005531887 nova_compute[186849]: 2025-11-22 09:16:09.530 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:16:09 np0005531887 nova_compute[186849]: 2025-11-22 09:16:09.762 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:16:09 np0005531887 podman[259789]: 2025-11-22 09:16:09.831198302 +0000 UTC m=+0.052561170 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 04:16:13 np0005531887 nova_compute[186849]: 2025-11-22 09:16:13.522 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:16:14 np0005531887 nova_compute[186849]: 2025-11-22 09:16:14.533 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:16:18 np0005531887 nova_compute[186849]: 2025-11-22 09:16:18.561 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:16:18 np0005531887 podman[259811]: 2025-11-22 09:16:18.834596055 +0000 UTC m=+0.056241188 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 22 04:16:19 np0005531887 nova_compute[186849]: 2025-11-22 09:16:19.533 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:16:22 np0005531887 podman[259830]: 2025-11-22 09:16:22.863556114 +0000 UTC m=+0.077274505 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 04:16:23 np0005531887 nova_compute[186849]: 2025-11-22 09:16:23.564 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:16:24 np0005531887 nova_compute[186849]: 2025-11-22 09:16:24.533 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:16:24 np0005531887 podman[259851]: 2025-11-22 09:16:24.574312492 +0000 UTC m=+0.046297827 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 04:16:28 np0005531887 nova_compute[186849]: 2025-11-22 09:16:28.603 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:16:28 np0005531887 nova_compute[186849]: 2025-11-22 09:16:28.769 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:16:29 np0005531887 nova_compute[186849]: 2025-11-22 09:16:29.535 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:16:32 np0005531887 podman[259877]: 2025-11-22 09:16:32.853180042 +0000 UTC m=+0.074567548 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, io.openshift.tags=minimal rhel9, architecture=x86_64, container_name=openstack_network_exporter, io.buildah.version=1.33.7, release=1755695350, vcs-type=git, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_id=edpm, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 22 04:16:33 np0005531887 nova_compute[186849]: 2025-11-22 09:16:33.606 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:16:34 np0005531887 nova_compute[186849]: 2025-11-22 09:16:34.536 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:16:36.684 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:16:36.684 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:16:36.684 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:16:36.684 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:16:36.684 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:16:36.685 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:16:36.685 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:16:36.685 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:16:36.685 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:16:36.685 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:16:36.685 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:16:36.685 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:16:36.685 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:16:36.685 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:16:36.685 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:16:36.685 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:16:36.686 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:16:36.686 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:16:36.686 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:16:36.686 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:16:36.686 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:16:36.686 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:16:36.686 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:16:36.686 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:16:36 np0005531887 ceilometer_agent_compute[197578]: 2025-11-22 09:16:36.686 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 04:16:36 np0005531887 podman[259899]: 2025-11-22 09:16:36.835440067 +0000 UTC m=+0.055736187 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true)
Nov 22 04:16:36 np0005531887 podman[259900]: 2025-11-22 09:16:36.892238778 +0000 UTC m=+0.107692939 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 22 04:16:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 09:16:37.421 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:16:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 09:16:37.421 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:16:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 09:16:37.421 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:16:38 np0005531887 nova_compute[186849]: 2025-11-22 09:16:38.609 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:16:39 np0005531887 nova_compute[186849]: 2025-11-22 09:16:39.539 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:16:40 np0005531887 podman[259943]: 2025-11-22 09:16:40.840123671 +0000 UTC m=+0.064963283 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 04:16:43 np0005531887 nova_compute[186849]: 2025-11-22 09:16:43.612 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:16:44 np0005531887 nova_compute[186849]: 2025-11-22 09:16:44.542 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:16:47 np0005531887 nova_compute[186849]: 2025-11-22 09:16:47.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:16:48 np0005531887 nova_compute[186849]: 2025-11-22 09:16:48.618 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:16:49 np0005531887 nova_compute[186849]: 2025-11-22 09:16:49.543 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:16:49 np0005531887 podman[259967]: 2025-11-22 09:16:49.828169 +0000 UTC m=+0.053535653 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true)
Nov 22 04:16:53 np0005531887 nova_compute[186849]: 2025-11-22 09:16:53.651 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:16:53 np0005531887 podman[259986]: 2025-11-22 09:16:53.836342059 +0000 UTC m=+0.059608412 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 22 04:16:54 np0005531887 nova_compute[186849]: 2025-11-22 09:16:54.544 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:16:54 np0005531887 podman[260006]: 2025-11-22 09:16:54.837449708 +0000 UTC m=+0.064239805 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 04:16:58 np0005531887 nova_compute[186849]: 2025-11-22 09:16:58.656 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:16:59 np0005531887 nova_compute[186849]: 2025-11-22 09:16:59.546 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:17:02 np0005531887 nova_compute[186849]: 2025-11-22 09:17:02.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:17:02 np0005531887 nova_compute[186849]: 2025-11-22 09:17:02.798 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:17:02 np0005531887 nova_compute[186849]: 2025-11-22 09:17:02.798 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:17:02 np0005531887 nova_compute[186849]: 2025-11-22 09:17:02.799 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:17:02 np0005531887 nova_compute[186849]: 2025-11-22 09:17:02.799 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 22 04:17:02 np0005531887 nova_compute[186849]: 2025-11-22 09:17:02.991 186853 WARNING nova.virt.libvirt.driver [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 22 04:17:02 np0005531887 nova_compute[186849]: 2025-11-22 09:17:02.992 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5721MB free_disk=73.2628402709961GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 22 04:17:02 np0005531887 nova_compute[186849]: 2025-11-22 09:17:02.993 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:17:02 np0005531887 nova_compute[186849]: 2025-11-22 09:17:02.993 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:17:03 np0005531887 nova_compute[186849]: 2025-11-22 09:17:03.064 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 22 04:17:03 np0005531887 nova_compute[186849]: 2025-11-22 09:17:03.064 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 22 04:17:03 np0005531887 nova_compute[186849]: 2025-11-22 09:17:03.183 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Refreshing inventories for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 22 04:17:03 np0005531887 nova_compute[186849]: 2025-11-22 09:17:03.251 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Updating ProviderTree inventory for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 22 04:17:03 np0005531887 nova_compute[186849]: 2025-11-22 09:17:03.252 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Updating inventory in ProviderTree for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 22 04:17:03 np0005531887 nova_compute[186849]: 2025-11-22 09:17:03.271 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Refreshing aggregate associations for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 22 04:17:03 np0005531887 nova_compute[186849]: 2025-11-22 09:17:03.297 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Refreshing trait associations for resource provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78, traits: COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_MMX,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_PCNET _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 22 04:17:03 np0005531887 nova_compute[186849]: 2025-11-22 09:17:03.323 186853 DEBUG nova.compute.provider_tree [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed in ProviderTree for provider: 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 22 04:17:03 np0005531887 nova_compute[186849]: 2025-11-22 09:17:03.344 186853 DEBUG nova.scheduler.client.report [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Inventory has not changed for provider 9393c4bb-7ceb-4b69-ba30-ecbfb8fa5b78 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 22 04:17:03 np0005531887 nova_compute[186849]: 2025-11-22 09:17:03.347 186853 DEBUG nova.compute.resource_tracker [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 22 04:17:03 np0005531887 nova_compute[186849]: 2025-11-22 09:17:03.347 186853 DEBUG oslo_concurrency.lockutils [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.354s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:17:03 np0005531887 nova_compute[186849]: 2025-11-22 09:17:03.705 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:17:03 np0005531887 podman[260031]: 2025-11-22 09:17:03.877891951 +0000 UTC m=+0.095353198 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.buildah.version=1.33.7, container_name=openstack_network_exporter, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., name=ubi9-minimal, release=1755695350, version=9.6, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9)
Nov 22 04:17:04 np0005531887 nova_compute[186849]: 2025-11-22 09:17:04.235 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:17:04 np0005531887 nova_compute[186849]: 2025-11-22 09:17:04.236 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 22 04:17:04 np0005531887 nova_compute[186849]: 2025-11-22 09:17:04.236 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 22 04:17:04 np0005531887 nova_compute[186849]: 2025-11-22 09:17:04.255 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 22 04:17:04 np0005531887 nova_compute[186849]: 2025-11-22 09:17:04.256 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:17:04 np0005531887 nova_compute[186849]: 2025-11-22 09:17:04.273 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:17:04 np0005531887 nova_compute[186849]: 2025-11-22 09:17:04.273 186853 DEBUG nova.compute.manager [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 22 04:17:04 np0005531887 nova_compute[186849]: 2025-11-22 09:17:04.547 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:17:07 np0005531887 nova_compute[186849]: 2025-11-22 09:17:07.785 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:17:07 np0005531887 nova_compute[186849]: 2025-11-22 09:17:07.785 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:17:07 np0005531887 nova_compute[186849]: 2025-11-22 09:17:07.786 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:17:07 np0005531887 podman[260052]: 2025-11-22 09:17:07.82650304 +0000 UTC m=+0.050684102 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 22 04:17:07 np0005531887 podman[260053]: 2025-11-22 09:17:07.852954969 +0000 UTC m=+0.073285348 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, io.buildah.version=1.41.3)
Nov 22 04:17:08 np0005531887 nova_compute[186849]: 2025-11-22 09:17:08.707 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:17:09 np0005531887 nova_compute[186849]: 2025-11-22 09:17:09.550 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:17:10 np0005531887 nova_compute[186849]: 2025-11-22 09:17:10.763 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:17:11 np0005531887 podman[260100]: 2025-11-22 09:17:11.86123046 +0000 UTC m=+0.082206515 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 04:17:13 np0005531887 nova_compute[186849]: 2025-11-22 09:17:13.795 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:17:14 np0005531887 nova_compute[186849]: 2025-11-22 09:17:14.551 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:17:18 np0005531887 nova_compute[186849]: 2025-11-22 09:17:18.798 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:17:19 np0005531887 nova_compute[186849]: 2025-11-22 09:17:19.558 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:17:20 np0005531887 podman[260124]: 2025-11-22 09:17:20.832325654 +0000 UTC m=+0.047745471 container health_status cd73be95f1b36f29426755d2cbabc5fbf0dc759e0278c1e841f599737159d9da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 22 04:17:23 np0005531887 nova_compute[186849]: 2025-11-22 09:17:23.803 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:17:24 np0005531887 nova_compute[186849]: 2025-11-22 09:17:24.560 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:17:24 np0005531887 podman[260143]: 2025-11-22 09:17:24.870738214 +0000 UTC m=+0.080715199 container health_status 30a952e857b1c4f66fb633672f7fb23a10a2539aa17840663eb14e05d63830a3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 04:17:24 np0005531887 podman[260163]: 2025-11-22 09:17:24.929897474 +0000 UTC m=+0.055109991 container health_status 3be178ecb3c6af9d186f76decc7b505f5b8d1d2d58b2167348b202d44a8af85f (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 04:17:28 np0005531887 nova_compute[186849]: 2025-11-22 09:17:28.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:17:28 np0005531887 nova_compute[186849]: 2025-11-22 09:17:28.807 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:17:29 np0005531887 nova_compute[186849]: 2025-11-22 09:17:29.562 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:17:33 np0005531887 nova_compute[186849]: 2025-11-22 09:17:33.812 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:17:34 np0005531887 nova_compute[186849]: 2025-11-22 09:17:34.565 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:17:34 np0005531887 podman[260189]: 2025-11-22 09:17:34.839739349 +0000 UTC m=+0.065553938 container health_status ae22ac08c48c97951e9b40aa52a3daaa7a6f200f6283369920dee536600f205e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, container_name=openstack_network_exporter, version=9.6)
Nov 22 04:17:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 09:17:37.422 104084 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 22 04:17:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 09:17:37.423 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 22 04:17:37 np0005531887 ovn_metadata_agent[104079]: 2025-11-22 09:17:37.424 104084 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 22 04:17:38 np0005531887 nova_compute[186849]: 2025-11-22 09:17:38.843 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:17:38 np0005531887 podman[260210]: 2025-11-22 09:17:38.864777742 +0000 UTC m=+0.077801708 container health_status 083dcb26d3e92b081811e06370f906364bc419c8774595b4d423fa33c8aba5dd (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 22 04:17:38 np0005531887 podman[260211]: 2025-11-22 09:17:38.898640911 +0000 UTC m=+0.107103656 container health_status 5d483a29ad21f1039a464dced67464760b8fe607413069a858f5da9e63fcb08d (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 22 04:17:39 np0005531887 systemd-logind[821]: New session 62 of user zuul.
Nov 22 04:17:39 np0005531887 systemd[1]: Started Session 62 of User zuul.
Nov 22 04:17:39 np0005531887 nova_compute[186849]: 2025-11-22 09:17:39.566 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:17:42 np0005531887 podman[260393]: 2025-11-22 09:17:42.148240884 +0000 UTC m=+0.065082546 container health_status 899e5543a46221e74be81d55d2573d6eadf0c5b86934e7498f42b985f6ddada2 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 04:17:43 np0005531887 nova_compute[186849]: 2025-11-22 09:17:43.762 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:17:43 np0005531887 nova_compute[186849]: 2025-11-22 09:17:43.848 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:17:44 np0005531887 ovs-vsctl[260448]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Nov 22 04:17:44 np0005531887 nova_compute[186849]: 2025-11-22 09:17:44.567 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:17:45 np0005531887 virtqemud[186424]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Nov 22 04:17:45 np0005531887 virtqemud[186424]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Nov 22 04:17:45 np0005531887 virtqemud[186424]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Nov 22 04:17:47 np0005531887 nova_compute[186849]: 2025-11-22 09:17:47.768 186853 DEBUG oslo_service.periodic_task [None req-08d5008f-127d-490c-9c75-3d1b8e67ebd0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 22 04:17:48 np0005531887 systemd[1]: Starting Hostname Service...
Nov 22 04:17:48 np0005531887 systemd[1]: Started Hostname Service.
Nov 22 04:17:48 np0005531887 nova_compute[186849]: 2025-11-22 09:17:48.852 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 22 04:17:49 np0005531887 nova_compute[186849]: 2025-11-22 09:17:49.568 186853 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
